An article from University of Oslo

New software allows for a personal touch to your favourite music

With a smartphone and your own body movements, music can be different each time you listen to it.

University of Oslo

The University of Oslo is Norway's leading institution of research and higher education.

Computer scientists and musicologists at the University of Oslo have developed totally new software that allows you to put your own personal touch to your music.

Postdoctoral fellow Kristian Nymoen at the Department of Informatics tells how:

“With the new system you can use your smartphone and your movements to control how modern compositions sound. The composer’s task is to create a music landscape with a lot of musical soundscapes. Then you control which soundscapes you want to move between, and you can also decide yourself the routes you want to take within the various sound landscapes.”

Sensors in your smartphone

Nymoen has a Master’s degree in musicology, and he has created this musical framework together with Jim Tørresen, professor of Informatics.

Kristian Nymoen and professor Jim Tørresen. (Foto: Yngve Vogt/Hanne Utigard/Colourbox)

The system should also be able to remember the music experience that you like.

“Using the sensors in your smartphone you can gradually be provided your own musical expression that suits your mood.”

The point is that all smartphones have sensors that recognize movement. The sensors in your mobile phone can interpret how you are moving.

“The way you move can give some indication about your mood. When you’re in a good mood you move in a different way from when you’re in a bad mood. So we want technology to sense what you are like as a person and to facilitate for you to get ting what you want for your current frame of mind,” Nymoen says.

If you are cross and bad-tempered, you can get music that counteracts the mood.

“Your mobile phone can create music that either reinforces the way you’re feeling or changes it,” says Nymoen.

To provide an even better description of bodily movements, the researchers use a sensor that can record foot movements and the pressure in shoes combined with an algorithm that analyses how you walk.

European research project

Research into music is part of the European research project called Engineering Proprioception in Computing Systems (EPiCS). This is intended to give machines the potential to adapt themselves to the environment where they are located.

Two partners from industry and six universities are working on the project. The University of Klagenfurt in Austria has used the new knowledge about human movement in video surveillance systems. They have developed a completely new type of communication between video cameras that makes it possible to follow a person who is moving within the various camera views.

“The new algorithms that we are jointly developing are intended to imitate our brain, which has the incredible ability to determine how your fingers can meet behind your back even though you can’t see them,” Jim Tørresen says.

Inspired by nature

To attain the goal of personalized music, the researchers have derived inspiration from nature.

One of the things they have done is to make an app (a computer program) for smartphones that lets you decide how your music experience is to change – in the same way as an ant tracks special smells to gather food.

The ant leaves a scent on the path so that other ants can find the way to the source of the food. The more ants there are on the path, the more intense is the scent. After a while the ants will find a new area of food, and the scent on the path to the old source will gradually disappear. The intensity of the scent on the path therefore depends on the amount of food.

“We’ve used the same principle to make a rule about how listeners move between soundscapes in a musical landscape. Listeners can use their smartphones to leave ‘scents’ at the transitions between soundscapes. The more ‘scents’ there are, the greater the probability that the music will move between these soundscapes. We can also insert a disappearance function that causes these transitions to decay if the user does not take action and leave even more ‘scents’," says Tørresen.

This means that the composer sets the framework while the user can control the music within this framework. The challenge is to make it listener friendly while at the same time finding a method that does not demand too much processing capacity.

Effect on the concert arena

The researchers have also made a special musical system for smartphones where they have allowed themselves to be inspired by fireflies. Fireflies have lights in their tail that flash simultaneously. In other words, fireflies have a mechanism that allows them to synchronize their flashing. The researchers will use this idea to get many smartphones to work together without the phones being controlled from a central unit.

They can use the sound from each individual smartphone to get them to synchronize with each other. Each phone will listen to and use the knowledge it captures to synchronize with the others.

That way, all the smartphones at a concert arena can be used as loudspeakers.

And as if that was not enough, your smartphone can also be used – together with all the other smartphones that belong to the rest of the audience – to influence the music on stage.

“Even though so far we have only made a system that will function for a few users, it’s exciting to imagine how the system could be used in a large room where there are a lot of people. Imagine a concert with an audience of hundreds. If enough of them shake their phones at the same time, the audience can also influence the sound from the actual concert loudspeakers,”  Nymoen points out.

----------

Read the Norwegian version of this article at forskning.no

Related content
Powered by Labrador CMS