RESEARCHERS ARE DEVELOPING A BRAIN-COMPUTER INTERFACE THAT CREATES MUSIC BASED ON YOUR MOOD
In order to assist users with emotion mediation, a brain-computer interface that interprets the moods of those who are wearing it as music is currently being developed.
It was described in an interview with the EveryONE blog, which was published by the open access scientific journal PLOS ONE. The device was developed by scientists Stefan Ehrlich from the Technische Universität München and Kat Agres from the National University of Singapore.
Using music that is tailored to the user’s emotional state, the user can ‘interact’ with their emotions by actively listening to and responding to them. Ehrlich describes the device as “a device that translates a listener’s brain activity, which corresponds to a specific emotional state, into a musical representation that seamlessly and continuously adapts to the listener’s current emotional state.”

The listener is made aware of their emotional state through the ever-changing music generated by the device, and it is through this awareness that they are able to mediate their emotions.
The scientists have already put their device to the test with a group of young people suffering from depression, who, according to Agres, “actually think of their identity in part in terms of their music.”
However, despite the fact that the group was divided on the ease with which the device could be used, Agres reported that “without instructing the listeners on how to gain control over the feedback […] all of them reported that they self-evoked emotions by recalling happy or sad moments from their lives” in order to bring their emotional state under control.
In their subsequent presentation, the duo discussed the various challenges they faced while developing their project, including the difficulty of creating a device that continuously generates music based on emotional state sound continuous, rather than just a series of sounds indicating mood. Agres emphasized the importance of adaptability in order to react to changes in brain state in real time, stating that the device had to adapt “to their brain signals and sound continuous and musically cohesive.”
Agres and Ehrlich are currently preparing to begin the second round of testing for their automatic music generation system, which will include both healthy adults and patients suffering from major depressive disorder. If their tests are successful, the researchers hope to use the device to assist stroke patients who are suffering from depression.
-
This is: Midday Wednesday
Embark on a melodic odyssey as we dive into the captivating story of Midday Wednesday, a band fueled by adolescent love, sonic exploration, and the delicate balance between academia and artistry.
-
6 Ways to Reinvest Your Talent Fee as a DJ
As a DJ, to reinvest your talent fee is a pivotal step in ensuring your continued growth, relevance, and success in the dynamic world of music entertainment.
-
‘Isulti Lang (Tell Me)’: A Unique Collaboration by DJ Young and Paul Pablo
New track by DJ Young and Paul Pablo
-
Katsy Lee — A Melodic Odyssey Fueled by Limitless Passion
In a recent interview with us, Katsy Lee spoke about the challenges and rewards of being a DJ-producer, her guiding principles for aspiring DJs, and her recent experience at Ultra Music Festival.
-
PVNDVMONIUM Makes a Comeback with New Track “Without You”
After a two-year hiatus, bass music artist PVNDVMONIUM returns with a more mature and emotional track “Without You”.
Leave a Reply