UMKC engineering and music composition students sonify brain waves
At the end of the fall 2008 semester, clanging, screaming and chanting emanated from 454 Robert H. Flarsheim Hall.
No, these students weren’t celebrating the end of the semester.
These graduate-level Music Composition (Computer Programming for Musicians) and Electrical Engineering (Biomedical Signal Analysis) students were presenting a final project that illustrated how someone’s brain waves change in relation to sounds and imagined movements. Three Kansas City Art Institute students added their video production expertise to the project, as well.
For the final presentations, three groups of music and engineering students gathered around Ramaraju Medisetty — a second-year electrical engineering graduate student who volunteered his brain for the project.
Jesse Sherwood, a graduate research assistant studying electrical and computer engineering, outfitted Medisetty with a brain wave cap connected to a bioamplifier. Then, Sherwood squirted a conductive gel into electrodes on the brain wave cap. The saline-based gel served as a conductive material, filling an air gap between the metal electrodes and Medisetty’s scalp.
After setting up the brain wave cap, Sherwood began a brain wave training set that required Medisetty to visualize moving his hands and legs.
“Left hand, right hand, left leg, right leg, relax,” Sherwood told him.
Meanwhile, a computer system processed and classified Medisetty’s thought patterns according to algorithms developed by electrical engineering students.
After the computer classified Medisetty’s brain waves, the data traveled to a music synthesizing software customized by music composition students. Sounds and music controlled by Medisetty’s brain waves echoed through speakers, and his brain waves appeared as a series of yellow lights on a projection screen.
During “The Life of the Techno Buddha” student group’s presentation, Medisetty’s brain waves appeared within a video Buddha figure. The students constructed the figure and original sounds from 108 YouTube videos they found when searching for “Buddha”.
“We use the music-synthesizing software to generate a fully-customizable sound palette that adapts in real-time to user preferences,” Rudy said. “Brain wave scans are used to detect user likes and dislikes that guide the computer in its selection of music, environmental sounds and effects. As the user reacts to the computer-generated soundscape, the computer learns what sounds the user prefers and customizes itself to these preferences.”
In addition to musical inspiration, brainwave sonification (listening to one’s brain waves) offers several potential healthcare benefits. For example, brain wave sonification can be used for stress-reduction, addiction rehabilitation, epilepsy protection and severe motor disorder intervention.
“You can connect the bioamplifier machine to a robot hand, and a paralyzed patient can have a much better quality life if his or her thoughts are sonified,” Derakhshani said. “The brain adapts to the machine and vice versa, and you can train your brain for biofeedback.”
Engineering and Conservatory students created videos of their final projects. To view one of the final projects, click here.