anticache

PhD position open

25.01.2018


In cognitive neuroscience, sound sequences are used as abstracted models for temporal and sensorimotor processing in individuals and multi-agent interactions. In engineering, Brain-Computer Interfaces (BCI) and Neurofeedback applications have been developed for providing users with alternative pathways of communication, as well as neurorehabilitation treatment protocols. Bridging these two fields, the goal of this project is the interdisciplinary development of a Brain-Computer Interface platform for investigating the human brain in continuous interaction with synthesized sound and music stimuli. e.g. the underlying neural dynamics of entrainment, temporal prediction, synchronization and adaptation.

You will be a member of the interdisciplinary project “INTERACT” in collaboration with partners from the Georgetown University Medical Center. Your task will be on the systems engineering side: development and validation of sound synthesis algorithms; development and validation of Brain-to-Sound interfaces; collaboration with researchers at Georgetown on implementing/transferring neuroscientific theories and models into algorithms and interfaces.

More details here: https://portal.mytum.de/jobs/wissenschaftler/NewsArticle_20180125_090056/newsarticle_view?