Dan Bohus
Senior Researcher

Microsoft Research
One Microsoft Way
Redmond, WA, 98052

research agenda
My work centers on the study and development of computational models for physically situated spoken language interaction and collaboration. The long term question that shapes my research agenda is how can we enable interactive systems to reason more deeply about their surroundings and seamlessly participate in open-world, multiparty dialog and collaboration with people?

Physically situated interaction hinges critically on the ability to reason about and model processes like conversational engagement, turn-taking, grounding, interaction planning and action coordination. Creating robust solutions that operate in the real-world brings to the fore broader AI challenges. Example questions include issues of representation (e.g. what are useful formalisms for creating actionable, robust models for multiparty interaction), machine learning methods for multimodal inference from streaming sensory data, predictive modeling, decision making and planning under uncertainty and temporal constraints, etc.

activities & news
Oct '17: Our recent papers on scene shaping and on diagnosing failures in physically situated systems deployed in-the-wild were accepted for publication at the AAAI Fall Symposium and ICSR!
Jul '17: We introduced the Platform for Situated Intelligence project in the Integrative-AI session at the MSR Faculty Summit.
Jul '17: I gave an invited talk on physically situated language interaction at MSR Cambridge AI Summer School.
Dec '16: The special issue of AI Magazine on Turn taking and coordination in human-machine interaction that I have co-edited is now published. It's a fun and interesting collection of articles on the topic. A big thanks to the contributors and my co-editors!
Aug '16: Sean Andrist from University of Wisconsin-Madison has recently joined our group! Check out his great work on gaze and human robot interaction here. Looking forward to expanding our research in the HRI space!
Jun '16: I have started serving as a member of the steering board for ICMI, the International Conference on Multimodal Interaction.
May '16: I gave an invited talk at the workshop on Designing Speech and Multimodal Interactions for Mobile, Wearable, and Pervasive Applications at CHI 2016 in San Jose.
Dec '15: I gave an invited talk on opportunities and challenges in situated dialog at ASRU 2015 in Scottsdale, AZ.
Apr '15: I am serving as Program Chair for ICMI'2015 to be held later this year, in November in Seattle. Paper submission deadline is May 15th.
Mar '15: I co-organized and attended the AAAI Spring Symposium on Turn-taking and Coordination in Human-Machine Interaction at Stanford University.
Nov '14: I attended ICMI'2014, and gave an invited keynote presentation at the co-located The 7th Workshop on Eye Gaze in Intelligent Human Machine Interaction.
Sep '14: Zhou Yu started as an intern with Eric Horvitz and myself. Looking forward to a fun and productive fall!
Aug '14: With Sean Andrist, Bilge Multu, David Schlangen and Eric Horvitz, I am organizing a AAAI Spring Symposium on Turn-Taking and Coordination in Human-Computer Interaction
Aug '14: Two papers, one on generating hesitations based on forecasting models and one on communicating about uncertainty in embodied agents were recently accepted for presentation at ICMI'2014.
Aug '14: We have deployed a 3rd Directions Robot, on the 4th floor in Building 99. Full coverage!
May '14: I am co-organizing the ICMI'14 workshop on Understanding and Modeling Multiparty, Multimodal Interactions.
Apr '14: A piece on our research featured on Engadget.
Situated interaction project overview
Video highlighting work on communicating about uncertainty in embodied agents
Directions Robot video