Angelica Lim
Titel
Zitiert von
Zitiert von
Jahr
Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist
A Lim, T Mizumoto, LK Cahier, T Otsuka, T Takahashi, K Komatani, ...
2010 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2010
402010
UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions
A Ben-Youssef, C Clavel, S Essid, M Bilac, M Chamoux, A Lim
Proceedings of the 19th ACM international conference on multimodal …, 2017
362017
The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence
A Lim, HG Okuno
IEEE Transactions on Autonomous Mental Development 6 (2), 126-138, 2014
312014
Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music
A Lim, T Ogata, HG Okuno
EURASIP Journal on Audio, Speech, and Music Processing 2012 (1), 1-12, 2012
302012
A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese
A Lim, HG Okuno
International Journal of Social Robotics 7 (1), 35-49, 2015
282015
Converting emotional voice to motion for robot telepresence
A Lim, T Ogata, HG Okuno
2011 11th IEEE-RAS International Conference on Humanoid Robots, 472-479, 2011
192011
Integration of flutist gesture recognition and beat tracking for human-robot ensemble
T Mizumoto, A Lim, T Otsuka, K Nakadai, T Takahashi, T Ogata, ...
Proc of IEEE/RSJ-2010 Workshop on Robots and Musical Expression, 159-171, 2010
142010
Using speech data to recognize emotion in human gait
A Lim, HG Okuno
International Workshop on Human Behavior Understanding, 52-64, 2012
112012
How does the robot feel? perception of valence and arousal in emotional body language
M Marmpena, A Lim, TS Dahl
Paladyn, Journal of Behavioral Robotics 9 (1), 168-182, 2018
92018
Habit detection within a long-term interaction with a social robot: an exploratory study
C Rivoire, A Lim
Proceedings of the International Workshop on Social Learning and Multimodal …, 2016
82016
Hri 2018 workshop: Social robots in the wild
R Mead, DH Grollman, A Lim, C Yeung, A Stout, WB Knox
Companion of the 2018 ACM/IEEE International Conference on Human-Robot …, 2018
62018
Developing robot emotions through interaction with caregivers
A Lim, HG Okuno
Handbook of Research on Synthesizing Human Emotion in Intelligent Systems …, 2015
62015
A multimodal tempo and beat-tracking system based on audiovisual information from live guitar performances
T Itohara, T Otsuka, T Mizumoto, A Lim, T Ogata, HG Okuno
EURASIP Journal on Audio, Speech, and Music Processing 2012 (1), 1-17, 2012
62012
A musical robot that synchronizes with a coplayer using non-verbal cues
A Lim, T Mizumoto, T Ogata, HG Okuno
Advanced Robotics 26 (3-4), 363-381, 2012
62012
Audio-visual musical instrument recognition
A Lim, K Nakamura, K Nakadai, T Ogata, HG Okuno
情報処理学会第 73 回全国大会 5, 9, 2011
52011
Robot musical accompaniment: real-time synchronization using visual cue recognition
A Lim, T Mizumoto, T Otsuka, T Takahashi, K Komatani, T Ogata, ...
Proceedings of the IEEE/RSJ International Conference on Intelligent RObots …, 2010
42010
Generating robotic emotional body language with variational autoencoders
M Marmpena, A Lim, TS Dahl, N Hemion
2019 8th International Conference on Affective Computing and Intelligent …, 2019
32019
The omg-empathy dataset: Evaluating the impact of affective behavior in storytelling
P Barros, N Churamani, A Lim, S Wermter
2019 8th International Conference on Affective Computing and Intelligent …, 2019
32019
Gaze and filled pause detection for smooth human-robot conversations
M Bilac, M Chamoux, A Lim
2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids …, 2017
22017
Making a robot dance to diverse musical genre in noisy environments
JL Oliveira, K Nakamura, T Langlois, F Gouyon, K Nakadai, A Lim, ...
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2014
22014
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20