The robot avatar proposed in this paper is a personal companion that captures users’ facial expressions and translates retrieved emotion data into gestures best suited to express the user’s emotion in real-time. This robot allows users to enjoy physical aspects of communication, as well as put an impact on parts of conversations to raise conversation quality. In this research, we have conducted experiments to validate the efficacy of translating facial expressions into robot movements.