University of Toyama
Preparing students for public speaking and/or interviews places greater demands on the teacher in the EFL classroom. “Total communication” consists of verbal (VC) and nonverbal communication (NVC), and in the large EFL class sizes in Japan it is difficult to focus on NVC. Our system, the Virtual Interview and Presentation Assistant, uses the Microsoft Kinect to help evaluate the NVC performance of EFL students. In this report, we focus on the development of our facial expression (FE) component. We simulated a job interview with twelve 19-year old students using questions from local companies. Their responses were recorded by the Kinect sensor and a standard video camera. The Kinect monitored any change in FE: facial movement, gaze, and engagement. After completion, the judge rated their impression of the interviewee’s FE using a 5-point scale at 3-second time intervals. At this stage, users can see a video of the assessment along with real-time scoring. The purpose of this experiment was to create a baseline for future research into differences in FE within L1 and L2 environments. By doing so, our system can provide valuable feedback to users and help them get ready for that important interview for their future.