In Zac Zidik’s fifth post in his Tech to Teach the Teacher blog series, he discusses how he was able to create movement in the virtual student character he created for Ann Clements’ Open Innovation Challenge idea, as well as manipulate the virtual space.
Zidik was able to quickly start recognizing all of the body’s joints that Kinect can track. However, he did come across a few issues, such as joints getting lost outside the Kinect’s camera view, and the inability to track individual finger movements, such as pointing.
Beyond looking at how the Kinect tracks motion of virtual characters, Zidik has been experimenting with various ways characters can move around a virtual classroom. One thing Zidik experimented with was manipulating the virtual camera’s rotation.
Another experiment involved exploring how it felt to move deeper into the virtual space as he moved closer to the Kinect, as it would be important for the teacher to walk closer to students sitting in rows. Zidik has also been exploring the use of voice commands to control the virtual camera’s rotation.
For a detailed look at Zidik’s virtual space experiment, see http://www.flagamengine.com/blog/tech-to-teach-the-teacher-part-five-moving-in-space/.