PARADISE BLUES


This project explores real-time interactive world-building in virtual reality. Users speak and objects appear around them. They ask kindly and mountains move. Basically, it turns them into the gods of their own terrible little worlds.

Voice control allows virtual reality to feel a little closer to our reality, in which spoken words have meaning and merit responses. This experience allows us to consider what might happen when virtual reality advances enough to feel truly real: will we all succumb to the temptations of complete control and live our lives in VR? Or will the fun of making snakes appear in mid-air wear off, leaving jaded VR users to spend their time elsewhere: maybe reaching ever closer to accurate mouth-feel for their virtual borscht?

This project was built as a final project for Nicole He’s ‘Hello, Computer’ course on voice interfaces at NYU-ITP.

Technologies used:

  • Unity3D is a game engine / 3D development environment in which (along with SteamVR) this VR experience was built,
  • IBM Watson API provided speech recognition and text-to-speech functionality,
  • Dialogflow was used to semantically parse the user’s recognized text provided by IBM Watson, and return the meaning to Unity 3D, using data from Darius Kazemi’s open source Corpora project,
  • Google Poly is a massive database of (relatively) low-poly 3D objects, with an API that allows for real-time querying and instantiation of objects.