Coursework, notes, and progress while attending NYU's Interactive Telecommunications Program (ITP)

Soil micro-environments with augmented reality

For the final I experimented with projection and augmented reality (Unity & Vuforia) to tell the story of how plants remediate their environments. I narrowed this down to phytoremediation with sunflowers, specifically, to make this manageable for a week-long project. Sunflowers accumulate lead from the soil, but like all bioremediators, they then become toxic themselves. I also wanted to show the complexity of the micro-environments in soil. My grand aspiration was to create one of these experiences for each of the ways to remediate soil. 

I put everything together with some text and animations. Many of the soil images I put up on Vuforia as tests got much better tracking ratings when I made them brighter.  I made a big composite image of these trackable parts, and then used screenshots of the composite image as my image targets.

 

Gabe helped me find 3D models of bacteria and bugs, but I found they had so many vertices and faces they created a lag on the phone. I tried using Blender to “decimate” the objects but this made me want to throw my computer out the window. Instead, I used Blender to identify the objects with he fewest faces and used those. Gabe later suggested correcting this with shaders for mobile in Unity.

It might have been useful to add some more information with audio, since it seems better to avoid lots of text that you’d need to read. I would have liked to incorporate sound but couldn’t find a simple way to attach this to an event, like a found target or a rendered object. I’m not sure this would have added to the experience, since you can see multiple targets at the same time. It’s easier to attach sound to a camera, and maybe I could have added some ambient noise. 

My original project proposal:

Post a Comment

Your email is kept private. Required fields are marked *