Coursework, notes, and progress while attending NYU's Interactive Telecommunications Program (ITP)

EcoExchange playtest #2

We spent today play-testing our projects with another class and received some valuable feedback! At this point we have the visualization Swapna created for her ICM final and have been saving data so we can make this connection clear, but we haven’t completed construction of the mechanism. The data collection is a huge part of the purpose of the project, though, so it was helpful to show others its present state!

  • Consider the ‘face’ of the plant
  • Could the visualization of noise be more clear?
  • Where do we see this installed?
  • Compare the real-time data to something. Is it happy? What do we learn about the state of the plant by seeing this?
  • Add an exploration element. What compels me to stay with the plant?
  • Could the donation prompt the data visualization? Show something further?
  • Should this feel like charity to the plant? Or should people interacting with it be getting something back?
  • It feels like a leap to the connection to global warming and carbon sequestration–how to make this more apparent?
  • Prompt people for more money?
  • (related to comparison comment) how could we tell what the state of the environment without this plant would be/it’s contribution? How could we see the contribution of this plant specifically?
  • What is the CO2 concentration when there are no people?
  • Way to show the relationship between photosynthesis and the light/moisture around the plant?

EcoExchange Responsive Plant

Our API is live!!!

It updates every minute. We are getting good overnight voltage data for our CO2 sensor which will help us get *real* ppm values, which are currently based on a zero-point voltage and drop voltage that is clearly under-calculating co2 ppm values.

We haven’t gathered enough days of data to fully calibrate our sensor, but we’re very close! We also named our project, Swapna put together an awesome visualization, and we’ve worked on implementing the coin-operating aspect, which we’ll play-test in class. Our as-of-yet sparsely commented code is on github.

Updated production plan including this week’s progress (/ indicates in progress; x indicates complete):

Dates Tasks Complete?
11/16/2016-11/22/2015 Adafruit order (motor, sensors) x
Experiment with sensors (infrared, CO2) x
Buy & weigh plant /
Sensor data to node.js x
Node.js to p5 x
Buy motor with right amount of torque /
Plan construction of plant holder x
11/23/2016-11/29/2015 Make first interaction work: approach plant and it turns /
Create visualization in P5 based on what we learn about sensor output /
Write Node.js code to read sensor data & set up database x
11/30/2016-12/6/2015 Connect sensors to node to p5 /
Build coin box (scrap wood and copper tape) /
12/7/2016-12/13/2015 Make second interaction work: add reaction for coin drop /
Build stand alone website
12/14/2016 Project Due!

Responsive plant: sensors & movement

Initial interaction: distance sensors & motor

So far the most complicated aspect of our project has been determining the mechanism to turn the plant! We’ve experimented a lot at the advice of our peers, Making Things Move, and Ben Light.

Prototyping with gears and a lazy susan

Prototyping with gears, lazy susan, cardboard, and a 360 servo

First, we tested with a lazy susan, some gears, and a 360 servo motor. We figured to get the precision we needed we should use a stepper motor, which we got to work with an EZ driver (as opposed to an H bridge). The EZ driver has pins for power (it needs 5V), direction, speed, and the stepper motor itself. Once we got our proximity sensors, we tested a very initial sketch that controls the speed and direction of the turn based on where the sensor detect something.

img_1660

Testing form

We spent some time figuring out how to connect the gears to the bottom of the lazy susan, and considered whether we needed a slip ring for the wires that will be connected to the plant with the arduino that will be in the base. After meeting with Ben Light, we think that by using a rubber stopper on our motor and placing this next to we can control the lazy susan directly, but we still need to build this to see if it works.

Sensor data to node.js

My goal is to save sensor data to a database so the first piece of this was getting the serial lab related to this to work. I successfully got the moisture and microphone data into the browser! Now I just need to write the code that sends the data to an MongoDB database (I’m using an MLab account), and serve it as an API.

CO2 sensor calibration

We’re working with this MG-811 sensor module, since this seemed to be one of the least expensive CO2 sensors available, and since we were able to find some sample code to go along with it. To figure out how to calibrate it, we’ve referenced documentation here and for a similar sensor module. From our understanding, this sensor works by outputting a voltage, and as the CO2 concentration increases the voltage output decreases. The code that goes along with it sets a voltage equal to 400ppm as a sort of baseline, and then does math to express changing voltage as CO2 concentration.

Going through the MG-811 documentation led us to think the only way to get an accurate reading (although maybe a relative reading is good enough for this project) is to calibrate it with some other equipment, otherwise we might never know the true CO2 concentration of the environment. Another limitation of our sensor may be that it’s not meant to detect concentrations below 400ppm–we’re trying to see if this limitation could be overcome, and whether changing the math in the sample code will render the data somehow inaccurate.
img_1675

CO2 sensor testing with a breadboard

We have tapped Marina, George, and former ITP-er Nick Johnson (currently at CUSP) to help us think through how we can calibrate the sensor and connect us to people that might have helpful equipment. After a very helpful e-mail exchange with Nick, who directed us here, here, and here, we think that one solution could be to leave the sensor running at ITP to get baseline voltage readings, and assume this is 400ppm. We should definitely set this up this week!

Box diagram for initial interaction

Aaron Parsekian helped us think through all of the elements we have discussed including the project and where they should go for the distance-sensing and plant movement interaction:

img_1678

Working from this system diagram for interaction #1

Wireless?

The lowest priority piece right now, and the part I’m least sure will work, is sending the data wirelessly to node rather than having wires. We thought we might be able to use a huzzah wifi module, but to test it on the floor we need to get the mac address from the device, and I’ve already run into trouble. **UPDATE: Was able to read the mac address by printing it in loop instead of setup.

screen-shot-2016-11-26-at-11-00-01-pm

Garbled mac address

Once we can connect the device to the sandbox network on the floor, I’m still not sure how we will send the data from multiple environment sensors, especially since this works at a higher baud rate.

Updated Project Timeline

We are behind on some things we intended to have complete by now, and ‘/’ indicates these are in progress!

Dates Tasks Complete?
11/16/2016-11/22/2015 Adafruit order (motor, sensors) x
Experiment with sensors (infrared, CO2) /
Buy & weigh plant
Sensor data to node.js x
Node.js to p5 x
Buy motor with right amount of torque
Plan construction of plant holder x
11/23/2016-11/29/2015 Make first interaction work: approach plant and it turns /
Create visualization in P5 based on what we learn about sensor output
Write Node.js code to read sensor data & set up database /
11/30/2016-12/6/2015 Connect sensors to node to p5
Build coin box (scrap wood and copper tape)
12/7/2016-12/13/2015 Make second interaction work: add reaction for coin drop
Build stand alone website
12/14/2016 Project Due!

Updated BOM

Item Quantity Amount ($) Link to purchase Did we get it?
Plant 1 $20.00 Not yet
Pot/Plant holder 1 $10.00 Not yet
Lazy susan 1 $4.48 http://www.homedepot.com/p/Everbilt-6-in-Square-Lazy-Susan-Turntable-with-400-lb-Load-Rating-49548/203661089?cm_mmc=SEM|THD|google|&mid=sXHAZG02o|dc_mtid_8903tb925190_pcrid_111414437105_pkw__pmt__product_203661089_slid_&gclid=Cj0KEQiA6_TBBRDInaPjhcelt5oBEiQApPeTF2Fa17c1-6Yvlaw5TNzVQrrl1r3-p2UoFVOWTScYFREaAnIT8P8HAQ Currently borrowing for prototyping
Gear kit 1 $12.99 http://www.robotshop.com/en/vex-gear-kit.html Borrowed
Rubber stopper (small) 1 $1.95 http://www.robotshop.com/en/vex-gear-kit.html Borrowed
Rubber stopper (large) 1 $3.60 Yes
Screws and nuts assorted $6.00 Yes
125oz/in Stepper Motor 1 $23.95 https://www.sparkfun.com/products/13656 Smaller motor borrowed
EZ driver 1 $14.95 https://www.sparkfun.com/products/12779 Yes
IR distance sensor (10-80cm) 3 $44.85 https://www.adafruit.com/products/164 Yes
MG-811 CO2 Gas Sensor Module 1 $58.95 https://www.jameco.com/z/SEN0159-DFRobot-CO2-Sensor-Arduino-Compatible-_2213270.html Yes
Adafruit Feather HUZZAH w/ ESP8266 WiFi 1 $15.95 https://www.adafruit.com/products/2821 Jasmine has this
Arduino Mega 1 $45.95 https://www.adafruit.com/product/191 Jasmine has this if needed
Slip ring 1 14.95 https://www.sparkfun.com/products/13064 Yes
Moisture sensor 1 $4.95 https://www.sparkfun.com/products/13322 Yes
Microphone (w/breakout) 1 $5.95 https://www.sparkfun.com/products/12758 Yes
Photocells 2 1.90 https://www.adafruit.com/products/161 Jasmine has these if needed
scrap (wood/cardboard) some School
copper tape some School
Total $263.62

Final Project Production Plan

This week Swapna and I finalized our proposal, and developed the bill of materials, timeline, and system diagrams necessary to complete the project.

Project Description

Short: A coin-operated responsive plant that rotates to acknowledge human presence and a visualization of the CO2 in its environment, that plants do the work of converting to human-breathable oxygen, a service people have the option to pay for.

Long: Our project is meant, in part, to prompt people to think about the work that plants do regulating our environment and making it habitable, and what it means to put monetary value to this in a capitalist society. It is also an exchange with a non-human, generally immobile, but living being, that involves you communicating with it via your breath and physical presence while it communicates through respiration and the representation of data. We want to bring forth the way you are each affecting the surrounding environment and sensing each others presence in difference ways.

System Diagram

Project Timeline

Dates Tasks
11/16/2016-11/22/2015 Adafruit order (motor, sensors)
Experiment with sensors (infrared, CO2)
Buy & weigh plant
Sensor data to node.js
Node.js to p5
Buy motor with right amount of torque
Plan construction of plant holder
11/23/2016-11/29/2015 Make first interaction work: approach plant and it turns
Create visualization in P5 based on what we learn about sensor output
Write Node.js code to read sensor data & set up database
11/30/2016-12/6/2015 Connect sensors to node to p5
Build coin box (scrap wood and copper tape)
12/7/2016-12/13/2015 Make second interaction work: add reaction for coin drop
12/14/2016 Project Due!

Bill of Materials

Item Quantity Amount ($) Link to purchase
Plant 1 $20.00
Pot/Plant holder 1 $10.00
125oz/in Stepper Motor 1 $23.95 https://www.sparkfun.com/products/13656
IR distance sensor (10-80cm) 3 $44.85 https://www.adafruit.com/products/164
MG811 CO2 sensor 1 $34.95 http://www.futurlec.com/Gas_Sensors.shtml
Gas sensor breakout board 1 $0.95 https://www.sparkfun.com/products/8891
Adafruit Feather HUZZAH w/ ESP8266 WiFi 1 $15.95 https://www.adafruit.com/products/2821
Arduino Uno 1 $24.95 https://www.adafruit.com/products/50
Total $175.60
Other/Additions
Photocells 2 1.90 https://www.adafruit.com/products/161
Moisture sensor 1 $4.95 https://www.sparkfun.com/products/13322
Microphone (tiny) 1 $0.95 https://www.adafruit.com/products/1935
Microphone (w/breakout) 1 $5.95 https://www.sparkfun.com/products/12758

Beyond Scarcity Hackathon

There were three main themes to the beyond scarcity hackathon: alternative narratives, making the invisible visible, and internalities (internalizing costs, to counter the economic concept of externalities). Swapna and I wanted to continue to explore the idea of human-plant mutualism, and teamed up with Viniyata, Andy, and Lola to create a project in a similar vein as our final.

We combined the story of the Alley Pond Giant, a large tulip tree in Queens and probably the oldest living organism in New York City, with the idea that the natural environment provides humans with priceless services that we devalue for immediate economic gain. Tulip trees happen to be one of the most efficient plants in converting CO2 into oxygen. We wanted to create an experience to prompt people to think about the conversion of the CO2 we release as humans with this work that trees are constantly doing for us.

img_1641

Prototype: coin-operated ‘tree’ releases oxygen when you pay

Our final project allowed people to visualize the volume of oxygen released in an hour (our non-exact estimate as non-botanists for a tulip tree of this size is about 1 liter). When you pay a quarter, a small amount of air is released from the balloon. We did some calculations to estimate how much CO2 a tree like the Alley Pond Giant converts into oxygen on a yearly and daily basis, but we couldn’t get the volume of the balloon or the amount released exact with the hackathon’s time constraints.

Still, we hope it’s a fun and thought-provoking way to think about the carbon sequestration trees do for us, and the important role they play in giving us air to breathe! Unfortunately, this kind of monetary quantification of natural resources is one of the only ways to get the capitalist regimes of the world to understand this immense value. Our project also seems relevant given the advent of bottled air.

Responsive plant: initial playtesting

Set-up for play-testing

Set-up for play-testing

For our final project Swapna and I are planning on doing a variation on my project proposal: A responsive plant that gathers and shows data about the physical environment, sensing things that are usually invisible. Specifically (for now) Co2 and EMF. We had a lot of questions about exactly how the interaction would work, which will of course affect how we execute the project.

Our main questions going into play-testing were:

  • Was the plant moving/turning in response to people approaching enough feedback for the people interacting with it? We imagined there could potentially be other ways the plant communicates, like with light or sound.
  • Is the scale right? We picked a small potted indoor plant for play-testing. Should it be larger? What about if we had multiple plants?
  • How should we depict the data? We set up a laptop with generic data visualizations, but what should the visualizations look like? Should they be screen-based? Should they be projected?
  • Does the interaction make sense to people? How did they interpret it/react?
  • Is it engaging for a sustained period of time?

We got a lot of really great, useful feedback!

  • It could be useful to have a prompt, specifically for the human-technology interaction piece. What could this be? Could we compel people to take out their phones and see how the data visualization changes based on human-generated EMF?
  • A lot of people had comments on the data visualization style. What we took away was we should not use a graph. One solution we thought of as a response was to include a depiction of the plant itself, and use particles, waves, or something like this to show the Co2 and EMF.
  • People really did like the plant movement as a response. They liked it turning, and thought this acknowledgement was powerful.
  • We got a couple of suggestions for the plant to speak. One person thought it would be cool to give the plant a kind of personality or character. I remember reading a while ago about an experiment where couches were given different personalities based on movement, but I couldn’t find it. I did find this article about anthropomorphized Ikea t-shirts, and of course Ikea also did that lamp commercial that made everyone sad. I think there are a lot of ways we can play with personality!
  • Could we include some kind of depiction of how the plant feels?
  • Could we anchor the interaction in something people do with plants already? Could we link some of the plant action or data visualization to some natural action/interaction? Like watering?
  • Could people interact in some way to toggle between visualizations?
  • How will we keep people from touching the plant? Could we pick a particular plant? Will we make it speak?
  • Could the plant react to people speaking? Or general noise?
  • How to make the result beneficial/positive?
  • Utilize augmented reality?

Developing for cellular networks

I got to attend Amanda Ervin’s developing for cellular networks workshop at the Radical Networks Conference. We used the adafruit FONA cellular break out board and sent some initial texts!

img_1598

Set-up

img_1600

Hello world!

Working with the board requires using the FONA arduino library. It was a little tricky to get all the pinouts right, and the board communicates at a different baud rate than the serial we’ve been using in class. But in the end it was a pretty simple way to create a cellular device, and I learned a bit about how antennas work in the process. It would be cool to experiment with other AT commands in the future, since we didn’t get to experiment too much with this. Instructions for the arduino test I did are also available on the adafruit site.

Responsive plant final project concept

For my final project I’d like to create an interaction where people interact with a plant in an unexpected way. One way I thought this could work was to have the plant turn to you as you spoke to it. Another idea that I thought could scaffold nicely with this one was a speech recognition poetry-generating AI. The final concept being a plant that you spoke to that intelligently manipulated human speech to create this other output, either as a print-out, or online.

img_1578_2

Potential interaction

I was inspired by Tega Brain’s talk on eccentric engineering, a way of thinking about building infrastructure “based on mutualism,” that considers all life, not just humans. Her installation Coin-Operated-Wetland was a self-contained system that I thought provoked many questions around this.

In considering how I could think about our coexistence with nature in a city where we’re quite alienated from it, I immediately thought of plants. We keep plants indoors and they form a sort of backdrop, but in fact there is anecdotal and scientific evidence that plants respond to people, communicate with each other, and have feelings.

Pausing to Consider How Plants React to Humans

I was also inspired by chatbot and poetry AI projects.

Potential outputs

One reason I’m excited about this idea is that there are a lot of potential elements to work out and compromise on. There is still a lot of room to explore different kinds of interaction, learn about plants, and add other elements to the project if there’s time:

  • Rotating, presented in the above proposal but perhaps there are other ways the plant could react
  • Recording (speech, photo)
  • Found poetry/Surveillance poetry from recorded speech
  • Projections
  • Website (what people said, poetry generated, the poetry the AI was trained on)
  • Self-contained system powered by solar panels

 

Midterm: paper towel gifs

For our midterm, Rushali had the great idea of sensing toilet paper use wirelessly and keeping track of it. After making a list of potential sensors that could detect movement and looking at the hardware in the bathroom, we realized that it would be hard to detect the kind toilet paper use we were interested in sensing and measuring. But, the paper towel dispenser had a springs in it that we thought could easily detect a pull with a simple stretch sensor. Next, we decided our output triggered by the pull should be a gif. We wrote a p5 program that pulls from the giphy API to create a series of arrays, and auto-updates at a set time interval. When a paper towel is pulled out of the dispenser, a random gif is pulled from the array, and updates each time a paper towel is pulled.

img_1541

Non-wireless arduino & sensor installation

Actually setting up and deploying our circuit into the real world proved challenging! But we were able to hook everything up and insulate our circuit from the metal in the dispenser to get our prototype working.

After we had a wired version working the way we wanted, we set out to make it wireless with bluefruit! Once we had our sensor installed, we didn’t want to keep removing it and putting it back, so we tested out our wireless circuit with a potentiometer first. Then, we hooked everything up in the paper towel dispenser and it kind of worked. There was a noticeable delay between when the paper towel was pulled and when the gif changed.

As next steps we discussed potentially keeping the data on paper towel use, since right now we aren’t saving it or keeping track of it in any way. Of course it would also be great to improve the wireless functionality and try to keep it running for a longer period of time. I imagine we could also serve our p5 sketch so that anyone could go to our app and see the gifs change as paper towels get used.

One concern we had is we made using paper towels more fun, so we might be encouraging their use! We thought it might be better to show deforestation gifs. Either way, it was a great way to experiment with sensing things in the real world.

Valparaiso Chile logo

I chose to make a logo for my favorite city in the world: Valparaiso, Chile. My mom is Chilean and my grandmother lived nearby for many years, so I’ve visited several times. It’s a port city that thrived before the Panama Canal was built, and has struggled economically since. It’s distinctiveness comes from being built over several hills. The sidewalks are often steep or simply stairs, and there are several elevators to bring people up and down the biggest hills and the steepest parts. Some are over 100 years old! More recently it has attracted artists and is known for it’s graffiti and bohemian scene.

Current logo

My Logo

Originally, I purposefully tried to avoid the more expat/bohemian elements of the current culture, and make something that the people there would recognize. I abstracted three iconic parts (which is perhaps already too many) to begin the name of the city, and the rest of the letters lay on the steps. I tried to maintain the expressiveness and color of the current logo. My mom was born there and she understood and was excited by all the essential characteristics, so that seemed good to me. She really loved the steps!

I changed the height of the A based on feedback–it kind of didn’t make logical sense with the rest of the word. I also played with adding trolley cables. Ultimately, I didn’t think this was successful.

valpo-final

Final version

I slept on it and decided it was just way too much, and created a version where the elements are even further simplified, and got rid of the trolley that was complicating the A. I maintained the elevator and there are tiny steps on the V. The font was actually an accident (I intended to maintain futura, and the smaller text is futura condensed), but after typing it out, I liked the only slightly varying weight of myriad and thought it looked good once I played with the letter height.

valpo-final_new

On a tourist map:

logo_map2

Sketches and previous iterations

img_1528     img_1527

screen-shot-2016-10-22-at-5-31-59-pm        screen-shot-2016-10-22-at-7-26-47-pm

Iconic aspects of Valparaiso/Source images

Graffiti

    

     

Architecture

     

doors

     

Elevators

     

     

Trolleys

trolly     va01

stamp

Port: ships, containers, cranes