Responsive plant: sensors & movement

Initial interaction: distance sensors & motor

So far the most complicated aspect of our project has been determining the mechanism to turn the plant! We’ve experimented a lot at the advice of our peers, Making Things Move, and Ben Light.

Prototyping with gears and a lazy susan
Prototyping with gears, lazy susan, cardboard, and a 360 servo

First, we tested with a lazy susan, some gears, and a 360 servo motor. We figured to get the precision we needed we should use a stepper motor, which we got to work with an EZ driver (as opposed to an H bridge). The EZ driver has pins for power (it needs 5V), direction, speed, and the stepper motor itself. Once we got our proximity sensors, we tested a very initial sketch that controls the speed and direction of the turn based on where the sensor detect something.

img_1660
Testing form

We spent some time figuring out how to connect the gears to the bottom of the lazy susan, and considered whether we needed a slip ring for the wires that will be connected to the plant with the arduino that will be in the base. After meeting with Ben Light, we think that by using a rubber stopper on our motor and placing this next to we can control the lazy susan directly, but we still need to build this to see if it works.

Sensor data to node.js

My goal is to save sensor data to a database so the first piece of this was getting the serial lab related to this to work. I successfully got the moisture and microphone data into the browser! Now I just need to write the code that sends the data to an MongoDB database (I’m using an MLab account), and serve it as an API.

CO2 sensor calibration

We’re working with this MG-811 sensor module, since this seemed to be one of the least expensive CO2 sensors available, and since we were able to find some sample code to go along with it. To figure out how to calibrate it, we’ve referenced documentation here and for a similar sensor module. From our understanding, this sensor works by outputting a voltage, and as the CO2 concentration increases the voltage output decreases. The code that goes along with it sets a voltage equal to 400ppm as a sort of baseline, and then does math to express changing voltage as CO2 concentration.

Going through the MG-811 documentation led us to think the only way to get an accurate reading (although maybe a relative reading is good enough for this project) is to calibrate it with some other equipment, otherwise we might never know the true CO2 concentration of the environment. Another limitation of our sensor may be that it’s not meant to detect concentrations below 400ppm–we’re trying to see if this limitation could be overcome, and whether changing the math in the sample code will render the data somehow inaccurate.
img_1675
CO2 sensor testing with a breadboard

We have tapped Marina, George, and former ITP-er Nick Johnson (currently at CUSP) to help us think through how we can calibrate the sensor and connect us to people that might have helpful equipment. After a very helpful e-mail exchange with Nick, who directed us here, here, and here, we think that one solution could be to leave the sensor running at ITP to get baseline voltage readings, and assume this is 400ppm. We should definitely set this up this week!

Box diagram for initial interaction

Aaron Parsekian helped us think through all of the elements we have discussed including the project and where they should go for the distance-sensing and plant movement interaction:

img_1678
Working from this system diagram for interaction #1

Wireless?

The lowest priority piece right now, and the part I’m least sure will work, is sending the data wirelessly to node rather than having wires. We thought we might be able to use a huzzah wifi module, but to test it on the floor we need to get the mac address from the device, and I’ve already run into trouble. **UPDATE: Was able to read the mac address by printing it in loop instead of setup.

screen-shot-2016-11-26-at-11-00-01-pm
Garbled mac address

Once we can connect the device to the sandbox network on the floor, I’m still not sure how we will send the data from multiple environment sensors, especially since this works at a higher baud rate.

Updated Project Timeline

We are behind on some things we intended to have complete by now, and ‘/’ indicates these are in progress!

Dates Tasks Complete?
11/16/2016-11/22/2015 Adafruit order (motor, sensors) x
Experiment with sensors (infrared, CO2) /
Buy & weigh plant
Sensor data to node.js x
Node.js to p5 x
Buy motor with right amount of torque
Plan construction of plant holder x
11/23/2016-11/29/2015 Make first interaction work: approach plant and it turns /
Create visualization in P5 based on what we learn about sensor output
Write Node.js code to read sensor data & set up database /
11/30/2016-12/6/2015 Connect sensors to node to p5
Build coin box (scrap wood and copper tape)
12/7/2016-12/13/2015 Make second interaction work: add reaction for coin drop
Build stand alone website
12/14/2016 Project Due!

Updated BOM

Item Quantity Amount ($) Link to purchase Did we get it?
Plant 1 $20.00 Not yet
Pot/Plant holder 1 $10.00 Not yet
Lazy susan 1 $4.48 http://www.homedepot.com/p/Everbilt-6-in-Square-Lazy-Susan-Turntable-with-400-lb-Load-Rating-49548/203661089?cm_mmc=SEM|THD|google|&mid=sXHAZG02o|dc_mtid_8903tb925190_pcrid_111414437105_pkw__pmt__product_203661089_slid_&gclid=Cj0KEQiA6_TBBRDInaPjhcelt5oBEiQApPeTF2Fa17c1-6Yvlaw5TNzVQrrl1r3-p2UoFVOWTScYFREaAnIT8P8HAQ Currently borrowing for prototyping
Gear kit 1 $12.99 http://www.robotshop.com/en/vex-gear-kit.html Borrowed
Rubber stopper (small) 1 $1.95 http://www.robotshop.com/en/vex-gear-kit.html Borrowed
Rubber stopper (large) 1 $3.60 Yes
Screws and nuts assorted $6.00 Yes
125oz/in Stepper Motor 1 $23.95 https://www.sparkfun.com/products/13656 Smaller motor borrowed
EZ driver 1 $14.95 https://www.sparkfun.com/products/12779 Yes
IR distance sensor (10-80cm) 3 $44.85 https://www.adafruit.com/products/164 Yes
MG-811 CO2 Gas Sensor Module 1 $58.95 https://www.jameco.com/z/SEN0159-DFRobot-CO2-Sensor-Arduino-Compatible-_2213270.html Yes
Adafruit Feather HUZZAH w/ ESP8266 WiFi 1 $15.95 https://www.adafruit.com/products/2821 Jasmine has this
Arduino Mega 1 $45.95 https://www.adafruit.com/product/191 Jasmine has this if needed
Slip ring 1 14.95 https://www.sparkfun.com/products/13064 Yes
Moisture sensor 1 $4.95 https://www.sparkfun.com/products/13322 Yes
Microphone (w/breakout) 1 $5.95 https://www.sparkfun.com/products/12758 Yes
Photocells 2 1.90 https://www.adafruit.com/products/161 Jasmine has these if needed
scrap (wood/cardboard) some School
copper tape some School
Total $263.62

Final Project Production Plan

This week Swapna and I finalized our proposal, and developed the bill of materials, timeline, and system diagrams necessary to complete the project.

Project Description

Short: A coin-operated responsive plant that rotates to acknowledge human presence and a visualization of the CO2 in its environment, that plants do the work of converting to human-breathable oxygen, a service people have the option to pay for.

Long: Our project is meant, in part, to prompt people to think about the work that plants do regulating our environment and making it habitable, and what it means to put monetary value to this in a capitalist society. It is also an exchange with a non-human, generally immobile, but living being, that involves you communicating with it via your breath and physical presence while it communicates through respiration and the representation of data. We want to bring forth the way you are each affecting the surrounding environment and sensing each others presence in difference ways.

System Diagram

Project Timeline

Dates Tasks
11/16/2016-11/22/2015 Adafruit order (motor, sensors)
Experiment with sensors (infrared, CO2)
Buy & weigh plant
Sensor data to node.js
Node.js to p5
Buy motor with right amount of torque
Plan construction of plant holder
11/23/2016-11/29/2015 Make first interaction work: approach plant and it turns
Create visualization in P5 based on what we learn about sensor output
Write Node.js code to read sensor data & set up database
11/30/2016-12/6/2015 Connect sensors to node to p5
Build coin box (scrap wood and copper tape)
12/7/2016-12/13/2015 Make second interaction work: add reaction for coin drop
12/14/2016 Project Due!

Bill of Materials

Item Quantity Amount ($) Link to purchase
Plant 1 $20.00
Pot/Plant holder 1 $10.00
125oz/in Stepper Motor 1 $23.95 https://www.sparkfun.com/products/13656
IR distance sensor (10-80cm) 3 $44.85 https://www.adafruit.com/products/164
MG811 CO2 sensor 1 $34.95 http://www.futurlec.com/Gas_Sensors.shtml
Gas sensor breakout board 1 $0.95 https://www.sparkfun.com/products/8891
Adafruit Feather HUZZAH w/ ESP8266 WiFi 1 $15.95 https://www.adafruit.com/products/2821
Arduino Uno 1 $24.95 https://www.adafruit.com/products/50
Total $175.60
Other/Additions
Photocells 2 1.90 https://www.adafruit.com/products/161
Moisture sensor 1 $4.95 https://www.sparkfun.com/products/13322
Microphone (tiny) 1 $0.95 https://www.adafruit.com/products/1935
Microphone (w/breakout) 1 $5.95 https://www.sparkfun.com/products/12758

Beyond Scarcity Hackathon

There were three main themes to the beyond scarcity hackathon: alternative narratives, making the invisible visible, and internalities (internalizing costs, to counter the economic concept of externalities). Swapna and I wanted to continue to explore the idea of human-plant mutualism, and teamed up with Viniyata, Andy, and Lola to create a project in a similar vein as our final.

We combined the story of the Alley Pond Giant, a large tulip tree in Queens and probably the oldest living organism in New York City, with the idea that the natural environment provides humans with priceless services that we devalue for immediate economic gain. Tulip trees happen to be one of the most efficient plants in converting CO2 into oxygen. We wanted to create an experience to prompt people to think about the conversion of the CO2 we release as humans with this work that trees are constantly doing for us.

img_1641
Prototype: coin-operated ‘tree’ releases oxygen when you pay

Our final project allowed people to visualize the volume of oxygen released in an hour (our non-exact estimate as non-botanists for a tulip tree of this size is about 1 liter). When you pay a quarter, a small amount of air is released from the balloon. We did some calculations to estimate how much CO2 a tree like the Alley Pond Giant converts into oxygen on a yearly and daily basis, but we couldn’t get the volume of the balloon or the amount released exact with the hackathon’s time constraints.

Still, we hope it’s a fun and thought-provoking way to think about the carbon sequestration trees do for us, and the important role they play in giving us air to breathe! Unfortunately, this kind of monetary quantification of natural resources is one of the only ways to get the capitalist regimes of the world to understand this immense value. Our project also seems relevant given the advent of bottled air.

Responsive plant: initial playtesting

Set-up for play-testing
Set-up for play-testing

For our final project Swapna and I are planning on doing a variation on my project proposal: A responsive plant that gathers and shows data about the physical environment, sensing things that are usually invisible. Specifically (for now) Co2 and EMF. We had a lot of questions about exactly how the interaction would work, which will of course affect how we execute the project.

Our main questions going into play-testing were:

  • Was the plant moving/turning in response to people approaching enough feedback for the people interacting with it? We imagined there could potentially be other ways the plant communicates, like with light or sound.
  • Is the scale right? We picked a small potted indoor plant for play-testing. Should it be larger? What about if we had multiple plants?
  • How should we depict the data? We set up a laptop with generic data visualizations, but what should the visualizations look like? Should they be screen-based? Should they be projected?
  • Does the interaction make sense to people? How did they interpret it/react?
  • Is it engaging for a sustained period of time?

We got a lot of really great, useful feedback!

  • It could be useful to have a prompt, specifically for the human-technology interaction piece. What could this be? Could we compel people to take out their phones and see how the data visualization changes based on human-generated EMF?
  • A lot of people had comments on the data visualization style. What we took away was we should not use a graph. One solution we thought of as a response was to include a depiction of the plant itself, and use particles, waves, or something like this to show the Co2 and EMF.
  • People really did like the plant movement as a response. They liked it turning, and thought this acknowledgement was powerful.
  • We got a couple of suggestions for the plant to speak. One person thought it would be cool to give the plant a kind of personality or character. I remember reading a while ago about an experiment where couches were given different personalities based on movement, but I couldn’t find it. I did find this article about anthropomorphized Ikea t-shirts, and of course Ikea also did that lamp commercial that made everyone sad. I think there are a lot of ways we can play with personality!
  • Could we include some kind of depiction of how the plant feels?
  • Could we anchor the interaction in something people do with plants already? Could we link some of the plant action or data visualization to some natural action/interaction? Like watering?
  • Could people interact in some way to toggle between visualizations?
  • How will we keep people from touching the plant? Could we pick a particular plant? Will we make it speak?
  • Could the plant react to people speaking? Or general noise?
  • How to make the result beneficial/positive?
  • Utilize augmented reality?

Developing for cellular networks

I got to attend Amanda Ervin’s developing for cellular networks workshop at the Radical Networks Conference. We used the adafruit FONA cellular break out board and sent some initial texts!

img_1598
Set-up
img_1600
Hello world!

Working with the board requires using the FONA arduino library. It was a little tricky to get all the pinouts right, and the board communicates at a different baud rate than the serial we’ve been using in class. But in the end it was a pretty simple way to create a cellular device, and I learned a bit about how antennas work in the process. It would be cool to experiment with other AT commands in the future, since we didn’t get to experiment too much with this. Instructions for the arduino test I did are also available on the adafruit site.

Responsive plant final project concept

For my final project I’d like to create an interaction where people interact with a plant in an unexpected way. One way I thought this could work was to have the plant turn to you as you spoke to it. Another idea that I thought could scaffold nicely with this one was a speech recognition poetry-generating AI. The final concept being a plant that you spoke to that intelligently manipulated human speech to create this other output, either as a print-out, or online.

img_1578_2
Potential interaction

I was inspired by Tega Brain’s talk on eccentric engineering, a way of thinking about building infrastructure “based on mutualism,” that considers all life, not just humans. Her installation Coin-Operated-Wetland was a self-contained system that I thought provoked many questions around this.

In considering how I could think about our coexistence with nature in a city where we’re quite alienated from it, I immediately thought of plants. We keep plants indoors and they form a sort of backdrop, but in fact there is anecdotal and scientific evidence that plants respond to people, communicate with each other, and have feelings.

Pausing to Consider How Plants React to Humans

I was also inspired by chatbot and poetry AI projects.

Potential outputs

One reason I’m excited about this idea is that there are a lot of potential elements to work out and compromise on. There is still a lot of room to explore different kinds of interaction, learn about plants, and add other elements to the project if there’s time:

  • Rotating, presented in the above proposal but perhaps there are other ways the plant could react
  • Recording (speech, photo)
  • Found poetry/Surveillance poetry from recorded speech
  • Projections
  • Website (what people said, poetry generated, the poetry the AI was trained on)
  • Self-contained system powered by solar panels