Backlightness

I began with the question of whether you could create ads for intermediary steps in the supply chains for the many products and devices that use rare earth elements. Supply chains are incredible opaque, and the need for them is driven by demand in the final products. My experience with efforts to clarify them in a way that conveys the magnitude of trade is often abstract and not impactful. Meanwhile, showing the effects of technological advancement and modern consumption at the human scale does not convey the breadth of the underlying systems. My goal was to try to reflect on these systems and the lifestyle I’m familiar with by rearranging texts related to both.

Source Texts

Through researching rare earths and exploring APIs I found that since the 1990’s an overwhelmingly large amount of rare earth mining and production is based in China. Instead of some complicated combination of trade API queries to get at supply chains, I settled on a report about China’s Rare Earth Industry from the USGS which provides detailed information.

I thought ad text like what I was looking for might already exist. Instead, looking through marketing and business papers on advertising, I found these are usually created for specific research projects. One way is to go through magazines and collect the text of ads that appear. I did this with the April 2017 issue of Wired, but specifically chose ads based on their relationship to the rare earth supply chain (directly used in their products or were in an industry that uses such products heavily- so as not to make it so broad that any company with a computer would be included, since that would be all of them). I also developed a second corpus by googling for companies who make products known to rely on rare earths, and grabbing a few of the top ads that appeared in google image search. I though this would give me a rich corpus to develop markov ads.

Code

The code imports the list of 17 rare earths from my very first rare earth poem and the USGS report for nouns and noun chunks (for the USGS report, noun chunks are extracted using Spacy.io). I created a function that finds the sentences created from character-level markov chains on the two ad texts imported, and replaces the nouns with one of the rare-earth nouns. The output from 3 of these functions is stacked together, and the words are stored in a list. Then, the portion of the list grows with each printed line so that the poem grows exponentially until the list is done. The print doesn’t go straight through, instead sweeping down so that there’s some repetition.

The poem prints such that the beginning of the line starts to move. I don’t know if this is effective in getting across the movement between all the places and companies that come up across the poems, but this was the intent. For the final form, I wanted the sense of the growing poem to come across, and I wanted to connect back to the 17 rare earths. I decided to pick 17 (which do not necessarily correspond in a linear way with the rare earth they appear next to) and create an image so this effect would be apparent. Through the process, I also tried mashing up markov generated text from both sets of text, and other ways of presenting the result.

For the reading, I will read 2-3 of my favorites (image and text both available on my github). This one reminded me of this Young-Hae Chang poem (I love these animated poems but you need Flash!):

Samsung
LED TVs

 Any excuse Imagine. Samsung
    LED Jiangxi Province Any This will allow the
       company to increase the recovery of rare earths to 50 percent from 25 percent in the
          separation process (China Ministry of Industry and Information Technology, 2010; Wang Ganzhou . You Need it You Need it Every and much more than a wide range lives. electronics Every and much
             more than a wide the volume tungsten products . Surgical supplies ration for all you need it Every Detail Sony 3D explorings to the microphone yet. Scandium for all you need it Every ion-adsorption Sony Jiangxi Province Pacemakers to the Ytterbium yet . Lowest priced technolo Lowest priced Environmental Protection era of do. nickel-cobalt products of do . Bigger The Best Catchbook. Bigger The
                Best Catchbook . Ultra Slim. Ultra Slim . FaceTime. FaceTime . Smart creater and game. Smart Chinalco and Metal smelting producers . Feel the LG Door-in- Feel the LG Door - in-

Two proposals

I’m hoping to get some direction in class on two ideas, since I have unwisely spent time becoming invested in both.

Idea #1

What if we tried selling all points of a supply chain as much as the end product?

  • Uncover as much specific information for the supply chains of as many rare earths, or heavy metals, or depleted elements as is feasible
  • Collect language from advertising
  • Create ad-like descriptions of starting, intermediary, and end points the supply chain

Concerns:

  • A lot of research
  • Highlighting bad things can be informative but I find this somewhat problematic when people aren’t empowered to address the issue. Perhaps I’m promoting awareness that reduces consumption?
  • As a positive, it could be interesting to include some but I don’t necessarily want to promote of these ‘responsibly sourcedproduct lines as a solution.

Progress

My poem from week 2 was sort of a first experiment with this, where I tried putting the rare earth element back into the end product.

There are also several places to look for supply chain information, although to get a complete picture often takes people several weeks and many databases have paywalls:

IDEA #2

Site-specific found poem

  • Sniff for wifi at a specific place (this would require going to the place, so it’s a physical limitation)
  • Pull all the tweets from that place (perhaps there’s other geo-located social media that would be fun to incorporate)
  • Create poem

I like that it’s could be found poetry that makes a connection to how the words might have been released to the world.

Concerns

  • Is twitter poetry Two Thousand and Late?
  • Enough twitter material?

Progress

I’m playing with the Twython library and twitter API (link to geosearch doc) but have been having trouble getting the actual tweets by querying geolocation. I know that only about 1% of tweets are geolocated so this might be a fatal limitation. Or maybe my code is just incomplete.

I haven’t delved into this first step of collecting wifi networks.

returned:

{
u ‘search_metadata’: {
u ‘count’: 100, u ‘completed_in’: 0.025, u ‘max_id_str’: u ‘852568659604234240’, u ‘since_id_str’: u ‘0’, u ‘refresh_url’: u ‘?since_id=852568659604234240&q=&geocode=40.72%2C%20-73.95%2C%2010mi&include_entities=1’, u ‘since_id’: 0, u ‘query’: u ”, u ‘max_id’: 852568659604234240
}, u ‘statuses’: []
}

markov budget

My poems for this week juxtapose the language of Trump’s 2018 Federal Budget with the Sequoia National Park text I’ve been using. Because the federal budget is not a text I enjoy engaging with and we learned natural language processing tools, I thought it might be interesting to use these new tools to decipher it. I found it as a PDF: I tried using PDFMiner & Slate but ended up using pyPdf to convert it to text.

Considering there have been a number of controversies around the current White House administration and the Parks Department, I became curious about using the National Parks text, specifically (and considering I’m familiar with it). I tried combining the two texts in different ways and finally found what I thought was a compelling way to show the different priorities, values, and kind of patriotism exhibited in both (final code), while also analyzing the similarities.

I couldn’t find Ibex Meadow but I did find Horseshoe Meadow, Lone Pine, California.

One of the
Enforcement law enforcement personnel
Ibex Meadow on Lone
a view of
Enforcement law enforcement personnel
Ibex Meadow on Lone
which is now
Enforcement law enforcement personnel
Ibex Meadow on Lone
the size of
Enforcement law enforcement personnel
Ibex Meadow on Lone

AIM Park updates

This week we refactored our code–I updated my Sequoia Park AIM poems from last week.

I didn’t get to correct a number of things I intended to–modularizing took longer than expected. There’s something weird going on where I’m getting a lot of the same rhymes line after line. After some trials I thought I had successfully taken out all of the punctuation I intended to, but I’m still getting some. Maybe it’s because this punctuation is part of a word? I also had trouble trying to figure out how to re-write code that used if/elif/else syntax–I want to put it in a loop. I also started to write a function to create sequential times but to get the sequential numbers and then turn everything into text where single numbers had a zero in front, in addition to continually checking to see that the numbers were in fact moving forward in time seemed excessive. There must be some kind of time function that I haven’t found yet that could do this more easily.

My favorite aspect is still the screen names. It’s funny working with this older text because every once in a while years show up, which reminds me of how people used to use their birthday in screen names, but the years are like, 1890.

Here in another example are some screen names where $ are still showing up:

One nice thing is since I now have all these functions, I can play with other combinations of the code, so I experimented with making these little chats, where really I just focused on screen names:

AIM-inspired Sequoia poems

This week we were tasked with creating our own poetic form. I went to Yosemite over Spring Break so I was thinking about Sequoias–I was kind of amazed when I found this Sequoia National Park guide from 1937 on Project Gutenberg. I was stuck on how to create a poetic form from this when separately, I was researching passcode and password best practices. I thought it might be funny to create passphrases with the Sequoia guide as a seed text (har har). I didn’t really like this, but realized that some of the guidelines for passcodes result in old-school AIM-y looking text. Instead, I decided to create screen names. I also figured this could be more easily generalized to any text.

I was amused, but I also wanted to play with the pronouncing library we used in class. I wondered if rhyming screen names might be interesting? Instead, I created some lame but sometimes funny conversations. One person/screen name character initiates the conversation, and the response is something random and generally long (an entire line). The response is a rhyme. This happens again but then the second character continues the rhyme, and then they switch roles.

My code is on github. My final program is AIM.py and my final output is parksOutput.txt. I tested out the generalizability using the mushroom recipes text I used in a previous week, but this sometimes gave me errors. It would also be nice to create dynamic timestamps!

encyclopedia of life poems

Since there’s a Magic the Gathering API, I was hoping to continue working with card text this week, but in a more robust way. But, for some reason I kept getting a handshake error. I googled it, and people suggested using the request library, which I gather is somewhat more secure. I downloaded the libraries, and adjusted my code, but I still couldn’t get it to work:

$ pip install requests
$ pip install requests[security]

Instead, I worked with the encyclopedia of life API. Since I’ve been doing work with plants and algae for Temporary Expert, I thought the material might be interesting to work with. Also, I thought my phosphorus poem worked quite nicely and I was curious to experiment more with scientific texts in a poetic form.

The algae page didn’t lend itself well to text manipulation–none of the descriptions included were particularly interesting. After developing a general structure using the bees results, I then made a more general program where you can pass through any (land) animal as a parameter. I tried passing in other living things but it seems like not all results include the same fields, and so the program returns errors. This sort of makes sense to me given my experience looking at the algae results.

Reframing technical scientific naming and language is quite compelling to me. Something I think I’ve neglecting in the past few assignments is the form on the page (screen). The Hartley and Morris readings compelled me to pay more attention to this. While I didn’t address meter head on, it was interesting to see the difference including different length words made. I settled on a sort of round (visually) form which meant there were often short, stressed words at the beginning and end. Maybe this is bad.

I don’t understand feet.

My code and some experiments are on github. My first program and output are the ‘bees’ files. The final, more generalized program is eolapi.py and the experiments from this are saved as animals.txt.

Magic mushroom dictionaries

I decided to continue working with the same texts from last week because I was dissatisfied with their ultimate form. I thought it might be interesting to pick out words based on their length, which would allow me a lot of control over the line length and rhythm. My very ambitious goal was to create a dictionary of magic cards by scraping the Magic the Gathering card database site using cards “multiverseid” as the input argument. Working with the beautifulsoup library became enough of a headache that I decided to put this off for later–especially since we’ll be working more with getting text off the web this week and class. I also then discovered MTG has an API.

At first I created my dictionaries so that the word was the key and the length of the word was the value, but after struggling to pull out keys based off of values, I realized it would be much easier if my dictionary was flipped, so that word length was the key. Figuring this out made the process much easier. I sort of lost track of the structure of my text when I was putting it back together, but I enjoyed the unintentional output.

Happy mistakes

order. the said classes the placed spell. one when juicier top slowly flying pay made cooking Tap little cards. him with pressed may gravy, Return the from pasture all dipped cards. way half covered You bacon, played You more drying. the spiced itself for will minutes all crumbs player him when pepper, you scales cards. the with covered any sound,

r e v e a l t h e t h e n f l a n n e l t h e a d d i n g
d a m a g e A d d b e s t t h r o u g h A d d t a s t e .
t a r g e t p u t w h e n l e a v i n g d o , p l a c e .
t a r g e t t o p B a k e f i f t e e n t h e b e f o r e
t a r g e t t w o p i n t s p i c i n g Y o u R e m o v e
G r e e n , t h e W h e n f i l l i n g a l l f r i e d .
c a r d s . a n y w i t h g l a s s e s a n d B e f o r e
d a m a g e h i s d i s h p i e c e s , a l l l a y e r s
l o o k e d A d d s o m e s t e w p a n w a y p e p p e r
n u m b e r y o u s o u p p r o c e s s m a y s l o w l y

Return Add make stalks, for rolled
itself two Chop vinegar its minced
enters the fire flannel Add pepper
grave. any with pudding may washed
itself the salt portion are rolled
Search set some butter, you water.
Simoon any half garlic; the butter
itself his with vinegar may slowly
player any into people, you onions
itself and into stewpan the using,

Final Poems

While testing my results and sorting out the issues I had with my words and arrays, I became concerned about the uniformity of the output poems. In order to vary the line structure and make them more interesting, I ended up employing different rules to different lines. Something I’ve lost by completely disregarding the structure of sentences is the sense of complete thoughts. I wonder if there’s a way to grammatical parts of speech to have more control over this?

My code is on github and below are a few variations:

filled you when Vinegar her number
add Search Vise spoils card, some
things her pan, people, you Badham
after mind parchment teaspoonful other Add
Reveal top thin stewpan the rains,
add Cursed moat little dies, Trim
player Add with parsley the stalks
dip player from Season life. half
target any over flannel its butter
serve Then parchment circulation cost. the

Draw of add grated to dried
the player. you onion. Mana salt,
five of the taste, 4. serve
layer, bomb, mushrooms teaspoonful gold. hand
mana in and unless am ounce
and evincar may Remove each heavy
Name to the 1893. of them,
hot Destroy way butter card salad
seen If The Agaric of dozen
rains, cards mushrooms immediately card. ring

your to the pepper of white
the upkeep, two mushrooms best olive
Tap: is and Lenten to dozen
muscat deals mushrooms teaspoonful exile Tap:
life of you layers to dish,
and Destroy the mushrooms Grim added
cast as the broken 3, Baked
put doesn’t do, perfectly your round
come of and batter of blade
slowly other thickness preparation cards seen

Tap: Mushrooms

This week for our assignment on cut-ups, I decided to mash up the text of about 30 Magic the Gathering Artifact, Enchantment, and Instant cards and the recipes from Student’s Hand-book of Mushrooms of America, Edible and Poisonous by Thomas Taylor. I picked arguably the best cards from these categories from the Magic Card database, because of their language around spell-casting. I thought they would work well with recipes since both are essentially instructions.

I think I had spells on the mind because of the hexes witches have been placing on Trump, MTG because a friend recently gave away his entire collection, and I just loved all the sounds of all the mushroom names from this book while searching through Gutenberg and wanted to find a way to incorporate something about them. Even though I selected specific text to begin with by cutting and pasting, I struggled to cut down the text to what I wanted. I don’t think I’m quite there (mushroom language or structure-wise) but I was pleased by how couldron-y some of the products were.

Before cutting down lines and reducing length more intelligently I was getting a lot of text as output. There was something very nice about the repetition in a later version, but I refined this a bit more before settling on the final code. Right now I am picking out every other line I create, and there’s probably a better way to do this. My source text and final code are here. Each time it runs you get something a bit different. I thought this one was okay.

Rare earths in things

This week I spent some time learning about rare earths, which have very obscure element names but are ubiquitous. All their names end with the same few letters which seemed like it might lend itself nicely to computational manipulation.

I wanted to play with the scientific taxonomy and naming conventions of these strange elements, the weirdness of their rarity, and their displacement. Like my last poem, I wanted to use space on the page/screen as a way to see this. Playing with spaces and elemental numbers didn’t work as well as I’d liked this time.

The source text is all 17 of the earth metals plus 17 products made of these elements I assembled from the internet.

My final poem and code.

Telescope lenetium
Lutetium
Dysprosium
Promethium
Nuclear control rodsetium
Pacemakersetium
Scandium
Cerium
Europium
Aircraft enginesetium
Ytterbium
Automotive Lutexhaust systemsetium
Yttrium
Superconductorsetium
Pulsed lasersetium
Thulium
Praseodymium
X-raysetium
Erbium
Lanthanum
Camera lenetium
Holmium
Neodymium
Electrodesetium
Magnetsetium
Surgical suppliesetium
Gadolinium
Televisionsetium
Optical glasetium
Samarium
Computer disksetium
etium
Terbium
Headphonesetium
Fluorescent lampsetium

 

First computational poem

For Temporary Expert I’m currently researching the concept “Limits to Growth,” which led me down the path of researching Phosphorus depletion, something I didn’t really know anything about. I decided to use source text from Phosphorus Futures, a research group.

I liked the alliteration I noticed on the website and thought this might be fun to play with more. My first problem was that the paragraph-style structure of my source text didn’t allow me to easily manipulate lines. I created new lines with commas as the delimiter. Then, to maximize alliterative capacity (? sure) I pulled out all the lines that included phos*, or any other word that started with an F. I stuck these lines back together and used the short python program included in the notes to randomize them.

The use of many non-renewable resources is growing exponentially–in the case of Phosphorus, because of its use as a fertilizer. I wanted to play with this idea of exponential growth as well, so I actually created 5 mini poems, each one grabbing exponentially more words/space than the previous one. I ran into a problem sticking them back together because I didn’t know how to insert my own text in between each mini-poem (I would have liked to somehow designate each new generation). As it is now, each one just follows the preceding one with no particular marker.

My source text

Unix commands that would recreate the poem from the source text, excluding all of my experiments

Final poem: P (P being Phosphorus’ chemical element symbol)