[ITP: Designing Your Synthetic Voice] Cringe Comfort

I used to have a much stronger relationship with my music taste. My taste in music used to mean so much to me and was a big part of my identity in high school. I mostly listened to emo, rock, alternative, bla bla in my formative years. I used to rip and burn all my friends CD’s and go to concerts. I took pride in being a snob haha. Music was such a big part of my life, I had played classical piano for years, violin in my high school orchestra, and even picked up guitar for a bit. Here I am playing a song at my senior recital.

It got too exhausting trying to keep up with all the more informed music-people so these days I’m a really casual listener, maybe even too casual. I’ve lost track of my own tastes and preferences. I’ll basically listen to whatever Spotify tells me I like.

Voice of the past

Usually I prefer music played by a band with traditional instruments, a lot of like indie, folk, and R&B, but I started to think a bit about my relationship with electronic music. I listened to some albums that I still really like that have some electronic components and tried to identify what I liked about them. Here are some era’s of past-Priyanka music taste:

Height of 2012 indie hipster cringe core

It wasn’t cringe at the time, but I sometimes do looking back on what I loved. Cringe isn’t always bad, I think it’s something to lean into! Listening to this music made me into the person I am today. We’re talking Washed Out, Passion Pit, The Postal Service, Crystal Castles, Grimes, MGMT, Kishi Bashi …

One of my favorite albums from that era is Merriweather Post Pavilion by Animal Collective.

What I like about it: floaty, mushy, reverb, airy, sparkly, punchy, vocals, what’s he saying?!, kinda feels like you’re outside, daylight, rhythm


80’s alternative synth pop? art pop? dream pop?

Think Talking Heads, Kate Bush, and Cocteau Twins. Honestly, I’m so music illiterate, I’m not really sure if these truly fall under this genre. I feel like I can hear electronic components in this music sometimes, but are they even using any synths? Maybe!

Brotherhood by New Order

What I like about it: dark, crisp, fast, dynamic, sharp, moody, gloomy, whiny


90s House: World Clique, Deee-Lite

I guess I don’t really listen to any other 90s house but I had a phase where I listened to this album a lot.

What I like about it: punchy, beat, bass line, simple lyrics, simple melody, dance, fast, fun, repetitive, catchy, funky, organ, hard yet moments of softness, samples


<3 Mort Garson’s Plantasia <3

Real analog synth, right?! His artist profile pic on Spotify shows him sitting in front of one, so it better be!

What I like about it: whimsical, magical, can visualize a plant growing, triumphant, melodic, rhythmic, charming, uplifting, can focus on other things while listening but interesting enough to focus on the music as well


Makin my own electronic music

I also just remembered that I took a music technology class in my undergrad. We learned to work with DAW’s like Logic Pro and Ableton and I think I even messed around with the SuperCollider programming language. I uploaded some of my weird experiments to this SoundCloud profile.

[ITP: Programming A2Z] Final Project Progress

AI-defined text-based portrait

Since my thesis deals with portraiture and self-portraiture I thought it could be interesting to see if I could get the Replicate Llama chatbot to try to describe what I look like based on the text I’ve written on my website and these blog posts.

Here are a couple of snippets of my convo from last week in which I’ve highlighted the physical details:

1

2

Below are some more snapshots from another convo with the Llama. It felt like pulling teeth trying to get a physical description of myself; he was really fighting me on making any aesthetic assumptions. Eventually I got something pretty weird. The last two screenshots entail mostly other-worldly descriptions.

1

2

3

4

In some office hours, Dan told me that the bot wasn’t actually going to my website and reading the text, it was purely guessing the next best letter character based on its training data and our previous conversation … and then I realized that was what this whole class was about!!! So basically the bot it just predicting text and telling me what it thinks I want to hear. It’s not being “trained” on any of this text or even referring to it, it’s just making its best guess. Kinda ironic, but good that I finally understood this!

Either way, Mr. Llama is super complimentary when it comes to describing me and the convo really helped to inflate my ego. Below I categorized the descriptions based on physical element and I started creating a pencil sketch. Because the descriptions were really “out there” I tried to lean-in to a kind of deity or god-like imagery sometimes found in representations of Hindu gods. There’s some kind of critique in this work, like AI playing god or something?!

Here’s the final, painted, AI-collaboration portrait. I think I look pretty good!

Priyanka Phone

I also revisited my assignment from a couple of weeks ago where I was generating texts from my Dad. Previously, I had no visual elements for that p5 sketch so I wanted to create an interface that looked like my own phone. I drew up some assets using my iPad and Procreate and made the sketch look more like my phone home screen. Now the generated texts come in periodically like they do on my actual phone. You can try out this sketch for yourself here.

Trying to recreate my experience or relationship with my phone is interesting to me because in some ways it also feels like a self-portrait. These days our devices become extensions of ourselves. Honestly, my relationship with my phone seems so shallow because sometimes I’m doom scrolling for hours until I’m completely hollow and the only texts that really matter are the ones from my loved ones and DoorDash. Is everything a self portrait?!

Future to-do’s for this self portrait:

  • Make all the texts for Priyanka. For variety's sake, some of the options for the DoorDash context-free grammar are people other than me, which doesn’t make sense if I’m trying to simulate my own phone.

  • I was also given a really small screen from Cindy and I would totally love to make this a physical piece at some point.

  • I’d also really like to make an unconventional phone enclosure to house the screen and RaspPi running my sketch. I would like to explore my relationship with my phone more but at the moment it really feels like it makes my life worse. My attention span is shot. Any notification will completely rip my attention away from anything I’ve been doing or thinking. How could I convey that in a physical form?

  • Is the screen touch? Can the sketch do something when you click on a text? Does it make sound?

[ITP: Thesis I] This system is NERVOUS!

Brain Board

Here’s how I’m feeling about brains this week!

Mojo Jojo from the Power Puff Girls

Third Brain from Spy Kids

Abby Normal brain from Young Frankenstein

It’s crunch time baby

Since I was waiting for my boards to come in from JLC I decided to cook up some more agar. I’m still unsure what exactly I’m going to use this for but I’ve gotten pretty good at cooking it and I know that it needs a long time to dry out, so the sooner it is cooked, the better.

RECIPE

Water = 700mL (less water this time), vegetable glycerin = 60g, agar agar = 30g, 1/2 capful of vinegar, a little bit of conductive metal powder

COOKING SEQUENCE

140 degrees for 5 minutes, 176 degrees for 7 minutes, 212 degrees for 7 minutes, 248 degrees for 3 minutes, 212 degrees for 7 minutes. Since I used less water, the mixture boiled more and I was left with many little bubbles in the cooked plastic. When I was done cooking, I poured out half of the plastic into the baking sheet. I added the conductive powder to the plastic remaining in the pot, stirred it in as best and quick I could, and poured that into the other half of the baking sheet. I needed to move quickly because the plastic solidifies really quickly once the heat is turned off. Below is a picture of my yin-yang sheet of plastic.

Spoiler alert: the bioplastic didn’t become conductive…

My brain arrived! FINALLY!

Paying $60 for expedited shipping from China really paid off, I got my boards in time. I ordered my brain boards from JLCPCB. The minimum order quantity is 5 boards and since I had the two brain boards, I have 10 brain PCBs now! I also ordered the stencils to help with building up the circuits.

Other things you will need for populating a PCB:

  • Solder paste

  • Flux

  • Painters tape

  • Spatula or putty knife

  • Tweezers

  • Surface mount components

  • Hot air gun

  • Paper towels (bonus points: microfiber cloth) and isopropyl alcohol

Here are the steps I followed to populate my brain PCB’s:

1. Create a jig to hold your PCB in place using extra boards and tape

2. Line up your stencil to the circuit board and tape it in place on one end

3. Load up your putty knife with some solder paste

4. In a slow, controlled motion, apply the solder paste by dragging the putty knife across the stencil. This can be tricky! Make sure not to bump the stencil and check that the paste made it onto all the pads.

5. Gently lift up the stencil and remove it from workspace. Double-check that solder paste is covering all the pads.

6. Remove components from packaging and place them on board using fine tweezers. Make sure components are placed in the correct orientation.

7. Once all the components are placed on the board, use the hot air gun to make the solder paste reflow. Point the hot air directly onto each component. You’ll want to find the sweet spot in the height where the hot air doesn’t melt the component but still melts the solder paste. You will be able to tell if the solder has melted once the component shimmies into it’s correct place, centered on the footprint, due to the surface tension of the solder paste.

8. Soldered board. Repeat the outlined steps three more times to populate all four sides of circuit board!

So yeah, you just have to repeat that process three more times to populate both sides of both brain boards. I made a lil time lapse of the process. It really wasn’t so bad once I had all the materials and equipment I needed and I’m honestly impressed that I could get this done in our ITP shop!

Here’s the assembled thing:

Next, I laser cut the other cross sections of my brain using some smokey gray acrylic (my laser files are on my Github repo). I also was curious about trying out two-way mirrored acrylic that I found at Canal Plastics. I was told that I needed to tape up the whole surface of the mirrored acrylic before laser cutting it because otherwise the laser beam would reflect right back into itself?! Not sure if that’s real, but I did tape up my acrylic anyway!

Now that I had the majority of the sculpture built I had to figure out how to display it. I had some really great help from Phil in the shop but I ended up going with my friend Bianca’s inspiration, the Third Brain from the Spy Kids movie. The way those pinchers hold the tiny brain reminded me of helping hands used for soldering. Works on many levels and I didn’t need to really build anything yay!

Pro tip: Put heat shrink on the ends of the alligator clips so that they don’t scratch up the circuit boards!

Here’s my vision for the final exhibit:

Software

Here’s where things get a lil hairy…. it’s always the software that gives me trouble. And idk how to design interaction either…

After some introspection, getting in touch with my body, I wanted to write an Arduino sketch so my brain would behave in the following way:

Nominal state: Neurons firing constantly, thought loop looping.

Someone detected in front of brain: uh oh, I’m nervous. Neurons firing quicker, thought loop LOOPING!

That person still there: I’m super nervous! Neurons don’t know how to fire anymore. Disco time!

Is this how it is to be in my brain?! Possibly.

STHS34PF80 Qwiic board

I started with this brand new, spiffy sensor from SparkFun. The STHS34PF80 is a human presence and motion sensor that measures the precise black-body radiation as described by Planks’ law. It can discriminate between stationary and moving objects, has 80 degree field of view, 4m range, and monitors ambient temperature conditions. It’s a classic Qwiic board, 3.3V supply and logic voltage, I2C communication, and configurable SPI.

Though I was super excited about this shiny, new sensor, I had a lot of issues with it. Off the bat, there were compilation errors with running the examples from the library… so not a good start which made me skeptical of the software as a whole. I was getting temperature readings that didn’t make any sense to me. I also didn’t understand the presence readings and the interrupts example was inconsistent and didn’t trigger readings every time I waved something in front of the sensor. With time running out, I decided to change sensors to something that is tried and true.

Interlude: what is black body radiation?

  • A black body is a theoretical object that perfectly absorbs all incoming electromagnetic radiation, emits radiation like heat and light

  • Black body radiation used to understand temperature, size, and age of celestial objects

i2c_scanner sketch. Why does this sensor return so many I2C addresses? Default is 0x5A.

Example 1 from library. No idea why temperature is negative.

Example 2 interrupts

Regular old PIR

Luckily, attending a tech and art program and being surrounded by super awesome people has its perks, and was able to get my hands on a replacement sensor immediately. The Adafruit site has some really great documentation on getting to know your PIR sensor. I was able to hone in its parameters and get it to trigger reliably when someone is detected standing in front of it.

There’s probably no reason to get into the weeds too much here, but I had a lot of issues with using interrupts for the sensor and running light up sequences for my LEDs. I eventually ditched the interrupts and just do basic polling to check the status of the sensor. Not ideal, because the code is blocking, but it does the trick for now.

When it comes to lighting up the neopixels, I was originally inspired by this well done example that achieves a “comet” effect. This worked well for me before I incorporated the sensor and the animations would freeze when interrupts were triggered. I switched to the neopixel library and wrote some code that *kinda* works like I would like.

As always, you can find my final code at my Github repo here.

Graph theory and nearest neighbor bs

I also had high hopes that I could get the “neurons” to fire randomly, following neural pathways. I got a lot of help from my buddy Tres to come up with an algorithm to travel the “graph” of neurons/LEDs. We consulted chat-GPT and got some code that would compile but not a ton of luck running it. Might be something I revisit in the future.


More bioplastic weirdness

I was really motivated to figure out how I can incorporate some of my bioplastic research into my art practice. I decided to try to create a sort of “gray matter” for my brain by sewing together a cap for the sculpture. I found this pattern for sewing a ball and sent my sheets through the laser cutter. The laser didn’t cut all the way through (which I expected) and I used the line to cut out the pattern with a knife.

Kay, my most stylish friend, trying on the agar agar hat!

I also wanted to make a bioplastic glove to go along with my previous project. Through my research, I learned about this concept of “skin hunger”, the need for humans to have skin-to-skin contact. I had hopes that this glove would be conductive and would need to be pet, but the metal powder didn’t make the material conductive at all.

ITP Winter Show

Some feedback I got: people were really curious about how to calm down the brain. That’s something I should consider for future revisions.

Special Thanks to

It truly takes a village: Phil Caridi, Nick Poole, David Rios, Tres Pimentel, Jeff Feddersen, Erin Mamaril, Bianca Gan, Blair Simmons, Yeseul Song, Gabe Barcia-Colombo, Kay Wasil

Resources

My Brain Github Repo

SparkFun STHS34PF80 Qwiic Board

SparkFun STHS34PF80 Hookup Guide

SparkFun STHS34PF80 Arduino Library

Black body radiation video, and here’s another

Interrupt rabbit hole

Volatile variables - Arduino

PIR with interrupts

ISR Arduino forum issue

Neopixels

Neopixel Comet Tutorial

Ball and sphere sewing pattern

[ITP: Programming A2Z] Transformers and Final Project Ideas

Llama Llama Red Pajama

On the Dangers of Stochastic Parrots

  • Increasing the environmental and financial costs of these models doubly punishes marginalized communities that are least likely to benefit from the progress achieved by large LMs and most likely to be harmed by negative environmental consequences of its resource consumption.

  • A majority of cloud compute provider’s energy is not sourced from renewable sources and many energy sources in the world are not carbon neutral

    • Negative effects of climate change are impacting marginalized communities the most

    • Researchers should prioritize energy efficiency and cost to reduce negative environmental impact and inequitable access to resources

  • Using the “Common Crawl” dataset

    • In accepting large amounts of web text as ‘representative’ of ‘all’ humanity we risk perpetuating dominant viewpoints, increasing power imbalances, and further reifying inequality

    • Social movements which are poorly documented and which do not receive significant media attention will not be captured at all. Media coverage can fail to cover protest events and social movements and can distort events that challenge state power

  • Documentation debt: datasets are undocumented and too large to document post hoc. Undocumented training data perpetuates harm without recourse.

  • Text generated by an LM is not grounded in communicative intent, any model of the world, or any model of the reader’s state of mind.

  • Disseminating texts generated by LMs would mean more text in the world that reinforces and propagates stereotypes and problematic associations.

Foundation Model Transparency Index

  • FMTI evaluates 100 different aspects of transparency, from how a company builds a foundational model, how it works, and how it is used downstream

    • Questions involving intellectual property, labor practices, energy use, and bias

    • “In our view, if this rigorous process [Googling] didn’t find information about an indicator, then the company hasn’t been transparent about it.”

  • Less transparency makes it harder for other businesses to know if they can safely build applications that rely on commercial foundation models

Assignment

Chatbot with Llama using Replicate

I’ve been a little overwhelmed with all the things we’ve covered in the last weeks of class, so I thought it was best to try out a variety of things for this assignment. I started by downloading the ChatBot with Llama Replicate example. Replicate is a repository of different machine learning models that you can pay to use in projects. Below are some examples of featured models:

It took me a second to get my bearings, even with all the code handy. I didn’t realize that I don’t have node installed on my interim computer, so I needed to get all that setup. Then, on the CMD line, I navigated to where this example lives on my PC. The “npm install” command installed all the dependencies required by this example from the package.json file.

I tried running the example but I got the ‘Missing required parameter: auth’ error. Haha, I need an API token, duh! I got an invite from Shiffman to join the ITP organization set up for Replicate and generated my own token to use. I created a .env file in the example directory with this token. Then I was ready to npm run start and navigated to http://localhost:3000 in my browser.

My first convo with the chatbot got deleted by accident but here’s an overview of my second conversation with the Llama Chatbot:

1. Trying to get help with my final for this class … and questioning the chatbot on its breathing abilities.

2. Text-based portrait of Dan Shiffman

3. Text-based portrait of Priyanka Makin (based on my website)

4. Asking the chatbot if it could make any conclusions about my physical appearance from the text on my site. This questions killed it!

I am honestly surprised how much time I spent talking with this chatbot, I think I got pretty into it! Using other generative text tools, like chat-gpt, is not really part of my current workflow. I’m actually a bit of skeptic, so it is extra surprising to me how much I enjoyed talking to this llama.

Replicate and p5.js

Next I tried the Replicate p5.js example. This pretty much worked straight out of the box but I needed to wait a second for the responses to come back from the model.

I was actually texting one of my BFFs from my undergrad, Kendle, as I was doing this assignment. She studied something similar to ITP and is an AI hater like myself. Two of the prompts above actually came from her. The final prompt was “college roommates that are all grown up now” and this continually threw an NSFW warning and wouldn’t generate an image. Seems I’ve reached the limits of AI in a matter of 4 prompts…

Interactive Drawing with SketchRNN

It was pretty complicated, but I just HAD TO try out the Interactive Drawing Coding Challenge because visual, and doodle, and scribble is kinda my thing. And let me just say… I LOVED the sillyness of this tutorial. Meow meow meow meow!!!

Sketch RNN is a recurrent neural network that has been trained on the Quick, Draw! open source dataset from Google. It has many different models for different drawings and sketchRNN included as part of the ml5 library! I honestly just followed the video, lots of complicated math that I can’t explain really. The user draws the starting stroke as seed points for the chosen model and then the model finishes the drawing based on those points. There’s some smarty line simplification happening before the points are fed into the sketchRNN model. This example is super cool even though it was kinda over my head but my sketch is here.

Cat drawing when the pen isn’t being picked up

Nice cat drawing

Changed the model to face, green is what’s drawn by the user

Final Project Ideas

Or things I’ve started and haven’t really completed yet…

  • 2 bots that scream at each other. Scream into the digital void as much as you want! Cathartic?!

  • Dad (Markov) + DoorDash (CFG) text generator

    • Create a message app UI

    • P-comp-ify this? Run p5 sketch on a lil screen? Lil fake phone?

    • Phones as extension of self/identity

  • Thesis: Self portrait! Body! Artificial Life!

    • ?????

    • Give a LLM? something? and have it describe myself back to me? And then I draw that?

      • My website, instagram, journal, idk how to do any of this

  • Ask chat-gpt to read my blog and tell me what I’m at school for, what my degree is, what job I can get

[ITP Thesis I] Brain Fabrication Update ... and stuff

Mood Board

Here’s how I’m feeling about brains this week.

Does Brain Size Matter?!

Carboard 3D Prototype

So if you’re following along with my brain journey you know that last week I was trying to slice up a 3D model of brain to make up my sculpture. You also know that I had a lot of Fusion confusion and couldn’t figure it out. So I gave up and decided to go free hand in Illustrator. Basically I copied the shapes I drew for my circuit prototypes and shrunk them down for the outer cross sections. I added in the slots so that the pieces could fit together.

Laser cutting!

Here’s my mess

I basically repeated this process, adjusting all the cross sections drawings, until I was happy with the overall shape. It took me three iterations of cardboard prototypes to get the shape just right. The one I landed on for my sculpture is the right side.

Proto 1

Proto 2

Proto 3 - Winner!

Bioplastic

I have been really curious about how I could add my bioplastic research to my art practice. Selfishly I also think adding a bioplastic component to my work would add a unique component to my sculpture and be a great contrast to the sterile/cold/technological feel of the circuit boards. Here’s the most beautiful preemptive agar agar bioplastic I cooked on 11/19/23 for unspecified future use.

RECIPE

Water = 800mL, glycerin = 60g, agar = 39g, 1/2 cap of vinegar

COOKING SEQUENCE

140 degrees for 5 minutes, 176 degrees for 7 minutes, 212 degrees for 7 minutes, 248 degrees for 7 minutes, 212 degrees for 7 minutes

I’m also trying to practice embodied making so I’m kind of going into this one without a plan. This is a picture from the next day (11/20) trying to drape the bioplastic around the brain prototype. I thought it might dry and conform to the shape of the cardboard but I should’ve known that the plastic was still to moist and heavy. It ended up ripping at the top.

This is the bioplastic ONE WEEK later. It is much thinner and dry to the touch but the rips are much more apparent. I was hoping to mold and bind the sheet around the 3D shape so I mended the sheet with a couple of stitches and tried wrapping the cardboard brain up like a gift. The ribbons are meant to hold the plastic in place and I’m kind of hoping that the bioplastic will just stay in that shape.

PCB v10

If you’ve been following along, you also know that at this point I have two WORKING milled PCB prototypes! Sweet! Now I am moving on to the more final version of the PCB. True to life brain size and many LEDs!

I started with updating my board outlines in Adobe Illustrator. I made sure the slot sizes would fit the PCB thickness and acrylic thickness. To get these outlines into EAGLE, I saved them as SVGs and converted them to DXF files online. However, when I did this, the scale was completely off from the original drawing. The scale is super important to keep accurate because I will have different parts slotting into each other. Turns out there’s a really weird scaling issue going from Illustrator to any other software because they use a different size for pixels than most other softwares. This was super frustrating!!!

A workaround I found to keep the scaling fixed was to open the original .ai (Adobe Illustrator) files in Inkscape by “File” > “Open” and then being sure to select “*.ai” as the file type. This will take care of properly converting the pixels to the correct size to keep the drawing to scale. I then saved the file as a DXF R12 while in Inkscape and could import the DXF into EAGLE as normal.

Another pro tip is if you place your drawing in the bottom left corner of art board in Illustrator or Inkscape it will align with the EAGLE origin in the board layout on import.

Something else I came across was that the free eagle license has restrictions on board area. I ended up having to switch from my personal login to my school login I created when experimenting with Fusion 360 so that I could access my educational license to create my big sexy brain!

Opening *.ai files in Inkscape

Yay! Here’s the imported PCB shape to correct scale!

This first view of the brain is meant to SPIRAL. I created a schematic with an LED chain 80 LEDs long, 40 LEDs on the top and 40 on the bottom. I first placed all the LEDs, reoriented them so that airwires were shortest, created a 5V plane on the top and a GND plane on the bottom, and then got to routing.

Here are some things I noted while refreshing my memory on using EAGLE:

  • Command “ripup @ ;” will rip up all polygons so it is easier to see components and airwires

  • I chose a trace width of 0.01 inch and via drill size of 0.02362205 inch

  • Ctrl + F flips the board over

I also attempted importing an Illustrator drawing as a DXF to be on the silkscreen layer and it didn’t quite work. I’d love to be able to draw a bunch of strokes for the fissures of the brain and put those on the white silkscreen layer. On the right is a quick test I did. The stroke I drew translated to an inside and outside outline which is not really what I want. If I could make this only one line, the pen path, then I could potentially draw some folds and bumps to scale, import it like normal, and select all silkscreen and bump up the thickness?

There’s also some nuance between fills and polygons. A fill would be composed of a bunch of lil lines which requires a whole bunch of data but polygons are more SVG-like. I was following tutorials like this but I couldn’t get anything to work. This was too hard for me so I’m abandoning the silkscreen for now.

I shifted gears to the second PCB and I was having the same scaling issues again.

Brain imports too small!

After opening *.ai in Inkscape and saving as DXF, the import is to scale!

I then went through the same layout and routing process. This brain should look more like normal neurons firing so I laid out the LEDs randomly and will treat it as a graph with neighbors, impulses coming from selected neuron hubs, in software I guess. Hope I can figure that out.

I don’t have any good images, but I also added some extra pads at the slots for the PCB. The idea is that when the two boards are slotted together, I can solder them into place for extra electro-mechanical connection. Not sure if they are the right size. I also grounded them… not sure if that’s right either.

Finished design

Generating gerbers

Here’s my Github repo with all my EAGLE files.

Design Review with Nick Poole

Time was quickly running out for me, so I had to go ahead and pull the trigger on ordering my v10 of the brain boards but I was able to get some eyes on my design after the fact. I used to work with my friend Nick at SparkFun some years ago now. Luckily, he’s the best person ever and responded to an email from a desperate graduate student and estranged co-worker. He was able to look over my design and give me pointers on creating 3D PCBs. In fact, he’s an expert on Boggling the Boardhouse and even gave a whole presentation at SuperCon some years ago. I’m really so lucky to know such smart and kind people and am excited to take Nick’s pointers and develop brain boards v11 soon!

References

Github Repo with Brain files

Morkana Nopales

Get DXF files into EAGLE

Adobe Illustrator scaling issue in Inkscape

Inkscape scaling issue with SVG files (this didn’t end up fixing my issue but is still relevant)

Importing custom images into EAGLE

How to add logos to PCB layouts

Nick Poole - Boggling the Boardhouse video and Github

[ITP: Programming A2Z] Markov Chains and Context-Free Grammars

Notes

  • A Markov chain is a stochastic describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous even

    • Can simulate “stickiness” by modifying probabilities

  • Grammar = “language of languages”. Provides a framework to describe the syntax and structure of any language

    • Context-free grammar = rules don’t depend on surrounding context

    • Vocab: alphabet, terminal and non-terminal symbols, production rules, axiom

  • n-gram = contiguous sequence of characters or words

    • Used to reconstruct or generate a text that is statistically similar to the original text

    • Frequently used in natural language processing and text analysis

    • Unit of an n-gram is called the level, length is called order

  • A Markov Chain will return less nonsense if the n-gram is longer, output will be closer to the original text

  • Tracery = generative text tool that creates text that is algorithmically combinatorial and surprising while still seeing the authorial “voice” in the finished text

Predictive Text

Thinking about natural language processing and predictive text, one of my friends made me aware of the predictive text feature in iMessage (a setting I didn’t have turned on). Basically, I tapped the middle button over and over again for different text conversations. The generated messages are below, different for different threads. The text to my boyfriend just devolved into “mama” “mommy” over and over. Really weird…

Assignment

It’s that point in the semester where my brain is completely dead and I’m having a hard time thinking of text that I’ve come across that follows a pattern. Yikes! But my last food delivery order of sub-par Pad Thai sparked some inspiration.

Building off of the context-free grammar using Tracery Coding Train example, I created a context-free grammar that would simulate the super-friendly texts I get from DoorDash. Creating this grammar using the Tracery library was really easy to do because it essentially works as a mad-lib and chooses an item randomly from a word bank that I hard-coded. You can find the p5 sketch for my DoorDash CFG here.

Also, Gracy gave me the genius idea to try generating texts from my dad since we compare dad texts sometimes. My dad is … involved to say the least. Generally, I’ll get reminders, motivations and wisdoms, un-solicited “suggestions”, and the occasional dad-in-the-life update with visuals. I started with the Markov Chain example and put in my dad-text data. This was particularly annoying to gather because I’m working on my PC these days so I had to manually select and copy+paste each text and compose emails to myself, etc, etc. Maybe this would’ve been easier using the iMessage app on my MacBook?! Anyway, I just created a new txt file in the p5 sketch with an assortment of my dad’s recent texts to me… and that’s it really! My sketch is here. Not sure how to define it specifically but the generated texts feel pretty authentic to me!

DoorDash texts

Real dad texts

Generated dad texts

You know, I actually really love that I landed on these two sketches because it is only my dad and DoorDash that text me haha! Or maybe that’s just how it feels sometimes!

I realize that these sketches I modified for the assignment are super close to the examples provided and are pretty simplistic. I haven’t had the time to build these examples out fully yet. In the future, I’d love to create more options for the DoorDash CFG. I think it would be kinda funny if I generated texts that sometimes responded back to my dad. Also, I want to build out the UI in p5 to look like the Messages app on my phone so that I could put my generated text in a meaningful visual context.

Resources

Priyanka’s DoorDash CFG sketch

Priyanka’s Dad Text generator sketch

https://github.com/galaxykate/tracery/tree/tracery2

[ITP: Thesis I] Project Proposal Draft and Fusion Confusion

First things first, here’s a link to my Thesis Project Proposal Draft. Remember, it’s just a draft…

Embodiment

I’m continuing to explore my own embodiment through different sketches of self portraits. I’ve been using some Inktober prompts as inspiration.

Inktober 20, bulb - sometimes ideas carry me away

Inktober 21, slingshot - why do I attack myself?!

Please Wash my Brain

What this project is

A life-size brain 3D PCB light sculpture with acrylic cross sections. The middle cross sections will be double-sided PCBs with RGB LED neurons. The neuron firing patterns will change as the brain detects people standing in front of it.

How I will be making it

PCB prototypes were milled on the Bantam and soldered by hand. The final version (100+? LEDs/PCB) will be manufactured by JLCPCB with a stencil for solder paste, hopefully. Then I will populate all the components.

Acrylic cross sections will be laser cut.

Materials left to buy

Schedule

  • 11/15 - 11/19: 3D model and cardboard prototype. I’ve been struggling with this this week, here are some options:

    • Try to model my own simple brain in Womp or Nomad Sculpt (for iPad) and then slice with Fusion Slicer

    • Draw my own brain cross-sections in Illustrator and just scale down and add slits

  • Order sensors - DONE!

  • 11/20 - 11/22: design v10 of brain PCBs (there’s two!), solder pads as mechanical connection? Do I need to order any more parts? LEDs, qwiic cables, solder paste, spatula?

  • 11/23: Order v10 PCBs

  • 11/23 - 11/28: Talk to some LED/PCB people — Aaron Parsekian, Brian Oakes, someone at SF? Nick?

  • 11/28: Assemble PCBs

  • Need v11 of PCBs?

  • On, and on, and on: write software, laser cut acrylic, assemble sculpture

Fusion Confusion

Tutorial

I downloaded Fusion 360 and secured a student license to use the software. I followed this tutorial to make a stamp in Fusion in hopes of learning the software. The results are below but I’m not really sure I’ve learned this software yet.

Fusion Slicer

My plan to get the cross sections needed for my sculpture is to slice up an existing 3D model of a brain. I downloaded this application called Fusion Slicer which will do just that thing. This Fusion extension is deprecated so it is really hard to find for download and there isn’t support for it anymore but I was able to find downloads and tutorials by the same guy who created the tutorial above, Product Design Online.

Once I got that all setup, I started looking for models of brains. I searched through sites like Turbosquid, GrabCad, and Thingiverse for models that weren’t overly complicated. A lot of them were really accurate and included all the parts of the brain but I really just needed the simple cloud shape.

Anyway, I tried out a lot of those models. Slicer works with .stl or .obj files so that’s what I downloaded from those sites. I tried opening the meshes directly in Slicer but the software would hang for a long time and eventually crash. I decided to open up the meshes in Fusion to see what was going on with them. Many of the models from the internet had warnings that the mesh was incomplete or open. Through some googling I found I could fix that by using “Mesh” > “Prepare” > “Repair”. Then I also learned that Slicer might be crashing because the brains were too complex! I used “Mesh” > “Modify” > “Reduce” in Fusion to simplify the mesh and reduce the number of vertices/faces. Kinda ironic!

Finally, the brain meshes were opening in the Slicer software! Below are some screenshots of that software. You can see it does a great job creating 2D slices from a 3D model. You can configure the number of slices, which direction, placement, almost everything! It’s starting to look like what I imagined for my sculpture. Except for everything in red has an error … so basically all slices. I went through this process with various models I found online but essentially the meshes were all too complex and accurate to the human brain, with all the extra giblets and stuff inside, that it wasn’t able slice without conflicting. I couldn’t figure out how to delete the meshes inside the cerebrum and make the brains dumber. Double ironic!

How to Customize Shirts Using Heat Transfer Vinyl

Ever want to buy a custom shirt and it doesn’t exist? Ever wonder what people do with Cricuts?! Well you can make custom graphics and text for clothing using heat transfer vinyl!

My school has a bunch of Cricut knock-offs called Cameo. I was actually first introduced to them in my first semester Lo-Fi Hi-Impact class at ITP. I’ve linked some docs from that class below for my own reference I guess.

First off, create your design in Illustrator. If your design includes text, make sure to “create outlines” for it. Save the Illustrator file as a DXF filetype. The Cameo works with this ancient software called Silhouette Studio, make sure you have that.

Bring your design into the Cameo software by either opening it or dragging the DXF file into the software. Make sure to mirror the design and resize it to fit your surface. The maximum cut size is 12 inches in width but I think the length can be whatever because the material passes through the machine.

Now you are ready to setup your Cameo machine. Plug it into power. Then connect the Cameo to your computer using the printer/USB cable.

Prep your vinyl by cutting it to the relative size of your design. Put the vinyl shiny side down on the sticky cutting board. The ITP cutting boards are … well loved, so they’re not so sticky anymore, but a little tape can keep the vinyl in place. Load the cutting mat with the material into the Cameo machine.

In Silhouette Studio, navigate to the “SEND” tab. Choose “Heat Transfer, Smooth” as the material. The settings that go with this material have worked for me, so no need to adjust the blade depth or speed. Make sure the software detects the machine connected to your computer and hit the “SEND” button to start cutting.

Once the machine has stopped cutting, eject the mat from the machine. The next step is called weeding, which is removing all the excess material from the plastic backing. I grabbed my handy, super-sharp tweezers and went really slow to make sure I wasn’t pulling up any of my design.

Before weeding

After weeding

Once the vinyl was weeded, I flipped it over and placed it on my surface, in this case a cotton T-shirt, plastic side up. Then, all you need to do is iron it in place and peel the plastic away. To make sure the vinyl was really well attached and not curling up on the edges, I turned the shirt inside out and ironed some more from the inside.

Final Product

[ITP: Thesis I] Uhhhh

Research - Embodiment / Bodies everywhere!

The Wisdom of Your Body, Hillary L McBride, PhD

  • “The body is not a thing we have, but an experience we are” - Theresa Silow

  • How do I experience my body?

  • If you were in fact your hand, how would you reveal yourself as a hand? — not sure what this means any more in hindsight… wish I had a page number for this

  • We forget all about bodies until we face pain, aging, illness, trauma, incarceration, or impending death

  • People are reduced to body-objects not empowered as body-subjects

  • We have forgotten ourselves as bodies to survive the pain or to be compliant

    • Dissociation!

  • Embodiment helps us be fully present and connected with ourselves and with those around us

  • Trauma occurs when something negative and unexpected happens, and it leaves us feeling confused, overwhelmed, and powerless (59)

  • Depression is not a feeling but a nervous system state we move into after having been overwhelmed or mobilized for too long. Same thing for burnout (63)

  • We complete the stress cycle when we release our trauma response mechanisms by moving the stress-related energy out through our body (65)

  • Interoception = ability to sense what is happening in your body and to know yourself from the inside out

  • Culture has ideals about bodies and they are communicated through three primary sources of influence: parents, media, and peers (85)

  • Reinforced idea that our appearance is connected to our social belonging (87)

  • Objectification as the experience of being female in a patriarchal culture that sexually objectifies the female body.

  • Self-objectify = to internalize the observers perspective as a primary way to view themselves (90)

  • Body neutrality = appearance acceptance or appearance neutrality

  • How we value and honor our own bodies impacts how we value and honor the bodies of others (96)

  • There is goodness in our own body

  • Interoception is essential for creating the experience of balance and homeostasis, regulating emotion, having a sense of body ownership, and experiencing continuity as a person over time (98)

Media

  • Trauma schism = mentally remove yourself from your body - Big Mouth, season 7 episode 9

  • “The body can’t live without the mind” - Morpheus, the Matrix

My own thoughts

  • Want to reconnect and remember my body

  • Shut down connection to the body as a trauma response, it’s just easier

  • Physically practicing my physical art practice

  • Me: I don’t want to be perceived, I don’t want to attract any extra attention, be invisible, take up as little space as possible

What it feels like, or what I visualize, in a dissociative episode. These happen when I have heightened anxiety or stress

I also tried completing Inktober last month. I didn’t succeed with creating 31 drawings in 31 days but I did represent myself a few other ways. I’m planning on still completing the rest of the prompts for Inktober, on my own timeline, but it might be helpful to my thesis if I keep on approaching these prompts as self-portraits!

8 Dream

12 Nest

17 Blush

18 Snooze

Eyes always closed… why am I always sleeping?!

Office Hours

Blair Simmons

  • I want to know about your portraits

    • Devices are an extension of self

    • Release of data

  • Should look into the history of portraiture

    • Who are the people who historically could afford to get portraits made? Can we reclaim this process?

  • Exploring a self-portrait could be an opportunity to be known and held in memory. Is there a desire to share?

  • Lean in to your instincts

    • Do the work to decide why. What’s the justification for what you are doing, choices you make?

      • Why is this all about you? Why is this literal? Why feminine?

      • Note when things are frustrating and try to disentangle, note when things are joyful

      • This thesis is an exploration of embodied/instinctual making

  • Is this project a critique? Inserting the self (or the human) into logical, technical thinking/design

    • The self wasn’t meant to be rational, emotions were meant to be taken out of the scientific process

    • What do we lose when we attempt to take out the human bias when designing tech?

    • Personal connection to the work

Sharon de la Cruz

  • Find more examples of self-portraiture. Why would you make it?

  • Dabble in the range of literal to abstract

  • I’m in practice with my body, body is a vehicle to create, PHYSICAL art practice

    • I want to more consciously understand my body and its relationship to my making process

    • How are you making differently now?

    • “Don’t think, just do” has been my motto for the last few years, trying to put words to the gut feelings

  • Give yourself space to defend and be secure in what you made, reflection on the message

  • How will you make people (viewers) sit and be in the story?

Gabe Barcia-Colombo

  • Brain looks cool but doesn’t say much

  • I should be writing everyday… about anything

  • Brain storm a list of body interfaces, inputs and outputs

  • Some Jan Svankmajer reference…

Prototype

Ok, so if we can remember, I ended last week with two PCB designs and one milled PCB. I started by milling my second PCB, just a different view of the brain with different neuron configuration.

I got all my parts in from Adafruit so I was ready for assembly. I made some markings on the PCB to indicate the orientation of the LEDs. Then, glopped some solder paste on all the pads. This was kinda hard because it’s hard to control the paste in the tube. I had to use my favorite, super sharp tweezers to clean up and push around the paste. Then I placed all my LEDs and used the hot air gun and a coffee cup warmer to reflow the solder. Honestly, assembly worked a lot better that I thought it would!

Pin indicator markings

Solder paste

Completed prototype

So I plugged this thing into my Arduino, powered it with my computer, and I got this error: “Power surge on USB port”… FUCKING SHITTY LED footprint!!!!!! Turns out the footprint Adafruit provided for their NeoPixel is not drawn to match the physical footprint of the part, so the pads short to ground since this self-milled prototype has conductive copper all along the top surface. So I needed to redraw the LED footprint.

Here’s all the dimensional information on the part datasheet… not enough measurements to draw an accurate footprint as far as I can tell.

I took a picture of the pads on the bottom side of the LED and brought it into Illustrator. I tried keeping the image to scale with the part and measured the size of the pads. Then I followed this trusty SparkFun guide on how to make SMD parts in Eagle. It took me longer than I would like to admit but my improved footprint is below. I revved my Eagle files with the new footprint.

Here’s a comparison of my new footprint (above) vs. the Adafruit footprint (below)

This is v02 design of my brain proto

With the update board files, I was ready to start fabricating again. I followed the same steps of milling, marking, solder paste, placing LEDs, and reflowing to create my assembled prototype. I thoroughly checked this board to make sure it wasn’t shorting across power and ground before powering it this time.

And voila! It’s alive!

Honestly, I feels proud that I could revisit my PCB drawing skills seeing they weren’t all lost to time! Also, I think I’m a pro on the Bantam now, milling my own boards. Super bad-ass!

Next Steps

  • Redesign the other brain view in Eagle with new footprint

    • Mill, assemble, test that proto

    • Try out some other LED animations?

    • Need to order presence sensor

  • Do Fusion 360 Tutorial

    • Find (or model) a brain

    • Try out the slicer and make a cardboard prototype

  • Continue working on Washing Machine

    • Try out new motor driver

    • Do I need to order a stronger motor and power supply?

Resources

My GitHub repo with Eagle design files and code

https://learn.sparkfun.com/tutorials/designing-pcbs-smd-footprints

https://learn.sparkfun.com/tutorials/making-custom-footprints-in-eagle/all

[ITP: Programming A2Z] Chatbots

Bots Thoughts

  • Bots follow the PUDG model = procedural, uncreative, data-driven, graffitti

    • “unoriginal genius” = remixing pre-existing textual artifacts

    • Interventions in a public space, inherently politically progressive

  • “My bot isn’t me but something I’m responsible for”

    • Need to think through all the misuses and consequences

  • Techniques: matching and substitution

    • AIML (Artificial Intelligence Markup Language)

    • Retrieval model = picks a response from a fixed set based on the input and context

    • Generative model = generates new responses from scratch based on Machine Translation techniques

    • RiveScript = simple language with “Unix-like” approach, takes human input and gives an intelligent response

  • Other considerations

    • Conversation length

    • Open vs. closed domain conversation

Assignment

For the bot assignment, I wanted to create a really simple screaming Discord bot, seasonal for spooky season! All this bot will do is scream at you or scream back whatever you tell it (change text to all uppercase letters). To get this going, I just watched some ✨super exclusive✨ ✨unreleased✨ Coding Train video tutorials and I hacked together the bot by referencing these examples.

I started by creating a node project by making a new directory. In the terminal, I navigated to that directory and typed “npm init” to create the “node_modules” file. The Node Package Manager installs packages and dependencies. To work with Discord, I also typed “npm install discord.js” to install the library. Also “npm install dotenv” to store environment variables.

Then I went to discord.com/developers and logged into my account and created a new application. Then I generated the OAuth2 URL and made sure to select “bot” and “applications.commands”. When I followed that link, my bot was added to the Programming A2Z Discord server! Success!

I then created a .env file to hold onto my secret variables which contain my client ID, server ID and bot token. I followed the tutorial video to use the Discord.js library to login to my bot. The green dot shows it is live!

Success!

Then, I built the scream.js file which handles the slash command. Each command will have its own JavaScript file. I also copied the deploy-commands.js file from Dan’s tutorial which reads and registers all the commands. I also finished up the bot.js file to handle the command interaction. In there, I specified if an interaction occurs, that is a “scream” command, then run the scream.js file.

Tada! Voila! It works!

I was totally inspired by Tres’s arguing MCR bots that I created a second bot named scaredy bot! It is up and running but maybe a future Priyanka will finish this project in which the scream machine screams and scaredy bot gets scared and also screams and the two bots just keep on screaming at each other…

You can find my code at this Github repo.

Questions:

When do I need to run deploy-commands.js? Only have to do once. If you change anything in the execute you don’t need to rerun deploy-commands. If you are changing something in the slashcommandBuilder(), then rerun deploy-commands.

Resources

https://github.com/Programming-from-A-to-Z/Discord-Bot-Examples

https://discord.com/developers/applications

https://discordjs.guide/#before-you-begin

https://www.omc-stepperonline.com/nema-17-bipolar-59ncm-84oz-in-2a-42x48mm-4-wires-w-1m-cable-connector-17hs19-2004s1

[ITP: Digital Fabrication] Algorithmic Design of 2D Geometry

Overview

Danny gave us a lecture on using the algorithmic and sometimes random nature of p5.js and Processing to help design physical art pieces as a part of the digital fabrication work flow. He has code examples to export DXF and PDF from Processing and SVGs from p5.js. The assignment for this week was to export 2D geometry from code and cut it on the laser cutter.

Sketch Selection

I had other presentations to do for this class this week, I knew I wouldn’t have time to come up with some new generative algorithms, so I looked through some things I had coded in the past. I considered this sketch which I wrote referencing Nature of Code. It’s lines drawn in a Perlin noise flow field but it really looks like a lot of hair! I decided though the output looked super cool, it wouldn’t bee a good candidate for laser cutting.

A preliminary step to the “hair” was these dots navigating a flow field. I thought this was still interesting to look at and might be a good candidate for laser cutting. I started adapting the sketch to make the dots circles with outlines to cut. Then I also made less of them because otherwise the laser might melt the material if it’s cutting in the same place over and over again.

Getting the sketch to save an SVG was difficult somehow. I was only able to get it to work by copying my code into the example Danny gave us.

Once I got this sketch to be relatively laser-cutable I didn’t like the output as much anymore. But then I remembered my flower generator sketch from a few years ago! I was able to slow down the frame rate and save SVGs from the sketch. What’s kind of funny is that whenever I would click the sketch wouldn’t export the current flower, it would export the next one. So the flowers I picked were kind of a surprise.

Once I had my SVGs, I brought them into Illustrator for some post-processing. Because of the sketch I chose, the post-processing was significant. I had to remove all colors, create outlines at the correct line width, and connect some shapes. But finally the SVGs were ready for laser cutting!

Original flower SVG

Ready for laser cutting!

Original flower SVG

Ready for laser cutting!

Final Product

[ITP: Thesis I] Research and Prototype Update

Research

I did a lot of research on the parts of the brain and neurons which you can find at this blog post.

I also finished ready Your Body is Not Your Body, edited by Alex Woodroe and Matt Blairstone. This is an anthology of horror stories written by trans and gender non-conforming authors. I really enjoyed reading this book, fun research for spooky season! After finishing, I went through story by story and did some word association. Here are some of my thoughts:

Some themes/concepts I identified were: mutilation, transfiguration, acceptance, gender presentation, gender roles, gendered bodies (society is stuck on classifying bodies this way), weaponizing bodies, bodies under attack, bodies in desperation, violence

What happens when the body doesn’t look or function the way we’re used to? How does this challenge our way of thinking?

Interview

Erin Mamaril, MS in Nursing, BS in Neuroscience & Cognitive Science, my smartest high school friend

General question: what does anxiety and depression do to the brain?

  • Check out: Greg Dunn, Neuro artist

  • There are circuits of specific interest —> prefrontal cortex

    • Connections break down and there’s noisy firing related to anxiety/depression

  • Default mode network is the part of the brain that is active when we’re not doing anything, like day dreaming and thinking about the future

    • This gets broken down too in depressed people

  • Anxiety and depression are really connected in the brain

    • Amygdala is overactive when we are fearful, less control

    • Physical responses: heart rate, breathing, etc.

    • Lizard brain (brain stem) takes over

  • Neurogenesis: it was largely believed that humans don’t generate more neurons, we get what we’re born with

    • Research has shown that new neurons can be produced in the hippocampus but this doesn’t really happen in the case of chronic anxiety and depression

  • Mindfulness is something you have to intentionally practice

    • Physical vs. cognitive parts of anxiety, there’s a connection but they’re also separate

    • Tune into what’s happening in your body to learn that it’s not necessary for this level of threat

Creative Experiments and Investigations

Bantam Desktop PCB Mill Test

Last week I learned how to use the Bantam mill to make PCBs for my Digital Fabrication class. Here’s the demo PCB I cut. I plan on using the Bantam to prototype my brain sculpture. And the shop stocks a whole bunch of sheets of copper circuit board.

For my own reference: single and double sided PCB blanks, 4x5 inches.

Circuit Design

I ordered some preliminary parts:

I chose these LEDs because of their affordability and built-in library and EAGLE support.

Another thing I learned while prepping to design this prototype is that the EAGLE PCB design software is EOL (end of life) in 2026. Autodesk will no longer be supporting it but EAGLE designs and libraries will be fully compatible with Fusion 360 electronics workspace.

That being said, I used EAGLE a lot at my last job, so I’m pretty familiar with it, and I’m going to use it for this project. It’s been a long time since I started it up on my computer and I needed to create a new personal Autodesk account and get my workspace set up with the SparkFun Eagle libraries and settings. In EAGLE, I went to “Options” and “Directories…” and made sure my paths were setup like this:

This way I will be able to use all the parts and design rules developed by SparkFun over the years. I might need to import some Adafruit libraries so that I don’t have to draw the symbol and footprint for the LEDs. Phil also told me that there might be design rules floating around for working with the Bantam mill (…can someone say: Homemade Hardware??).

Might as well install the Adafruit libraries as well…

Ok, cool! I got my EAGLE environment all setup, and look how cool this is! Here is the schematic drawing and PCB footprint for the LEDs I’ve ordered, they’re already part of the Adafruit libraries.

I wired up this schematic. It’s really just an addressable LED strip with some holes to attach headers and wires (spinal cord?!).

Getting a custom PCB shape took some trial and error. You can import DXF files to any board layer. I made some DXF outlines of brain shapes in Illustrator, but the shapes didn’t show up when I imported them into Eagle. For whatever reason, generating the DXF files in Inkscape was the only way I could get them imported properly. I followed this tutorial. Here are my first round of prototypes! Next step: the Bantam!!

I updated some parameters to mill on the Bantam. Generally, I followed some recommendations from the Homemade Hardware curriculum. Here’s what I changed:

  • Increase the trace width to 1 mm or 0.04 in

  • Add a GND polygon

Phil helped me get setup with the Bantam again. Next time I’ll be able to work the machine myself for sure! One thing he reminded me was that generally the machine mill as deep as the tool diameter (or even less) so the mill will make multiple passes on paths, especially if it is cutting all the way through.

Milled brain proto! :)

LED animations to write in code

  • Neurons firing, normal

  • Neurons firing, FAST

  • Spiraling — does this mean that LEDs need to be laid out in a circular pattern?!

  • Fuzzy, random, noisy neuron firing

Dude, get that Fusion 360 out of my face!

As much as I’ve been avoiding the propaganda, I think I might have to finally look into Fusion 360. I got some impromptu office hours with Phil while I was checking out the Bantam today and, surprise surprise, he was preaching the gospel of Fusion 360 again. If I want to make a 3D sculpture, I can draw (or find) a 3D model of a brain and slice it in Fusion 360. Adnan already gave me the slicer software which makes 2D slices of 3D drawings, guess I need to start watching some tutorials…

[ITP: Digital Fabrication] Project Update

Big Brain Research

So I watched a series of brain and neuron videos from this great YouTube channel called CrashCourse. Here’s some of my key takeaways:

  • The brain is part of the central nervous system

  • The brain is the source of self and controls human behavior

  • Parts of the brain have specific functions yet nearly every region of the brain lights up during simple tasks

  • “Old brain” = medulla + brain stem + pons + thalamus + reticular formation + cerebellum

    • Basic bodily functions

  • Limbic system = amygdala + hypothalamus + hippocampus

    • Memory consolidation, fear, aggression, learning

  • Gray matter = two hemispheres that make up 85% of brain weight

    • Oversee thinking, speaking, perceiving

    • Corpus callosum connects the two hemispheres

    • Left = language, right = creative function, controls the opposite side of the body

    • Cerebral cortex = 20 billion neurons and glial cells for protection

  • Neurons are not only in the brain, they run the whole length of your body!

    • They respond to stimuli and transmit signals

    • They are the longest lived cells in your body, brain neurons will live as long as you do!

    • Neurons are irreplaceable ( … or are they?! In my interview with Erin she said there’s research proving otherwise…)

    • They have large appetites, 25% of the calories humans consume go to brain activity

  • Three types of neurons:

    • Sensory neurons = transmit impulses from sensory receptors to the CNS

    • Motor neurons = impulse moves from CNS to rest of body

    • Interneurons = impulse moves between sensory and motor neurons, these are the most abundant

  • Neurons transmit at one uniform strength and speed but can vary in frequency

    • The brain reads the neuron signals like binary code, organized by location, sensation, magnitude, and importance

  • Neuron action potential = depolarize resting neuron to cause a big enough change in membrane potential to trigger voltage-gated channels to open

    • Resting membrane potential = -70mV

    • Trigger threshold = -55mV, all-or-none triggering

    • A weak stimulus tends to trigger less frequent action potentials, intense signals have increased frequency to tell muscles to contract harder

  • Conduction velocity is faster for reflexes and slower for glands and guts

    • Transmission speed is affected by whether or not there’s a myelin sheath on the neuron axon

Remaining brain questions:

What purpose do the folds in the brain have?

What does anxiety, depression, poor mental health do to how the brain functions?

Moodboard - interesting connections and weird thoughts

Neurons firing (video)

Light map

Stars

big brain, smooth brain, brain fart, mind blown, brain spiral, black box, two hemispheres

Sketch

For this project I would love to build a custom PCB sculpture. Components would only be on one side of the circuit board and I’d like to build some prototypes on the Bantam Desktop mill in the ITP shop.

LEDs

I want to illuminate a series of LEDs. They need to be addressable so that I can control each pixel individually to make neuron-firing animations. I only need the LEDs to illuminate a white light. Here are a couple of options I was considering:

I ended up going for the NeoPixel mini because of its small footprint. Also, Adafruit has provided the EAGLE component that I can use for my design and I can leverage the NeoPixel library. This will cut down my work considerably! These are also a relatively affordable option as far as I can tell from my searching. I might need up to 50 pixels (or more?!) in my final sculpture, so I’ll be going through a lot of NeoPixels...

Resources

https://www.youtube.com/watch?v=pRFXSjkpKWA

https://www.youtube.com/watch?v=vHrmiy4W9C0

https://www.youtube.com/watch?v=qPix_X-9t7E

⚡️ https://www.youtube.com/watch?v=OZG8M_ldA1M ⚡️

Neuron Forest by Katie Caron

[ITP: Digital Fabrication] Bantam Othermill

Overview

The Bantam Othermill is a Desktop PCB milling machine. It is for milling printed circuit boards, small 3D parts, and engraving. A milling machine removes material from a work piece by rotating a cutting tool and moving it into the work piece.

There is also a fourth axis attachment with which you can rotate a material around the x-axis and mill away material to make 3D parts.

Working Volume

The milling area is pretty small: 4.5 x 5.5 x 1.35 inches with spoilboard installed and 4.5 x 5.5 x 1.6 inches without. There are no spoilboards installed on the mills in the shop and Phil said there was a little more wiggle room with the material thickness, more like 2 inches max.

Materials

Don’t use these:

  • Steel (stainless or otherwise)

  • Iron

  • Titanium

  • Fiberglass

  • Hard stone and precious gems

  • Glass

  • Food products

Use these:

  • FR-1 PCB blanks

  • Machining wax

  • Linoleum

  • ABS

  • Acrylic

  • Brass (for engraving)

  • Aluminum (for engraving)

  • Silver (for engraving)

For a complete list of materials you can use and tips, check out this article.

Confusion between Othermill and Bantam Desktop CNC

Othermill and Othermill Pro are both small table-top CNC mills from Bantam. They can be identified by their white plastic housing and we have many of them in the ITP shop. Bantam stopped selling them in 2017 and Phil was adamant in focusing on the Desktop machine (still being manufactured/supported) with its wider capabilities.

Othermill

Bantam Tools Desktop PCB Milling Machine

Before you begin - what you need

Design file - SVG, BRD, Gerbers, and G-code files

Milling material - cuts most any material softer than steel, for example wood, plastic, machining wax.

Milling tool - 1/32” Flat End Mill and 1/64” Flat End Mill are included, but you can use any cutting tool that has 1/8” shank. We have all the tools, bits, and things in the shop. Eventually there will be kits available for check out in the shop.

Software

You can download the Bantam Tools Desktop Milling Machine Software from here but there are computers in the shop designated for use with the Bantam machines with the software already downloaded. All you need is your design file on a flash drive.

Other softwares that might be handy are ones used for drawing and/or modeling like Fusion360, EAGLE, Inkscape, etc. The software is expecting .svg, .brd, PCB Gerbers, or G-code files. Some CAM software requires a Bantam Tools post-processor to allow for integration with the milling machine. The post-processing software translates the mill design into tool paths and machine movements specific to the Bantam.

Process - Cutting sample PCB

  1. Checkout a Bantam tool set kit from the ITP shop. Below are some of the things you might use when working with the Bantam. There’s also a drawer under the work bench with a bunch of tools and materials for working with the mill.

2. Plug desktop mill into power and make sure the computer is connected to the machine over USB.

3. Turn on the machine with the switch in the back. When it is first powered on, the mill will calibrate itself.

4. Open your design file in the Bantam 2.0 software. The software will prompt you through the setup and milling process. In the screen shot below you can see that there’s a warning. As a safety feature, the mill won’t rotate if the front door is open.

5. Start by installing the tool you will be using. The machine should be left in a state where no tools or bits are installed. If there’s any hardware already installed, feel free to remove it. For our setup, we installed the fan on the bit to remove excess material. Screw the collet into the nut and the bit into the collet. Use the two wrenches to install the tool but make sure to keep it only finger tight so it can be removed in the future.

6. The software will prompt you to go through the process of locating the newly installed tool. The machine will go through an automated process of conductively measuring the vertical distance between the tool tip and the collet so that the machine knows where the tip of the tool is in space.

7. Follow the software prompts to define the material and size. Make sure to double-check all these measurements.

8. Prep the material for install. The sheet of FR-4 will be stuck down to the bed using double-sided tape. Make sure to cover as much as the area as possible being careful to not wrinkle or double up on tape. Inconsistencies in material level will affect the cut. At some point the software will ask for z-offset so make sure to measure the PCB material with the tape, both sides of paper backing removed.

9. The software will prompt you to attach the material and push the bed forward. You can clean the bed down with rubbing alcohol. Place the material as close and lined up to the bottom left corner as possible. This is the origin. Stick the material down.

10. Confirm the placement and scale of your design and tool path. Be sure that what you’re milling was designed with the tool in mind and never place your design directly on the edge of the material.

11. Setup is complete, now we can run the mill! NOTE: the Bantam laptop isn’t currently logged into the PCB subscription needed to cut the .brd file but the shop is working on getting laptops setup for checkout with all the correct softwares and subscriptions for any digi-fab process.


Voila!

12. The last thing to do is remove the circuit board from the bed. Use the scraper to gently pry the material. You can use some isopropyl to try to loosen the adhesive. Remove all bits from the machine and put everything away.

Notes

  • Lines can only be milled if they are as wide as the width of your tool

  • Remember, the smaller the tool, the more fragile it will be and the slower you’ll need to cut. Big tools are great for roughing passes and clearing out lots of material.

  • The tool moves in the z- and x-direction, the bed moves in y-direction

Terms

CNC = computer numerical control, all of the robot-y machines in the shop are CNC’s

CAD = computer-aided design. Softwares include: SolidWorks, Fusion360, VectorWorks, etc

CAM = computer-aided manufacturing

Collet = a segmented band or sleeve put around a shaft or spindle and tightened to grip it

Shank = the end of a drill bit grasped by the chuck of the drill

Spoilboard = disposable workbench mounted above a working/milling surface

Fixturing = the process of secuting the material to the milling machine’s bed

For my own reference…

FR = flame retardant materials

FR1 = paper substrates for single-sided boards, ROHS certified

FR2 = basically FR1

FR3 = basically FR2, except it uses epoxy resin instead of phenolic resin

FR4 = glass fiber composite with a flame retardant epoxy resin, water resistant, superior electrical insulation, excellent copper adhesion, preferred for multi-layer PCBs

Resources

Bantam Tools Desktop PCB Milling Machine Documentation

Technical Specifications

Materials Overview

Bantam Tools Milling Machine Software

Light Up PCB Badge Example Project

https://www.wellpcb.com/fr1-pcb.html

[ITP: Programming A2Z] Word Counting

Notes

  • “Artisinal data” coined by Sarah Groff-Palermo = small, fragmented, incomplete, human

    • Data to express who we are in the language of today

  • Concordance = list of principal words in a text, listing every instance with its immediate context

  • Sentiment analysis, pronouns hold the key! —> James Pennebaker

  • Associative arrays relates a key with a specific number or value

    • Un-ordered list, dictionary

  • loadStrings() function returns an array where each element is a line from the text

  • TF-IDF (term frequency inverse document frequency) = what words are important to this text versus others

    • score = term frequency * log(total # of docs / # of docs term appears in)

  • Corpus = collection of written texts

Assignment

1. shows initial word counting 2. split tokens by space 3. remove German stop words

I’ve had this question written down in my notebook for the last week: “what are texts that are important/interesting to me?” I didn’t really get a chance to think too much about it before jumping into this assignment. I went to Project Gutenberg and the home page featured Meine Erinnerungen aus Ostafrika by General von Lettow-Vorbeck. German! I can understand that!

I wasn’t so sure about that specific text but I looked at the list of texts in German and I landed on this: Der Bucheinband: Seine Technik und seine Geschichte by Paul Adam, a book about the art and history of book binding from 1890. Pretty cool, I think!

I started by putting the plaint text into a .txt file and removing the weird header and footer license stuff that was in English. I counted the words of this book using the code from the Coding Train tutorial.

In the initial counting I noticed that a lot of single letters were being counted as words… which was strange. When I compared my token array to the text itself I figured out that the special German characters (ö, ä, ß, etc) were causing issues with the token splitting. I changed the split call from non-word characters to splitting by whitespace which made the word list more sensible to me.

When I look at the word list now, at the top are words like der, die, das which all mean “the”. Und is “and”, mit is “with”, in is in, zu is “to”, all words that don’t mean much, right? These are considered stop words which are just commonly used words in a language. We all know that pronouns can be really important, but I wanted to challenge myself to remove them from my word count. Maybe then the word count would be more representative of the content of the text.

The internet is amazing! With a quick search, a found a complete list of German stop words on Github. I uploaded the “plain” list to my p5 sketch and put all the words into an array. Then, I did a quick check before creating a div for the words and their counts to see if that word is on the stop word list. My new word list has a lot of great book-words in it, like leather, fold, cover. Some other stand out words for me are: weise = way, ganz = quite/all, genau = exactly/precisely, gut = good.

This book has really great illustrations and images depicting all the book binding techniques and many beautiful books. Some of my favorites are below:

Since I dabble in illustration too, I thought I could try my hand at visualizing some of the top words from Der Bucheinband:

[ITP: Thesis I] Research Update

Body-ody-ody

I might not have any physical proof that I’ve been working on my thesis but I’ve been thinking about my project a lot, believe me! I recently re-started going to therapy. For someone who exclusively lives in her head and in the future, I think it is really funny that out of the three projects I made last year they are all kinetic sculptures that depict body parts. Is my subconscious trying to tell me something?!

I’ve never had a great relationship with my body. My body image has always been crap.

I don’t really take care of it either. I’m not one for long skin care routines or workouts. I don’t put much time into take care of my hair or cooking for myself. I’m so busy or worried about the future or my never-ending to do list that I don’t prioritize the physical here and now. It’s never really been that important to me. I am realizing that some of these rituals are an expression of self-love and I want to learn to practice them more often.

Outside looking in. When my anxiety is really bad I cannot get out of my head, with all my spiraling thoughts. It makes it really hard for me to be in the present moment and feel myself in physical reality.

Thots, Questions, Keywords

technology as portraits, body as mechanism, priyanka as mechanisms, self-portrait

Why do I want to make myself known through these mechanisms?

Cyborg, post-human

Woman/female bodies, bodies under attack

Embodiment, anxiety, depression - how does my anxiety manifest itself in my body?

bio something?! human being in nature, human being is nature

Reading List

On hand

So missed the presentation from the librarian about how to do research at our library. I do plan on reaching out to her soon, but I honestly just googled books on the body and I decided to start with these for my research. As I’ve been thinking about my project, I feel these three books are a good starting point and offer a broad array of POV.

Also, I am a firm believer in judging books by their covers…

Book on embodiment

Anthology of horror stories written by trans or non-binary authors

Reading for “Multispecies Theory” class in IDM. Bodies, environment, nature, the works!

Future Reading

Beyond the Periphery of the Skin: Rethinking, Remaking, and Reclaiming the Body in Contemporary Capitalism, Silvia Federici

The Body Keeps the Score: Brain, Mind, and Body in the Healing of Trauma, Bessel van der Kolk - this is kind of a long one, might be good for winter break

A Cyborg Manifesto, Donna J Haraway - tried reading the first two chapters of this last week and it is … dense. Need to try again

Recommendations from Friends

The Mindbody Perscription: Healing the Body, Healing the Pain, John E. Sarno M.D.

The Wounded Story Teller: Body, Illness, Ethics, Arthur W Frank

Her Body and Other Parties, Carmen Maria Machado

Further Research

Wow! I’ve gathered a lot of resources dealing with the human body in such a short time, also without consulting any “experts” yet. I do think I want to know more about the history of portraiture. Who are the people who got portraits made of themselves? What about self portraits? Why were they made? What cultural or historical significance do portraits serve? Etc etc etc

Experts for Interview

Blair Simmons - IMA Professor, artist, curator, story teller, and everything! She teaches many classes in her department among which is a class called Useless Machines. She also has a sculptural series called Archive of Digital Portraits in which she casts technological items in concrete and exposes them by chipping away with hammers and chisels.

Elizabeth Hénaff - computational biologist, artist, and assistant professor at IDM. She teaches a course called “Multispecies Theory: Design for the More-Than-Human”. Still need to do a bit more research on her work and her class but this seems really relevant to my project!

Would Danny Rozin also be a good person to interview?! Or Gabe BC? Lots of work on reflection, right?!

Path Forward

Next Two Weeks

  • Start reading my books, look at library presentation you missed

  • Think up questions for Blair and schedule an interview for next week

  • Revisit Womb/Washing Machine

    • Bring into school, get it working right, finish putting it together

    • Finish writing up documentation and book an appointment with the Doc Lab

  • Start designing a prototype for brain circuit for Digi Fab class. This could potentially be a part of my thesis

And Beyond

  • Think up questions for Dr. Hénaff and reach out to her

  • Continue working on brain circuit

  • Body part brain storm, how do I relate to my own body?

  • Revisit Lungs sculpture

  • Ideas: lil wooden guy, nipples (3D modeling), eyes (video?)

[ITP: Programming from A2Z] APIs and Libraries

RiTa

I followed this Coding Train tutorial to learn how to use the RiTa.js library. RiTa has tools for natural language processing and generative writing. It seems the video is a bit outdated when it comes to using the current version of RiTa, so here are some things I noted:

  • This is how you include the library: <script src="https://unpkg.com/rita"></script>

  • I did not create an RiString, I used the tokenize() function

  • No Lexicon object, right?! I checked the reference and no Lexicon exists…

I did run into an error following the tutorial when trying to check for plural nouns, or part of speech with the PENN tag of “nns”. I ended up figuring out that the randomWord() function expects JSON notation as the parameter by reading the documentation. So I fixed it, yay!

Dan, stop trying to make fetch happen! (Notes)

Sorry! Jk! I love the fetch() web API … but not actually because I don’t understand it.

  • loadJSON() function is asynchronous

  • fetch() returns a promise

  • Use .then() to deal with asynchronous promises

  • ES8 supports await()

    • If a function returns a promise you can use await()

    • Await() can only be in an ASYNC function!

    • Await() can replace .then() function for promises …. right?!

  • async functions by definition return a promise

  • API = application programming interface, how two applications communicate with each other

Assignment

Ok, so I’m officially SO CONFUSED! I wanted to see if I could get some data from the Datamuse API using async and await() because I just went through all the video examples for week 3 and there was no direct Coding Train example for that. From all the videos I learned that you should only call await on functions that return a promise, right?! I’m still not 100% sure what that is either…

So this is what I wrote first, I modeled it after the code we wrote in the fetch video.

My API call is definitely correct. This is what I get when I put that url in my browser.

Console for aync/await p5 sketch.

The response is really cryptic. It looks like it just returns a bunch of function names or something. Definitely not the JSON response to the Datamuse URL. So what’s all that stuff from lines 8-16? That’s the promise, right? Do I need that stuff?

Then I tried the loadJSON() function. This worked perfectly even though this isn’t how I wanted to retrieve the JSON data.

This response looks much better. This is the data I want to work with.

Then I also saw on the week 3 email from Dan, he attached a code example “Data muse with async and await”. Great, this is exactly what I’m trying to do! So I re-wrote my getDatamuse() function to call the json() function on the fetch response hoping that this would finally unwrap the API data for me. Basically, the console log shows me that data is a Promise and it is perpetually pending, it never resolves. There’s a screen shot below.

AHHHH! I forgot the second awaittttt! For line 25! Now I’m getting my data from the API.

Alrighty, here’s what I made for my assignment! I even tried to load the image asynchronously by calling the fetch() function, but this did not work for drawing on the canvas. I had to use the preload() function like normal. Also, in my sketch I call my asynchronous function in draw(). I was trying to draw the words to the canvas that the async function returned in draw() itself but I quickly learned that wouldn’t work. The text with the retrieved word from the API needs to be drawn to canvas in the async function itself and I lowered the frame rate of the sketch as well.

Some high art I made!

Q: Why use fetch and await instead of loadJSON?

References

All my p5.js sketches live here ❤️✨

https://github.com/dhowe/ritajs/tree/master

https://rednoise.org/rita/#reference

https://www.youtube.com/watch?v=lIPEvh8HbGQ

[ITP: Digital Fabrication] Multi-Dimensional Drawings in VectorWorks

The assignment was to choose an existing object and try to create a scale drawing of it, in multiple views, in Vectorworks. I have never used this software before and have very limited experience in CAD other than designing PCBs.

Process

Initially I chose this mini recycling bin. This is truly my most prized possession, my boyfriend got it for me and I use it to hold all my scissors, x-acto knives, and screw drivers. It also felt like an object that would be easy enough to draw but also had some curves and interesting shapes I could learn the software with.

Below is as far as I got with my Vectorworks drawing. Turns out the curves and weird shapes were too hard. I really couldn’t find good support online for learning this software or to draw what I wanted.

I ended up switching my object to my Circuit Playground Express. I didn’t want to go back to something I already had some experience with (drawing PCBs), I really wanted to learn something new, but time was running out and this seemed like something simple enough that I could measure and draw.

To complete this drawing, I used some calipers to measure specific components to get exact dimensions. I also followed some of Danny’s videos and did a lot of googling to navigate drawing in Vectorworks. Here are somethings I noted:

  • Combined shapes using “add surface”

  • Cut out the alligator clip holes using “clip surface”

  • Used the “DIN” dimensions, but it would be really nice if you could default to those each time!

I don’t really feel like I learned enough about using this software yet. I’m still looking out for some good beginner tutorials to familiarize myself! Also, not sure which CAD software to invest time in because there are so many to choose from and not sure what’s unique to each one. I know Phil really recommends Fusion 360…

Final Drawing

Questions

How to repeat something (copy/paste) around a circle?

How to get shapes consistently equidistant from each other?

Sometimes it is hard to move shapes small, precise amounts. How to do that?

How to make my page landscape orientation?

References

https://university.vectorworks.net/mod/scorm/player.php?scoid=24&cm=92&currentorg=articulate_rise

https://www.youtube.com/watch?v=R1Pz6u3oaeY

YouTube getting started series I still wanna check out (4 parts)

Danny’s Multiple view drawing video

Danny’s Dimensioning video

[ITP: Programming A2Z] Regular Expressions

DOM and ES6 Review

  • Arrow functions, “=>”, are a shorthand function notation in ES6

    • Works for anonymous functions which are unnamed

  • Switched from “var” to “let” in ES6 too

  • “this” refers to the current context of where you are in the code

  • Callback functions are synchronous only in JS

  • Promises are not supported in p5.js yet but you can use the fetch() function

    • There are three states: pending, fulfilled, and rejected

    • Fetch() handles asynchronous events using then() and catch() events

  • Made my own word interactor! Could do some interesting things with this in the future…

RegEx Notes

  • RegEx are inside \ ___\

  • meta-characters: “\d” = 0-9, “\w” = A-Z or a-z or 0-9, “\s” = whitespace, “.” = any character

    • Capital letter means opposite or NOT

  • Quantifiers: “*” = 0 or more, “+” = 1 or more, “?” = 0 or 1, “{min, max}” = range, “{n}”

  • Position: “^” = beginning of line, “$” = end of line, “\b” = word boundary

  • Character class [], OR

  • Capturing groups with ()

    • Group 0 is the full regular expression result

    • “$1” and “\1” (back reference) refers to group 1

  • JavaScript functions to use RegEx: test(), match(), exec(), split(), replace()

    • Use flags g = global to return every match to the RegEx or i = case insensitive matches

Assignment - MadLibs Generator

Here is my MadLibs generator and I followed this Coding Challenge.

So as I was following this challenge I saw that the Tabletop.js library is deprecated since 2020 so I transitioned to using something called Papa Parse 5 which converts CSV to JSON.

Here are some other important notes from following the challenge:

  • Need to “escape” the dollar signs in the RegEx because they are meta-characters

  • Whatever you return in the replacer function is what the expression results will be replaced with

  • Global flag!!

  • We kind of touched on this in class but entry.noun = entry[‘noun’]

Now that I’ve completed the MadLib generator, I’ve actually noticed that sometimes a MadLib is generated with all “undefined”’s. My gut instinct is that that would come from the first line of the .csv file but I think that’s accounted for in the Papa Parse declaration with “header: true”. Actually I just realized the “undefined”’s come when I try to generate a MadLib before the file is loaded! Duh!

Another question I have is I guess I am not sure how the "replacer” function really works. Like how does it know the part of speech that corresponds to the matched regular expression?

References

https://thecodingtrain.com/challenges/38-word-interactor

https://www.youtube.com/watch?v=7DG3kCDx53c&list=PLRqwX-V7Uu6YEypLuls7iidwHMdCM6o2w

https://www.papaparse.com/docs