Black Projections – ICM

For my final projects at ITP, I combined my assignments for Physical Computation and ICM. This made a lot of sense to me, as I genuinely am more enthusiastic about coding if it is part of a physical piece or installation.

I am not sure if my ideas or feelings towards code have changed. Honestly, I think I’ve always had quite a bit of skepticism related to my personal interests surrounding it and how it could be a tool for me, which started before my ITP journey when thinking of going to coding bootcamp. What drew me to ITP was that there was a world just beyond coding and that if I had to use coding, it could be part of more meaningful projects.

This larger project which I am calling “Black Projections,” is an exploration of how time, space, and current design systems alter our reality, memory, and ability to take control of our future. It specifically focuses on the Black Experience and explores these concepts through blackness.

I am very much a long form thinker which means my sketches tend to be less “efficient” and everything is spelled out. This could lend itself to mistakes, but honestly helps my brain out when I’m exhausted. Notation also helps. For my sketch for my final project, I really learned how to push p5.js limits in terms of it’s storage / photo holding capacity and I was pleasantly surprised. It could hold a lot more images than I thought.

The device I built is a portal mapping device that has 14 node points that are connected to sensors that then trigger images and series of text in p5.js.

I had a few issues with my text disobeying the parameters I thought I had set. There are also a few issues with some of the photo alignment. For taking this further, I would like to explore having more of a collage effect. So when an image is triggered, it stays on the screens and fades away even after the sensor is triggered off. Since, right now the sensor is digital and operating on binary the images now display following that where they are either on the screen or off the screen dependent on the digital read of the sensor. In actuality, the sensor values are oscillating between on and off very fast as they try to navigate the magnetic field, so to accomodate this I lowered the framerate of the sketch to one so the images are not speeding through the sketch very fast.

Sketch link:

Server: Web Server for Chrome

Black Projections – Update

For my final project, I’m creating a device that invites people to destroy interrogate the idea of the colonial time project (linear time) and investigate how colonization and race as technologies have led to erasure of past and other ways of being, destruction of magic, while creating the “one-world world,” a term coined by John Law.

The device will be accompanied by a zine that is a collection of inspiration, research, questions I have asked while making this project.


Monday Dec 2: Buy acrylic from Canal Plastics, do first draft of laser cutting for the top of device

Tuesday Dec 3: Set up Reed Switches + Submit to be in Winter Show

Wednesday Dec 4: Edit Zine + User testing

Thursday Dec 5: Access User Feedback + Edit Zine

Friday Dec 6: Finish Device Design (the non-electronic components inspired by previous project “Open Portal”

Saturday Dec 7: Edit Zine + Decide if I want to add more images and gif to Black Projections p5.js sketch

Questions / Things to Figure out:

-Non-computer power supply for Arduino

-Music / Sound / Noise Component

-What will “hold” the magnet

-Connect magnets to overarching theory / thesis

Calls – p5.js sound

For our unit on sound using p5.js, my partner Lizzy and I created a 30 second composition called “Calls.”

For our project, we knew that we wanted to incorporated recorded / found sound at the basis of our project. Lizzy from a previous project had recordings of different birds from the Northeast.

We tested these in p5.js to hear what they all sounded like interacting with eachother. Originally, the sounds had ambient noise before and after the initial call or had extreme volume peaks at certain portions so we edited the sounds in Audition so we could have more control over looping them.

We arrived at a good point but wanted to think more about our sonic scape. With just the bird sounds, we were getting a lot of higher frequencies but there was still empty space in other registers. Also personally, I am very curious about synthesizers and how those sounds can interact with nature, so I did some experimentation with that. The addition of the Oscillator / sine wave moving between the frequency 100-200 hz allowed for there to be a more ambient bass rumble. We also added leaves for texture and variety.

Listen to calls here:


For Introduction to Computational Media, we had to create some sort of p5.js sketch that we would then present to the class for five minutes.

I had a hard time coming up with an idea of what to do initially, but then remembered back to an old idea / old source of inspiration which was Image CAPTCHAs (those response tests you’re forced to do so that the computer can determine if you’re a human or a robot).

I always wanted to recreate these tests but as a way of testing some sort of bias.

Unfortunately, I didn’t quite feel confident coding something similar but instead opted for something more loosely inspired.

This week’s videos focused tremendously on manipulating the DOM so I figured, I should go ahead and try that out with this project and because we recently had Indigenous People’s Day, I came up with something thematically tied in with that.

var words = ["click verify", "evil", "invader", "confused", "pain", "murder", "genocide", "thief", "fake"]
var index = 0;
var button;
var truthP;

function setup() {
  createCanvas(500, 500);
  button = createButton("verify");
  truthP = createP('Have you been lied to?');
  truthP.position(65, 601);

function draw() {

function textwords() {
  textAlign(CENTER, CENTER);
  text(words[index], 250, 250);

function mouseClicked() {
  index = index + 1;

function verifyTruths() {
  if (index < 9 && index > 0) {
    text('#tru', 250, 200);

function theEnd() {
  if (index > 8) {
    text('#FUCKCHRISTOPHERCOLUMBUS', 250, 200);

function oververify() {
  truthP.html('Are you surprised?');


Computation and Me

The World of Computation is still very new to me. A few years ago, I had no idea what that word meant or that it could be something that I could use.

In many ways, I feel like I still cannot completely fathom all the possibilities of computational media. As an artist I am interested in how to combine installations and fabrications with my thematic interests in exploring black futurity, and queer futures. There is still much to be figured out here. Still many more dreams to be dreamt and stories to birth before I know what project I shall work on specifically.

Something I am interested in is the merging of the digital and physical, how they can inform each other (well technically the digital is actually pretty physical, although people seem to forget that). A project I really like that merges these worlds well is recent ITP grad Wipawe’s Thesis Project, Me, myself, and .io.

A projection-based augmented reality (AR) installation about who I am
as seen through both my physical and digital data.

Me, myself, and io as described by Wipawe

For this week’s homework assignment, I tried out some new computational methods using p5.js. Although I was intrigued by p5.js initially, it is becoming clear that it is not really my ideal method / platform of choice. But nevertheless, I am just thinking of it as a learning tool through which I can grow and find something that speaks to me more.

This sketch is silly, which I’m realizing is also something I need to get in touch with. I haven’t been silly in an academic sense in a while. The requirements for this sketch were: 1) Have one element controlled by the mouse 2) one element that changes over time, independently of the mouse 3) one element that is different every time you run the sketch.

For my sketch, the color of the character’s shirt changes every time the sketch is made to run again (by pressing play, by refreshing the browser, etc). The afro of the character also grows as you move your mouse down. I’m still working on the second requirement. I wanted to make the eyeballs move / be animated but was struggling a bit with that.

That’s all for now. Currently, feeling a bit limited by my capabilities. Sometimes I want something to happen but I can’t always make it happen so then I creatively figure a new alternative out. I have become better at embracing flexibility, but would love to get better at precision. I’m also curious about how p5.js is used after this class? It would be nice to view more examples of work.

Why Computational Media?

My creative process of creating two-dimensional designs or video has mostly been through the use of existing platforms and softwares such as those in the Adobe Creative Suite series.

After accidentally falling into a design role at my last job, I began to become curious about other ways that I could use technology to tell stories and/or connect with other people. However, my initial searches and attempts at finding ways of learning how to code brought me to coding bootcamps or were ways to learn how to code so that I could get a coding job. (Explains why I was very happy when I found out about ITP)

Over the year of learning more about the possibilities of computational media, I have become inspired by:

Queering the Map

Queering the Map is a community-generated mapping project that geo-locates queer moments, memories and histories in relation to physical space. 

This project is intriguing as it investigates queering space both virtually and physically while enriching our collective memory.

Queering the Map homepage with black location pins showing presence of logged stories.


Co-Star is an app that I use frequently and one that has garnered a lot of interest within the past year or so.

Co–Star is an AI-powered astrology app with horoscopes, personality analysis, and compatibility founded by Banu Guler, with consulting astrologers Lor O’ConnorDr. Jennifer Freed, and Alice Sparkly Kat 

With the rise of astrology, Co-Star in my opinion has been one of the first apps to not only start conversations about astrology that extend beyond just knowing your sun sign, but also has found a way to make astrology a revolving conversation rather that is through the sometimes ridiculous A.I. powered notifications that give you advice about your day, through the ever-changing movements of planets and moons with specific updates on how it effects you, or through the ability to view the charts of your closest friends.

Screenshots of four co-star notifications against a celestial background.

For my first assignment in Intro to Computational Media, I decided to create a self-portrait using p5.js.

I found the process to not be too difficult for a rather simple portrait, but had a harder time thinking of how to add more details to my face but was also unsure if more details would be beneficial or necessary.

Here’s what I created:

function setup() {
  createCanvas(400, 300);

  // Body
  rect(160, 200, 80, 500);

  // Hair 
  ellipse(200, 90, 170, 125);

  // Head
  fill('rgb(101, 51, 0)');
  ellipse(200, 150, 150, 200);

  // Outer Eyes
  ellipse(160, 150, 50, 32);
  ellipse(240, 150, 50, 32);

  // Inner Eye
  ellipse(160, 150, 30, 32);
  ellipse(240, 150, 30, 32);

  // Mouth
  stroke('rgb(210, 127, 161)');
  line(190, 210, 210, 210);

  // Cheeks
  fill('rgb(119, 71, 69)');
  ellipse(160, 180, 20, 5);
  ellipse(240, 180, 20, 5);