Cy X is a black queer multidisciplinary artist based in Brooklyn, NY. They received their BA in Film and Media Studies from Colorado College in 2017. They are currently a MPS candidate at The Interactive Telecommunications Program, New York University Tisch School of the Arts. Cy is interested in exploring black queer futures and abolitionist possibilities through emerging technology, immersive environments, and performances.
- Ideal goals
- Launch interactive website with two-three mixes
- Realistic goals
- Launch interactive website with one mix
- Minimum goal
- Design website using Sketch + Figma to show User flow
Interactivity + Visual Design
- Ideal goals
- Launch full-fledge working website
- Website is responsive for mobile and web
- Website allows users to submit knowledge / information about sounds
- Guest mixes are also showcased
- Realistic goals
- Design website using Sketch and Figma and show examples of User Flow
- Minimum goals
- Web or Visual experience to be interacted with online
Final Execution Plan
By April 14
- Color Scheme decided
- Landing Page Design
- Layout for First Mix
- Annotated Notes for First Mix
- Animation Designed from Moving from Landing Page to First Mix Page
By April 21
- Design for how information flows on screen decided for first mix
- Annotated notes for Second Mix
- Two minutes of information shown and animated
By April 28
- Design for second mix
- Two minute animation shown for navigating to second mix
By May 5 – Final Presentation
- Animated Video exported and ready for feedback
- Figma/Sketch design accessible for those to see
Sonic Sessions / Sonic Portals
Sonic Sessions is an experiment in listening that seeks to incorporate music theory, history, and exploration into the way we listen to music and mixes.
Sonic Sessions will combine music literacy through active listening alongside but not limited to the exploration of:
- Music Theory
- Music History
- Collective Knowledge
The goal would be to provide an alternative experience to the way we often listen to music. Mixes are stories, woven together with the unique talents of the DJ and the selector. What if we provided insight into the process? What if there were more opportunities to learn the story behind each sample?
HOW AND WHY ME?
I currently DJ and have/had a bi-weekly residency at Playground Coffee Shop. Sometimes when I would make my own mixes, I would ensure to include a tracklist, but even that wasn’t always helpful unless it was synched up / including the time when it was playing. However, depending on the DJ they can have as many as four samples and effects moving at different moments.
Additionally, for the year I decided that I wanted to become a DJ because I wanted to better my listening practice and actively build a relationship with music. This thinking about music and sound more intentionally often led itself to a deeper level of research and curation.
Music Listeners who are looking for a curated music discovery experience that exists beyond/outside the algorithms.
Music Listeners who are interested in learning basic music theory and music history in an approachable way
For this exploration, I decided to use Max to learn more about synthesis and combine that with learning about UI.
There were a few parameters for the assignment and eventually I ended up taking an additive approach as I began to work on the assignment.
I was first tasked with controlling an element using a button (or a digital off/on state). I decided to learn about they key pressed function in Max and used the pressing of “p” on the keyboard to activate the phasor. I’ve been really drawn to the phasor because of it’s ability to produce a clicking noise.
I then created another button to start a sine wave. I liked the idea of combining the wave with the clicking of the phasor to see how they interacted and sometimes got some pretty interesting results.
For the sliders, I also connected them with buttons and used the built in slider of the live.gain as well as other additional sliders and programmed them so that their min and max made sense and reflected their role.
Max Patch UI:
I spent this past week focusing on effects and research. My last piece I did seemed pretty melodic heavy, so I worked on an additional one and used both effects and sampling to try something new.
The inspiration for this piece was pretty accidental. I was listening to some songs I was thinking of including in a DJ performance I was doing and some of the pieces I was listening to were more ambient. The dialogue you hear in the beginning was actually playing on my computer in a different window on twitter when listening to one song and I thought it was part of the song.
When I realized that dialogue kept looping, I recognized that it was not actually part of the song at all, but it gave me an idea on what I could experiment with.
Additionally, I also researched more musical devices that could serve as inspiration as I further develop my device.
I really like the Roland G707 which is a string inspired synthesizer and the hyve which is a 60 voice polyphonic analog synthesizer controlled by pressure and touch movement.
I’ve been having fun experimenting with Ableton! For some reason, experimentation is starting to feel a little less scary and I’m having fun. In our last class, we talked about rhythm and did a few exercises, so for this weeks assignment I continued with the use of Kalimba-esque sounds and added other sounds with them. This is making me think there should be a loop option for the instrument…
Sound #1: Simple Rhythm Exercise with no effects and a distorted bass sound.
Sound #2: Kalimba with melody and another type of Kalimba sound taking place of the original bass sound that was used in the first example. This new sound is at a higher octave and deviates from the original pattern done in the first sound exploration.
This week, I used Ableton to begin the prototyping of sounds and effects. So far, I’ve prototyped the default sounds of the Kalimba and use the MIDI keyboard to trigger. I also prototyped the wah wah effect. I’m still working on if the second effect for the instrument will be the tuning, mostly because other kalimba tunings usually have different string numbers / lengths.
Video 1: Base Notes of Instruments with No Effect. Still working on how I’d like the ADSR to be for the notes. Right now it’s sequencing which is not what I expected but I like the effects / idea of instrument being a sequencer.
Video 2: Base Notes + “Wah Wah” Effect. Here the wah wah effect is pretty to the “max” in my mind. Would love to experiment with this more and settings for controlling it because as of now the effect is just either on / off but in the actual design of the instrument it would be able to be controlled.
I changed my final project idea from creating an installation to creating / building an electric kalimba-esque instrument. I’m curious about getting into performances and would love to explore the possibilities of building instruments through Max MSP so this feels like a good start.
Here’s a video of someone playing kalimba: https://twitter.com/louboutintwink/status/1211722029893701635?s=20
Aural Soundboard: https://www.dropbox.com/s/res9xlgltlksat2/Kalimba.aif?dl=0
First portion is sound of Kalimba without effects, the second portion is sound of the Kalimba with a “wah wah” effect. Traditionally, the wah wah effect is created by placing hand over the soundboard and modulating position as the notes sound, but I attend to create this sonically. Another “effect” I’m curious about experimenting with is creating a knob to change the tuning of the instrument so that users may be able to play different genres of music.
Something that isn’t captured in either of these moodboard + user journey, is the imagined aesthetic of the instrument. I believe that through creating electronic instruments, this increases the possibilities of also what the instruments can look like especially since the need to rely on materials to help create the sound isn’t quite as big in the same way that it is for acoustic instruments.
In addition to sounding well, I want the instruments to stand out as an artistic piece and will be experimenting with the design of the kalimba as part of that, including experimenting with making a circular shape for the body.
For my mono piece, I did a sonic exploration of my mind in transit. Oftentimes when I’m taking public transportation, my mind drifts deeply to thinking about different things that are occuring or have occured in my life. Sometimes I’ll write about these things in the notes app on my phone. Othertimes, I’ll just mindlessly drift further into my brain.
For this piece, I recorded a bit of the my train ride towards my therapy appointment. I then found one of the notes I had written during a commute and overlayed that on top of the sound of the train ride.
I decided to do this for the mono piece because I like the way that in mono everything sits in this one space oftentimes making these sound muddled or creating moments that are out of phase. This to me seems to add to the overwhelming nature of the piece.
Mono assignment: https://www.dropbox.com/s/3pctzfxdmajg58t/MonoAssignement.aif?dl=0
Music Interaction Design Week 1 Prompt
For my interactive music project, I have been deeply inspired by the Golden Record: Sounds of Earth by NASA. These records exist as time capsules of Earth and are etched in copper and plated with gold.
The current sounds they have are:
- Life Signs, Pulsar
- Kiss, Mother and Child
- Tractor, Bus, Auto
- Horse and Cart
- More Code, Ships
- Tractor, Riveter
- Herding Sheep, Blacksmith, Sawing
- Tame Dog
- Mud Pots
- Wind, Rain, Surf
The list goes on. There aren’t many sounds of people speaking or people playing music. It is a record of noise.
For my project, when reimagining this, I would like to create a room sized capsule that has different imagery that can be trigger sound. I am imagining that sound can be triggered by AR image target or by literally touching the image.
Multiple people can be interacting with the piece and create a collaborative sound scape.
TLDR: Golden Record inspired room capsule allowing people to discover the Sounds of the Earth. People move throughout the room and trigger sounds live through AR (image target or phone) or by tactile interaction (touch).
Why? Interested in asynchronous clocks, relative time, time exploration that doesn’t center time as linear, maps, sonic maps / memories.
On the first day of class, me and my partner Yiting designed an interactive musical experience inspired by a jazz piece that Yiting liked.
The jazz piece had many interesting yet simple components that we wanted to play with and allow someone to control and experience, so we started off with a catalog of attributes:
- Three instruments: guitar, drum, upright base
- rotating solos that invite new rhythms and melody
- tempo that grows over time
We both were also interested in movement and the body and wanted to incorporate that into the piece.
So our idea was to create a three-panel wall sized installation where each panel represented an instrument. Each panel would have a unique visual aspect to represent the instrument and the visual aspect of it would be projected onto the panel.
A person interacts with the piece by moving across the panels. The panel you stand in front of allows you to solo that panel. Additionally, you could change the tempo of the piece by using the height of your body which would be tracked through projection mapping and kinect. Wiggling your arms also allows you to accentuate different points of the piece and alters the rhythm.