Home Artificial Intelligence How Jacqueline Li is Visualizing Sound with Machine Learning Are you able to tell us more about yourself? Tell us about your capstone project; what was your inspiration and approach? What were a number of the challenges you faced while creating this project? What drew you to make use of ml5.js and p5.js over the others tools on the market? Did you might have previous machine learning experience? Do you might have any future plans for this project?

How Jacqueline Li is Visualizing Sound with Machine Learning Are you able to tell us more about yourself? Tell us about your capstone project; what was your inspiration and approach? What were a number of the challenges you faced while creating this project? What drew you to make use of ml5.js and p5.js over the others tools on the market? Did you might have previous machine learning experience? Do you might have any future plans for this project?

2
How Jacqueline Li is Visualizing Sound with Machine Learning
Are you able to tell us more about yourself?
Tell us about your capstone project; what was your inspiration and approach?
What were a number of the challenges you faced while creating this project?
What drew you to make use of ml5.js and p5.js over the others tools on the market?
Did you might have previous machine learning experience?
Do you might have any future plans for this project?

A gif of the synesthesia — lens website in motion

Jacqueline Li is a fifth 12 months at Northeastern University studying Computer Science and Design. She recently accomplished her capstone project the Synesthesia Lens, which is a synesthetic audio visualizer using ml5.js and p5.js. Proceed reading to learn more about her work!

Interview by Ozioma Chukwukeme

JL: I’m currently a fifth 12 months undergrad student at Northeastern University graduating in May. I grew up in Virginia and moved to Latest Jersey for middle school and highschool. I at all times had a preference towards STEM — I liked math, but wasn’t excellent at it, and I liked physics. Besides, in highschool I didn’t really know what I desired to do in college. The funny thing is that I hated computer science in highschool and didn’t need to pursue it in college. My mom kept telling me ‘You need to do it, it’s very easy’ but I believed it was confusing and difficult, so I believed it will make sense to come back into college undeclared as an engineer, because it still scratched that STEM itch in my brain. After a semester, I learned that engineering just wasn’t for me. So, I made a decision to offer computer science a second likelihood; I switched majors from engineering and eventually settled on a combined major in computer science and design.

The explanation why I selected design because the second half of my major was design was that I used to be at all times a creative kid, but never considered that as something I could pursue professionally. After exploring a bit more, I spotted the potential that it had for careers, similar to UI/UX, product design, and frontend development. Something that actually sparked my interest though was going to Zach Lieberman’s ARTECHOUSE exhibit in DC. I genuinely thought, and still think, ‘That’s considered one of the good things I’ve ever seen.’ It was a direct intersection between CS and design that I didn’t know existed before, and it became a distinct segment I desired to explore.

JL: So the prompt for the capstone project was very broad — the aim of design capstones is actually just to permit senior design students to work on whatever passion project they need to do using the abilities they learned in undergrad to this point. Going into it I knew I desired to create an interactive installation with creative code because I never had the outlet or space to try this during my time at Northeastern. At first of the semester, I used to be fooling around with p5.js and doing quite a bit with audio and randomly generated bubbles. During considered one of the crits, my professor mentioned that I should look into synesthesia and specifically Alexander Scriabin, a composer that had synesthesia who saw specific colours for specific notes.

I wondered how I could visualize this as I don’t have synesthesia. Eventually, I figured that I desired to showcase the fleeting nature of sound along with the colours that Scriabin saw when he heard music. To perform these tasks, I made a decision to randomly place lines on the x-axis with a color that’s mapped to the note that’s being picked up through the microphone. Immediately after the road is projected on the screen, it begins to fade away. I ended up titling this project the Synesthesia Lens to spotlight that this lets you see through Scriabin’s synesthetic lens.

In the event you’re curious about the nitty gritty, I used ml5.js to work out the frequency of the audio data. Then, I converted each frequency to the note it falls under and visualized that via a line with a color of the required note, as Scriabin saw. In the event you wanna see the precise mapping of notes to colours, take a look at the about section of my project website!

JL: I sort of went into this project blind because I didn’t have an excessive amount of experience with creative code, so it was roadblock after roadblock. I discovered videos from The Coding Train on Youtube which helped me start once I didn’t know the way things just like the draw() function worked since I used to be so used to object oriented programming.

Once I got deeper into the project, considered one of the larger challenges was determining the right way to get frequency data. I used to be initially using the p5.js FFT (fast Fourier transformation) library to get the frequencies but I used to be having trouble with it and thought there needed to be a better way. Eventually, I discovered The Coding Train’s Ukulele Tuner video and decided to make use of ml5.js.

JL: I believe it’s really the net editor for p5.js — its so fun and straightforward to start. My initial plan was to work in p5 and see the extent of its capabilities, and if I needed more, then I might look into Processing. Nevertheless, I felt that for this project p5 was good enough. I didn’t even know ml5 existed until having the difficulty with frequencies–and that was a godsend!

JL: Surprisingly, I don’t have any prior experience with machine learning. At my university, the mandatory classes we needed to take were around the basics. So my first computer science class there was in a language called Racket — it’s a functional programming language to get you pondering like a programmer and it was very difficult to learn. A few of my friends took machine learning as an elective, but I wasn’t curious about it on the time because I desired to go into frontend and learn more about web development. So I didn’t have previous experience with machine learning but ml5.js made it super easy to grasp, and I’m curious about learning more about machine learning and AI.

JL: My initial plan for this was to create several visualizations for the Synesthesia Lens, but due to the time constraint of the capstone I wasn’t capable of. Over the summer, I plan on making more interactive and unique visualizations for this project. I also need to look into generative art and inventive code and hope to do more with that in the longer term!

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here