Mood Ring is an interactive mirror that illuminates based on your mood.
Classmate Michael Blum and I have developed a project proposal for a public intervention project that utilizes data art/visualization called ‘Invisible Crowds’.
Invisible Crowds aims to raise awareness of the sheer volume and diversity of invisible waves that are constantly bouncing around public spaces.
For this proposal we developed the above presentation and a more detailed writing below.
Using data from the musixmatch API, I have created a tool that shows what words musical artists use most in their lyrics. It uses a combination of jQuery and D3 libraries to work as a single-page “app”. Try it here.
The web “app” works by finding the artist you search for, and running through every lyric it can find by the artist to develop a lexicon. Once that lexicon is built, you can click on the box that is created to view statistics about their most-used words.
You can also view the most-used overall by clicking the “All Artists” button.
The instant an Arduino or Raspberry Pi connects to the web (with a public IP) it is out there for anyone – or anything – to detect.
In our connected devices class, my classmates and I all saw this vulnerability firsthand. After leaving our connected thermostats on for a week, we experienced our devices being scanned and sometimes attacked by machines from across the globe.
The connected thermostat I was building earlier is now complete! 🌡🌡🌡
This thermostat works like a Nest Thermostat (though clearly not as pricey); collecting the current temperature and sending that information to an online server.
Over the past few weeks I have been working on my first ‘patch’ – an interface for controlling an audio-visual performance. Here is a still-shot of something made by the patch:
Lets call it CRYPT0MANIA – the connected crystal.
This post is about the crystals digital details – because it works – it really works!
Planet Music is a sound visualizer on the moon! The current version has a set playlist of 6 songs that can be activated with the media player buttons. The space scene reacts to the sound using a fast Fourier transform (FFT). Here is a sample video:
Try it for yourself here (Note: please allow ~20 seconds to load. Currently not optimized for mobile)
I’m fascinated by the way sound and images combine, and how we perceive them. I like concerts because the music is accompanied by visual compliments, whether it is lighting, dancing, or (increasingly) digital images and videos on the big screens.
In the big leagues, these visualizations are high-quality animations, often specifically designed for the song that is playing. One that always comes to mind is the animation that accompanied Tiesto’s ‘Escape Me’ during his Kaleidoscope world tour. It was many years ago, and this was the best video I could find:
Amazing production, the visuals really complimented the song; but what about music in the mid-leagues or little-leagues? Is it possible to entertain and engage people with more accessible stuff? This is the avenue I would like to explore for this final project.
My goal is to create a dynamic music visualizer – an accessible sketch that detects something in the music data and provides visual feedback in real (or near-real) time.