Building upon my artist lyrics analyzer project, I made a small web app that searches artist lyrics for the colors that they mention in their songs.
You can try it yourself here.
Using data from the musixmatch API, I have created a tool that shows what words musical artists use most in their lyrics. It uses a combination of jQuery and D3 libraries to work as a single-page “app”. Try it here.
The web “app” works by finding the artist you search for, and running through every lyric it can find by the artist to develop a lexicon. Once that lexicon is built, you can click on the box that is created to view statistics about their most-used words.
You can also view the most-used overall by clicking the “All Artists” button.
The instant an Arduino or Raspberry Pi connects to the web (with a public IP) it is out there for anyone – or anything – to detect.
In our connected devices class, my classmates and I all saw this vulnerability firsthand. After leaving our connected thermostats on for a week, we experienced our devices being scanned and sometimes attacked by machines from across the globe.
The connected thermostat I was building earlier is now complete! 🌡🌡🌡
This thermostat works like a Nest Thermostat (though clearly not as pricey); collecting the current temperature and sending that information to an online server.
Over the past few weeks I have been working on my first ‘patch’ – an interface for controlling an audio-visual performance. Here is a still-shot of something made by the patch:
Lets call it CRYPT0MANIA – the connected crystal.
This post is about the crystals digital details – because it works – it really works!
Planet Music is a sound visualizer on the moon! The current version has a set playlist of 6 songs that can be activated with the media player buttons. The space scene reacts to the sound using a fast Fourier transform (FFT). Here is a sample video:
Try it for yourself here (Notes: please allow ~20 seconds to load. Currently not optimized for mobile)
I’m fascinated by the way sound and images combine, and how we perceive them. I like concerts because the music is accompanied by visual compliments, whether it is lighting, dancing, or (increasingly) digital images and videos on the big screens.
In the big leagues, these visualizations are high-quality animations, often specifically designed for the song that is playing. One that always comes to mind is the animation that accompanied Tiesto’s ‘Escape Me’ during his Kaleidoscope world tour. It was many years ago, and this was the best video I could find:
Amazing production, the visuals really complimented the song; but what about music in the mid-leagues or little-leagues? Is it possible to entertain and engage people with more accessible stuff? This is the avenue I would like to explore for this final project.
My goal is to create a dynamic music visualizer – an accessible sketch that detects something in the music data and provides visual feedback in real (or near-real) time.
As the technical feasibility of the API to LED project comes together, it is time to consider the physical specifications, including design, construction, and materials. Here is the current conceptual design:
Above is a little illustration that depicts the sunrise and sunset times (listed on the y-axis) of two locations in the world for each day in the year (spread across the x-axis).
See the code and web version here (press the play button – not optimized for mobile).
In this current case it is the 2016 sunrises and sunsets from Eastern Standard Time (EST) – aka New York – and Central European Time (namely Switzerland & France).
At many stages in my life, for different reasons, I have found myself doing a mental calculation to imagine this time difference, and specifically when our normal waking hours overlap.
These days, both a friend and my girlfriend’s father are stationed in Switzerland.
This illustration is designed to show that our days share a lot of the same sunlight – especially in the summer. So even if we are far away, we can often look up at the same thing in the sky. ?