Before the beautiful viz is made, data must be bent and molded.
Last week I went to the American Natural History Museum through the lens of a museum/exhibit critic.
I remember as a child and young adult the museum seemed impossibly big. Yet this time, through this lens – and thanks to the dozens of other times I’ve been here – the space was totally manageable.
I performed a 5-minute audio-visual piece at CultureHub for an audience of about 50 people. Here is a recording of it – the piece is called Big Dark Age.
Created a small piece based on a repetitive activity in Adobe After Effects 😤
Revenge of Left Shark is an interactive rhythm and dance game. In other words, DDR with your hands. The game uses computer vision to place the player on the beach, supply visual cues and provide real-time scoring information.
700+ lines of code later, a simple game has been fully created from what was once a tiny idea.
As a group performance, me and my team (Huiyi Chen & Fanni Fazakas) created an object theater installation & performance called No ∞ Symphony.
The piece is an expression of the modern version of the story of Sisyphus – suggesting that life is an endless cycle of actions and choices.
Using data from the musixmatch API, I have created a tool that shows what words musical artists use most in their lyrics. It uses a combination of jQuery and D3 libraries to work as a single-page “app”. Try it here.
The web “app” works by finding the artist you search for, and running through every lyric it can find by the artist to develop a lexicon. Once that lexicon is built, you can click on the box that is created to view statistics about their most-used words.
You can also view the most-used overall by clicking the “All Artists” button.
My work in Max MSP is getting more advanced.Yet at times, things can look very very complicated:
Some classmates and I will be putting together an object theatre based live performance later this week. Stay tuned for documentation.