I used Runscope and Thingsboard to prototype an IoT API for a working steam engine. With experience in API production and consumption, I wanted it to be a relevant POC for devs trying to integrate the physical world with the virtual. Once of the challenges I wanted to take on is how to differentiate a hardware failure versus an API failure and then pass that response on for debugging by a dev down the line. The base IoT capability of the devices is to send MQTT messages to an external broker. Thingboard.io digests MQTT and offers distinct REST API endpoints for each device, as well as an administrative API that offers historical data queries.
Originally prototyped at the Gray Area Art Foundation Creative Code Immersive, Embers is an interactive installation built from 1250 LEDs that react to collaborative wind movement such as breath. The core of the piece is an Arduino Mega 2560, followed by 25 strands of 50-count WS2811 LEDs, 16 improved Modern Device wind sensors (Rev. P), and 300 ft. of calligraphy grade rice paper. It was debuted at the SubZERO art festival and is currently on display at Kaleid gallery.
What began as a small effort to support the release of my first album has blossomed into an ongoing video art and animation project. These videos are around 1 minute in length and have original music and sound art tracks. Originally shared on Instagram and Twitter, they have also been projected on the sides of buildings for art festivals in San Jose, CA. They were made using a variety of video tools from Final Cut Pro to ffmpeg and iOS apps.
These albums are the culmination of 2 years of experimenting with modern field recording and music production hardware and software. Similar to programmatic approaches to media manipulation, the process involves the treatment of sound as an object. The sound objects have properties that could be shifted and transformed like variables. The resulting music veers into dark ambient, glitch, and post-industrial territory.
The goal of this box was to have the Linux soft-synth ZynAddSubFX running headless on a battery powered and untethered Raspberry Pi, controllable by a simple MIDI keyboard and an instrument switcher on phone or tablet. It also runs Node.js and serves a networked configuration app. The setup allows for networked playing from a variety of contexts. I wrote up a detailed tutorial with code snippets, video, and a public GitHub repo.
For this particular project, I ended up using a Raspberry Pi Zero W for its size and versatility. Because it shares the codebase used by Zyn, it also serves up a Node.js webapp over wifi for changing instruments. It's controllable by any basic USB MIDI keyboard and runs on a mid-sized USB battery pack for around 6 hours. Pretty good for such a tiny footprint and it costs around $12. A interesting challenge was the need to script Telnet to control Fluidsynth, it's main sound engine.
The Charleston City Paper provides comprehensive coverage of the annual Spoleto Festival USA over the course of three weeks. To better serve the many out of town visitors using mobile devices, I planned, designed, and coded this fully featured HTML5 app. In addition to editorial content and complete event calendar, it offered geolocation based suggestions for dining near events. Built with jQuery Mobile and drawing from the pre-existing CMS, it was compatible with a broad range of devices and automatically updated with fresh content.
The Electronic Arts Alliance of Atlanta Annual Member Show was mostly musical performances. My contribution was to take a combination of different members' electronic music and build this interactive musical instrument for anybody to experiment with online. The aesthetic came from a combination of scanned images of actual electronics and web-based animations, while backend scripting handled audio streaming and trigger timing.