I used Runscope and Thingsboard to prototype an IoT API for a working steam engine. With experience in API production and consumption, I wanted it be a relevant POC for devs trying to integrate the physical world with the virtual. Once of the challenges I wanted to take on is how to differentiate a hardware failure versus an API failure and then pass that response on for debugging by a dev down the line. The base IoT capability of the devices is to send MQTT messages to an external broker. Thingboard.io digests MQTT and offers distinct REST API endpoints for each device, as well as an administrative API that offers historical data queries.
Originally prototyped at the Gray Area Art Foundation Creative Code Immersive, Embers is an interactive installation built from 1250 LEDs that react to collaborative wind movement such as breath. The core of the piece is an Arduino Mega 2560, followed by 25 strands of 50-count WS2811 LEDs, 16 improved Modern Device wind sensors (Rev. P), and 300 ft. of calligraphy grade rice paper. It was debuted at the SubZERO art festival and is currently on display at Kaleid gallery.
This album is the culmination of a year of experimenting with modern field recording and music production hardware and software. Similar to programmatic approaches to media manipulation, the process involved the treatment of sound as an object. The sound objects had properties that could be shifted and transformed like variables. I created videos for various songs using open source Linux tools such as ffmpeg and mainstream production platforms like Final Cut Pro.
The goal of this box was to have the Linux soft-synth ZynAddSubFX running headless on a battery powered and untethered Raspberry Pi, controllable by a simple MIDI keyboard and an instrument switcher on phone or tablet. It also runs Node.js and serves a networked configuration app. The setup allows for networked playing from a variety of contexts. I wrote up a detailed tutorial with code snippets, video, and a public GitHub repo.
For this particular project, I ended up using a Raspberry Pi Zero W for its size and versatility. Because it shares the codebase used by Zyn, it also serves up a Node.js webapp over wifi for changing instruments. It's controllable by any basic USB MIDI keyboard and runs on a mid-sized USB battery pack for around 6 hours. Pretty good for such a tiny footprint and it costs around $12. A interesting challenge was the need to script Telnet to control Fluidsynth, it's main sound engine.
The Charleston City Paper provides comprehensive coverage of the annual Spoleto Festival USA over the course of three weeks. To better serve the many out of town visitors using mobile devices, I planned, designed, and coded this fully featured HTML5 app. In addition to editorial content and complete event calendar, it offered geolocation based suggestions for dining near events. Built with jQuery Mobile and drawing from the pre-existing CMS, it was compatible with a broad range of devices and automatically updated with fresh content.
The Electronic Arts Alliance of Atlanta Annual Member Show was mostly musical performances. My contribution was to take a combination of different members' electronic music and build this interactive musical instrument for anybody to experiment with online. The aesthetic came from a combination of scanned images of actual electronics and web-based animations, while backend scripting handled audio streaming and trigger timing.
I've leveraged my journalism background to write case studies, blog posts, tutorials, and stories for a variety of tech companies. Some of them are detailed and technical explainers, while others delve into storytelling from a technology perspective. Clients have included NordicAPIs, API Evengelist (Kin Lane), Runscope, and DreamFactory Software. For a while, I even had a tech column in a regional newspaper in Charleston, SC that was irreverent and informative.