steven dale
ux. interaction. sound. design
spark
2010
a playful way to cross-pollinate research
researchcollaborationremixprototypecardsgamelegoconnections
Collaborators
  • Aabhira Aditya
  • Jacqueline Cooksey
  • Bland Hoke
  • Mai Kobori
  • Eulani Labay
  • Minuette Le
Overview
We designed + prototyped a paper-based game. This tool helped us make meaningful connections between fragments of insight each of us discovered over the course of an 8 week research project.
I was particularly interested in this method of facilitating conversations and spatial reasoning to improve communication on complex collaboration projects.
Context
We investigated the food distribution ecosystem of New York City, and its effects on people living near its central hub in Hunts Point.
Each of us came from a different professional background, including graphic design, sound design, architecture, public policy, performance art and computer science.
Instead of setting out to try and "solve" a specific problem, we let our interests and discoveries guide potential directions for our experimental research.
In our 2nd playthrough we used "BAM" or question cards to help us start conversations and link topics across disciplines
Challenge
Our work was highly collaborative. We walked through neighborhoods, interviewed people, conducted literary research and created data visualizations.
Whenever we debriefed our field research, we struggled to weave together relationships, pinpoint cause/effects, and share what we learned with each other in a simple way.
Could we create an engaging structure for sharing the bits of insight each had uncovered, while keeping the energy of an open-ended discussion?
Could we somehow record the connections we made and and the patterns we discovered, to later share outside of our group?
Process
A simple ruleset emerged after lots of false starts + several playthroughs.
At the beginning of each session, players chose an identifying marker color. This helped us visually scan the construction later to get a sense of participation balance. Then, on blank white index cards, each player wrote one noteworthy tidbit of an idea they'd discovered recently, for example:
  • an assumption they held
  • a statistic from a report they found
  • a phrase they overheard on the subway
  • a telling quote from an interview
  • an open question
  • a recent happening in the news
  • a troubling gap or concept of conflict
We called these the spark cards. Players took turns placing these on the board and briefly talking about their significance.
The second type were link cards. We used red tape to draw connections between spark cards on the board, and then added link cards to the tape to describe the relationship. For example, Card A causes Card B. In another instance, maybe Card D is a result of Card F. Lastly we had question or BAM cards. These could be played by anyone as a way to tease out more detail.
As a record keeper, I tried numbering the cards to keep track of the flow of conversation, so the board could later be reconstructed as a timelapse. Inspired by distributed version control system (DVCS) systems like git, I was hoping to create a visualization similar to this:
From the facilitation side, I was inspired by the work of Dialog Mapping and Issue Based Information Systems (IBIS).
Splitting into thematic groups in later iterations
Results
The game was helpful for starting meaningful conversations between the group that may not have otherwise occurred. Some of the resulting ideas were seeds for multi-year research projects that later grew out of the program.
After completing the project, 2 things stuck with me: The idea that seemingly unrelated fragments have value in different contexts, and the power of spatial thinking for making sense of fragmented research. I revisited these interests in my thesis research, and later as part of the Make Parallels project.
Roles
  • Rapid Prototyping
  • Visual Design
  • Gameplay Mechanics
  • Concepting
  • Workshop Facilitation
visualizing flow
2015
an animated, visual identity
processflowrollageremixanimationidentityprototypecanvasjs
Collaborators
Overview
Flow, remix and rollage are key design principles for Make Parallels, a free + open source creativity support tool I've been helping design + build.
I've been working on various forms of dynamic animations for use across the product, an ecosystem of web + desktop applications. The latest animation is on our proof of concept web application, as a non-functional teaser to the identity. We'll incorporate animations based on this research + framework into the application's functionality next.
Goals
Inspired by the phrase "motion is meaning", I started with two questions:
How can the motion of the application's UI elements communicate the core ideas + feeling of the project?
How could it inspire people and improve their experience in a meaningful way?
Design Research
A key part of my design process is collecting art + design images. I'm heavily inspired by early collage artists Jiří Kolář + Boguslaw Schaeffer, along with a new wave of collage artists.
I looked for inspiring images under three themes: flow, rollage + remix. I then dropped the images for each theme onto a separate digital canvas, playing with arrangement as a way to discover patterns, ideas and connections. A different structure emerged for each.
This was not a one-off exercise, but an ongoing process; I've iterated these arrangements many times as I've found new inspiration and learned more about what we're making.
A timelapse of this canvas arrangement emerging. The art+design images are of sine waves, figure 8's and other circular, back + forth movement, representing the theme of Flow.
Remix / Collage: Each image is an artwork composed of an assemblage of fragments with different shape, color, texture and style.
Rollage is a specific type of collage, where fragments are shaped as thin vertical slices. Some exploration into horizontal and off-kilter variations
Prototype 1:
Rollage Study
This image in particular inspired me with a feeling of movement. Could I recreate this in code, to see the glimmer in action?
With this first step goal, I created a prototype using an early version of P5.js, a library similar to Processing useful for art, design + interaction projects. I was able to recreate the feeling with a bit of experimentation and a small amount of code.
Prototype 2:
Realtime wave
This movement had a nice glimmering feeling, but I wanted to push it further by adding a wave motion to the slices. Looking to another Kolář image gave me more inspiration:
How could the application embody this feeling?
I started by sketching: what might different moments, or states, of the animation look like?
With this goal, I broke the problem down into parts. First step: get the image sliced into vertical bars, in realtime. While I already had something similar working from the earlier rollage prototype, I wanted to understand how the p5.js library worked under the hood.
I read the library's source code for the functions I was using. I understood only a little, but enough to see it was abstracting HTML5 canvas manipulation. Since I was using the P5 library for only one function and planned on doing more custom image manipulation, it made sense to move forward with JS + HTML5 canvas directly.
I found some related snippets that I adapted, and after a few tests, got a basic version working: a static image, sliced into bars and rendered in realtime, in the browser. The bars were frozen in a set position - no animation yet.
With the number of links, references, prototypes + images for this task growing, I began compiling all of my research in one place.
Now for animating the wave motion. I found a great example in Processing with all the physics code and the feel I had in mind.
I used this as a base, first porting the Processing code to Javascript, replacing the Processing run loop with the JavaScript RequestAnimationFrame loop. From there, it wasn't too hard to replace the dots from the demo with the bars I had already figured out how to slice.
Some sketching again to work out the math...
After many tests and iteration, I finally got the mashup working:
Result
Once I got everything stable, I refactored out my code into a reusable module, as a local (unpublished) Meteor package.
You can see the live animation in code directly on CodePen.
Next Steps:
Interpolation
In its current form, the animations represents two of the three key themes: flow (via wave movement), and rollage (via the fragments or slices of vertical bars). It does not yet incorporate the third theme: remix.
How could this theme be expressed, within this framework of animation?
A logical action in the application's functionality where the remix theme could b expressed is during the act of creating connections between two images, ie, making a parallel. Here, we could combine all three: flow movement, rollage aesthetic, and represent remix aesthetic through a momentary interpolation of the images.
Imagine selecting a source image, the image on the left, selecting a destination image, the image the right, and then tagging that relationship as born from.
Here's a short demo of how selecting + drawing the connection might work, if both the source and destination images are in the viewport:
Kolář's early rollage work was inspirational here, inspiring a sense of what that interpolated moment might look like:
This is a storyboard of how that might look like in movement. Note I didn't represent the wave motion yet here, this is just the interpolation.
Next Steps:
Algorithmic Augmentation
Lastly, I've been interested in experimenting with sampling the color + shapes of images on the canvas and using these as parameters to generate algorithmic elements to the animations in real time.
Designer Tristan Bagot experimented with this idea in his thesis, and this experiment might be an interesting direction to pursue as our framework develops.
Roles
  • Concepting + Design Research
  • Visual Design
  • Animation + Motion Design
  • UI Prototyping + Development
the rub
2010
sound design + composition for an experimental theater production
immersivehamlettheaterexperimentalsounddesignspace
Collaborators
Overview
I composed, recorded and mixed original music for The Rub, an experimental theater production. The play was an immersive, contemporary re-imagining of Shakespeare's Hamlet.
Listen

Process
I worked alongside the director, designers and cast for several months to produce the soundscape.
The pieces changed continuously, as the script evolved and the actors learned + fine-tuned their performances. I wrote several hours of music, with many variations. Much of the material was not used, though all of it was valuable in the process.
A key piece was the appearance of Hamlet's ghost. I recorded the actor reading his lines during one of the rehearsals with a Rode Omni Lavalier condenser microphone. I later heavily remixed, layered and effected this recording. We used this as a background track to add impact + ambiance during the pivotal scene.
Inspiration for the this project came from Trent Reznor + Atticus Ross' Social Network soundtrack and Loscil's sound compositions:
Result
The theatrical production ran at the 2012 New Orleans Fringe Festival and received positive reviews, including a mention by The Gambit as one of the best productions of the festival.
Next Steps
A big challenge was syncing sound + music to the performance because no two were ever the same. Audience immersion added another layer of variability: the production relied on a fixed stage for several sections only, leaving the audience free to roam about during the rest.
This made designing a consistent experience for all listeners very difficult. Standing in various areas of the space had big changes in sound quality, resonance, loudness, and intelligibility.
Thus, an interesting area to explore is dynamic composition: can sound + musical elements be designed in pre-production, to be triggered and shaped in realtime, based on actors + audience members' movements in the space?
There are exciting intersections in dynamic audio/sound system design, drawing from emerging practices in exhibit design, interactive installations, video games + virtual reality (VR).
Roles
  • Sound Design
  • Sound Editing
  • Music Composition
  • Sound Recording
Attribution
I relied on sound effect contributions from these sound designers on freesound.org:
Photography by Valerie Pisarello
turn+zoom
2008
rotating + zooming products in 360°
interactionux360jszoomtileproductuiecommerceproductphotography
Collaborators
  • Drew Diller
  • Nitin Dhar
  • Maurício Linhares
  • Peter Lee
Overview
Our team designed + delivered into production a 360° turn + zoom product viewer for an ecommerce startup selling refurbished networking hardware.
We developed a pipeline for photographing products as they flowed through operations, which generated images and powered the product viewer on the customer-facing website. This experience loaded images dynamically using a technique similar to Google Maps. Each product was shot in 16 angles, enabling customers to zoom in for detail after choosing any angle.
At the time (2008), no similar functionality existed which combined a rotating and zooming product experience directly in the browser, without plugins.
Context
Over 50% of our customer service inquiries were product-related: how many ports are there? will it fit in this rack unit? We had a low return and defect rate, but whenever we did have a return, it was usually because a customer ordered the wrong part.
We started to see a clearer picture of the problem once we did a review of our chat logs and RMA data, and interviewed customers, salespeople and customer service reps:
While manufacturing specification data was freely available online for basic product information, (ie, number of ports, jack types), customers often needed to see the product from various angles to determine if it was a good fit for their configuration.
Manufacturer-released photos were low resolution, were from one angle, and were nearly always available by category only. Tens of variations per category and complex SKU numbering added up to a lot of confusion for customers.
Since many could not afford costly manufacturer maintenance contracts, they relied on ecommerce websites for information to inform their purchase. We realized we could significantly improve the customer experience, differentiate ourselves from competition, and potentially change industry standards if we could provide free, high-res product photography online.
Process
We were a small team with little resources. Many moving parts were needed to launch this feature, so we needed to prototype quickly and cheaply to see if the idea was feasible. We bought a used low-end USB turntable that had configurable software to photograph products at intervals and copy photos to a shared network drive.
We started with simple tests to get consistent color with white sheets against a wall, then iterated to foam-core light boxes. We tried many lighting arrangements and combinations of equipment.
Concurrently, we wrote automation scripts in Ruby, relying on ImageMagick for image processing: 16 shots for each product were color-corrected, resized, compressed, optimized and sliced into tiles, which the front-end widget would consume.
All of the code needed for the front-experience was custom, written mostly in Javascript. We integrated this into our live Rails-backed codebase, designed the UI, tested across browsers, and conducted usability tests.
Result
Several years later, the zooming JS library we originally evaluated released a plugin with similar functionality [combining turning + zooming].
Roles
  • Concepting + Design Research
  • Visual + Interaction Design
  • HTML/CSS Development
  • User Experience Design
  • Product Management
image morph
2015
a JavaScript animation for expanding + contracting thumbnail images
uiprototypeinteractionmorphimageanimationtransition
Collaborators
Overview
Flow is a key design principle in Make Parallels, a free + open source creativity support tool I've been involved with.
I designed + coded an image preview animation for our proof of concept web/desktop application.
Challenge
People using the application arrange fragments, or bits of media, on an infinite canvas. Maintaining context + viewport position are a key design requirement for a good user experience. I was experimenting with several image preview ideas that fit within the visual + interaction framework we were concurrently developing.
Process
One morning I went on tangent watching old video game footage on YouTube. I found a moment in one of the Zelda video games where the main character, Link, enters a new room and discovers a treasure.
A lightbulb went off: in the same way the black bars direct attention to Link and his discovery, maybe I could use a similar curtain effect in our app when someone selects an image to preview.
I sketched how this might work in the context of our web application...
... and then worked out the math needed to display the image at the right aspect ratio, at the largest size possible while still fitting the viewport.
Result
Try the latest version of this interaction on our proof of concept web application demo: hover over an image, press the Space key to expand it and Escape to close. This is a storyboard sketch of the latest revision:
I used Verge.js to reliably calculate viewport dimensions, and Greensock.js for smooth, performant animations. Both were strong and well-documented API's + consistent behavior across browsers.
Once I got everything stable, I refactored out my code into a reusable module, as a local (unpublished) Meteor package.
Next Steps
I'd like to experiment with elasticity next for a better fit with our identity. The code could also be refactored to reduce some repitition.
Roles
  • Concepting + Design Research
  • Visual Design
  • Animation + Motion Design
  • UI Prototyping + Development
  • User Experience Design
what's up with Jen?
2011
a game for parents to explore issues of cyberbullying
gamelearningfacilitationbalancestorytellingcards
Collaborator
  • Aabhira Aditya
Overview
We designed a ‘choose your own adventure’ style card game for parents to learn about the issues of cyberbullying.
There's no way to win or lose, there are no points, and no "correct" way to play the game. Instead, players navigate a story with multiple paths, choosing what action to take at each scene. The game is designed to be played several times with the goal of experiencing different endings.
We playtested the experience many times. Our goal was not to take a stand on the issue, but to help parents gain new insight and consider perspectives they might not have otherwise. As a result, we hoped they could be empowered to work with their children to reduce cyberbullying.
Context
Mobile phones and social media are prevalent today, and are a key component of teenagers' lives. Heavy research is focused on a growing side effect of technology use: cyberbullying.
We wanted to explore if a more playful, interactive method could be more effective than traditional public campaigns for shedding light on this complex social issue.
Inspiration
The Drop The Weapons project was a big inspiration for us. It used a first-person, choose-your-own-adventure style narrative powerfully and was cleverly designed with YouTube's 'Recommended Videos' feature.
This campaign was made for teenagers, hoping to directly influence their behavior. Although they had a strong potential to influence teenagers, parents were rarely the focus of cyberbullying public awareness campaigns. Thus, we saw an opportunity to focus on parents instead.
We were also inspired by a community of designers + game makers making experiential games targeting various social issues.
Process
We tried using interactive narrative tools like Twine, but sketching and writeboarding proved the most effective way to prototype the story.
We iterated the narrative many times, using bits and pieces from our research to design a believable story that we hoped would resonate with parents.
It was much more difficult to do this than we expected. The choose-your-own-adventure format meant any scene could lead to another, and it was extremely difficult to work out if each path made sense logical sense, and maintained an engaging rhythm. Every scene significantly added possible paths, making iteration very time consuming.
Once we had a loose narrative, we began to consider aesthetics. Two constraints led us to a watercolor style. First, we did not have time or a production budget to find and photograph people in the scenes. We also found in early tests that parents would not relate to the characters as strongly with photographed imagery.
By having imagery that was more hazy + ambiguous, we felt people could place themselves in the story more easily. This is a similar design strategy in concept art, when in the early stages of envisioning a film or game.
We found images of people in poses that matched the scenes we had written, then traced, layered and painted these with a watercolor brush setin Photoshop to make give them a consistent feel.
Result
We facilitated the workshop at a local games community event, after running several playthroughs with friends who were parents.
We got positive feedback from participants who said mentioned it was a compelling issue and thanked us for helping raise their awareness. We mapped the routes parents took through the game to see if we could identify any patterns. The visualizations gave us some insight, but this stategy's value would likely come with larger data sets.
This type of narrative-based facilitated tool has many potential applications, and can be adapted to different kinds of stories, issues and contexts.
We found it's quite difficult to remain passive as a facilitator, and exploring the effects of our bias as moderators is an important aspect for future research. Also, even with permission, video recording makes participants uncomfortable, and so technical considerations for a smaller recording rig would be beneficial.
Roles
  • Workshop Facilitation
  • Visual Design
  • Gameplay + Narrative Development
constructing the city
2012
a sound design composition
soundcityconstructionaudiorecordingsamplingchaosharmony
Collaborator
  • Howard Chambers
Overview
A sound composition exploring the different sound states of chaos and respite in the urban environment. Produced as an accompaniment for a visual design work.
Listen
Process
I began by going on a field trip around streets of Manhattan, recording and sampling sounds of interest: construction workers, trucks, engines, buses, passing generator noise, passing conversations.
Most sounds were recorded with a shotgun microphone and an Olympus digital recorder. After returning to the studio, I edited and cleaned up the recordings and created the composition, trying several arrangements, of a transition from chaos to harmony.
Inspiration
Inspiration came from driving, crescendo pieces- building up over time, peaking at the end.
Roles
  • Sound Design
  • Sound Editing
  • Music Composition
  • Sound Recording
Attribution
I relied on sound effect contributions from these sound designers on freesound.org: