Sonus

An mobile app that celebrates social networking as an embodied and emergent user experience based on primary research with the visually impaired community and around a concept of spatial information architecture.

Role

UI/UX Designer and Creative Technologist

Tools

Figma, Unity3D, C#, Blender, Ableton, Illustrator, and Photoshop

Context

MFA Thesis project

Context

With the rise of mobile technology, much of our social interaction has shifted from in-person to app-based platforms on smartphones, like Snapchat, Facebook/Instagram, etc. These social media companies, driven by advertisement revenue, continue to heavily invest in visually immersive technology like spatial computing, with Snap pioneering AR features in 2015.

Moreover, approximately 8% of the U.S. population has visual impairments, hindering their participation in this form of modern socialization, which can lead to social and health issues, per Georgetown University's Health Policy Institute.Source

On Snap’s website, the company illustrates its desire to further occupy a User’s perception with its wearable device, Spectacles.
Can a social networking app accommodate differences in its users' visual abilities?
Only 0.1% of tweets include alt-text descriptions Source

Research

To understand the lived experience of the visually impaired and their relationship with smartphone technology, I interviewed two individuals directly involved in the community:

  1. Ryan Richards, a 40-something blind man from Kansas
  2. Emory James Edwards, a neurodivergent researcher at University of California, Irvine (UCI) in the Donald Bren School of Information & Computer Science. Link

Interview Questions

Various features and visual languages of different applications were compiled and evaluated for their UI/UX.

The Epiphany

During my design research into Snapchat, I was particularly drawn to the Periscope-like interaction design of the Snap Map feature.

From a top-down view of a map, I was struck by the strategy employed to represent localized user-generated content, and to me, it was reminiscent of clouds.

Yet, the media being shared on this application is principally visual information; so instead, I chose an audio-focused route.

Concept Sketch

To assist in visually articulating the project's concept, I collaborated with Midjourney to illustrate a digital cloud of information intertwined within the urbanscape.

Prototypes

Using Unity3D, different cloud types were examined and interpreted into respective spatialized prototypes.

User input was primarily WASD keyboard controls, a mouse and headphones.

Cumulus

Different audio effects and behaviors were applied as a way to interrogate emergent qualities that the immediate collective Audio Notes might manifest.



Stratus

To assist in user navigation, audio was binaurally recorded and played back. Each Audio Note contained a beacon that dynamically adjusted to the User's location in the space.

Cirrus

By prototyping the concept in an interactive simulated environment, I was able to move beyond conversations and loose ideas, consider a holistic, novel interaction paradigm.

Nimbus

Testing

Three volunteers interacted with each cloud type via the interactive Unity Scenes and provided feedback on the user experience, visual language and immersive qualities.

My classmate, Noah, interacting with the Stratus cloud study using WASD keyboard controls, a mouse and headphones for binaural playback.

The feedback was collected via a Google Form and synthesized into the following six comments/questions.

Insights

From the extensive primary and secondary research, and feedback from User Testing and my Thesis Advisors, I synthesized the following three key insights:

Challenging UX with VoiceOver on iPhone

Despite VoiceOver capabilities on my iPhone, most applications, including all social media applications, are nearly inaccessible. This severely impacts my ability to interact with others, unless I'm able to speak to them on the phone.”

Ryan Richards

1

Spatial Information Architecture

The techniques of Orientation and Mobility (O&M) empower the visually impaired to develop a mental model of space based on the environmental sounds. This empowers the individual to safely travel without the need of others.

2

Positive Feedback to Interaction Design Studies

The Stratus and Nimbus interaction design studies were the most effective at conveying the idea of networked audio recordings and emergent behaviors that might result from content-specific information and/or density.

3

UX Design

By placing the primary action button in the center of the screen, all Users can easily interact with the app's main function, without the need to rely on VoiceOver.

Each Audio Note's beacon sound and audio information provides a mental model for the user’s surroundings, while facilitating connections with others.

The sketches were expanded to include actions for activating a spatial filter based on Natural Language Processing (NLP) and a toggle switch for Users to move between AR and Planar modes.

Regardless of which mode, the UI and beacon sounds suggest to the User how the system interprets the Audio Note content and density in the vicinity, according to the Nimbus and Cumulus interaction design studies.  See above

Recording an Audio Note

Listening to an Audio Note

Interacting with an Audio Note

Initiating a Spatial Filter

UI Design

One of the feedback notes from the interactive prototype—How can the metaphor of a cloud be better expressed visually and behaviorally?—instigated visual research into my own cloud forms and behaviors.

To differentiate my interpretation of a cloud, I experimented with an analog method of making clouds through the use of a cloud tank. I filmed the cloud-like effect with my digital camera at 120 fps.

From the numerous experiments I created with the Cloud tank, there was one in particular that I believed captured the essence of a fluffy cloud.

A still from that footage turned into the background image of the app, and I color-picked many of the colors, with some tweaks, to use throughout the app.

8698C7
86ACC7
374050

The cloud motif was applied throughout the app, including interactive animations and UI elements.

Audio Note creation animation

From the behavioral qualities observed in the cloud tank, I attempted to mimic the circular, billowy feeling in how a User creates their own Audio Note.

Soften visual treatment

For the visual treatment of most elements, including the menus, I feathered the hard edge to accentuate the soft feel of a cloud, while the typographic choice, Proxima Nova, suited the overall circular forms.

I structured the Audio Notes to make them identifiable and more meaningful for Users.

When these Audio Notes gather in place, they form clusters and embody cloud-like and emergent characteristics.

The connecting line between Audio Notes indicates a conversation or response with one or more people. The beacon sound helps orient users to each Audio Note or group of Audio Notes while the central node acts as the GPS anchor of the Audio Note.

Final Design

A user can easily engage with a space full of Audio Notes without directly interacting with the application and be guided to each Audio Note location by listening for its spatial beacon sound via their wireless earbuds.

Enjoy the presence and perspective of others

To listen to the audio note, the user simply walks towards its fixed location and the recording automatically plays as spatialized audio.

Capture the moment with one's voice

To record an audio note, an individual user simply presses on the central button.

Up to four users can record a single audio note if they are within 3 ft. of one another. In these instances, the recording will spatialize each of the users voice so that others may inhabit that conversation.

Filter surroundings by topics

In spaces that are denser, the user can initiate a spatial filter according to topics discussed in the Audio Note recordings.

This feature is available to the system due to the app's NLP abilities.

Access details about a recorded Audio Note

Users are able to tap an individual audio note and be presented with its metadata, such as the User who posted the Audio Note(s), and the number of plays and responses.

Delight in emergent Audio Note behavior

Based on density of Audio Notes in a 24-hour timeframe, the system automatically switches between two states expressed by a hue change to the background of the application, purple vs. blue, and an distinct auditory experience.



Furthermore, the experience can be observed and interacted in a more immersive manner through the Mixed Reality mode.

Reflection

Fundamentally, Sonus offers a way to reunite our physical and social spaces, which includes the variety of accessibility needs of all probable users, and acknowledges how context and wonder plays a major role in socialization.

Next Steps

Project Index
Back to top  ↑
A realistic room created based on an art gallery space filled with interconnected, floating cloud-like glyphs.
Sonus
A cruise ship originating behind a "Caribbean" label and as it moves across the label images, animals, and food from the region animate in.
Carnival Cruise Line Gateway Atlas
 A nighttime mountain scene and an array of tiny lights glimmering across the landscape
Light Sprites™ Experience
The Gateway Atlas
A woman riding a Bird scooter with a trail of coffee shop furniture and machine vision boxes highlighting her socio-economic status.
Snap Lens Gentrifier Database
A woman riding a Bird scooter with a trail of coffee shop furniture and machine vision boxes highlighting her socio-economic status.
Eso Es La Vida Exhibition