iPhones host a robust library of accessiblity features, hidden deep within an able-centric interface. My BFA thesis addresses this issue and explores the difference between accessibility and universal design.
research, task analysis, UX design
Growing up with a disabled parent, I've developed a keen eye for potential obstacles and inaccessible environments. This, plus my newly discovered interest in UX design, led me to ask, "how do people with disabilites use smart phones?" Tackling this challenge meant straying away from the expertise of my Graphic Design BFA professors and relying on research and the design process.
I surveyed the landscape of smart phones and accessibility features and found many pretty incredible features that make smartphone use possible. However, they were all hidden deep in the settings and were... well, inaccessible. I focused on the lowest hanging fruit and centered my BFA thesis on rearchitecting the Apple on-boarding workflow to be as univerally designed as possible, giving disabled users a chance to independantly access and turn-on these robust accessibility features.
I first called an occupational therapist in the area to find out if and how patients were adapting to the physical challenges of using their phones with their injury or disability. The doctor told me that none of her patients used assistive technology and struggled through usage or asked others for help, and that most didn't know about the many accessibility features available on smart phones.
I then mapped out the current (iOS 12) iPhone on-boarding user journeys and found some challenges:
I didn't have time to tackle the physical challenges associated with unboxing and installing hardware within the physical phone, so I focused on revising the onboarding flow. In my proposed model, each step anticipates a possible need to complete the next step. This would allow users to iteratively configure more complex options using the more basic options that they had configured previously in the process. I made sure to incorporate functionality that already existed within the iOS so that these revisions were realistic and achievable in the short term.
This particular chart outlines a possible workflow of a user with individual, specific needs.
I prototyped the updated model focusing on a user with a touch-related disability. This example shows how turning on features like Siri by default can dramatically change a users ability to understand and interact with the options presented on the screen.
You can watch the full prototype here.
My completed project was selected to be displayed at the 2019 Graduating Student Exhibition at the Museum at FIT. This work was also accompanied by a new iPhone manual, outlining the revisions made and why they’re important for more inclusive user experiences. I wanted viewers to walk away understanding that processes that seem simple can actually be very complicated for others.
Doing the research into how people with disabilities experience technology has made accessibility a priority in my work. Now, working as a user experience designer at a large software company, I see much room for greater exploration and improvement in this project.
How can we make the physical steps of this process more accessible? What happens if the user isn’t an English speaker? Can we as designers and producers do more to ease the experience of a first time user?