While at Lexmark, I worked on several different products. Some of these saw the light of day, and some did not. Mobile Print is one project that made it out the door. The goal of this project was to give the mobile user more power in their pocket to scan to and print from the cloud, determine how a user might need to interact with the printer, consider proximity sensing, and give the app an overall UX facelift. Several minor concepts (scan adjustments, print page range, and image previews, to name a few) could have made this app gold. Still, developmental limitations (such as schedule) prevented many of them from being implemented. Note: While I wasn't the only designer to work on, I was responsible for formalizing its UX design, building the prototypes, and creating the storyboards. The app has continued to be iterated upon since I have left Lexmark.
Looking at the original app:
he screens above are from the original application. It was a UX nightmare, but when something works, people will often overlook all of the awkward UX. The main problem was that this app often didn't work. Knowing that we had a lot of work ahead of us, the mobile UX team began the research phase. We reached out to the product owners and the previous developers to understand what they knew, initially, when building the app. We probed for more information regarding what we could remove, fix, and add to the experience to make this application sing.
After discussing the ins and outs with the previous team, we broke down the current app to understand the workflow to figure out what seemed extraneous, bad, or right. Not having a direct line to existing users, we relied on reviews of the app on the app stores (iOS and Android), Google searches, competitive research, and marketing's feedback they had heard from the field.
Pulling out the bits and pieces of UI/UX critique that we could, we set to work. One of the main priorities we had was to know which toolset had to use to build Mobile Print. Three times the development platform changed, which changed the types of interactions and micro-interactions we could use. Also, since our wireframes needed to reflect the toolkit (for internal reasons), we had to rebuild these wires multiple times—not the most productive workflow.
After exploring two frameworks, Bootstrap and Lexmark's Perceptive, we finally settled on native apps. The reason for a native app was that it allowed us to hook into the system more efficiently. The user wouldn't have to deal with the odd discrepancies that can come with web frameworks, and it would feel natural using a system language with which to use it the user was already familiar. When we finally locked down the intent to use native components, taking the various and scanty user input, we were able to move forward toward a more finalized design and direction. We iterated through section specific workflows and determined the primary patterns the app would use.
Several questions we had to determine throughout the project was:
- Why does a user need this app?
- What is the process a new user should experience versus a preexisting user?
- How do we let a new user understand what the app is for?
- How do we hint progress and intended direction?
- What advantages can we give the user in native apps versus a web-only app?
There are many more questions to explore, and we did, but the point is: understanding UX is about asking the right questions.
Since we chose native apps and knew we would be able to tap into the system better, we wanted to install extensions/plug-ins, which would allow the user to scan to and print from the cloud from within any app on their phone. The user would now be able to access OS extensions (which both iOS and Android have), but they would need to enable these extensions, and allow them to access the system.
- How do we inform the user they will need to enable the extension?
- How do we tell them of the benefit?
- Is a wizard the proper flow to help them accomplish this task?
- How do we help them to remember it later?
Onboarding can be tricky. Done wrong, the user feels as though they have been swiping forever. We decided to test a few simple screens.
Once we had the ideas mapped out better, we worked our way through each of the onboarding workflows, discussing the possibilities and limitations with the dev team and the product owner. Can we add animation to the onboarding? Can we have an interaction trigger the animation? We'd seen it done, but could our team do it?
After testing the onboarding and several other workflows, we began to build rapid prototypes to test the experience so we could iterate to an MVP design that was consistent with user expectation and one which would help the user understand what they needed.
We were able to get out most of the problem areas for an MVP through iteration and user testing. While not fail-proof, the rapid prototypes were essential to understanding the pitfalls a user might encounter. It helped us clean up the user-facing language, slim the flows, and refine the interface. If needed to clarify a specific flow, we would do a rapid prototype and then refine them to a more fully functioning prototype.
My belief is: as a User Experience designer, all designs should be rapidly prototyped. I should be experiencing the designs I create and not just creating wireframes and wireflows, which are lifeless. When I experience design, I understand better the needs and problems a user will have. Challenges and successes make more sense than when looking at static designs.
In addition to the designs above, we developed storyboards (really these are Beat Boards) to explain and experience a possible growth function for the application. Our goal was for the user to accomplish their tasks and projects quicker and with more flexibility. The theory was that, if the printer could "sense" the user was near and could ping them to release their print, they wouldn't have to use the printer interface or return to their desk. The user could, with a simple swipe on their phone, release the prints. This concept allowed us to leverage the mobile phone to provide an overall smoother and faster experience.
We explored several user scenarios and mapped them out in a beat board fashion to visualize the user within their environment. This visualization helps tell the story of why the function or flow works or fails.
These beat boards are similar to rapid prototypes. They provide useful insights into the scenarios—visually. Usually, a manager or product owner can't envision our concepts with written scenarios alone. Creating rapid beat boards helps provide a better picture. It also can help with the workflow and function requirements.
Here are the screens of the final application.
Below is a hinting animation to enable extensions, I storyboarded it but didn't create the final animation. I suggested this direction to aid the user in understanding how to enable the iOS extension. The reason for this hint was because the newly installed extensions were disabled by default. This animation was never used. The original vision was: the user would click the "Set up Sharing" button (image two, first row), and then the animation would begin playing. (Note, the original animation played at a slightly slower rate. This is an example animated gif.)
As a User Experience designer and illustrator, I believe that all of UX is about storytelling. Our goal is to help the user to be the hero of their story and guide them through their immediate needs without stumbling. While no application or user experience is perfect, we can help the user overcome inconsistencies and oddities by merely helping them be heroic in their day today. That was my goal with this app: make it the best way for a user to interact with their printer and cloud printing to feel they had been successful in their task. If we create one less difficult task in their day, then we have designed well.