Summary: While at Lexmark, I worked on a number of products. Some of these saw the light of day and some did not. Mobile Print is one project that made it out the door. The goal of this project was to give the mobile user more power in their pocket to scan to and print from the cloud, determine how a user might need to interact with the printer, consider proximity sensing, and give the app an overall UX facelift. There were several minor concepts (scan adjustments, print page range, and image previews to name a few) that could have made this app gold, but developmental limitations (such as schedule) prevented many of them from being implemented. Note: While I wasn't the only designer to work on it, I was the person responsible for the formalization of its ux design, building the prototypes, and creating the storyboards. The app has continued to be iterated upon since I have left Lexmark. 

Looking at the original app:

The screens above are from the original application. It was a UX nightmare, but when something works, people will often overlooked all of the awkward UX. The main problem was that this app often didn't work. Knowing that we had a lot of work ahead of us, the mobile UX team began research phase. We reached out to the product owners and the previous developers to understand what they knew going into building the app. Then we probed for more information regarding what we could remove, fix, and add to the experience to make this application sing.

After discussing the ins and outs with the previous team, we broke down the current app to understand the workflow to figure out what seemed extraneous, bad, or good. Not having a direct line to current users, we relied on reviews of the app on the app stores (iOS and Android), Google searches, competitive research, and marketing's feedback they had heard from the field.

Pulling out the bits and pieces of UI/UX critique that we could, we set to work. One of the main priorities we had was to know which toolset had to use to build Mobile Print. Three times the development platform changed, which changed the types of interactions and micro-interactions we could use. Also, since our wireframes needed to reflect the toolkit (for internal reasons) we had to rebuild these wires multiple times—not the most productive workflow.

After trying two frameworks, Bootstrap and Lexmark's Perceptive, we finally settled on native apps. The reason for native, was because it allowed us to more easily hook into the system, the user wouldn't have to deal with the odd discrepancies that can come with web frameworks, and it would feel natural using a system language with which the user was already familiar. When we finally locked down the intent to use native components, taking the various and scanty user input, we were able to move forward toward a more finalized design and direction. We iterated through section specific workflows and determined the primary patterns the app would use.


What's the process a new user should experience versus a preexisting user? How do we let a new user understand what the app is for? How do we hint progress and intended direction? What advantages can we give the user in a native app verses a web only app? There are many more questions to ask, but the point is: understanding UX is about asking the right questions.

Since we had gone native and knew we would be able to tap into the system better we wanted to install extensions/plug-ins which would allow the user to scan to and print from the cloud from within any app. The user would now be able to access OS extensions (which both iOS and Android have), but they would need to enable these extensions, and allow them to access the system. How do you inform the user they will need to do this? How do you inform them of the benefit? Is a wizard the proper flow? On boarding can be tricky. Done wrong and the user feels like they have been swiping forever. We decided to test a few simple screens.

Once we had the ideas mapped out better, we worked our way through each of the on boarding workflows, discussing with the dev team and the product owner the possibilities and limitations. Can we add animation to the on boarding? Can we have an interaction trigger the animation? We'd seen it done, but could our team do it? 

After testing the on boarding and several other workflows, we began to build rapid prototypes to test the experience so we could iterate to an MVP design that was consistent with user expectation and one which would help the user understand what they needed. 

Scan to Phone: https://xd.adobe.com/view/1f9ccbff-acd2-4aed-be95-61bdb480d46f/

Android: https://xd.adobe.com/view/d9b98d48-6e05-4270-9677-27df1cf5c788/ 

Through iteration and user testing we were able to get out most of the problem areas for an MVP. The rapid prototypes, while not fail proof, were essential to understanding the pitfalls a user might encounter. It helped us clean up the verbiage, slim the flows, and refine the interface. If we needed to clarify a specific flow we would do a rapid prototype and then refine them to a more fully functioning prototype as needed.

My belief is: as a User Experience designer all designs should be rapidly prototyped. I should be experiencing the designs I create and not just creating wireframes and wireflows which are lifeless. When I experience a design I understand better the needs and problems a user will have. Things click in a way they won't from static designs.

In addition to the designs above we developed storyboards (really these are Beat Boards) to grow the possible functions of the application. Our goal was for the user to be able to accomplish their tasks and projects quicker and with more flexibility. If the printer could "sense" the user was near and it could ping them to release their print, then they wouldn't have to use the printer interface or return to their desk. The user could, with a simple swipe on their phone, release the prints. This allowed us to leverage the mobile phone to provide an overall easier and faster experience.

We explored several user scenarios and mapped them out in a beat board fashion so we could put the user in their environment. This helps tell the story of why the function or flow works.

These beat boards are similar to rapid prototypes they give use insight into the scenarios visually. Usually a manager or product owner can't always "see" what we are intending them to understand when we place a scenario in writing. So, doing rapid beat boards helps them to get a better picture. It also can help with the workflow and function requirements.

Here are screens of the final application.

Here is an animation we suggested be put in place in order to help the user understand how to enable the iOS extension, because the newly installed extensions are disabled by default. However, this was not used. The original vision for this was: the user would click the "Set up Sharing" button (in image two, first row) and then the animation would begin playing—at a slightly slower rate (this is an animated gif—Note: I designed the flow for this but, I did not do the animation.)

As a User Experience designer and illustrator, I believe that all of UX is about storytelling. Our goal is to help the user to be the hero of their story and guide them through their immediate needs without stumbling. While no application or user experience is perfect, we can aid the user in overcoming inconsistencies and oddities by simply helping them be heroic in their day to day. That was my goal with this app: make it the best way for a user to interact with their printer and cloud printing, in order for them to feel they had been successful in their task. If this is one less difficult task in their day, then we have designed well.

 

UX Mobile Print App Design

UX Mobile Print App Design

UX Hinting

UX Hinting

UX Gestures

UX Gestures

UX Storyboarding

UX Storyboarding

Prototyping & Interaction Design

Prototyping & Interaction Design

Leave a Reply