Visual Feedback for Swipe Gesture

Visual Feedback for Swipe Gesture

There have been several smartphones recently released, including Google's Pixel, that have a touchless swipe gesture. They all use either a camera or time-of-flight sensor.  Elliptic Labs is also working on a touchless swipe gesture, using only ultrasound and the hardware that’s already common across all smartphones.  We’re looking to make it as clear and seamless an experience as possible, and needed a demo app for the OEM partners that we’re working with.  This experience was made for a specific phone, for one of the world’s largest smartphone manufacturers. My Role: I was the product designer and product manager for this project. My main design role was to find a way to provide visual/haptic feedback for when the swipe gesture is available and when it is successful/unsuccessful. We needed this visual feedback to clearly guide the user toward more successful swipe gestures, without detracting or distracting from the rest of the experience. Below is a video of the final experience. Keep scrolling to see the full process.

Gesture User Testing

Gesture User Testing

Overview: This was a user test I designed and led to help Elliptic Labs decide between three possible versions of a "zoom" gesture experience.  I conducted these tests with the help of my interaction design intern at the time.  We used this testing process to determine if zoom was an experience we wanted to create, and what exactly that experience would look like.  It was also used to gauge public interest in this product within our target demographic. What we were building was a touchless gesture experience that reused the hardware sensors already present in all smartphones.  This experience allows a user to easily zoom in and out on their phone screen, while using only one hand.  We tested this gesture in a variety of applications, including photos and maps. The specific test described here was for a photo experience, to zoom in after taking a selfie.  The task for the user was to perform a gesture the "move" the image closer to their face to zoom in, and to move the image away from their face to zoom out.  This is a natural way of mimicking how people already bring real-world objects closer to their face for a more detailed view.  How was the test conducted? The first round of user testing happened in California, on the campus of UC Berkeley. We specifically chose to test at UC Berkeley because of the high concentration of people between the ages of 18-22 (our target demographic). Users were recruited on campus, asked to fill out a demographic form, and then shown a demo.   This was the demographic form we used.

Mobile Ecommerce Payment

Mobile Ecommerce Payment

An exploration of the payment process for mobile ecommerce. I researched what was and wasn't needed to complete a credit card transaction, along with what information a native app would already have on its users.  This was created to show some of the ways that the current process for mobile payment could be simplified. To create this clickable prototype, I started with user research on the pain points currently involved with mobile ecommerce, and ways this could possibly be eliminated.   Then I moved into sketching the IA of the app, then onto extremely low-fi sketched flows and wireframes.  Finally, I moved into the second stage of low-fi wireframes, before creating the visual design and the clickable prototype.