My first professional experience in the world of UX was at a hardware company in San Jose, CA called Synaptics. They design hardware components – focusing on displays, touch, and other biometric products – that are then integrated into PCs and mobile devices. The UX team at Synaptics comes under the Biometric Division, and I was the usability intern in the summer of 2016 working under two researchers there.

My managers were from two different backgrounds, and together introduced me to different facets of usability, especially in the realm of hardware and tangible products. While one manager had a background in engineering, the other had previously done research in biomechanics. Working with the two of them highlighted the unique challenges of building hardware products, where the physical demands of both the electronics and human body limit the possible interaction.

For the three months that I was there, my managers had not set apart any specific projects for me, but instead let me contribute to whatever work was coming to them. This way, aside from working on a variety of products, I became familiar with the pace and perspective of research in industry. Until this internship, I had only known academic research projects in cognitive science, which extend for multiple years and look to answer questions by finding statistically significant trends in the data. Unlike my lab experience, the studies I conducted at Synaptics only took a few weeks each, and focused more on identifying flaws in prototypes or personal preferences about a certain technology. So by the end of the summer, I was pleasantly surprised to have completed four major studies in my three months there.

Facial Recognition as Secondary Authentication

My first study at Synaptics was an ongoing project that one of my managers let me collect data for as a way to get my feet wet. It was a competitive analysis between several facial recognition apps, focusing on the efficiency of technology and its use as a form of security.

Methods

To investigate the effectiveness and efficiency of the apps, we recorded the time required for enrollment and verification under indoor and outdoor lighting conditions from four angles. Enrollment refers to the process of creating the original image stored as the ‘key’ that each subsequent attempt at verification is compared to. Each application had a protocol for enrolling the user’s face, some of which involved rotating and making expressions to capture its nuances. After enrollment, subjects were asked to try to open the device or app by verifying their faces while we recorded the app’s accuracy and the time it took to process the scan. The subjects verified through each app multiple times, rotating to change the angle of lighting, and then repeating under sunnier conditions.

Screen Shot 2018-01-31 at 5.55.45 PM
The lab setup. The subjects rotated through the different positions (denoted in numbers) to change angles, and the light was turned on to recreate sunny/outdoor conditions.

After trying each one, subjects were asked to rank the four apps by preference. Finally, I compiled a short interview protocol to identify subjects’ feelings about the effectiveness and security of facial recognition.

Results

While the usability test and ranking described above highlighted users’ preferred apps, the qualitative data we collected showed surprising trends. Many subjects came into the lab feeling confident about the effectiveness of facial recognition technology, but left dissuaded after seeing it in action. All four apps struggled in the outdoor lighting conditions, and even rejected some subjects during verification. When asked about it, subjects reasoned that if the apps could not handle bright light, they would struggle even more with changes to hair, makeup, glasses, or darker settings.

Furthermore, subjects felt that the enrollment process was not always thorough enough, as they would sometimes be rejected during verification. This led to concerns about the security of facial recognition, and subjects felt that it was best used as a secondary form of authentication. They did not feel safe using it as the only security measure, especially for sensitive information such as in banking or email applications.

Trackpad Competitive Analysis

The next study I conducted at Synaptics was centered around their trackpad products and how they compared to a competitors’. The study was conducted using three laptops – two with Synaptics trackpads, and one with the competition. I have not included the results of this study, as they are directly pertinent to Synaptics products.

Methods

Subjects were asked to sit at each laptop (order of laptop was randomized across subjects) and complete a set of tasks that tested three main interaction types:

  • Single point contact: subjects saw a point on the screen that they had to click on. This tested for the precision users had when using each trackpad.
  • Zooming or pinching motions: subjects saw two outward or inward point arrows indicating the use of two fingers to expand or pinch in a certain angle. This tested for how effectively each trackpad processed simultaneous contacts
  • Typing: subjects saw some text to type. This tested for how well each trackpad negated contact near the space bar caused by the bases of the users’ thumbs.

Two Fingerprint Sensor Integration Studies (Patent Pending)

My last two studies focused on new Synaptics products that were being used in their clients’ devices. Both involved addressing the usability challenges of integrating fingerprint sensors beneath the glass (top most surface) of trackpads and cellphones. This was an interesting problem because there are several design benefits to having invisible sensors that are only noticeable when necessary. Such sensors give back large portions of real estate on devices’ surfaces, and allow for larger UIs. But they also introduce a new usability challenge, as users now have to interact with sensors that cannot be seen or felt. These studies revolved around finding new ways to enable interaction between users and fingerprint sensors. I designed experimental UIs for these studies with JavaScript, HTML, CSS, and Android Studio.

For the first study, we tried to address this question for Synaptics’ SecurePad, a trackpad integrated with a fingerprint sensor. HP was using SecurePads in it laptops and wanted to embed additional LEDs that would light up and flicker to indicate location of sensor, as well as authentication status. We worked with a team there and conducted a study to test the most user-friendly organization and behavior for these LEDs.

The second study was focused on Samsung’s upcoming (at the time) Galaxy S8 cellphone, and is part of work currently wrapped up in the IP process at Synaptics. For this study, I worked with my managers to come up with the experimental protocol, as well as scripts for the subsequent data analysis. In this study, we investigated how UI visuals presented during the fingerprint enrollment process affected the types of contact users made. Effective fingerprint enrollment involves maximizing the area of the finger exposed to the sensor. But when the sensor is invisible to the user, this becomes increasingly challenging. This study was meant to identify visual cues that could illicit contact from various regions of the finger. To model the contact made by subjects during the study, I used Python’s SciPy library.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s