Preventing $1.38M in Electrostatic Damages

Did you know that the same harmless act of rubbing your feet against a carpet to create static shock has the ability to ruin small, but powerful electrical components in space structures?

At an aerospace manufacturer, I led a project that modernized the procedural safeguards used to prevent the static damage of electrical parts. This case study will walk through the design process of the project, and highlight some technical and personal UX learnings.

I have omitted & obfuscated confidential information in this case study. All information in this case study is my own and does not necessarily reflect the views of the employer.

The Mission

My team and I had to design a way to communicate and alert temperature & relative humidity data to the employees on the manufacturing floor in the form of a live dashboard, to be shown on 6 digital signage touchscreens placed throughout the area.

Main Goals:

  1. Empower manufacturing floor personnel to maintain optimal environmental conditions
  2. Replace the standard/practice of relying on paper recorders

My Role

I saw this four-month-long project to completion as the project lead and lead UX designer, and worked alongside three software developers, our client, and our end users. (This case study will focus on my design contributions, but feel free to contact me to learn more about my contributions as a project lead!)

I owned the research, design, and testing phases of this project.

Setting the Scene

Quick Science Lesson šŸ”¬

That carpet static shock mentioned earlier is possible through a phenomenon called electrostatic discharge (ESD), when there is a sudden flow of electricity between two objects with different electrical charges. ESD is a huge concern in manufacturing floors that assemble sensitive electronic componentsā€”the discharge can damage parts and lead to malfunctions or failure.

Out with the Old, in with the New

One of many specific measures to reduce the risk of electrostatic discharge is to track & maintain temperature and relative humidity (T&RH) conditions. After relying on paper recorders for over 30 years to monitor T&RH, one manufacturing team introduced digital sensors. This new technology gathered data in a server, but had no way of informing conditions to the 150+ people working on the floor. This created a need to communicate that data.

The Discovery

I kicked off the project by holding discovery interviews with our client, one of the department managers of the manufacturing floor. We spent several meetings with them to get an understanding of the space, main and secondary users, and user needs of the dashboard.

The Space

The manufacturing floor was sectioned into three zones for the three teams (~50 people each) that worked there. One zone manager supported each team of technicians, who make up the majority of the employees on the floor. Each zone had two digital signage screens that were meant to display relevant information to the employees on the floor.

Main Users

Technicians: Responsible for assembling and testing electronic components. Unideal conditions can impact multiple technicians in an area and the quality of the components being built. Technicians are collectively responsible to take appropriate action when thresholds are breached, and had to rely on outdated paper recorder systems for real-time information.

ā€Zone Managers: Responsible for the productivity of technicians and the quality of their zoneā€™s production. Theyā€™re very concerned with collective environmental conditions which threaten component quality. They have access to the updated sensor data, but could not easily communicate breaches to their technicians. Zone Managers are responsible when thresholds are breached.

Secondary Users

Quality Assurance Personnel: Periodically audits floor to check for process compliance, including temperature & relative humidity-related protocols.

Engineers:
Works alongside the technicians but arenā€™t on the manufacturing floor all the time. They design the parts that the technicians assemble and understand the importance of the environmental conditions. They are not directly responsible for addressing environmental issues.

Looking Back

We started ideating with very little information on our users. We depended on the client to communicate user needs via video call, rather than going to the users themselves on the manufacturing floor. Thus, our initial designs did not address our usersā€™ problems. The ā€œAdding a New Processā€ section in this case study details how pushing for usability testing saved the project from our error.

With a stronger UX foundation, I know now the importance of conducting thorough user research. For a recent example of my learnings around personas, check out The Sweetest Makeover case study. Lessons learned! šŸ˜‰

The Limitations

Below are the main constraints we worked with when designing our solution to communicate temperature and relative humidity data.

  • Changing Old Habits: The solution would be displayed on recently installed interactive screens, meaning 150+ users would be adopting a completely new workflow to get the data they need. Our goal was to make this transition as seamless as possible.
  • Time > Quality: We had to work with the skillsets of our developers to meet the four month deadline. That meant more complex features would have to wait until the next version of the product.
  • Physical Factors: There were only six 55ā€ screens evenly spread out across an incredibly large manufacturing floor, which was filled with rows of tall workstations. Our design had to consider potential visibility from farther distances.

Digital Signage Best Practices

Drawing on lessons learned previous digital signage projects like Navigating Success, our team also followed these guidelines:

  1. Text has to be large (min 24pt), legible (sans serif), and simple (keep it short)
  2. Interactive elements have to be within reach (located in the bottom half of the screen, and limited to finger taps and swipes)
  3. Screen elements have to accommodate for a center focal point (ie when users are interacting at arms length, the far corners of the screen are out of view)

Initial Ideation

Because of what little information we had, much of our initial features were based on the request of our client. We invited our developer team to ideate with us to make sure that building the solution could be done within the four-month timeframe.

Features

  • Sensor Reading & Map: Displays the real-time T&RH reading for each sensor, placed based on the physical placement in the area. Our client requested a birdā€™s eye view of environmental conditions.
  • Alerts: Triggered when temperature & relative humidity crosses a threshold, and stays until it gets back in the green. Historically, users relied on audio alerts from the paper recorders to know when conditions were bad. This is a recreation of that feature.
  • Trendline Graphs: Shows historical data for every sensor within a certain time range. Managers wanted to anticipate & take preventative measure when conditions got bad.

Usability Testing āš ļø

šŸ’” TL;DR

Introduced usability testing for the first time, and results completely debunked our original assumptions. We went back to the drawing board, and created a solution that met the actual needs of our users.

Recognizing the danger of developing the product without first getting feedback from our users, I successfully proposed usability testing with the main and secondary users in the manufacturing floor. I worked with the team to reevaluate the timeline in order to meet our deadline.

This was my first time attempting a usability test, so I consulted a UX researcher, who helped me design the questions and structure. The test aimed to gather individual and group feedback within a short period of time.

At the manufacturing floor, we brought together five testers, who represented the main and secondary users: a technician, a quality inspector, a test engineer, and two zone managers. The initial design was presented as a prototype on one of the digital signage screens, and we provided some questions for them to silently record their thoughts and impressions of the design at first glance. Then, we had each individual privately interact with the prototype based on a set prompt and recorded their behaviors. Finally, we brought these users in for a group discussion to share their impressions.

The results from the usability test (which cannot be shown) greatly surprised our team and our client. We falsely assumed that the technicians would prioritize exact state-of-time readings. Instead, they simply needed to understand if they were within the right conditions for working on components. We also initially assumed the trendlines would only be relevant for zone managers; however, the managers were hoping to create a system of ownership so that the entire manufacturing floor could collectively monitor environmental conditions and take preventative measures.

What We Know Now

Below are the main user personas, which reflect the updated knowledge we learned from usability testing and were created post-project for your convenience. We didnā€™t know what personas were at the time, so the final ideation was guided by more diligently asking questions in our userā€™s perspectives (ie. Would a technician understand what this means? Does this meet the zone managerā€™s needs?)

The Main Changes

  • Average Gauges: This feature, requested by our client, communicated the real-time average, upper bound, and lower bound of temperature and relative humidity for all sensors in a zone.
  • Sensor List: We placed live data in a list rather than on a map. Any data that passed the thresholds would be moved to the top of the list, and highlighted to match the color of urgency its in. A ā€œMapā€ button will open a window of a map, to help users match the name of a sensor to its physical location in the area.
  • Leveling Up Trendlines: Added three toggles: ā€œWeek, Day, & Hourā€ for users to see trends at different intervals of time. Users could tap a anywhere on the lines to identify the sensor, the reading, and the time it was read. Also increased the size of the trendline graphs for higher visibility.
  • 3 Dashboards & 1 Menu: Rather than showing the data of the entire manufacturing floor on one page, we decided to separate the data into the three zones of the floor. A user can toggle to any zone via the navigation menu on the left.
  • Brighter Alerts: Alert notifications were redesigned to be bigger and more noticeable to better catch a userā€™s eye in their busy day-to-day. The alerts cover the average gauges, as average gauge information isnā€™t relevant when thresholds are crossed.
  • Info Button: To help train new users to trust the data they saw and how to use the dashboard, we added an information window pop-up that described each feature in detail, and who they could go to for questions, comments, or concerns.
  • Time > Quality: We had to work with the skillsets of our developers to meet the four month deadline. That meant more complex features would have to wait until the next version of the product.

Final Product āœ…

A second round of usability testing wasnā€™t possible due to time constraints, so we moved ahead with development. Once our developers completed the build, I organized and led the launch/implementation effort.

Launch & Test

šŸ’” TL;DR

  • In-person launch doubled as casual usability testing
  • Technicians found the dashboards to be useful and had many suggestions on how to improve them
  • Zone managers enthusiastically adopted the dashboard in their responsibilities and communication with their team

I personally met with the technicians and managers of each zone and the teams representing the secondary users, to demo the dashboard and gather live, in-person feedback to inform future iterations, and to identify any bugs. Rather than holding one mass meeting, I met with each group individually, which encouraged group discussion and helped me get to know many users by name.

I started each meeting by having a volunteer interact with the dashboard based on preset prompts, which acted as a proxy usability test. We took notes on their behavior to guide feature adjustments and future iterations. I ended each meeting by asking the group for live feedback and sharing a survey link to gather anonymous feedback. We got very favorable comments from zone managers and great suggestions from our technicians, highlighted below.

Technician Feedback Highlights

ā€œItā€™ll save space, and help us get rid of the paper recordersā€
  • Found the information on the dashboard to be friendly and useful in their daily workflow
  • Some were relieved to get rid of the paper recorders and save some space on their workstations
  • Several requested a sound alert to accompany the visual alert, and the ability to mute it when necessary
  • Many were confused with what the average gauges representedā€”some thought it was a live reading of just one sensor, others saw it as a static reference to the thresholds
  • Most had trouble with tapping individual lines on the trendlinesā€”the screens didnā€™t allow for touch accuracy.

Zone Manager Feedback Highlights

ā€œItā€™s been really good! I check it every morning, and itā€™s easy to show it to the teamā€
  • Delighted in being able to see the status of their zoneā€™s environmental state at a glance
  • Gained confidence in quality assurance efforts
  • Enjoyed the ability to more easily communicate trends and concerns with their team
  • Noted that unlike the paper recorders, the dashboard gave everybody a high-level understanding of real-time temperature and relative humidity data.

Conclusion

šŸ’” TL;DR

This project was a coming of age event in my UX design journey.

It changed my approach to design, and helped me understand the importance of user interviews, usability testing, and involving end users in the process. The positive results and user testimonies helped upper management understand the value of user-centric design, and from then on, UX design practices were implemented with less pushback.

By leveraging these techniques, my team and I delivered a temperature & relative humidity dashboard that improved work efficiency, reduced reliance on paper recorders, and empowered manufacturing floor personnel to maintain optimal environmental conditions.

A few months after the launch, I visited the manufacturing floor to do a maintenance check on the screens. There, I witnessed a technician come up to the dashboard with their zone manager to point out a trend they noticed in the humidity levels. The technician and zone manager then utilized the dashboard to help with their active discussion on how to address the problem. It was a personal affirmation to see that the feature was voluntarily being used, rather than used because ā€œupper management said to use itā€.

Table of contents