Optimizing Employee Performance Plan application
Reducing support tickets by 54% for an internal application with over 4,000 users
Overview
I trained the development team on how to plan, conduct, and analyze a usability test. I established a reusable framework to assess severity and significance of results to help the Product Manager prioritize and gain approval for additional, impactful work. I supported solution design to accelerate improvements and grow design skills within the team. Through continued collaboration, we delivered a new Performance Plan Application to thousands of employees, which improved the user experience and significantly decreased support tickets.
Goals
Conduct usability testing and incorporate feasible results to reduce support costs
Provide usability testing training to the HR team, enabling them to conduct future testing independently
Provide a useful prioritization framework to evaluate potential improvements with limited resources and gain executive buy-in
Use the results and impact of this work as a case study to grow buy in on UX efforts in solution development and secure additional UX resources
Role
Solution Development Lead UX Researcher Designer & Consultant
ServiceNow Application, HTML, CSS
Technology
Usability Testing, Prioritization Framework, UX Training and Resources, Promotion of Results, Contractor Management
Responsibilities
HR Federal Team: Application Product Owner, HR IT System Owner, HR IT Executive
HR Contract Team: Business Analyst, Customer Support Lead, Service Now Developers (2)
Solutions Development Team: UX Designer (Contractor), Solution Architect, Solutions Executive Leader
Stakeholders
54% reduction in support tickets post-launch
Impact
The Challenge
Performance plans are an annual requirement. With just under 5,000 employees, the performance plan cycle is time consuming for employees, supervisors and HR professionals. The previous version of the performance plan application frustrated users, caused an increase in support tickets and received pointed leadership criticism. Addressing usability issues with a new application was key to ensuring process compliance and reducing costs.
User Need
Simplify performance plan actions to reduce confusion and ensure compliance.
Business Need
Utilize out of the box functionality to reduce support and maintenance costs.
Expand UX capability with internal development teams.
Performance Plan Creation Screen
Recreation of the Performance Plan application page to create a new plan at the beginning of a fiscal year with header, tabbed navigation, and two-column form field layout.
Building trust through listening and realistic goals
Understanding stakeholder motivations is critical to success in any project and building trust is critical, otherwise I risk results of any testing or design iteration not being incorporated into a final product. For our HR partners the following items were most important to them:
Stay on schedule
Reduce costs to maintain app
Increase performance action compliance
Minimize app customizations
Receive less negative feedback post-launch
Produce an app that can incrementally improve over time
UX research was not initially scoped into the schedule, but with growing concerns around usability, I was brought in to provide additional support. After meeting with the HR group I learned that anything UX felt like an ALL OR NOTHING situation to the HR team. It was my job to shift this perception and begin research on the new app.
Testing Application Usability
Usability Test Overview
Key Research Questions
Intuitiveness: How intuitive are available processes for employees, supervisors, and reviewing officials?
Enough Guidance: Is there enough timely guidance to complete key workflows without assistance or training?
Content Clarity: Does the app use plain language and basic design patterns in a consistent and useful way?
Ease of Use: Does this new version of the app provide a better user experience than the existing version?
Session Structure
Based on available functionality and a pilot session, we focused on Supervisors and Employee roles, excluding HR Professionals. Supervisor tasks were the most complex and highest priority for launch.
9 usability sessions completed over 2 weeks
Participants consisted of 6 supervisors and 3 employees
Each participant completed 2 tasks per session
Participants were given 2 attempts per task
Testing Results
Participant Success Results By Task
-
Plan Creation: A new fiscal year has just begun, for your employee create a new performance plan for the year and issue it for them to review.
5 participants were given this task.
All 5 participants did not successfully complete this task on their first try. All needed a second attempt and receive help from the moderator.
-
Change in Position:
5 participants were given this task.
1 participant completed this task on their first attempt. 3 completed it on their second attempt with help. 1 could not complete the task.
-
8 participants completed this task.
5 completed this task on their first attempt. 2 completed this task on their second attempt with help from the moderator. 1 could not complete this task.
Usability Session Tasks
Plan Creation: A new fiscal year has begun and it’s time to create and issue a new performance plan for your employee.
Change in Position: One of your employees is moving on to a new role within the organization. Issue a change to complete this transfer.
Final Rating: It’s approaching the end of the fiscal year and it’s time for final ratings to close out this year’s performance plan.
Each session included two of the three tasks.
Total Usability Issues
12
4
→
3
5
Critical
Serious
Minor
Severity Rating Definitions
Critical - If we do not fix the issue, users will not be able to complete a task in the tool/service
Serious - Users can complete a task in the tool/service but are frustrated with the cumbersome experience
Minor - Cosmetic issues or small hinderances that a minority of users mentioned. Potentially low-effort changes could have big impact cumulatively.
End of Session Survey
At the end of each session, participants completed a ten questions survey. Each question was a statement rated on a Likert scale of agreement. Statements were written both positive and negative sentiment and associated with one of the four key research question themes.
Results By Key Research Question
Enough Guidance - Poor
Ease of Use - Moderate
Intuitiveness - Moderate
Content Clarity - Good
Averages from each question indicated which themes needed the most improvement in the new application. The survey portion also allowed for additional discussion from participants to understand why statements were agreeable or disagreeable.
-
Each statement was rated on a 5 point Likert scale of agreement:
Strongly Disagree = 1
Disagree = 2
Neutral = 3
Agree = 4
Strongly Agree = 5
Scores for each statement were averaged. Negatively phrased statements needed an average of 2.5 or lower to be considered good. Positively phrased statements needed an average of 3.5 or higher to be considered good. Average scores within .5 of their target are marked moderate. Everything more than .5 average score away from their target are marked poor.
-
I think I would need to ask questions to know how to complete each action.
Target: 2.5 or less
Average: 3.56 - Poor
The amount of information was adequate to each action.
Target: 3.5 or more
Average: 3.22 - Moderate
I feel confident I completed the HR action correctly.
Target: 3.5 or more
Average: 3.89 - Good
-
I think the app was easy to use.
Target: 3.5 or more
Average: 3.44 - Moderate
Figuring out how to use this app was difficult.
Target: 2.5 or less
Average: 2.56 - Moderate
-
I think most people would figure out how to use this app quickly.
Target: 3.5 or more
Average: 3.11 - Moderate
Similar information was constantly placed within the app.
Target: 3.5 or more
Average: 4.22 - Good
-
Information was easy to read and understand.
Target: 2.5 or less
Average: 2.67 - Moderate
The amount of information was adequate to complete each action.
Target: 3.5 or more
Average: 3.67 - Good
The information contained little to no jargon or overly specific terms.
Target: 3.5 or more
Average: 3.78 - Good
I structured this usability test to provided clear, prioritized results, alleviating the feeling that UX improvements must be “All or Nothing.” This framework clearly defined tradeoffs and allowed the Product Owner to make informed implementation decisions that benefitted users most.
Examples of Addressing Usability Issues
Users had to recall information from separate screens to complete tasks
6 out of 9 test participants indicated that they felt the overall instructions and labeling was unclear. One supervisor user said “It doesn't feel natural to go between the task and the [performance] plan.”
Critical Issue - Instructions Location
Instructions for a task were in the Tasks/To-do tab while the task itself was completed in the Plan tab. Users were brought to the Plan tab from the homepage, causing them to miss the instructions on the Tasks/To-dos tab.
Because of this workflow, most users to guessed at the necessary steps before seeing the instructions or unknowingly leave the creation task unfinished.
Solution - Integrated Instructions
We added dynamic instructions within the Plan tab rather than only on the Tasks/To-dos tab. Instructions varied by role and performance plan phase.
This change was a minor customization worth taking on because this solution addressed a critical usability and our worst performing research theme, Enough Guidance.
We determined this issue caused the most confusion and frustration for users in the old application. These changes likely contributed to the 54% decrease support tickets post-launch.
Users needed better visual hierarchy to better distinguish important information and scan content
4 out of 9 test participants indicated the performance plan page needed additional formatting to help draw attention to important fields and prevent errors.
Serious Issue - Plan Page
Recreation of page used to review a plan. This content represents about 10% of a standard plan. Limited contrast in spacing and font size led to difficulty scanning for information and frequent rereading.
Solution - Spacing and Contrast
To make plans easier to scan and review, we adjusted heading sizes, added line breaks to all performance plan templates, increased spacing to improve readability.
We conservatively estimated this change would save employees 10 minutes per year on average completing performance plan actions, equating to 160+ days of time back to the agency annually.
Because UX and CX practices are still emerging within government environments, I used this project as an opportunity to strengthen the team’s long-term research capability rather than just completing a single usability study. A key goal was to ensure the HR development team could run future usability tests independently, confidently, and without additional design resources.
Growing internal research capacity
Foundational Training That Built Early Buy-In
I started with a concise one-hour training session covering usability testing fundamentals, session flow, and the materials we’d be using. This helped the team understand how research would answer their most important questions and created early buy-in from a group that was new to, and somewhat skeptical, about UX research.
Reference images from DHS CX Usability Testing
Progressive Handoff to Build Confidence
To make learning practical, I moderated the first few sessions while the HR team observed, and then they took over. By the final sessions, they were independently moderating with confidence, achieving the goal of repeatable internal testing capability.
Structured Debriefs That Accelerated Learning
After each participant session, we held short, structured debriefs. These discussions reinforced research concepts, helped the team quickly identify patterns in user behavior and strengthened moderation skills. Early debriefs took around an hour; by the final sessions they were down to 10 minutes, an encouraging sign that our study found a majority of usability issues.
Debrief Questions:
What did we learn?
What surprised us most?
What could we improve next time?
Growing Design Skills for Better Future MVPs
As insights shifted into solution discussions, I used simple mock-ups and sketches to introduce core design principles such as proximity, alignment, contrast, and repetition. These conversations improved the team’s design literacy and supported better decision-making early in the development process, reducing future rework.
By the end of the project, the HR team could independently plan, moderate, and synthesize usability tests, effectively expanding internal research capacity and ensuring the team can continue improving the product long after this project.
The Impact
Using a low cost methodology and investing in meaningful collaboration, this work produced massive results:
54%
reduction in support tickets during the “Plan Creation Phase” post-launch
10
of 12 usability issues addressed partially or fully by launch, with the other 2 addressed in training
5
HR Team Members trained in usability testing and basic design principles
1000+
audience members at organization-wide Lunch & Learn about this case study
Early data for mid-cycle reviews showed a 76% reduction in support tickets compared to the prior year. Usability testing was scoped into the next phase of work by the HR team to be completed independently.
My leadership was very pleased with the collaboration effort and the role I played in the noticeable app improvement. Because of this, I received additional funding and support to continue growing UX solution development services.
What I Learned From This Project
Experiential learning is critical for UX maturity growth
The choice to make this an immersive project, rather than a quick study I did myself, is one of the best decisions I made. For the HR development team, the experiential learning shifted the way they approached development and prepared them to build better apps in the future, not just improve this one.
For me, this was also time with my customer. I learned where development teams were in their UX maturity and how they perceived UX. Based on this project, I crafted and implemented an action plan to scale UX across the software development organization and shift the way apps were built from done to done well.
Step-by-step collaboration pays off
This project took more time than expected with collaboration slowing after the study before launch. We hit a 2-month lull in progress as the development team focused on other portions of the app. Leveraging the relationships and strong, shared understanding of the results built during testing helped us pick up work quickly and get results delivered. Everyone benefitted from the better app and large reduction in support tickets.
Severity frameworks should be standardized
The severity framework from this study was incredibly helpful and helped gain leadership buy in to implement our results. Because of this, I standardized our definitions of Critical, Serious and Minor to be specific for enterprise software applications. I then implemented this standardized framework into all research projects and established an internal tracker to show issues across applications. This allowed us to identify repetitive or software-specific issues. I could then use issue frequency metrics as a key performance indicator (KPI) for UX work with the goal of reducing critical issues and improving out of the box functionality over time.

