case study:


Helping teachers create online lesson-plans.  

The goal of this project was to evaluate the usability of primary tasks for two key user groups and make enhancements to the usability of DocentEDU. To accomplish this, our team conducted usability tests and analysis, collected data, and obtained insights regarding different aspects of the application.

The Challenge

DocentEDU is an online tool designed for teachers to help create online lesson plans for students. It seamlessly integrates into existing webpages and allows teachers to insert information, quizzes, and images into the page.

We were tasked with reviewing this product and creating recommendations on how to improve the experience for teachers and Administrators that use the 'Tools Manager' section of the product.

The Approach

To fully understand the issues that needed to be solved, I started by creating a Usability Test Plan & Script by defining the user's goals. Next, I created a Usability Review of the existing product using my script. I ran two remote tests (one moderated and one unmoderated). With a team of four other UX designers, we ran three in-person usability tests and compiled all of our data. I wireframed and prototyped my recommendations for improvement. Finally, I created a Findings and Recommendations report to deliver to our Development team.


Usability Test Plan & Script

The initial step in the project was to define who the two key user groups were for this product (Teachers and Administrators) and define their goals.

"My user is a teacher who wants to find resources to embed into external webpages for student assignments."

"My user is an administrator who wants to manage, add, edit, and delete database resources."

I created a Test Plan & Script with tasks and scenarios based off of the goals of each user group. The scenarios were created to give the user context and allow themselves to imagine using the app in a realistic way.



Usability Review

After creating a Test Plan & Script, I created a Usability Review using Shneiderman's 8 Golden Rules. I used the user goals and tasks I created to conduct a usability evaluation of the "Tools Manager" section of DocentEDU.

I ranked the tasks given to each user group from "No Error" to "Catastrophic Failure" according to the 8 rules I chose to use for my analysis.

After the review, I compiled the raw data with my team and refined the Test Plan & Script based on our reviews.


Remote User-Testing


I conducted two remote tests using different types of software for communication and screen capture. The first test I ran was moderated with me walking the user through the script. The second test was unmoderated with a slightly adapted script for the user to help make the tasks easy to understand.

I made sure to use "Think Aloud Protocol" to help understand why they were doing tasks a certain way. Each user completed the tasks in different ways and gave good feedback. 



In-person Testing


Along with four other UX designers, I conducted three in-person tests. For each test, we had a moderator and observer in the room with the user. The three other designers sat in an observation room with an audio feed and a screencast from the user's laptop.

Many of the issues that came up during the remote tests started to be apparent during these sessions as well. We also recorded these sessions to be able to refer back to them.


“I need more clear instructions.”


“I have no idea what just happened.”


“I’ll probably find it eventually.”


“Did it work?”

Data collection & Synthesis


We collected the data  from our user tests and compiled our findings. The data consisted of observations, body language, facial expressions, user quotes, and pain points the users ran in to.

We then synthesized and grouped our findings to help us create insights based on commonalities. These findings would go on to help confirm and discover usability issues in order to develop meaningful recommendations.


Wireframes & Prototype

After delivering my report to the Development team, they decided to implement my recommendations. I created annotated wireframes with the enhancements I discovered would be the most beneficial and an interactive prototype to show how the improvements to the usability would function.



Findings & Recommendations Report

After completing all of the user-tests, I compiled the information and created a report on my findings and recommendations. The report walks through the complete process that I took and uses the data I collected to call out improvements to the usability.


  • Add documentation/tutorials/how-to’s.

  • Improve consistency of language.

  • Fix broken elements.

  • Add search feature.

  • Provide feedback when an action has taken place.




One of the main issues that I discovered through testing was the fact that there was a lack of consistency and findability. I worked on correcting that with my wireframes and prototype. There was also a lack of feedback when buttons were clicked or some type of action took place. I used a card layout style for both the Teacher and Admin view of the tool to keep a consistent flow. Another point that was brought up by multiple users was when they got stuck on a task, they automatically looked for some sort of documentation or tutorial. In the original tool, there was no section for easily finding these resources so I added both of those categories to the main menu.

One of the most challenging parts of this project was the user tests. Many of the users struggled with the tasks and ended up not completing them like we were hoping for. That ended up giving us good data for recommendations, but it also made it hard to judge the current state of the tool. When starting a project similar to this in the future, I learned that it is beneficial to think ahead about your findings report. Knowing what type of data you're looking for during your user tests will help you create a more precise way of showing the findings to the client.