CompTIA Virtual Workbench Labs UX/UI
Activities & Roles:
UI design
UX design
UX research/Usability testing
Programs Used:
Adobe XD
Figma
Team:
James Chesterfield
3rd Party Development Vendor
CompTIA produces training and certifications for the IT industry. Often, training takes place in physical classrooms with physical hardware to assemble. As the COVID-19 pandemic hit and students could no longer be in physical classrooms, CompTIA had to create new ways for learners to interact with hardware without having to be in a physical space.
This accelerated the training team’s plans for creating virtual “workbench labs” - interactive, hands-on experiences embedded into our flagship eLearning product where users could assemble physical hardware within a virtual environment.
CompTIA would partner with a development vendor to build the functionality, while I would work alongside a product manager to define the UX, UI, and controls for the new labs. I would also perform usability testing, create a basic design system, and define a method to smoothly transition a user into the lab experience from its parent platform.
The new labs would be required for the updated version of our largest product line, meaning we’d need them launched in about four months. This had to be done well, but fast. I’d start by breaking down the project into any discovered sub-projects or issues. After analysis, I discovered three issues I’d need to resolve to complete the project:
How to smoothly transition users from he parent/host eLearning platform into the labs platform
How to design the interface aesthetics and usability
How to teach users how to control the lab
I’d attack these issues from easiest to most complex and run any testing or research I needed as I went.
Labs are built by a separate vendor in a separate platform. The labs platform is then integrated into the eLearning platform. This creates many technical and vendor-based constraints. Transitioning form eLearning to labs requires that a user move across separate platforms without a jarring or confusing state change.
Interestingly enough, there were so many technical and integration constraints for transitioning a user that there was only one immediate solution that didn’t create a jarring experience - using an existing notification modal feature. The modal would alert the user to the state change and then explicitly launch the lab in a separate window. Despite the extra click and pop-up, this would clearly transition a user and keep both the eLearning and lab windows available.
eLearning-to-Labs Flow: The eLearning interface shows a list of learning tasks at the bottom and a primary “Get Started” action button at the top right. Clicking either will show a modal before launching the lab. This modal alerts the user to the context change and then launches the lab in a separate window while maintaining the eLearning platform in its own window in the background.
Initially, the labs vendor provided an interface and style for the labs. It features a lesson selection screen, a mode or difficulty selection window, and the actual activity window itself. This interface, however, didn’t fulfill our needs in functionality or aesthetics. The default interface was a jarring aesthetic experience, required extra clicks and sub-menus, featured unclear icons and actions, and didn’t provide clear information on user controls.
Based on CompTIA requirements, I designed an initial mock of the interface that cut out unneeded interactions, changed and moved other interactions, and updated the visual style to align to our product style guides. I would then take this interface into usability testing.
This initial mock has the following updated features:
A simple lab mode picker to launch directly into the work
More accessible, contrasting colors
Sidebar with comprehensive steps and sub-steps to guide user tasks
Robust footer with icons and labels for easy access to controls, options, and operations
Vendor Demo Interface: This initial interface provided by the vendor features an entry and lesson selection screen (left), a mode or difficulty selector (middle), and the lab activity itself with instructions (right).
v1 Interface: I designed this updated interface based on organization needs, updating aesthetics and functionality and decreasing the complexity to move directly into the lab. This cuts down on an extra selection screen/step.
CompTIA has an extremely varied user base, from 16-year-olds considering IT to 70-year-olds maintaining skills. All users needed a way to interact with a content type (3D) they may have never used before. We needed to make the environment as accessible as possible, so I decided on supporting four control methods, not all of which would be available at launch:
Mouse or Touchpad + Keyboard
Mouse or Touchpad Only
Keyboard Only
Touch
Determining v1 Controls
I wanted to take a best guess approach at controls first, and then bring those controls into testing to refine. I designed initial controls based on the following examples:
Game Design (e.g. Dyson Sphere Program, Cities: Skylines)
Depth-based Mobile Apps (e.g. Google Maps)
Interaction Guidelines (e.g. Apple Human Interface Guidelines)
The result was an initial, albeit complex showcase of all the controls a user may need. I’d use this as the basis for usability testing.
v1 Controls: A first attempt at displaying controls within the lab.
I wanted to build an interface as fast as possible so that I could get a prototype in front of users to test. I performed six usability studies, each with a variety of open-ended scenarios and questions to understand how the interface and, especially, the controls were used and perceived.
During this time I also realized that I had no true data or understanding on how users physically interacted with our content. I wasn’t sure how to order or introduce the controls because I had no data on what devices or peripherals our users had. So, I asked our research team to run survey of customers to determine peripheral, device, and assistive technology usage. I’d use this to prioritize menus, accessibility considerations, and especially controls.
Usability Testing: This is a portion of the document I used to test users on various actions within the v1 interface.
Though usability testing and research led to many tweaks and updates, the overall changes can be categorized into three major areas:
Change 1: Updating the UI Footer
Users were confused at the placement and usage of many icons. We went back and forth with them to refine a new set of icons, labels, and placements to make a more understandable interface. For example, in the original design, users correctly identified the compass icon on the right side of the UI, but they didn’t understand it’s meaning for resetting the lab view. Though the idea was pulled from video game map design, that pattern just didn’t work here, so I scrapped it and made the icon more explicit.
Original (top) vs Updated (bottom) UI Footer: Based on usability testing, we updated the layout, labels, and icons for many functions. Follow-up sessions with users showed greatly-increased clarity and understanding.
Change 2: Streamlining the Controls
The next big change is an example of reducing complexity into more focused, consumable experiences. Usability testing showed it took significant time and effort to consume the controls, often due to being shown all controls for all peripherals. Users just needed their controls for their peripherals.
After multiple iterations and feedback, I landed on an accordion interaction with a default or suggested control method shown. The ordering and default display was based on the peripherals survey.
After testing again, users spent only one-third of the time on this screen vs the original display.
Updated Controls Interface v1 (left), v2 (middle), and Final (right): Following usability testing and peripheral usage research, the controls menu was iteratively improved to reduce complexity and provide a more focused, prioritized method of consuming controls.
Change 3: Creating an “Active” Tutorial
Usability testing showed that users just weren’t familiar with 3D controls. They needed a low-stakes, active way of playing with them (as opposed to a textual or video explanation). I pitched an “active” tutorial using the same environment we’d already built so that we could keep the time costs low. This would get users familiar with controls early on and, even more importantly, scaffold knowledge to them on how the interface and steps work.
This is a key idea: Instead of passively presenting information, we want to design a simple tutorial to gradually perform 3D movements in increasingly complex ways. In instructional design, we call this “scaffolding” knowledge.
Tutorial Walkthrough Video - This shows an early version of the tutorial, following a basic 5-step onboarding process with each step building knowledge of how to use 3D controls. The final step requires a user to use all the controls.
I successfully collaborated with a product manager and vendors to complete the labs interface, UX, controls, usability studies, and platform flows. The labs launched on-time, embedded cleanly into our flagship learning product. Labs are now being used in hundreds of classrooms across thousands of students.
Many bugs, enhancements, and delightfulness improvements are on roadmaps, but the core product was released to great excitement. Customers are overall ecstatic at the features and capabilities and the ability to place their students in virtual environments with real hardware.
Virtual Workbench Labs Overview Video: This shows some of the features of a workbench lab, including “explore mode” that shows components and definitions and “assisted mode” that functions as a walk-through of assembly steps.