Last month I attended a webinar on the Performance Support Community of Practice entitled “Making PS Work! – Bank of America.” If you haven’t seen this presentation, you missed an amazing example of Performance Support in action at Bank of American – including an idea I’ve heard many time over the years: a performance support tool for performance support projects! Stop reading now and click here to watch the recorded session. I’ll wait.
You’re back? Amazing, right? Beth Daniel and her team have done incredible things to transform the organization and retool their own thinking around how performance support and traditional training can fit together.
During the webinar, there was a lively backchannel conversation going on in the chat window. At one point, Craig B. asked a very good question that got me thinking. Craig asked about how best to order a list of resources to make it easier for users of a Performance Support system to find what they are looking for. Specifically, he was asking whether it was better to arrange resources in groups by process, or whether leading with an action verb makes it easier to find. Then Craig asked if there was any usability evidence to support this arrangement one way or another.
I wasn’t aware of any evidence, but I knew someone who might. I told my User Experience expert friend Jeff that I was buying lunch if he’d do a quick consult with me. Jeff and I have done work in the past on eLearning courses, and my designs always benefit from his User Experience expertise. For those unfamiliar with the term “User Experience” (otherwise known as “UX” in Jeff’s world), it is a broad term used to explain all aspects of a person’s interaction with a system, application, or tool. It grew out of – and remains primarily focused on – website design, but the discipline also includes user interface, information architecture, graphics, interaction design, content strategy, communication, and even the guides and helpdesk support.
Much like “Performance Support” draws blank looks from the uninitiated in Learning & Development circles, “User Experience” suffers the same branding issues in the web design world. Jeff even asked me to find a catchier name for it for this blog post. Well, in the spirit of helping you improve your Performance Support, I need to call it UX so you’ll know what to research. But rest assured, Jeff and I will spend some quality creative time working on terminology at our next happy hour!
Three Focuses of UX
The goal of UX is to create products that give users a positive experience through the measurement and improvement of the product’s efficiency, effectiveness, and satisfaction. Let’s unpack each of these terms and see what insights we can apply to our Performance Support tools
Efficiency helps us to understand how quickly a user can accomplish a task or learn an interaction. Words like “usable” or “intuitive” come to mind, describing how easy it is for a user to adapt to a new system or to become proficient in its use. Good goals, but it turns out “fewer clicks” does not necessarily mean “more efficient.” Talk about counter-intuitive!
With respect to Bob and Con’s “2 clicks, 10 seconds” goal, it UX practitioners say the number of decisions a user has to make is more important than the number of clicks it takes. One study concluded “users complain about how long it takes to find things all the time…. However, these complaints aren’t actually about the clicks. They are really complaints about failing to find something.” Frequently, a user’s satisfaction can be more closely attributed to the difficulty of decisions they make along the way, rather than the number of clicks.
[See “Testing the Three-Click Rule,” by Joshua Porter. Originally published: Apr 16, 2003 on the User Interface Engineering site, accessed 12 March 2015: http://www.uie.com/articles/three_click_rule/ ]
We’ve been talking about this for a long time: make it easy for a performer to find the information they need in the Moment of Apply. That’s our bottom line. If you have struggled with how to implement that goal, UX may offer you some insights.
Effectiveness is a measure of how easy it is for users to complete a task. Digging into the reasons they don’t compete a task reveals important clues about the way that user is interacting with the system. I’ve simplified this so much, my friend Jeff may pretend he doesn’t know me. Suffice to say that effectiveness is a tricky thing to measure because so much of a user’s experience is subjective. (Never fear, UX practitioners have a dizzying array of techniques, measures and metrics to determine effectiveness. Just ask, but be prepared for a long dissertation!)
For our purposes, Performance Support is all about making it easy for performers to complete a task, right? Now we just have to make sure our own PS tools present the content in a way that makes it easy to find and complete the task. The Performance Support Pyramid already provides an excellent framework for presenting just enough contextual information at the Moment of Apply.
Jeff tells me the trick to creating effective designs is to reduce the amount of thinking a performer has to do when using your PS tool. His exact quote was: “Don’t make them step out of the tool to think about what you were thinking when you designed it.” For example, he suggested that instead of separate tabs for Steps and Details, a more effective design might use the “+” and “-” hierarchical list convention to roll up nested Details under each Step. Then performers keep the context of the Step when they need to drill down into the Details. I thought this idea has merits, but as Jeff warned it depends on the audience. Knowing your target audience is an important step here. What you think of as “intuitive” isn’t so much to others.
User Satisfaction is how we measure the user’s perspective on their interaction and it gives us insight into what the user’s attitude is toward the product. As with effectiveness, user satisfaction is directly related to task completion. If users can complete their assigned tasks quickly and easily, they often report higher satisfaction, especially if there is an element of delight in the interaction. The ultimate goal for User Satisfaction is for the technology to fade into the background. As long as the users don’t have to think about the interface, you have a winner.
One of Jeff’s favorite stories about User Satisfaction hit close to home for me. Jeff was working with a training team who deployed a compliance training WBT to a large group of learners. They were getting hammered with consistently getting low evaluation scores on the question “How satisfied were you with this training course?” and the client was not happy. After a few quick interviews with users, Jeff recommended changing the question to “What did you find most frustrating about this training?” As it turned out, it was not a learning content issue that was leading to the low satisfaction. The majority of learners who gave a low evaluation score identified issues related to three categories 1. Accessing the training course; 2. Loading the videos during the training; and 3. Downloading the training certificate. This is just one example of how User Satisfaction issues – in this case related to the technology solution and not the content – interfered with the learners overall experience with the training course.
This hit home for me because I’ve been there. A measure of success can often be as simple as hearing a user say “That was less painful then I thought it would be!”
Answering the question
And this takes us back to Craig’s webinar chat question about how to organize content in the least painful manner.
According to Jeff, because the performer is focused on Process, and because Process provides the framework for the context, the best way to organize information is by the easiest logical association (which in this case, the name of the Process). This allows the performer to leave their current activity, look for broad resources associated with the task at hand, then come back and use where appropriate. Where possible, Jeff continues, providing real-time resources (i.e. context-sensitive, embedded support in the workflow) as tooltips or opening a new window specifically related to the step you are currently attempting to complete.
We owe it to our performers to make sure our PS tools are as effortless to use as possible. PS tools that are difficult to navigate or not intuitive to the performers will not generate the level of usage metrics needed to justify their expense. And if users find our systems hard to use, they will simply go back to the old way of doing things. Fortunately, the Performance Support Pyramid already furnishes a very solid framework for an efficient, effective interface that boost user satisfaction. But it never hurts to start thinking about continuous improvement!
Can you say that your performers are satisfied with your PS tool? Not everyone has the luxury of a full User Experience team to run through the battery of assessments and focus groups used to analyze and evaluate user interactions. But I would bet if you work in a company with a large web presence or that does e-commerce, you already have UX practitioners on staff. Time to call your contacts in IT to see if you can “borrow” one for a quick consultation. Check around your town, too. There’s probably a local chapter of the User Experience Professionals Organization (https://uxpa.org/) where you could find someone who would be willing to help.
In my experience, a bowl of chili and a beer is usually enough to get those UX-types waxing philosophical about interface design and other very useful topics! Speaking of which, send me your questions about UX in the Comments below and I’ll pick Jeff’s brain and report back his answers.
Further reading, or everything you never knew you wanted to know about UX:
“What Is User Experience Design? Overview, Tools And Resources,” by Jacob Gube, Smashing Magazine, October 5th, 2010. Accessed on 11 March 2015. http://www.smashingmagazine.com/2010/10/05/what-is-user-experience-design-overview-tools-and-resources/
HackDesign’s “An Introduction to User Experience Design” by Dan Zambonini. Accessed on 12 March 2015. https://hackdesign.org/lessons/9