Performance Improvement for Data Quality

Problem & Solution

The audience for this performance improvement solution was a global group of data operations specialists. My client needed to improve the quality of underlying app data in order to increase end user satisfaction. 

The issues I set out to address were:

  • the existing 3-hour instructor-led training did not reduce operator errors (even after multiple iterations), was too long, and was not scalable;
  • learner KPIs did not align to desired behaviors; 
  • quizzes tested whether learners could regurgitate abstract concepts rather than apply the concepts;
  • there was no assessment on the technical procedure;
  • learners were struggling with complex policies and procedures;
  • high learner satisfaction was overweighted as an indicator of success, delaying a much-needed overhaul of this training.
The solution reduced seat time to 1.5 hours, significantly reduced operator errors,* and got the client one step closer to a fully scalable solution. We kept one hour of instructor-led training (this could later be converted to asychronous training), eliminated the quizzes, and added a scored, 30-minute scenario-based eLearning. The main two eLearning activities assessed whether learners could perform the procedure—while applying correct policies—in a simulated user interface. Learners were presented with challenging branching scenarios and multi-step tasks, making this an L3 eLearning. The warm up activities made the complex technical material much more digestible.

Training is often mistaken as a silver bullet. If we had only implemented this new blended learning solution, it is unlikely to have worked. We also had to fix the learner KPIs. Learners were incentivized (through frequent KPI reviews) to work as quickly as possible. However, to improve data quality, they needed to slow down, at least during their ramp up period. I advised that once new hires could demonstrate a minimum level of quality, their speed goals should then be increased.

I also encouraged my client to stop placing undue weight on learner satisfaction scores when prioritizing training needs. This was a prime example of the unreliability of the learner satisfaction metric: learners reportedly loved the existing instructor-led training, and yet business metrics were telling us a very different truth—it was not getting results.

 

*A business analyst reported the Top 10 monthly error types to watch. This error type stopped appearing on the Top 10 report in two separate pilots before it was rolled out successfully to the entire learner population. It had previously been regularly ranked in the Top 3 errors.

Note:  To respect the proprietary nature of this content, I’ve changed the details of the project. But this example is representative of my process and solution.

Process: SAM

This process flexes based on specific project needs. Below is one way of implementing the iterative design phase principles. Read more about the SAM approach here.

Explore More