Learning Campaigns Part 1: Drive Learner Engagement
For the first time we now have a line of sight to metrics that help us evaluate the impact of delivering learning in an extended campaign format, rather than as a single event. Using data gathered from a variety of Fort Hill’s clients who have implemented the 70-20® learning activation platform, we can now assess the relative value of learning journeys vs. standalone courses.
This will be a 3-part Blog series that will examine learning campaigns from three perspectives: 1) participant engagement; 2) performance improvement; and 3) social learning.
Part 1 – Use Campaigns to Drive Learner Engagement
Embracing and implementing the 70:20:10 learning framework brings many changes to how learning is designed and delivered in organizations. The 70:20:10 approach fundamentally shifts the focus from course delivery to performance improvement. One strategy that supports the shift is to transform training programs into learning campaigns.
Learning campaigns are an effective alternative approach to providing “one and done” training courses or assigning people a list of uninspiring eLearning modules (the traditional “10’s” in a corporate training curriculum). Charles Jennings, co-founder of the 70:20:10 Institute in Europe, recently discussed this shift in his excellent blog: From Courses to Campaigns: using the 70:20:10 approach.
Unlike a typical program or event, learning campaigns take place over an extended timeframe and can include a variety of activities and resources, many of which are embedded in an employee’s daily experience. For example, a campaign may involve formal educational components delivered in person or online; social exposure to ideas, coaching and collaboration with others; and on-the-job experiential and self-directed learning paired with performance support tools.
Two key elements of any learning campaign include spaced opportunities for practice over time and guided reflection. Depending on the performance objectives of the campaign, there can also be social interactions among a cohort working on common challenges as well as individual activities.
Structuring a Learning Campaign
From our experience, we recommend campaigns be comprised of a series of learning challenges activated at different times for people as they progress through the various elements of the learning journey and build their capabilities on the job.
For a campaign involving multi-modal formal learning components, challenges can start during the preparation phase before the first module and provide bridging activities between and after modules. The challenges can be specific activities, tasks or projects that participants work on solo or in groups including:
~ Short assignments done in the context of work.
~ Opportunities for practicing new skills & behaviors in the flow of work.
~ Conversations & coaching with colleagues, managers and mentors.
~ Reflection on what’s working and what’s not.
Learning campaigns represent a significant and creative step forward in how L&D practitioners design solutions to meet dynamic business and performance needs. But, once implemented, how do you evaluate the results of this new approach? Traditional evaluation strategies will not work effectively when you have a variety of activities and steps being taken by participants over an extended timeframe.
We suggest new metrics to evaluate the impact of learning campaigns, such as improved participant engagement, visible progress and observable performance outcomes on each challenge. With these new metrics, you can demonstrate that campaigns yield stronger outcomes than traditional short-term standalone forms of application after a course.
Supporting Skill and Knowledge Transfer
Fort Hill’s new 70-20 learning activation platform supports skills and knowledge transfer after formal courses. It also facilitates and captures a variety of informal and social learning activities. More importantly, 70-20 provides a platform to create, implement and measure results from extended learning campaigns.
We now have data in 70-20 from a variety of clients and learning initiatives that make visible the impact of extended (and planned) learning campaigns, as compared to the results of single application periods following a learning initiative. (Note: Fort Hill has clients who use 70-20 to support single learning challenges after programs, or align challenges with team projects, so our comparison data is coming from different types of applications in the same system).
Participant Engagement in Learning Campaigns
When a learning campaign is defined upfront and positive expectations are set in advance for all stakeholders – the learners, their managers, coaches and business sponsors – the results are impressive. Before we get into the data, below are some quick definitions of 70-20 system terms:
Development initiatives structured as learning campaigns are highly engaging. Here are some 70-20 client examples (Note: the first three campaigns required manager validation of results):
What’s exciting to see in this data from five different Learning Campaigns is the average level of participants active in one or more of the challenges was 85%. The average percent of participants with completed challenges across all five campaigns was 45%. If we look at the three campaigns with the higher level of accountability (managers validating completed challenges), the average percent of participant engagement increased to 89% and the average with completed challenges rose to 64%.
This is an extremely high level of engagement that dramatically lifts the performance outcomes of the development initiatives. It’s also important to note the different companies (with one open enrollment group) and diverse programs had participants located in the US, Europe and several Asia Pacific countries.
In contrast, many research studies (including those conducted by Fort Hill) over the years, have shown that the majority of participants who have no accountability or support mechanisms in place after a learning experience rarely apply their learning and improve their capabilities. Our research data, compiled over 17 years, shows less than 20% of people will practice and apply learning well enough to demonstrate results or improved outcomes on the job after training.
Participant Engagement in Standalone Challenges
Fort Hill also has a number of clients using 70-20 periodically with teams or groups working on single challenges associated with different learning programs or projects. In the examples below, the groups were active periodically in 70-20 over a 12-month period. As you will see, the engagement levels on these challenges were not as consistent as the learning campaigns.
1. A large geographically dispersed L&D team at a global biotech company had four unique development challenges over a year (ranging in length from several weeks to two months). Across the four challenges, the group’s activity level ranged from a high of 75% to a low of 38%, and the majority of participants reached 50 or 75 points on the LOI. The group had exceptionally high social engagement (commenting on and liking other’s progress posts), but didn’t complete their challenges in the time provided.
a. Note: From the client’s perspective, her goal was to activate social learning and collaboration among peers who rarely see each other in person. She was focused on having the team learn together by encouraging creativity and out-of-the-box problem solving on the 70-20 platform. On this measure, the client felt the application of learning and outcomes were successful.
2. A diverse Communications and Organizational Effectiveness team at a financial services company had six unique challenges spread over a year on different topics and initiatives. The level of engagement on these disconnected challenges ranged from a high of 55% on one challenge, to a low of 22%. Again, we observed a high level of variability on the single challenges and a relatively low level of challenge completion. The team had an above average level of social interaction on 70-20, but the inconsistent engagement levels reduced the outcomes for each challenge. Reflecting on these results, the client acknowledged the need to improve how:
a. the context is set for each challenge and its connection to business priorities;
b. expectations for involvement are communicated;
c. progress and outcomes will be evaluated.
Conclusions on Engagement Levels
Comparing data from both approaches, we can see that creating, communicating and supporting a structured learning campaign results in significantly higher engagement of participants across a variety of learning experiences, as compared to standalone challenges. The learning campaigns with validated challenge outcomes resulted in even higher engagement and challenge completion than the campaigns without verification of results.
Ideas for Your Action
1. Look at your high priority development initiatives and consider how you can redesign and extend the learning experience over a series of challenges. Communicate performance expectations for the learning campaign and the overall implementation plan to all stakeholders in advance. Have participants regularly communicate their progress on each challenge to the campaign sponsors.
2. Involve participants’ managers and communicate anticipated outcomes and results that will support business priorities. Provide coaching pointers and discussion questions to managers to provide clear guidelines to optimize their engagement with direct reports.
We will explore the specific learning campaign outcomes and performance metrics in Part 2 of this Blog series.