Andrew Downs, Learning and Interoperability Consultant, Watershed LRSBy Guest Author, Andrew Downes, Learning & Interoperability Consultant with Watershed

During a recent webinar, I joined Pam Boiros, learning advisor at Training Orchestra, to discuss the importance of laying the foundation for building your learning tech stack. But, having an effective L&D department is more than having the right tools in place. You also need the right strategy and approach to use those tools. So, once you’ve got your technology foundation in place, how do you stand on that foundation to really see the benefit? This blog post outlines three key principles for implementing a new technology, using learning analytics platforms as an example to illustrate these concepts.

1. Start Small, Then Scale

When thinking about a new and unfamiliar technology, there’s a risk of becoming so overwhelmed with the possibilities that you never actually get started. There’s also a risk that, because you’re new to the technology, you don’t have the knowledge and experience to implement it well. For both these reasons, it’s important to start small, then scale.

In the context of a Learning Analytics Platform, such as Watershed, that means choosing a limited scope project to begin using learning analytics. Start by identifying a learning program and group of learners so you can begin collecting data and practicing learning analytics. Running this type of pilot project will help you develop your knowledge and skills in learning analytics and teach you important lessons for when you’re ready to scale your learning programs. When your pilot is complete, you will be ready to not only roll out learning analytics on a larger scale within your organization, but also have an informed plan based on your pilot experience.

That’s exactly what Verizon has accomplished with their learning analytics implementation. See how Dwayne Thomas, Verizon training manager, and the rest of his L&D team used a proof-of-concept as the foundation of a broader xAPI ecosystem strategy in a recorded xAPI case study webinar.

2. Embrace Change

Another challenge when implementing a new technology is that it can become secondary to your existing technologies and processes, rather than a core part of what you do. But remember, if you keep doing things the way you’ve always done them, you’ll struggle to see the benefits of the new technology. Especially for technologies that are intended to form part of the foundation of your learning tech stack, you need to change your processes to properly stand on that foundation.

In the case of learning analytics, this means thinking about data, analytics, and evaluation throughout the lifecycle of your learning content, resources, and tools. Evaluation isn’t something to start thinking about after learners have taken the course, but, instead, needs to be in your mind even as you start to gather requirements and design your learning solutions.

So, before you design your learning solution, ask yourself:

  • What is the business goal?
  • What needs to happen to meet that goal, and who is responsible for making sure these things happen?
  • What knowledge is required for people to support these actions?

For each of these questions, consider what data you’ll need to measure success and how you’ll capture that data.

3. Keep Sight of the Purpose

A third risk you might encounter is getting caught up in the functionality and features of the new technology and lose sight of the reason you implemented it. You might be getting some benefit from the technology, but you need to make sure that it’s fulfilling its intended purpose.

If you’re not seeing the promised benefit of the technology, the problem may lie with the technology itself or with how you’re implementing and using it. As mentioned above, if the technology isn’t embedded into your processes, you may not be seeing its true benefits.

The purpose of a learning analytics platform is to provide data to improve the effectiveness of your learning solutions. You should be able to collect data about learning and performance and compare the two in order to gain insights into what’s working and what’s not. If you’re not able to do that, it may be because:

  • Your learning solutions aren’t aligned to any particular objectives so it’s difficult to measure if they were successful.
  • You’re missing data from a key data source.
  • You haven’t taken time to explore the data.
  • Some limitation of your learning analytics platform.

Whatever the case, take action to address any issues and keep pushing ahead. Keep sight of the purpose and don’t let go! And, if you missed our webinar, you can still check out the slides here.

 

 

About the Author

With a background in instructional design and development, Andrew Downes creates learning platforms and experiences in academic and corporate environments. Now a learning and interoperability consultant with Watershed, Andrew is an expert in Evidence-Driven Learning and Learning Technologies Interoperability.

As an author and top contributor of xAPI (Experience API) and the majority of material on experienceapi.com, Andrew is a recognized xAPI expert who has delivered presentations, webinars, and training sessions across the globe. Keep up with Andrew on Twitter by following @mdownes.