At the same time, instructors are uneasy about ceding control of personalized instruction to a platform’s opaque algorithms. This tension is unsurprising; there are many definitions of “adaptive learning” flying around, and it’s hard for instructors to trust that a product’s adaptive choices will align with their teaching methods. It’s on the provider to make sure that instructors know exactly how the courseware works and what makes it adaptive.
What efforts are in place on your campus to educate instructors on emerging models such as adaptive learning?
The participants in the Next Generation Courseware Challenge (NGCC) grant employed practical tactics to address instructors’ concerns, starting by clearly explaining the adaptive features in their particular products. Because instructors aren’t always well versed in courseware design and engineering, providers must help them understand the science behind the design choices in the product. To increase their customers’ comfort with adaptive features, the grantees flagged the specific places in the platform that were driven by learning science research and made visible the actions that the system was taking as a result.
An instructor’s course focus doesn’t always align with a courseware provider’s automated outputs. During the grant, OpenStax found that, while one feature auto-generated questions for a quiz based on topics that instructors wanted to cover, some of the questions weren’t raised in the lessons. Instructors also wanted more standardization among adaptive learning questions so that one student wouldn’t receive wildly longer or more difficult assignments than a peer. OpenStax Tutor gave instructors more control over their students' experiences, including greater control over the selection, number and type of questions that each student receives.
There are important trade-offs to consider when giving the instructor control of choosing which features to use. You’ve likely employed the latest learning science and developed adaptive features to implement specific principles, so you may need to employ nudging techniques to encourage instructors to use adaptive learning to their advantage. In another grant example, Acrobatiq realized that students in some implementations were not getting enough practice time in order for its courseware to provide a truly adaptive experience. While this is not unique to courseware—every teacher knows she needs a certain amount of data to personalize her instruction—this insufficiency of data wasn’t easily visible to instructors. So, Acrobatiq developed a tool that clearly informed teachers at launch about the requirements to make the courseware adaptive to students. The company released this tool recently and is excited to see how it helps improve the experience for instructors and outcomes for students.
“If students don't provide enough data in the form of practice, then courseware providers can write an adaptive experience, but it won't actually adapt to the students. That’s a trickier thing you can’t necessarily expect instructors to just know.”
Bill Jerome, Senior Product Manager at Acrobatiq
It’s not easy to find the balance between provisioning a fully adaptive platform and letting human instructors assert autonomy in its implementation. However, when instructors have a clear understanding about how an adaptive platform works, they can make intelligent choices about its features and their classroom content, thereby extending the benefits of machine learning.
Wendy Wright, Professor of Business Administration at Cerritos College
Clearly define adaptive learning for instructors. With individual students’ outcomes at stake, instructors are naturally leery of black box implementations. An instructor who understands your tool’s adaptive features will be more willing to use them as designed.
Leave room for some features to be controlled by instructors while ensuring they understand the trade-offs and learning science behind the design of the solution. Then, give opportunities for instructors to tailor relevant features of their experience in the platform to their unique method.
Build in learning science triggers. Use evidence-based practices to inform the design of the product and as a way to explain to instructors the rationale behind these design decisions.
Because instructors have varying comfort levels with Lumen's Waymaker platform automatically communicating with students, Lumen makes certain features related to student communication optional. But its data showed that these features have real and positive impact on student outcomes. This drove Lumen to find a good middle ground. Instructors can choose not to enable the automated system, but the platform nudges them to turn on features by displaying statistics that show positive impact based on Lumen student performance data.