Home » Resources » Could Safeguards for Efficacy Help Balance the Use of Smart Technologies in Education?

Could Safeguards for Efficacy Help Balance the Use of Smart Technologies in Education?

Published:
Topics: Emerging Trends, Future of Learning

When we look to the future of learning, some prospects can feel exciting, while others can feel alarming. The drivers of change shaping learning can enable new kinds of practices, programs, structures and roles – or lead to the need for them. One possibility is that safeguards for efficacy could enable education stakeholders to provide vision and stewardship for implementing effective data strategies and for embracing emerging technologies for intentional learner support.

This provocation from KnowledgeWorks’ latest ten-year forecast, Navigating the Future of Learning, explores the tensions and promises related to making more extensive of artificial intelligence (AI), machine learning, data analytics and other smart technologies in education.

Looking ahead ten years, AI ethics cooperatives comprised of education stakeholders, community members and artificial intelligence and data experts could work together to establish robust ethical frameworks and standards governing the use of artificial intelligence and machine learning applications in education. These groups could also develop strategies for preventing the bias that could all too easily be built into such tools or baked into the data and values informing them. Today, we are beginning to understand both the vast potential and the vast unintended negative consequences that could result from the use of increasingly smart technologies. In one example of an attempt to balance this tension, GovLab’s Data Stewards project facilitates collaboration across private-sector organizations, public agencies and researchers to promote responsible data leadership and analytical expertise in service of the public good.

Machine learning audits could also help ensure the ethical use of machine learning in education. K-12 schools, postsecondary institutions and other learning organizations could regularly undergo systemic technology audits by trusted professionals to examine whether the algorithms driving decisions and experiences were interpretable, were based on unbiased data and reflected fair assumptions about specific local communities. Today, we are seeing some early moves to address such issues in other settings. At Socos Lab, research by theoretical neuroscientist and entrepreneur Vivienne Ming describes the financial and social inequities resulting from racial and gender bias in recruiting and hiring processes and suggests how to use artificial intelligence to develop smart talent acquisition tools that can prevent bias.

As we look ahead to an increasingly data-rich world, another possibility is that students could gain the rights to own their own data instead of having it be owned, monetized, and potentially exploited by the corporations that own the platforms and apps with which they interact. If that happened, data asset advisors could help students and their families manage, present and exchange data related to students’ learning, locations and device and platform usage. In addition, these trusted data stewards could have fiduciary responsibility for safeguarding the integrity of educational data systems. They could collaborate on security and defense mechanisms to prevent unauthorized third-party data collection. While such roles do not yet exist, new laws for data protection signal a possible move in this direction. Among them, California’s AB 375 law puts in place a variety of powerful protections against consumers’ having their data bought and sold without their knowledge. In addition, the European Union’s General Data Protection Regulation aims to give citizens more control over their data.

Another possibility in this space is that follow-me schools could emerge. Comprised of place-based learning opportunities, mentor and coach networks and digital resources, these flexible, self-organizing schools could help students experiencing educational disruption due to family instability, mobility challenges, homelessness or climate dislocation achieve continuity in their learning. These schools could use smart contracts, learner resource accounts and searchable and customizable repositories of vetted educational options to configure themselves around students wherever they were. Today, School in the Cloud provides an early indicator of this possibility through the creation of self-organized learning environments where anyone, “no matter how rich or poor, can engage and connect with information and mentoring online.”

Such possibilities promise to respond to the drivers of change shaping the future of learning while helping to support the healthy development of young people, enable effective lifelong learning and contribute to community vitality. We cannot know whether they will come to pass, but we can consider whether they might help us achieve more of what we want for learning – and for all learners.

How might safeguards for efficacy and the more specific possibilities described in this post benefit your organization or community? What first steps could you take to begin exploring their potential further?

For helping making sense of future possibilities in your context, see KnowledgeWorks’ audience-specific Discussion Guides for Forecast 5.0: Navigating the Future of Learning.