The reach of education technology has dramatically increased as schools and postsecondary institutions were forced to rapidly adopt remote learning as a result of the COVID-19 pandemic.
Kate Goodman, an engineering education researcher at the University of Colorado – Denver, shares her thoughts on the foundational and emerging issues related to smart technologies in education.
How has thinking around smart technologies in education changed since the pandemic?
First, I want to point out that not all technologies used for education are “smart” technologies. While the definition is hard to pin down, most of the time smart tech indicates a system that utilizes constantly updating data, fed back into an algorithm, to provide a customized experience. Instead, many education technology systems, or ed-tech, grab the data, but don’t necessarily update the user experience based on that data, at least not in a way that educators or learners can see.
The pandemic forced us all to rely on ed-tech far more than in the past, with the type of rapid deployment rarely seen in education. This urgency to expand access was necessary. We needed to provide equitable education for every student. But ed-tech that can be deployed the fastest may not always be the best.
What challenges do you foresee when contemplating the use of smart technology in education?
Existing smart technologies were developed for business or personal use cases. The fundamental goal of education is not aligned with the goals of business or even personal usage. In my view, the goal of education is to create useful struggle so that learners understand things they did not before. Error-driven learning is one of the most powerful ways our brains change and learn. A goal of most other uses of smart technologies is to streamline tasks, making them easier, to eliminate struggle.
The second challenge is related to how data shapes experiences. The promise of personalized ed-tech is that the system will document behavior and interactions and provide an individualized path that truly suits individual learning needs and strengths. The danger is that the system will “bucket” me based upon a few key indicators, and important factors that aren’t measured will go unnoticed. Without meaning to, we could replicate some of the problems caused by high-stakes testing – one bad day eliminates swaths of options.
How can we manage those challenges?
We need to work with companies developing these technologies – make sure education researchers, parents, educators and students are co-designing these systems. We need to follow good codes of conduct around these collaborations, not only to avoid ethical concerns, which have already become a problem, but also to protect these co-designers. They are content experts whose time and expertise should be compensated! Co-designers should come from many communities, cultures, ethnicities and socio-economic backgrounds. When we only consider small subsets as we design systems, the potential exists to replicate implicit bias.
We also need to hold ed-tech companies accountable on how the data collected is used. Companies aggregate and reuse data all the time, usually because such data sets are extremely valuable for training and in a commercial sense. Legal questions usually center around copyright issues and who has the right to see a specific learner’s data. However, as we move into more educational spaces with these technologies, the larger issue becomes privacy and ethics, particularly when it comes to surveillance. Remember that these tools are being used to collect information about minors who do not have the ability to opt-out. Even among college students, the choice has essentially been made for them. As adults, we may realize we’re clicking away our privacy rights when we say, “I agree,” but students are compelled to utilize these systems by education institutions.
What’s a common misconception about the use of smart technology in education?
People think more technology in education means fewer humans in education. We may actually need more humans. If a student is using smart technology and having aspects of their learning journeys influenced or determined by algorithms, we must have a well-trained educator as part of that feedback loop. That person’s primary job would be to interact with students and understand their needs. As personalized learning options are offered through technology platforms, educators would help ensure that presented options make sense and aren’t replicating bias like so many algorithms do. A version of this role would also need to exist at an institutional level to verify that certain populations are not disadvantaged by automated recommendations.