Smart Technologies in the Era of COVID-19

A Q&A with engineering education researcher Kate Goodman

Published:
Topics: Future of Learning

In Navigating the Future of Learning: A Strategy Guide, we discussed the interest in and challenges inherent in using smart technologies in education. The reach of education technology has dramatically increased as schools and postsecondary institutions were forced to adopt remote learning, and we wanted to revisit this concept and think about what pre-COVID-19 challenges remain relevant and what new challenges and opportunities might have emerged since the pandemic began.

Kate Goodman attended one of the workshops that informed the strategy guide and leads an innovation initiative at the University of Colorado – Denver and is an engineering education researcher. She shared her thoughts on foundational and emerging issues related to smart technologies in education.

How has thinking or efforts around smart technologies in education changed due to the pandemic?

First, I want to point out that not all technologies used for education are in fact “smart” technologies. While the definition is hard to pin down, most of the time smart tech indicates a system that utilizes constantly updating data, fed back into an algorithm, to provide a customized experience. Instead, many education technology systems, or ed-tech, grab the data, but don’t necessarily update the user experience based upon that data, at least not in a way that educators or learners can see.

The pandemic has forced us all to rely on ed-tech far more than in the past, with a rapid deployment rarely seen in education. This urgency to expand access is necessary. We must provide equitable education for every student. But the ed-tech that can be deployed the fastest may not always be the best.

How can you form future-ready, resilient and equitable systems during and after COVID-19? We’re here for you.
Get Help >

What questions do you have about those efforts?

Existing smart technologies were developed for business or personal use cases. The fundamental goal of education is not aligned with the goals of business or even personal usage. In my view, the goal of education is to create useful struggle so that learners understand things they did not before. Error-driven learning is one of the most powerful ways our brains change and learn. A goal of most other uses of smart technologies is to streamline tasks, making them easier, to eliminate struggle. From a learning perspective, the biggest risk of instant information is not that it might be wrong; it is that no matter what, there is an answer, even if it is not a good one. There is a gap between wondering about something and finding an answer. That gap is where curiosity grows. That gap is what leads to original insights.

The second challenge is related to how data shapes those experiences. The promise of personalized ed-tech is that the system will document my behavior and interactions and give me an individualized path that truly suits my learning needs and strengths, and keeps my curiosity fueled while not overwhelming me. The danger is that the system will “bucket” me based upon a few key indicators, and important factors that aren’t measured will go unnoticed. Without meaning to, we could replicate some of the problems caused by high-stakes testing – one bad day eliminates swaths of options.

So how can we manage those challenges?

We need to work with companies developing these technologies – make sure education researchers, parents, educators, and students are co-designing these systems. We need to follow good codes of conduct around these collaborations, not only to avoid ethical concerns, which have already become a concern, but also to protect these co-designers. They are content experts whose time and expertise should be remunerated! These co-designers need to come from many communities, cultures, ethnicities, and socio-economic backgrounds. We create algorithms that replicate bias when we only consider small subsets as we design.

We also need to hold ed-tech companies accountable on how they use data. Companies aggregate and reuse data all the time, usually because such data sets are extremely valuable for training their algorithms and in a commercial sense. To date, the legal questions have usually centered on copyright and who has the right to see a specific learner’s data. However, as we move into more educational spaces with these technologies, the larger issues should be privacy and ethics, particularly when it comes to surveillance. Remember that these tools are being used with minors who do not have the ability to opt-out. Even among college students, the choice has essentially been made for them by the college. As individual adults, we may realize we’re clicking away our privacy when we say “I agree” on unread user license agreements, but students are compelled to utilize these systems by education institutions.

What’s a common misconception you think people hold about incorporating smart technologies into education?

People think more technology in education means fewer humans in education. We may actually need more humans. If a student is using educational smart technology and having aspects of their learning journeys influenced or determined by algorithms, we must have a well-trained educator as part of that feedback loop. That person’s primary job would be to interact with students and understand their needs. As personalized learning options are offered to learners through technology platforms, these educators would help ensure that these options make sense and aren’t replicating bias like so many algorithms do. And a version of this role would need to exist not just at a classroom level but at an institutional level to verify that certain populations are not disadvantaged by automated recommendations.

Navigating the Future of Learning: A Strategy Guide is designed to help K-12 educators, postsecondary education institutions and community-based learning organizations take action with the future in mind.