Oxford Professor Outlines 3 Ways Schools Can Empower Students to Shape AI Responsibly, Not Just Adapt to It
As artificial intelligence reshapes classrooms, workplaces, and everyday life, Professor Rebecca Eynon from the Oxford Internet Institute and the University of Oxford’s Department of Education is urging schools to rethink how they prepare students for the future. Rather than simply teaching students to adapt to AI, she argues schools must equip them to actively shape its development. Eynon warns that many young people are not as digitally literate as commonly assumed. Her research through the Towards Equity-Focused EdTech project revealed that students often struggle with basic digital tasks like managing files or sending emails. Meanwhile, teachers remain uncertain about how or where to integrate digital literacy into the curriculum. She calls for a shift from reactive to proactive education—one that empowers students not just to use AI, but to understand and influence it. According to Eynon, this requires three key changes in how schools approach AI education. First, schools should teach criticality, not just coding. While technical skills are important, they should be paired with deeper understanding of the social, political, and economic forces behind AI. Students need to learn how bias is embedded in algorithms, how tech companies profit from user data, and how misinformation spreads. This helps them move beyond being passive users and become informed, questioning citizens who can challenge and shape technology. Second, AI education must be designed with inclusion in mind. Eynon emphasizes that design is a core part of digital literacy. She advocates for hands-on projects that allow students to explore real-world issues—such as algorithmic bias or inequitable access to technology—and develop digital tools that serve their communities. By connecting technology to social realities, these projects help students see themselves as creators and changemakers, not just consumers. Third, responsibility for addressing AI’s challenges should not fall solely on students. Eynon warns against placing the burden of fixing flawed systems on young people. Instead, she stresses that governments, educators, and tech companies must share accountability for the ethical, environmental, and legal impacts of AI. Students should be empowered to question and contribute, but not expected to solve systemic problems alone. Ultimately, Eynon believes schools have a vital role in shaping a future where AI serves society equitably. By fostering critical thinking, inclusive design, and shared responsibility, education can help students become active participants in building a more just digital world—not just survivors of it.
