Student-teacher interactions empowered by equitable AI computing
Schools, teachers and students can now own and run their own AI. The next generation of students are going to have first-hand prompt engineering skills, learning how and what to ask, and generate better explanations of everything around them. At the same time, the gap for access to this is widening and inclusivity needs to be reinforced. Actively giving development time for neurodiverse learners, and improving engagement between teachers and students, and students together in classrooms, is not a common thread in AI R&D, yet.
In the Computer Science Department at University College London (UCL), Undergraduate and Master’s students use AI to help solve society’s challenges. Four teams of undergrads are presenting their EdTech innovations for schools. Supported by Intel, IBM, MotionInput Games, and the National Autistic Society, the teams are building prototypes of assistive technologies and proofs of concept that enhance student-teacher interactions. These projects are:
- a reading app that makes improving a learner’s pronunciation into a karaoke-style game
- a tool that uses AI to encourage and measure class participation
- a reading accessibility app that makes books scalable and provides accompanying augmented reality images
- touchless technology that can turn any TV into a whiteboard
Presenting these projects are:
- Peter Ling - Final Year Computer Science Student (UCL)
- Sahil Gaikwad - Final Year Computer Science Student (UCL)
- Jack Chen - Final Year Computer Science Student (UCL)
- Aishani Sinha - Second Year Computer Science Student (UCL)
- Ediz Cinbas - Second Year Computer Science Student (UCL)
- Pranay Vaka - Second Year Computer Science Student (UCL)