Artificial intelligence (AI) systems, tools, and techniques are being rapidly adopted and used in devices and products that youth interact with on a daily basis at home and in schools. Educators are starting to recognize the need to teach K-12 students about AI and are developing novel programs, such as robotics, to address this need. However, AI ethics, despite its critical importance in youth's future personal, civic, and work lives, has been largely neglected. This project develops and integrates AI ethics teaching modules into existing robotics youth programs. As students learn about robotics, they will also learn about emerging ethical issues in AI in the design of robots at the same time, including fairness, transparency, autonomy, respect, accountability, privacy, and security. The project aims to promote responsibility for AI ethics and skills among students through two types of innovative interventions by engaging learners in (1) stories based on AI ethics issues that are likely to be meaningful to young adolescents, such as surveillance of their physical activities, and (2) empathy driven hands-on activities with an embedded ethical dilemma in which students will experience the process of committing an ethical error, identifying the error, and fixing the error, potentially developing a stronger sense of ownership of ethical issues and agency to address them. The research will contribute to the understanding of the effects of integrating AI ethics into STEM programs such as robotics for students, particularly regarding their engagement in, awareness of, knowledge about, and sense of responsibility for a range of emerging ethical issues underlying the use of AI in STEM and CS fields. It will provide a new model for teaching AI to students integrated with STEM concepts in a contextualized and relevant manner, by elevating ethics as a primary concern, as opposed to as a side issue. This project is funded by the STEM + Computing (STEM+C) program that supports research and development to understand the integration of computing and computational thinking in STEM learning.
The research questions driving this project are: (1) How does integrating AI ethics into a robotics program help develop students' ethical thinking, while deepening their learning in STEM and CS? (2) How do empathy-driven hands-on activities with embedded ethical dilemma help cultivate students' sense of AI ethics responsibility; and (3) How do teachers in various contexts integrate AI ethics modules into their own robotics programs? Over the course of three years, AI ethics modules will be formatively designed and studied to assess their effects on students' AI ethics reasoning along with STEM and CS skills. Researchers will also study how educators in different formal and informal contexts integrate these AI ethics modules into their robotics programs. 72 middle school students will participate in a between-subject research study while attending a 5-day robotics camp program in three camp program conditions: camp program without AI ethics, camp program with an AI ethics module separated from robotics program, and camp program with robotics integrated with AI ethics. Researchers will also conduct a mixed methods study of 50 robotics educators who have agreed to participate in an online community, studying how teachers integrate AI ethics modules into their own robotic programs. The resulting AI ethics modules will be disseminated widely through partnerships with organizations and schools, as well as through the project website. Given AI's far-reaching implications for our society, this project aims to help students develop responsibility for AI to complement their skill and knowledge about AI, so that more will be prepared and able to handle ethical issues in their future professional practices in order to guide the inevitable societal transformation driven by AI in a safer, fairer, and freer direction.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.