Despite the growing popularity of generative artificial intelligence to enhance instruction in K-12 schools and universities, many educators still doubt that AI in today's market for educational technology using GenAI. and the lack of data privacy protection. To address these concerns, some education technology leaders are adjusting their technology development processes to include teacher input as well as internal protocols focused on ethical use and data safety. Masu.
Amber Orenstein, senior vice president of product management, design and operations at education technology company Discovery Education, says many of her customers are turning to AI tools to help provide feedback to students and generate course content. Masu. However, data privacy concerns remain a major barrier, especially at the K-12 AI level.
As edtech companies continue to integrate AI into their products, Orenstein said, they should consider establishing internal policies and controls to ensure staff are developing safe and ethical tools. he pointed out. In the case of Discovery Education, he said the company has sought to adhere to data privacy and cybersecurity best practices.
“We have taken on the challenge of making the AI revolution safe and effective for teachers and students by focusing on internal controls. We also created an AI policy,” she said. “We then created guiderails within specific teams to further ensure that AI was used appropriately within the work of that specific team.”
As for how user data on AI platforms should be leveraged, Orenstein said Discovery Education recently launched an AI feature that collects data on students' problem-solving processes so that teachers can provide personalized feedback and guidance. He said he had developed. He noted that the need to provide rapid feedback to students is one of the key goals for educators as they implement new ed-tech tools and AI programs, and that teacher input is critical to effective classroom development. He added that it is an important part of developing the tool.
“In the next few months, we will [also] We are introducing new AI-powered formative assessment technology for a small number of educators,” she said. “Our goal is to iterate quickly and co-design. [tools] In collaboration with experts in the field – teachers who work hard every day to drive change and improve student learning outcomes. ”
Brian Imholte, director of teaching and learning services at EPAM Systems, a software engineering services company, said EPAM has been speaking with school leaders, educators, and students across the country to understand their needs and concerns regarding GenAI tools in education. He said he came. He said he has noticed that educators are more apprehensive than students when it comes to using AI tools, in part due to concerns about data privacy and AI illusions. He said reducing AI-induced hallucinations remains a major challenge for developers of AI-driven educational tools, noting that the company is working to develop more accurate and effective AI tutors.
“One of the big things we heard from teachers was; [about AI] In other words, there is a lack of understanding and vision, and the immediate reaction is often, “Screw it!” “I think it’s a very different experience from the students’ perspective,” he said. “I think what we’re seeing from young people and young college students is a complete acceptance.” [of the technology]Young kids in particular don't really care about data privacy because they've grown up in a world where all data is inherently shared. ”
Charles Thayer, chief academic officer at online curriculum provider Lincoln Learning Solutions, also cited AI illusions and data privacy as his top concerns.
“artificial intelligence [tools] “Sometimes I hallucinate or make things up,” he said. “If you're not prompted correctly, you can be fooled and things can quickly go sideways. …That's why we're taking a very deliberate and deliberate approach to how we leverage technology. We are hiring.”
Dave Whitehead, chief technology officer at Lincoln Learning Solutions, said when it comes to data privacy, there is a need to develop AI tools that protect personal information from unauthorized access. He reiterated the need to ensure AI education technology tools are designed with data privacy in mind.
“Technology companies that use AI to enhance their products and services must be aware of not only user expectations but also the potential risks and threats to their data and systems. AI Cyber Security data and privacy practices include protecting data sources and storage, ensuring the quality and validity of data used to train and test AI, and protecting learner and educator identity and privacy, and how to do so. “AI works and what it does,” he said, adding that companies need to comply with changes in local, state and federal regulations related to AI. I added that there is.