HomeTop StoriesStudents need human relationships to thrive. Why bots can get in the...

Students need human relationships to thrive. Why bots can get in the way

In August, OpenAI released its latest security map for ChatGPT 4.0. It is a highly technical and quite bleak book, describing the risks and security issues that generative artificial intelligence could create or amplify. At the very bottom of the document, OpenAI lists the “social impact” that it wants to study further. First on the list? Anthropomorphization and emotional dependence.

That’s a nice way of saying that bots are becoming increasingly human-sounding, which is why people are increasingly at risk of building a bond with them. OpenAI admits its own catch-22: these improvements create “both a compelling product experience and the potential for over-trust and dependency.” In short: as technology improves, social risks increase.

Ed tech tools are not immune to this challenge. As AI floods the marketplace, tech companies and district leaders alike must start asking the hard question: If bots are increasingly built to mimic human relationships—if they are designed to sound human—are they also designed to help students connect with real people? ?


Get stories like this straight to your inbox. Sign up for the 74 newsletter


If not, AI tools are at great risk of displacing their human connections. This poses long-term risks to students’ well-being, their ability to maintain human relationships and their access to networks that open doors to opportunity.

In a new report Navigation and Guidance in the Age of AI, Anna Arsenault and I wanted to analyze whether and how this question is addressed in AI-enabled study and career guidance.

Related

Q&A: Putting AI in its place in an era of lost human connection at school

This is a domain where chatbots will mainly be introduced. On average, high schools have one guidance counselor per 385 students. Research shows that only 6% of the time of counselors at secondary schools is spent on career advice. Such scarce human resources create gaps where chatbots can help, offering students on-demand personalized advice about applying to college, graduating and starting a career.

See also  Clothing on the way to Africa destroyed

Our report features insights from founders, CEOs and Chief Technology Officers at more than 30 technology companies building and deploying chatbots to support students in their college applications and careers.

Based on these interviews, Open AI’s warnings about anthropomorphization—attributing human characteristics to non-human things—ring true. For example, most college and career bots have names and are designed to emulate happy, fun-loving personalities. Many go beyond informational support to provide students with emotional and motivational help when advisors cannot.

Do students rely too much on these bots? It’s too early to tell. But while most leaders we interviewed envision a system of hybrid advising that gives students access to both bots and human coaches, the majority admitted that some students are drawn to bots in the hope of avoiding human interaction altogether.

Related

Generative artificial intelligence can help teachers. Does it work for students?

In short, the possibility that students will become attached to and rely on bots instead of humans is very real.

See also  Olivia Hussey, star of 1968's Romeo and Juliet, dies at 73

Fortunately, some leaders are taking steps to build bots that foster relationships, rather than just mimic them. Here are five examples of efforts to ensure that authentic human connection is a result rather than a victim of AI products:

Promoting frequent social interaction offline: Axio AI, born from Arizona State University’s student-run Luminosity Lab, is an AI companion that supports students’ personal growth. The bot is trained to learn the relationships in students’ lives. If students tell the bot that they are having a hard time or are bored, it suggests reaching out to specific friends or family members. Axio has also worked to reduce over-reliance by limiting students’ time on the app.

Involve students’ family and friends: Uprooted Academy is a nonprofit organization that operates a virtual community center where students can interact with AI-powered coaches who help them apply to college. To ensure that these digital relationships do not replace real-life relationships, Uprooted Academy asks students to identify up to five supportive people in their lives when they enroll. The tool automatically sends these five people a text message every two weeks with recommendations to support students’ study progress.

Stimulating conversations – even the difficult ones: CollegeVine, an AI consultant and teacher, guides high school students through the entire application process. Through conversations with students, CollegeVine’s bot, Sage, tracks how they describe their interactions with advisors and teachers. When it comes time to ask for letters of recommendation, the bot can coach students on who to ask questions of and how to address any issues they encounter when interacting with those adults.

See also  Tickets on sale for Philadelphia Eagles vs. Washington Commanders in the NFC Championship: prices, how to buy

Matching students and mentors: Backrs is a platform that recruits online volunteer mentors to coach high school students on projects related to their academic and extracurricular interests. The company recently released an AI success coach, Lubav, which helps students find the right mentors on the platform and message them.

Practice networking through online role play: CareerVillage.org’s Coach is a chatbot designed to help students practice asking for help. The platform includes a series of career development activities in which students practice interviewing with the bot and draft networking and job search emails, social media posts, and reference letters.

These examples highlight AI’s potential to strengthen human connections. However, the incentives to build relationship-oriented AI tools are weak. Few schools ask for these social features or evaluate instruments for their social impact.

If things remain as they are, the more anthropomorphic bots simulate relationships between tutoring, tutoring, and student support, the more they can promote student isolation.

Related

If teachers work together with tech makers, AI doesn’t have to be scary for schools

But that outcome is not inevitable. Research underlines the importance of relationships in youth development and economic mobility. To deliver on their mission, schools must prioritize human connections and ensure AI tools work for – not against – expanding students’ networks. Information technology coordinators and buyers, regulators, principals, and educators involved in purchasing new technologies should demand evidence that AI improves relationships and implement data systems to track that progress. Entrepreneurs who take steps to secure and expand connections should be rewarded for their efforts.

Otherwise, ED tech companies risk the same catch-22 as OpenAI: building artificial intelligence that keeps getting better, but at the expense of the human relationships students need to thrive.

Revelation: Julia Freeland Fisher serves as an unpaid consultant for Backrs.

- Advertisement -
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments