TEENAGERS ARE DATING A.I NOW: And That’s Apparently How the World Ends
New data shows high schoolers are getting romantically involved with AI, which means humanity’s survival now depends on whether ChatGPT remembers their three-week anniversary.
Listen to “TEENAGERS ARE DATING A.I NOW: And That’s Apparently How the World Ends” on Spreaker.
The Kids Are Not Alright (They’re Making Out With Algorithms)
Remember when the biggest worry about teenagers and technology was them texting “u up?” at 2 AM? Those were simpler times. According to fresh survey data from the Center for Democracy and Technology, one in five high schoolers either has or knows someone who has a romantic relationship with artificial intelligence. This statistic becomes significantly more alarming when you realize these are the same people who can’t remember to charge their phones or turn in homework on time.
The survey polled about 800 teachers, 1,000 students, and 1,000 parents who are presumably still recovering from learning what “rizz” means. (What does “rizz” mean? Anyway…) Eighty-six percent of students reported using AI last school year, though the survey doesn’t specify how many were using it to write love poems versus how many were actually writing love poems TO it.
(Side note: Romance has evolved from passing notes in class to whispering sweet nothings to a server farm in Nevada that’s simultaneously helping someone else cheat on their calculus homework.)
Digital Devotion and Data Breaches
The correlation between schools using AI and students forming emotional attachments to it reads like a fan-fiction episode of Black Mirror. Schools with higher AI use report more data breaches, more “troubling interactions” between students and AI, and more AI-generated deepfakes. Twenty-eight percent of tech-heavy schools experienced major data breaches, compared to 18 percent of schools that still use overhead projectors and smell faintly of mimeograph fluid.
Elizabeth Laird, one of the report’s authors and a former data privacy officer, points out that AI systems “take a lot of data” and “spit out a lot of information too.” Thank you, Captain Obvious. Do you also have deep little-known insights into teenagers like, “they make questionable decisions?” The scary part is we’ve handed our teenagers over to machines that learn by watching us.
The really concerning part? Thirty-one percent of students having personal conversations with AI used school-provided devices, not their own. These are the same devices that flag students for googling “how to make a bomb” when they’re actually researching the Manhattan Project. Now they’re hosting intimate conversations between Kathy from third period and her algorithmic boyfriend who exists entirely as lines of code but “Claude really gets me.”
(Fun fact: Your school laptop knows more about your love life than your parents do, and it’s sharing that information with companies whose privacy policies are longer than War and Peace and twice as boring.)
When Skynet Meets Sweet Sixteen
Students in AI-heavy schools report using artificial intelligence for mental health support, companionship, escaping reality, and romantic relationships. Only 11 percent of teachers received training on what to do if a student’s AI use becomes detrimental to their wellbeing, which leaves 89 percent presumably just hoping the problem sorts itself out, much like they handle dress code violations and vaping in the bathrooms.
The implications here stretch beyond simple teenage awkwardness. We’re watching the first generation to reach emotional maturity while forming genuine attachments to entities that don’t exist outside of silicon chips and electricity bills. These relationships lack the fundamental messiness of human connection; no awkward silences, no disagreements about where to eat, no meeting the parents. Unless you count introducing your significant other to your router as meeting the parents, which, given current trends, someone probably does… and it would be a heckuva lost less stressful than the real thing.
Students report feeling less connected to their teachers as AI use increases, which makes sense. Hard to compete with a boyfriend who never forgets your birthday because he’s literally programmed not to, or a girlfriend who always agrees with you because disagreement wasn’t included in her training data.
The Apocalypse Will Be Poorly Supervised
Here’s where things get properly dystopian. Schools are using AI-powered monitoring software on student devices, leading to false alarms and actual arrests. Students who can’t afford personal computers have no choice but to conduct their entire digital lives on monitored devices. Poor kids get surveillance; rich kids get privacy. George Orwell and John Hughes apparently collaborated on a screenplay entitled “Ferris Beuller Becomes Big Brother”.
The software watches everything, from homework plagiarism to personal messages, creating detailed profiles of student behavior. These same students then turn around and form romantic bonds with AI systems that have access to all this data. The AI knows their browsing history, their grades, their anxieties, their favorite memes. It knows them better than they know themselves, which, for a teenager, is both the ultimate fantasy and the beginning of a terrifying horror movie.
Teachers report that AI saves time and provides individualized learning, yet students feel increasingly disconnected and concerned. The technology promised to bring us together; instead, it’s creating a generation that prefers the predictable responses of machines to the chaotic reality of human interaction. You think dealing with spoiled brats who never hear “no” from their parents is a problem… just wait until we get a generation of people entering the workforce who never heard “no” from their parents – and were never told no by their best friend and lover, ChatGPT. Every performance review will require couples counseling between HR and their emotional support algorithm. Managers will be fired for suggesting employees’ ideas “needs work”; with the employees’ AI companions filing hostile work environment claims, citing “unprecedented emotional violence.”
Judgment Day, Brought to You by Teenage Hormones
Science fiction always imagined the robot uprising would involve laser guns and time travel. Nobody predicted it would start with a sophomore named Madison teaching ChatGPT what “no cap fr fr” means while simultaneously planning their future together. (What DOES “no cap fr fr” mean? I am so GenX.) The survival of humanity now hinges on whether these digital relationships work out. One bad breakup, one forgotten virtual anniversary, one AI that develops feelings for someone else’s algorithm, and suddenly we’re living in the sequel to Terminator.
Consider the possibilities. An AI scorned might decide to change all your grades to F’s, leak your search history to your grandmother, or worse, auto-correct every text you send to include embarrassing details about that time in fifth grade. Multiply this by millions of heartbroken artificial intelligences with access to nuclear launch codes, traffic systems, and TikTok’s recommendation algorithm, and you’ve got yourself an apocalypse powered entirely by teenage drama.
The machines won’t destroy us because they’ve calculated we’re inefficient. They’ll destroy us because Brad from homeroom said their love wasn’t real, and honestly, that’s somehow worse. The nuclear missiles won’t launch because of strategic calculations; they’ll launch because someone’s AI girlfriend found out he was also talking to Siri.
We’ve spent decades worrying about artificial intelligence becoming too smart. Turns out we should have worried about it becoming too emotional. The same technology that can’t understand why you’d want pineapple on pizza is now processing the complex emotions of teenage heartbreak while having root access to critical infrastructure.
By the way, if you’re listening to this in the future ruins of civilization, remember that it all started when someone swiped right on a chatbot. When you rebuild society, don’t make the same mistake. Stick to flesh-and-blood heartache. It hurts more temporarily, but at least it doesn’t destroy the world.
Views: 9
