I'm 16 and I Use AI Every Day. Here's What Most People Are Missing.
Growing up in the middle of the fastest technological shift in modern history is a strange experience. Every day I hear two completely different versions of reality. My teachers tell me AI is an easy way out, a crutch that will make me less capable in the long run. Meanwhile, my dad, a software engineer with 30+ years of experience, uses AI daily to improve his output and free up time for higher-level work. Same technology, opposite conclusions.
Now obviously, we all carry biases even if we don't think we do. Mine come from watching my father's career up close. But I've also spent time looking at what the research actually says, and it tells a more interesting story than either side usually admits.
Here's where educators tend to stop listening. They hear "I use AI" and immediately think "shortcut." But a study from Harvard Business School and Boston Consulting Group found that when workers used AI for tasks it was well-suited for, they saw a nearly 40% increase in performance compared to those who didn't use it. That's not a shortcut. That's a tool making people meaningfully better at what they do. If a 40% improvement showed up in any other context, a new teaching method, a new textbook, a new curriculum, educators would be celebrating it. Not banning it.
To be fair, the other side has evidence too. A study in the British Journal of Educational Technology found that college students who revised essays with ChatGPT actually improved their writing the most, even outperforming students who worked with human writing coaches. But here's the catch: those students didn't retain as much about the topic they were writing about. Better output, less learning. That's the study educators love to cite.
But here's what that argument misses. Those students were handed ChatGPT with zero guidance. No one taught them how to use it as a thinking tool. No one said "use this to check your reasoning" or "use this to break down a concept you're stuck on." They used it the way anyone uses a tool when they've had no training: poorly. That's not an AI problem. That's a teaching problem. If you hand a kid a calculator before they understand multiplication, they won't learn anything. But nobody is arguing we should ban calculators.
From what I see, the biggest issue surrounding AI in education is how people are being introduced to it. I hear my teachers talk about how harmful AI is, but their main experience with it is being forced to teach from a poorly designed curriculum where AI handles all the grading. Think about that. The people telling students AI is bad are forming that opinion from a system they had no say in, one that was implemented badly and stripped away the parts of teaching they actually care about. Of course they hate it. They were handed the worst possible version of AI and told it was the future.
And there are real limits. Good teaching involves knowing when to push a student and when to let them struggle, reading the room in a way no algorithm can. Teachers structure how you encounter information, challenge you at the right moments, and adjust based on who you are as a person. AI is not going to replace that. We aren't at that point now, and we never will be.
But what most people haven't figured out is that replacing humans was never the point. AI is a tool meant to help us do more with what we already know.
At 16, I've used it to build a coordination app that helps my family manage who watches my younger brother and when. I've built tools that support my dad's options trading strategy. These are real, functional applications that solve real problems, the kind of projects most people don't touch until college or later.
If there's one thing you take away from this, it's that the conversation around AI is broken. Not because people disagree, but because they're not listening to each other. My teachers aren't wrong for worrying about what AI does to learning. My dad isn't wrong for using it to be better at his job. And I'm not wrong for building real things at 16 that create value for the people around me.
The pushback is good. It means we're taking this seriously. But fear isn't a strategy. Teaching people how to use AI responsibly is. And right now, the people who understand that the least are the ones making the rules about it.