Categories
Education & AI

Teach the Right Struggle: AI as an Exoskeleton for Kids

Parents are stuck in a bad choice. Let kids use AI freely and risk turning school into copy paste, shallow focus, and helplessness when the tool is wrong. Ban AI and risk teaching in a world that no longer exists, where kids spend years grinding skills that may not matter much anymore.

I think there is a third path, but it is not just “do things manually first.” It is more selective than that: teach the right manual skills, skip the rest, then use AI as a power tool on top of that foundation. The goal is not to preserve every old method. The goal is to build the human abilities that let a kid steer AI, judge its output, and recover when it fails.

I am writing this as a parent of three elementary school aged kids, a co founder of Ketshop (where kids learn money by living through earning and spending), and a member of the Tahoe Truckee Unified School District AI Board Policy Work Group.

TLDR

AI should work like an exoskeleton. It should amplify skills kids already have. So the job of education is to choose the manual skills that build real judgment and independence, then stop drilling once those skills are there.

What I learned from the people sounding the alarm

A few voices have been especially helpful for me in thinking this through.

Nate B. Jones, an AI professional, argues that kids should do real work on paper first, then use AI to build things and learn how to give clear instructions. A Fortune piece makes a broader claim that when schools replace too much reading and writing with screens and edtech, student focus and outcomes can suffer. And researchers have been studying related questions from different angles, like how handwriting and typing engage the brain differently, and how reliance on AI can cause “cognitive offloading,” where the tool repeatedly does the hard part and the student loses skill over time.

Those sources do not all agree on every detail. But they point to the same core risk: if we let tools do the thinking too early, we should not be surprised when kids struggle to think without them.

The Exoskeleton Model, updated

AI is not just another app. It can produce finished looking work on demand. That changes the temptation. A kid can skip the part where learning actually happens: the first struggle, the false starts, the revision, the check your work moment.

So I agree with the direction of the alarm, but I want to tighten the prescription.

The question is not, “Should kids do things manually first?”
The better question is, “Which manual skills are worth the time because they make AI safe and useful, and which ones are mostly tradition?”

If we treat every old method as sacred, we waste time and create resentment. If we treat AI as a shortcut for everything, we create dependency. The fix is a filter.

The two traps to avoid

  • Nostalgia trap: treat every old method as mandatory. This turns learning into busy work and steals time from higher value learning and building.
  • Autopilot trap: let the tool do the hard part. This creates dependency and weak judgment.

My filter: the “one layer down” rule

You do not need to master every layer under a tool to use it well. But you do need the layer right below the button you are pressing.

That means the manual skills you teach should have a clear purpose. They should help a kid:

  • steer the work (give clear direction),
  • judge the result (spot weak, wrong, or empty output),
  • recover (revise, fix, or try another path).

If a manual skill does not serve those goals, it is a candidate to shorten, simplify, or skip.

The policy that changes behavior

Attempt before augmenting.

This is the same principle we built into Ketshop: you learn faster when you feel the real feedback loop.

Before AI, the kid produces a first pass. Not perfect, just real: a rough outline, a short answer, a plan, a guess plus a quick check.

Then AI is allowed to help: polish, reorganize, suggest options, explain mistakes, help test.

This keeps learning inside the kid while still letting AI accelerate progress.

What this looks like in practice

This is the difference between “manual first” and “manual on purpose.”

Writing
Kids need enough writing experience to know what strong writing looks like, so they can judge and revise what a tool produces. AI can help polish and reorganize, but the kid stays the editor in charge.

Math
Kids need enough math skill to tell when an answer is reasonable, and to notice when something is off. AI can help show methods and explain errors after the kid tries.

Coding and building
Kids can use AI early for building things because building forces decisions and iteration. But they still need to be able to explain what they want, test what they got, and fix what is broken.

Sources I am drawing from

Nate B. Jones: Why My 10-Year-Old Does Math by Hand This is a video by an AI professional explaining his approach to parenting in an AI-driven world. He details how he makes his daughter do math on paper and read physical books to build a strong brain first. Once she does the hard work, he then lets her use AI to actively build things like video games, teaching her how to give the machine very specific instructions rather than letting it do her thinking for her.

Fortune: Why American Schools are Broken This article explains the negative impact of replacing traditional schoolwork with digital screens and educational tech. It points out that when schools removed physical books and paper in favor of tablets, students’ reading and math scores plummeted. It serves as strong proof that removing the natural “struggle” of analog learning actually harms a child’s ability to focus and retain information.

Frontiers in Psychology: Handwriting vs. Typewriting (2024) This is a scientific study where researchers measured brain waves to see the difference between writing and typing. They found that carefully forming letters by hand activates large, interconnected networks in the brain that are responsible for memory and learning. Typing on a keyboard requires very little brain power and does not create these same important pathways, showing why kids still need to write with a pen.

The Cognitive Paradox of AI in Education (2025) This research paper looks at what happens when students rely on artificial intelligence too early in their education. It explains a concept called “cognitive offloading,” which means the brain gets weaker when a machine does the heavy lifting. The study warns that if kids don’t learn how to struggle through problems on their own first, they will eventually lose the ability to do complex, independent thinking.

The point, in one sentence

Teach kids to be directors: build the right manual skills, skip the rest, require a first attempt, then let AI amplify what they can already do.