This scene from Good Will Hunting (1997) hits different these days:
For the video-adverse crowd, here’s the context.
Dr. Sean Maguire (played by Robin Williams) is talking to Will Hunting (played by Matt Damon). Sean is Will’s psychiatrist.
The subject of the conversation: Will just psychoanalyzed Sean’s painting to suggest that he married the wrong woman. (She had passed away quite a few years earlier). Basically, Will knows how to push his psychiatrist’s buttons.
And in case it weren’t obvious in the clip, Will is a genius. He’s a genius’s genius. He can look at a book and immediately memorize it. He can look at a math problem and immediately solve it. He can look at a painting and immediately analyze it like a top critic.
Basically, he’s Neo from The Matrix (1999):
But instead of hooking himself up to a machine, Will just reads a book. His mind digests all of the information, synthesizes it, and applies it.
The problem?
As this scene dramatizes it, the endless pursuit of knowledge doesn’t become experience.
Will’s never been outside of Boston. He cannot appreciate what it’s like to stand in the Sistine Chapel, or to be truly vulnerable with another person.
Hold up…Why Are We Talking about Good Will Hunting?
Here’s my thought.
Will is AI. Sean is us.
We have a person who can digest information, seemingly infinitely. He comes face-to-face with a psychiatrist who cannot even imagine what that is like.
This is their empathy gap. Neither Sean nor Will can fully understand what it’s like to be the other person.
Now…AI.
No matter how much it advances (don’t yell at me), AI will never replace these experiences:
The experience of writing ourselves—word by word, sentence by sentence, paragraph by paragraph—and having ideas bubble up as we write.
The experience of watching a movie--scene by scene--with little idea of what is going to happen next.
The experience of putting a paintbrush to canvas, making decisions as we go along.
The experience of learning to play an instrument, through hours and hours and hours of practice and failure.
These kinds of experiences defy shortcuts. They cannot be shortened, packaged, and delivered to the person’s doorstep.
Shortening an experience changes the experience.
Because in the end, experiencing something is about process rather then product.
Ok, But What About…?
I hear you.
What if someone created an AI so powerful that it can make us feel emotions and experiences as if they were really happening to us?
Great question! To philosophy we go!
(Sorry, not sorry).
In 1974, the philosopher Thomas Nagel had an idea. He wrote all about it in his awesome essay, “What Is It Like To Be a Bat?”
The idea is pretty simple.
A person could spend 100 years studying bats from every conceivable angle. They could live with bats. They could study bats. They could read everything about bats. They could put electrodes to the bats’ heads and learn everything about them. They could create VR simulators (I updated this example a bit) and spend every waking moment living like a bat in that simulator.
They could become the greatest expert on bats ever in the history of ever.
BUT they will never know what it’s like to be a bat.
There’s nothing they can do to truly live that experience. No matter what AI/VR program they used to live just like a bat, they will always have an asymptotic relationship to that bat’s experience. They can get really really really really close, but will never know what it’s like to be a bat.
Ever.
Ok, fine. Let’s be honest. It’s not really about bats.
This example demonstrates something about phenomenology (the study of consciousness and experience).
No matter what I do, I cannot know what it’s like to be you. And you cannot ever know what it’s like to be me.
No matter what. No matter what we do. No matter how AI evolves.
This brings me to my (perhaps) knuckleheaded claim: AI cannot replace what we learn in the process of doing something. Technology cannot actually replace the experience of standing in the Sistine Chapel, because any AI/VR simulation will be framed by the technology itself.
It can get close. But no cigar.
It will always be a simulacrum.
What Does This Mean for the Classroom?
I’ve written about this idea before.
What if the classroom is about experiencing the beauty of human creation? I don’t mean “human creation” as a product. I mean experiencing the thoughts, ideas, and questions that pop in the process of creating something.
What if the way forward is bringing that beauty front and center, and helping people process it?
(That process of creating could be with a machine or technology of any sort.)
In other words, the emerging technology is great for many things. But I still want to take my sons to the Sistine Chapel one day.
That’s Not It…There’s One More Thing
I wrote about this on LinkedIn. A problem immediately came up.
And I admit it: I could be wrong.
It is possible that, in the future (and maybe not even too far off), someone creates an AI program that really does substitute for real-life experience.
We hook ourselves up to it, and it’s so “good” that it effectively replaces the kinds of process I write about here.
It’s possible.
But if it happens, my last worry is that you’ll come across this Substack post and put my feet to the fire.
I’ll have bigger things to worry about.
Bingo. I think you nailed it.
Neo "knew" Kung Fu. But he didn't learn it. I think we all realize that bit of the movie takes a serious suspension of disbelief. Muscle memory and the reactions necessary to be proficient in a martial art cannot be downloaded. It takes training. Training is experiential. Education is experiential.
I agree with you.
AI is missing what we call Tacit Knowledge, which includes our experiences, intuitions, skills, and insights that we develop over time. Most of what we learn as tacit knowledge is not in any book or can be taught to another person as you have to live and experience it to learn it. This is why we handle the edge cases in the self-driving cars scenario better than AI.
The above is why I always say we are more than books, articles, and posts on the internet, and that part will be very difficult to learn with the current approach of AI.
While AI can process vast amounts of data and identify patterns, it struggles with the intuitive and experiential aspects of human understanding. This is why humans often excel in situations requiring creativity, empathy, and nuanced judgment—areas where tacit knowledge plays a crucial role.
Every problem cannot be solved by mathematics or more data, but for now, unfortunately, this is what the AI industry thinks.