We’ve Confused Access to Information With Actual Knowledge
We’ve confused access to information with actual knowledge.
And it’s getting worse.
We watched it happen in real time. First, it was a handful of people experimenting with AI like it was some kind of forbidden tech. Then it became normal. Then it became default. Now it’s Uncle Fred screenshotting what he thinks are hard facts and forgetting to crop out the “how do I reply to…” at the top.
And somehow… that counts as research.
This is where it gets messy. Because I’m about to say something that sounds dramatic until you really sit with it:
A whole generation knows how to find answers.
But not how to think.
When Information Sounds Like a Good Idea
Let me give you a stupidly perfect example.
I’ve been trying to eat healthier. Doing the whole “make better choices” thing. But the movie theater? That’s my downfall. It’s basically Temptation Island with surround sound.
So naturally, I did what any sane person would do—I googled “healthy snacks to sneak into a movie theater.”
And Google, powered by its shiny AI brain, told me to bring a watermelon. A whole watermelon.
Now yes, technically I could cut it up. But I need you to imagine sitting in a packed theater next to someone elbow-deep in a Ziploc bag of watermelon during a romcom. That’s not a snack. That’s a social crime (and a disgusting one at that).
That’s information.
It exists. It was offered. It is, in some universe, an option.
Should it have been? Absolutely not.
But yet there it was, technically correct and wrong at the same time.
So I scrolled past the nonsense and went where people with actual lived experience tend to gather, Reddit. From there, I found trail mix suggestions that were stealth, not messy, and didn’t make me look like I was prepping for a fruit salad by hand mid-movie.
And this is where the line gets dangerous because information isn’t the problem. It’s the assumption that information equals truth or correctness.
Technically Correct. Completely Wrong.
Let’s talk about the infamous glue pizza situation.
People were trying to figure out how to get cheese to stick better to pizza. Google AI Search, in all its brilliance, suggested adding non-toxic glue.
And here’s the wild part, it’s not entirely wrong.
In theory, glue would make the cheese stick better. And if it’s non-toxic, you’re probably not ending your night in the ER.
So technically? Correct. But also… What the actual hell are we doing?
Because this is where knowledge steps in. Even the smallest amount of real-world understanding kicks in and says, “Maybe we don’t add Elmer’s glue to our dinner.”
That pause? That questioning? That sanity check?
That’s knowledge.
When We Used to Learn How to Think
We used to be taught this.
Think back to math class. You didn’t just write the answer, you had to show your work. Not because teachers loved extra steps.
But because the process mattered.
The path from point A to point B told you:
if the answer was correct
why it was correct
and where it broke if it wasn’t
That’s knowledge.
And we’ve replaced it with equal parts speed and laziness.
Now it’s:
fastest answer wins
most confident-sounding response wins
it’s AI, so it must know best?
And people stopped asking: Is this right? Does this make sense? Should I trust this?
A Generation Trained to Find Answers
Now zoom out for a second.
Because this isn’t just about watermelon suggestions or glue on pizza.
This is where it starts to scale into something bigger.
We now have a generation graduating from high school and college that has had AI access for most of their diploma-earning education. They absolutely know how to find answers. That part isn’t the issue.
But knowing how to think through those answers? That’s a different skill entirely.
Because life is not the fastest route from question to answer.
It’s asking…Is that answer correct?Why is it correct?What supports it?Where could it be wrong?
And that part is getting skipped.
AI will hand you something that looks complete. It will give you structure, confidence, even citations that look legitimate at a glance. But dig even a little deeper and sometimes those citations lead nowhere or worse, they loop back into something that only sounds credible.
And when that becomes the norm, you don’t just lose accuracy.
You lose the habit of questioning.
Most of the time now, the same tools being used to generate answers are also being used to grade them. So the system reinforces itself. Output becomes validation, whether it holds up or not.
And the world they’re stepping into looks completely different from the one we came up in.
Entry-level roles, the ones that used to teach judgment, pattern recognition, and how things actually work, are disappearing or being replaced by AI. So now you’ve got people entering the workforce with access to information, but fewer opportunities to build real knowledge.
That gap doesn’t stay theoretical for long and shows up fast.
It shows up in how decisions get made.
And it compounds fast.
If you want to go deeper on what this actually means long-term, Big Left founder James Ellis breaks it down in The Coming Leadership Drought, and it’s worth the read, but be warned, it may keep you up at night pondering what the future holds.
This Was Never Harmless
But you don’t have to look that far ahead to see the impact.
This is where it stops being a weird internet problem and starts becoming a business problem.
Because this doesn’t stay in harmless Google searches or half-baked AI summaries.
This shows up in:
budget spreadsheets
startup cost projections
engineering decisions
grant writing
And we’ve already seen what happens when people use AI instead of actual expertise to make those calls. They get information. Sometimes a lot of it. But they don’t have the knowledge to verify whether any of it is right.
And that’s the part most people skip.
Machines are not perfect. Tech is not perfect.
And once you add the human element: bad prompts, shallow understanding, blind trust, people grabbing the first answer that sounds polished, it gets messier fast.
What AI thinks a human needs and what a human with actual experience knows is needed are not the same thing. AI can generate an answer. It cannot guarantee that answer makes sense in the real world.
Information Doesn’t Make Decisions. You Do.
That matters even more in business, because business is not about access to answers. It is about making decisions. And decisions require more than information. They require judgment, context, pattern recognition, and the ability to look at something and say, “This technically works, but it’s still wrong.”
We are already seeing the cracks. Confident strategies built on shaky logic. Clean-looking outputs mistaken for correct ones. People moving fast because they found an answer, not because they understood it.
And that is the uncomfortable truth underneath all of this: you do not need more information. You need the ability to question it, check it, and understand it well enough to know when it is incomplete, misapplied, or flat-out nonsense.
This isn’t about rejecting AI or pretending technology is the enemy. It’s about not outsourcing your thinking. Because the moment you do, you lose the ability to tell the difference between something that sounds right and something that is right.
And in business, that difference is everything.
Because the cost of getting it wrong is not a weird snack suggestion or a bad dinner recommendation. It is money. Time. Reputation. Sometimes far more than that.
If you are running a business, you have probably already felt this. The answers are everywhere, but clarity is not. And that gap, between having information and actually knowing what to do with it, is where people get stuck.
That is also where Big Left works with founders. Not at the surface level where answers are easy to find, but at the level where decisions actually get made. Where information has to be translated into something usable, accurate, and grounded in reality.
The Strategic Working Session is not about getting more random input.
It is about figuring out what actually makes sense for you and your unique situation and business.
Because access to information is not the advantage anymore.
Knowing how to think is.