Artificial Words

Technology makes us stupid.

I’ve written about this before. The availability of amazing tools that provide knowlege and organization allows us to stay on the ball without remembering anything. My car knows where I’m going and how to get there. My calendar will tell me what I’m doing this weekend. My phone knows how to contact everyone I care about, so I don’t need to know phone numbers. Technology helps us move the mundane things out of our brains so we can focus on more important ideas. Or, you know, watch more TikTok videos. But we have a choice on how to spend our time and cognitive energy.

Generative AI is the next step along that path. Maybe the mundane thing is writing a report or a recommendation letter or a lesson plan filled with “I can” statements that all tie back to state standards. Maybe it’s replying to a bunch of emails that we don’t care a whole lot about. AI can help us be more efficient about getting things done that aren’t super important to us.

Image Source: New Scientist

But as we become more familiar with AI, it also gets easier to recognize. The special effects in old movies look cheesy now, even though they were fantastic when the movies came out. As we become more familiar with the new amazing thing, it becomes easier to see it when it’s there. At first glance, the office filled with plastic plants looks like a tropical paradise. But when you spend more than a couple seconds looking at them, you realize it’s just an illusion. Someone wanted to create a soft, lush, inviting topical environment. But they didn’t want the hassle of having to take care of real plants. And while it looks fine if you don’t spend a lot of time there, just knowing they’re fake cheapens the effect a little.

We’re all getting a lot better at identifying ai-generated content. And there are places where it totally makes sense to use it. But we also need to be aware that we’re taking an authenticity hit when we use it. We’re sending the message — whether we intend it or not — that this thing isn’t really that important.

There’s also now some evidence to suggest that the use of generative AI has an impact on critical thinking. People who are dependent on generative AI tools show lower critical thinking ability than those who don’t use AI, and that effect is magnified with younger participants. It makes sense that the use of AI for cognative offloading would reduce our immersion, interaction, and reflection around those topics. And because we’re experiencing them at a much shallower depth, our ability to apply those ideas in new contexts is reduced. Using a navigation system to find my way around a strange city doesn’t help me learn my way around. Using a calculator to do arithmetic doesn’t help my mental math skills. In the same way, using AI to write a report or outline a plan allows me to focus on other things. And that means my experience with and understanding of those products is diminished.

So maybe we should be careful about using AI to help teach kids remember stuff or help them prepare for assessment. The more interesting conversations about AI in school are centering around the use of AI to improve student engagement, foster project-based learning, help students create artifacts that demonstrate their learning, and assist teachers in proving authentic, appropriate learning experiences specific to each student’s needs. That’s where the exciting work is happening. And hopefully the generative AI tools will help us take care of the less important stuff while we work on that.