If Only

Last week, I attended an Artificial Intelligence forum sponsored by OhioX and InnovateOhio. The event included a couple demonstrations, including one on how AI can be used for marketing, and one on Microsoft’s approach to AI. But the primary focus of the event was two panel discussions: healthcare and education.

The healthcare discussion consisted of representatives from Cleveland Clinic, University Hospital, and Pandata, a company that builds AI solutions for high-risk industries. It was clear from the conversation that there’s interest in AI for healthcare, and both hospitals are pursing applications that are safe, reliable, and cost-effective. But there are also significant concerns. The technology industry is notoriously sloppy, and its tolerance for error is very high. If ChatGPT is 90% correct, we consider that a victory. If Waze accurately directs you to your destination 99% of the time, and finds the most efficient route 75% of the time, that’s good enough. But if we miss 1% of cancer diagnoses, or if AI can spot potential health issues better than humans 75% of the time, that’s not going to work.

The hospitals are starting to use AI in redundant ways, double checking the work of humans, and supporting, but not replacing the work of medical professionals. That doesn’t save any money, but it can lead to a higher level of care. Neither human work nor AI work is better than the combination of both working together.

The education panel was more interesting to me. It included a professor from Case, a professor from John Carroll, an education industry architect from Microsoft, and the CEO of aiEDU, an organization that helps schools improve AI literacy among students. It was moderated by Ohio Attorney General Jon Husted.

The professors, interestingly, had trouble seeing the forest for the trees. They focused on how AI can be leveraged to help manage very large lecture classes. They acknowledged the use of AI to “cheat” on student projects, but didn’t really explore any strategies for mitigating that challenge or even exploring what “cheating” means now. With some input from the aiEDU CEO, there was a significant focus on preparing students to work in a world of AI. Students should be “AI literate,” the thinking goes. There are toolkits for introducing AI to students. There are strategies for helping students write better AI prompts, and to better incorporate this technology into their work. The implication was clear: if our students don’t leave school with a solid understanding of artificial intelligence, they’ll be at a disadvantage when they enter the workforce.

They weren’t serious about this, of course. We will know that the state really wants us to embrace AI when they let kids use it on the state proficiency tests.

Only the Microsoft guy seemed to actually see the big picture. He talked about growth mindset and innovative applications of AI. He had some ideas for how students could leverage AI to improve their work. He seemed much less interested in developing a list of AI competencies that students should have to be successful workers when they enter the workforce.

I think the goal of the event was to raise awareness about artificial intelligence, but also to create some sense of urgency. We need to adapt to this new tech, and make sure our students and young adults are up to speed so they don’t get left behind. All of this was done through the lens of workforce development, which is the focus of Husted’s work.

So, we should be teaching AI, but also make sure students don’t actually use it for schoolwork. We’ll add that to the list. There’s a new list of things we have to teach this year, following changes to the prescribed curriculum enacted by the legislature this past spring. This is the 7th time that law has changed in the last ten years, and every time, it gets longer. And there seems to be an uproar every time something is taken off the list. We have to have cursive, and multiplication tables, and spelling tests, and states and capitals. We can’t cut family and consumer science, or industrial technology, or (God forbid) chemistry.

But what about all the students who DIDN’T get AI instruction? What about the students who graduated last year? Or the ones who don’t have room in their schedules to fit another class? Or the ones who learned everything there was to know about AI last spring, only to find that the industry has completely changed since then?

If only there were a way to equip students for a world in which the things they need to know is constantly changing. It’s almost like we need to teach people how to learn, so they can continue doing it after they leave school. If only they could learn, unlearn, and relearn as needed. Maybe instead of constantly adding to the half-baked list of things kids need to know, we should teach them how to learn new stuff when they need to. Then, they could learn without us. We could build that into our portrait of a graduate. We could write it into our mission statement.

Oh, wait. We did.