Thirty years ago, when computers were still new, we didn’t know what to do with them. There was a sense — in that half generation between the widespread availability of computers and the advent of the Internet age — that the world was changing in fundamental ways. No one was sure what was really going on, but this was big. Really big.
Schools bought computers because they thought children should be computer literate. They didn’t really know what that meant. They were struggling to prepare students for a world that no one understood. So they scraped together money from the media budget and grants and the PTA and they bought a few computers that they set up in a lab. The digital age was here.
As students, we played a little Oregon Trail and Lemonade Stand. We practiced our math facts. And then, we learned BASIC programming. Programming seemed… important. This was a skill we would need. Everyone would have a computer when we became adults. Everyone would need to know how to program it. Plus, the computers came with programming software. It was one of the few things we could do without buying additional software, and those costs weren’t anticipated when they bought the computers.
I learned BASIC in fifth grade, and by sixth I had forgotten it. I learned LOGO, but just the turtle parts of LOGO, not the really cool list handling stuff that made it a useful language. I learned to type on a typewriter, which I then used through high school. Computers didn’t help me learn other things. They were a subject all to themselves. Our school didn’t have enough of them to teach students much of anything about them. And they didn’t know what to teach us anyway. So after sixth grade, I didn’t use a computer again until I was a senior in high school, when I took programming (in Pascal, this time) for fun.
Half a decade later, I found myself with a minor in Systems Analysis and half a dozen different programming languages under my belt. I was teaching a middle school computer applications class. My predecessor had spent about 80% of the course teaching programming. Having no actual curriculum to follow, I scrapped all of it, and focused on applications instead. I felt that students needed to know more about word processing and spreadsheets and presentation software than BASIC. A year later, I was emphasizing Internet research, evaluating online resources, documenting sources, and using the Internet to disseminate content. These were things my middle school students needed to know. They’re still things they need to know.
At the dawn of the new millennium, I was teaching a high school programming course. It was an elective. It was a neat class for students interested in programming. But students don’t need to take auto mechanics to drive a car. They don’t have to study structural engineering to work or live in a high-rise building. They don’t need a degree in economics to work at the bank. The course I taught was an introduction meant to spark interest in the field. I never felt like I was teaching a necessary (or even marketable) skill.
The early 2000s confirmed that. Remember The World is Flat? Friedman talked about going to the drive through fast food restaurant, and talking to someone in India or China to order your food, which you then pick up at the window. Auto companies were making tail light assemblies in the US, shipping them to Malaysia for cheap, unskilled labor to put the light bulbs in, and then shipping them back to go into new cars. Software companies were focusing their systems design efforts in the US, but outsourcing the routine coding to India.
Programming is an entry-level skill. There’s nothing wrong with that. But it’s the kind of position that is more “job” than “career”. Sure. There’s a bubble right now, and programming skills are in demand. There are also some good reasons to teach programming, because it helps students learn logic, reasoning, and problem solving. But if schools are reacting to the media hype around coding by teaching programming to a generation of would-be programmers, they’re preparing students for a future of unemployment.
Image sources: Wikimedia Commons, Pixabay.
You seem to be saying that because there was a fad for offshore programmers 15 years ago, software development isn’t a career? I would beg to differ. The offshoring fad crashed against the notorious difficulty of software specification. Onshore hiring is strong today, fed in part by the rise of Agile methods and their emphasis on close interaction with product owners. An offshore team can’t do that.
Should everyone learn to program, at least a little? Maybe. I don’t know.
Is programming an important skill to teach technically-inclined students? Yes, absolutely! Most science and engineering degrees require at least one programming class. It is a key skill for many technical careers, not just computer science. Those technical careers are among the few with wage growth in the face of relentless automation of low-skill jobs. High school students need the opportunity to learn to program.
First, I’m honored and humbled that you read this.
I think we initially taught programming because there wasn’t much else that you could do with a computer. As technology has evolved, we’ve found ways to be productive with it that don’t require users to really understand how it works. You don’t have to be a programmer to use technology effectively. As that happened, we transitioned “computer” classes to focus less on programming skills that have a limited target audience and more on the more accessible productivity tools.
But I think there is a difference between coding and software/systems design/engineering, just like there’s a difference between a builder and a structural engineer. Kids love legos. But there are kids who follow the instructions to build the plane or the car or whatever comes in the kit, and there are kids who throw the instructions away and make their own thing. Those are the kids who are still going to have jobs when the mindless coding-to-spec positions evaporate.
I think you disproved your point in the title of your article. If someone doesn’t know coding, they’re not going to understand what != means. Maybe would work, if that’s what they learned in math.
The coding movement isn’t to jump start a career in programming or even end up in a technological career. It’s to help students be able to do their chosen career faster and more efficiently then others.
December of my sophomore year I received a computer for Christmas (1984), and immediately my classmates were at a disadvantage. I was able to produce papers better and faster then anyone else in the class. Unfortunately, I used it more as a tool for procrastination.
To put it into the car analogy, people don’t need to understand how the internal combustion engine works, or how the transmission works, but they do need to understand that a car needs gas and that the tires need to have air in them. Coding gives students a framework on which to build. There will be more jobs/careers that will require some sort of technology background than won’t.