Before the pitchforks and torches come out, allow me to clarify – don’t just teach coding.
Jeff Susna and Mark M captured the essence of what I’m talking about:
The article Mark M referenced, “Is coding the new literacy?”, had this to say:
So you might be forgiven for thinking that learning code is a short, breezy ride to a lush startup job with a foosball table and free kombucha, especially given all the hype about billion-dollar companies launched by self-taught wunderkinds (with nary a mention of the private tutors and coding camps that helped some of them get there). The truth is, code—if what we’re talking about is the chops you’d need to qualify for a programmer job—is hard, and lots of people would find those jobs tedious and boring.
But let’s back up a step: What if learning to code weren’t actually the most important thing? It turns out that rather than increasing the number of kids who can crank out thousands of lines of JavaScript, we first need to boost the number who understand what code can do. As the cities that have hosted Code for America teams will tell you, the greatest contribution the young programmers bring isn’t the software they write. It’s the way they think. It’s a principle called “computational thinking,” and knowing all of the Java syntax in the world won’t help if you can’t think of good ways to apply it.
Whether you call it computational thinking or computer literacy, understanding the high-level basics of the technology is what is useful. Conflating the ability to create Hello World in JavaScript with that understanding is both simplistic and counter-productive. Chris Granger, in “Coding is not the new literacy”, observed:
Being literate isn’t simply a matter of being able to put words on the page, it’s solidifying our thoughts such that they can be written. Interpreting and applying someone else’s thoughts is the equivalent for reading. We call these composition and comprehension. And they are what literacy really is.
Languages come and go. Technologies evolve. Concentrating exclusively on specific languages and technologies risks creating future obsolescence. The underlying concepts that must be understood to effectively use them, however, have longer lives. It’s the 21st century equivalent of the difference between giving someone a fish and teaching them to fish. Erik Dietrich in “Don’t Learn to Code — Learn to Automate” points out the broader utility of computer literacy:
It’s obtuse to suppose that a prerequisite for every job in the future will be the ability to implement sophisticated, specialized computer applications. But it’s not at all obtuse to suppose that, given the ubiquity of computing, a prerequisite for every job in the future will be the ability to recognize which tasks are better suited for humans and which for computers. Learn at least to recognize which parts of your job are a poor use of your time. After that, perhaps learn to use your ingenuity and creativity to automate using the tools that you know (such as googling for solutions, leveraging apps, etc). And, if you’ve come that far, maybe it’s time to roll up your sleeves and take the plunge into learning to code a little bit to help you along.
The need for greater numbers of people with computer literacy is real, as is the need for greater diversity within the ranks of those who work with technology. It serves no one to misrepresent the skills needed or the nature of those skills. People who know how to google for code but lack the ability to understand and evaluate what they get back are no better off than if they had never seen a computer. We owe them, and ourselves, better.