We always hear of the self-taught wunderkinds that build billion dollar tech businesses, but rarely do we hear about the arduous work that individuals endure in learning a technical skill like coding. If you look at entrepreneurs like Bill Gates and Mark Zuckerberg, who both dropped out of college and emerged as tech giants, you would think that the path to riches is simple: drop out of college, start a company, become a millionaire, right? Wrong.
In his book Outliers, author Malcom Gladwell says that it takes roughly 10,000 hours of practice to truly master a skill in any field. He backs this up with a number of cases, illustrating the long road to mastery for individuals like Gates and even musicians like The Beatles.
Gates received his hard-earned hours well before graduating high school. In 1968, he and his Microsoft co-founder, Paul Allen, helped raise money for their school to buy a computer terminal. Their early access to a terminal and availability of resources at the nearby University of Washington gave Gates the hours he needed to become a proficient computer programmer.
Likewise, Mark Zuckerberg, founder of Facebook, had a compute science tutor in high school that helped teach him to code. His tutor considered him a prodigy, and at the age of 12, he created an instant messaging app for his father’s dental practice. Needless to say, Zuckerberg had a lot of hours of practice before he arrived at Harvard.
Coding is not for everyone
The reality is that the work of coders is tedious and sometimes boring. Just like accounting, legal and marketing, computer programming is not a discipline for everyone.
But is learning code the most important thing? Or is teaching people to understand the capabilities of code and the merits of “computational thinking.”
Jeannette Wing, VP of Research at Microsoft, who popularized the term, explains that, “computational thinking involves solving problems, designing systems, and understanding human behavior.”
These are good skills sets for anyone to have, not just computer programmers. The computational approach to solving problems views the world as a series of puzzles; ones that can be broken down into ever-smaller challenges and solved piece by piece through logic and deductive reasoning.
Your mother is one of the best computational thinkers
If you’ve ever cooked dinner, congratulations—you’ve engaged in some light computational thinking. When preparing a meal, a home cook will mentally think out the path of combining raw ingredients into an edible dish. They will think through the process of adding, dividing, substituting, timing and applying external processes (heating, mixing), and so on until they achieve their desired result.
Computational thinking applies the same principles. Where the cook takes raw ingredients to make a meal, a programmer will take a software algorithm as a kind of recipe: a guide on how to take random variables and combine them in various manners, for specific amounts of time, until an outcome is achieved.
Similar to cooking, computational thinking begins with imagination, the ability of an individual to envision how digitized information can be combined and computed to achieve a specific outcome. The coder will decompose the big picture idea and break it down into a logical series of smaller steps to achieve the desired result, just like a recipe.
But is learning code the most important thing?
Although it might seem premature to force every kid to learn how to code, opportunities that expose children to coding are becoming increasingly prevalent. As the tech sector grows, so to have resources that introduce children to the basic logic of coding. Popular, kid-oriented sites that focus on learning code include Code.org and Scratch, a programming language developed at MIT. There’s even board games like Robot Turtles that teach programming basics to kids as young as three.
Scratch and other coding games use simple drag and drop techniques to teach kids about computational logic and basic programming principles.
In the future, many predict that understanding code will one day be as essential as reading and writing. It’s hard to imagine that coding, something that seems so foreign, will eventually invade our everyday life on par with reading and writing. But we should remember that it has taken us thousands of years to settle on our modern day writing conventions, while digital technology started in the 1950s and has evolved exponentially faster.
While you or your child might not be ready for a coding bootcamp, it is never too early to learn how to become a better computational thinker. Whether we like it or not, the future will favor those who can manipulate digital information as skillfully as a tasty salad.
This article was adapted from Is Coding the New Literacy originally published by Mother Jones and written by Tasneem Raja.