what-is-codingHistorians consider there to be three primary time periods in human history, which they refer to as the three-age system. These ages, as you probably already know, were the Stone, Bronze, and Iron Ages. These ages are named for the primary technological developments of the time, which involved each respective material. So, then, would it really be so far fetched to state that we live in the Computer Age?

In a Nutshell: What is Coding?

Coding, in the simplest answer possible, is the means by which a human operator instructs a computer about what it should be doing, and how. A more detailed description would first require the understanding that a computer, in its basest form, can really only understand two different signals: on and off. If you’ve ever heard of binary code, this may already sound familiar to you – a 1 represents on, and a 0 represents off.

Clearly, telling a computer what to do with millions or billions of ones and zeroes would be a task that borders on impossible for humans, if not outright crazy (although it is rumored that Linus Torvalds, developer of the Linux kernel, has some experience programming binary directly).

To solve this problem, humans developed a number of languages which are far more simple to understand, but can still be interpreted by the computer as binary. Using and developing software in these languages is what most people consider “coding,” or programming.

Code is involved every time you use your smartphone, your television, and your automatic coffee maker. Unless you live in the woods without any form of technology, it is likely involved in every facet of your life. Welcome to the Computer Age.

How Does Code Get Translated to Binary?

As we learned above, code must ultimately be translated into the binary system of on and off switches, or ones and zeroes. There are several steps that occur before this takes place, usually unseen to even the programmer him- or herself.

First, the program is written in whatever language the programmer believes best suited to the project, such as Python, Java, or C (and many, many others). Then this code is fed into another program, which translates it into assembly language. Assembly is then translated to machine language, which is then finally executed by the computer as binary.

What is a “Bug?”

Humans make mistakes – and, at least for now, computers can only do exactly what their human operators tell them. When a programmer makes an error in a computer program, it’s referred to as a bug – a reference usually attributed to the work of computing pioneer Grace Hopper, who discovered a literal bug in her Mark II computer (it was a moth).

Bugs must be carefully discovered and fixed by coders in a process aptly named debugging. Since some computer programs are very, very large, with perhaps millions of lines of code, this is typically an ongoing process.

Code is all Around us, and it’s Here to Stay

The advent of computers and the internet has revolutionized the world, how we communicate with one another, and how we learn. At the heart of this revolution is code – to function in our new world, it’s becoming more and more necessary to have at least a rudimentary understanding of this primary building block of technology.