Monthly Archives: September 2017

To bit, or not to bit

A dualistic assessment pulses at the heart of cyber-sprawl. Our computer universe revolves around a persistent question delivered in rapid-fire succession. We call tiny, indivisible parcels of information bits. Bits answer one question, again and again: Does it exist? If it does not exist, it takes an identity based on a powerful idea often taken for granted in the rudimentary arithmetical grind.

The concept of zero arose in the wake of the agricultural revolution, most likely in South Asia. Nothingness, as a distinct mathematical entity, logged into history about the same time our ancestors first built cities. Nonexistence equates mathematically to the absence of value—zero.

Humans speciated about 300,000 year before present (ybp). We abandoned hunting and gathering approximately 10,000 ybp. The concept of zero didn’t manifest until more like 5000 ybp. The concept of zero most likely snuck into existence under the cover of flashier history: Egyptian pyramid construction, Babylonian hanging gardens, and the preponderance of pottery in human settlements across the Fertile Crescent.

We many never know zero’s precise origin in space and time. Most likely, the numerical value of nothingness arose again and again among geographically disperse cultures, throughout separate eras. Without reliable and more precise evidence, 5000 ybp serves as a sufficient estimate. Humanity has reaped the fruit of zero for roughly 2% of its time as a distinct species.

If a bit exists, we assign it the least magnitude, just enough to establish its presence. The symbol “1” represents existence.

A true or false also answers the bit’s existential query. But this is just a more complicated restatement of the answer rendered above. The use of a string of characters, a word, like “true” or “false” adds contextual meaning and alleviates the anxiety of the mathematically averse. True or false successfully answers the existence question. As simple as answering true or false may seem, imagine repeating the process hundreds, thousands, millions, or even millions of millions of times.

Many people would instinctively switch to “T” and “F” in place of writing four or five letters for each query. A single letter suffices to state the existence of something. Continue along this line of logic and some people—probably the mathematically inclined—will substitute a 1 for T, and a 0 for F. This transcends language barriers, removes ambiguity, and adds quantitative value for more complicated manipulations of our nascent data.

binary_pile_Mark-Ordonez

photo courtesy of Mark Ordonez

The concept of zero wields power in ways that often escapes even the most mathematically gifted. Humans prefer ten digits (0-9) that repeat infinitely in a cycle up, and down, the number line. A zero initiates the process: The origin on a number line has no value. Moving up, we count to nine; further progress requires a reset of the ones place to zero, and then the addition of a one to the tens place (1 ten plus 0 ones is 10). Repeat the process until we reach nineteen (1 ten plus 9 ones is 19); now, replace the one in the tens position with a two, and reset the ones to zero—20. Without the concept of zero, we have no natural origin, or means to recycle our counting system at a convenient interval.

Computers operate with only two digits. Remember: the most basic element of a computer is a bit with two possible states—existence, or not. Humans prefer ten digits because modal humans have four fingers and a thumb on two hands. Base ten is natural for us because we typically inaugurate our mathematical experience tallying small quantities on our hands. Computers don’t have appendages. Instead, computers recognize existence or lack thereof; on or off; 1 or 0.

binary_window_Chris-McClanahan

photo courtesy of Chris McClanahan

Math isn’t partial to any number of digits. As long as the absence of value is expressible, only two fundamental numbers can effectively represent any quantity. Two digit numerical systems employ binary code.

Here’s a string of binaries beginning with the absence of value: 0, 1, 10, 11, 100, 101, 110, 111, 1000, 1001, 1010. You don’t need master spatial perception to discern the pattern at the end of the previous sentence. Let me translate into more familiar base ten notation: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10.

The same principle governs all positional numeral systems. In binary, we recycle the process using only two digits; base ten counts nine distinct quantities before an abrupt return to the concept of zero. The first position has no value, then it does. Since we only have two possible values, we have to reset the value to zero and progress by adding value to the next position. Then we add value to the initial position. This process can repeat forever creating an infinite series of magnitudes.

Tedious repetitions, like writing and manipulating binary code, strain human attention spans and raise the probability of error. Also, we’re slow when it comes to mundane tasks. Fortunately, humans designed computers to flawlessly iterate simple instructions. According to a loose interpretation of Moore’s Law, processing speed doubles about every two years. My processor executes 2.2 billion (the same as 2.2 thousand x 1 million, or 2.2 thousand million) instructions every second.

By the way, base ten 2,200,000,000 is the same as 10000011001000010101011000000000 in binary. I think. Just transcribing a 32 digit quantity, regardless of base, carries a high probability of a simple mistake.

If you enjoyed reading this piece, check out An Electron Story or Hunter Gatherers in the Quantum Age.