Tag Archives: Computers

To bit, or not to bit

A dualistic assessment pulses at the heart of cyber-sprawl. Our computer universe revolves around a persistent question delivered in rapid-fire succession. We call tiny, indivisible parcels of information bits. Bits answer one question, again and again: Does it exist? If it does not exist, it takes an identity based on a powerful idea often taken for granted in the rudimentary arithmetical grind.

The concept of zero arose in the wake of the agricultural revolution, most likely in South Asia. Nothingness, as a distinct mathematical entity, logged into history about the same time our ancestors first built cities. Nonexistence equates mathematically to the absence of value—zero.

Humans speciated about 300,000 year before present (ybp). We abandoned hunting and gathering approximately 10,000 ybp. The concept of zero didn’t manifest until more like 5000 ybp. The concept of zero most likely snuck into existence under the cover of flashier history: Egyptian pyramid construction, Babylonian hanging gardens, and the preponderance of pottery in human settlements across the Fertile Crescent.

We many never know zero’s precise origin in space and time. Most likely, the numerical value of nothingness arose again and again among geographically disperse cultures, throughout separate eras. Without reliable and more precise evidence, 5000 ybp serves as a sufficient estimate. Humanity has reaped the fruit of zero for roughly 2% of its time as a distinct species.

If a bit exists, we assign it the least magnitude, just enough to establish its presence. The symbol “1” represents existence.

A true or false also answers the bit’s existential query. But this is just a more complicated restatement of the answer rendered above. The use of a string of characters, a word, like “true” or “false” adds contextual meaning and alleviates the anxiety of the mathematically averse. True or false successfully answers the existence question. As simple as answering true or false may seem, imagine repeating the process hundreds, thousands, millions, or even millions of millions of times.

Many people would instinctively switch to “T” and “F” in place of writing four or five letters for each query. A single letter suffices to state the existence of something. Continue along this line of logic and some people—probably the mathematically inclined—will substitute a 1 for T, and a 0 for F. This transcends language barriers, removes ambiguity, and adds quantitative value for more complicated manipulations of our nascent data.

binary_pile_Mark-Ordonez

photo courtesy of Mark Ordonez

The concept of zero wields power in ways that often escapes even the most mathematically gifted. Humans prefer ten digits (0-9) that repeat infinitely in a cycle up, and down, the number line. A zero initiates the process: The origin on a number line has no value. Moving up, we count to nine; further progress requires a reset of the ones place to zero, and then the addition of a one to the tens place (1 ten plus 0 ones is 10). Repeat the process until we reach nineteen (1 ten plus 9 ones is 19); now, replace the one in the tens position with a two, and reset the ones to zero—20. Without the concept of zero, we have no natural origin, or means to recycle our counting system at a convenient interval.

Computers operate with only two digits. Remember: the most basic element of a computer is a bit with two possible states—existence, or not. Humans prefer ten digits because modal humans have four fingers and a thumb on two hands. Base ten is natural for us because we typically inaugurate our mathematical experience tallying small quantities on our hands. Computers don’t have appendages. Instead, computers recognize existence or lack thereof; on or off; 1 or 0.

binary_window_Chris-McClanahan

photo courtesy of Chris McClanahan

Math isn’t partial to any number of digits. As long as the absence of value is expressible, only two fundamental numbers can effectively represent any quantity. Two digit numerical systems employ binary code.

Here’s a string of binaries beginning with the absence of value: 0, 1, 10, 11, 100, 101, 110, 111, 1000, 1001, 1010. You don’t need master spatial perception to discern the pattern at the end of the previous sentence. Let me translate into more familiar base ten notation: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10.

The same principle governs all positional numeral systems. In binary, we recycle the process using only two digits; base ten counts nine distinct quantities before an abrupt return to the concept of zero. The first position has no value, then it does. Since we only have two possible values, we have to reset the value to zero and progress by adding value to the next position. Then we add value to the initial position. This process can repeat forever creating an infinite series of magnitudes.

Tedious repetitions, like writing and manipulating binary code, strain human attention spans and raise the probability of error. Also, we’re slow when it comes to mundane tasks. Fortunately, humans designed computers to flawlessly iterate simple instructions. According to a loose interpretation of Moore’s Law, processing speed doubles about every two years. My processor executes 2.2 billion (the same as 2.2 thousand x 1 million, or 2.2 thousand million) instructions every second.

By the way, base ten 2,200,000,000 is the same as 10000011001000010101011000000000 in binary. I think. Just transcribing a 32 digit quantity, regardless of base, carries a high probability of a simple mistake.

If you enjoyed reading this piece, check out An Electron Story or Hunter Gatherers in the Quantum Age.

hunter gatherer

Hunter Gatherers in the Quantum Age

Fifteen thousand years ago it’s probable that all humans banded together in hunter-gatherer clans of 50 to 100. That’s the way we survived for thousands of generations. Subsistence in permanent settlements is relatively novel for our species. Although we have spread world wide on the waves of an agricultural revolution, we thrive on the heart of a fundamentally nomadic species.

Most human brains can’t maintain more than 100 concurrent relationships. Apparently, this is the number when alpha male rivalry drove apart prehistoric nomadic clans. (Try a little test: write down all the people you interact with, face to face, in an average month. I bet you’ll struggle on your pass 50.)

Social media may be a development on par with the printing press because it allows us to engage in hundreds (thousands?) of concurrent relationships, and get past evolutionary cognitive barriers. Ultimately, this new connectedness could generate a hyper-level of creativity. Add this connectivity to the advent of quantum computers—they should be  available in about 30 years—and we might become a completely interconnected species.

What is a quantum computer? Ordinary computers communicate via binary mathematics: All instructions are coded as ones and zeroes, value and no value.

For an obvious reason, humans prefer to use a numerical base of ten symbols: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. We start repeating these symbols in different positions and arrangements to represent any quantity on the number line.

All the familiar mathematical operations are possible using only 0 and 1. For example, a base ten 7 is equivalent to 111 in binary. Seventeen is 10001 in base two. I won’t explain how to translate from base two to base ten or visa versa. Just take my word for it.

Computers only have two fingers or I guess you could say the computer alphabet only has two letters. Computers make up for this weakness by processing the 0’s and 1’s rapidly. For example, my computer can do 2,660,000,000 actions every second.

Each 0 or 1 represents a bit that is or is not. Quantum computers have qubits. A qubit is allowed to occupy both value and no value simultaneously. Don’t feel bad if you don’t completely understand; no one really understands quantum physics. Here’s a good example to help you understand the power of quantum computers: If you wanted to find your way out of a complicated maze and, try all options until you discover a correct path. That’s what ordinary computers do, but they do it faster than humans.

I’m sure you can imagine a maze so complicated that my 2.66 GHz processor will get bogged down and take a long time to find a solution. The perfect solution to escape the maze is to try all paths simultaneously. That’s what a quantum computer would do; I guess you could say exponential technological growth becomes essentially vertical.

exponential_growth2

If you enjoyed this post, try Binary Fraud or De-frag Brain.

binary red black

Binary Fraud

Humans tend to substitute duality when it’s clear unity provides the best description.

Consider the concepts of light and dark. Ordinary language indicates these two things are separate entities and work in opposition. Try to define darkness without using the concept of light. Good luck. Perhaps a clever wordsmith will succeed, but I doubt it.

Darkness is the absence of light. Darkness does not exist independently of light. Darkness cannot overcome the light. When light appears, darkness vanishes.

Hot and cold is another false duality. Scientifically, hot and cold don’t represent distinct  physical conditions. Hot objects possess relatively high temperatures; cold things have correspondingly low temperatures.

This duality inspires misunderstanding of one of the central concepts in science, temperature. The faster molecules move in a substance, the higher it’s temperature. The amount of stuff in each molecule is important for temperature too, but temperature essentially represents the measure of molecular motion in a substance.

It’s not a question of “Are the molecules moving fast or slow?” because this is another false duality. The true question: how much motion does it have?

Duality’s power is apparent in binary numbering systems: computers have transformed humanity. Computers operate according to binary mathematics. All operations in binary math use a language based on 0 and 1. Humans prefer a base ten system: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9.

Actually, binary math isn’t really binary. The concept of zero is based on the absence of value. A one represents value. Value vs. the absence of value.

Humanity is on the cusp of constructing quantum computers. The fundamental strength of quantum computing is each bit is allowed to occupy value and no value simultaneously. Quantum computers will one day change humanity in ways that cannot be predicted, or described with current language. Quantum computers are based on a unifying principle, probability.

If you enjoyed this post, you might also like Hunter Gatherers in the Quantum Age, De-frag Brain.