Category Archives: Humanities

An Evolving Universe I—The Greatest

Isaac Newton was the greatest, most influential scientist.

newton_portrait

This is a fact but not a really scientific fact. There aren’t really any facts—even in science—because the scientific method (question, hypothesis, experiment, analysis, conclusion, evaluation) dictates all ideas must carry some degree of uncertainty. The scientific method never rests. It does get tired after many iterations. If exhaustive repetitions fail to uncover evidence against—scientists attempt to falsify, not support their predictions—a hypothesis becomes a theory: a scientific fact is born. Keep in mind all facts—theory is probably a better moniker; a fact and a theory are essentially the same—are subject to ongoing review.

Any evidence against a theory compels at least a modification, or even abandonment of that theory. The idea that facts don’t exist confuses the general public; it often confounds people with advanced degrees. Most realize the universe is continuously changing, evolving. Facts are part of the universe. Assuming ideas are manifestations of the physical universe, facts should be subject to evolution too.

Why was Newton the greatest scientist? His influential accomplishments were many. In order of estimated decreasing importance, here is what Newton revealed: the nature of light (he even hypothesized light came in particles called corpuscles, a precursor of photons, but he conducted no experiments regarding this belief), the universal nature of gravitation and the laws of motion. He invented (should we say discovered?) calculus too.

Calculus would be a more significant achievment but another bright chap, Gottfried Leibniz, created the same branch of mathematics about the same time as Newton. Had Newton died in the plague—he fled London when the pandemic ravaged the British Isles—calculus would have been Leibniz’s baby, so to speak.

leibniz_portrait

It’s unlikely another scientist would have discovered (should we say invented?) the other three ideas within a few decades. Newton’s color theory of light might have taken a century or more before another scientist discovered it.

newton_prism

If Newton were alive today, he wouldn’t claim to be history’s first scientist; Newton would most likely defer to Galileo. Galileo seems to be the first person we know of to test his ideas. Newton didn’t really do anything distinctly different from Galileo. Newton just took Galileo’s practices to another level.

I’ve never heard an argument that any scientist surpasses Newton’s greatness. Albert Einstein is often considered as Newton’s competition. Einstein was the first scientist to compel a modification to Newton’s gravitation law; it was a cosmetic adjustment really, and it only modified it in extreme conditions. But Einstein’s General Theory of Relativity did do something Newton couldn’t do: Einstein explained the true nature of gravity—a distortion of space-time caused by the presence of matter.

earth_moon

It’s appropriate that we distinguish between laws and theories. It’s likely most people believe laws are superior to theories. Unfortunately, the word theory is often mistakenly applied when the word hypothesis should be employed. A hypothesis is an educated guess; a theory is system of ideas backed up by a vast and complicated reservoir of experiments. In short, and once again, a theory is what we commonly call a scientific fact.

A law is a mathematical system which allows us to make predictions. Laws are powerful scientific tools. Laws have a profound weakness: they don’t explain what’s actually happening, physically. We just know that, as long as we realize the necessary constraints, laws yield reliable predictions.

Niels Bohr is a dark horse candidate to compete with Newton for greatest scientist. What did he do? Bohr was a father of quantum theory. Why not the father of quantum theory? There are many fathers of quantum theory: Max Planck and Einstein to name two more—there are others we should consider too—but neither really accepted the fundamental weirdness that goes with quantum theory.

bohr_portrait

Bohr was the first scientist to embrace the weirdness, the probabilistic nature of the universe, at the root of quantum theory. Once Bohr convinced the scientific community—not all scientists we on board with Bohr, Einstein was stubborn and never accepted the dicey nature of quantum theory—a vast array of successive quantum theorists continued to build the most explicative theoretical system in the history of science: quantum mechanics.

quantum_copenhagen

Bohr is not the father of quantum theory, but he’s the first on the list of potential fathers. Since quantum theory is the most successful scientific system of ideas, it makes sense that the first on the list of fathers is one of the greatest scientists.

It will be nearly impossible to knock Newton of his lofty perch. He had the advantage of getting in at the start of the game. Science didn’t really exist in an organized way when he was born.

The whole discipline rests on a foundation he constructed. Thanks to Newton, the base of science is strong. The only way to supersede Newton may be to discover a new characteristic of the foundation, or something we had not thought about how the foundation rests on whatever is supporting it. In my opinion, there is one possibility for another scientist to take the title of greatest scientist from Isaac Newton.

Click here to go to Part II. Here’s Part III.

hunter gatherer

Hunter Gatherers in the Quantum Age

Fifteen thousand years ago it’s probable that all humans banded together in hunter-gatherer clans of 50 to 100. That’s the way we survived for thousands of generations. Subsistence in permanent settlements is relatively novel for our species. Although we have spread world wide on the waves of an agricultural revolution, we thrive on the heart of a fundamentally nomadic species.

Most human brains can’t maintain more than 100 concurrent relationships. Apparently, this is the number when alpha male rivalry drove apart prehistoric nomadic clans. (Try a little test: write down all the people you interact with, face to face, in an average month. I bet you’ll struggle on your pass 50.)

Social media may be a development on par with the printing press because it allows us to engage in hundreds (thousands?) of concurrent relationships, and get past evolutionary cognitive barriers. Ultimately, this new connectedness could generate a hyper-level of creativity. Add this connectivity to the advent of quantum computers—they should be  available in about 30 years—and we might become a completely interconnected species.

What is a quantum computer? Ordinary computers communicate via binary mathematics: All instructions are coded as ones and zeroes, value and no value.

For an obvious reason, humans prefer to use a numerical base of ten symbols: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. We start repeating these symbols in different positions and arrangements to represent any quantity on the number line.

All the familiar mathematical operations are possible using only 0 and 1. For example, a base ten 7 is equivalent to 111 in binary. Seventeen is 10001 in base two. I won’t explain how to translate from base two to base ten or visa versa. Just take my word for it.

Computers only have two fingers or I guess you could say the computer alphabet only has two letters. Computers make up for this weakness by processing the 0’s and 1’s rapidly. For example, my computer can do 2,660,000,000 actions every second.

Each 0 or 1 represents a bit that is or is not. Quantum computers have qubits. A qubit is allowed to occupy both value and no value simultaneously. Don’t feel bad if you don’t completely understand; no one really understands quantum physics. Here’s a good example to help you understand the power of quantum computers: If you wanted to find your way out of a complicated maze and, try all options until you discover a correct path. That’s what ordinary computers do, but they do it faster than humans.

I’m sure you can imagine a maze so complicated that my 2.66 GHz processor will get bogged down and take a long time to find a solution. The perfect solution to escape the maze is to try all paths simultaneously. That’s what a quantum computer would do; I guess you could say exponential technological growth becomes essentially vertical.

exponential_growth2

If you enjoyed this post, try Binary Fraud or De-frag Brain.

De-frag Brain

Why do humans insist on categorizing everything? Is the human brain a series of discrete, isolated compartments where information is stored-retracted via a master distribution-acquisition system?

Not only do our brains seem fragmented, but also, we are supposed to only use certain parts of our brain once we decide which functions to employ. If you are good at linguistics then you cannot be a mathematician. Those with strong math skills should devote most time developing mathematical prowess. Math is generally perceived as difficult and practitioners of the craft are in short supply, so it is almost a duty to do something with that rare gift.

Using more than one part of your brain in the same thought, or series of connected thoughts, is implicitly discouraged. Specialization is the key to a rising academic status. Education creates human blocks of expertise and then the blocks are stacked to create progressively more complicated structures. This is how we built the foundation of modern civilization, no?

During childhood—actually it continues into the early adult years, through college—a series of strategically timed bells dictates when it’s time to utilize a different part of the brain. Some say school bells had an even more basic purpose when they started ringing during the late industrial age. The bells conditioned students in preparation for a life on the production line.

Knowledge divisions are logical but arbitrary: language, social studies, math, science and art. Each discipline competes for time and influence in a zero sum game. Remember: Individuals must eventually choose only one focus subject.

Double majors are a chore and held in high regard, but they are an exception to normal, and the two fields are almost always closely related: math-computer science, electrical-mechanical engineering, English-history… When was the last time you heard of a math-history double major? I am sure it happens, but it’s rare.

In a display of equity, policy dictates each subject gets equal time. This is absurd because we all know science requires lab activity on top of classical instruction, and math is difficult to learn without problem solving exercises with teacher assistance. There is some correction for this absurdity in college: math and science classes often get four or five credit-contact hours while most other subjects only get three.

The grand prize of education is a Doctorate of Philosophy. A Ph.D. is an expert in one sliver of the knowledge spectrum. To earn this lofty distinction, one must make a unique contribution to some field. A Ph.D. knows something that is obscure to the rest of humanity. A doctorate is almost universally required to be a professor.

Essentially, a professor knows everything about nearly nothing, and they teach it to very few.

The dutiful researcher is at the top of the educational food chain. Researchers typically avoid the classroom; when professors do share their knowledge with the masses it tends to be in the form of caned lectures to hundreds of young adults. The grand cathedral of knowledge is mostly available only to people who have yet to acquire any significant measure of wisdom.

Teaching assistants tackle the details of coursework in recitations. Professors answer questions too, but only during rarefied office hours or brisk appointments.

The whole process resembles an assembly line when viewed from a distant and detached perspective. We manufacture individual cerebral parts and bring them together to make a working machine of sorts.

Not that this is all bad but with technology making information more accessible, it seems the skill most needed is the ability to process ideas from as many parts of the information spectrum as possible. We do not need a learned professor to divvy out parcels of knowledge according to a four-year plan anymore.

All we need to know is out there in the ether waiting for us to access it. If anything needs to be taught formally it’s how to bring all this accumulated knowledge together in one fluid motion of thought.

binary red black

Binary Fraud

Humans tend to substitute duality when it’s clear unity provides the best description.

Consider the concepts of light and dark. Ordinary language indicates these two things are separate entities and work in opposition. Try to define darkness without using the concept of light. Good luck. Perhaps a clever wordsmith will succeed, but I doubt it.

Darkness is the absence of light. Darkness does not exist independently of light. Darkness cannot overcome the light. When light appears, darkness vanishes.

Hot and cold is another false duality. Scientifically, hot and cold don’t represent distinct  physical conditions. Hot objects possess relatively high temperatures; cold things have correspondingly low temperatures.

This duality inspires misunderstanding of one of the central concepts in science, temperature. The faster molecules move in a substance, the higher it’s temperature. The amount of stuff in each molecule is important for temperature too, but temperature essentially represents the measure of molecular motion in a substance.

It’s not a question of “Are the molecules moving fast or slow?” because this is another false duality. The true question: how much motion does it have?

Duality’s power is apparent in binary numbering systems: computers have transformed humanity. Computers operate according to binary mathematics. All operations in binary math use a language based on 0 and 1. Humans prefer a base ten system: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9.

Actually, binary math isn’t really binary. The concept of zero is based on the absence of value. A one represents value. Value vs. the absence of value.

Humanity is on the cusp of constructing quantum computers. The fundamental strength of quantum computing is each bit is allowed to occupy value and no value simultaneously. Quantum computers will one day change humanity in ways that cannot be predicted, or described with current language. Quantum computers are based on a unifying principle, probability.

If you enjoyed this post, you might also like Hunter Gatherers in the Quantum Age, De-frag Brain.