overview

Advanced

In best of all worlds, Leibniz surpasses Newton

Posted by archive 
In best of all worlds, Leibniz surpasses Newton
07/21/2003
Source

By TOM SIEGFRIED / The Dallas Morning News
When Gottfried Wilhelm von Leibniz declared that humans inhabit the best of all possible worlds, he really meant that the universe is a cool place because it permits the pursuit of science.

In modern translation, says mathematician Gregory Chaitin, Leibniz meant that God is a computer programmer.

Whether you take "God" in a devout religious sense (as Leibniz did), or as metaphorical shorthand for the ultimate nature of the universe, is beside the point. [In my opinion, this remark does not do justice to Leibniz, whose religious views were extremely sophisticated.---GJC] Dr. Chaitin's message is that the practice of science today reflects a new "digital philosophy" that was articulated by Leibniz more than three centuries ago.

A mathematician, philosopher, physicist and lawyer, [Instead of "lawyer", I would have said "diplomat".---GJC] Leibniz devised calculus independently of Isaac Newton, invented a primitive calculating device, and rediscovered the binary number system (previously used by musicians in India more than a millennium earlier). [In my opinion, the I Ching and Hindu explorations of all possible rhythmic structures exemplify binary combinatorics, but not base-two arithmetic.---GJC] Expressing information using the binary digits 0 and 1 (as modern computers do) stemmed from Leibniz's desire to apply logic to the cosmos.

It was through logical analysis that Leibniz concluded that the world God created was the most nearly perfect possible. After all, Leibniz reasoned, God could have made a universe in any of many possible ways - some complicated, some simple.

Suppose, for instance, that God made the world the way a child might make abstract art - splashing dots of color on a sheet of paper with a crayon. Even for an apparently random pattern of dots, Leibniz showed that it would be possible to devise a mathematical formula describing their positions. In other words, you could use the formula to plot a curving line that would pass through all the dots - in the order they were made.

For most patterns of dots, though, the formula would be extremely complicated. A "better" world might be built by arbitrarily changing certain aspects of nature. But events in such a world, though obeying some complicated formula, would appear to be utterly chaotic, beyond the power of human scientists to comprehend.

"When a rule is extremely complex, that which conforms to it passes for random," Leibniz observed.

To say that the world is ruled by "laws of nature," then, is not very meaningful unless the laws are relatively simple. You could in principle find a rule to describe the world no matter how random it looks. But unless the rule is short and sweet, you have not discovered a very useful law.

Fortunately, the world that humans actually inhabit succumbs to rather simple formulas; scientists can describe all sorts of processes with only a few fundamental equations.

"God has chosen the most perfect world," Leibniz wrote in his Discourse on Metaphysics, "the one which is at the same time the simplest in hypothesis and the richest in phenomena."

In Leibniz's writings, Dr. Chaitin sees the precursor of his own seminal work in describing nature in terms of information. He was a pioneer in the development of algorithmic information theory, a method for gauging the complexity of a scientific theory.

A good theory, Dr. Chaitin explains, compresses lots of observations about nature into a concise mathematical statement - best expressed as an algorithm, or computer program. The algorithmic information content of a theory is the length of the shortest program that can reproduce all the observations.

"The smaller the program is, the better the theory," Dr. Chaitin writes in a recent paper (online at arXiv.org/math.HO/0306303). That is, a good theory, or law of nature, reduces complex data to a short (or simple) formula.

Leibniz's anticipation of this concept, plus his invention of binary digits, planted a 17th-century seed that is only now flowering in many scientific pursuits based on information processing. Dr. Chaitin cites recent advances in quantum information theory and quantum computing, the use of holographic information descriptions of black holes, and models of computation studied by Edward Fredkin (originator of the "digital philosophy" terminology).

"The digital philosophy paradigm is a direct intellectual descendant of Leibniz," writes Dr. Chaitin, of IBM's research laboratory in Yorktown Heights, N.Y. "The human race has finally caught up with this part of Leibniz's thinking."

The digital view of nature also represents a subtle revision of a more ancient philosophy - practiced by the followers of Pythagoras - often summarized by the statement that "Everything is number; God is a mathematician." Today, says Dr. Chaitin, the proper credo should be "Everything is software; God is a computer programmer."

This "digital" philosophy has become a new paradigm for science, and it's just what science needs to cope with the complexity that eludes description with traditional Newtonian physics, based on analysis of mass and motion, or matter and energy.

"In our new interest in complex systems, the concepts of energy and matter take second place to the concepts of information and computation," Dr. Chaitin asserts. The godlike authority of Newton and his materialist philosophy must yield to the new digital philosophy in order for science to extend the human ability to comprehend creation.

Newton's vision is the history of science; Leibniz's is the future.

"Newtonian physics," Dr. Chaitin avers, "is now receding into the dark, distant intellectual past."

E-mail tsiegfried@dallasnews.com

Tom Siegfried is the author of The Bit and the Pendulum: How the New Physics of Information is Revolutionizing Science and Strange Matters: Undiscovered Ideas at the Frontiers of Space and Time.