# A Very Brief (visual) History of Mathematics

A couple of weeks back Stephen Wolfram released a podcast where he went over the history of mathematics largely from memory.

While I was listening to the talk, I had a few dozen Wikipedia pages open that really brought the talk alive for me. This blog post is an attempt to bring you to the same state of mind I was in when I was listening.

We can trace back the origin of Mathematics to about 18,000 BC via the Ishango bone.

The etchings on an Ishango bone are reminiscent of tally marks which is the system you’d use if you wanted to quickly count something.

And in fact we can see counting evolve in several different cultures from their scripts

The first invented mathematical subjects were arithmetic and geometry with the respective applications being accounting and architecture. Accounting is particularly important because it’s a system that allowed and encouraged trading to develop. Phoenicians for example didn’t seem to be particularly interested in reading poems or stories, their only surviving writings are ledgers.

Eventually we found more efficient representations that let us count large numbers more quickly. Here’s an early decimal system from ancient Egypt.

At some point numbers started to become something more abstract, as opposed to just taking on values, numbers could be used to for arithmetic computation. While knowledge of addition α + β was evident even since the Ishango bone, it took bit longer to understand the utility of

**Multiplication**: α × β which helps you count much more quickly

**Reciprocals**: 1 / α which then helps you understand division to split things up efficiently

While the origins of Mathematics are rooted in practical applications, at some point the objective truths it purported to give drew out an almost religious fervor.

Plato had strong contributions in voting theory and policy decisions which necessitated the introduction of mathematical logic which revolved around verifying whether statements like the one below were true or false.

“All cats are mammals”

“Humans are mammals”

“Cats are mortal”

Therefore Humans are mortal

However, Plato’s mistake was that he thought you could reason about the world in pure thought without any regard to the physical world. He’s the original armchair philosopher.

Pythagoras (~600 BC) started a communal living society that sought to further mathematical knowledge and see its relationship to all other aspects of life

Oddly enough for such an Avant-garde society, Pythagoras’ followers imploded upon their discovery of irrational numbers like the square root of 2. Mentioning their existence shattered the purity of mathematics and the world it’s trying to describe.

One of the most famous proofs of the existence of irrational numbers came almost 200 years later by Euclid (~400 BC). His book Elements is the second most published book of all time after the Bible. In fact up until recently reading Elements in Greek was a hallmark of prestigious schools and universities around the world. Euclid is probably the most “Lindy” mathematician, most of what Euclid has written still holds up. (Not true for Plato or Aristotle)

And so with a growing body of knowledge and a new structure involving theorems and proofs a large body of knowledge of mathematics was collected from all over the world and centralized at the Library of Alexandria.

Of particular interest were the contributions of the Babylonians who were big on predictions. Whether it’s to predict astronomical observations for agricultural purposes or predicting who would win a battle based on various observations. In some sense Babylonians are the inventors of Econometrics and data analysis.

While Euclid formalized 2-dimensional geometry, the world around us is 3-dimensional so it was only a matter of time before Archimedes (~300 BC) started work on estimating π to solve 3-d geometry problems like the equations of a lever or circular liquid motion inside a pump

On the other side of the world, the production rules of Sanskrit necessitated the creation of a more discrete kind of mathematics that works on words and phrases and not just algebraic or geometric primitives. (~200 BC)

α + β → γ

β + α → α + β

And in the classic I Ching (~900 BC) text in China, discrete structures like the Hexagram take on a spiritual meaning.

Back to Greece we observe the first ever analog computer in the Antikythera mechanism (87 BC) which was used for astronomical observations.

Already we see signs of humans realizing that mathematical calculation can be tedious and error prone and that these errors can be reduced by machines. It took about a 1,000 years longer to develop better number systems.

We see the first familiar variables x, a, α etc.. introduced in 1200 as placeholders for different possible values.

Fibonacci (1200) introduced Arabic numbers in the West — main motivation was accounting for commerce.

Have fun adding V + I + VII vs 5 + 1 + 7

The Arabic numerals should really be called the Brahmi numerals (~250 BC) and took over a thousand years for people in the West to see the utility.

Fractals started showing up around 1205 with the Cosmati family which over the course of four generations worked on decorative fractal mosaics.

Standard symbols for things like addition +, multiplication ×, subtraction -, all only started to show up in the mid 1300 s and it initiated a rapid progress in math because the notation became less of a barrier when understanding new work.

Polynomials became all the rage when representing the world in an algebraic manner took off. So naturally people started to look at the roots f(x) = 0 of different algebraic equations. It was a huge surprise to say the least when even though we can visually inspect any cubic function and verify that there are points where it goes to 0 that solutions could be complex numbers

α + β i where i² = -1

This is why no-one paid much attention to square-root of two before because complex or imaginary solutions felt fictional with respect to quadratic functions. See how f(x) = x² + 1 doesn’t seem to need an imaginary solution.

Around 1600 John Napier invented the logarithm which helped make large computations much smaller to compute on his calculating table.

log(a × b) = log(a) + log(b)

Let’s say you wanted to multiply 10 × 100, instead of moving,000 blocks one by one, you would only need to move 1 + 2 = 3.

John Napier was also an accomplished necromancer

Descartes introduced coordinate systems in 1637 and it would play a huge role in the coming revolution of Newtonian physics.

Probability theory mainly came about to study gambling problems and was pioneered in the West by Cardano in the 1600s. There is also some older work between 800 and 1300 where Al Khalil pioneered the use of statistical inference for crypt analysis.

In one of the more poetic coincidences in Science, Galileo died on the same day Newton was born January 4, 1643.

Newton really pushed pushed the idea that you could reason about the physical world using math and to do this he had to invent calculus.

Calculus was also independently invented by Leibniz who worked on pioneering techniques using a more symbolic version of calculus. Leibniz wanted to develop a distributed contract language more than 500 years before cryptocurrencies have become a mainstream topic.

Math textbooks for kids started in 1600’s, before that textbooks were mostly traded among merchants since the techniques provided a competitive business advantage to those that mastered them.

In the 1700's there was an explosion of research in calculus pioneered Euler, Lagrange, Laplace and this is actually a fairly defining moment in mathematics where the papers become readable to us right now.

Gauss was involved in land surveying so invented differential geometry to make his job easier. He needed for example to calculate the area of complicated surfaced with lots of ups and downs.

Galois (~1820's) was a hotheaded french activist whose accomplishments in activism were rather terrible relative to his accomplishments in math. He’s the inventor of Group Theory which was a strong departure from math as a geometric language to something higher level

We take for granted that α + β = β + α (commutative property) but this is not true for all number systems or all types of numbers nor is it true for different kinds of spaces. So group theory is really about studying when the kinds of symmetries that we see above hold and when they don’t.

Most of modern mathematics today uses some form of group theory, the parent subject being abstract algebra. While most people have heard of calculus, most haven’t heard of anything about Mathematics after Group Theory which is just unfortunate.

Galois died at 20 years old in a duel for a mutual love interest and published his groundbreaking research the day before he died.

While abstract algebra does sound really abstract it has applications when you consider that basic properties like commutativity fail to hold in settings like curved spaces which ended up being fundamental to the theories of relativity.

In 1827 Mobius introduced Homogeneous coordinate systems which formalized how to project a 3-dimensional space like the real world into a 2-dimensional space like a piece of paper or computer screen by formalizing the idea and explicitly adding a a point at infinity or vanishing point where parallel lines would intersect. Your computer renders stuff on your screen based on insight from 1827.

Artists on the other hand had an understanding of projective geometry long before Mathematicians and it was used to great effect during the Renaissance.

3-dimensional geometry became a mainstream topic and it really needed a theory for how to represent rotations. In 2 dimensions it was simple, you just use a 2-d vector but in 3 dimensions a 3-d vector had numerous issues until Hamilton in 1843 wrote down the brilliant insight that you could represent 3-d rotations using 4 numbers instead of 3. Quaternions were born on this day and game programmers 300 years later are still rejoicing.

In 1889 Peano published his findings on Peano arithmetic which provided a major improvement to logic over Plato’s notations with only a couple of axioms (statements assumed to be true) you can generate most proofs in math. Peano also wrote in an axiomatic way which makes his work difficult to read as a human but easy to read if you’re a computer.

In 1920 Hilbert started an ambitious program to solve all open problems in mathematics, the idea is that you would define a couple of simple rules and then derive all possible truths of mathematics with a single program.

Hilbert wanted to feed programs axioms and get all of math out and then Godel’s incompleteness theorems came out in 1931 and it shook everything when he proved that his dream was not possible.

In 1892, Poincare was obsessed by the 3 body problem and started thinking about phase space which led him to start studying higher level shapes and invent topology, dynamical system theory and chaos theory. The main idea being how do you describe the evolution of a physical system using a different representation than the familiar Cartesian coordinates that would allow you to actually remove the instabilities associated with the 3-body problem. Changing spaces to get easier problems is an extremely powerful and recurring idea in modern math.

Knots provide a good way to introduce topology and algebra to kids. Given two knots can you untangle them to get others. Turns out there are unique prime knots similar to the prime numbers

In 1830 Fourier, developed a theory of heat which then needed him to come up with Fourier series which is again a way of morphing a space from one form to another to make problems easier to work with.

The heat equation is an example of a differential equation which tracks the progress of some variables (heat) with respect to others (position)

And we can further compress this representation if we use Linear Algebra

So we had methods to describe the evolution of systems and we had methods to compress the representation to something that would be nicely computable on a computer, but we didn’t have computers yet..

Fortunately in 1836, Alan Turing (Turing Machines) and Alonzo Church (λ-calculus) independently came up with the idea of Universal Computation which confirmed that anything that could theoretically be computable is computable within the context of Turing Machines and the λ-calculus. Turing Machines work by writing bits on a tape and the λ-calculus works by composing mathematical functions.

And in 1854 George Boole came up with Boolean algebra which let you express any Turing Machine Computation using 1, 0 and a couple of simple rules which later paved the way for Claude Shannon to invent Information Theory and develop the idea that we can transmit information using bits. Phones led to computers which led to the internet.

After the 1960's there was and still is a revolution in scientific computing including dynamical systems and control systems for guided ballistics and rocketry. Numerical simulations, Linear Algebra and Partial Differential equations became important research topics which received tremendous funding especially if you consider that this aligns to tensions between the U.S.A and the U.S.S.R.

After WW2 Partial Differential Equations became such a big field that it caused a rift that exists to this day between applied mathematicians that work on numerical methods (including machine learning) and abstract mathematicians that deal with topics like (topology, algebra, category theory, number theory)

Even though a large part of modern math is about proving stuff, the point of math is not so much to prove stuff but to compute stuff. Which is why in the past 3 years there has been an explosion of papers using modern abstract math to better understand and speed up applied math. Open access paper publishing via websites like arxiv.org has allowed research communities that would have otherwise never interacted because of artificial paywalls to collaborate and lean on each other.

About 3,000,000 math theorems exist today and more math theorems have been proved in the last 50 years than the entire history of mankind. You’re bound to find something useful, there’s never been a better time to get into math.

# Next steps

If you’ve enjoyed this post you’re likely to enjoy the full podcast by Wolfram.

If you’d rather read a book, History of Mathematics is exceptional and goes over more detail than the podcast does. Specifically the missing concepts are Lie groups etc…

I’m planning an extension part 2 of this blog post that would also cover content from this book including but not limited to Probability Theory, Geometric Algebra and Lie Groups.

So please subscribe if you’re interested!

Also you may be interested in my upcoming book “**Robot Overlord Manual**” which will help get you up to speed with all the modern math you’ll need to start building your own robots at home.

Sign up for my mailing list and I’ll send you a maximum of 2 emails. One when part 2 of this blog post comes out and one when my book comes out.

Randomly browsing Wikipedia is medicine for the soul