How to Write Software With Mathematical Perfection Leslie Lamport revolutionized how computers talk to each other. Now he’s working on how engineers talk to their machines. Sheon Han Contributing Writer Leslie Lamport may not be a h..
“The universe cannot be read until we have learnt the language and become familiar with the characters in which it is written. It is written in mathematical language, and the letters are triangles, circles and other geometrical figures, without which means it is humanly impossible to comprehend a single word.” -Galileo Galilei
I’ve always loved this quote. In it, Galileo lays out his philosophy very simply: math is the language of the universe. This was a novel idea for the time; most looked to the church to explain natural phenomena. Galileo used this mindset to make massive strides in our understanding of physics and astronomy, completely reshaping our view of the universe.
The reason this quote has stuck with me is the analogy it creates between math and language. Languages are complex, ever-evolving forms of communication that allow us to convey ideas. Mathematics aims to do something similar, it’s just specialized for a set of particularly abstract concepts. Due to this need, mathematical notation must be unambiguous (more on this later). This has led to an immense amount of mathematical symbols, created to distinguish between an ever-growing list of ideas. Many of these symbols are combined with methods such as subscripts and superscripts to further distinguish between objects within mathematics.
Like language, mathematics contains a variety of dialects. Different fields of mathematics have different needs, and the notations within these fields evolve to reflect them. This is a long, difficult process. Newer fields, like category theory, are currently working to create their own “standards.” Most importantly, all aspects of math have a shared commonality in their notation, keeping them united under one language.
Earlier, I said that mathematical notation must be precise in order to accurately convey an idea. This is much easier said than done. Because math is ultimately a human endeavor, with thousands of us working to expand our sphere of knowledge, this cannot be as rigorous as we’d like. See this page for several examples of differences in notation that can potentially lead to confusion. When I was studying math and physics for my undergraduate degree, I took Multivariable Mathematics and Mathematical Physics in the same semester. Both classes dealt with Stoke’s Theorem, but used radically different notation and wording to describe it. Balancing these two similar versions of the same theory was a huge challenge for me. I can only imagine the pain of reading through mathematical papers, each with its own take on how symbols should be used.
Of course, the constant shift in mathematical notation is essential. Without it we would still be stuck with Roman Numerals which become incredibly clunky for large numbers. Richard Feynman famously invented his own notation for sine and cosine which, while pretty, never caught on (shown below).
My absolute favorite article on Stack Exchange is linked here. It contains a bunch of alternate notation for logarithms, exponents, and roots: three concepts that are intimately related. However, their respective notations don’t indicate this. Commenters come up with several beautiful ideas; it’s a fascinating read. Looking through these suggestions will help you understand the ideas I present in the remainder of this article. If you want a video summary, 3Blue1Brown has a fantastic presentation on it.
The discussion linked above leads to an interesting question: should we change the notations for logarithms, exponents, and roots? Our current symbols are certainly precise; they convey exactly what we need to know to perform the operation. They still somehow feel like they’re missing something. I strongly sympathize with the question-asker in that Stack Exchange discussion.
I distinctly remeber being baffled by logarithms when they were first presented to me. I only made it through high-school math by blindly memorizing what it does and how to calculate the logarithms provided to me. Only much later in my mathematical education did I realize the strong relationship that connected logarithms to exponents and roots. Of course, this is only a personal anecdote, but it is interesting to consider how this could be avoided. Would this lapse not have occured if I had been presented with a different notation?
Believe it or not, the symbol π is also controversial. There is a group dedicated to replacing π with τ (pronounced tau), where τ = 2 * π. At first, this seems rather silly. Why is this necessary? There is a lot of culture around π, most notably π-day which occurs on March 14. In addition, an immense amount of mathematical textbooks would have to be rewritten. Proponents of τ have written out an extensive list of reasons for this change. While they are presented in a joking manner, the arguments laid out are rather convincing, and I would encourage you to read it. It may improve mathematical education when τ is used instead of π, but this is yet to be proven. (I happen to be a supporter of τ only because I was born on τ-day).
Communicating mathematics is incredibly important. Just like a language, math is all about expressing complicated ideas in simple ways. It’s important to continue discussing how we talk about math and how it could be made better. If you’re interested in notation and how it can be used to describe more ideas (not just for mathematics) then I highly recommend you check out this collection of pieces. It served as my inspiration for this article (some of the pieces are linked already) and has so many cool thoughts in it!