< ρℝεν    ℂσητεητs    ℕεχτ > 
“Quantum Mechanics is a beautiful generalization of the laws of probability.”
– Scott Aaronson 

I have in general eschewed too many comments about the process of writing this book. Glimpses of Symmetry is meant to be a book about Group Theory, not a book about writing a book ^{[1]}. However, here I am going to make an exception. Has there ever been a book written where the author has not suffered from writers’ block at some point? My guess is there are probably few of these; Glimpses of Symmetry is not an exception to the rule. My specific block was related to Quantum Mechanics. As advertised in the Introduction, this book is primarily about Mathematics, not Physics. However, the motivation behind telling the story that I have been trying to tell is of course Physicsbased and relates to explaining the role that Group Theory plays in the Standard Model of Particle Physics. Given this Mathematics / Physics dichotomy, the sort of questions I have got caught up on have been as follows:
Just how much of Quantum Mechanics (and Quantum Field Theory) do I want to include in the text?
 Quite a lot, so that the actual role of SU(3) × SU(2) × U(1) is laid bare?
 A handwavy outline (something common to many a book discussing Quantum Mechanics), which is just about enough of a framework on which to hang the Mathematics?
 Or something in between these two extremes?
A factor here is that I am not a Physicist. To do Quantum Mechanics any sort of justice, I would have to devote myself to learning a lot of things that I did not cover in University. The path that I thought was most likely ahead of me is eloquently described by Scott Aaronson as follows:
You start with classical mechanics and electrodynamics, solving lots of gruelling differential equations at every step. Then, you learn about the “blackbox paradox” and various strange experimental results, and the great crisis that these things posed for Physics. Next, you learn a complicated patchwork of ideas that physicists invented between 1900 and 1926 to try to make the crisis go away. Then, if you are lucky, after years of study, you finally get around to the central conceptual point: that nature is described not by probabilities (which are always nonnegative), but by numbers called amplitudes that can be positive, negative or even complex.
— Quantum Computing Since Democritus – Chapter 9
Scott Aaronson is a theoretical computer scientist and his abovereferenced book proved something of an epiphany for me, allowing me to break through my writers’ block and begin the process of finishing this book. The central point that he makes is that, while you can approach learning Quantum Mechanics in the manner he described above, you don’t have to. There is an alternative formulation, in which Quantum Mechanics “falls out” of a process of generalising probability theory. This path is essentially a Mathematical one and thus appeals strongly to me. It is Aaronson’s ideas that I will discuss in this Chapter. Just one benefit of his approach (at least speaking selfishly) is that the importance of Unitary matrices appears early on and in a very natural manner. Having acknowledged my considerable debt to Aaronson’s wonderful insight, let’s dive in. Given my desire to make this work as accessible as possible, I’ll start with a basic primer on probability theory.
Chances Are…
Probability Theory relates to the chances of something happening (or not happening). This is probably ^{[2]} an unsurprising statement. The main concepts in the field are essentially accessible to anyone with experience of placing a bet, or trying to make a decision based on the likelihood of something else occuring. As Probability Theory is part of Mathematics, we need to bring numbers in to the picture. We do this as follows:
If an event, let’s call it X, has n potential outcomes: {x_{1}, x_{2}, … , x_{n}}, the the probability of any one, say x_{i}, occuring is denoted by P(x_{i}). The value of P(x_{i}) is a Real Number in the range 0 to 1.
If an outcome cannot happen, its probability is assigned the value 0. If it must happen, its probability is assigned a value of 1.
If the outcome may occur, but is uncertain to do so, then its probability is neither 0, nor 1, but somewhere between the two. For example, if an outcome is just as likely to happen as to not happen (throwing tails with a fair coin for example), it has a probability of a half; we can also say a 1 in 2 chance (1/2 = 0.5 clearly).
The closer that a probability P(x_{i}) is to 1, the more likely it is to happen; the closer to 0, the less likely.
We can obviously also gather together the probabilities as follows: {P(x_{1}), P(x_{2}), … , P(x_{n})}, or more simply just {P_{1}, P_{2}, … , P_{n}}
As ever an example should help to clarify things. Consider that staple of probability exercises, a fair, sixsided die. The basic outcomes are then throwing each of a one through to a six, which we can write {1,2,3,4,5,6}. Obviously, as a fair die does not favour any number over another, the probability of any specific number being thrown is one in six or 1/6. Thus P(1) = P(2) = … = P(6) = 1/6.
We can however consider some other outcomes:
 P(roll a seven) = 0
 P(number is less than or equal to 6) = 1
 P(even number) = 1/2
 P(prime number) = 1/2
We can see that some of these are compound outcomes, so P(even number) = P(2) + P(4) + P(6).
Returning to the general case, can we say anything about our set {P_{1}, P_{2}, … , P_{n}}? Well – assuming that this covers all independent outcomes – the sum of this must be 1.
[To be completed]


< ρℝεν    ℂσητεητs    ℕεχτ > 
Chapter 23 – Notes
^{ [1]}  An image of an infinite stack of turtles swims unbidden into my mind. 
^{ [2]}  Irony intended. 
^{ [3]}  TBC. 
Text: © Peter James Thomas 201618. 