Montag, 16. Juli 2012

The Origin of Force

 

As I have expanded upon lengthily in this previous post, interference is a key phenomenon in quantum theory. In this post, we will see how it can be used to explain the existence of forces between certain objects, using the example of the electromagnetic force in particular. 
The usual popular account of the quantum origin of forces rests on the notion of virtual particles. Basically, two charged particles are depicted as 'ice skaters' on a frictionless plane; they exchange momentum via appropriate virtual particles, i.e. one skater throws a ball over to the other, and both receive an equal amount of momentum imparted in opposite directions. This nicely explains repulsive forces, i.e. the case in which both skaters are equally charged. In order to explain attraction, as well, the virtual particles have to be endowed with a negative momentum, causing both parties to experience a momentum change in the direction towards the other. Sometimes, this is accompanied by some waffle about how this is OK for virtual particles, since they are not 'on-shell' (which is true, but a highly nontrivial concept to appeal to for a 'popular level' explanation).
In this post, after the introduction, I will not talk about virtual particles anymore. The reason for this is twofold: first, the picture one gets through the 'ice-skater' analogy is irreducibly classical and thus, obfuscates the true quantum nature of the process, leaving the reader with an at best misleading, at worst simply wrong impression. Second, and a bit more technically, virtual particles are artifacts of what is called a perturbation expansion. Roughly, this denotes an approximation to an actual physical process by means of taking into account all possible ways the process can occur, and then summing them to derive the full amplitude -- if you're somewhat versed in mathematical terminology, it's similar to approximating a function by means of a Taylor series. The crucial point is that the virtual particles are present in any term of this expansion, but the physical process does not correspond to any of those terms, but rather, to their totality. So the virtual-particles analogy can't give you the full picture.

Montag, 16. Januar 2012

The Interpretation of Quantum Mechanics




So far on this blog, I have argued that quantum mechanics should be most aptly seen as a generalization of probability theory, necessary to account for complementary propositions (propositions which can't jointly be known exactly). Quantum mechanics can then be seen to emerge either as a generalization (more accurately, a deformation) of statistical mechanics on phase space, or, more abstractly (but cleaner in a conceptual sense) as deriving from quantum logic in the same way classical probability derives from classical, i.e. Boolean, logic.
Using this picture, we've had a look at how it helps explain two of quantum mechanics' most prominent, and at the same time, most mysterious consequences -- the phenomena of interference and entanglement, both of which are often thought to lie at the heart of quantum mechanics.
In this post, I want to have a look at the interpretation of quantum mechanics, and how the previously developed picture helps to make sense of the theory. But first, we need to take a look at what, exactly, an interpretation of quantum mechanics is supposed to accomplish -- and whether we in fact need one (because if we find that we don't, I could save myself a lot of writing).

Donnerstag, 29. Dezember 2011

Untangling Entanglement



What to Feynman was interference (see the previous post), to Erwin Schrödinger (he of the cat) was the phenomenon known as entanglement: the 'essence' of quantum mechanics. Entanglement is often portrayed as one of the most outlandish features of quantum mechanics: the seemingly preposterous notion that the outcome of a measurement conducted over here can instantaneously influence the outcome of a measurement carried out way over there.
Indeed, Albert Einstein himself was so taken aback by this consequence of quantum mechanics (a theory which, after all, he helped to create), that he derided it as 'spooky' action at a distance, and never fully accepted it in his lifetime.
However, viewing quantum mechanics as a simple generalization of probability theory, which we adopt in order to deal with complementary propositions that arise when not all possible properties of a system are simultaneously decidable, quantum entanglement may be unmasked as not really that strange after all, but in fact a natural consequence of the limited information content of quantum systems. In brief, quantum entanglement does not qualitatively differ from classical correlation; however, the amount of information carried by the correlation exceeds the bounds imposed by classical probability theory.

Samstag, 17. Dezember 2011

What is Quantum Mechanics?



So far, I've told you a little about where I believe quantum theory comes from. To briefly recap, information-theoretic incompleteness, a feature of every universal system (where 'universal' is to be understood in the sense of 'computationally universal'), introduces the notion of complementarity. This can be interpreted as the impossibility for any physical system to answer more than finitely many questions about its state -- i.e. it furnishes an absolute restriction on the amount of information contained within any given system. From this, one gets to quantum theory via either a deformation of statistical mechanics (more accurately, Liouville mechanics, i.e. statistical mechanics in phase space), or, more abstractly, via introducing the possibility of complementary propositions into logic. In both cases, quantum mechanics emerges as a generalization of ordinary probability theory. Both points of view have their advantages -- the former is more intuitive, relying on little more than an understanding of the notions of position and momentum; while the abstractness of the latter, and especially its independence from the concepts of classical mechanics, highlights the fundamental nature of the theory: it is not merely an empirically adequate description of nature, but a necessary consequence of dealing with arbitrary systems of limited information content. For a third way of telling the story of quantum mechanics as a generalized probability theory see this lecture by Scott Aaronson, writer of the always-interesting Shtetl-Optimized.
But now, it's high time I tell you a little something about what, actually, this generalized theory of probability is, how it works, and what it tells us about the world we're living in. First, however, I'll tell you a little about the mathematics of waves, the concept of phase, and the phenomenon of interference.

Montag, 5. Dezember 2011

The Emergence of Law



For many scientists, the notion of a lawful, physical universe is a very attractive one -- it implies that in principle, everything is explicable through appeal to notions (more or less) directly accessible to us via scientific investigation. If the universe were not lawful, then it seems that any attempt at explanation would be futile; if it were not (just) physical, then elements necessary to its explanation may lie in a 'supernatural' realm that is not accessible to us by reliable means. Of course, the universe may be physical and lawful, but just too damn complicated for us to explain -- this is a possibility, but it's not something we can really do anything about.
(I have previously given a plausibility argument that if the universe is computable, then it is in principle also understandable, human minds being capable of universal computation at least in the limit; however, the feasibility of this understanding, of undertaking the necessary computations, is an entirely different question. There are arguments one can make that if the universe is computable, one should expect it to be relatively simple, see for instance this paper by Jürgen Schmidhuber, but a detailed discussion would take us too far afield.)
But first, I want to take a moment to address a (in my opinion, misplaced) concern some may have in proposing 'explanations' for the universe, or perhaps in the desirability thereof: isn't such a thing terribly reductionist? Is it desirable to reduce the universe, and moreover, human experience within the universe, to some cold scientific theory? Doesn't such an explanation miss everything that makes life worth living?
I have already said some words about the apparent divide between those who want to find an explanation for the world, and those who prefer, for lack of a better word, some mystery and magic to sterile facts, in this previous post. Suffice it to say that I believe both groups' wishes can be granted: the world may be fully explicable, and yet full of mystery. The reason for that is that even if some fundamental law is known, it does not fix all facts about the world, or more appropriately, not all facts can be deduced from it: for any sufficiently complex system, there exist undecidable questions about its evolution. Thus, there will always be novelty, always be mystery, and always be a need for creativity. That an underlying explanation for a system's behaviour is known does not cheapen the phenomena it gives rise to; in particular, the value of human experiences lies in the experiences themselves, not in the question of whether they are generated by some algorithmic rule, or are the result of an irreducible mystery.

Samstag, 19. November 2011

The Origin of the Quantum, Part III: Deviant Logic and Exotic Probability



Classical logic is a system concerned with certain objects that can attain either of two values (usually interpreted as propositions that may be either true or false, commonly denoted 1 or 0 for short), and ways to connect them. Though its origins can be traced back in time to antiquity, and to the Stoic philosopher Chrysippus in particular, its modern form was essentially introduced by the English mathematician and philosopher George Boole (and is thus also known under the name Boolean algebra) in his 1854 book An Investigation of the Laws of Thought, and intended by him to represent a formalization of how humans carry out mental operations. In order to do so, Boole introduced certain connectives and operations, intended to capture the ways a human mind connects and operates on propositions in the process of reasoning.
An elementary operation is that of negation. As the name implies, it turns a proposition into its negative, i.e. from 'it is raining today' to 'it is not raining today'. If we write 'it is raining today' for short as p, 'it is not raining today' gets represented as ¬p, '¬' thus being the symbol of negation.
Two propositions, p and q, can be connected to form a third, composite proposition r in various ways. The most elementary and intuitive connectives are the logical and, denoted by ˄, and the logical or, denoted ˅.
These are intended to capture the intuitive notions of 'and' and 'or': a composite proposition r, formed by the 'and' (the conjunction) of two propositions p and q, i.e. r = p ˄ q, is true if both of its constituent propositions are true -- i.e. if p is true and q is true. Similarly, a composite proposition s, formed by the 'or' (the disjunction) of two propositions p and q, i.e. s = p ˅ q, is true if at least one of its constituent propositions is true, i.e. if p is true or q is true. So 'it is raining and I am getting wet' is true if it is both true that it is raining and that you are getting wet, while 'I am wearing a brown shirt or I am wearing black pants' is true if I am wearing either a brown shirt or black pants -- but also, if I am wearing both! This is a subtle distinction to the way we usually use the word 'or': typically, we understand 'or' to be used in the so-called exclusive sense, where we distinguish between two alternatives, either of which may be true, but not both; however, the logical 'or' is used in the inclusive sense, where a composite proposition is true also if both of its constituent propositions are true.

Samstag, 12. November 2011

Maxwell's Demon, Physical Information, and Hypercomputation



The second law of thermodynamics is one of the cornerstones of physics. Indeed, even among the most well-tested fundamental scientific principles, it enjoys a somewhat special status, prompting Arthur Eddington to write in his 1929 book The Nature of the Physical World rather famously:
The Law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations—then so much the worse for Maxwell's equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
But what, exactly, is the second law? And what about it justifies Eddington's belief that it holds 'the supreme position among the laws of nature'?
In order to answer these questions, we need to re-examine the concept of entropy. Unfortunately, one often encounters, at least in the popular literature, quite muddled accounts of this elementary (and actually, quite simple) notion. Sometimes, one sees entropy equated with disorder; other times, a more technical route is taken, and entropy is described as a measure of some thermodynamic system's ability to do useful work. It is wholly unclear, at least at first, how one is supposed to relate to the other.
I have tackled this issue in some detail in a previous post; nevertheless, it is an important enough concept to briefly go over again.