The Map Is Not The Terrain: The Pitfalls of Language

What's in a name?

Rather more than we might think, actually. Words have baggage, and sometimes that baggage can be difficult to shake off.

Here I want to talk about natural language, and how we use it. Increasingly, in my virtual travels, I encounter the degeneration of discussion into semantic quibbling, most of which entirely misses the point of semantics (and I freely confess to having gone down this rabbit-hole myself).

We've previously had some discussion on semantics, and why and when it's important, but there are some things that weren't necessarily made explicit, and I wish to address some of them here, along with touching on some things we haven't previously covered. Specifically, I want to deal with some of those instances in which language can lead us astray, especially in our reliance on conventions. By halfway through what follows, I'm going to try to convince you that words can't be trusted and, by the end, I'm going to try to convince you that they can, as long as care is taken.

Language can be a tricky bugger, especially when dealing with complex languages like English. One of the issues is how it evolves over time. There's an exceptional book by Professor Steve Jones entitled Language of the Genes, which details the evolutionary history of language, and how it follows genuine evolutionary principles of divergence, selection, drift, etc. Properly, it's stochastic. That's an important term in the sciences, which we discussed in some detail in Has Evolution Been Proven. It means that the future state of a system is dependent on the current state plus one or more random variables. By random, of course, I mean 'statistically independent' not, as some seem to use the term 'uncaused'.

There's a useful example of how language evolves that should be reasonably familiar to all. In 1976, Richard Dawkins' The Selfish Gene was published. In it, he coined a term, 'meme'. He meant it to signify an idea that becomes culturally embedded. It was erected more as a didactic tool than anything, an analogy for biological evolution, in which ideas were subject to divergence, selection and drift. Since then, an entire field of study has arisen to look at the evolution of memes. It's known as memetics.

I'd be willing to place a Hawking-style wager that, were you to ask most internet denizens these days what a meme is, they'd tell you that it's an image, especially one that delivers some sort of message. I first encountered this idea on a particular social networking site. An entire cognitive culture seems to have grown around them, even to the extent that they're treated as discardable simply on the basis of them being images, which had me somewhat flummoxed.

I like to be as clear as I can be, and character limits can sometimes be an issue for clarity, even when ideas are spread over several submissions, so it's a no-brainer to me to do a screen cap of some text I've written, crop it and upload, so that I can be as clear and logical as possible in natural language. Here's a typical example:


OK, so it's direct, and slightly scornful, but accurate. So what happens when I do that? It gets dismissed on the basis that it's a meme, and no consideration is even given to the content. This is, of course, another example of the genetic fallacy.

The main concern here, though, is simply that the idea of what a meme is has evolved, as memes do, and as language does, even beginning to find some divergence in meaning.

I recently looked at cognitive inertia, our tendency to hang on to ideas we find comfortable for any reason, in Patterns and the Inertia of Ideas, in which was detailed how ideas can form deep roots and consequently become very resistant to change. Words can be deeply embedded in our psyches, and we each carry with us our own version of what is meant by a given word. A word is just an idea, after all. The most common words will tend to be well-correlated, because it soon gets picked up if we're not using a word in the way others are using it, and probably long before one imprints a specific definition. Dictionaries also serve to standardise many of the less common ones as well, but there are pitfalls, and they fall under a set of fallacies I like to term argumentum ad lexicum, or appeal to dictionary, itself a subset of the genetic fallacy.

The best known of these is probably the etymological fallacy. This is an appeal to either the etymological root of a word or to a historical definition of a word to dismiss other usages. For obvious reasons, it's also related to the fallacy of equivocation and the argumentum ad antiquetatem (appeal to antiquity or history). As we can see with the example of memes above, this is trivially silly.

The major example of the argumentum ad lexicum is 'here's how such-and-such dictionary defines that word, and it's not how you're using it, therefore your argument is wrong' (or the corollary positive argument, which still commits the same fallacy). This fallacy is a twofer, because it commits the argumentum ad populum and the argumentum ad verecundiam.

I've come to the conclusion that people tend to think of dictionaries as prescriptive, as monolithic authorities on what words mean. If this were actually the case, English would have had no new words or meanings since Dr Samuel Johnson finished his famous tome, and in which he defines 'oats' as a grain, which in England is generally given to horses, but in Scotland supports the people. With a bit of luck and a fair wind, I don't have to point out how absurd that conclusion is.

So what's a dictionary, then, if not a prescriptive source regarding what words mean? It's a descriptive source describing usage. A dictionary does nothing more than document how words are generally used, and trace their history to written sources. Thus, accepting or dismissing an argument based purely on a dictionary definition, is appealing to popular usage, which is the ad populum. In treating the dictionary as authoritative, the ad verecundiam is committed.

There's another way that words can be tricky, and it occurs most in areas in which natural language is a poor reflection of what it's describing. in Who Put it There? we looked at information in DNA and, among the issues we covered, was the principle that some of the terms we use in dealing with it can give an inaccurate picture of what's really going on if one isn't careful of what the terms actually refer to in that context. There are other areas in which this is even more pronounced, and I want to briefly talk about one of them here.

The map is not the terrain.

In some of the previous posts on quantum theory, we've explored the double-slit experiment and wave-particle duality. In many popular treatments, this is described as a quantum entity being, or having properties of, both waves and particles. This serves as a useful shorthand, but doesn't rigorously capture the essence of what's actually going on. Consider:

What are the characteristics of a wave?
Distributed (i.e. not localised).
Displays interference.

What are the characteristics of a particle?
Localised (i.e. not distributed).
Doesn't display interference.

Suppose we run the double slit experiment. There's a full description of a beautiful experiment conducted by some R&D guys at Hitachi, in which this was done with electrons sent through the apparatus one at a time. The result is single electrons arriving at the screen. After many tens of thousands of electrons were sent through the apparatus, one electron at a time going through and arriving at the detector, we get an interference pattern.

So, is it a particle? Well, it's certainly localised on the screen, but it shows interference. Particles don't show interference.

Is it a wave? Well, it shows interference, but it isn't distributed.

So which is it? Is it a particle, a wave, or both?

The answer is, of course, that it's neither: It's an electron!

Another good example, also from quantum theory, is what Heisenberg's Uncertainty Principle tells us about, for example, the position and momentum of an electron, namely what we can 'know' about them, and the relationship between them, and other pairs of conjugate variables. This is again slightly misleading, because it suggests that the limitation might be a simple matter of experimental cunning. What we're actually talking about when we say 'know', or when we talk about the information that can be extracted from a quantum system, is actually all of the information about the system, whether we can know it or not. If we extract the information about the position, the momentum doesn't actually exist, and vice versa. Indeed, neither can be said to actually exist until we make an observation.

Tying these two ideas together, what actually happens when we observe a quantum system? Well, when we interact with the entity in one way, we see a behaviour that we associate with waves. When we interact with it another way, we see a behaviour that we associate with particles.

So what about fields? Well, when we interact with the field in certain ways, we see behaviours that we associate with fields. Are fields a rigorous description? Quite probably not, but it seems to work for now. Modelling phenomena in terms of excitations in fields has been amazingly fruitful.

Here's Brian Greene, string theorist and professor of physics at Columbia University.
Quantum mechanics is a conceptual framework for understanding the microscopic properties of the universe. And just as special relativity and general relativity require dramatic changes in our worldview when things are moving very quickly or when they are very massive, quantum mechanics reveals that the universe has equally if not more startling properties when examined on atomic and subatomic distance scales. In 1965, Richard Feynman, one of the greatest practitioners of quantum mechanics, wrote:
"There was a time when the papers said that only twelve men understood the theory of relativity. I do not believe there ever was such a time. There might have been a time when one man did because he was the only guy that caught on, before he wrote his paper. But after people read the paper a lot of people understood the theory of relativity in one way or other, certainly more than twelve. On the other hand, I think I can safely say that nobody understands quantum mechanics"
Although Feynman expressed this view more than three decades ago, it applies equally well today. What he meant is that although the special and general theories of relativity require a drastic revision of previous ways of seeing the world, when one fully accepts the basic principles underlying them, the new and unfamiliar implications for space and time follow directly from careful logical reasoning. If you ponder the descriptions of Einstein's work in the preceding two chapters with adequate intensity, you will - if even for just a moment - recognize the inevitability of the conclusions we have drawn. Quantum mechanics is different. By 1928 or so, many of the mathematical formulas and rules of quantum mechanics has been put in place and, ever since, it has been used to make the most precise and successful numerical predictions in the history of science. But in a real sense those who use quantum mechanics find themselves following rules and formulas laid down by the "founding fathers" of the theory - calculational procedures that are straightforward to carry out - without any real understanding why the procedures work or what they really mean. Unlike relativity, few if any people ever grasp quantum mechanics at a "soulful" level.
What are we to make of this? Does it mean that on a microscopic level the universe operates in ways so obscure and unfamiliar that the human mind, evolved over eons to cope with phenomena on familiar everyday scales, is unable to fully grasp "what really goes on"? Or, might it be that through historical accident physicists have constructed an extremely awkward formulation of quantum mechanics that, although quantitatively successful, obfuscates the true nature of reality? No one knows. Maybe some time in the future some clever person will see clear to a new formulation that will fully reveal the "whys" and the "whats" of quantum mechanics. And then again, maybe not. The only thing we know with certainty is that quantum mechanics absolutely and unequivocally shows us that a number of basic concepts essential to our understanding of the familiar everyday world fail to have any meaning when our focus narrows to the microscopic realm. As a result, we must significantly modify both our language and our reasoning when attempting to understand and explain the universe on atomic and subatomic scales.
Similarly and, as discussed in Who Put it There? we do the same in evolutionary theory. We know that DNA is not actually a code. Our treatment of it is a code, because that's an extremely good way of looking at it to further understanding. It's extremely fruitful, because it displays many of the features that we associate with codes. Have another meme:


So it looks an awful lot like words are extremely plastic. How the holy shit are we supposed to trust them?

Luckily, there's an escape. It deals with an area of thought central to what we mean when we say a thing, yet is often dismissed. Yes, it's semantics. As we've already covered this topic at some length, I won't belabour it here, but I want to deal with one more thing, and it's the most important thing about semantics. It encompasses all that we've been talking about here, and goes directly to what semantic discussion should be about. It doesn't actually matter what this or that dictionary says about word, or even if you invent a brand new one just for purpose, or repurpose an existing word and build a new usage. What actually matters about semantics is clear communication. Thus, whether you and I agree on a particular definition is irrelevant. Semantic arguments should only ever revolve around ensuring that both arguer and arguee understand what's meant by the given term as it is being used.

In other words, it doesn't matter whether I understand 'random' to mean 'statistically independent' and you understand it to mean 'uncaused', it only matters that you know what I mean by it, and I know what you mean. The easy way to resolve such discrepancies is to discuss the definitions and come to some consensus that you're both happy doesn't misrepresent your view, and that communication is reached.

As a corollary to this, it's also important that, if somebody asks you to define a term that you're using, it isn't remotely sufficient to say 'go look it up'. This will only tell you how some other people are using a term. If I ask for a definition, it isn't because I don't have one, it's because I want to understand what you associate with a given term. The dictionary can't tell me what you mean, which is what we need to ascertain in order to communicate effectively.

Always remember that words and the entities and phenomena they describe are distinct and different things, and that the words are merely an aid to understanding; a metaphor; a cipher.

The map is not the terrain.

DJ, Spin That Shit!

Right, peepholes, time for a bit of fun, methinks.

Today's offering is going to deal with something that many will think is insane - and it is - but there are two things that have been popping up all over the internet with increasing frequency over the last few years which were put to bed in scientific terms centuries or even millennia ago. 

I've always enjoyed reading. In my younger days, I read all sorts of stuff, pretty much anything I could get my hands on. Science fiction, historical fiction, thrillers, you name it.

About twenty-five years ago or thereabouts, I stopped reading fiction. It wasn't a conscious decision, at least at first. Rather, it was a function of discovering some areas of non-fiction that I got hooked on. I ended up with such a long reading list, comprising mostly history and science, that I simply didn't have room on my bookcase any longer under the groaning weight of Simon Schama, Alison Weir, Antonia Fraser, Stephen Hawking, Isaac Asimov, Richard Dawkins, Karl Popper, Thomas Kuhn... There was just too much to get through, and the list has remained fairly constant in size ever since, and even growing on occasion. As I write this, the stack of books waiting to be read runs to about fifty titles, almost all science, with a smattering of history books that have tended, in recent years, to get deferred in favour of whatever new science stuff piques my interest.

There's been one exception throughout this time, and it's become the only fiction I've read in a couple of decades, but I never missed a new publication, and it was always elevated straight to the top of the list. In fact, I'd tend to pause partway through whatever I was reading (although I usually have three or four books on the go at once, so I can change to suit whatever mood I'm in). Sadly, this exception has now dried up, and there's no new fiction to take its place, because the author behind this exception has now, sadly, passed away. I dedicate this post to him, my all-time favourite author: Terry Pratchett.

Pratchett is known to the world as the author of the Discworld series of books. Ostensibly just fantasy stories with a comedic slant, they're really a satire on some of the silly things that humans have believed throughout the millennia. He covered parodies of all the daft religions we've adhered to throughout history, various inventions, such as moving type, news, gunpowder, movies, postage stamps, trains, long-distance communications, computers, etc, as well as satires on real people like Leonardo da Vinci, Capability Brown, Heath Robinson, it's all there.

His world is, as the name suggests, a disc, sitting on the back of four elephants which sit in turn on the back of a giant space turtle swimming through the void. This again is a direct reflection of what was once a popularly-held belief among humans about Earth, although I'm not aware of any of the elephants having to cock a leg to allow the sun to pass by in any of the Earth mythologies...

He also, along with mathematician Ian Stewart and reproductive biologist Jack Cohen, wrote some science books, The Science of the Discworld series, which are genuine science books for a lay audience intertwined with Discworld stories. I can't recommend these highly enough for anybody getting interested in science (even approachable to young adult readers). They do a side-by-side comparison of the science in this world with magic in Discworld (it runs on magic; they can't get science to work there). Brilliant stuff, and extremely funny.

I should also note that there's a fair bit of science weaving through the entire series, but you have to be quick to spot it.

Anyway, to more serious matters. Silly and satirical as Pratchett's books are, it appears that there are genuinely still people who think that Earth is flat or, alternatively, that it sits still at the centre of the universe, while all around us stars violate the speed limit imposed by special relativity in completing an orbit composed of many millions of light-years travel in a single day composed of slightly more or less than 24 hours (yes, this is actually a variable).


It's reasonably certain that, for much of human history, people intuited that the planet was flat. This is hardly surprising. There's an interesting thing about curvature and how it works that means that, on large enough scales, it can appear extremely flat locally. This is fairly easy to intuit, with a little careful thought. Take a square piece of paper 1 cm on a side, and place it on a golf ball. It's easy to see that a piece of paper this size on that size sphere will not sit flat. Now take the same piece of paper and put it on a tennis ball. It will still curve, but it will sit considerably flatter. Do the same with a sphere 1 mile in diameter, and it will sit even flatter. Now take it to the moon, and it will sit functionally flat (assuming the moon was perfectly spherical).* When you get to something the size of Earth, your 1cm piece of paper will be so close to flat that measuring the curvature would be pretty close to impossible in practice.

Our intuitions are incredibly powerful, and have proved immensely useful to us in the evolutionary past of our species. As we've learned in other areas we've explored in previous posts, they can also be incredibly misleading, not least because they have inherent innocence. Intuition tells us that we can't get big hunks of metal to fly, that time flows at a fixed rate, that something can't be in two places at once. All these intuitions are demonstrably wrong. 

Sometimes, we need to gather data before our intuitions can be useful. Others, no amount of data help, and we have to throw intuition out of the window. 

So, how can we be confident that the planet is an oblate spheroid rotating about its axis and orbiting Sol?

First, it's worth pointing out that, in any places with significant maritime histories, it's probable that the popular intuition didn't hold, not least because they'd have had plenty of experience of watching ships disappearing over the horizon. Similarly, any populations living in the high places of the world would have been able to see the curvature.

For most of us, who don't live on the coast or in high places, it's easy to understand why people might have thought the Earth was flat. It certainly looks like it on a naïve appraisal.

The idea that the planet was spherical has been around since sometime around the middle of the first millennium BCE, though it appears in ancient Greek writings with little justification. Priority on the matter is very difficult to pin down, but the biographer Diogenes, writing in the 4th century BCE, attributes priority to Pythagoras, a regular contributor to my own witterings. In any event, it's fairly clear that, among the ancient Greek philosophers, few thought the planet to be anything other than spherical. 

By Graham.beverley - Own work, CC BY-SA 3.0, 
The earliest known account that gives some justification for this comes to us from our resident expert on sexual dimorphism in human dentition, Aristotle. He noted several observational issues that only made sense on a curved surface. Chief among these is the simple fact that there are stars visible to the South from Egypt and Cyprus that are not visible from further North. He also noted that, during a lunar eclipse, the shadow of the Earth on the moon was always curved, as in this image. This can only happen if the planet is spherical.

The first proper measurement on record showing that the planet isn't flat was conducted by Eratosthenes of Cyrene in 240 BCE. It was well-known that, at noon on the Summer Solstice, the sun shone directly down a well in the town of Syene, now known as Aswan, on the banks of the Nile. Eratosthenes took measurements in a well about 515 miles North in Alexandria and measured the difference in angle at seven degrees. Since seven degrees is about one fiftieth of the circumference of a circle, it was a simple matter of multiplication, and he came up with a figure of approximately 26,000 miles. There is some uncertainty on the exact figure, because his distance measures were expressed in stadia, an archaic measure of 600 feet, where the foot was a broadly variable measure, with a range of lengths defined all across Greece. In any event, his figure is somewhere within four to six percent of what we now know to be the correct figure of 24,901 miles.

Eratosthenes was responsible for quite a few other firsts, as well. He measured the axial tilt of the Earth (I'll be coming back to this shortly), created the first world map based on the knowledge available in his day, and invented the discipline of geography. He's also thought to have invented the leap day and to have calculated the distance to the sun. He also invented the sieve of Eratosthenes, a clever way of calculating prime numbers.

In any event, the knowledge that the planet is (approximately) a sphere has been in our species for more than two millennia. This hasn't had much impact on some, quite probably because of a combination of cognitive inertia and the fact that the notion that we live on a flat planet has been made dogma in some spheres, even to the point that suggesting otherwise has been considered sanctionable heresy in societies in which the sanctions included a painful death, if you were lucky.

That's not the end of the story, of course. It should be, but there you go.

In 1687, Newton published his Principia Mathematica, in which was detailed his theory of gravity and his laws of motion. We've seen in previous posts that, when a new, ground-breaking theory is formulated, it isn't always immediately obvious what the full implications of the theory are. Newton's theory was no different, but something that became apparent pretty quickly was something that caused some controversy. It was bad enough, some thought, that heavenly bodies didn't move in perfect circles - the inverse-square law determines that they must travel in ellipses, of which a circle is a special case - Newton's worked implied that the Earth wasn't, as was thought, a perfect sphere. Specifically, because of the opposition of forces, the planet must bulge slightly at the equator and be slightly flattened at the poles. 

There's a wonderfully convoluted tale of how a French team battled for nine years to try to measure this effect by conducting surveys in the Andes, only to be beaten to the result by another team that took measurements in Scandinavia, using essentially the same methods as Eratosthenes had two thousand years before but on a larger scale. This is discussed in Bill Bryson's marvellous A Short History of Nearly Everything

Anyhoo, they confirmed Newton's result, and measured that a degree of arc was indeed longer at higher latitudes, meaning that the planet is oblate.


Image source: NASA
Of course, there's a really straightforward way to show that the planet is a spheroid, but it's one the flat-Earthers love to paint as a conspiracy. Here it is.

This image was taken by the Apollo 11 crew  in 1969. The deniers, of course, will adamantly insist that the entire mission was a hoax, but this doesn't stack up for several reasons.

The first and most obvious reason is the Soviets. Recall that this mission was the culmination of an intensive race between the USA and the Soviet Union, at a time when their rivalry was reshaping the world (figuratively, of course; it was still an oblate spheroid). Remember that this race began immediately in the post-McCarthy era. The Soviets, who had every reason to undermine NASA's claim to having landed on the moon, tracked the entire mission themselves. Indeed, the Soviets had their own unmanned mission, Luna 15, running at the same time, with the Soviets and the Americans sharing information to ensure that they didn't collide. Luna 15 ultimately failed its mission, crashing into the lunar surface. However, the Soviets acknowledged the Americans' achievement, and the then Soviet President, Nikolay Podgorny, sent a telegram to President Nixon offering 'our congratulations and best wishes to the space pilots', acknowledging a success that, had it not actually happened, the Soviets would have had every reason to deny, even if only for propaganda purposes.

More importantly, both missions were independently tracked from the local observatory here, by the Lovell radio telescope, just a few miles from where I'm sitting, at Jodrell Bank.

Moreover, the Apollo astronauts left something behind that's still used to this day, and which put in a cameo appearance in season 3 of The Big Bang Theory. It was the Lunar Laser Ranging Retroreflector Array, which is the only experiment from the moon landings still running. It's a specially-designed panel of clever mirrors on a panel 2ft across.
Image source: NASA
These mirrors are designed to reflect light directly back to source with minimal scattering. We see examples of these all over the modern world; the reflective surfaces on road signs, the reflectors on bicycles, cat's eyes, etc. Indeed, the eyes of cats themselves are a natural retroreflector (in fact, all camera eyes are, and this is why, if your flash is too close to the lens on your camera, the result is that eyes come out glowing red like a demon). In the case of the LLRRA, they're used to precisely track the moon's orbit, and provide information about the moon's gradual recession from Earth, a function of transfer of angular momentum from the tides.


In short, there can be no doubt whatsoever that we live on an oblate spheroid. There are many other lines of evidence, but none of the above is explainable if we live on a disc, and that's even before we get into why the sky is red at sunset but blue when the sun is overhead, and other easily observable effects (this is a function of Rayleigh scattering, named for British physicist Lord Rayleigh, in which, because the photons are travelling through significantly more atmosphere, the shorter wavelength blue photons are more attenuated, leaving the longer wavelength red photons; see Give Us A Wave).

As we observed, another thing that Eratosthenes was responsible for was calculating the planet's axial tilt which, contrary to what one might hear on Faux News, is the real reason for the season. Indeed, it's the real reason for all the seasons.

Earth is tilted at approximately 23.5° from the vertical. If this were not the case, every day would have pretty much the same length regardless of latitude or time of year. Indeed, if the axis weren't tilted, it would have been considerably more difficult to even define a year. We'd still have been able to do it, of course, by noting the positions of stars, but even this would have been significantly more involved. The reasons for this are fairly simple. First, as the planet orbits the sun the axis always points in the same direction (this is why it always points roughly at Polaris, the North Star, regardless of the time of year), which means that, at different times of the year, the Northern and Southern Hemispheres alternate in terms of which is more directly pointed toward the sun. As the year progresses, the point on the planet's surface closest to the sun oscillates between the tropics. In the Northern Hemisphere, the Winter solstice occurs when the sun is directly over the Tropic of Capricorn. The summer solstice occurs when the sun is directly over the Tropic of Cancer. The spring and autumn equinoxes respectively occur when the equator is directly beneath the sun. This progression is what drives the seasons.

The other consequence of the axial tilt is that we see different constellations rising above the horizon at different times of the year and, of course, this is the basis for the Zodiac.

Moving on, what about the axial rotation?

Again, naïve intuition can certainly make it seem that we're sitting still while the universe whirls around above our heads, but there are massive problems with this. The most critical is that, in order for this situation to hold, one of our most fundamental laws would have to be breached. 

To put this into perspective, let's look at our closest galactic neighbour, the Andromeda galaxy. Andromeda sits a little over 2.5 million light-years away. Assuming, for the sake of simplicity, that Andromeda orbited Earth directly, in a circular orbit, we can apply a simple calculation to see the distance it would have to travel in a single orbit. We're all familiar, I hope, with the formula for the circumference of a circle, 2πr. A quick puff of chalk dust tells us that Andromeda would have to travel approximately 16 million light-years in a day. Now, light can travel about 16 billion miles in a day, give or take, which sounds a lot, but this is only 1/365th of a single light-year, and constitutes the maximum distance that anything can travel in a single 24-hour period (see The Idiot's Guide to Relativity). For anything with mass, of course, even that speed is impossible, according to special relativity.

We've already met one consequence of living on a rotating planet, in the form of the moon's recession due to conservation of angular momentum, but it's quite a subtle one. Let's look at a consequence that's more pronounced and apparent.


Image source: Wikipedia
There's a simple experiment we can carry out on Earth. This experiment, first conducted by Lèon Foucault, who we met in The Certainty of Uncertainty as the scientist who first gave an accurate measure of the speed of light, is among the most elegant experiments ever devised. It consists of a simple pendulum, now known as Foucault's Pendulum. The idea is straightforward enough to intuit with a little care. You set the pendulum swinging and, because of the rotation of the planet, the plane of swing of the pendulum precesses. In other words, the angle of swing changes throughout the course of the day. 
Image source: Wikipedia

Here's a simple diagram showing this in action. As you can clearly see, the bob is precessing as the planet moves around. This is actually a disc, so Foucault's Pendulum in and of itself doesn't demonstrate that the Earth is a sphere, but we've already done that. More importantly, a simple translation in the experiment means that the it can show that we live on a sphere. This might seem odd, especially for those who've been paying attention since the inception of this blog project, because we've been insisting that this shouldn't happen, because of Noether's Theorem. However, what's actually happening is that this translation has a direct impact on the experiment because it constitutes a change in the experimental setup that isn't merely a translation.

The amount of precession is a function of latitude. At the equator, the plane of swing remains fixed in relation to the Earth. As you move North from the equator, the precession increases, with the angular momentum proportional to the sine of the latitude. It reaches maximum at the pole, at which it rotates 360 ° in 24 hours. If we pick a point partway between, Paris, for example, where Foucault first conducted this experiment, the plane precesses at 264° every 24 hours. If you move South of the equator, the same thing occurs, except that the precession is in the opposite direction.

This experiment has been conducted at both poles, and there are many examples all over the planet at different latitudes and it behaves precisely as Foucault predicted. Although the deniers insist that the South Pole is a fiction, there's actually an example of this experiment as a permanent fixture there.

This is a manifestation of the Coriolis force. First formalised by Gaspard-Gustave de Coriolis in 1835, the Coriolis force is an inertial force (or pseudo-force, like centripetal and centrifugal force) arising from motion relative to a rotating frame. There are many manifestations of it, not least that airline pilots must correct for it when traversing latitude. Indeed, even snipers have to take it into account for shots over distance. 

Another beautiful - if deadly - manifestation of the Coriolis force is hurricanes. Because of this effect, hurricanes in the Northern Hemisphere always rotate anti-clockwise, while in the Southern Hemisphere, they always rotate clockwise (the effect on tornadoes tends to be negligible, the critical balance there being between pressure and centrifugal force).

I'm conscious that this is already a long read, so I'm just going to address a few quick arguments raised by flat-Earthers for completeness.

The first is one that was presented to me on Twitter a few weeks ago. The FE proponent in question showed a video of a rocket lifting off, with a camera mounted on the rocket looking down toward the Earth, and objected that, if the Earth were a rotating sphere, we should have been able to see the planet rotating sideways as it ascended. The reason we don't, while far from being simple, exactly, is at least trivial once you grasp the consequences of relativity. Specifically, the reason we don't see the ground moving sideways is the same reason your hat doesn't blow off when you're riding in a train, namely that the motion of the planet is already imparted to the rocket prior to lift-off. In simple terms, the rocket and the surface of the planet are both moving at precisely the same velocity. We know from Newton's Law of Inertia that a body will remain in its state of motion until a force acts upon it. When the rocket lifts off, it continues moving with the surface, as it was before, but now with a force applied upwards away from the surface. For the rocket not to remain above the point of lift-off would require the imparting of a sideways force to overcome inertia in that direction (they are affected by the Coriolis force, as are all fast-moving objects in rotating frames).

Another objection deals with space shuttles on re-entry, and suggests that they should experience cross-winds. Orbital altitude is a function of velocity, which means that for an object to remain in orbit at a certain altitude, it needs to keep moving at a certain speed. When the shuttle is re-entering the atmosphere, the first thing it does is execute a de-orbit burn, which slows it down until it meets the atmosphere. Once inside the atmosphere, the shuttle is within the rotating frame of the Earth, and we simply reverse the principle above for lift-off.

One final example of an objection is the idea that rockets can't thrust in space. Once again, we need only refer to Newton here, and his third law of motion, namely 'every action has an equal and opposite reaction'. A naïve view might suppose that a rocket needs something to push against, but this overlooks this law. There have been a few instances in science fiction films in which somebody becomes untethered from their craft, and the mission leader always tells them to throw away whatever they have in their hand, a spanner (wrench, for our transatlantic cousins) or some other tool is thrown in a direction away from the craft, which exerts a force equal and opposite on the thrower, propelling them back toward the craft. When rocket fuel is expelled from a rocket, the fuel exerts a force equal and opposite to the force of the expulsion. That's why rockets work in space.

De Chelonian Mobile - valé, Terry. You're missed.

Hope this was enjoyable. I'd wanted to deal with expanding Earth nonsense in this post, but I think this is long enough, so I'll address that in a future post.

Thanks for reading. As always, crits and nits welcome.

Edited to add:

I got confronted with a picture and a question, and thought I might as well tag it on here. Here's the picture.

The implication should be fairly obvious. The FE proponent wants to know why this doesn't happen with the planet's oceans if we're living on a spinning ball.

Of course, the answer is fairly straightforward to anybody with a tiny understanding of gravity and its relationship to mass. We haven't had an equation for a while, so here's a nice one: \[ v=\sqrt \frac {2Gm}r \] Where v is the velocity, G is the gravitational constant, m is the mass of the sphere and r is the radius of the sphere.

Taking a vaguely standard mass of 58.5 grams and radius of 3.17cm for the tennis ball, a quick calculation tells us that the escape velocity for the tennis ball is a massive 5.65cm/h.

The mass of the Earth, on the other hand, is 5.976x1024 kg, with a radius at the equator of 6378 km. Plugging those numbers in to the equation, we get an escape velocity of 11.2 km/s. The actual rate of rotation of the planet at the equator is approximately 465 m/s, which is less than 5% of escape velocity. 

That's why the oceans don't fly off into space.

A further edit: I was confronted with the following on Twitter and again thought I might as well tag it on the end here with my response:











































*There's a famous physicist, Fritz Zwicky, who was fond of growling that people he didn't like were 'spherical bastards', because they were bastards no matter which direction you looked at them from.

Promoting Atheism... And Why I Don't.

There's a fairly common collection of memes in those areas of the internet in which various alleged entities are discussed that atheists 'hate god', that we all 'know he exists, which is why we spend so much time arguing against him' amid accusations of 'atheist dogma'. I want to address some of them here.

I've previously posted on how I use the term 'atheist' and why I think it's the most robust definition. To summarise that post, atheism is nothing more nor less than the non-acceptance of a specific class of truth-claim with regard to the existence of a specific class of entity. I went on to address some objections, such as the idea that it should be a distinct cognitive position and why I felt them to be problematic.

Here, I want to expand on some of what I said in that post, particularly with regard to what motivates me to argue, and why I think it's both fruitful and important. It's going to be somewhat biographical, because it's going to deal with where my motivations originate.

The first thing to note is that, among all the things that motivate me, atheism isn't one of them. To be absolutely clear, my atheism is a symptom; an effect, not a cause. Atheism, being a simple privative, cannot be causal. Indeed, to imbue it with causal properties is to commit several basic fallacies, the modal fallacy and the category error among them.

When I was young, I really tried to believe. It seemed incredibly important to the people I cared about, and I was assured that it was for my own well-being. Evolution has equipped us with many things crucial to our survival, among which is, in humans, a certain credulity. That sounds a little odd, so perhaps some unpacking is in order.

We're a funny old species in some ways. Due to the limits of the human birth canal, we have an incredibly protracted post-birth development programme. It takes us a long time to reach adulthood*, and that time is taken with learning how to survive in our niche in the environment. As children, it's actually advantageous to be gullible, especially with regard to certain information sources, because rejection of information impedes such learning. When we're told things as children such as 'don't put your hand in the fire', whether or not this concept is reinforced by associating it with pain, we're hard-wired to accept them, especially if it comes from a parent. From our earliest imprinting, we put our faith in our parents. When they tell us 'you can trust this person, or this class of people', we simply accept it, without question. We take it on faith. Up to a point, this is extremely useful, and is the nature of many of the life lessons we learn during our emotional, psychological and intellectual development. It does, however, have its pitfalls.

I've posted before about a famous saw popularly attributed to Ignatius Loyola, founder of the Jesuit order (some say Aristotle), show me the child until he is seven, and I will show you the man. The Jesuits were famous as educators, and can be found in popular fiction as far-flung missionaries bringing Western education and the word of god to the world's heathen. They were aware, as are most, that the most effective inculcation occurs when our minds are at their most malleable. Indeed, many cog-sci studies show that our learning capacity is radically depleted as we get older. This is one of the few areas in which the computer analogies don't fall too wide of the mark. It's not unuseful to think of childhood as a system configuration process, during which you make a lot of tweaks in the hardware (growth), install the operating system and the core software and (learning), configuring internal connections and setting up key commands (training). By the time we reach adulthood, many of the basic principles with regard to how we think about things are laid down in a fairly unshakeable foundation. It takes an extreme effort of will to shed or even to challenge these 'core beliefs', whether in isolation or in concert. We'll even call them 'common sense' or, worse, 'basic logic'. The latter, they're certainly not. Logic has little to do with them.

Now, I should state that I'm not having a go at parents here. I'm a parent myself, and I know as well as anybody that children don't come with an instruction manual. You do your best to keep them safe, teach them not to be dicks as best you can, and give them the tools necessary to be a successful human, however you personally define success. This is an enormous responsibility, and far from easy. We draw extensively from our own childhood, remembering what our parents did well and, perhaps, what they didn't do well, and we try to bring those examples into our own children's upbringing.

Of course, this makes the whole process relentlessly cyclical, not least because now we have this collection of ideas that we're not supposed to question, are programmed not to question. We don't even think of them as being subject to question. This idea, that there exists an idea that is not to be questioned, is the most pernicious and destructive idea of them all in my considered opinion.

There are very few children, I suspect, that don't question some of the social, religious and political beliefs they're brought up with. Humans are innately curious and, as a result, they tend to pick at things until they make sense. We want to believe things, and we'll challenge things that don't make sense to us, asking the awkward questions until they do. As children, when we do this with certain ideas, we're given answers designed to stifle discussion. There's an old joke that sums it up: 'What was God doing before he created the universe?' and the response 'He was preparing hell for people who ask awkward questions'. In many cases, this has the desired effect and curtails that curiosity. Not always, though, and this brings me nicely back to where we began.

I really tried to believe, but there were things in the religion I was raised in that simply didn't make sense, and they niggled away the whole time, even as I engaged in all the behaviours a good Christian should. The evidence mounted, though, that showed that the entire idea was really quite silly, and I could never get beyond that.

Anyway, I got on with my life, somewhat saddened that so many would subjugate their intellect to such a poorly-supported idea, but not massively concerned with it.

I left school at a very early age (12) after some less-than stellar experiences. That's not to say that I hated school. Quite the contrary, in fact. I loved school, and I loved to learn. I was quite easily bored, but managed to keep it largely in check (although I could be a bit of a handful). After leaving, I continued to educate myself. I'm a voracious reader, and always have been. 

In my 20s, a friend introduced me to A Brief History of Time. I was hooked. I couldn't get enough. This was the thing I'd been looking for, and I read every bit of science I could lay my hands on, especially physics and cosmology, but I also encountered The Selfish Gene, which triggered another area of study, evolutionary theory. On reading some of Dawkins' other work, I came across the concept of creationism, and the concerted efforts of well-funded and politically well-connected organisations to undermine the teaching of valid science in schools, and the promulgation of a pseudo-controversy, as if there was any reasonable doubt that their preferred model of reality was anything more than a blind assertion, passed down by these same mechanisms of credulity, or that it shared anything like the epistemological status of evolutionary theory.

I started to investigate, and was horrified at what I discovered. Among my first encounters was Kent 'Dr Dino' Hovind, and a video recording of him - get this - teaching children, in a large auditorium, that the Universe was less than 10,000 years old. It was riddled with the most shoddy reasoning, anecdotes like I was on a plane with an atheist - or often, a scientist - and I said, 'Brother...' followed by some inane question that falls broadly into two categories, the first being the category of question that anybody with a functional neuron should be able to answer without breaking a sweat, and the second being the category of question that could only come from not understanding the subject matter at all, being so disconnected from or misrepresentative of the topic at hand that there was no sensible answer that didn't involve deconstructing the entire edifice of misunderstanding prior to building up the understanding necessary to grasp the subject properly. Aron Ra, in a debate series on Youtube with a particularly special case from Birmingham (UK, I'm sorry to say, not Alabama), voiced his frustration succinctly thus:
"Before I could explain any of that, I'd have to drain your brain of the faeces festering in it and replace all that with a clue"

Hovind wasn't alone. I discovered that there were large organisations, deep of pocket and equally deep of ignorance, actively pursuing a strategy of attempting to undermine acceptance of valid science, and infecting children with it, by building big sham museums, another one of which opened in Kentucky in the past couple of months, inveigling their way into the bodies responsible for selecting textbooks for schools, with books clearly intended to undermine what the evidence suggests, while they were largely powerless to resist it, based only on the principles in the foregoing discussion.

Worse, with a little more digging, I discovered that scientists, the real ones, as opposed to the poster-boys of the creationist movement in stolen lab-coats, were faced with actively working against it, as opposed to getting on with probing the universe for its hidden secrets. I couldn't get my head around it, and it saddened me. It wasn't all bad, of course; some of the assertions of creationists have led to really interesting research, such as the research on the bacterial flagellum discussed in Irreducible Complexity and Evolution.


Now, I have no problem with what anyone believes. To put it bluntly, you have every right to be wrong, if you wish. Where I do object is when those beliefs are manifest in the world in a way that impinges on the rights of others. A summary way of stating this is that your rights end where mine begin, and vice versa. 

Rights are a source of a fair bit of confusion judging by some of the discussions I've had over the years. I've engaged with people who assert things like 'this is a right afforded me by the constitution'. This is to fundamentally misunderstand what a right is. It isn't something awarded by a government, but flows directly from our empathy, and the social contract that allows us to function successfully as social animals. The role of government, and of frameworks like the US constitution, is to recognise rights that are innate and inalienable, to enshrine them in law, and to give them the protection of the state. It's to ensure that we are not subjugated by individuals, by groups and, indeed, by governments themselves.

One of the rights I think most important is the right to self-determination. It informs all other rights in the most fundamental way, and is central to a healthy, well-balanced and progressive society. Of course, self-determination doesn't happen in a vacuum. It requires, among many other things, that people be well-informed, and able to think for themselves, and this brings me at last to the central topic I want to cover here.

In my last post dealing with patterns and the inertia of ideas, we looked at how certain ideas embed themselves in the way we think and the way we see the world, and how difficult it can be to shift them once ingrained. We discussed Morton's Demon, and how ingrained ideas actively work to survive by editing out contradictory evidence, the mind's mechanism for the avoidance of cognitive dissonance.

Most of our prejudices, including those that vilify others based on perception of non-membership of a given group or clique, are rooted in this issue, and the source of very many of them is the religion we're indoctrinated into. Those who've been reading this blog from its inception will be aware of some of the knowledge we've gathered about the universe and how it operates, and the means by which we've gathered it. The two pillars of physics, which we've covered broadly but also in some detail, which run massively counter to our intuitions, but which underpin all of our technology, including our ability to communicate instantly over distances that, even only 100 or so years ago, would have taken weeks or even months, by means that would, in times not much earlier, have resulted in wholesale torture and burning.

Science can be difficult, not just because it requires modes of thought that our mammalian brains, which evolved as mechanisms to avoid predation and access resources, are not naturally equipped to cope with. It requires an open mind and the willingness to shed biases. It requires training, and that training can be massively undermined by those complexes of ideas that colour our perceptions.

I'm hoping that my case has largely been made here, because there's a clear conflict between inculcation and independent thought, the latter being fundamental to self-determination. Just as we can't reasonably engage in the social contract if we don't understand the nature of the contract, we can't engage in independent thought if we don't understand the nature of our biases, prejudices and beliefs.

As parents, we often think of ourselves as having the 'right' to raise our children as we see fit. I'm going to voice here what I know is going to be an unpopular opinion, but it is my opinion, and it's incumbent on me to deliver it, and it's this:

We have no such right. What we do have is a duty to prepare our children for the world. Central to this is giving our children the tools to function as social animals, to engage in the social and moral contract, to help them, grow, not just physically and emotionally, but intellectually as well. Imbuing them with our 'memeplexes' is anathema to that goal, and we really need to recognise this. What we actually do when we instil unchallengeable ideas in our children is to curtail their rights, especially the rights to free speech and self-determination. Teaching a child that they must bow to an entity that is merely asserted to exist, sans evidence, and that this must not be questioned, is stripping them of their most fundamental freedoms. In my not-so-humble opinion, this crosses the line into intellectual abuse. When this is reinforced with the suggestion that even questioning such assertions will lead to eternal torture, it's tantamount to emotional torture. In short, imbuing children with poorly-supported doctrines, and the attendant doctrinal imperatives that must be adhered to without question, is a fundamental breach of the rights of those children to self-determination.

Raising children is a privilege and, like all privileges, it comes with a great responsibility. We should teach our children to question, to challenge, even to challenge the paradigms and edifices of science, but also to understand the nature of evidence, and when denial is counter-productive. The one thing above all others that we should be teaching our children is that ideas are disposable entities, and that there's no such thing as an unchallengeable idea. No idea is sacred, and ideas aren't entities that are, in and of themselves, worthy of respect. People are worthy of respect not least because, in respecting others, we're respecting ourselves. Ultimately, if we show the beliefs of others the respect they think they deserve, we aren't showing the holders of those ideas the respect that they definitely do deserve.

As I write this, discussion is once again rising in legislative bodies around the idea of blasphemy in many countries. Here in the UK, the new, unelected prime minister is throwing her weight behind legislation to un-cap prejudice in admissions into faith-schools, as if there can be any such thing as a Christian child, or a Muslim child. This is a step in precisely the wrong direction. No pluralistic society can remain truly pluralistic while such segregation is promulgated and actively promoted by government. Only by promoting pluralism can we ever be truly pluralistic, and promoting religious apartheid in this manner is massively destructive to any progress in this area.

This is especially of moment now, when the world is facing an ideologically-motivated crisis of violence and terror arising directly from these pernicious in-group/out-group distinctions, and the clash of ideologies isn't only leading to fear, because the fear itself is having the effect of making individual nations more insular. 

Over the past decade or so, we've seen instances of violence that are difficult for a moral or thinking person to countenance. Only earlier this week, I was in discussion with somebody on Twitter who alluded to the actions of 'atheist extremists', going on to say that we 'write articles'. This is exactly the sort of prejudice that arises from the silly notion that beliefs and ideas should be immune from criticism. Meanwhile, 'extremists' of other stripe are flying planes into buildings, stabbing and shooting cartoonists, beheading film-makers, beheading aid workers such as Alan Henning, a Salford taxi-driver who wanted nothing more than to help people in need (I didn't know him, but he was well-known to some of my family and friends), hacking bloggers to death in the streets for saying things, including Avijit Roy, with whom I used to chat physics on the Richard Dawkins forum (his wife barely escaped with her life).

If it weren't for the fact that there's nothing remotely amusing about any of the above, it would be almost funny that I, who write articles about science and reason, and argue with believers and science-deniers on social media, can have an epithet levelled at me comparing my behaviour to the above atrocities.

In the face of all this, we have accommodationists across the political spectrum enjoining us to curtail our critique, and to be careful where we aim it.

Let me be clear here. I'm more than aware that Muslims are not a homogeneous group, and that treating them as such is horribly bigoted, as it is with any group of people. I know many Muslims and, just like any other arbitrary distinction we might draw, the fact that they're Muslim tells you one, and only one, thing about them, in the same way that me telling you I'm an atheist tells you only one thing about me, namely that I don't believe in the existence of a deity.

I wouldn't even say that it's the responsibility of all Muslims to speak against the extremists, it's the responsibility of all of us. Shying away from this responsibility, whether out of genuinely believing that religion and religious beliefs are things to be respected, or from fear of reprisal, is actively working against progress and the future safety of the society in which I and my loved ones live. Edmund Burke famously said that the only thing necessary for the triumph of evil is that good men do nothing. While I have issues with the terms 'evil' and 'good', I'd say he was largely correct in this, but there's more. When good people do nothing, evil can succeed. When good people actively work against doing something, evil has a free pass, and doesn't even have to do any work.

So, do I promote atheism? Certainly not. I promote thought, education, scepticism and evidence-based reasoning. I promote intellectual, social, moral and scientific progress. I promote the disposal of bad ideas. That religion and the idea of god are, in my considered estimation, among the worst and most useless of those ideas is functionally incidental.

Let's teach our children how to think, not what to think. Let's teach them to reason, to value evidence, and to properly assess the truth-claims of others. Let's raise our children to be properly and fully engaged in the social contract, to understand its nature, and to know what morality really is. Then maybe, just maybe, we can look forward to a future for our children that's free of terror, free of prejudice, free of subjugation. Let's teach our children to be free.

Freedom of speech is the freedom from which all other freedoms flow. When you fight against freedom of speech, you forge your own manacles.

As always, thanks for reading.


*There's good reason to suppose that this never happens with humans. We know there are species in nature which are 'neotenic', meaning that they reach reproductive maturity prior to reaching adulthood, and that adulthood is lopped off the life-cycle. Axlotls are one such species. Because, among other things, humans are one of the most highly paedomorphic (resembling young) of extant species, it's thought that humans are also neotenic.

Patterns and the Inertia of Ideas

Ideas are tricky little buggers. 

They're extremely attractive, but they can be dangerous. Before we can really look at why that is, we have to understand the nature of an idea.
Everything, all human life, is history. - Michael Rosen
There's a very real truth to that, though I want to cast it in a slightly different context than Rosen was getting at. What he was saying, of course, is that everything about us stems from our experiences and that every topic we might study is really the study of the past. Literature, mathematics, science, all are disciplines that rely heavily on the past. Indeed, modern physics tells us that even our most immediate experiences are experiences of the past, due to the fixed limit at which information can travel. Even our pain, which we experience as immediate, takes time to get from our pain receptors, traversing the myelin-sheathed nerves to trigger neurons in the brain. I'm going to take a slight diversion here to show how critical this is.

As a musician who's spent considerable time working with other musicians in recording environments, this is quite apparent to me, but I'm aware that those who haven't experienced such environments with regularity may not have the temporal sense that I've developed in that environment, so it's worth looking at what happens.

One of the hardest things to do in a recording environment is synchronisation. When musicians are playing together, this isn't much of a problem, except when one or more are less than entirely competent at keeping time. What generally happens is that the timing ebbs and flows. It won't generally be noticed by many but, for example, when musicians play live, they tend to slow down slightly during drum fills over section changes, speeding back up to tempo on introduction of the new section. This is merely the most obvious feature of a general principle, namely that musicians move around. They'll generally, if they're competent, go through these changes together.

In a recording studio, this takes on a slightly different complexion, because it's rare that the entire ensemble will all be playing at the same time, except in the case of professional gatherings, such as orchestras. What usually happens is that the rhythm section gets laid first, sometimes starting with only the drummer. If that happens, the drummer will need a metronome or click track. 

Sounds travels at a finite speed, approximately 767 mph in air, although this changes with temperature and density. A reasonable rule of thumb is to work on the basis that, at room temperature, sound travels at something like 1 foot per millisecond. In the technical jargon, the difference between a sound leaving a source and arriving at the ear (or indeed any delays of this type) is known as 'latency'.

Often, these days, we use digital drum kits to trigger sample players, so that we can muck about afterwards with the sounds. 

So, drummer sits down, speakers are about 5 or 6 feet away in the average project studio. He starts laying down his drum track based on the click track. Because there is a delay of 5 or 6 ms, this can have some interesting effects.

Most people can probably detect a delay of about 15-20 ms, but very good drummers will tend to have a sensitivity of about 4-5 ms. I'm sensitive, after many years of conditioning, to 3 or 4 ms. The average musician will be somewhere around 5-10.

Anyhoo, let's assume a really cracking drummer with excellent temporal sense. He lays down his drum track for a song that the band have been playing for years. He knows every note, and plays it flawlessly week-in, week-out, and never drops a note, even when practising on his own. When he gets in the studio situation, he often flounders (note that I'm not saying this of all drummers). Why is this? 

First, it's because he's struggling with the latency. Second, it's because he's having a hard time staying on the beat without the variations. The drummer is usually the core of the regimentation of the ensemble, but now he's subservient to another core, the metronome.

There are some things we can do to mitigate this, but which course we choose depends on the musician. Some drummers work really well with a flashing light giving the tempo and, of course, this travels a smidgeon quicker, so latency is reduced massively. We can also send the click via headphones, which means it gets there are as quickly as the blinking light, because electrical signals, such as those travelling from any source to a direct connection, travel at the same speed, c. For the record, the technique that works best most of the time in my experience is simply for me to sit marking time with a finger, as I, being the engineer, am closer to the source, more aware of the pitfalls, and can read the situation well enough to reduce latency of trigger for the drummer, and can even anticipate and key into the drummer's perception.

So, we labour away, and he gets his part laid down. Then come the rest of the band, one or two at a time. They have an even harder time, because the variations they usually expect aren't there. They will tend to fall behind during fills, and run ahead when the section picks up again. They anticipate things derived from their experience of playing in the band but, of course, the band has never played to a metronome before, let alone a drummer playing to a metronome.

The above should highlight the approach I want to take here. I want to go deeper than Rosen's ideas, and get to the meat of what it is that we're studying.

We have an interesting ability, literally inherited from a long evolutionary history stretching back many millions of years, and it's all to do with how we process space and time. The brain records sensory input from all our senses (of which there are considerably more than five, despite popular opinion), nicely temporally ordered. This ordering means that we can abstract periods of time, remembering things that happened in the past, recognising patterns, and using those patterns to abstract into the future, projecting potential future patterns so that we can avoid pitfalls and work toward desirable goals. 

Terry Pratchett once described humans as Pan narrans, the story-telling ape, and this is precisely what he was talking about. We tell ourselves stories about what happened in the past, and we tell ourselves stories about what will happen in the future. All of this arises directly from our temporal processes and pattern recognition. Herein lurks a danger, though, and that's because of the way our ideas interact with each other and how they interact with contradictory information.

You can think of ideas as having a property we'll call mass. That is, they have a certain resistance to any changes in 'velocity'. You could even stretch the analogy further and say that different kinds of idea have different mass. What we're talking about here is something that we all have, and it's extremely difficult to defeat: Cognitive bias. Cognitive bias comes in many forms, but the most pernicious of these is confirmation bias. Confirmation bias means that each of these ideas you have, both about the past and about the future, colour your perception of the world. Ideas are the lenses of the mind. When you meet a new idea, it's always seen through the lens of all of the ideas you currently hold. The ideas that sit well with all of those ideas will readily be taken in with minimal scrutiny, thus these have significant mass. Ideas that don't sit well will struggle to come under the influence of the forces in the nucleus, as it were.

OK, so I'm stretching the analogy too far. Mea culpa. 

The point is that ideas group together and find safety in numbers. The more safe, comfortable ideas you hold, the greater the inertia in those ideas. 

There's a wonderful post written some years ago by a former creationist, Glenn Morton, about how this works. He imagines a demon that sits on the shoulder, diligently filtering out any evidence that might cause dissonance or contradict any of the nucleic ideas, preventing you from seeing them, but that allows any evidence that might aid in confirming the nucleus, even to the degree that it will carefully interpret the evidence to make it fit. I'll link to his post at the bottom.

There are other factors in our cognitive development having taken the course it did besides pattern-recognition, of course; large brain, with a good mass to brain-mass ratio, opposable thumbs, etc, but what really defines the way we interact with the world is our ability with patterns. 

All of our knowledge about the universe comes to us from recognising patterns. Our development from foragers, to hunter-gatherers, to agrarian, to technological, all stem from recognising patterns. Poetry, art, music, science, philosophy, logic, mathematics, all patterns. 

We're so good at it, and in responding to it, that we even attach significance to events that are entirely meaningless or random. This is a useful skill in some respects, and gave clear advantages in evolutionary terms. When our ancestors saw what they thought matched the pattern of a tiger hidden in foliage, they took cover. It didn't actually matter whether or not there was a tiger there, simply being conditioned to respond to the appropriate pattern was sufficient to increase the odds of survival. However, it can be taken too far, and the result is what we call superstition. In the dim and distant past, this took quite a few forms according to the best evidence we have. In Homer's Odyssey there are descriptions of 'auguries', which consist of predicting the future by reading bird-sign. If a hawk passes by on the left in the evening with one eye closed doing the secret handshake, yada yada...

The night sky is a wealth of information about the cast of ancient mythology, all rooted in patterns. Astrology is nothing more than connecting an occurrence to a celestial event and making a religion out of it. Dowsing, acupuncture, homoeopathy, alchemy, antivax and all manner of credulous wibble is founded in this superstitious drivel.

We're not alone in this, of course, and we can see high degrees of pattern-recognition ability in many species. There's a famous series of experiments conducted by BF Skinner, in which various birds exhibited superstitious behaviour, including pigeons and corvids. These experiments are a real eye-opener, and they tell us a lot about ourselves. In one that comes to mind, a pigeon is conditioned so that, if it turns in a certain direction, it receives a reward in the form of some grain. After some reinforcement, the pigeon will continue to turn in that direction even long after the reward stops coming, or when the reward comes after a different action. What the experiments showed is that this behaviour arises precisely because the reward is random, so there's no way to predict which behaviour will trigger the reward, which reinforces the behaviour and strengthens the response. This also leads to addiction to things like gambling and other risk-related activities.

I was reminded of this when watching an episode of Tricks of the Mind, in which he uses some of the same techniques on celebrities, with much the same result. I recommend watching the series, not least because it's almost entirely rooted in the principles we're discussing here.

Like the example of the tiger above, we're also extremely adept at spotting visual patterns where there is no pattern. This is especially true for faces, almost certainly as a result of our earliest imprintings as babies, learning to recognise our families. When we see faces in the clouds, it's a manifestation of this pattern-recognition. Similarly, any viewers of Top Gear will recall Clarkson extracting the Michael from the 'face' of Hammond's favoured wood-constructed Morgan. Psychologists have a name for this: Pareidolia. This is nothing more nor less than our cognitive biases at work, as are all these abilities with patterns.

Cognitive biases are difficult to defeat, but not impossible. Cognitive behavioural therapy, for example, is highly useful in reducing some particular biases. Many of the stresses we face in our daily lives arise from cognitive bias. We anticipate situations and begin to 'catastrophise', tending to think situations will be bad, and this escalates stress. CBT works by getting us to examine those situations and assessing pragmatically whether the stresses we anticipate are real, or as bad as we might think. Often, simply by casting the same situation in more positive terms, we can make a situation less stressful.

The best mechanism we know of for reducing and eliminating bias is science. Because it's self-correcting, all biases are ultimately eliminated. 

That's not to say that scientists are bias-free, of course. Bias manifests in science all the time. Einstein, for example, in what he called his 'greatest blunder' (as we've discussed previously, it really was a howler, despite the romanticism you'll hear in popular science books), Einstein introduced a term into the equations for general relativity simply because he thought the universe was static, and GR as it stood said it couldn't be. This 'Lambda' term, or cosmological constant, was a term that basically added a knob that could be set to a value which determined the rate at which the cosmos was expanding or contracting, and Einstein set it so that it balanced perfectly, all because he, along with pretty much all the physicists of his time, thought that the universe was eternal and unchanging. Pure confirmation bias.

His greatest blunder was the huge amount of time and intellectual energy he exhausted in trying to debunk quantum mechanics, another manifestation of bias, but we've covered that at length elsewhere.

When Eddington read Einstein's work, he was immediately struck by it, but he was pretty much alone in the Royal Astronomical Society. Other notable fellows of the society in his day were extremely resistant to the 'German science', in what was pure prejudice, another form of bias.

Another good example is plate tectonics, an idea first proposed by meteorologist Alfred Wegener in 1912 (actually, not strictly accurate; the first rudimentary theory of plate tectonics was proposed by Leonardo da Vinci in the 15th century, after discovering fossil marine organisms on mountains). This idea was resisted for decades, and even ridiculed by many until, in the '50s and '60s, observations confirmed seafloor spreading, making plate tectonics the only game in town (aside from the asinine 'expanding Earth' nonsense, which I'll be covering in a future post).

There's a marvellous book on this topic, the brilliant The Structure of Scientific Revolutions by philosopher Thomas Kuhn, which deals specifically with the inertia of ideas in science and how old paradigms become extinct in practice. I highly recommend this work.

The point is that ideas have inertia, and can be extremely resistant to change. This is especially true with ideas from certain sources during our formative years. We're programmed to trust our parents implicitly. When they tell us we can trust somebody and take what they say as true, we will tend to accept it at face value. This can be dangerous, not least because the ideas imprinted during this time are the hardest to shift. There's an old saw of the Jesuits, the Catholic church's educators, often attributed to Aristotle. 
'Show me the child until he is seven and I will show you the man.'
The implication is clear, and carries with it the elucidation of how, for many decades - more than we can know - children were subjected to institutionalised abuse, much of which has been coming to light in the past few decades the world over, and continues to do so. I attended an event for survivors of institutional abuse in Ireland some years ago, hosted by the then president, Mary McAleese, at Áras an Uachtaráin in which she talked about this as being the result of 'bad imprinting of children' (this, while new legislation for 'blasphemous libel' was being pushed through). 

One final thought about a particular bias that we often think of as being something desirable, but which actually serves to increase inertia; common sense. Common sense can be a useful tool, but over-reliance on it can be catastrophic to intellectual progress, precisely because of the inertia it represents. Because no post is complete without at least three mentions of the wiry-haired brainiac, I'll leave you with one of his most famous quotations: 
"Common sense is the collection of prejudices we accumulate by the age of eighteen."
Thanks for reading. Nits and crits welcome, as always.

Morton's Demon

A quick note of thanks to Neuro Sooz (@myscienceylife) for her input on the behavioural aspects of this article. Much appreciated.