Wednesday 7 January 2015

One hundred revolutions

To kick off the year, allow me to apologise in advance to you, the relatively innocent, for the actions of my fellow devotees of the cult of Einstein. In the coming months, the Relativists will be out in force to accost you with breathless attempts to explain Einstein's general theory of relativity. They will gesticulate and gibber and blabber in an incomprehensible argot that they will paradoxically refer to as "simple terms". I have no idea how many of these shambling, mumbling, misshapen creatures roam the Earth, but don't worry, by the end of the year you and I will both know the precise number with complete certainty: we'll have had the opportunity to count them all six or seven times over.



Why is this? What will draw these bizarre individuals out into daylight?

It's quite simple. On November 25, 1915, Einstein presented his general theory of relativity to the Prussian Academy of Sciences. Basic mathematics tells us that 1915 + 100 = 2015. And guess what? By an astounding coincidence, 2015 happens to be, since last Thursday, this year. Feel free to take a moment to recover from the shock of this revelation.

Yes, on November 25 of this year, the Earth will have completed one hundred entire orbits of the Sun since Einstein completed his theory. That isn't a relative statement -- even amidst the distorted distances and distended durations of Einstein's wild theory, the number of orbits is an invariant. All observers can agree: 100 Earth years have passed. So, it seems, it's time to celebrate.

The excitement isn't confined to the science Popularisers and Outreachers and Communicators and Ambassadors and whatever other grandiose titles these buzzing burblers go by. The frenzied desire for celebration will also ascend to the hallowed ranks of the academics. There will be lectures, conferences, workshops and symposia galore, as every scientist who has ever calculated a Christoffel symbol crawls out of the shadows to bask in the distant glow of Einstein's triumph.


It is for this that I apologise because, as I have ranted in the past, it is futile to attempt to explain complex scientific concepts with fancy metaphors, "everyday examples" and using "simple terms". It is a sham and a waste of time.

You don't believe me? You think that every detail of the marvels of modern science is available to all? I'm afraid that if you refer back to my previous rant, you'll find that I am backed up by the strongest piece of evidence any physicist could ever hope for: a Feynman quotation.

Do you think I'm happy about this? That perhaps it gives me a thrill? That I'm an elitist snob?

I certainly am an elitist snob. But even more than that I'm an egotist and a show-off. I'd love to be able to impress as many people as possible with the fathomless depths of my knowledge of general relativity, and provide a daily blow-by-blow account of all of the amazing research that I do. This year of all years I have the perfect excuse to bore you all.

You can't imagine how wonderful that would be. I would probably even get more work done. Every day I would rush into my office and furiously perform calculations and write papers, just so I could go home at the end of the day and wow family, friends and random lucky acquaintances like you. No-one would ever get sick of me, because everything I said would be so mind-blowingly, jaw-droppingly fascinating.

Plus, I'd be very fair to you. You would also have the opportunity to bore me. I'd let you blather about your latest kitchen renovation. You could showcase YouTube videos of your kid's astounding progress at potty training. You could even rhapsodise about your favourite football team. I'm sure I could smile and nod through every one of the repetitively tedious details of your mundane lives, for upwards of seven or eight whole minutes. Then we could settle down to a nice discussion of frame dragging, quasi-local definitions of mass, and the finer points of post-Newtonian spin effects. Oh boy!

Sadly, it doesn't work that way. I could trawl the internet and gather together some witty phrases and colourful computer animations, and pretend that I had explained something. You would be suitably impressed, since you wouldn't know any better. But what would we have achieved? You wouldn't have gained anything, except the false impression that I am very clever.

On second thoughts, that sounds fine to me. So perhaps I will take the opportunity this year, just once or twice or 14 times, to wheel out the creaky metaphors. There'll be stretchy rulers and dodgy clocks. There'll be trampolines and spinning tops. There'll be no end of rubber sheets and rubber balloons -- oh yes, there will be a definite excess of rubber.

For now, I will simply remind you that Einstein's general theory is not to be confused with his special theory. That was the one where the short, fat man runs very fast and finds that unfortunately he's only succeeded in becoming shorter and fatter. General relativity is the one where he falls into a black hole and gets stretched into spaghetti.

Unless I can control myself, that may be only the beginning. I apologise in advance.

8 comments:

  1. Ooh, that's me! I've calculated a Christoffel symbol. I've calculated at least two Christoffel symbols.

    Wait...how many are there? I think I've calculated them all. Yes, I've calculated every Christoffel symbol.

    Einstein's triumph is mine!

    ReplyDelete
    Replies
    1. I trust that you have alerted the media. You have 11.5 months left.

      Delete
  2. The part on "simple terms" and "a Feynman quote" reminded me of some lines in the preface of Aaronson's Quantum Computing Since Democritus:

    "[From] Carl Sagan, in The Demon-Haunted World:

    Imagine you seriously want to understand what quantum mechanics is about. There is a mathematical underpinning that you must first acquire, mastery of each mathematical subdiscipline leading you to the threshold of the next. In turn you must learn arithmetic, Euclidean geometry, high school algebra, differential and integral calculus, ordinary and partial differential equations, vector calculus, certain special functions of mathematical physics, matrix algebra, and group theory...The job of the popularizer of science, trying to get across some idea of quantum mechanics to a general audience that has not gone through these initiation rites, is daunting. Indeed, there are no successful popularizations of quantum mechanics in my opinion – partly for this reason. These mathematical complexities are compounded by the fact that quantum theory is so resolutely counterintuitive. Common sense is almost useless in approaching it. It's no good, Richard Feynman once said, asking why it is that way. No one knows why it is that way. That's just the way it is (p. 249)."

    ReplyDelete
    Replies
    1. Interesting comment from someone who's most famous for popularising science. Most of the same applies for Astronomy. Is Astronomy easy to understand, or just easier to *think* you understand?

      Delete
    2. Well, it's easy to wow people with pictures of the crab nebula or talk about turning into spaghetti as you fall into a black hole. But how many people who are (rightly) awed by astronomy know their Roche lobe from their Virial Theorem?

      Delete
    3. That's an interesting question Mark. Astronomy has always been a canonical example in discussions of philosophy of science. I'd be curious to know how implications of Hanam et al's "Futility Theory" pan out for that case :)
      My personal experience tells me not to hope for simple answers in philosophy of science though; simple answers in popular readings of philosophy of science can be more deceptive than their counterparts in popular science.

      Delete
    4. Don't worry about the astronomy terms -- if you get a bout of the virial theorem in your Roche lobe, you'll know about it!

      Delete
    5. As for my "interesting question mark": I have no idea if the public understands the philosophy of science, because I don't understood it myself. Unless, of course, I am mistaken.

      Delete

[Note: comments do not seem to work from Facebook.]