Thursday, 27 February 2014


Being a good scientist requires an aptitude for sharp, analytical thinking, being able to critically evaluate evidence and draw sound conclusions. When I was an undergraduate student, I discovered that it also requires a lot of time. All of the time you have. In fact, more time than you have. To be a successful scientist, it seemed, I needed more than a highbrow soundtrack: I needed more hours in the day.

Fortunately, I found a way. I read that Leonardo da Vinci survived on a total of only three hours of sleep per day. He did it by taking a 30-minute nap every four hours. His brain had adjusted to accelerate and compress all of the vital sleep patterns. What a thought! A whole five extra working hours each day! I read further, and discovered that the reason hardly anyone emulates him was that it requires several weeks of painful adjustment that are virtually impossible without trained assistance. There are professional courses that provide just such training, but they are affordable only to the ranks of the richest corporate executives.

That was fine with me. If this already obscure technique was also beyond the budget of my fellow students, then all I needed was a month of iron discipline, and I could start composing my Nobel acceptance speech.

The first day was fine. I even overcame the first two novice problems, finding time to nap during a busy working day, and getting to sleep at irregular times, with one simple solution: lectures. The university lecture system is so conducive to regular napping that I wondered if it hadn't been designed to support the da Vinci sleep plan. Apart from the small embarrassment of my alarm clock going off in the middle of both Algebra and Classical Mechanics, I was well under way. I was a little drowsy as I passed through the early hours of the next morning, but I could keep myself awake by pacing my flat nutting out the opening paragraphs of my Nobel speech ("I must confess that honours and accolades mean nothing to me, but since it would be impolite to refuse this prize…").

The second day was hell. Yesterday's convenient lectures were today's torture. How could I possibly stay awake? In the morning I thought it would help to sit in the front row, where I tried valiantly to pay attention while telling myself that I just had to keep this up for a few more weeks. "Weeks?" my mind frantically asked. "No, no, don't worry, I meant days." Just a minute, did I say that aloud? In the afternoon I moved to the back. It got progressively harder as I entered the second night. I became delirious. I sat in deep mumbling discussions with myself about when I would get to sleep next, and tried to motivate myself with repeated calculations of just how much extra work time I was going to gain, and how quickly it would offset all the time I was losing right now by staring into space for hours on end. I found myself wondering things like, "Does one extra minute of nap count as a sleep-in?"

By 3am I was convinced that, although it felt like I had been sitting on my sofa staring at the opposite wall for the past 45 minutes, it was in reality a sensory illusion concocted by my desperate brain while I slept. The ensuing hours were spent in existential argument over whether or not I was truly conscious. It wasn't quite as gruelling as watching The Matrix, but it was bad enough: the plan was over.

At this point you may be wondering how anyone ever adjusts to the da Vinci sleep plan. The answer is that they don't. It doesn't work for anyone, and there are no executive training courses. I made that up, to see if you could be fooled as easily as I was.

I read about the da Vinci sleep plan in a newspaper article when I was 18. At that age I thought I was a sharp and sophisticated fellow, but I hadn't quite shaken off the childishly naive notion that the media report facts.

In the modern era of the internet, we have replaced faith in second-rate newspaper editors with the more democratic blindness of the google search ranking. Following this current research practice, I have now learnt that (a) no article on the first page of search results for "da Vinci sleep" cites any evidence that da Vinci ever followed this plan, (b) many like-minded hopeful geniuses have tried to follow it, and have almost all failed, and (c) the number of those who have succeeded, and shown a heightened level of concentration, creativity, memory retention, or any other indicator of smartypantsness, and been able to demonstrate this under reliable scientific/medical scrutiny, is not only statistically zero, but, again by the compelling evidence of one page of google search results, exactly zero.

You see, being a scientist requires a lot of time to work, but mostly it requires an aptitude for sharp, analytical thinking, and being able to critically evaluate evidence and draw sound conclusions. That's not easy, because the evidence is not always trustworthy, and the brain is easily fooled. Even when the answer is revealed, it's hard to let go of an attractive idea. To the many hopeful da Vinci's out there, none of the evidence matters. All objections to the plan's feasibility are easily countered with yet another calculation of just how fantastic it would be if it were true. This is the same reasoning that convinces people to gamble: if the potential winnings are large enough, we can replace a careful calculation of the odds by an approximate estimate of "practically certain".

You might think, in your charming innocence, that, once trained, no scientist would be fooled by such nonsense. Sadly, no. Scientists believe as much stupid baloney as anyone else. Maybe a bit less. I can believe that you are less likely to meet a scientist who moonlights as a palm reader, or claims to be able to see your aura, or thinks that your cellphone gives you cancer -- but they do exist. Trust me, they do.

Monday, 24 February 2014

The classical solution

Last time I talked about that essential scientist's skill of appearing to be as extreme an embodiment as possible of the eccentric genius. That skill is not taught. It's not even acknowledged. When incoming students speak to the admissions advisor on university enrolment day, they are never told, "Oh, you want to be a research scientist? Then I suggest you drop that extra mathematics module and sign up for either Eccentricity 101 or Intro to Arrogance." It's a shame. The innocent novices still believe that all scientists are geniuses, and therefore the only way to succeed is to actually, really and truly become one themselves, and many years of desperate foolishness lie ahead before they discover the truth.

I went through all that. I was seduced by the popular claim, "The average person uses only 10% of their brain", unaware that the accompanying statement should be, "If you believe that, then it's true." I was a sucker for any technique advertised to improve brain power. Speed reading: "Read a novel in 30 minutes, and absorb a physics textbook in one hour." Meditation: "Clear your mind of all distractions, and let your creativity flow." Whacky diets: actually, the one thing I never tried were strange diets. They were against my basic criteria that any instant genius scheme worthy of the name would require no effort and no discipline. It should also require no money, which ruled out extensive drug experimentation; I shudder to think what sort of gibbering cabbage I'd have become if serious funds had been at my disposal. And it could have been even worse than drugs -- it could have been Scientology.

When I was at high school, back in the late 80s, the cheap and simple miracle brain enhancer that was coming to prominence was listening to classical music. The abilities of classical music to unleash the hidden potential of your brain were seemingly endless.

As instant genius plans went, classical music was particularly easy to try out. It was no problem at all to go to the public library once a week and take out a few classical albums. Well, there was one problem. It was essential that I keep this most un-cool of activities hidden from my adolescent peers. I'm not entirely sure why. There was already a serious risk that I had been identified as a nerd. I wore hand-me-down clothes from my brother, who attended high school in the 70s. I went to the same hairdresser as my mother. When my exam marks were high I was visibly ecstatic, and I threw loud tantrums when they were low. I was a vocal proselytiser of my religious devotion to Doctor Who. On balance, my fear that a bit of Vivaldi on vinyl would shatter my trendsetter image now seems a tad irrational.

Nonetheless, secrecy was essential. This was not entirely trivial in the days before the complete takeover of the compact disc, when a stack of Tchaikovsky LPs were difficult to hide up your jumper. I could have easily bookended them with some U2 and REM, but this simply wouldn't have occurred to me; after all, I was so ignorant of music that I was starting from scratch 300 years earlier. I didn't know the difference between Bono and Barry Manilow, or Billy Idol and Billy Joel; all I knew for certain was that getting it wrong could end my unofficial role as School King of Kool. It was only later, perhaps armed with the increased brain power provided by a regular intellectual bathing in Bach, that I would learn that asocial nerd awkwardness posing as brave individuality is the perfect dress rehearsal for any budding scientist's genius act.

It was even beginning to dawn on me before the end of high school. I could note with approval the stunned reaction of a fellow student when he asked me what I was listening to on my walkman. "Mozart," I told him. "Bullshit!" he snorted. I took out the earbud and handed it to him. He contemptuously stuck it in his ear. And immediately his expression changed to pure amazement. "You weren't fucking kidding," he said, and indicated his newfound respect by handing it back, generously encased in earwax.

I have no idea if the classical music helped. I can provide anecdotal evidence of wonderful insights during the frenetic conclusion of many a Beethoven symphony (invariably after having jumped out of my chair to assist the conductor), but that proves nothing, because I also once solved an especially thorny coding problem somewhere in the middle of the Sex Pistols cover of My Way. All I can say for sure is that I consider myself lucky that the pop-science fad of my youth provided a rewarding minimal education in classical music, because in my innocence I would, if instructed, have just as avidly devoted myself to Elizabethan folk songs, barbershop quartets, or Broadway musicals.

A few years later record companies began to make a killing with classical compilation albums to help you think: "Mozart for your Mind", "Brahms for your Brain", "Saint-Saƫns for your Psyche". This quickly expanded into classics to assist with every task. "Mendelssohn for the Morning", "Bach for Bath-time", "Debussey for Doing the Dishes"; "Rachmaninov for Ranting" and "Wagner for Wanking". Fortunately by that time I was already snobby enough to listen only to complete works, the essentials of which I had already copied from public library records onto a bank of cassettes. And anyway, I was beyond such childish mind tricks. Having observed how rarely a crowd flooded out of an opera house and went home to quantise gravity, and that Fermat's Last Theorem had now been solved without the aid of The Ring Cycle on continuous repeat, it was time to move on. Something far more serious was required.

Thursday, 20 February 2014

The mythical genius

If you want to be a successful scientist, it helps to be a genius. Failing this, you should ensure that people think you're a genius. Given that geniuses are extremely rare, but that hundreds of scientists work in every university in every city in the world, it should be clear that the genius act is far more common than the real thing.

A full virtuoso genius act is a vast artistic creation, and the life's work of most scientists. All I can do here is mention two of its common aspects.

The first is a set of extreme work patterns. Any extreme will do. On the one hand: "I work all the time, and I never sleep." Alternatively, "I never work, but I am so brilliant that I wipe the floor with everyone else anyway." As props in these performances, the former involves a ridiculous show of drinking a lot of coffee, while the latter, drinking a lot of alcohol.

Working hours also play a key role. You can choose to be in the office before everyone else arrives in the morning, or the last to leave after everyone else has gone home -- but a nine-to-five schedule is too radical to countenance. Or you may never enter the office at all, except to occasionally tell stories of four-day-long drug parties and three-week-long snowboarding holidays, while casually dropping off samples of your most astounding work. This potent performance culture leads to graduate student offices that are never empty day or night, never free of boasting and story-telling, never quiet, and in which actual work is never done.

The second important choice is arrogance versus modesty. Are you in-your-face "I am smarter than all of you!", or do you adopt the Goofy tone, "Aw, gee, shucks, I don't think I could do that," before blowing everyone else out of the water?

A perfect example of the in-your-face variety is the student exam technique of a now-distinguished professor. For a three-hour exam, he always turned up half an hour late. He went to the back of the room, where he sat down and made a lot of noise. He then proceeded to complete the exam in an hour and a half. He made it clear he was finished by putting his feet up on the desk, and spending the next fifteen minutes leisurely smoking a cigar. Finally he made yet more noise and left, with almost an hour still on the clock. The crucial part of this performance, of course, was that he aced the exam.

It's an unforgettable act, but it can fail spectacularly, so most people prefer to feign modesty instead. Then it's easier to keep a low profile if you screw up.

If we are to look to the true pinnacle of the arrogance/modesty show, we come to the physicist Richard Feynman. His personal myth-making was so successful that he was able to turn it into bestseller books. And he was able to concoct the perfect myth, by appearing both a simple, regular guy, and a flamboyant show-off genius, all at the same time. The trick worked so well that my first introduction to his books was from a fellow student who said, "Look at this guy. He's just a normal person, not that smart, but just by looking at things carefully and working hard, he managed to get a Nobel prize. There's hope for us yet." There couldn't have been a greater accolade to Feynman's performance.

It has to be said, however, that outside the crowd of physicists eager to dote on their hero, the act is not always so convincing. A mathematician friend (mathematicians, being so close to physicists, resent them intensely) had to read only half of Feynman's first book before distilling the essence of a typical Feynman story. "I didn't think I'd be very good at mountain climbing," he mimicked, "but I read a bit about it in the encyclopedia Britannica, and after a while I got pretty good at it. The next thing I knew I was at the top of some mountain and all these people were cheering at me, and they told me later that it was called Everest." As skeptical as my friend pretended to be, however, he learned the recipe well, and went on to a successful career.

This genius act may sound simple, but you have to remember that every scientist's performance is in competition with every other. And to forget that it's an act can lead to disaster. Everyone has met the first-year student who is so brilliant that they don't need to turn up to lectures; they soon cease to turn up at all. And we've all met the dedicated scholar who refuses every invitation to go out for a drink, a movie, a party, a hike, or even just to accompany you to check the mailbox, because, "I'm sorry, I'm busy, I just have so much work." Eventually these characters crack up, and achieve nothing besides providing a short burst of precious material for their colleagues' own mad-scientist performances.

I have my own experiences of getting dangerously deep into the genius act, but I'll leave those for next week.

Monday, 17 February 2014

Some light history of black holes

My research is on black holes. So how about a story or two about black holes?

There are two famous solutions to Einstein's equations that describe black holes. The first was worked out by Karl Schwarzschild only months after Einstein completed his general theory of relativity, in 1915. Keen students of history will realise that this was during the First World War, and sharp-eyed etymologists will guess that Schwarzschild was German. So, yes, he managed to work out the first really big-deal solution of Einstein's equations while fighting in the First World War.

Although it is surely completely untrue, I like to imagine young Karl hunched down in the trenches, barely able to see in the gloom of smoke, and barely able to hear over the erratic BANGs and RAT-TAT-TATs of battle, and with his papers spattered with mud and blood from the debris that occasionally rains down into the trench, scrawling away at his calculations. You can be sure that in my vision he is writing with a blunt pencil, and the man who borrowed his sharpener two weeks ago just went Over The Top and never came back.

"Jesus Christ! Where have I lost a minus sign?"

All the other soldiers think he has gone mad. But they do still give him cigarettes.

The next day he is clearly at a very different stage of madness: all grins, and just bubbling with joy.

"What's got you?" they ask. (I leave it to the reader to translate this dialogue into German, and correct my one-century-displaced colloquialisms.)

"Yesterday I made a major scientific discovery!"

"Oh yeah?"

"I finally solved Einstein's new equations of gravity for the space-time metric of a spherically symmetric star. I sent it off to a journal in a dispatch last night. Isn't that fantastic?"

"Um. Yeah. Good for you."

"Now I can relax and get on with contributing to the war effort. What's happening today?"

"We're going over the top."

And that is the last we hear of Schwarzschild [1].

Karl's solution can describe the most basic kind of black hole, which just sits there in space. The next important black-hole result described a rotating black hole, which might sound like only a minor step up in complexity, but it took almost 50 years for anyone to work out. (When physicists say that Einstein's equations are difficult, they are not trying to be funny. You can always tell when physicists are trying to be funny. It is very uncomfortable.)

The rotating solution was worked out by Roy Kerr in 1963. Kerr happens to be from New Zealand, just like me, which I have to say is a far greater source of national pride than the All Blacks. Pulling out of your hat one of the two most important solutions of arguably the most difficult and mind-bending physical theory of the 20th century strikes me as just a tad more impressive than carrying a ball around a field while grunting [2].

I am not telling you about the Kerr solution because of the amazing physics it predicts. That was just a bit of background to what I think is a wonderful example of just how perversely interesting the behaviour of scientists can be.

Before I tell that story, I have another little story, which I think illustrates just what an impressive achievement the Kerr solution was. I was once at a conference in Argentina (ah, the jet-setting life of the scientist!), and I was walking behind two big-shots on the way to lunch, listening to them talk [3]. You should understand that these were senior Professors, big names, great minds, leaders in the field. And I heard one of them whisper to the other, "Have you ever worked through the derivation of the Kerr solution?"

"Fuck, no!"

"Me neither."

Anyway, here is the story I really wanted to tell.

Kerr came up with his solution in 1963. Needless to say, many others had been trying to solve the same problem, and some had got quite close. But close is not good enough. Kerr will be remembered, and the rest will not. One of them (I have forgotten who) recently gave a talk at a conference. During his talk he paused to tell a joke. Now, I warned you earlier about physicists trying to make jokes. He told us it was a story he had never told before. He did a long build-up, all about a cute solution of Einstein's equations that he once worked out, which aficionados of the theory in his audience were familiar with. Then came the punchline.

"I sent it off to the journal, who sent it to someone to referee. He realised that if he changed a few minus signs, he would have the solution for a rotating black hole. So he wrote that up, and become the most famous relativist after Einstein."

I told you it can be uncomfortable. For anyone not paying attention to the subtle subtext of that punchline, the guy was saying, "I was robbed! It should have been me!" And he was saying it FIFTY YEARS after the Kerr solution was published. So this guy, who had done a number of other notable things and was a renowned physicist, had been bitter and twisted about not finding the Kerr solution for half a century. It is easy to see why he got so wound up about it, but you also have to wonder if there was not some way he could get a bit of perspective. At least he was not blown up in the trenches in the First World War!

In the course of relating this story to another venerable scientist, I realised a comforting lesson about the mental-health dangers of life at the top. "We should count ourselves lucky to be mediocre," I said.

"Speak for yourself!" he snapped back. Then he walked away, and has not spoken to me since.

1. I just checked, and "completely untrue" is not the half of it. Schwarzschild did indeed serve in WWI, although he was not a young man. He was a distinguished and successful scientist who joined the army at the age of 40. He died of a rare skin disease called pemphigus in 1916, a year after he worked out the Schwarzschild solution. I learnt all this from wikipedia, so it must be true. Nonetheless, when the epic story of Einstein's general theory of relativity is finally turned into a major Hollywood blockbuster, I expect that it is my version that will make the cut. [return]

2. I understand that this is unfair, and that rugby is a game that requires great skill and physical strength, agility and endurance. It is especially unfair, considering how terribly under-appreciated is the great cultural contribution of our sportsmen. I apologise, guys. Now please let me take my head out of the toilet bowl. [return]

3. I am afraid lunch did not include Argentinian steak. At least, not that day. [return]

Thursday, 13 February 2014

The end of science as you know it

Now I really am going to tell you what this blog will be about. After all, the sooner it's properly categorized and indexed on the major interchanges of the information superhighway, the sooner all that social-media and search-engine magic will occur, and my inevitable millions of readers will flock to me.

Basic category: science blog. This is a science blog.

Except that it's not quite. That's fine: good old google will have picked up on the keywords, and this is now a science blog, and no amount of protesting will do me the least bit of good. But I'll try to forget that my main audience right now are the servers and algorithms of google, which probably outweigh living human readers by a factor of several billion, and address myself just to those two or three actual humans. Hi guys! Sorry if I keep getting distracted by that very loud, humming, buzzing, absolute endless horde of ghostly googleness that's watching me.

As I said, this is not quite a science blog. You see, on the other side of the media veil, inside the world of science, a science blog is part of a big operation with the loathsome heading of "science outreach". It's a terrible name. It sounds like a community programme to assist recovering alcoholics and abuse victims. Or perhaps a term you'd see in a water safety brochure. "In a drowning situation, try to remain calm, and even when a rescuer is within close proximity, do not attempt outreach." Or a police report on the death of that unfortunate prehistoric mammoth: "The victim, observing a tasty morsel overhanging the cliff, appears to have suffered a fall following excessive outreach." Maybe someone has realized this, because now they have an alternative term: "public engagement". Not really an improvement. I would have assumed that was the natural successor to the public marriage proposal.

Whatever the connotations of the term, science outreach means telling non-scientists just how wonderful science is. And getting more kids to study science, based on the premise that there are deep flaws in our education system that drive them away from science, when it is in fact vital to our cultural and economic health. ("Our" also means "your", whoever you are, because every nation, in the Western world at the very least, seems to take exactly the same view.) I'm happy to accept all of this, and would not want to question the motivation, dedication or skill of the many wonderful people who engage, reach out to, or otherwise enthuse the public about science. Especially since a large number of them work in the same building as me, and have a huge storeroom of "science demonstrations", which I'm sure could very easily be repurposed to demonstrate to me the error of my ways.

Nonetheless. Since scientific research relies on public funding, and university teaching also relies on funding, which is allotted in proportion to the number of students who enroll, there is a huge temptation for the whole enterprise to devolve into a massive advertising campaign. I'm not saying that's what has happened (and I would be too chicken to say so even if I believed it; God knows what you can do to someone if you fiddle the wiring of a van der Graaf generator), and I frankly wouldn't care that much anyway. I'm a scientist, and I like my funding! But I'm not interested in writing a blog all about what a wow whizzbang jolly time science is. Plenty of other people do that, and with far more colourful vocabulary. What I find interesting is what the "world of science" is really like, on the inside. I've no idea what most people think it's like, but it's certainly nothing like the naive, idealistic, romantic view I had when I started out, a world of dedicated individuals who have devoted their lives to the pursuit of the truths of the universe. In the end, it turns out that it's just like every other job: full of incompetence, rivalry, power struggles, pettiness and stupidity, needless crises and ridiculous dramas. The work of most scientists is of as much consequence to humanity as that of the characters in The Office, only executed less professionally. In other words, it's entirely human and thoroughly entertaining. It's wonderful. Put another way: I've realized that science is not just fascinating as an intellectual exercise, it's also fascinating as a human endeavor, with all the comic nonsense that entails. Plus, in the end, and this is the tricky bit to get across: the whole scientific enterprise, despite everything, actually works. Somehow a handful of scientists really do make major discoveries, they really do make steady progress in working out how the universe operates, and their contribution to society really is incalculable, even if the process is far more chaotic than most of them would be willing to admit. I am happy to admit it. I love it. And that's what this blog will be about. At least, some of the time. 

Monday, 10 February 2014

Slow blogs, rushed book reviews, and the last great promise of Obama

I want to explain my motivations for writing this blog, the inspiring ways I plan to talk about science and all that, but first I will digress a little.

On the dashed-off nature of blogs. I'm skeptical of it. If you don't mind, I'd rather mull things over a bit before just throwing them out there. It's all very well if I've gone into the intellectual wilds, as it were, and caught a good idea, but in most cases I think you'd prefer if I didn't just toss it to you raw; I should cook it first. If a draft or two is against the spirit of a blog, then I confess to them up front. In fact, for added pretentiousness disguised as honesty, whenever necessary I will include the dates over which the post was written, like those fancy writers do: "Barcelona-Osaka-Montreal-Lima, 2007-2012."

My reservations about the spontaneity of blogs extend to a lot of old-fashioned print media, too. There has always been a desire to be the first out with an opinion, to set the tone of the discussion. The example that comes to mind, for some reason, is political memoirs. I remember when Bill Clinton's memoirs came out. That was one heavy block of a book, and I imagined that all the reviewers had been handed a copy one day before publication, and had immediately leaped into their reading chairs, or beds, or baths, and started ploughing through it like maniacs, every one of them desperately hoping that they didn't miss a single headline-making passage before they finally had to put it down and belt out a 1000-word review to send off to their paper by 5am.

In reality they were probably given a pre-publication copy and a non-disclosure agreement three months earlier, plus an index to key pages provided by the publicist ("…amusing childhood experience No. 32, p. 234… re-election campaign, p. 452-587… stains on dress, forget it."). But the point is that no-one is interested in a review that appears six months later, even if it is the result of careful consideration, detailed research and fact-checking and interviews, and contains brilliant insights and subtle observations. Now, this is probably because no-one gave a damn about Clinton's memoirs for more than a week, and maybe that's how it should be. But I assume that there is the occasional amazing political memoir out there, and I would be very happy if someone, years after it was published, were to write a "re-review" (or perhaps it should be "aged review") that explained to me why, even though I haven't read a political memoir in my life, this one will be worth my while. That is something I would love to see. I wouldn't mind having a go at that kind of "aged review" myself in this blog, although probably not with political memoirs.

Except, perhaps, Obama's memoir. (Ok, now I'm writing an opinion several years before the book is written, so I'm botching the whole thing already.) I have high hopes for Obama's memoirs. Probably higher hopes than I had for his presidency. After all, he had never been President of the United States before, and, as was disingenuously pointed out during the 2008 election campaign by all of his rivals, who also had never been President before, this was a man with no direct experience of the job.

On the other hand, Obama has already written a memoir. I read "Dreams from my Father" in those heady times between Obama's election and his inauguration, and I was amazed. It wasn't just that I found the book inspiring, although of course it was certainly that, especially in those months. After his every insight about the difficulties of black America, I would think, "Yes, racial prejudice is just too deeply ingrained in the culture. There's no way this guy could ever be elected president, not in a million years… Just a minute, he HAS been!" This wonderful realisation would strike me every 20 pages or so, and maybe that biassed my assessment of the book. (Really, Mark? You think?) But I'm not the only one who gave it the literary thumbs-up. A few months later I read a beautiful article in the New York Review of Books by Zadie Smith, where she discussed, among other things, Obama's great skill at capturing a character's personality, outlook, and entire culture, all in the perfect rendition of their unique voice.

In short: this is a guy capable of incisive self-analysis and shrewd observations of others, and the ability to illuminate it all in fine prose. He's not a President who can write, but a writer who became President. How often does that happen? Who would not relish the thought of such a fellow writing a first-hand account of being President of the United States? Sure, we could imagine more tantalising figures for the job. I'd much rather read Mark Twain's century-delayed autobiography if Book Three was devoted to his (presumably disastrous) one-term presidency. Or Hemingway refusing to be apologetic that in a drunken fit he launched a nuclear holocaust: "I ordered the attack. The missiles were airborne. It felt good. And that time, the Earth really did move." Or Philip Roth's autobiography-as-fiction blockbuster Zuckerman Rules. Obama may not match what they would have done, but he's ambitious, and we can be sure that he'll try; and if you read Smith's article (and you should), you'll be convinced that in fact he is capable of doing a greater job than any American writer before him. Now, I know that the man who comes out the other end of the presidency will not be the same man who went in (who in turn was not the man who'd just left Harvard when he wrote his first memoir), and the whole thing may be nothing but charming anecdotes and convoluted self-justifications, but -- just to induce a cringe in disillusioned liberals -- I can have the audacity to hope.

If you cannot bear the possibility of disappointment, you can also just wait to see if I write an "aged review" of it. Sometime around 2020.

Sorry. I was going to tell you what this blog will be about, and why I'm writing it, but I got a bit sidetracked. At this rate I'm going to write a blog devoted entirely to the question of what the blog is about. As thrillingly post-modern as that may sound, I promise to reveal at least one of my nefarious motives next time.

Thursday, 6 February 2014

A scientist starts a blog

One of the main reasons people become scientists, which they're not supposed to talk about, is to satisfy their ego. I have become a scientist, and I can attest that the ego is indeed fairly well satisfied. But, it seems, not quite enough. Although I have convinced myself that I am extremely clever, and that the work I do is so important that I don't mind being paid relatively little to do it (certainly much less than a brilliant mind like mine could presumably earn if turned to a lesser vocation), my ego still wants more. Fame! Glory! The adoration of millions! Unfortunately, having spent all my adult life training to be a scientist, it's a little late to become a rock star or a Hollywood celebrity. Plus, see lesser vocations, above.

A blog is the perfect option. It is a natural extension of the scientist's urge to continuously pontificate, and my observations of the internet suggest that notoriety can be achieved even if you have nothing original or useful to say. The internet has done a sterling job of uncovering a great democratic urge to communicate the fathomless depths of inanity within us all --- so I'm sure I have plenty to offer.

In most hit blogs the author is either an entertaining screw-up, or screws entertainingly. Scientists are invariably entertaining screw-ups, so I have that one covered. I'll have to remember to change the names and shuffle around the facts, but as a practiced scientific researcher, that will come naturally. As for the second point: forget it. If the world's labs, libraries and lecture theatres constitute a modern-day re-enactment of the sauciest passages of the Decameron, then they've kept it all well hidden from me. I could make something up, but this isn't that kind of blog.

The ideal blog is written by an expert we're thankful to hear from. That might work. I can occasionally feign being an expert. After all, I do know a little bit about physics. I can say nice things about black holes and rude things about string theorists as well as the next guy.

Besides self aggrandisement, there are slightly more serious reasons (i.e., not made up as an excuse for flippant remarks) why I have started a blog. But since it is the nature of blogs to be brief and dashed-off, I will leave those for next time.