So long…


… and thanks for all the fish!

About six months ago, I purchased the domain name with the goal of one day porting this blog over to it.  I optimistically suggested that would be done “by October,” which was just balls-to-the-wall crazy.  I could never do the three hours of work it took to export everything from here and put it over there in a mere six months.  Instead, I waited until this afternoon to do it and knocked it out in three hours.

You can check out the new digs over at — it should look really similar to this site, except with a few additional “About” pages, a newer simpler theme, and this post won’t be there.

A few things to note — if you are reading my blog and want to continue to read it, I think you should!  I’ll continue to post on the new site going forward, and hopefully I’ll do so more often.  I’m also aiming to have my posts be a bit more… targeted.  I’ll still post rambling nonsense, but more often than before it will have something to do with web development or coding in general.

Unfortunately, though, it costs money to keep this site up and point it to the new one, so I’m just going to shut it down.  I’m giving this notice a week in advance; I still need time to go through each of the posts and make sure the links work, since a lot of them link back to here instead of to the new site.  I’m aiming to shut down this site one week from today, at 5:00 PM on Wednesday, April 1 (no this is not an April Fools joke — who would even care?).

Finally, anyone who’s been going to regularly (which is extremely unlikely because why would you be doing this?) may end up with a redirect error to a 404’d page:


It’s not this one, but we can all wish it was.

It’s a surprising PITA to get around this, but for Chrome users I suggest you follow the accepted answer here; the gist is to go to Chrome Menu Chrome Menu > Settings > Show advanced settings… > Privacy > Click Clear browsing data… then select “Cached images and files” (and probably nothing else) and clear the browsing data.  If you don’t have Chrome, try it out.  It’s great? Or just move on with your life; what I’m posting probably isn’t that worthwhile anyway.


Thanks for reading!  I hope you continue to do so on my new site,


Hi, Perbole

This week, the American sports world was shocked by a (rather hilarious) controversy involving a shirtless photo of possible first pick in the 2015 NFL draft Jameis Winston.  The photo was controversial not because it was lewd or because it captured conduct unbecoming a first pick, but because he appears, for lack of a better word, fat.


Iggy piggy!

This, of course, was much-debated on ESPN this week leading up to the NFL combine; his coach pointed out on SportsCenter that Jameis would be paraded around to GMs and owners in his underwear*, so if he’s still fat they’re gonna see it anyway.  I think he was referencing the medical examination process, as part of which each player gets a physical — as well they should, since their physical properties are about to be heavily invested in. However, this struck me** as an alarming image.

What else in our nation’s history involved a bunch of rich white guys (NFL owners are 100% non-black with one Pakistani, and there are only 7 black GMs) lining up a group of predominantly black people (NFL players are over 2/3 black) with no say in their below-market-value compensation (rookie salaries are predetermined by draft position) and valuing them based on their physical attributes?


Did somebody say… slavery??

* I could not find a video or transcript of this, so I’m not sure of the exact wording.  I may also be misattributing the source.  Womp womp.

** A middle class white guy with exactly zero stake in this statement.

Communication is Key

In keeping with the ancient forms and traditions, today the world celebrates its yearly remembrance of Saint Valentine driving the snakes from Ireland by surrounding those we love with chocolate, shiny objects, and Hallmark cards (snakes hate those things).  We spend countless ducats (OK, maybe they’re countable) reinforcing relationships that are already strong — in part because if we don’t, they’ll no longer be so strong.  I think all of this outpouring is misdirected; we should gather together to save foundering relationships rather than pouring resources into those that can stand under their own power.  In particular, one relationship needs our collective societal help: the ancient, ever-tumultuous relationship between science and the right-wing nutjob morons who deny her.


This guy captioned himself!

The chief problem here is a complete and total breakdown in communication between the two parties.  On the one hand, you have the scientists and sane folk who present evidence and draw conclusions, and on the other hand you have stubborn fools backed into a corner against an overwhelming onslaught of evidence contradicting their, let’s face, not facts, but opinions and beliefs, and who lash out at that which they can’t possibly wrap their tiny minds around.  If they would only listen instead of digging their heels in and dismissing their obvious superiors, imagine what a society we’d have!

OK, maybe that was overblown, and maybe the Science clan would never actually make those claims out loud, but writing from that camp, I can tell you that the frustrations with the stubbornness of the other side are real.  But it should be obvious from reading that that we are doing the exact same thing — senselessly digging in our heels and refusing to budge.  We’re not engaging in a dialogue, we’re engaging in a shouting match, and we’re making it worse.  Here’s Bill Nye, arguably the single greatest science communicator to my generation, on Science Friday last month talking about his new book on the theory of evolution, Undeniable.  You probably don’t even have to give that a listen to see what I’m about to say — the title of his book says it all.  “Evolution is undeniable.  Anyone who denies it is wrong.”  However, if you do give it a listen, I’d point out around 3:10, where he actually says “Evolution is provable.”

Bill Nye

That’s what happens when you get your science from professional baseball players.

The statement that any scientific theory is “proven” is directly contradictory to the way that science works.  Quite simply, nothing can be proven except in pure mathematics.  As an example, can you prove that a wall is solid?  Take a running leap at it and see if you pass through.  You don’t, so it’s super solid, right?  The current model of the universe suggests it’s not — it’s just extremely likely to be solid.  If you repeated the experiment about e^10^50 (that’s effectively 10^50th zeroes) times, one of those times you’d probably pass through the wall.  The wall isn’t solid, it’s just that it’s likely to take more interactions than there have ever been or ever will be in the universe to find an interaction in which it is not solid.  So we simplify — we claim the wall is solid, with a reasonable certainty that we’ll never see it behave any other way.  A model that’s wrong only once in every (1 with a sexdecillion of zeroes after it) is a really really really really really really really strong model.

Screen Shot 2015-02-14 at 12.24.21 PM

This is not the number. The number has this number of  zeroes.

The same thing applies for evolution, albeit at a much, much less certain level.  There’s a slew of evidence in favor of it — we’ve seen it in action at the microscopic level (many times to our dismay, with antibiotic resistant bacteria and mutating viruses), and the fossil record supports the theory that plants and animals change gradually over time.  But just to pick apart a tiny piece of the overwhelming evidence in support, the fossil record relies on radiometric dating, which itself relies on the assumption that all radioactive substances decay in the same manner, and that they have done so identically for all time.  We can verify that right here, right now, radioactive substances with half-lives up to a few centuries — or even millennia — all decay in the same way, and there’s no reason to think that there have been any changes in these fundamental behaviors.  However, we’re talking about millions and billions of years; we’ve only been able to study radioactive decay for a hundred years.  We are able to see just a tiny tiny tiny tiny tiny point on the curve of radiological history, and we’re applying what we see to that whole curve.  It is possible — very, very unlikely, but possible — that there may be some change to this seemingly universal behavior that we can’t or don’t perceive.

But when we talk about evolution we don’t talk about it that way; we say “it’s proven,” and when people ask for that proof we give them evidence that certainly points to our conclusion, but is chock-full of assumptions, and when people point out our assumptions, we say they don’t understand, and they say we don’t understand, and we dig and dismiss them.  Instead, we should encourage people to test these assumptions.  There should be an army of scientists actively trying to prove that radiometric dating doesn’t work, formulating hypotheses of what kinds of physical evidence we would see if half-lives had been exponentially increasing since the formation of the world.  Think about what an incredible boon for science it would be if we found out we were wrong about something so basic!


For one thing, it might mean dinosaurs were way more recent than we thought…

While we frequently overstate our case, I think we magnify the problem by frequently being wrong.  Sometimes this arises from uncertainties due to the mind-bendingly complicated nature of the world; the Thalidomide tragedy, where a drug marketed to protect against morning sickness in pregnant women could cause severe, often fatal birth defects, was the result of one chiral enantiomer behaving differently than the other.

Sometimes, it’s worse, and science is being used to deliberately mislead.  Think about those vintage tobacco ads suggesting that smoking was good for you (usually that it would help you stay thin and virile), where they mention the number of doctors or scientists who agree with some study, and then the tobacco company says something like “Proven to make you thinner and more virile!”  Yes, all of those studies were hilariously paid for by the tobacco companies, and yes they picked and chose the results they wanted.*   Still, it’s not hard to imagine where people might have gotten the idea that scientific studies may be influenced by partisan concerns.  We’ve come a long way in promoting impartial research, but in (another) recent episode of Science Friday specifically about science communication, the overarching theme coming from science doubters is that they don’t believe in the neutrality of research — they all think someone is trying to promote some viewpoint or new product.  No one bothers to go into details about what the research means or whether its impartial, they just dismiss the callers.  You can imagine how frustrating that would be if you were on the other side of it.  Now imagine you’re already skeptical of some precieved science lobby, and you hear Bill Nye talking about he’s going to indoctrinate your sons and daughters while you can’t do anything about it (5:40 from the Bill Nye episode):

The people that I’m concerned about are people in, let’s say, middle school.  People who are open minded, or who still haven’t established themselves intellectually.  By that I mean their reasoning is not fully developed, this way or that way.

I know where he’s going with that, but it’s terrifying if you think he’s specifically trying to indoctrinate your children because, to skew his words, they’re young and impressionable and they’ll believe any lies you tell them.  I can imagine how that sort of language might back me further into a corner.


and he demands that you accept his “science”

I don’t think that either side in this debate is doing particularly well of communicating, but I also disagree with what I perceive as a widely-held view that science and scientists are somehow holding the moral high ground.  We need to be better at communicating everything, from our most basic theories and laws to the most recent studies and findings; in particular we need to be up front about our assumptions and the the uncertainties in our findings.  The world is an absurdly complicated place, and it will never be possible to eliminate all doubts; instead, we should meet in the middle and be up front about those possibilities — and, critically, encourage their exploration.  If we want to ultimately bridge the gap between the two camps, we need to do a better job of establishing communication across that great divide.  Communication is key; after all, without establishing communication, could Saint Valentine have gained the dragon’s trust and lured it out if its lair to be slain?

* Note even in the linked ads that they say things like, “proven to be less irritating to the throat than other brands,” and then they repackage that as “protects the throat.”  I love it.

Fix It

Last weekend, a friend hosted a Star Wars marathon, screening all six movies (yes, six, those of you who don’t count Episode I).  The correct order to watch them in has been debated, but it is generally accepted that the best order is IV -> V -> I (if you do watch I) -> II -> III -> VI.


This applies to the entire article, from here forward, by the way.

That way, you don’t get the “No, I am your father!” ruined for you (which would be obvious after episode III — although you do get the “… sister” moment ruined for you.  But I think that makes it better, since you don’t want to know about that while you see them makin’ out super hard in V.)

The other thing about watching it in that order is that you get to start and end with movies that are actually good.  If you started me on Episode I, I’d never watch the second one; the only reason I saw Episode II is because by the time I watched the Ep. I, I already loved Star Wars.  Compared to the original trilogy, Episode I (and really all of the prequel trilogy), is just the ravings of a crazy old wizard who lives in the desert.

There has been nigh-endless discussion of how to fix the prequel trilogy; in particular, there’s one video I believe is pretty widely known (but I can’t seem to find it…?) that points out that there’s no main character in the prequels — especially Episode I, where you’re sort of following Qui-Gon Jinn and his apprentice Obi-Wan, also Anakin, the queen, and Jar-Jar about equally, and you never really get to know anything about the characters.  He also points out that the characters have obvious traits in the original trilogy: Han Solo is a cynical rogue with a hidden heart of gold; Luke is an idealistic rube, etc.  Comparatively, Obi-Wan in the prequels is … like a guy who’s kinda young but really powerful…?  Or like Qui-Gon is older and wiser than he is, but like… sometimes kinda petulant?

There’s a reason for this; one of the things I was struck by watching originals side-by-side with the prequels is that there are entire sections of the original trilogy that do nothing to really advance the plot but instead exist solely for characterization; in Episode IV, we get to see whiney Luke be a whiney little brat, set up an internal conflict about fulfilling his duties to his adoptive family on his farm vs. going off to fight the Empire, and then it gets resolved for him by some storm troopers; this takes a good half hour, and in the end we’re basically where we started, but we know Luke as a character and have been introduced to Obi-Wan, the mysterious sage.  In Episode I, we open on Jedi using laser swords, then they go down to a planet and there they get to use their laser swords, then they leave the planet and use laser swords, then they come back to the planet and use laser swords.  In the un-testable hypothetical scenario where Episode IV was written like Episode I, we’d get a shot of Luke being like “I wanna leave!” and then Obi-Wan coming from nowhere and telling his aunt and uncle he’s coming, and then they leave.  The prequels are generally like this; people flitting from place to place advancing the plot, and it looks really cool, but at the end of the day I don’t really know anyone in those movies.


Except Jake Lloyd, who to this day denies any involvement in that film.

The thing that kills me about those movies is that they could be great; it’s a fascinating story, and even the nonsense that happens in the background (er, it should be in the background) with the Galactic Senate and the chancellorship and the mystery around the war is interesting.  It should just be in the background, and the characters should take center stage.  If I were Disney (and depending on the success of upcoming Episode VII), I would rewrite and reproduce the prequels.  And if I did that, there’s a few things I would do differently…


Seriously though; the first thing I’d do is work on character development, and I’d base it on the characters from the original trilogy.  They don’t have to be identical — they should change as the story progresses — but we should see their roots in the prequels.  What is Obi-Wan like in the original trilogy?  He’s a patient, wise and learned old man.  When we see him in Episode I, he should be frustrated and anxious (yes I know this is actually the case in the real Episode I), and there should be moments of tension where he explicitly grows into his character, so that by the end of Episode III, he’s essentially the patient, wise man we see in Episode IV, not the gung-ho warrior who runs off to kill a separatist general (or a Sith Lord, for that matter) without any backup.

The obvious person to change, though, is Anakin.  Besides getting someone who can actually act and completely rewriting all dialogue, there’s some really fun things you can do with the character.  How would you describe him in the original trilogy?  Intimidating.  What makes him intimidating is, of course, the suit — but also his speech pattern.  He speaks slowly; he thinks about what he’s saying and says exactly what he means.  He wastes no words.  Build that into his backstory; when we meet him, he’s a slave boy.  His master has drilled it into him that he is to be seen and not heard; he is to speak only when spoken to.  Show that.

This also brings up my next point, which is to make direct comparisons to the original trilogy.  Open with Anakin on Tatooine; spend a half hour in his life and draw the comparison to Luke and his uncle; Luke is surrounded by family who love him, Anakin is surrounded by his mother and his abusive owner (note: make the owner abusive, and don’t make it that stubbly blue thing — model it after someone from Mos Eisley or Jabba’s palace).  Both Luke and Anakin long for something more, and are eventually scooped up — almost by happenstance — by Jedi.  Also, while we’re at it, let’s ignore the entire opening on Naboo; we only meet the Jedi after they’ve come to Tatooine in a battle-worn spacecraft.  (Here’s another point — the original trilogy, in part because of budget constraints, was unable to show everything, and it made it better.  You have Lando in VI show up as a general, and he casually says, “They must have liked that stunt I pulled at the Battle of Taanab.”  We didn’t see that stunt; it’s left to our imagination what the daring General Calrissian did to save the day.  (Probably it involved engaging those Star Destroyers at point blank range.)  Do that here — briefly mention the harrowing escape from a far-away land, show the outcome — a busted-up ship — and leave the rest to our imagination.)


This also brings up an important rule: AT-ATS. As Yoda would say, “All Trilogies At Tatooine Start”

Pander to the audience a little; sprinkle in quotes from the original trilogy (honestly, if just to make the writing better, since the dialogue in the original trilogy was pretty fantastic, and there’s literally not a single quotable line in the prequels).  Someone’s worried Anakin won’t win the podrace?  Slowly, and somewhat disappointedly, he says, “Your lack of faith … disturbs me.”  If you have to have that scene where he saves the day at Naboo, put him in the cockpit and tell him if he stays put you’ll come back and get him, deal?  When he shows up in space, you call him out on it and he says — and this is the only thing he says in that entire scene — “I am altering the deal.”

Back to Anakin; don’t tell us the Force is strong with him.  Show it.  At that dinner scene, Anakin looks expectantly at the Jedi, they ask him what’s on his mind, and he says nothing until his mother tells him it’s OK (because remember, he’s afraid to speak for fear of angering his master).  He tells them he knows they are Jedi.  He does not know that that means.  “How do you know that?”  Slowly, “… I saw it in my dreams.”  Oh snap!  The Force is strong with this kid!  Then build up an arc about how he turns to the dark side.  Don’t have him slaughter a bunch of helpless Sand Persons (not just the men, but the women, and the children, too).  Keep that story — his mother is kidnapped and he is unable to save her — but then have him discuss it with Obi-Wan.  “Do you ever wonder if there is something more… powerful?”  “No, Anakin; the Force binds us together… blah blah blah.”  This sets up his fall without making it obvious — there’s still hope for him, but his motivations are clear when the Chancellor tells him about the powah of the Dahk Side.

In this — the key conflict of the entire series! — again draw comparisons to the original trilogy.  The Emperor poses a real threat to Anakin’s wife and unborn children (he’s holding them hostage, whatever).  Anakin comes in to reason with Mace Windu — already victorious — about the Emperor’s release.  He says he will not fight him, but the Emperor plays on his emotions; as the sword comes down on the Emperor, Vader’s comes up to defend him (I know this actually happens in the movie; the rest of the scene is super bizarre, though) and Vader gives in to his emotions (as does Luke at the end of Jedi).  Here’s the key difference — when he disarms Windu (literally), he pauses and considers what he has done.  Maybe the Emperor offers some encouragement, and then he brings the sword down; Anakin kills Mace Windu and realizes there is nowhere else to run but to the Emperor.  I’m tempted to say end the movie there, but I guess you need him to get lava’d up real hard or whatever.

Anyway, that’s just some of the stuff I would do, if I were put in charge of this project that doesn’t exist.  But if, for any reason, some Disney exec is reading this, know the following:

  • You should re-make the prequels so that they’re good instead of terrible
  • I will render my services free of charge
  • I’m looking forward to Episode VII

2015 Year In Preview

As promised in my last post, I have a new batch of resolutions in 2015.  I apologize for a post that’s just a list of stuff I’m hoping to do this year, but “publishing” is one of my keys to successful resolution keeping, so I’m hoping my reader(s) will help keep me honest.  Having learned from last year’s failures, I’ve set up a system of tracking everything weekly, and I’ve also set up a grading scale to determine success and failure.  The way it works varies a bit from goal to goal, but I’ve set up a pretty sweet spreadsheet to track everything, so I should have a good view of where I stand at any point in time (and can presumably come up with interesting trends and stuff).  There are a few new ones, but many of the resolutions are extensions of last year’s successes.

Generally, I’d say the theme for 2015 is productively using my time.  Since I’ll be funemployed on Thursday, I should have plenty of time to spend on fitness and hobbies and the like.  The plan for my funemployment (and perhaps my first resolution) is to work 5 75-minute chunks, separated by at least 15-minute breaks, 5 days per week (for a total of about 30 hours per week) on my main project, and a sixth day hopefully on some personal project (may be a Coursera course or a stupid app I want to build).  The breaks will be great time to work on some of these goals — getting in a solid 15 minutes of juggling, or a workout, or what have you.  I may need to revisit these if I end up getting another job, but I think these goal are tractable through at least the first half of the year.

Without any further ado, I give you my resolutions for 2015.

Up by 6:00 4 times / week

I can make it easier to maximize productive time by maximizing awake time.  Note that I can also take naps between my 75-minute blocks.  I’ve both made this stricter and more relaxed than last year’s “into the office by 8” rule; on the one hand, I’ll be getting up earlier, but on the other hand it’s only 4x per week, plus I can do whatever I want (see “naps,” above) for the day after I get up.  Another more relaxed aspect of this is that I am in complete control and won’t miss a week if, e.g., I get stuck in traffic.

Grading Scale: A: 45 weeks; B: 42 weeks; C: 38 weeks; Failure: 35 weeks


Work Out 5 Times / Week

Same as last year; I felt that was a good number.  I have some more pointed goals in 2015, though.  See below.

Grading Scale: A: 50 weeks; B: 47 weeks; C: 45 weeks; Failure: 42 weeks

25,000 Push-Ups (2015 Total)

I said I’d have weekly goals; this one is a bit weird.  It’s 500 push-ups (100 push-ups, 5x) per week * 50 weeks, but I’m counting a weekly success as being on target by Saturday of that week. E.g., if that Saturday is the 100th day of the year, I should have done 25,000 * (100 / 365) = ~6850 push-ups by then.  This allows me to get ahead or behind, rather than doing exactly 500 push-ups in any given week, which if anything makes it a bit more relaxed than just doing 500 per week, since I’m more likely to get ahead than behind.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: Not completing 25K

12,500 Pull-Ups (2015 Total)

Same metric as the push-ups, but 50 per day, 5 days per week.  I think this will be much harder than the push-ups… especially if I don’t join a gym.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: Not completing 12.5K

Run + Row 10 Miles / Week

This should establish a good baseline cardio regimen; it’s more than I probably did last year, and it’s relatively easy, since my apartment has treadmills.  It’s also going to be the easiest to do on vacation, etc., since it’s pretty much always possible to run.

Grading Scale: A: 50 weeks; B: 48 weeks; C: 45 weeks; Failure: 42 weeks

1 Hour of Yoga

This will be pretty difficult, since I will basically have to teach myself yoga.  Also, I’m very inflexible.  But that’s exactly the point, so I’ll need to do this.  I’m guessing I’ll end up doing 15-minute stretches right after I wake up or something.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

“Fast” 5 Days / Week

I heard on a podcast recently that a study suggested that mice are protected from obesity, diabetes, and a host of other lifestyle diseases if they just don’t eat during a 12-hour window, 5 days / week, so in the dumbest possible way, I’m going to try that out.  I’ll wait until February, since January will see a lot of lifestyle changes anyway, then I’ll stop eating between 9:30 PM and 9:30 AM Sunday night – Friday morning and see how it goes.  If successful, I’ll switch to 7:00 PM – 7:00 AM for the first four weeks in March and compare; if it’s unsuccessful in February and March, I’ll stop doing it altogether.  “Success” will be determined by weight and body fast measurements.

Grading Scale: A: 40 weeks (or no change after 8 weeks) ; B: 38 weeks; C: 35 weeks; Failure: 30 weeks


Contact 1 Out-of-Town Friend / Day

Same as last year; that went really well, and I don’t want to stop that.  But I’ve added in a few others (that’s why it gets its own category!).  Also, I’m formalizing that I cannot repeat people within a week.  “Success” for a week means not missing any days.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

Contact 1 Still-Employed Ex-Coworker / Week

This should give me a way to keep in touch with the folks at my old job.  If the ones I tend to talk to all end up leaving, I may have to revisit.  I’m also excluding people I would talk to normally anyway, and I’m saying that I cannot repeat people within a given month.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

Call 1 Friend / Week

hate calling people, but the number of people I’ve promised to call (and subsequently not called) is embarrassing and awful.  I intend to fix this by calling a few people (mostly the ones I promise to call and don’t) more frequently.  Same rule here — cannot repeat a name within a given month.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks


Practice Music 3 Hours / Week

Given my newfound time, I’m increasing this by an hour per week.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

Juggle 60 Minutes / Week

I want to be able to juggle 4 balls and pins.  I’m hoping if I spend an hour a week juggling, I can probably manage it.  This should be a great way to spend my 15-minute breaks.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

Practice Sleight of Hand 1 Hour / Week

I’d like to be able to do stupid card / coin tricks.  My brother’s been married for awhile, so I could be an uncle like… any time, with no warning.  Better get practicing.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

Volunteer 2 Hours / Week

Really this is “Volunteer 8 Hours / Month.”  I never volunteer, but now that I have time, I should be able to give it freely toward something interesting.  I’m hoping I can land a volunteer job doing something awesome like hunting snakes in Florida or cataloguing fossils at the natural history museum, but I have yet to explore opportunities.

Grading Scale: A: 25 weeks; B: 23 weeks; C: 21 weeks; Failure: 20 weeks

Practice Speed Reading 1 Hour / Week

I’m a crazy slow reader; if I spend some time addressing this, I think I could make some good progress, and it’d be a huge boon to my productivity generally.

Grading Scale: A: 48 weeks; B: 45 weeks; C: 42 weeks; Failure: 40 weeks

2 Blog Posts / Month

I’ve enjoyed writing this blog, but it’s a killer on Saturdays, and — let’s face it — I often have nothing worthwhile to say.  I’m going to scale back and try to come out with more interesting things to say less often.  I don’t promise that they’ll be interesting, but the bar wasn’t exactly super high to begin with…

Grading Scale: A: 25 weeks; B: 23 weeks; C: 21 weeks; Failure: 20 weeks

It’s gonna be a long year*, but I imagine it’ll fly by.  Hopefully this time next year I’m celebrating by writing a blog post about my hugely successful magic show / yoga / rock concert New Years Eve performance extravaganza.

*Statistically, slightly shorter than average, actually.

2014 Year In Review

In a fitting end to the year, I’m revisiting my first post.  In that post, I put together a long-winded list of helpful habits in order to keep a new years resolution.  That list boils down to the following advice: identify, write down, and publicize quantifiable goals with definitions for success and failure that can be evaluated at a regular time interval.  Track your progress, and don’t let a single failure derail the whole goal, but instead consider it a part of the learning process.  Finally, know when to give up.

I then proceeded to follow those instructions all year long.  I put my money where my mouth is, and for the first time ever, it paid off.


Summary page from my goal tracking notebook

As you can see, my goals met with mixed success.  I’ll go through them one by one and provide some commentary and a grade, and then I’ll wrap up with some closing remarks.  Here goes!

Weekly Goals

These are the goals that boil down to “do something X times per week.”  I determined successes and failures on a weekly basis; if I failed to complete the goal for a week, that was a complete failure, but I got a clean slate the next week.

Work out 5 Days

47 / 52 weeks.  I only had one week with less than 4 workouts (the week I was in Vegas), and I basically didn’t miss a week from July on (including holidays and vacations), so I feel pretty good about this.  The raw numbers say A-, but I’m going to give myself an A on this one.  The key was being flexible enough to treat even relatively easy workouts (20 min on an elliptical) as workouts, without which I would have missed a few weeks due to limited time or desire (in particular on vacations).


Sleep 56 Hours

This went really poorly.  I did this in only 6 weeks this year, and three of those were on vacation (including the last two weeks).  Here we find a good example of the “know when to give up” rule.  I basically decided that I didn’t care as much about sleeping as I did about the things I was doing while I was awake; I still tracked it, but I stopped trying.

F-, though.

Get to Work Early 5 Days

The criteria for success here was either be online by 7:30 or in the office by 8:00 AM, all five days in a week.  This is another good example of knowing when to give up; although progress on this goal was initially promising, I decided it wasn’t worth disturbing my girlfriend to get out of bed so early, so I just sort of stopped.  I only succeeded 7 weeks this year before giving up.


One Date

I consider this a pretty strong success.  I got off to a rocky start — for the most part, I managed a date every other week or so — but the whole point was to find someone to go on dates with consistently, which I did (and she’s super great).  Plus, I was still trying really hard during those early periods; as I’ve mentioned before, it was a surprising lot of work to do the whole online dating thing.  I managed 40 weeks with a date this year, plus netted a girlfriend from it.  I’d call that an…


Contact 1 Long-Distance Friend Per Day for Seven Days

I managed 40 weeks contacting a long-distance friend every day.  For at least three of the weeks where I failed, I was actually out of the country and couldn’t contact anyone at all.  Most of the rest were single day failures.  Without going through my records and counting it up I’d guess I talked to someone on at least 330 days this year, with a possible max of about 345 days — that’s a solid 95%.


One Blog Post

I cheated a bit here, because if I wrote something but didn’t post it, I’d count it.  Also, I definitely wrote a blog post on Sunday once and counted it for Saturday.  Either way, I managed to write something in at least 42 weeks this year, which is pretty solid.

I will note that the point was really to get me to start writing fiction again, but I found that writing the blog was more enjoyable (and more readable, probably, since my fiction is terrible).  I give myself a solid B on this — I definitely passed, but it wasn’t excellent work.


Two Hours of Music

I only managed this in about 27 weeks this year, but I put those 27 weeks to good use.  I can play pretty passable guitar, which is particularly impressive since I’ve tried to teach myself at least two or three times prior to this year.  A bunch of the weeks that I didn’t get two hours in, I got some time (typically more than an hour), but there are definitely stretches where I didn’t play at all, but could have.  Given the 50% success rate (low) balanced with the results (fairly good), I’m going to give myself a …


Yearly Goals

< 10% Body Fat

Hahahahahaha!  No.  Not even close.  I gave up on this one after I gave up drinking for Lent and there was no change.

F –

Olympic Triathlon:

I never even really tried for this one.

F –

Dance Lessons:

I didn’t really try for this one either, although I brought up once maybe with my girlfriend… I think?  I might be making that up.

F –


I didn’t join a band this year.

F –

Complete 2 Coursera Courses

I completed a Stanford machine learning course on September 1 and a Princeton algorithms course on October 17 — plus I took almost all of an intro to interactive Python course out of Rice, but it ended while I was out of the country so I missed it.

A +

New Job
I didn’t publish this one originally, since I thought it would send a bad message to be like “I CAN HAZ NEW JOB” on the internet while I was still employed, but I did have a goal to get a new job (or leave my current job) by year’s end.  I am happy to report that my last day will be January 7, after which I will be effectively funemployed, but nominally working for myself, for some indeterminate period of time.  I’m looking forward to it!


What has two thumbs and works for himself?

However, because I don’t have a replacement income, and because I’m not quitting until 2015, I give myself a…



I gave myself a passing grade in 7 of 13 goals, which isn’t great, but it’s a whole lot better than the 10% or so that was quoted in my original post.  Plus, I think we can glean some more info from this.

Notice that, of the weekly goals, I passed 5 of 7?  In my original post, I mentioned that one of the worst resolutions you can make is to do something by year’s end, since you just keep putting it off forever and ever until the end of the year sneaks up on you.  That’s certainly what happened to me here.  Even my Coursera courses didn’t get completed until well after the halfway point, and I didn’t get a new job until the very end of the year (and at that I didn’t like… look for a new one at all).  Meanwhile, the weekly goals I gave up on I made a conscious decision to give up on — every day I recorded that I didn’t get to work early or that I didn’t get 8 hours of sleep — I never recorded that I didn’t take dance lessons or that I didn’t do an Olympic triathlon.  The key takeaway from this for me is that these goals are stupid and need to be reworked so they can be better tracked.

Due to mixed (but well above-average) results, I give myself a passing grade, but due to my poor setup for the yearly goals, it can’t be a high one.  I’ll scrape through the year with a …


Next week, I’ll reveal my goals for 2015!  Given what I’ve learned in 2014, I’ve come up with a pretty robust set of goals for the coming year that I can hopefully track a bit better than some of the ones I made this year.  Get excited…?


I don’t fly well. I’m writing this at an astonishing 34,000 feet in the air – that’s almost seven miles in the sky. We are currently experiencing what is best described as “mild turbulence” – and it is mild – and I’m utterly terrified.   I keep looking out the window, as though to reassure myself that we are, in fact, still asky – like it would be possible for me to discern that we had changed altitude at all at such a preposterous height. In fact, I’m checking to make sure that the ground is still parallel to the plane’s trajectory, and we are not plummeting nose-first to our doom, or – somehow, worse – nose-up, streaking toward the moon.

This is particularly interesting, because I am, by training, a mechanical engineer*. In particular, I took a number of aerospace courses in school, and always dreamed of becoming a rocket scientist.

I was going to attempt to present a coherent blog post (for the first time ever!), but as I’m writing this, our turbulence has passed from mild to medium, the point at which I FREAK OUT. So instead please enjoy the following bullet points.

  • I always thought the scene in … I don’t remember which movie it actually is, so let’s say The Hunt for Red October, where Jack Ryan (probably played by Alec Baldwin) is on some red eye and the stewardess hands him a pillow and he’s like “No, I can’t sleep due to turbulence,” and the stewardess looks at him like “Well there’s an SAT word,” and he has to explain to her what turbulence is. What stewardess doesn’t know what turbulence is!?
  • I used to have relatively smooth flights all the time. I have not had one in at least a year, and for years before that the frequency has been decreasing; there is always some patch of rough air, as measured by the fasten seatbelt sign being on or the pilot announcing it. I don’t know if I’ve become more aware of it, or if climate change is causing increased atmospheric energy which has no choice but to punch every plane I’m on, but something in the last few years has changed, and it’s making me worse at flying, since I’m always worried it’s about to get bumpy, and I’m always right.
  • They have suspended cabin service. This is a double-whammy, since it means things are getting rougher, but it also means the cocktail I so desperately need to calm my nerves upon hearing that announcement won’t be coming.
  • We are either over the coast or North America’s largest uninhabited forest. There is nothing outside the window. I have no point of reference. I have no choice but to assume we are taking this one straight to the moon.
  • Did you know that before planes started falling out of the sky in the ‘70s due to microcracks and metal fatigue, they didn’t know that planes could fall out of the sky due to microcracks and metal fatigue? Food for thought.
  • Statistically, flying is one of the safest modes of transportation, both in terms of number of accidents and fatalities. Did you know that statistically, most planets are unable to support life? And yet here we are. Improbable things happen.
  • And while I’m on the “deaths per passenger” issue, can we talk about how much worse it would be to survive a plane crash? I’m not afraid of dying, I’m afraid of being afraid to die, or worse – of living in unending pain. Both of those are uncomfortable. At least in death you feel nothing. All the most comfortable things feel like they’re not even there, like that scene in Communty where Troy experiences the room in which room temperature is kept, or being naked.

At this point, drink service has resumed, I have my precious cocktail, and the captain has even turned off his fasten seatbelt sign. We now return you to your regularly scheduled programming.

But that’s the thing – do you know just how heavy a commercial airliner is? Because I do. It’s like … super heavy. Now, I understand the principal of lift – it’s pretty key to aerospace engineering — so I get that moving air quickly over an airfoil results in an upward force on the airfoil. I know that this force is (at least) proportional to the velocity over said airfoil. However, I’m also familiar with Newton’s three laws of motions (but apparently insufficiently so to number them? How embarrassing…), one of which is that an object experiencing no net force will continue to move at its current velocity, with no acceleration. Since we are cruising at constant velocity and altitude, there must be no net force on the aircraft. This means that the lift force we are experiencing is exactly equal to the weight of the aircraft, but it also means that our forward thrust is exactly equal to our drag. And drag is proportional to the square of our velocity, which as mentioned previously is essential to maintain our lift. So basically we need to push hard to move fast through the air in order to say aloft, but the harder we push, the less return we get on our additional investment due to drag. How we’re in the sky is a complete mystery to me.

Then there’s the fact that the lift manifests entirely on the wings. The wings are supporting the entire aircraft right now. THE WINGS. Have you seen those things? Those wobbly little spindly things hanging off our plane? Sure, during normal flight they seem peaceful, but hit a tiny pocket of rough air and all of a sudden they’re going “juggada-juggada-juggada-juggada-juggada” all over that foot-tall area where they’re mounted to the body. Don’t worry though, they’re probably riveted there by the most experienced child laborers in China. Mark my slurred words, the next rash of airline disasters will involve whole wings just snapping off the body of the plane and fluttering peacefully the ground behind the spiraling death plume of the main body and its other wing.

And now you understand why I’m (and I’ll admit it, unfoundedly) terrified of flying. Luckily, we’ve begun our initial descent into the Fort Lauderdale area, which means I’ll soon be safe and sound aboard a cruise ship at sweet, sweet sea level. Those things never have any problems, right?

* Legally I’m not actually allowed to say that. I trained as a mechanical engineer but never took the licensing test, having gone into energy trading after college. Therefore, I think it’s probably truer to say, “I have received mechanical engineering training.”

Privacy Please

Growing up in the ’90s and coming of age in the 2000’s, I was bombarded by old fogeys complaining about the lack of privacy in this digital age, with its Facepokes and interweb retailers asking us for credit card numbers. I — and other young people — typically dismissed this as an older generation that didn’t understand the awesome transformative power of the Internet and the fact that, with Amazon, not only can I avoid all human contact by buying my groceries online, I can set up recurring deliveries so I can avoid all online contact, too.


Now the real question is, what do I do with all this milk?

Of course, the old fogeys had some valid points, but for the most part our generation has accepted that to some degree, we are giving up some small amount of our digital privacy in order to reap the huge rewards of the Internet.  However, over the past ten years or so, the amount of privacy we’re giving away has shot up at an alarming rate (whether or not we realized it at the time).

Ten years ago, (Feb ’04), Mark Zuckerberg launched a social networking website called thefacebook.  While other social networking sites had existed before it, by 2008 Facebook had overtaken its most popular rival and boasted 100 million users.  While morons continue to post racist diatribes and pictures of illegal activity, the savvy (read: non-moronic) user accepts that the information they post is largely public and can be used against them by, e.g,. employers, law enforcement, and the public at large.  That’s a pretty small price to pay for being able to Like your friend in Santa Fe’s recent relationship status change from “married” to “single.”


Followed by several Status Updates™ about chocolate ice cream

Around the same time Facebook achieved ubiquity, Apple changed the world by introducing a mobile device that was — reliably and quickly — connected to the Internet.  Again, other smartphones had existed before, but this one was lighting fast; it could be connected to WiFi, and even its cellular-backed data connectivity was faster than anything on the market.  This ushered in the era of smartphone ubiquity (by 2013, more than 50% of American adults would own a smartphone), but more importantly, it meant that small objects — phones and other handheld devices, cameras, monitoring equipment, house lighting systems, cars, anything — could connect directly to the Internet so long as they had some connective (e.g., cellular) service.  As more and more devices connect to the Internet and we approach the awesome realities of the Internet of Things — we’re at around 16 billion installed devices now, and the number is growing rapidly as wearables and other applications take advantage of these technologies — the privacy we’ve given up has crossed from the digital world into reality.

The same technology that will allow me to video chat with my cat while I’m in Puerto Rico next week also allows the NSA to know that I’m in Puerto Rico.  At least in my experience, the news that our government was spying on us was met with a resounding “meh.”  Just another bit of privacy we’re giving up for the awesome power of being able to correct Siri when you tell her to text your dad that your plane landed and she tries to send a text to Jad that your fame is left-handed.  Of course, there’s other ways that these data can be used against you — if put in the wrong hands (as though the NSA are not the wrong hands, amiright?), data about your smartphone usage or the location of your car might alert burglars that you’re not home, or that you were cheating on your spouse, etc.  But that folds neatly into our history of giving up privacy in order to reap huge rewards — and, without really getting into it, I really do believe the rewards are huge.  But the more connected we get, the more real-world privacy we will lose, in ways that we may not expect.

Last week, my cousin came over to my parents’ house for Thanksgiving dinner.  He brought with him a quadcopter drone.  I don’t want to nerd out too much, but this thing was really cool — gyroscopically stabilized, with a swivel-mounted high-def camera (the videos are incredibly stable), and quintessentially a part of the IoT: connected to GPS (it won’t let you fly it around restricted airspaces like airports, and it has an automatic return-to-home feature), and with real-time wireless feeds of the camera sent straight to your smartphone or mobile device.  After he demo’d its flying capabilities, he brought out his laptop to show us some of the videos he had taken.  They were mostly aerial shots of his neighborhood or his friends’ houses; a few pictures he had taken of groups, that kind of thing.

But one of them was, for lack of a better word, somewhat disturbing.  His office is down the street from a rougher part of town, so he had gone out in the parking lot, flown the drone over a few buildings and a couple of blocks down the street, and come across a street corner.  And pretty much immediately you see a bunch of people looking up at the drone, pointing, and running away.  And here’s my cousin, sitting in my parents’ living room, showing us HD video with the faces of a bunch of guys who undoubtedly thing they’re being watched by the police and have been caught on camera selling drugs.  I don’t want to use the words “human rights violation,” but if you assume that humans have the right not to be surveilled by random strangers without some sort of regulated oversight, then I guess I’d use those words.*

This problem — and problems like it — will continue to grow with our connectivity and our ability to process it.  Imagine a future where the Amazon delivery drone has cameras; it uses images to make product recommendations for you based on your perceived tastes and preferences.  As it passes by your window it catches a glimpse of your daughter’s room and all of a sudden your Amazon recommendations are all My Little Pony themed.  Imagine you can stream the feed live so you can watch your package being delivered.  Your neighbors are delivering a birthday present for their eight year old, and they’re recording the video so they can have his reaction.  As the drone flies by your house, they — and their eight year old — watch and record you having sex.  Congratulations on exposing yourself to a minor!

I’m not saying we should stop the connectivity — far from it — I’m just saying that for the next 80 years, I’ll be inside with the blinds closed.

*Rhetorical; I don’t actually think that we have that right and I’m too lazy to look up what sorts of loops the po-lice have to jump through to surveil.  I guess my point is, maybe we should.

Daylight Waste of Time

This post is a couple of weeks late, but I feel like it’s important for people to understand just what an enormous waste of time the whole Daylight Saving Time concept is.  It has literally zero good or useful qualities and everything it touches turns to total garbage.


It is the Nic Cage of time tracking.

I’ve read a lot of blog posts (OK, I’ve read a couple (OK, I’ve heard about some)) criticizing the modern practice of DST begin by saying that it made sense when we were an agrarian society and most of the work and production in the country took place on farms.  This argument makes exactly zero sense — farmers are known for getting up at the crack of dawn.  That’s sorta their thing.  Did you know that the crack of dawn doesn’t magically shift forward and backward by an hour every spring and fall?  Same thing with sunset — I’m not making this up, you can look it up.  It’s probably in books somewhere.  Whether a farmer gets up at 4:30 and goes to bed at 9:00 or gets up at 5:30 and goes to bed at 10:00 is completely immaterial to the farmer, who is beholden to the daylight.  He’ll get up no matter when it happens.  And before you say, “Ah, but the [grain | cattle | wegetable] market opens at a certain time each day,” I would remind you that those markets are beholden to the farmers, and therefore also to the daylight, so there’s no reason they wouldn’t open earlier or later as the daylight grows and fades.  Maybe this is why historically, farmers have opposed the practice.

The other main argument for DST — and the reason it was extended in the Bush administration — is that it somehow saves energy.  The idea here is that a large portion of our electricity still goes to lighting the home (accounting for around 14% of residential energy usage and 3.5% of total US energy consumption), and we expend that energy disproportionately while we are awake after sundown.  So the reasoning goes that if we make the sun set an hour later, but keep our schedules the same, we’ll effectively be going to bed an hour earlier, using an hour less energy to light our homes.  This also seems ridiculous to me.  Not only do recent studies suggest that we’re probably not actually saving any energy by doing this, but the whole concept of only doing this for 8 months out of the year makes no sense.  Why not always be on DST?  Or better yet, why not just… move our schedules back an hour?  From now on, we work from 10-6 instead of 9-5.  Problem solved.  Go home.  Keep your lights on or don’t, I don’t care.


A whopping 30% of commercial electricity usage in the Bay Area goes to illuminating people’s idea-bulbs.

Right now you’re thinking, “But Mysterious Internet Authority, what about that weekend where we get an extra hour of sleep?  That’s a pretty great weekend!”  First of all, it’s “Mr. Ious Internet Authority” to you, and secondly, what about that weekend where we get an hour less sleep?  We reap what we sow, and in this case it’s a day of traffic accidents and lost money from sleep-deprived laypeople trying to navigate a world in which the very concept of time has magically skipped over an hour they normally would have spent resting.  The only good that DST ever did for us — making it dark earlier so we could trick-or-treat at 5:30 — has been stolen from us, since fall back was moved out of late October and into November.


Thanks, Obama.

OK — so I’ve “conclusively” “proven” that DST has little-to-no upside.  But what about its downsides?  Check this guy out — it’s basically a list of crazy things that happened because of DST!  Mixed in are hilaaaarious cases of, for example, a terrorist plot foiled by the terrorists failing to know what the time on the bomb meant, along with more innocuous stories of twins where the older one is actually born “after” the younger one.  There’s also stories about how one year there were 23 different DST events in Iowa alone, all of which needed to be kept track of for things like train schedules, which cost an estimated $12 million more per year to maintain than if DST had not existed at all.

These problems are neither problems of the past, nor limited to terrorists or Iowa.  For me, professionally, DST is a nightmare.  In my line of work, it is important — actually, it is essential — to be able to store information with what is called a “primary key.”  This key is used to look up data; a key (hahaha) component of a primary key’s effectiveness is its uniqueness, which allows the user to specify a primary key value and return exactly one entry.  This is so key (hahahaha) that the software used to store data requires primary keys to be unique.

As a concrete example, let’s say that I’m storing information about what the temperature was in Baltimore at a given time.  The reasonable primary key for this data is time.  Now let’s say I want to know what the temperature was in Baltimore on, say, Sunday, November 2, 2014, at 1:30 AM.  Looks like I’m SOL — there were two 1:30 AMs on November 2.  So now I either need to include an additional piece of information in my key — “was this hour the DST duplicate hour or not,” which will be “not” in 8,759 out of 8,760 entries in that table — or I need to pick a single time zone to put the data in (EST or EDT, rather than EPT) and then keep track of that every time I want to tie that out with another piece of data (this problem is actually largely solved, but my company is so far down the path of not-best-practices that we actually choose to just ignore the extra hour and store everything in EPT).  This leads to a huge number of problems for the company as a whole (how do we treat the extra hour for products that trade on an hourly basis?  Can we tie out our data to the point where we can even have a view on that hour?) and me personally, as I spend probably ten hours every fall fixing things that were written back when time didn’t suddenly and inexplicably duplicate itself — you know, 99.98% of the year — and the fix is almost always to ignore any duplicates in the primary key of the data, which means if there are real duplication errors (e.g., they post two completely different temperatures for Baltimore at 2:30 AM on Sunday, Nov. 2, and we need to figure out which one is correct), we completely miss them.  Then of course there’s the data that gets posted on an hourly level — what do you do when one of those says that the hour is “hour ending 25” or “hour ending 2_2” …


I feel you, Pikachu.

So I of course propose that we eliminate DST entirely.  However, if we’re going to look at DST, we might as well look at time zones generally.  I further posit that these, also, are misleading and confusing (though not to the same degree as DST), and should be eliminated in favor of a single universal time (say, a Coordinated Universal Time).  There’s exactly zero reason that 12:00 PM has to occur in the middle of daylight (and in fact, it doesn’t — we already shift that by an hour in DST, and near the borders of time zones by up to two hours), and as already discussed there’s no reason that we have to operate on a 9-5 schedule.  In Virginia, we’re (currently) 5 hours behind UTC; why not go to work on a 2PM-12AM schedule?  We’d wake up when the sun rises at 12:00 PM and go to bed after it sets at 1 AM.  Midnight would be 5AM and noon would be 5PM.  The key is that it makes no difference what time the clock says — the only thing that matters is that everybody agrees on its meaning.  In today’s global marketplace, if the entire world can agree on a calendar*, the entire world should be able to agree on a time.

* It’s surprisingly hard (not to say that it’s actually all that difficult — I googled a bit and didn’t find much, but I expected it to be very easy) to find data on who all uses the Gregorian Calendar, but the wikipedia page on New Year claims the Gregorian calendar is in “worldwide use,” and this page cites the Wiki page on the Gregorian Calendar (which doesn’t currently appear to have a list of countries using it) in its calculation that over 96% of the world population currently observes the Gregorian Calendar.  Of course, the Gregorian Calendar has its own problems — even if we all agree that the best way to keep track of dates is to have 12 months and an extra day every four years (but not … every four years…), why have a month with only 28 days ever?   Why not make every month either 30 or 31 days and have it be 7 / 5 for three years and then 6 / 6 for the leap year?

Note: for a (presumably) less biased and more fact-based discussion of the goods and bads of DST, you can check out the wikipedia page (duh), which has a section devoted to this discussion.

Accounting 101

I think I frequently annoy my girlfriend.  Let’s be honest, that’s not really a surprising statement — I think I annoy pretty much everyone.


Frequently by not touching people

As a biologist working to address climate change (and with a quite a passion for doing so, I might add), I think I really get on her nerves whenever I try to put myself into the shoes of skeptics.  To be clear, I 100% believe that the earth is warming, that it’s caused (or at least aided) by increased levels of carbon dioxide in the atmosphere, and that those increased levels are manmade.  There is no doubt in my mind that humans are causing the climate to change in ways that may be difficult to predict, except for this: it will have catastrophic consequences for people and ecosystems.

However, I also think that some level of skepticism is essential — blindly accepting science just because it’s called “science” doesn’t make you any smarter than rejecting all science outright; after all, think about the supposedly causal link between vaccines and autism — that was “science” at one point (though quickly and thoroughly disproven — hooray, science!), and if we had looked at the outcome of that one study, collectively thrown our hands in the air and accepted what some “scientist” had to say…


We’d be no better than this.

Obviously, compared to climate change, the anti-vaccine “science” is a wildly different beast — that paper was directly contradicted by dozens of other research papers, and almost immediately a scientific consensus was built that there was no causal link between vaccinations and autism.  That’s how we do science — we look into relationships, we find something compelling, and we generate a theory — a model representing the real world — that says “A is probably caused by B, but it could perhaps be influenced by X, Y or Z.”  Subsequent studies explore the relationship between A and B to confirm what the first study found, and others explore X, Y, and Z to confirm that they are not major influences on the causal relationship between A and B.  Every subsequent study increases the probability that, in real life, B causes A, and we increase the degree to which we believe our models are accurate predictors and explainers of real world outcomes.  This is exactly how the major scientific theories and laws were developed, from gravity to evolution to climate change, with varying degrees of certainty.  There happens to be a pretty high amount of certainty in both the climate change and vaccination-autism models currently employed.

However, I assert that the debate is still important.  One of the really cool things about the way we build our scientific models is that we constantly get to revise them in the face of new evidence; we should always be pushing to find new evidence to revise our models.  When people who don’t know what they’re talking about say things like “Well evolution is just a theory,” and people who do know what they’re talking about then say, “Well, so is gravity!” no one learns anything.  However, directly engaging the uncertainties could potentially lead to a meaningful discussion (although a model based on past experience states that it probably won’t…).  Educating people about the scientific process and drawing comparisons between something they believe in (e.g., gravity) and something they don’t (e.g., evolution) may be a more productive avenue forward: “It is true that evolution is just a theory; however, many of the scientific theories and models that we take as fact today exhibit the same weaknesses that the theory of evolution does.  Consider the theory of gravity; the current model suggests that any mass exerts an attractive force on other mass.  This mechanism has been empirically observed on Earth and can be used to explain the movements of planets in our solar system, stars in our galaxy, and galaxies in our universe; most recently it was instrumental in allowing us to land a probe on a comet traveling at tens of thousands of miles per hour after a journey of over 3 billion miles.  However, one of the weaknesses of this theory is that some of the gravitational movement of the universe is unexplainable due to a lack of visible mass; empirically, it is much more likely that this mass exists and cannot be observed (so-called ‘dark matter’), than that the theory of gravity is non-universal or flat out wrong — but there are still other possible explanations.  Similarly, evolution is widely agreed upon to be the most likely manner in which the ecological diversity we see today came into being; however, since we cannot physically go back in time to watch it unfold, we can only say that it is extremely likely — though not certain — that the processes and mechanisms that we see today, including radioactive decay used in carbon dating and random genetic mutation and speciation that we have empirically observed, behaved the same way in the past as they do now.  In the absence of other evidence, this theory works as an excellent model going forward, and has been instrumental in HIV/AIDS research, ecology, and the selective breeding of animals and plants, enabling us to feed a population that is five times higher than it was when the theory was first proposed.  I would encourage you to look for evidence that supports or disproves these assumptions, so that the scientific community can update its models, and we can do even more — after all, the theory has been constantly refined and changed over the last 150 years in light of new evidence, and it was those refinements that enabled those breakthroughs.”


We used to think that time was a cube until Professor Cohle did his seminal work on its true geometry.

However essential skepticism may be, there is no room for flat-out deniers.  Anyone unwilling to engage with the theory at all should be left out of the conversation entirely, and that goes especially for people controlling policy.  This makes it utterly terrifying that the presumptive head of the Senate Environment and Public Works Committee has published a book called The Greatest Hoax: How the Global Warming Conspiracy Threatens Your Future.  Meanwhile, the governor of Florida — a state that will be hard hit by rising sea levels — is on the record as saying he’s “not a scientist” when asked about climate change.  Congratulations Rick, neither am I.  That’s why I listen to them when they tell me your state will be under water in fifty years.  This is like a chef walking into his five-star restaurant and declaring “I’m not a food safetyist,” then serving raw chicken to the diners.

Dr. Nick

Or just generally anything that this guy does.

Except it’s so much worse than that.  Because when a chef gives people salmonella, he’s held responsible.  His restaurant is closed, and he can be charged with criminal negligence — or worse, if he did it on purpose.  If people die, he can get charged with manslaughter or homicide.  When Florida floods, Governor Rick Scott will be remembered for his presidential run or his time in the senate.  He may be long dead by the time the true damage is done.  In short, he won’t be held accountable for his actions.

So here’s my solution: let’s hold climate change deniers (and for that matter, anti-vaccinators) accountable for their deeds. Every parent whose child dies from a preventable, vaccinate-able disease should be tried and convicted of infanticide.  Climate change deniers making policy should be held accountable for the future destruction of life and property that they will have caused.  A White House paper estimates that seeing global temperature rise by just three degrees Celsius would incur a penalty of .9% of global GDP ($74 trillion in 2013, or over $650 billion per year).  The cost in human lives might be large as well — one estimate suggests as many as 300,000 per year due to malnutrition and severe weather, and those will come disproportionately from developing nations that will have the hardest time adapting to the changing climates.  World War II cost an estimated $1.3 trillion and killed an estimated 50 million people — while the cost in human lives might be smaller in the short term for climate change, we would hit that damage total in only two years at current GDP levels, making climate change three times more destructive than a constantly-waging World War II.  If these predictions come to fruition, any policy maker who prevents climate action should have their names permanently smeared by history as the mass murderers and belligerents that they are, with their pictures next to Hitler’s on history’s Wall of Shame.  Schoolchildren should be reading about them by name in their boiling hot schoolrooms for the next millennium; their families should have to live with the shame of what they did for generations.  And they should have no trouble agreeing to this plan — after all, iIf what they believe is true and climate change is not manmade or isn’t even happening, there won’t be any deaths or destruction, and they’ll go down as lone visionaries bucking the conventional wisdom and saving us all the hassle of living in a cold world.