The Only Known Instance of Voter Fraud

This year, for the first major election in my history as a voter here, the state of Virginia conducted elections under two controversial legislative measures aimed at preventing electoral fraud (a problem which, by the way, is almost 50 times less likely than you, personally, getting struck by lightning this year). The first requires voters to display photo identification at their polling place to confirm their identity.  This seems pretty straightforward; you are required to prove who you are so you don’t vote under someone else’s name.  However, opponents of the law claim that it disproportionately affects the poor (and due to demographics in the US, this in turn disproportionately affects minorities), acting as a potentially preventative poll tax due to the high prices of photo IDs, especially in cases where a birth certificate is unavailable.  This law, typically championed by conservatives, therefore could be seen as a way to decrease or even restrict minority turnout in elections, which is especially pernicious considering minorities tend to vote for liberal candidates.  In fact, it appears voter ID laws may have reduced voter turnout in the election this week, which saw conservatives gain an enormous majority in the Senate.

The second law has been around much longer, and prevents voters from registering to vote on the day of the election (in the state of VA, you must be registered 22 days prior to the polling date).  Here again the logic seems straightforward; if you make people register prior to the election, it gives you plenty of time to confirm that they are registered in only one place and prevents people from registering at multiple polling stations and casting multiple ballots.  However, proponents of same-day registration have pointed out that it dramatically increases voter turnout and that it is not a burden on poll workers, as its opponents claim.  Movements against the practice are again often considered another method that conservatives employ to restrict voter turnout in key demographics that typically vote for liberal candidates.

Below is the very “real” story of how a member of a key conservative demographic, white upper-middle-class males, was disenfranchised in a battleground state by a measure aimed at decreasing the non-existing problem of voter fraud and, ironically, led him to commit voter fraud.  Feel free to share it with your local (likely Republican) senator or congressperson.


Like every good American, I checked my voter registration long before the election, in mid-October.  This was well in advance of the 22 days required by the great battleground state of Virginia.  I found what I had expected: I was still registered to vote.  I checked that off of my list of to-dos and continued to monitor election news and events as a well-informed voter.

Fast forward a month, to this past Monday evening (November 3, 2014).  In planning to perform my civic duty the following day, I checked my polling location.  I had moved about a mile and a half down the road in July, so I assumed it wouldn’t be the same location it was the last time I had voted, but I was surprised to see that it was.  Being just the type of hard-working, no-nonsense white, upper-middle-class man that built this country, I investigated further.  I found that, while I was registered to vote, I was registered under my old address.

“No trouble!” thought I, “I’ll just change my registration now, or perhaps tomorrow at my new polling station.”  I checked the registration deadline and found that it had long ago passed.  I thought that perhaps my polling station would be the same for my new address as my old one, such that I needn’t re-register.  I was disappointed to find that my new address did indeed have a different polling location.  I was stuck.  In order to perform my civic duty and exercise my God-given right to vote — in one of the tightest senate races in the country, only decided yesterday — I would have to do something that runs completely counter to the American ideals of freedom and democracy: commit voter fraud.

In the end, my duty to contribute to the governing of this great nation trumped my law-abiding nature.  The ballots at the two polling locations were identical; I live in the same county and the same district that I moved from, so I was not misrepresenting my eligibility to vote for any candidates.  The poll workers were unfazed by the different address on my driver’s license; a small amount of paperwork indicating that I currently lived at my old address was all it took.

In the end, I was able to participate in this election.  But at what cost to its integrity?  To my own integrity, as a man?  Nay, at what cost to the integrity of this great nation?  Help end voter fraud — support same-day registration in Virginia and in all states.

Duke Sucks. Go Duke!

My company recruits in the fall, and I typically get to spend some time traveling to schools or meeting recruits in their senior years at top-notch east coast academic institutions.  As part of the recruiting schtick, I tell them that I graduated from Duke with a mechanical engineering degree to illustrate that, at our company, we’re not really looking for people with any particular background in finance or energy, but instead we want people who are good with numbers and like to solve problems.  Frequently, people will ask me how I enjoyed Duke or what I think of it.  That’s a simple question with a complicated answer.

I’m a firm believer that, in the end, it’s the people who you go to school with (or work with — another part of my schtick) that ultimately shape your experience, and that at any sizable institution you’ll be able to find a niche and connect with people who share similar values and facilitate whatever personal growth you seek.  Whether you’re at a 60,000 person state school looking to spend your evenings expanding the horizons of homemade wearable tech or at an Ivy hoping to spend weekends smoking pot and like, really finding yourself, man, you can find people to do it with, and you and they will determine the quality of your educational experience.  Duke was certainly no exception, and my friends from Duke are some of the smartest, most self-aware, humble, and all around best people that I know.

Some of them are crooks, though
Some of them are crooks, though.

However, my experience with the general culture and population of the university was different.  While there were at least as many decent people minding their own business on campus, the group of students subscribing to the “work hard, play hard” ethos was the loudest and most visible subculture; although only ~35% of students at Duke are involved in Greek life*, the social structure revolved around the typically awful parties and events thrown in their “dormitory sections” (which is where the fraternities are housed, rather than in houses) or at the hilariously awful local club, Shooters II.  It frequently seemed like the only options on a Friday night were to go into some dormitory common room with a sticky floor and cardboard covering the lights, with the sweet, bready smell of stale beer permeating the residence halls as people held boring conversations with red solo cups in the rooms of the section, or to venture out to Shooters and try to grind on a stranger (that stranger would of course end up doing much better than me on our econ test that Monday).  Needless to say, this was not my jam.

Jam
My jam, of course, is raspberry.

The fact that I didn’t enjoy the particular social scene isn’t so much a reflection on Duke as it is a reflection on me; I’m relatively awkward, and I’m terrible at having a good time.  What really soured my perception of the institution was the pervasive culture of elitism.  Those section parties I mentioned earlier?  They get cleaned up by the cleaning staff the next day, who before 8 AM have to come in and scrub the beer-covered walls and mop the floors and throw away the empty cases and cans of crappy beer that a bunch of thankless 20 year olds sleeping off their revelry from the night before left for them to deal with.  This sort of elitism isn’t limited to the selfish acts of a few kids, it’s widespread in the institution.  The Great Coach K actually gave a speech before a game against UNC that can be summed up as, “You’re better than people because you go to Duke.”  Not, “Our basketball team is going to win,” or even, “Our basketball team is better than other basketball teams,” but “Other people are inferior to you.”  Who says that — and more to the point, who cheers for it?  Duke students do.

While certainly a minority, the number of people who went to Duke because it was the best school they got into — but not the best school they applied to — was high enough that the Ivy chip on their shoulder contributed to a defensive sense of elitism perhaps best exemplified with the pervasive “D-U-K-E” / “DDMF” … chant… thing … that freshmen learn on their first drunken bus ride back from west campus.  The funny truth is, Duke really isn’t that special as an academic institution.  Sure, it’s consistently ranked as a top-ten undergraduate institution, but those rankings are based on highly biased criteria like peer perceptions, admissions selectivity, and alumni giving, rather than any measured academic value.  Meanwhile, not a single non-professional graduate program ranks in the top nine (we have three at #10), and the engineering program barely cracks the top thirty.

The truth is, whenever I tell someone I went to Duke, I get one of two reactions.  I either hear, “Oh, you must be so smart!” and I feel the need to correct them (“Actually pretty much none of the people I went to engineering school with are fit to be engineers,”), or I see them immediately start treating me as though I’m some sort of elitist, and I can’t blame them.

On the other hand, most people know that I’m a pretty big Duke athletics fan.  I frequently issue disparaging comments about inferior institutions, and almost always immediately follow them up with a quick “Go Duke!”

sucks
Carolina sucks.

How can I hypocritically claim to chafe against the elitism of the institution while simultaneously displaying it, you ask? To start with, Duke does athletics pretty well.  That elitist coach I mentioned earlier?  He’s really good at coaching basketball; while it’s not necessarily great sportsmanship to throw up the #1 sign and sing “We Are the Champions” on the plane ride back from winning an NCAA title game, it’s at least technically true.

But the thing that I really like about Duke’s athletics is that it unites the campus in a way that you don’t often otherwise experience.  Whether you’re in the most exclusive sorority or you’re a founding member of the D&D club or you’re a math professor, everyone is on the same side and rooting for the same thing.  Basketball games were one of the only places where people would leave their own petty elitisms behind and join a group that was larger than themselves; where nobody questions anyone’s decisions to sleep outside on a sidewalk for three nights in sub-freezing temperatures in order to watch Kyle Singler or Jabari Parker dunk on next year’s NBA rookie class.  And nothing, nothing, will ever beat the feeling on campus the week after watching Gordon Hayward’s would-be buzzer beater rim out in the 2010 championship game.  I wouldn’t trade it for anything.

Duke celebrates victory against Butler in their NCAA national championship college basketball game in Indianapolis
No caption necessary.

Go Duke!


*I single out Greek life here not because all Greek life at Duke is “bad” or a “a problem,” but because I’m lazy and it’s loudest, most visible subculture is the same loudest, most visible subculture endemic to the institution.

The Uncanny Valley of American History

Note – the following was composed at an altitude of 35,000 feet by a man too cheap to purchase in-flight WiFi. It makes no attempt at factual accuracy, presents no graphical illustrations of its ridiculous claims, and claims no informational purpose whatsoever.


Most (all two of) of my readers are probably already familiar with the “uncanny valley,” a concept first proposed by some guy, as I recall to describe the strange dip in human comfort levels with robots as a function of those robots’ similarity to the human form. Put simply, we humans are comfortable with a robot that is sufficiently inhuman — an R2-D2 or a C-3PO — because we recognize that it is non-human due to its shiny metal exterior, and we treat it as human only in the same that way we anthropomorphize our pets. On the other hand, a robot that is sufficiently human – the Replicants from Blade Runner or the Michael Fassbender / that-guy-who-played-Bilbo-in-Lord-of-the-Rings androids from the Alien franchise (where here the inclusion of Prometheus plays fast and loose with the terms “Alien” and “franchise”) – doesn’t bother us, because it looks, acts, and feels like one of us (here the extras chant “one of us” in creepy unison). It’s specifically when something looks like one of us, but doesn’t quite get something right – a motion that’s too jerky, or unseeing, dead eyes, perhaps – that we become uncomfortable. There is something wrong, but we just can’t quite put our finger on what it is. It’s like the human-skin-mask-wearing villain of our own personal horror movie.

Ladies and gentlemen (read: “lady and gentleman”), I would like to introduce you to the human-skin-mask-wearing villain of American history: 1870-1910.

Think about it – if you could envision a graph of your own personal familiarity with American history, starting in, say, the 1750s, it’d probably start pretty low, hit the revolution and spike, stay pretty high through the 1830’s (war of 1812, Andrew Jackson and the Trail of Tears), dip a bit, and then spike again from 1850-1865, dropping abruptly after the end of the Civil War and Lincoln’s subsequent assassination. Then it rises again come WWI; the ‘20s was “The Roaring 20’s;” the business of America was business! The ‘30’s had the Great Depression, the ‘40’s had The War, and the rest your parents lived through.

What the hell happened between 1870 and 1910? As far as I know, literally only two things: reconstruction and populism. I can’t tie a single event to either of those things, except maybe … something about the gold standard? I don’t know if that was abolished around the turn of the century or if it happened 30 years later … or 30 years prior for that matter. And don’t get me started on reconstruction; I grew up and took all of my history classes in the south, and all anyone could say about reconstruction was “there were carpetbagging northerners!” It is a fact that I know the derivation of the term carpetbagger, but I couldn’t tell you why they were bad, or what they did.

But I think there’s more to it than just “it was a boring-ass time, and we didn’t do a whole lot as a country for several decades.” Really, it gets down to the fact that this era saw the rise of a modern industrial, imperialist America … but it wasn’t quite the industrial, imperialist America of today.

Think about it – antebellum American society is sufficiently foreign to us that we immediately recognize it as an Other, something that is distinctly different from ourselves. The slave-owning, locally-governed federal republic of the pre-war era, where the most important legislative chamber was the House (the House, for God’s sake!) and presidential hopefuls could come from the backwoods of Kentucky (Kentucky, for God’s sake!) is the R2-D2 of American history. There are no photographs, just paintings, and the subjects of those paintings are wearing strange garb that is immediately recognizable as old-fashioned. You know no one who lived through that time, and in all likelihood you know no one who knew someone who lived through that time. It is safely relegated to The Past.

The last hundred years, meanwhile, are readily identifiable in today’s world. They exist in living memory; chances are, your grandparents were born before or during the Depression; their parents fought World War I, and they know that history as well as you know your parents’. The society of today is found firmly rooted in the progress made over the last hundred years; the flapper society that saw women get the vote is echoed in the ‘60s and ‘70s with the sexual revolution and again in the ongoing fight for gender equality in the workplace, and the excesses of Wall Street that led to the stock market crash of 1929 and ultimately the Great Depression are alive and well today, as evidenced as recently and poignantly as in the 2008 subprime mortgage crisis. The advent of photojournalism and the rise of modern media means provide meticulous documentation for this era; color footage of kamikaze pilots hitting American warships made of steel could, aside from a certain level of graininess and a lack of jet engines, have been filmed in the Gulf today. The players of the last 100 years of American history are indistinguishable from us; they are the Replicants living among us, and we are comfortable with them. The last 100 years is Now.

The intervening period (postbellum?) is a strange time — a relative unknown wherein is found the seed of modern society. This is the era when the electric light goes mainstream; when Tesla and Edison duke it out in the war for AC or DC and electrocute elephants at public spectacles. This is an era when the railroad hits its full stride, and the continent is suddenly and inextricably linked together, and the wilderness is conquered. Think about the scene in Lincoln where Lincoln goes into the basement of the White House (or whatever) and talks to the telegraph operator. The 1910’s have telephones; the antebellum period has mail carriages. This era has telegraphs – a crude approximation that clearly leads to technology available today, but is sufficiently different that it is utterly foreign to the average person in the modern era (although admittedly was still being used in the American Navy until recently, I think?). Similarly, the antebellum period has covered wagons; the 1910s have the automobile and the airplane. This period has enormous locomotives powering their way across the country, by river or by rail, as part of everyday life; something we are now sufficiently divorced from to find comfortable.

From a political standpoint, this period gives rise to the modern, centrally-empowered federal government of today, but it doesn’t really come to fruition until the trust-busting and federal regulations of the early 20th Century, which of course reaches its height in FDR’s New Deal. Antebellum America is a collection of states – “the United States are going to War over the issue of slavery.” The modern America is a nation composed of states – “the United States is going to war in Europe.” The America of this interstitial period is a murky grey area with respect to the issues of federal versus state power; an America that will see the rise of nationalism and jingoism and the ultimate triumph of the Red, White and Blue in the early 20th Century, complete with Sousa marches to back it up.

The people of the era perhaps are most obviously residents of the uncanny valley, captured in a nascent photographic medium that distorts colors, with details that have faded over a century and a half; the etiquette for photography had not yet evolved to produce the “say cheese,” smiley happy photos of today, and subjects often look serious or wooden, with cold eyes staring out through they years. Auditory recordings are similar; the newborn phonograph producing a higher-pitched, tinny tone, complete with a voice speaking what is unmistakably American English, but in a slightly strange accent, with an odd, formal vocabulary and strange cadence. The Civil War brings fashion out of the tailored age of the antebellum era and into a mass-produced, industrially sized and supplied era, but the materials are the same wools and cottons used before the war, rather than the machine-picked, clean cottons of today, completely lacking artificial fibers, which wouldn’t be invented until the early 20th Century. This is a people that is discernably real – we can see and hear them – but we can’t quite put ourselves in their shoes; we can’t genuinely identify with them, and that makes us uncomfortable. There is something distinctly uncanny about our encounters with them.

Perhaps all that’s missing is a light to shine on this period of American history – perhaps it’s precisely the lack of knowledge of this time that leads to this conclusion. Maybe if we get to know these people and their deeds, study their likenesses and get to know their voices, we’ll look at them and find within them the essence of our own kind. More likely, though, the uncanny valley will spread into the future; the next generation will view the deeds of our grandparents as uncomfortably foreign, and eventually even our time, our Facebook posts and our newspaper clippings, will seem queer and awkward for not being in 3D and set on Mars. But if that’s the case, maybe eventually we, too, will be relegated to the Past. And maybe that’s the most anyone could ask for.

XBox (Turn Me) On

All my life I’ve played video games.  I remember when we got the original Nintendo (which, thanks to a friend of mine, I still own a version of); I remember being upset when I realized that the onset of the Super Nintendo era meant that they would stop making NES Mega Man games, but falling in love with SNES’s classic (and maybe best game of all time) Zelda installation, Link to the Past.  I remember thinking a Zelda game could never work on one of the fancy new 3D consoles (“How could they possibly get around the top-down view from the old games?”), and then being blown away by Ocarina of Time.  I remember thinking that Sony’s PlayStation gamble would never be able to compete with fierce Nintendo and Sega brand loyalty, and that XBox’s entry into the market in the following generation was a misguided joke.  Then the Dreamcast bombed, my GameCube couldn’t play about half the games on the market, and Halo came out… and within a few years, I bought an XBox.  By the time the next generation rolled around, I was hip to the times; XBox and PlayStation were here to stay; Nintendo was relying on cheap consoles, cheap games, and novel gimmicks to capture the family gaming market.  I bought an XBox 360 long before I bought a Wii, and to date I have about 10 times as many XBox games as Wii games, maybe more.

With the next generation of consoles out, I knew I would have to make some serious decisions.  History had taught me that brand loyalty means nothing; just because I have a 360 now doesn’t mean I should get an XBox One.  Moreover, the new XBox isn’t backwards compatible, so my impressive library of 360 games wouldn’t factor into my decision.  Initial reports on specs for the new platforms suggested that PS4 would outstrip XBox One in most categories, plus the XBox system would be far less flexible, requiring users to maintain an expensive XBox Live account and boasting a litany of other offenses large and small.  On top of all of that, it would cost more.  In the end, I was thinking there was a good chance I’d make the decision to switch that I failed to do years ago with the GameCube.

Decision
Unfortunately, I had none of these things.

In the end, my decision was made for me — my friend (the same one who sold me his NES) decided to go with the XBox option; this in spite of the fact that he owned a PS3 (and also a 360 — he game real hard).  Since a lot of the fun of the games lies in the multiplayer options, and to date no cross-platform multiplayer options exist, there was basically no other option than to follow suit.  I’ve had my XBox One out of the box for about 24 hours now, and here’s what I’ve discovered.

The Good

So far, there’s a lot of really cool things about the XBox.  In many cases, these apply to both consoles; if you’re looking for an in-depth breakdown of which is better at what, this seems to do a pretty nice job.  But as for things that I like about my new XBox One so far, here they are, in no particular order:

Graphics

The graphics are hands down better than the 360.  I didn’t really think this would be a huge selling point, but it turned out it is.  Apparently, PS4 has better graphics than XBox One, but the difference isn’t that noticeable unless they’re side by side.  Anyway, I actually think that the gaming industry will be at a crossroads for the next generation of consoles — they’re rapidly approaching the uncanny valley, and I think the industry is going to have to pour a ton of money into elevating their graphics to the point of photorealism in order to avoid the sickening sensation of seeing a perfectly-crafted computer-generated face, then seeing its hair fail to detect it and pass seamlessly through its own skin.

Kinect

I didn’t have a Kinect with my 360, but I went ahead and spent the extra money to get one with my new XBox One — we’ll get into why later.  So far, the ability to issue voice commands has been extremely helpful, and I intend to continue to use it.  An obvious application here is pausing Netflix and other video apps; one of my biggest pet peeves about the old system was how hard it was to judge the timing on the menu — how often did you hit play, then wait to see if the controller menu would disappear, then click “B” to get it to go away, but you were a second too late so you accidentally exited the movie?  Or worse — bump the controller to the ground and all hell breaks loose. That’s a thing of the past, now!

Integration

By far the coolest thing is that you can use your XBox One as an entertainment console — you can plug your satellite or cable into the XBox and use it to switch between TV and games and apps, without having to change your input.  I actually had a pretty complicated setup involving an audio receiver, so I didn’t think this would be that useful, but since 95% of my use is either the XBox or the TV, now I don’t have to even switch my receiver.  I was a bit worried when I ran through the setup steps that the receiver wouldn’t work (it kept trying to issue volume commands to my TV, for instance, which has zero pracitcal effect), but in looking through the settings it turns out you can redirect audio commands to a receiver.  I had a few hiccups getting it set up (it was unable to automatically detect my cable box type), but I found that I could manually specify my setups and it had no trouble getting everything set up; it seems like they have a pretty rich library of integrations.  There’s also the ability to basically PIP an app — including TV, so today I watched the Duke game while playing Shadow of Mordor.  Go Devils!

As an added bonus, if you get the Kinect, you can actually issue TV commands verbally, and switch between TV and games with verbal commands too.  To top it off, you can even turn on and off the entire system with voice commands.  I just took a video of it, but can’t upload videos to my blog unless I buy a $100 upgrade (stupid WordPress), so that’s not going make its way up here.  The important thing is that it turns on my audio receiver and my TV, just by me saying “XBox On.”  But don’t take my word for it — here’s meth kingpin Jesse Pinkman shilling for the man:

Exclusive Games

This is actually where brand loyalty to XBox (and Nintendo in the past) really matters; I want to play the next Halo game when it comes out.  I have no attachment to any PS games, and none have been sufficiently intriguing for me to even consider getting a new console just to play them.  XBox exclusive games are a pretty small fraction of the games I’ll play, but I’m excited to know I’ll be able to play the ones I really want to.

The Bad

Of course, knowing me, there’s always plenty to gripe about. None of these things is a deal breaker, but in my opinion they’ll need to be dealt with if XBox wants to steal away any market share from PlayStation in the long run.

Menu Layout

The new menu is — at least for now — totally un-navigable.  It’s just a huge jumble of stuff; it seems like the most recent things you’ve viewed show up to the main page, and anything important you can “pin” to the far-left page, but they’ve severely reduced the navigable content and thrown all apps under a generic “apps” tag.  It’s also almost impossible to navigate with only a controller; it’s really built for use with the Kinect, which is why I ended up getting one.  Speaking of which…

Kinect

It’s pretty cool as a gimmick, but the Kinect has a long way to go.  Its speech recognition is at best fair and at worst usable; it definitely gets things wrong from time to time.  I’ve also heard that its facial recognition is fairly poor — it signs you in automatically when it sees you, but apparently if you have two people who even look remotely similar that have accounts on the same system, that’s a deal breaker.  (This is anecdotal; friends have complained of this. I’ve not had any issues.)  Where it’s really lacking is in its gesture controls.  They’re basically unusable; having to wave your hands out in front of you is tiring on the arms; the gesture recognition software is either a little too jumpy, picking up gestures where there are none, or not perceptive enough, failing to interpret gestures unless they’re performed perfectly; it’s gesturing interface has unintuitive and uncomfortable commands (you extend your arm to click something, but frequently it doesn’t pick up that you’ve begun extending until your arm is already fully extended, and then there’s no more extending you can do, so you just… sit there…).  Altogether, the gesturing is not ready for production and primarily serves as a distracting gimmick.

The other shortfall in this area is Kinect-integrated games.  Assuming the technology is where they say it is, this should be an incredibly powerful tool for developers to exploit.  Unfortunately, I think the fact the Kinect is not a guaranteed feature of the XBox makes it difficult to design around the assumption that there will be one — if you make it a central feature of the game, you’re really just limiting your audience.  Furthermore, given the above list of grievances, it’s definitely possible the technology just isn’t where they claim.  Either way, the Kinect is a killer app away from being something really special, but until then it’s just not there yet.

Apps

There doesn’t seem to be an HBOGO app? Thank God I finished The Wire last weekend.

The Ugly

All-in-all, I have no regrets about my recent upgrade.  I’m excited to start playing through some of the new titles, and I’m cautiously optimistic that Microsoft will fix some of my gripes above in the coming year or two.  Bottom line?  Now that I’ve finished my second Coursera course for the year, I’m definitely going to be wasting a ton of time on my new toy.  Don’t look for CCM.com to be ready for launch any time soon…

Yankee Doodle

I think it’s finally happened — I’ve completely run out of thing to say.  I have some ideas for future topics, but I don’t want to write about any of them now (some because I have more topical release dates, some because even I think they’re boring).  I haven’t done anything particularly interesting recently, and I — somehow — haven’t heard or seen or read anything that’s really incensed me (or possibly everything I’ve read and seen and heard has incensed me, but I’ve become to numb to the constant incense).

download
My sense of smell was never that good either.

I briefly contemplated doing a listicle, but then I intellectually belittled myself for even knowing the “word.”  Plus, ClickHole has won the listicle anyway, so it’s pointless to even try.  I could post cat videos, but Lana is surprisingly unphotogenic for the world’s most adorable bear.

IMG_0884
Seriously, look how grainy she is.

I could just go through pictures on my phone and post them, but nobody wants to see that many pictures of my junk — trust me, I have the cease and desist order to prove it.

So instead, I’ll ramble a bit about something that I’ve always had a special attachment to that’s coming up.  I’m talking, of course, about the anniversary of October 17… 1777.

1024px-Surrender_of_General_Burgoyne
To be fair, I would surrender too if you pointed a cannon at me from that range.

This date marks the close of the Saratoga Campaign during the Revolutionary War; specifically, when General Gates accepted the surrender of the dastardly British commander, General Whatsisface McNobodyCaresBecauseHeLost — er, I mean, John Burgoyne — and the 6,000 troops under him, while the American camp played Yankee Doodle, which, let’s be honest here, is awesome.

Why do I care about this, you ask?  Because this victory is absolutely pivotal to the American cause during the war.  Prior to the battles at Saratoga in the summer of ’77, the Americans had had little success prosecuting their war.  Although Washington was able to defend Boston in early ’76, he turned around and lost New York later that year, and Philadelphia fell to the British in early ’77.  The successes of the war, including the defense of Boston, were successful primarily due to smuggled French materiel; in fact, the French government (largely inspired by their embarrassment during the 7-Years War), continued to supply the Americans with gunpowder and other resources throughout the early part of the war and into the Saratoga campaign.  Without these French resources, the rebellion would have foundered before it really had a chance to take off.  Still, the French were reluctant to declare all-out war — which would prove necessary to defeat the British, since the Americans had no particular Navy to speak of and would ultimately rely on the French Navy to blockade the Chesapeake, cutting off the British retreat and trapping them at Yorktown, leading to the final surrender of the war.  (Incidentally, the customary white flag appeared four years to the day after the surrender at Saratoga.)

Coming out of these embarrassing defeats in New York and Philadelphia, the French were reticent to support what might be a losing American cause, especially since they had a relatively untested Navy and the war would be expensive to support.  Ultimately, the victory at Saratoga proved that the fledgling nation could stand up to the big bad Brits, and the French officially recognized the United States as an independent nation in early ’78, joining the war officially in March; the rest, as they say, is history.

historyFeature
Pictured here as a Civil War soldier for some reason.

Another key facet of this story is its effect on Washington, who, until Saratoga, had little to show for over a year’s work.  Washington’s legacy as a military commander is something of a mixed bag — on the one hand, he commanded and trained the Continental Army, which ultimately defeated the largest British force ever deployed outside of Europe at the time, but on the other, he had very few military victories in his career — principally Boston, Yorktown and Saratoga, although at the latter he was not present to command.  To his credit, after losing New York, he avoided large, pitched battles with what was a pretty meager army, but just before the victories at Saratoga in the summer and fall of ’77, he was coming off of what may have been his worst year as Commander-in-Chief — such a poor year, in fact, that Congress was threatening to remove him from command.  The surrender at Saratoga on October 17, 1777 came at just the right time to reinforce his credentials, and for better or worse, Congress decided to stick with him, through French involvement all the way to the surrender at Yorktown.  (Ed. note: This is actually a somewhat simplified and rosy view of how everything went down, but it helps for my narrative.)

So this Friday, think about what happened 237 years ago on a field with a cannon pointed at some British guy’s junk and whistle Yankee Doodle to yourself.  I know I will.

MONSTER MADNESS

Guys, Ebola is terrifying.  Like, super duper 100% incredibly terrifying.  Like, every time I hear about it on the news I think about every scene in every horror movie ever where everyone is hanging out minding their own business and in the background the news is on and it’s like “And today, another reported case of what is being called ‘Simian Flu’ was reported in Paris.  It appears to have arrived on a commercial jet from a generic third world country.  Officials have implemented a quarantine and assure us there is no cause for alarm about this mysterious disease. In other news…” and then they go on to like the war that everyone is actually worried about or some mindless fluff about how Kim Kardashian lost her Skechers endorsement deal after she had her kid or something, and the audience is like “OMG they only gave it three sentences in that news report like it wasn’t important at all BUT I BET SIMIAN FLU IS THE ZOMBIE VIRUS WHICH I KNOW BECAUSE I JUST PAID MONEY TO WATCH A ZOMBIE MOVIE,” and then the main title credits roll, and in the background it’s like headlines about Simian Flu cases reaching 1 x 10^X or something and diagrams of transmission vectors and a globe with a big, growing red spot labeled “INFECTED ZONE” or whatever, and then, before you know it, 28 Days Later the world is a mass of shambling animated corpses, and there’s like 3 people left on earth, and one of them has an axe for some reason.  And that’s exactly what’s happening right now — we’re so preoccupied with the brewing Cold War II and the ongoing Middle Eastern War Part CXII that when we hear about Ebola in west Africa we’re like, “Eh, we’ve got bigger fish to fry” and then we move on with our lives to worry about other things.  And as cool as it sounds to be the one dude with the axe, remember that when this whole Ebola thing / zombie movie plays out, the odds of you being that dude are about 1 in 7.2 billion; the odds of you being a corpse are … considerably better.

Of course there are cultural factors in play, and although there are many instances of possible Ebola patients in the US, it turns out that the US is actually probably well prepared to stop any further outbreaks due to higher levels of education, materiel, and preparedness than the countries in which it usually breaks out (as shown by the fact that only one of the possible Ebola patients actually had Ebola (although he was admittedly discharged from the hospital… again, terrifying)).  We’re also aided by the fact that patients are not contagious during the incubation period, which means we don’t have to worry about tracking down all contacts in a 3-week window prior to hospitalization or death.

So, all-in-all, this probably won’t be the end of the world — at least, not the developed world — but it still has many parallels to the classic horror movie virus transmission trope, which got me thinking about the fact that most of those movie ghouls and monsters are ancient abstractions of fears that were poorly understood — including diseases.  Here are some of the real-world explanations for classic mythical monsters, in no particular order:

Dragons

According to David Jones in “An Instinct for Dragons” (as summarized here), the dragon is likely an ancient abstraction of the three most common predators of our tree-dwelling ancestors: the cat, the snake, and the raptor (and I would argue that you could add fire to the list).  This is supported by the dragon’s universal cultural significance (from the far East to Europe to the Americas), which belies an origin far older than any specific culture, and indeed older than the diaspora of early man from eastern Africa across the globe.  The dragon’s flying nature and sharp claws (representing the raptor) may suggest even older roots; it’s easy to imagine a time when large cats or poisonous snakes (or even just very large ones) posed a valid threat to wild Man, but no predatory bird could successfully hunt a human after it was even a few years old.  However, evidence suggests that raptors probably fed on the young of our australopithecine forebears 3 million years ago (with the most famous example being the Taung Child).  It is also hypothesized that this may be the reason for genetically inherited fear reactions to objects flying overhead.  The more you know!

Werewolves

Although there is much debate over the source of werewolf mythology, it seems to have arisen from any number of diseases that parallel the symptoms, from excessive hair growth to the more likely Rabies, which, like Werewolfism, causes aggression, frothing at the mouth, and is spread through biting.  Since Rabies is objectively more terrifying than Ebola (~100% fatality rate after symptoms appear), and since Medieval Europeans didn’t have access to Rabies vaccines, it seems likely that this monster may actually be a manifestation of our fear of the disease.  Additionally, the prevalent theory that various psychoses were tied to the phases of the moon (hence, “lunatic“) ties in nicely with Werewolf mythology.

Zombies

Zombies come from Caribbean tradition (particularly Haitian voodoo), which in turn draws its roots from west African traditions (where can be found the etymological roots of zombie (nzambi, e.g.), principally carrying spiritual connotations), and seems to be an allegory for slavery, due to a zombie’s brain-dead submission to an evil authority.  However, the modern concept of the zombie as a shambling, undead corpse actually finds its roots as recently as George A. Romero’s Night of the Living Dead.  A more general treatment of these creatures as among the undead is probably more fitting, as various forms of corporeal undead mythology exist across many cultures, including the draugr of northern Europe, the revenant of western Europe, and the vampire of Eastern Europe, which typically feature malevolent dead returning to take care of unfinished business, and many of which serve as vectors for spreading disease and destruction.  The modern zombie (and the associated zombie apocalypse) actually stems from a more modern fear — the fear of societal collapse at the hands of an ineffective, bureaucratic government and the selfishness of the individual outweighing the greater good.  Over the last few decades, zombies have also come to represent a fear of over-reliance on technology and the inability to deal with new information that fundamentally breaks our worldview; the Max Brooks novel World War Z in particular mentions the struggle of coming to terms with something that is, at its face, scientifically impossible, and how that effects the initial understanding of the threat it poses.

Vampires

As previously mentioned, vampires share a common ancestry with many other corporeal undead monster myths, and in fact almost all cultures have some form of vampiric undead, suggesting that they represent some ancient, deep-seated fears — in particular, the fear of death and the process of decay (given the illusion of continued hair and fingernail growth immediately after death, or various swellings that may be interpreted as signs of life after death, plus natural bleeding from the mouth and receding gums giving the illusion of long teeth used to suck blood).  Additionally, vampires and other undead are often associated with the spread of contagion, which may symbolize the poorly-understood mechanism by which victims of disease might spread the disease after death to those who interacted with the corpse; by implying that the corpse could come back to life and spread evil, the myth helped to promote safe practices like burning corpses or avoiding them altogether.  Finally, many of these undead monsters display cannibalism in some form, either hungering for human flesh (as in modern zombies) or drinking human blood; these practices are known mechanisms for spreading diseases, and associating these practices with evil spirits may have performed a vital role in discouraging them among society.


I hope you enjoyed this brief look at the roots of the monsters we hold so dear.  I certainly look forward to 2016’s Ebola-inspired release of the next big blockbuster horror film, Evening of the Monsters that Bleed A Lot.

Notions on the City of Notions

I work for a small company — between 50 and 60 “investment professionals,” as we call ourselves — and because we don’t have the human resources necessary to field a human resources department, every fall we do our own recruiting.  I love recruiting season, of course because it’s a great excuse to not do any work for a couple of weeks, but also because it offers an opportunity to travel to places I’d not normally go to, usually with someone who spent a few years there.

business
“Do you need this by tomorrow?”
“No, I’m out of the office until January, so anytime in the intervening 4 months is fine.”

Last year I went to Pittsburgh to recruit at CMU, and I ended up getting shown around the neighborhood by a bunch of PhD students.  I hate to say it but… Pittsburgh is really cool?  (I also get to show people around Durham and visit friends I don’t get to see very often who are there, which is great.)

This weekend, I went to Boston for the MIT career fair on Friday, after which I elected to stay for the better part of Saturday so I could explore the city.  I really liked Boston proper — I got to walk the Freedom Trail (hooray, history!) and see crazy things like a bike race through the location of the Boston Massacre and Paul Revere’s house (also the bar next to it, with the sign “Paul Revere probably drank here!”).

PaulRevere
Also, Paul Revere’s tomb.

The Freedom Trail goes from the Boston Common (not to be confused with Ballston Commons, AKA “The Worst Mall in America”) all the way to the site of the fortifications on Breed’s Hill, the location of the famous and poorly-named Battle of Bunker Hill in the Revolutionary War.  Charlestown, now built up around the hill, is a beautiful town filled with little brick houses — and also the scene of that totally rad mission where you have to run down through Charlestown as the British burn it in Assassin’s Creed 3.

Boston is one of the oldest towns in America, and as you walk the Freedom Trail, you wend your way through 17th and 18th Century landmarks like the Old North Church and Faneuil Hall.  But it’s very different from the other old cities I’ve been to (principally in Europe), where they separate the old city from the new; in Boston, these landmarks from the first European settlements are interposed with massive 20th and 21st Century skyscrapers.  It’s actually kind of cool to think of the city as moving on and evolving in exactly the same place that it’s been for the last four centuries.

The other thing I enjoy about recruiting is that, against all odds, by telling people how much I love my job all day, I actually start to like my job.  Somehow, the combination of cherry-picking the parts of my job that are actually interesting (and then, perhaps, lying through my teeth about the magnitude of that allure) and how incredibly douchey it sounds to tell people about how dynamic the energy markets are and how sending accurate pricing signals to the market is a much-needed service that the market is only happy to reward through windfall ducats makes me come home excited about the prospect of another several years of sending those pricing signals.  Then I come home and am confronted by the sheer, mind-numbing drudgery of what I do — that the problems I solve are both esoteric and incredibly uninteresting, that, while the actual problem at hand is perhaps often different, the tools and methods used to solve these problems are always the same, and that I literally cannot remember a time in the last 2 years when I didn’t feel undervalued — and I fantasize about brushing up my resume and leaving it all behind.

I should be clear, the job is actually great; it pays well, it’s relatively well-run, and it’s fairly successful.  The problems, while esoteric, are real-world problems that do need to be solved, and while this opinion may be atypical of finance / investment  professionals, I actually do believe that financial participants provide a service to the market by providing pricing signals, provided those pricing signals are based on rigorous analytical modeling of market fundamentals.  Moreover, many of the people I’ve worked with in my time there are easily the smartest people that I’ve ever met.

Mr. Peabody
Me and my first boss on my first day of work.

And I think that that’s the problem.  This was underscored to me as I was wandering the streets of Cambridge the night before the career fair.  Everywhere I went, I passed the elite of the elite, the young up-and-comers of Harvard and MIT in their native habitat, and they were having the most boring conversations about just the shallowest inanity.  To their credit, it was markedly different from walking around Duke’s campus, where the conversation skews douchey rather than pretentious, and I now appreciate that these schools actually attract a different — perhaps even better — class of student than my alma mater.  But hearing the under-stimulated intelligentsia walk around talking about how the play they saw last night really explored the depths of neo-post-modern consumerism in an ante-post-racial world, or how being in the production crew for Exploring the Depths of Neo-post-modern Consumerism in an Ante-post-racial World really opened their eyes to other cultures because of the diversity of the cast, served to underline the degree to which our most intelligent people are directing their considerable brain power toward just the most frivolous bunk, and that’s exactly what I feel most of my coworkers are doing.

I can hear you thinking, “but you all just sold out for well-paying jobs.  It’s your fault that you bought into The System, man.”  Yes, I always wanted to be a rocket scientist, and I gave up on that because I could get a (much) larger pay check by going into trading. I have absolutely no problem with people responding to market incentives or doing things for personal gain.  People should be going out there and creating Facebooks and Twitters and starting energy trading companies and generally solving problems — both the global-scale, world-changing problems like curing cancer and putting men on Mars and the small-scale esoteric ones that most people will never knowingly benefit from.  The issue is that the people I work with are capable of so much more than what they’re doing, even if you’re just judging by paychecks.

At this point I’ve outgrown trading; within a year or two of taking the job, I started to feel that a trained monkey could often do what I did.  I was underutilized — I still am — and although my days are busy, I’m not really learning anything.  But worse, I think, is that I’m not creating anything worthwhile, either for myself or anyone else.  Instead, everything that I do goes to make some rich guy at the top of the ladder even richer while I and those around me — many of whom exhibit far greater potential than I ever could — largely stagnate.  And we’re all doing this instead of exploring our universe or curing diabetes or becoming billionaires.

So to those out there who feel uninspired by your current situation, who know you are capable of more than you’re doing, I say take a risk and go do something productive.  Start a business, build something new, change the world.  Don’t get stuck in a neo-post-modern ante-post-racial consumerist America and spend your time to analyzing it on a Thursday night with your under-stimulated friends.

Wired

For years, people have been telling me to watch HBO’s The Wire.  Having finally knocked Breaking Bad and Game of Thrones off my list, I’ve recently found the time to take them up on it.  I’m almost done w/ Season 3 (I’ll probably finish tonight), and so far, I haven’t been disappointed.  Except by Season 2.

Since then, I’ve had that weird thing that happens where you see something for the first time and then see it again everywhere with Season 2 of The Wire, and in particular with how underrated it is.  My brother, who defends Season 2 (or should I say, “Season poo”… amiright?  I’m so right) as “the quintessential modern Greek tragedy,” recently retweeted this for my benefit:

And just this morning in my weekly drive to keep up with the ever-changing, highly-varied Cracked.com, there was a reference to Season 2’s underratedness (to quote, “…every episode of the underrated second season”) on the most recent episode of The Spit Take (if you really care, it happens around 5:15).

Here’s the thing though — I haven’t talked to anyone who’s watched the second season recently.  And, looking back on it from a distance of almost two weeks, there were some really good things about it; I can honestly imagine thinking, “It wasn’t so bad as everyone says.  The story was compelling, and the characters were well-developed.  The callousness and understated professionalism of The Greek and company played well against the tragically-flawed Frank Sobotka, and the close of the second-to-last episode in the season is destined to become a classic moment in TV history.”

But every time I think that, I remember how much I hated actually watching that season — it’s easy to pull back and see the good story but forget about the terrible execution.  And we must never forgive the creators for bungling what could have been fantastic television.  Here’s what went wrong — may we never forget.

It starts right at the beginning — actually, at the beginning of every episode.  The theme song for season two is terrible.  In fact, the above linked Cracked video even refers to it as “almost confrontationally bad.”  The first season’s theme is a darkly upbeat, bluesy masterpiece that uses the historically black roots of blues music to highlight the racial tensions of policing drug crime in Baltimore.  The second season’s intro is at best boring and at worst unlistenable — the upbeat bluesy aspect is gone, and in its place is a slow, static background track that never seems to lead anywhere.  Of course, that doesn’t bring up the obvious aspect of the vocals, which sound like they caused the singer almost as much physical pain as they were erupting from his throat as the listener incurs by hearing them.

But let us not judge a show by its intro.  Let’s instead look at the stuff that the show did right, and then remember how in execution it actually went wrong.  We’ve established that the overarching story was pretty good, so let’s start with that.  I have two major complaints about the story:

  1. The story has little-to-nothing to do with Avon Barksdale and Stringer Bell until the very end, and even that tie-in is weak and contrived.
  2. The “getting the band back together” storyline is so contrived as to be laughable.

Here’s the thing — you can fix both of these in one shot.  Establish that Stringer has to start going through Prop Joe for his stuff at the beginning of the season; give Lt. Daniels a motive for looking into it (unfinished business or some such), and make that the focus of the investigation from the get-go.  Don’t give the shrew-faced wrinkly Major some BS story about wanting to put stained glass in a church and creating a 12-person task force to pursue a petty personal vendetta against some no-name Pole down at the docks just so Lt. Daniels can come in and say, “If I’m going to do this, I’ll need all of the people from last season who signed on for this one.”

There are of course other executional mis-steps in the story — for instance, I would point to the amount of Dunkin Donuts in each episode as somewhat distracting, and the technology used to track the shipping containers, with the little truck graphic and the disappearing boxes, as frankly hilarious — but the fact that my two complaints above cover basically the entire first two-thirds of the season speaks for itself.

Another aspect of the show that went really well was the characters — we’ve already covered the tragic hero Sobotka and the villainous Greek and his henchmen, but in doing so we’ve completely overlooked two of the worst, most useless, least sympathetic and watchable characters of our generation: Mousey Docks Police Lady (I can’t be bothered to look up her real name because I just don’t care about her enough) and, of course, Ziggy.

My problem with Mousey Docks Police Lady is quite simply that she’s boring.  She doesn’t really do anything or add anything to the show; occasionally, something interesting looks like it might happen (OMG she’ll hook up with McNulty! Nope.  OMG she gave away that they were under investigation!  Doesn’t matter), but then it doesn’t.  At one point at the beginning of the season they literally talk about how she’s not real po-lice, which seems like it’ll be a source of at least some tension (can she come up to speed and deal with life … on the street?) and then … she never really grows or progresses as a character, despite being a main focal character in several episodes.  To the point, she’s so useless to the season that I don’t even know her name.

Meanwhile, the problem with Ziggy seems obvious; every Season 2 defender I’ve talked to so far has even had to admit that none of Ziggy’s actions can be explained by any sort of thought process or motivation.  He’s a completely unlovable, whiney, irresponsible fool who does things like celebrate being able to pay off his drug connections by lighting a $100 on fire in front of his notably poor friends and coworkers at a bar and just straight up murdering a dude, then sitting there like an idiot.

But the real problem with Ziggy isn’t Ziggy himself, it’s the other characters’ — particularly, his family’s — interactions with him that just make no sense whatsoever.  His dad basically talks to him twice throughout the show, once to tell him off for the aforementioned $100-bill incident, a conversation that ends in a Lion King, Mufasa-teaching-a-hard-lesson-after-rescuing-Simba-from-the-elephant-graveyard-that-ultimately-reinforces-their-relationship type moment, and … probably another time?  I would guess?  They’d have to, since they’re family, right?  And then you have the cousin, whom the elder Sobotka treats basically as his son, who for no apparent reason puts up with Ziggy’s garbage and continues to invite him along to deal with those who are infinitely his superior.

Ziggy
The Greek, this is Ziggy…
harbinger
…Ziggy, I’d like you to meet The Greek.

Also, his name is Ziggy.  Can we talk about what a dumb name that is?  It’s a really dumb name.

Anyway, the kicker on all of this is that the acting — in particular, of these two characters — is pretty bad.  To be fair, I don’t really know how one could act the part of Ziggy, since it’s almost impossible to take that character and put yourself in his place, but even Mousey Lady Cop seems to basically have no emotions; it’s like her character was described to her as “earnest professional” and she got no further instructions.  Also, and I can’t really make this a knock against the show, since there was no way they could have known when they cast him, but the fact that the cousin is Pornstache from OITNB is also really distracting.

It’s sad, but all of these mis-steps really detract from the experience of what could have been top-notch television.   I understand why people think it’s underrated, but I challenge those people to remember that, even though they like the story and remember the aspects that really went well, when they actually watched it, they didn’t enjoy it because of all the little stuff that got in the way.

On the Origin of “Species”

I am not a biologist, so I can guarantee that I really have no clue what I’m talking about. Needless to say, I’ll charge ahead stubbornly anyway.

stubs
My sophomore yearbook photo

I was listening to Radiolab a couple weeks ago, and they did an interesting bit on conservation in the Galapagos — in particular, half the episode was about an invasive parasite that is endangering three species of finches on one of the islands, and the resulting interbreeding of those finches in the face of annihilation.  Now, as I’ve mentioned, I’m not a biologist, but I very distinctly recall that in my 6th grade biology class a species was defined as a set of individuals who can create fertile offspring — a donkey and a horse are not the same species, because when they mate, they produce mules, which are infertile; a chihuahua and a St. Bernard, however, produce a batch of fertile puppies (it helps if the mom is the St. Bernard), because they are the same species.  So how was it that three separate species were interbreeding and producing fertile offspring?

This question has actually crossed my mind a number of times before, since basically every other week we discover that we’re not actually X% Cro-Magnon man, we’re actually some weird hybridization of (X-1)% Cro-Magnon and (Y+1)% Neanderthal.  

Neanderthal
My senior yearbook photo

And with extinct species it’s particularly difficult to tell anything about species and speciation, since species are constantly evolving and changing, and if you look at, let’s say, the slice of 2 MILLION years during which we know that T. Rex existed, we only have about 50 specimens to determine what constitutes the species, while in the last two million years, the genus Homo has gone from this:

220px-Homo_habilis
Source: Homo Habilis

to this: 

biebs
We’re evolving BACKWARD!

Now tell me that with 50 specimens in a two-million-year period you’re going to have any clue what constitutes natural variation within a given species (keep in mind that, at least technically, Justin Bieber can produce fertile offspring with Anne Hathaway or Jennifer Lawrence or whomever you take to be the height of evolutionary perfection) and two completely separate species, and it’s not hard to see that the concept of species is just totally 100% man-made bunk.

I mean, it’s not entirely useless — I was trained as an engineer, and I still work a lot with models, and as understandable, simplified representations of the real world, models are incredibly useful.  The species model gives biologists a shared and meaningful framework for all sorts of things — again, not a biologist, but I’d assume it’s much easier to have a formal classification system so that you can discover that Huge Green Turtle with Domed Shell and Green Spot is resilient to drought, but Huge Green Turtle with Domed Shell and Red Spot is susceptible to drought, and then when drought hits The Island With Huge Green Turtles with Domed Shells, you know how to react and which to worry about.  And sure, maybe they can breed, but they don’t, and their gene pool stays separate in practice, so hey — good model, or whatever.

ALBERT
And Albert is susceptible to toothache.

So I was kind of taken aback listening to that episode of Radiolab to hear a biologist just freak right the F out about the finches interbreeding (see around 15:45-17:30 in the episode).  The hosts present the species concept as “a biological rule” specifying which individuals will or will not mate, and then treat the breaking of that rule as this crazy thing that’s never happened before, when in reality we think that this “rule” is just a model — a simplification that proves true in most cases, but not all — and that that model has broken at every stage in our own evolutionary process.  All I could think was, “Here’s this biologist who ought to know better going crazy because her model broke, like she’d never seen it before or realized it could, in the face of countless known instances where exactly that has happened.”

I ran this little incident by my friend (who happens to be a biologist), and she …

  1. had already heard about the Galapagos finches and thought it was a pretty big deal, and…
  2. told me why it was such a big deal.

It turns out that, yes, the species model is just a model, and it is acknowledged as such, and in fact there are many species models, and the “species problem” (essentially, what actually constitutes a species) has been around in some form or another since before Darwin even made it to the Galapagos.  Ernst Mayr explored the definition of species in Systematics and the Origin of Species, eventually settling more-or-less on the “textbook” definition given above.  More generally, though, the various species models may also take into account physical characteristics (…with Green Spot vs. … with Red Spot, or in the case of the finches in question, song pitch and size) or statistical clustering of genetic characteristics, where the lines may be a bit fuzzy, but generally you’ll see two fairly distinct populations when comparing the genes of some large sample of members of two different species, and you can say with some certainty given the genetic characteristics of a new sample whether it’s one or the other.  

Species Train
Examine genetic characteristics of turtle species with red spots vs. green spots
NewSpec
Given genetic characteristics of a new, unknown sample, can we classify it?
Specguess
Pretty likely to be a member of “red spot” group.

I’m about to do something that I very, very rarely do, so cherish it: I admit that I was wrong.  Well, with the caveat that the hosts sorta bungled the presentation.  They presented it as though the biologist in question was amazed at the breakdown in the model, when upon re-listening, I realize that what she was actually presenting as amazing was not the fact that previously-thought-to-be-different species were mating, but the creation of a new species in a way that biologists expected had happened and could and would happen, but had never seen happen before in vertebrates, and that this speciation was occurring at a wicked fast rate.  The end result of this process would be a new statistical clustering of genetic characteristics, representing a group of finches sharing physical similarities that largely chooses to mate among that cluster, fitting pretty nicely into any of the various species models presented.

New Species
A brand new clustering of turtles!

And that actually is pretty cool — because now that they’ve found a new finch, they can correct one of the greatest social injustices of our times by naming it Atticus.

Mistakes Were Made

I recently decided to to purchase www.carscafmoo.com (do *not* go there — it is extremely not ready yet).  I was going to register the domain through WordPress for the low, low price of $18 annually, which would have allowed me to port this blog over seamlessly (like — really seamlessly… like going to ccm.wordpress.com would just redirect you to ccm.com).  But what I really wanted was a chance to develop my own website back-end — twiddle around with PHP scripts, build out customized themes with my own CSS, explore Python; you know, fiddle with how real websites get developed in the real world.

Unfortunately, WordPress doesn’t let you do that.  If you register with WordPress (and perhaps pay for some additional adds-on), you can build a pretty bitchin’, full-featured website without writing basically any code.  A WordPress site also takes care of a lot of the administrative work that’s nice to have, but not strictly necessary — basic stuff like following and comments (both of which are pretty customizable), but also things like alerting you when someone has commented on or followed your blog, or even linked to it, and page view counts and the like.  And that’s, like, totally rad — except if you actually want to write code, they direct you to this page, which lists hosting options.  External hosting gets you none of those sweet WordPressy things, although you can then download WordPress and have all of its features available for your website or whatever, which is actually pretty cool, and perhaps something I will, nay must, explore, especially when I eventually port this sucker over to there. 

Anyway, assuming I want to develop my website from the ground up, he’s what I should have done.  I should have built out the basics of my website locally, probably designated a folder on my local machine as the web root, figured out a way to serve that out through apache (preferably limiting the IP addresses served to local host so I’m not serving it out for the world to see), spent a month or two building up the PHP, JS, & CSS libraries required to make my site run smoothly, back-populated the existing post history into the new format or whatever, crossed my t’s and dotted my … lowercase j’s … , and then, once everything was all neatly tested out and packaged up — WHAMMO, roll-out!  Huge success!  Standing ovation! High fives for everyone involved!

empty
Pictured: both the standing ovation and “everyone involved.”

But what I did was, instead of doing any of that, I just blew a cool $100 registering with BlueHost for a year (it was the cheapest single-year plan).  I then proceeded to spend the next 3 hours twiddling around with Twitter Bootstrap and the header image I have on this blog in order to get the very, very simple header currently up at www.carscafmoo.com (go ahead, click the link at your risk…).

header
Or don’t bother. Here it is.

The impressive thing is that I actually have some experience with web design from work (and admittedly I frequently forget how the web HTML / CSS / JS part is the stuff that takes me the longest, not the actual operational code work), but what struck me was how surprisingly difficult it was for me to get off the ground here — not really in terms of the hosting setup being confusing or hard to deal with; the registration went really smoothly and pretty much immediately after clicking the confirmation button, I went to the website and it totally existed and everything.  What’s incredible is the insane amount of overhead that I just completely take for granted at work — and all that despite not having to set up the server and run apache (which, actually, I’d like to figure out how to do sometime).  Things like versioning systems and test spaces are just completely missing for me, and I have to build them out from scratch, which will be hard since I don’t yet have SSH access, and when I do get it I probably won’t have root access (or maybe I will???).  Speaking of SSH access, I don’t think I have any way of copying files as I write them locally up to the server other than through their web-based SFTP client (at least not yet), which is a huge pain in the butt for debugging.  I probably spent all of 30 seconds setting that up at work, and I literally don’t even know where to start with this site.  Needless to say, there is a ton of stupid nonsense I’ll have to figure out between now and actually launching the website.

So where does that leave me? Right now, I have a website that totally exists and people can go to it DEAR GOD that has exactly zero features and will take me, I dunno, at least a month and a half working in my spare time to get anything going on, and I’m paying almost $10 a month for it.  And the first step in that many-day journey is to figure out the very basics — stuff like 

  • Access
    • SSH and exploration of the server (I don’t even know what OS it’s running?); root access?
    • MySQL — what version are we running?  Do I have admin privs? 
  • How to quickly upload files (SFTP access + SublimeText SFTP plugin?)
  • Secret test space
    •  Folder structure: Maybe ccm.com/SECRETTESTING/ holds a version of the website and ccm.com/ just points to ccm.com/live or something?  Maybe I figure everything out locally and serve apache out to localhost and then test out locally?
    • Includes paths: How do I reference inclusions? (Probably relative to the file in question?  Maybe a centralized includes function?)
    • Access: Is there any way I can shut the test space off from non-authorized visitors?

Once that’s done (hopefully this weekend…?) I can start to get some content.  I at least have a pretty basic idea for that — I’d like to have a blog (which also means I’ll have to copy all of these posts over there…), but I’d also like to have other contributors (so if you have something to say and don’t really want anyone to read it, start writing it down!), so I’ll need a Contributors page with brief bios of people who have contributed and links to their most recent stuff.  I’ll want an About page that’s even remotely meaningful, and for now, that’s probably about it (although that’s a lot of work in getting folder structures and back-ends — think searching — set up).  Only then will I really be ready to launch.  I’m hoping I can be there by mid-October for obvious reasons (… it’ll almost be Halloween…?)

Once the basics are up, I’ll be able to start looking into things like tracking page views, following and commenting (assuming I can’t just plug right into that through WordPress; maybe I can, or maybe I’d rather build my own for sport); converting to Python, and adding all sorts of fun features (logins for contributors!?).  Of course, I can literally do all of these things on my current (free) blog through WordPress, but that’s beside the point — and the topic of an upcoming post, so I’ll leave that for then.

Anyway, in the meantime, I’ll try to keep posting here weekly, and I’ll double-try to have my posts there make their way back here.  And if you’re reading this, remember — I’m looking for contributors!  Let me know if you want in.