Geek Chic

March 16, 2010

It’s no big surprise that nerds are taking over the universe.

This, of course, is something that has been a long time coming. Most of us knew, deep in our hearts, that eventually the smart folks would be in charge. However, a lot of us assumed that Type A personality salesmen and that high school quarterback who still pumps gas at the hometown Citgo station and still does whippits in the Denny’s parking lot during shift change would be able to hold off the full-scale invasion for at least a few more decades. But it looks like their time has come.

Now, don’t get me wrong. When Percy McPocketprotector asks Molly Sue Easypants to the junior prom, he’s still getting the floor mopped with his braces. But nerds have come a long way since the slide-rule stereotyping of years gone by.

But a word of warning to many of you self-described geeks out there: Just because you’re a video-game-playing, science-fiction reading, Pokemon-loving dork doesn’t mean you’re something special.

There are plenty of ways of determining exactly when it was that being a geek became cool. Nerds have wallowed in the lower depths of the social scale for a long time, and there are plenty of items to point at as a turning point as to when this shifted. Most people would peg it to the ascent of Bill Gates as the world’s richest person, because–let’s face it–he probably still showers with his underwear on, and all the money in the world apparently doesn’t stop one from using a cereal bowl as a hairstyling product.

I’m sympathetic to that thought, but making gobs of money and being awesome on the social scale aren’t necessarily mutually exclusive. I peg it to the Lord of the Rings: Return of the King winning the Best Picture Academy Award. It’s fair to say that Hollywood hasn’t necessarily been hostile to geeks, but they certainly haven’t helped–for every Grade-Z science fiction flick and grating performance by Jerry Lewis perpetuating nerdiness as a valid lifestyle choice, there’s a thousand Ryan O’Neals, Tom Cruises, and Sean Connerys slamming the ladies and being the hit of the party. But having the movie industry actually recognize the epitome of the best representation of what it means to be a geek…well, let’s just say the World of Warcarft servers fell silent for whole minutes after it won, and no doubt attributed to all the half-orcs and dwarves that got born about nine months later.

The problem, of course, is once there are enough geeks out there, the mere act of being a geek means less and less. And there is always a segment of the population that strives to not be like everyone else, and many folks become quite conflicted when a once-obscure space opera only dozens of people obsessed over becomes a major motion picture and years of devoted fandom become useless when every bonehead and their brother can just look it up on the Internet Movie Database’s trivia section. Most of these poor kids don’t have any other avenues of interest to go to, and simply become geeks without all the awesome geek parts. Also See: Emo.

I’m lucky. I’ve been a geek for quite some time. If you were to go back in a time machine and ask me what my life goal was, here would be the results:

Age 4: I want to be a pirate.
Age 6: I want to be a stage magician.
Age 10: I want to build my own computer from random bits I bought at Radio Shack.
Age 12: I want to beat Sid Meier’s Civilization on Deity.
Age 16: I want to work for the National Security Agency.
Age 18: I want to be the world’s foremost expert on the Adventures of Brisco County, Jr.
Age 21: I want to write and star in my own steampunk version of Dune.
Age 32: I want to beat Sid Meier’s Civilization on Deity.

However, I didn’t embrace geekiness whole-heartedly. I never got into anime–I mean, the Japanese are just…weird. And I never really got into role-playing beyond the most basic level. And I didn’t own any console games between the Nintendo Entertainment System and the Wii, so I apparently missed out on such wonderful products as the Nintendo 64 version of Goldeneye as one of the greatest cultural milestones of the century.

Now, I’m not going to say anything about my lovely wife about his, though I would like to point out that our first date involved a book store and at one point in our marriage I’m pretty sure she once played Super Smash Brothers Brawl for 36 hours straight. I’m just sayin’.

While we are celebrating the cerebral, it might be prudent to point out what is NOT within the realm of the geek:
1. Just because you are a female, you cannot become a “sexy geek girl” just by putting on a pair of glasses. Tina Fey and Olivia Munn can pull it off because they are real-world capital-N Nerds. Some random hottie on the internet looking for attention and thinks she classifies as a nerd because she has an iPod, glasses, and an unbuttoned white men’s dress shirt and nothing else are not.
2. Number of Call of Duty units sold: 55 Million. Number of American soldiers in the actual World War II: 16.1 million.
3. Just because you picked up the Foundation series at the used book store or once watched forty-five minutes of magna at two in the morning in your dorm room via dial-up doesn’t mean you get a free punch on your geek card. It involves a more drastic change in your lifestyle to be a geek, such as pissing in empty Mountain Dew bottles so you don’t miss a minute of that EverQuest campaign you’re playing, or actually reading any of those Harry Turtledove books your odd aunt gave you one Christmas when you were eight.

Even with all of this information, there isn’t any particular well-defined determination of whether one is a geek or not. However, if you are already in the process of writing me a scathing e-mail about how the gestation period of a half-orc is not, in fact, nine months, then we may have a pretty good idea of where to start.


The End of the World as We Know It

February 6, 2010

Sometimes in the slow evenings of my existence I think about the end of the world.

Now, granted, this usually occurs when I’m watching either the History Channel or Jersey Shore, either of which are exceptional candidates for finding out how, and hoping for, respectively, the end of the world. But in thinking about it I’ve realized we have quite a bit to worry about.

There’s never been a shortage of theories. The Long Count Calendar, an elaborate prediction made by the Maya civilization, advised everyone that the end of the world would happen on 2012, at around 10:30 in the morning, right after breakfast. The Mayans devised this calendar using incredibly advanced technology for the time, mostly due to the scientific efforts they did not invest in, say, the wheel, or not falling for the Spanish-Dude-is-a-god trick.

Or course, we could also die by natural catastrophe. We naturally see blips of this on occasion, when tsunamis and rock slides just outright destroy entire nations in mere minutes. At some point, the weather is going to get its act together and start coordinating this nonsense. Someday, we’re going to get earthquakes, tidal waves, solar flares, volcanoes, ice storms, and Brett Favre’s last throw in regulation in the 2009 NFC Conference Championship all at once, and–poof!–we’re done.

The prospect of an infinitely expanding outer space doesn’t help. For those of us worried that some day the aliens will come with plasma rays and titanium boots and start laying waste to our cities, we really should be worried about what could actually realistically happen, since that is much scarier. The magnetic poles in the earth could switch, causing electrical generators to self-destruct and digital watches to switch to military time. A gamma ray burst, what as far as I can tell is the stellar equivalent of a six year old’s explanation of an episode of Kim Possible along with a brick of C4, could devour the earth in a blink of an eye. An asteroid could demolish the world, even with the assistance of a frustrated Bruce Willis now that he’s not going home to Demi Moore.

Of course, we could be doing it to ourselves each time we tap away at our computers. The concept of singularity–humans develop a computer smarter than humans, so it takes over its own development in an infinite loop of e-nightmares and cyberterror–is frightening. Only more so since I think it’s supposed to be the plot of Tron, but Disney was too scared to awesomeify it into reality. The term “grey goo” sounds cute, but it’s a scenario in which self-replicating nanobots, created with the intention of helping medicine and industry, end up consuming everything in its path, including–amazingly–Hot Pockets. Granted, I may be biased in this particular regard, since I am fairly certain the copy machine at work is smarter than myself, and is at least no doubt better organized. (For the record, I am also scared of most vending machines.)

Not all end-of-the-world scenarios involve random nastiness. It could be deliberate acts of cranky. Iran has been given a green light to nuke Israel, in the sense that I suspect that Tehran’s weapon of choice will be a fully functional and peaceful nuclear power plant small enough to fit inside an intercontinental ballistic missile. Kim Jong Il has been chucking Fat Men in the Pacific since Churchill was in diapers. Osama Bin Laden has been sitting in some cave in western Pakistan with an Erector Set, knocking over scale replicas of the Golden Gate Bridge and the Sydney Opera House while he waits for the canister of sarin to get to him, once FedEx finds his street number.

Then again, the world’s most destructive terrorist isn’t a nutjob in a turban or a fisttwister in Beijing, but a pig. Or a bird. Or some other random animal who, for some reason, holds a grudge against their caretakers and occasional preheaters. Swine Flu, Avian Flu, and no doubt Pachyderm Flu has mutated across species and will some day doom us all. Our bodies are weak to resist such outbreaks–thanks to their newly-formed transmission methods, but also because doctors have been pumping our bodies full of antibiotics like peanut M&M’s for years–and at some point a global pandemic may leave the buildings empty across the globe.

Of course, worrying about all of this isn’t going to do us any good. Aside from getting on the Opus Dei mailing list and maybe buying some of that astronaut ice cream, there isn’t any practical way to prepare for the end of the world. Me, I’ll pop a bag of kettle corn and crack open a Mr. Pibb. No, it won’t save me, but I certainly hope and expect that it will be one hell of a show.


The Quest For the Holy iGrail

June 26, 2007

Decades into the future, one could easily be misled into thinking that a small but vocal religion cropped up during the latter years in the first decade of the new century. Hymns have been composed, deities have been created, and rites have been established for all to participate in the Church of the iPhone. Welcome, all, those who are willing to embrace intuitive technology, advanced computing processing time, and a two-year contract, terms subject to change.

The iPhone, a new product shilled by Apple prodigy Steve Jobs promising to integrate several disparate features onto one single cell phone, has been getting a significant amount of press and heightened anticipation. While the media and technogeeks love the iPhone, it has more than its share of detractors. The first strike against the iPhone is that it’s called the iPhone. Listen, Steve, the whole putting an “i” before everything you sell was kind of cute back in 1980 or whenever, launching yourself ahead of the curve when internet startups were calling everything names like eParkingTickets or eSuicidePrevention. But “e” actually made some sort of sense, at least in its original, historical sense; identifying it as something that is electronic, as opposed to dead tree or vacuum tube. Now I’m sure there’s an equally valid reason why there’s an “i” placed on everything, like it stands for information or it’s better than Microsoft or some nonsense like that, but we all know it’s simply a marketing ploy to be able to put a tiny little apple where the dot on the “i” is supposed to be. It’s the fourth-grade-girl-writing-her-name-with-little-hearts-four-hundred-times-on-her-Trapper-Keeper-cover school of marketing.

Secondly, the almost hagiographic devotion journalists have displayed in covering the iPhone borders on the nauseating. Newsmen have restlessly fawned over marginally notable items before—MySpace, Flickr, John McCain—but this time, the escalated emotion borders on the Mark David Chapmanesque to an almost embarrassing degree. One analyst stated that “This is the most anticipated phone since Alexander Graham Bell did his,” apparently equating the inauguration of intercontinental communications with not having to walk three yards to check your email. “Is This The Holy Grail of Gadgets?” ran another headline, no doubt comparing the singularly elusive treasure with something that is soon going to have a 10 million unit production schedule.

Jobs himself is becoming more and more insufferable. I mean, God bless the guy and all, he’s certainly a consummate innovator and has forced competitors to make their devices more intuitive and resourceful. But in press conferences and interviews he always seems to exude a certain level of smugness normally reserved for French salons or college Democrat mixers.

Some of the more notable features of the iPhone:

Touch Screen: The iPhone uses a touchscreen as the primary input device, which is hardly a revolutionary addition to the world of gadgetry. However, the screen on the iPhone is glass—not plastic, as most others are, the main advantage to glass being the convenient ability to determine whether an engagement ring is truly diamond or not. Notably, you cannot use a stylus with the touchscreen; it requires contact with bare skin to operate. The marketing department of Apple apparently feels that the segment of population likely to purchase the iPhone is not going to include those individuals who have fingers the size, dexterity, and composition of sausages, a safe bet since I doubt too many iPhones will grace longshoreman union meetings or Slovenia.

Voice Mail: I don’t know why this is so important, but it’s on every list of features I’ve seen. The iPhone will let you pick and choose the order in which you listed to your voice mails. I mean, I guess that’s kind of nice, but I’m lonely enough that I only get about one voicemail every lunar cycle or so, so I guess a lot of it is lost on me. I suppose skipping over voice mails left by your boss, your parents, or (most importantly) your wife is significant enough that you have to listen to a drunken call from that girl you met at the bar last night intoxicatingly butchering her phone number a full two minutes earlier in your life than you could with a regular, prehistoric phone.

Multimedia: The iPhone will combine much of the media applications that other Apple products have provided, such as the iPod and the Video iPod. Combined with the phone’s internet capabilities, this is a major breakthrough in the field of being able to watch a video clip of a dog riding a skateboard anywhere you can pick up a signal.

Internet Access: In an increasingly mobile world, having access to information via the internet is becoming more and more important. I have access to the internet on my phone, for instance, though I have no earthly reason to do so besides apparently my desire to throw money away and succumbing to the thought that there may be a chance, however remote, that some day I will need to know the actress who played Six on Blossom and I will be able to resolve that question with only a few moments worth of effort. The iPhone will have this capability much in the same manner as regular Macs do, except that it will be on a screen about the size of a deck of cards, which is going to make that Photoshopped picture of Jenna Von Oy spread eagle on the beach all that much harder to admire.

Most people are waiting with anticipation for a device that has all of the applications they desire; however, most consumers will probably take a wait-and-see approach. While no doubt useful for some, it’s a good chance that the iPhone is simply going to be a more efficient way to drop a $500 device down the toilet.


The Platform of My Expertise

May 21, 2007

There’s something frighteningly alluring about the “Get a Mac” campaign. You know the commercial: two disparate representations of the PC and the Mac stand side by side getting paid scale and mugging it up for the ad agency. It’s not particularly subtle—the white space background is a psychiatrist’s wet dream’s wet dream, and the dialogue is snarkily understated until you reach the end of the commercial and suddenly realize you’ve just witnessed the manufactured equivalent of the whitest dozens ever propagated by the computer industry. It seems to have been engineered down to the core demographics of home computer buyers—college graduates with sudden disposable money and parents of 13 year old future meth mules who think that purchasing their child a computer will make them smarter, when in fact they’ll spend 18 hours a day playing EverQuest and looking up donkey-on-donkey porn.

Yet these commercials are shrewdly constructed, pointing out the flaws of the PC without being asshatish about it, retaining a standoffish smugness to make sure it doesn’t fall into helpless melodrama. Humorist John Hodgman (for those who know the difference between a humorist and a comedian, life presents a flavor more sweet the criminally uninformed will never know) portrays the PC, a stodgy, lumbering bundle of projects and competing platforms, fending off backhanded compliments from the Mac. Mac is portrayed by the smooth and slightly self-satisfied Justin Long, who somehow manages to craft fiercely cranky repartee without coming off as a completely soulless douchebag. These character actors, therefore, manage to portray their personalities to match those of the operating system they represent.

That said, it’s probably not the most effective commercial in the world. Watching it has showed me that, yeah, all things considered, the Mac is probably a superior computing system. Many of the irritants of the PC are completely absent, or at least heavily arbitraged in exchange for an inability to be connected to any kind of reliable network and waiting eight to twelve months to be able to play any game that console players and PC players have already bought, won, replayed, bragged about uselessly to their girlfriends, and sold on eBay.

Yet it’s not just that easy. Just as the Long’s slick presentation replicates the ease of use and rampant utility of the Macintosh, Hodgman’s PC doesn’t just point out flaws; he has the unintended consequence of familiarity. It also reminds us of the friendliness of the PC. Not in the come-over-this-lazy-afternoon-and-drink-lemonade-and-play-Atari kind of friendliness, but more along the lines of a baleful friendship long since forgotten. It’s a comfort zone kind of friendship. Yeah, the Mac may be more powerful and easier to use, but we like PCs because we’re used to PCs. And Hodgman’s portrayal of the PC is dead on—yeah, he may freeze up once in a while, and he may not be the best at multimedia applications, but he helped us when we stayed up all night to write that term paper due the following morning back in the spring of ’97 when that cute blonde blew us off right before dinner, so we kind of owe him.

I really don’t like getting into the operating system wars, of course, because no one will ever be convinced. Actually, that’s not true. Most PC users are at least vaguely aware that Macs are probably more efficient machines, but it just isn’t better enough to bother switching; likewise, the simple tyranny of the PC’s hold on the home computer business make doing a lot of things for the Mac inconvenient. But no matter what I say, I’ll probably get an avalanche of emails and painful correspondence dedicated to my complete and utter inability to grasp the simple concepts of kindergarten engineering that would crack open like the dead sea scrolls to me the knowledge that Macs are the evolutionary answer to all of life’s problems from Middle East peace to QuarkXPress, or that PCs are the harkened messiah in silicon form assuming the pinnacle achievement of the messiah involved the ability to play Sheepshead with an Australian housewife at four o’clock in the morning while an irritating graphic advertising mortgage payments with a dancing cactus flashes off to the side. Most people can’t seem to accept that in some areas PCs are superior, while in others the Macintosh is superior. And some jacked rivethead will always bring up the eternally oppressed Linux, like a latter-day Liberal Party MP or an RC Cola diehard.

Still, one has to feel impressed with the Mac advertising campaign. Macs used to largely be the sole province of publishers and Computer Science professors, with the occasional science fiction writer thrown in for good measure. Now, with a rather heavily aired campaign, along with sister products such as the iPod, iPhone, and iWhateverthehellthenextoverblownconspicuousconsumptioniconisgoingtobe, perhaps Apple will finally be able to crack more than single digits in the market share.