I bring it up because: As we look back at 2009, this was The Year of the Reboot. Culturally, politically , economically, and spec-fictionally, so much was given the reset button it's hard to fathom it all. Sticking close to the nerd-o-verse, Star Trek was conspicuously rebooted, as was the classic TV series V. You can be forgiven for ignoring the painful cinematic reboots of GI Joe, Land of the Lost, Friday the 13th, and Astro Boy along with the second punch-to-the-brain installment of the live-action Transformers reboot. Even classics like The Prisoner, Day of the Triffids, and Sherlock Holmes weren't above the reboot footprint this year. The aforementioned, critically acclaimed Battlestar Galactica reboot -- which in many ways kicked off the reboot craze that dominated 2009 -- also drew to a close this year. Here's hoping that in 2010 we get a few more original ideas.
The personal blog of Jay Garmon: professional geek, Web entrepreneur, and occasional science fiction writer.
Thursday, December 31, 2009
Nerd Word of the Year: Reboot
Reboot (n.) - A new version of an existing story or franchise that discards or ignores existing story continuity. This is different from a retcon, which sees much or all of existing continuity maintained, but with select changes in the backstory. Reboots start from scratch in many ways, and are sometimes indistinguishable from remakes. For example, the Ron Moore/David Eick reinvention of Battlestar Galactica saw major deviations from the 1978 original with main characters changing race, gender, or even species alongside the introduction of major new characters, settings, and themes.
I bring it up because: As we look back at 2009, this was The Year of the Reboot. Culturally, politically , economically, and spec-fictionally, so much was given the reset button it's hard to fathom it all. Sticking close to the nerd-o-verse, Star Trek was conspicuously rebooted, as was the classic TV series V. You can be forgiven for ignoring the painful cinematic reboots of GI Joe, Land of the Lost, Friday the 13th, and Astro Boy along with the second punch-to-the-brain installment of the live-action Transformers reboot. Even classics like The Prisoner, Day of the Triffids, and Sherlock Holmes weren't above the reboot footprint this year. The aforementioned, critically acclaimed Battlestar Galactica reboot -- which in many ways kicked off the reboot craze that dominated 2009 -- also drew to a close this year. Here's hoping that in 2010 we get a few more original ideas.
I bring it up because: As we look back at 2009, this was The Year of the Reboot. Culturally, politically , economically, and spec-fictionally, so much was given the reset button it's hard to fathom it all. Sticking close to the nerd-o-verse, Star Trek was conspicuously rebooted, as was the classic TV series V. You can be forgiven for ignoring the painful cinematic reboots of GI Joe, Land of the Lost, Friday the 13th, and Astro Boy along with the second punch-to-the-brain installment of the live-action Transformers reboot. Even classics like The Prisoner, Day of the Triffids, and Sherlock Holmes weren't above the reboot footprint this year. The aforementioned, critically acclaimed Battlestar Galactica reboot -- which in many ways kicked off the reboot craze that dominated 2009 -- also drew to a close this year. Here's hoping that in 2010 we get a few more original ideas.
Wednesday, December 30, 2009
2010: The beginning of the end of free
With the exception of broadband/dial-up access charges, the history of the Internet has been defined by free. Content is free. Access is free. Applications and features are free. Once you're online, everything is free for the taking. It's a culture of free, to the point that even when content producers want to charge you for something -- movies, music, games, books -- there's an accepted practice of getting it for free anyway.
And I think 2010 is the year that all starts to change. 2010 is the beginning of the end of free.
Why do I say that?
First, because 2009 was the year that the previously presumed notion of online = free became a real discussion point. Back in 2007 and 2008 it was just quasi-contrarians like Jason Fried from 37signals who got press for "daring" to charge for online apps. But in 2009 the groundswell of "maybe it shouldn't be free" got loud enough that the free-lovers countered with a seminal text, Chris Anderson's Free: The Future of Radical Price. The fact that free was even a major topic of debate in 2009 -- let alone that guys like Chris Brogan came out against the idea of giving everything away -- was the first tremor of change. If you have to defend your position, that's an acknowledgement that the position is under assault.
Second, 2010 will be (I predict) the year the PC really stops being our primary Internet device, and our phones take over. Apple has a lot to do with this, thanks to the iPhone, though Blackberry did plenty to make us like and want mobile Internet. Google is here with its Android OS, playing the same game too. And if/when the Apple tablet gets here -- probably running the iPhone OS -- it will annex the ebook reader landscape into the phone universe. (Ray Kurzweil's Blio book platform will pick up the dedicated e-reader stragglers, because it shows better ebooks. And it's worth mentioning that almost everyone, Amazon included, would rather sell you a tablet PC than a dedicated e-reader, if only for the higher market penetration.) So what's that got to do with free?
Whereas we've been trained that our PC-based Internet is a world of free, we've always paid for everything on our phones. A phone-based Internet universe is a world of micropayments. We pay for ringtones. We pay to download text messages. We pay extra for data connectivity. We pay for apps. Phone-based features are a Chinese menu of mini-payments. And the tweens and teens raised in a text-messaging, pay-as-you-go world won't think anything of a not-free Internet when they are driving the culture and finance bus 10 to 15 years from now. Moreover, not only will we pay for online stuff on our phone, we'll use our phone to pay for stuff offline. Virtual payments using a secure mobile device are just beginning to take hold in the US, but this trend will escalate and cement the notion of our phone as a nexus of commerce.
And the third nail in free's online coffin? Content providers have to find a way to make money online, and they're dead set on charging us something for the work. Rupert Murdoch wants to delist his sites from Google and put up a paywall -- and he's crazy enough to do it. Other conventional print content creators are trying to create a "Hulu for magazines" to encourage online payment. The RIAA and MPAA aren't going anywhere, and Apple and AMazon are happily playing along selling you 99-cent content snippets. Now, I expect all of these moves to fail, but they will send ripples through the content marketplace that erode the concept of free. While I won't pay $50 for a year's subscription to Time, I might pay $5 for a Time/Warner app on my tablet/phone -- one that gives me access to all their print content across all their properties. But the problem for content providers isn't that charging is wrong, but that they are overcharging for the value consumers place on their products. 2010 is the year we start to zero in on a realistic, sustainable price -- and it ain't free.
I'm not arguing that free is going away tomorrow, or that it will ever go away entirely. I am arguing that 2010 is the year that free stops being the assumed standard price for everything online.
And I think 2010 is the year that all starts to change. 2010 is the beginning of the end of free.
Why do I say that?
First, because 2009 was the year that the previously presumed notion of online = free became a real discussion point. Back in 2007 and 2008 it was just quasi-contrarians like Jason Fried from 37signals who got press for "daring" to charge for online apps. But in 2009 the groundswell of "maybe it shouldn't be free" got loud enough that the free-lovers countered with a seminal text, Chris Anderson's Free: The Future of Radical Price. The fact that free was even a major topic of debate in 2009 -- let alone that guys like Chris Brogan came out against the idea of giving everything away -- was the first tremor of change. If you have to defend your position, that's an acknowledgement that the position is under assault.
Second, 2010 will be (I predict) the year the PC really stops being our primary Internet device, and our phones take over. Apple has a lot to do with this, thanks to the iPhone, though Blackberry did plenty to make us like and want mobile Internet. Google is here with its Android OS, playing the same game too. And if/when the Apple tablet gets here -- probably running the iPhone OS -- it will annex the ebook reader landscape into the phone universe. (Ray Kurzweil's Blio book platform will pick up the dedicated e-reader stragglers, because it shows better ebooks. And it's worth mentioning that almost everyone, Amazon included, would rather sell you a tablet PC than a dedicated e-reader, if only for the higher market penetration.) So what's that got to do with free?
Whereas we've been trained that our PC-based Internet is a world of free, we've always paid for everything on our phones. A phone-based Internet universe is a world of micropayments. We pay for ringtones. We pay to download text messages. We pay extra for data connectivity. We pay for apps. Phone-based features are a Chinese menu of mini-payments. And the tweens and teens raised in a text-messaging, pay-as-you-go world won't think anything of a not-free Internet when they are driving the culture and finance bus 10 to 15 years from now. Moreover, not only will we pay for online stuff on our phone, we'll use our phone to pay for stuff offline. Virtual payments using a secure mobile device are just beginning to take hold in the US, but this trend will escalate and cement the notion of our phone as a nexus of commerce.
And the third nail in free's online coffin? Content providers have to find a way to make money online, and they're dead set on charging us something for the work. Rupert Murdoch wants to delist his sites from Google and put up a paywall -- and he's crazy enough to do it. Other conventional print content creators are trying to create a "Hulu for magazines" to encourage online payment. The RIAA and MPAA aren't going anywhere, and Apple and AMazon are happily playing along selling you 99-cent content snippets. Now, I expect all of these moves to fail, but they will send ripples through the content marketplace that erode the concept of free. While I won't pay $50 for a year's subscription to Time, I might pay $5 for a Time/Warner app on my tablet/phone -- one that gives me access to all their print content across all their properties. But the problem for content providers isn't that charging is wrong, but that they are overcharging for the value consumers place on their products. 2010 is the year we start to zero in on a realistic, sustainable price -- and it ain't free.
I'm not arguing that free is going away tomorrow, or that it will ever go away entirely. I am arguing that 2010 is the year that free stops being the assumed standard price for everything online.
Tuesday, December 29, 2009
Truly Trivial: What school is secretly referenced in every Pixar film ever made?
Christmas is over, but this godless heathen is still taking it easy. As such, you get another rerun from my Geek Trivia archives. Don't worry; it's timely. For those of you with kids, there's a good chance somebody got the rugrat a Pixar film -- probably Up! -- as a holiday gift, and the adults in the house are watching the CG animation on a nigh-endless loop. Fret not, Jay is here to help. The following bit of info can be used to construct a Pixar drinking game to kill the pain:
It just wouldn’t be Christmas without a CGI cartoon character voiced by John Ratzenberger, after all. John who? For the uninitiated, John Ratzenberger is merely the actor that portrayed iconic trivia geek Cliff Clavin on the long-running sitcom Cheers ... [and] he’s the only voice actor to appear in every Pixar feature film to date — to the point Ratzenberger is jokingly referred to as Pixar’s “good luck charm” ...
That said, Ratzenberger isn’t the only Easter egg that has so far snuck into every Pixar feature film to date. An inside reference to a famous academic institution has also shown up in each Pixar movie — if you know where to look for it.
WHAT ACADEMIC INSTITUTION HAS BEEN SECRETLY REFERENCED IN EVERY PIXAR FILM TO DATE?Get the complete Q&A here.
Thursday, December 24, 2009
Nerd Word of the Week: Santa Claus machine
Santa Claus machine (n.) - Whimsical nickname for a self-fueling universal constructor; essentially, a machine that can create any object or structure desired by transmuting any materials already on hand. The term was coined by the late physicist and nuclear disarmament advocate Ted Taylor. Santa Claus machines are often seen as necessary components for the creation of megastructures, as the time and materials necessary to build Dyson Spheres or Niven Rings under direct human supervision and effort is astronomically impractical. (I once did some back-of-the-napkin math on what it would take for NASA to build a Death Star, and that's a pretty clear case for why we need Santa Claus and a legion of tireless robo-elves.)
Some allegory or equivalent of the Santa Claus machine is a long-held staple of speculative fiction. Star Trek's replicators are perhaps the most famous example, though the pharaohic chemical transmuter factories from Kim Stanley Robinson's Mars trilogy also fit the bill, as do the semi-sentient household "makers" from Warren Ellis and Darick Robertson's comic series Transmetropolitan. Self-directing Santa Claus machines are also fodder for sci-fi-horror, as they may be a precursor to a gray goo outbreak. In the right hands, Santa Claus machines could lead to a post-scarcity economy (cue the Whuffie references). Paradise or apocalypse, Santa Claus machines could bring about either.
I bring it up because: Uh, Christmas Eve. Duh!
Some allegory or equivalent of the Santa Claus machine is a long-held staple of speculative fiction. Star Trek's replicators are perhaps the most famous example, though the pharaohic chemical transmuter factories from Kim Stanley Robinson's Mars trilogy also fit the bill, as do the semi-sentient household "makers" from Warren Ellis and Darick Robertson's comic series Transmetropolitan. Self-directing Santa Claus machines are also fodder for sci-fi-horror, as they may be a precursor to a gray goo outbreak. In the right hands, Santa Claus machines could lead to a post-scarcity economy (cue the Whuffie references). Paradise or apocalypse, Santa Claus machines could bring about either.
I bring it up because: Uh, Christmas Eve. Duh!
Thursday, December 17, 2009
Nerd Word of the Week: SantaCon
SantaCon (n.) - A flash mob or mass gathering of persons dressed in Santa Claus costumes, usually for the purpose of performance art or mild civil disobedience. The term is a play on the -Con naming tradition of science fiction conventions. These events are also known as Naughty Santas, Cheapsuit Santas, Santa Rampage or Santarchy. SantaCons regularly occur in major cities during the holiday season, either as a celebration of, or commentary on, Christmas traditions (or simply as an excuse to dress up and act a little nutty). SantaCon participants may distribute gifts to strangers or sing bawdy versions of traditional Christmas carols, while other may simply descend on a retailer, restaurant, or area of town en mass.
Both SantaCon and perversions of the Santa Claus image have a long tradition in speculative fiction. No less a noted novelist than Fight Club author Chuck Palahniuk wrote about a Santa Rampage in his book Fugitives and Refugees. William Gibson imagined a supposed Santa battling a sentient, burglar-repelling house in "Cyber-Claus", while both Doctor Who and the Futurama crew have faced off against robotic Santas with rather overzealous definitions of naughty and nice. Of course, Santa isn't always evil. In the pages of the webcomic PvP, Santa fends off the Christmas-crushing ambitions of a superintelligent cat each year, occasionally with the help of the superhero team Jingle Force Five. In the same vein, Santa saved the world from alien invaders in the nonetheless horrible cult B-movie Santa Claus Conquers the Martians. If you're a good little boy or girl, the real Father Christmas will place the Mystery Science Theater 3000 version of this flick in your stocking, rather than the stinking lump of coal that is the uncut original.
I bring it up because: Four years ago tomorrow -- Dec. 18, 2005 -- the Auckland, New Zealand SantaCon supposedly erupted in a violent riot that included looting and property damage. The event was organized -- allegedly -- in the online forums of the skateboard magazine Muckmouth by one Alex Dyer, who claimed it was merely an excuse for drunken revelry that got out of hand. Like all Santa legends, the truth or the Auckland Santarchy has little to do with the public perception of the event. As such, SantaCon (or, at least, Santarchy) is developing a myth of its own, one that's far more often naughty than nice.
Both SantaCon and perversions of the Santa Claus image have a long tradition in speculative fiction. No less a noted novelist than Fight Club author Chuck Palahniuk wrote about a Santa Rampage in his book Fugitives and Refugees. William Gibson imagined a supposed Santa battling a sentient, burglar-repelling house in "Cyber-Claus", while both Doctor Who and the Futurama crew have faced off against robotic Santas with rather overzealous definitions of naughty and nice. Of course, Santa isn't always evil. In the pages of the webcomic PvP, Santa fends off the Christmas-crushing ambitions of a superintelligent cat each year, occasionally with the help of the superhero team Jingle Force Five. In the same vein, Santa saved the world from alien invaders in the nonetheless horrible cult B-movie Santa Claus Conquers the Martians. If you're a good little boy or girl, the real Father Christmas will place the Mystery Science Theater 3000 version of this flick in your stocking, rather than the stinking lump of coal that is the uncut original.
I bring it up because: Four years ago tomorrow -- Dec. 18, 2005 -- the Auckland, New Zealand SantaCon supposedly erupted in a violent riot that included looting and property damage. The event was organized -- allegedly -- in the online forums of the skateboard magazine Muckmouth by one Alex Dyer, who claimed it was merely an excuse for drunken revelry that got out of hand. Like all Santa legends, the truth or the Auckland Santarchy has little to do with the public perception of the event. As such, SantaCon (or, at least, Santarchy) is developing a myth of its own, one that's far more often naughty than nice.
Tuesday, December 15, 2009
Truly Trivial: Who is the original creator of Festivus? (Hint: It ain't Seinfeld)
Today is my daughters birthday and thus I have no plans to do actual work today, so enjoy a recycled Truly Trivial from my old Geek Trivia days:
On Dec. 18, 1997, the Seinfeld episode “The Strike” aired for the first time, introducing the world to the now infamous faux holiday, Festivus. Billed as a counterpoint to the perceived increasing commercialism of Christmas (even though said commercialism is vital to the economy), Festivus — the so-called “holiday for the rest of us” — struck a chord with audiences, and real-world celebrations of this fictional festivity have been on the rise ever since. ...
Lost in all this Festivus revelry is the fact that, despite Seinfeld’s role in popularizing Festivus, the holiday is not original to the sitcom. In fact, Festivus was over 30 years old when “The Strike” first aired [more than] a decade ago.
Ironically, for a holiday ostensibly devoted to denouncing commercialization, Festivus may have been commercialized to the point of obscuring its own origins.
WHO IS THE ORIGINAL CREATOR OF THE FAUX HOLIDAY FESTIVUS?Read the complete Q&A here.
Thursday, December 10, 2009
Nerd Word of the Week: God particle
God particle (n.) - Nickname for the Higgs boson, a theoretical elementary particle that -- if proven to exist -- could explain many inconsistencies in the so-called Standard Model of the universe. The search for the Higgs boson, and its presumed importance, have given it something of a cult following within both science and science fiction, with the "god particle" becoming both a media darling for science journalists and a convenient plot device for authors and screenwriters. In Robert J. Sawyer's novel Flashforward, it was a particle accelerator's attempt to identify the Higgs boson that temporarily transported all of humanity's consciousness 20 years into the future -- a plot point so far absent from the FlashForward TV series based on Sawyer's book. John Ringo's novel Into the Looking Glass uses Higgs boson experiments as the catalyst for an explosion that allows alien invaders to enter our dimension. The film version of Dan Brown's novel Angels and Demons references the Higgs boson repeatedly, though often with dubious scientific accuracy.
I bring it up because: The Large Hadron Collider at CERN -- an instrument designed largely to find the Higgs boson -- became the highest energy particle accelerator in human history this week (breaking its own record) when it slammed together two 1.18 teraelectronvolt proton beams to create a 2.36 TeV collision. So far as kinetic energy goes, that's small, but so far as total energy in a hyperconfined space, that's astronomical. To grossly oversimplify, if physicists can get enough energy to occur in a extremely small space, they hope to recreate conditions necessary to create or observe otherwise unfindable exotic particles -- like the Higgs boson. We're likely still years away from that, but with the LHC online and hard at work crunching subatomic particles at notable fractions of lightspeed, odds are that the notion of a god particle -- if not its outright discovery -- will remain a regular subject of our science and our fiction.
I bring it up because: The Large Hadron Collider at CERN -- an instrument designed largely to find the Higgs boson -- became the highest energy particle accelerator in human history this week (breaking its own record) when it slammed together two 1.18 teraelectronvolt proton beams to create a 2.36 TeV collision. So far as kinetic energy goes, that's small, but so far as total energy in a hyperconfined space, that's astronomical. To grossly oversimplify, if physicists can get enough energy to occur in a extremely small space, they hope to recreate conditions necessary to create or observe otherwise unfindable exotic particles -- like the Higgs boson. We're likely still years away from that, but with the LHC online and hard at work crunching subatomic particles at notable fractions of lightspeed, odds are that the notion of a god particle -- if not its outright discovery -- will remain a regular subject of our science and our fiction.
Tuesday, December 08, 2009
Truly Trivial: What technology was the subject of 1968's famous "Mother of All Demos?"
One Dec. 9, 1968, a watershed moment in computer science and consumer electronics took place: a technology demonstration now known as The Mother of All Demos. The events of the MoAD arguably lit the fire that eventually set forth the technological, cultural, and economic conflagration that was the invention of the modern personal computer. It's just that no one really knew it at the time.
A pair of researchers named Douglas Englebart and Bill English coordinated to MoAD under the somewhat immodest title of "A research center for augmenting human intellect" at the 1968 Fall Joint Computer Conference in San Francisco. In attendance at the FJCC was Butler Lampson, who went on to help found Xerox's Palo Alto Research Conference (PARC) in 1970. Two years after that, Lampson wrote a memo titled "Why Alto?" which outlined his vision for a new type of computer -- the Xerox Alto -- based in part on what he saw at the Mother of All Demos.
The Xerox Alto is rather infamous in computing circles. First for the many modern technologies the Alto integrated into what we now recognize as a modern personal computer, including a computer mouse, Ethernet, file servers, and a graphic user interface with a desktop metaphor. Second, because Xerox decided there wasn't a market for the Alto and refused to produce it commercially. The latter point is often considered one of the great missed opportunities in business history.
You'll likely not be surprised to learn a young engineer named Steve Jobs got a first-hand look at a prototype Alto at Xerox Parc in 1979, and it inspired him to build the first Macintosh. The Mac, in turn, so impressed Bill Gates that he originally licensed part of its GUI for Windows 1.0, which Gates himself later licensed for IBM PCs with some multibillion-dollar, world-changing success. (Though there was some rather ugly fallout from the Windows vs. Mac similarity, some of which still rages today.)
Thus, you can draw a straight line from the Mother of All Demos to whichever personal computer -- be it Mac, PC, or a GUI flavor of Linux -- sitting on your desk right now. Lost in all the fervor, however, was the actual inspirational technology displayed at the MoAD.
What technology was the subject of 1968's famous Mother of All Demos?
A pair of researchers named Douglas Englebart and Bill English coordinated to MoAD under the somewhat immodest title of "A research center for augmenting human intellect" at the 1968 Fall Joint Computer Conference in San Francisco. In attendance at the FJCC was Butler Lampson, who went on to help found Xerox's Palo Alto Research Conference (PARC) in 1970. Two years after that, Lampson wrote a memo titled "Why Alto?" which outlined his vision for a new type of computer -- the Xerox Alto -- based in part on what he saw at the Mother of All Demos.
The Xerox Alto is rather infamous in computing circles. First for the many modern technologies the Alto integrated into what we now recognize as a modern personal computer, including a computer mouse, Ethernet, file servers, and a graphic user interface with a desktop metaphor. Second, because Xerox decided there wasn't a market for the Alto and refused to produce it commercially. The latter point is often considered one of the great missed opportunities in business history.
You'll likely not be surprised to learn a young engineer named Steve Jobs got a first-hand look at a prototype Alto at Xerox Parc in 1979, and it inspired him to build the first Macintosh. The Mac, in turn, so impressed Bill Gates that he originally licensed part of its GUI for Windows 1.0, which Gates himself later licensed for IBM PCs with some multibillion-dollar, world-changing success. (Though there was some rather ugly fallout from the Windows vs. Mac similarity, some of which still rages today.)
Thus, you can draw a straight line from the Mother of All Demos to whichever personal computer -- be it Mac, PC, or a GUI flavor of Linux -- sitting on your desk right now. Lost in all the fervor, however, was the actual inspirational technology displayed at the MoAD.
What technology was the subject of 1968's famous Mother of All Demos?
Monday, December 07, 2009
What force can bend even PR reps to his will? Cthulhu!
As a guy who currently earns much of his living from reviewing stuff, I get blasted with a fair share of press releases most of which hold little interest to me. Thus, it makes my Monday to get a little honest PR representing rightful fear and worship of The Elder Gods, as Tor sent me today. I recount it all for your benefit below:
Hi Jay,
This December, take a break from sparkly vampires and annoying good cheer with regular stops at Tor.com, where every day we’ll be tempting the Great Old Ones to awaken for our inaugural Cthulhu-mas, a month dedicated to all things Lovecraft.
Upcoming features include posts from Weird Tales editorial director Stephen Segal, an original comic from art superteam Teetering Bulb, Cthulhu-themed gift recommendations selected by Ellen Datlow, a new short story with a Lovecraftian focus, and lots more.
If you haven’t had a chance to check out the blog this month yet, here’s some of the posts you’ve missed:
*A list of H.P. Lovecraft-related titles available for 30% off all month: http://store.tor.com/
*Very special Cthulhu-mas wishes! http://www.tor.com/index.php?option=com_content&view=blog&id=58390
*Lovecraft monster drawings from Mike Mignola, Michael Whelan, John Jude Palencar, and Bob Eggleton: http://www.tor.com/index.php?option=com_content&view=blog&id=58269
*The introduction of Tor.com’s exclusive line of holiday cards: http://www.tor.com/index.php?option=com_content&view=blog&id=58384
*Patrick Nielsen Hayden on “H.P. Lovecraft, Founding Father of SF Fandom”: http://www.tor.com/index.php?option=com_content&view=blog&id=58397
As if your sanity weren’t already pushed to the brink this December.
All best,
AmiAs I've recounted before, Tor have a strikingly good PR staff who seem to get it. Invoking Lovecraft and taking potshots at Twilight both earn points in my book, and I'm pretty much the target audience of Tor products. Plus, I just finished the milSF-Cthulhu mashup "A Colder War" from Charles Stross's Toast, so I was primed for this missive either way. Still, for those of you wondering what a targeted (and geek-friendly) press release looks like, the above is a great example. Ami, keep 'em coming!
Thursday, December 03, 2009
Nerd Word of the Week: Tuckerization
Tuckerization (n.) - The use of a real person's name for a fictional character as a conscious literary in-joke. The term derives from SFWA Author Emeritus Wilson "Bob" Tucker, a science fiction writer and fanzine editor who famously appropriated the names of his friends family for his fictional characters. A contemporary example of a serial tuckerizer is John Scalzi, who has made a habit of doing so in his novels, though Scalzi claims it's simply because he's terrible at conjuring names for his characters. (In fact, in Scalzi's novel The Ghost Brigades, he presents several artificially engineered soldiers named after famous scientists, and notes that their creators chose those names simply out of convenience, effectively as a meta-tuckerization.) Nonpersons can also be subjects of tuckerization, as in Allen Steele's novel Spindrift where the author named a pair of space probes Larry and Jerry, after sci-fi authors Larry Niven and Jerry Pournelle -- authors who themselves were known for some famous tuckerizations.
It should be noted that tuckerization is different than including real people as fictional characters, as often happens in alt-history novels, or in the somewhat self-referencing tradition of including some version of the late writer/uberfan Forrest J. Ackerman in sci-fi works. Tuckerized characters are simply namesakes, not sci-fi versions of a roman à clef . Over the years, it has become tradition for established science fiction authors to auction off tuckerizations to benefit science fiction conventions or charitable causes.
I bring it up because: A host spec-fic authors are auctioning off tuckerizations this week in support of the Trans Atlantic Fan Fund, which pays for sci-fi and fantasy fans to cross the big pond in order to meet their respective ante-oceanic counterparts. Basically, it's an exchange program for geeks. Elizabeth Bear, David Brin, Julie Czerneda, Cory Doctorow, Nalo Hopkinson, Mary Robinette Kowal and Charlie Stross all have TAAF-benefit tuckerizations up for auction now, with the most expensive one (Stross's) still lingering around $250. That's a very reasonable price for fan-insider literary immortality, even accounting for the price-sniping that will occur when the auctions expire on Monday. If there's an uber-nerd in your life and you've got a a Benjamin or three to drop on his/her hobby, this would make a frakkin' awesome Christmahannukwanzukah-Solstice-Festivus present. (Hint, hint.) And it might even be tax-deductible.
It should be noted that tuckerization is different than including real people as fictional characters, as often happens in alt-history novels, or in the somewhat self-referencing tradition of including some version of the late writer/uberfan Forrest J. Ackerman in sci-fi works. Tuckerized characters are simply namesakes, not sci-fi versions of a roman à clef . Over the years, it has become tradition for established science fiction authors to auction off tuckerizations to benefit science fiction conventions or charitable causes.
I bring it up because: A host spec-fic authors are auctioning off tuckerizations this week in support of the Trans Atlantic Fan Fund, which pays for sci-fi and fantasy fans to cross the big pond in order to meet their respective ante-oceanic counterparts. Basically, it's an exchange program for geeks. Elizabeth Bear, David Brin, Julie Czerneda, Cory Doctorow, Nalo Hopkinson, Mary Robinette Kowal and Charlie Stross all have TAAF-benefit tuckerizations up for auction now, with the most expensive one (Stross's) still lingering around $250. That's a very reasonable price for fan-insider literary immortality, even accounting for the price-sniping that will occur when the auctions expire on Monday. If there's an uber-nerd in your life and you've got a a Benjamin or three to drop on his/her hobby, this would make a frakkin' awesome Christmahannukwanzukah-Solstice-Festivus present. (Hint, hint.) And it might even be tax-deductible.
Tuesday, December 01, 2009
Truly Trivial: What non-sci-fi book was the original basis for the alien invasion series V?
The revamped version of the television minseries V just finished its initial four-episode "pod" run before embarking on a four-month (are you kidding me?) hiatus. When the series resumes in March 2010, it will have a new showrunner: Chuck's Scott Rosenbaum, who displaces the technically-not-fired-but-no-longer-in-charge Scott Peters, former producer of The 4400.
In some ways this switchout is just another tribute that the 2009 V miniseries is paying to the 1983 V miniseries. Or, to paraphrase another sci-fi franchise: All of this has happened to V before, and all of this will (probably) happen to V again.
The Visitors were first brought to the small screen in 1983 by Kenneth Johnson, who at the time was riding high as creator of The Bionic Woman and The Incredible Hulk TV series. During its original two-episode, four-hour run, V garnered a 25 share and over 40 million viewers -- which meant that a sequel was all but assured. In 1984, ABC television rolled out a followup miniseries, V: The Final Battle -- a series produced without Kenneth Johnson.
Johnson cowrote the original Final Battle draft script but ABC decided that it would be too expensive to produce and fired Johnson prior to rewrites. Many of Johnson's ideas survived the transition and thus ABC wanted to credit him as a writer on the sequel series. Likely in protest, Johnson is instead credited under the pseudonym Lillian Weezer. (For the record, you almost never outright refuse a writing credit because refusing endangers your royalty position.)
Thus, firing your showrunner prior to a cliffhanger is a V tradition. So is, apparently, the network wanting to "change the direction" of the series -- a tradition that started before the series even began production. Johnson, you see, originally pitched V without the Visitors as a non-sci-fi miniseries based on a non-sci-fi book. It was ABC's idea to "Star Wars" it up, which may be the only time in history that a TV network has asked for a series to be more sci-fi.
WHAT NON-SCI-FI BOOK WAS THE ORIGINAL INSPIRATION FOR V?
In some ways this switchout is just another tribute that the 2009 V miniseries is paying to the 1983 V miniseries. Or, to paraphrase another sci-fi franchise: All of this has happened to V before, and all of this will (probably) happen to V again.
The Visitors were first brought to the small screen in 1983 by Kenneth Johnson, who at the time was riding high as creator of The Bionic Woman and The Incredible Hulk TV series. During its original two-episode, four-hour run, V garnered a 25 share and over 40 million viewers -- which meant that a sequel was all but assured. In 1984, ABC television rolled out a followup miniseries, V: The Final Battle -- a series produced without Kenneth Johnson.
Johnson cowrote the original Final Battle draft script but ABC decided that it would be too expensive to produce and fired Johnson prior to rewrites. Many of Johnson's ideas survived the transition and thus ABC wanted to credit him as a writer on the sequel series. Likely in protest, Johnson is instead credited under the pseudonym Lillian Weezer. (For the record, you almost never outright refuse a writing credit because refusing endangers your royalty position.)
Thus, firing your showrunner prior to a cliffhanger is a V tradition. So is, apparently, the network wanting to "change the direction" of the series -- a tradition that started before the series even began production. Johnson, you see, originally pitched V without the Visitors as a non-sci-fi miniseries based on a non-sci-fi book. It was ABC's idea to "Star Wars" it up, which may be the only time in history that a TV network has asked for a series to be more sci-fi.
WHAT NON-SCI-FI BOOK WAS THE ORIGINAL INSPIRATION FOR V?
Thursday, November 26, 2009
Nerd Word of the Week: Vatvegan
Vatvegan (adj.) - A term coined just days ago on Twitter by William Gibson, referring to a vegan who is nonetheless willing to consume artificially grown meat. Specifically, someone who has moral objections to eating conventional meat, but is willing to consume vat-grown pseudo-animal products so long as the substance consumed was not capable of experiencing pain. Artifical and/or moral meats have a long tradition in spec-fic, from the vat-meats mentioned offhand in Robert Heinlein's Door Into Summer to the sentient self-slaughtering cows found in Douglas Adams's The Restaurant at the End of the Universe to replicated meat found througthout the Star Trek mythos, vatvegans have been waiting for the goods for a long, long time.
I bring it up because: Today is Thanksgiving in the United States, a day of feasting gleefully without much concern to potential (if overhyped) frankenfood farmageddon or the lessons of The Omnivore's Dilemma. More specifically, Time named the nascent possibility of vat-grown meat one of the top technologies of 2009. Thus, we may be on the verge of sidestepping many of the moral concerns which have stirred many to vegetarianism (and into the ranks of PETA), leaving us only with the health risks associated with vat-meat. One wonders whether the next foodie counterculture movement won't be a rejection of "establishment" vat-meat in favor of natural farm-fresh animal flesh, with the next generation of suburban hippie defined as avowed carnivores, rather than prosetylizing meat-abstainers. Now that would be ironic.
I bring it up because: Today is Thanksgiving in the United States, a day of feasting gleefully without much concern to potential (if overhyped) frankenfood farmageddon or the lessons of The Omnivore's Dilemma. More specifically, Time named the nascent possibility of vat-grown meat one of the top technologies of 2009. Thus, we may be on the verge of sidestepping many of the moral concerns which have stirred many to vegetarianism (and into the ranks of PETA), leaving us only with the health risks associated with vat-meat. One wonders whether the next foodie counterculture movement won't be a rejection of "establishment" vat-meat in favor of natural farm-fresh animal flesh, with the next generation of suburban hippie defined as avowed carnivores, rather than prosetylizing meat-abstainers. Now that would be ironic.
Tuesday, November 24, 2009
Truly Trivial: What US President tried to reschedule Thanksgiving, and why?
A short work week at NotebookReview -- including preparations for Black Friday -- means we're going back to the Geek Trivia well in lieu of a fresh Truly Trivial column, but it's a timely and popular retread. The key excerpt below explains.
The full Q&A is here.
Sales volume typically spikes on Black Friday, then plummets back down the following Monday. The sales trend then ramps back up as Christmas approaches, with the two Saturdays and two Sundays nearest to but before Christmas eventually overtaking Black Friday in sales volume. (December 23 is always a good bet to be near the top as well.) The last-minute shopper is a statistical -- and economically powerful -- reality.
Despite Black Friday's somewhat overstated reputation, one U.S. president so respected the economic power wielded by the Christmas shopping season that he tried to extend it a week -- by rescheduling Thanksgiving.
WHICH U.S. PRESIDENT WANTED TO RESCHEDULE THANKSGIVING TO ACCOMMODATE CHRISTMAS SHOPPING?
The full Q&A is here.
Thursday, November 19, 2009
Nerd Word of the Week: Whuffie
Whuffie (n.) - A form of currency based on social status, rather than tangible wealth. The Whuffie originated in Cory Doctorow's science fiction novel Down and Out in the Magic Kingdom. Whuffie has since become an insider term for online reputation, with such non-fiction works as The Whuffie Factor taking the word mainstream. In Doctorow's original novel, Whuffie (which is the plural and singular form) replaced money as the means of acquiring wealth, though "wealth" in that setting meant something very different. Down and Out is set in a post-scarcity economy, where necessities and luxuries are so easily and cheaply produced they have no practical cost. Thus, the economy was based on status rather than property, and complex systems were set up to upvote and downvote your every action, such that every activity becomes a public political performance. If you think that's an unlikely future, please say so in the syndicated comments below this posts, and/or downvote (using the controversial new dislike button) the Facebook note version of this blog. Possibly using your always-on Internet phone. Which is itself a status symbol. The point (and the sarcasm) should be evident by now.
I bring it up because: The Oxford English Dictionary declared the verb unfriend 2009's Word of the Year. When removing someone from your virtual social circle is so commonplace and mainstream as to get pub from the OED, you know that virtual, pervasive reputation management is here to stay. Combine that with the relatively ballyhooed debut of LoveMachine, a real-life antecedent to the Whuffie, a virtual reputation-management system created by Second Life founder Philip Rosedale. Granted, The Whuffie Bank already got there and already has the Whuffie name. Also, the PenguinCon science fiction-slash-Linux convention has recently experimented with an in-con virtual reputation currency called a Whuffie, though in practice it worked a bit more like Slashdot's karma system. That may be a distinction without a difference, but I'll leave it to my Whuffie score to sort it out.
I bring it up because: The Oxford English Dictionary declared the verb unfriend 2009's Word of the Year. When removing someone from your virtual social circle is so commonplace and mainstream as to get pub from the OED, you know that virtual, pervasive reputation management is here to stay. Combine that with the relatively ballyhooed debut of LoveMachine, a real-life antecedent to the Whuffie, a virtual reputation-management system created by Second Life founder Philip Rosedale. Granted, The Whuffie Bank already got there and already has the Whuffie name. Also, the PenguinCon science fiction-slash-Linux convention has recently experimented with an in-con virtual reputation currency called a Whuffie, though in practice it worked a bit more like Slashdot's karma system. That may be a distinction without a difference, but I'll leave it to my Whuffie score to sort it out.
Tuesday, November 17, 2009
Truly Trivial: How many planets are known to have liquid surface water?
Last Friday, NASA announced that its LCROSS mission -- also known as "let's slam a probe into the lunar surface while another probe films the action, Jackass-style" -- found significant amounts of water on the moon. This revved up interest in lunar colonization again, despite the fact that there isn't any pure, drinkable, liquid water on the moon.
Now, water is very useful for space exploration even if you can't directly drink it, as simple electrolysis lets you crack water into breathable oxygen and hydrogen rocket fuel. The moon just became much more interesting as a weigh station for spacecraft, as robot factories could be sent ahead to stockpile hydrogen, oxygen, and possibly even purified drinkable water for manned craft heading further out into the solar system. Moreover, water is a very effective radiation shield, provided you can encircle a crew cabin in a two-meters-deep tank of H2O on all sides. The "significant amounts" of lunar water means moonbases might just be able to extract the Olympic swimming pools' worth of water necessary to ward off the cosmic fry-daddy rays awaiting space travelers.
As Karl Schroeder points out, the main barrier to manned spaceflight isn't technology but the expense of hauling the necessary materials into orbit. Launching these materials off the moon requires climbing out of one-sixth the gravity well that exists on Earth, which means there's an economic argument to be made for gathering spaceflight materials from a lunar base rather than launching them from Earth with the astronauts. Besides, if we can perfect the robot factories on the moon, we can get serious about mining comets, which have a lot more water-ice mass per capita and a lot less gravity. Robot comet iceminers mean that manned spacecraft could have a plethora of automated air, fuel, and shielding depots scattered throughout the Interplanetary Transport Network.
Thus, the lunar water discovery is very exciting, but it's unlikely to lead directly to the holy grail of space exploration -- the discovery of extraterrestrial life. Virtually all life as we know it requires water to exist, particularly liquid water that appears on the surface of the world, where it can be exposed to nurturing (and mutating) solar radiation. We've found hundreds of extrasolar planets and thousands of extraterrestrial non-stellar bodies, but liquid surface water remains stubbornly rare.
How many planets besides Earth are known to possess liquid surface water?
Now, water is very useful for space exploration even if you can't directly drink it, as simple electrolysis lets you crack water into breathable oxygen and hydrogen rocket fuel. The moon just became much more interesting as a weigh station for spacecraft, as robot factories could be sent ahead to stockpile hydrogen, oxygen, and possibly even purified drinkable water for manned craft heading further out into the solar system. Moreover, water is a very effective radiation shield, provided you can encircle a crew cabin in a two-meters-deep tank of H2O on all sides. The "significant amounts" of lunar water means moonbases might just be able to extract the Olympic swimming pools' worth of water necessary to ward off the cosmic fry-daddy rays awaiting space travelers.
As Karl Schroeder points out, the main barrier to manned spaceflight isn't technology but the expense of hauling the necessary materials into orbit. Launching these materials off the moon requires climbing out of one-sixth the gravity well that exists on Earth, which means there's an economic argument to be made for gathering spaceflight materials from a lunar base rather than launching them from Earth with the astronauts. Besides, if we can perfect the robot factories on the moon, we can get serious about mining comets, which have a lot more water-ice mass per capita and a lot less gravity. Robot comet iceminers mean that manned spacecraft could have a plethora of automated air, fuel, and shielding depots scattered throughout the Interplanetary Transport Network.
Thus, the lunar water discovery is very exciting, but it's unlikely to lead directly to the holy grail of space exploration -- the discovery of extraterrestrial life. Virtually all life as we know it requires water to exist, particularly liquid water that appears on the surface of the world, where it can be exposed to nurturing (and mutating) solar radiation. We've found hundreds of extrasolar planets and thousands of extraterrestrial non-stellar bodies, but liquid surface water remains stubbornly rare.
How many planets besides Earth are known to possess liquid surface water?
Thursday, November 12, 2009
Nerd Word of the Week: Cryptid
Cryptid (n.) - A creature that is rumored to exist but no one has managed to prove is real. Bigfoot, chupacabras, the Moth Man, and the Jersey Devil are among the more famous examples of cryptids. Cryptozoology is the "scientific" study of cryptids, though (much like ufology) the field is often dominated by fringe scientists and conspiracy theorists who seem ill-acquainted with the actual scientific method. (There's also a subfield called cryptobotany, which seems mostly obsessed with finding giant man-eating plants. Little Shop of Horrors apparently was inspired by true events.)
Cryptids, to no one's surprise, make excellent fodder for speculative fiction A Lee Martinez's novel Monster is a recent example, though the trend is hardly limited to books. The animated series The Secret Saturdays chronicles the adventures of a family of cryptid hunters, depicted in a Jack Kirby pulp-hero style. Moreover, giant cryptids seem to have a rolling contract with Syfy, as the channel has become infamous for cheeseball original cryptid-kaiju movies like Megashark vs. Giant Octopus and Dinocroc vs. Supergator.
I bring it up because: A mere 76 years ago today, the first known photographs of the world's most famous cryptid, the Loch Ness Monster, were taken. On Nov. 12, 1933 Scotsman Hugh Gray snapped the first purported pics of Nessie, though to most eyes it looks like a blurry image of a dog swimming with a stick in its mouth. 1933 was when the Nessie phenomenon gained media attention, with multiple sighting reports published in papers that year. It was only a year later that the world-famous but since-discredited Surgeon's Photo of the Loch Ness Monster was taken, proving that hoaxsters trying to horn in on the cryptid action are a time-honored tradition, even if cryptozoology itself isn't.
Cryptids, to no one's surprise, make excellent fodder for speculative fiction A Lee Martinez's novel Monster is a recent example, though the trend is hardly limited to books. The animated series The Secret Saturdays chronicles the adventures of a family of cryptid hunters, depicted in a Jack Kirby pulp-hero style. Moreover, giant cryptids seem to have a rolling contract with Syfy, as the channel has become infamous for cheeseball original cryptid-kaiju movies like Megashark vs. Giant Octopus and Dinocroc vs. Supergator.
I bring it up because: A mere 76 years ago today, the first known photographs of the world's most famous cryptid, the Loch Ness Monster, were taken. On Nov. 12, 1933 Scotsman Hugh Gray snapped the first purported pics of Nessie, though to most eyes it looks like a blurry image of a dog swimming with a stick in its mouth. 1933 was when the Nessie phenomenon gained media attention, with multiple sighting reports published in papers that year. It was only a year later that the world-famous but since-discredited Surgeon's Photo of the Loch Ness Monster was taken, proving that hoaxsters trying to horn in on the cryptid action are a time-honored tradition, even if cryptozoology itself isn't.
Wednesday, November 11, 2009
We're on the verge of the 4th Epoch of memory
Before we begin, a hat tip goes out to Phil "Bad Astronomy" Plait for disseminating this collection of Carl Sagan quotes, which inspired this little observation. As such, I'll begin with Sagan's own words, which invariably outstrip my own:
Within Sagan's words we observe the first three Epochs of Memory. Like all species on Earth, we first solely stored memory genetically. Not just the memory of our biological selves, but whatever rote, instinctual behaviors that dominate our actions still today. Even viruses, which exhibit nothing so complex as "thought" or "intention" nonetheless possess a form of genetic memory that impels them to propagate. This was our First Epoch of Memory, when our genes made of us all that we could ever be. The design limitation of genetic memory was its dependence on mutation to spur advancement. All learning was random. Carried to its logical conclusion, the First Epoch led to a random mutation that improved our memory.
The Second Epoch saw the rise of brains, organs designed explicitly to deal with information. We (or, rather, our genetic forebearers) could now remember information and apply that memory long before the implications of that data became encoded in our genes. We need not wait for some random mutation to give us an advantageous instinctive aversion to chewing on hemlock leaves; we could now remember the illness we suffered after first consuming hemlock and, if we survived, choose not to eat it again. Many species developed brains and applied it to this explicit advantage.
Carried to its logical conclusion, the Second Epoch gave rise to language, a collection of symbols that could be remembered collectively by a group and used to communicate information not directly discerned by an individual. Now, if one member of the tribe learned not to eat hemlock, all the members of the tribe could benefit from that knowledge. The limitation of language was tied, initially, to the limitation of our physical memory. We could only remember so much, and the fidelity of that memory often suffered when it was transferred -- as in, told -- to another individual. The children's game of telephone is the classic exhibition of oral history's limitations.
Language, carried to its logical conclusion, gave rise to Third Epoch of Memory, a time when information could be stored outside the individual in tangible form. Books, for most of our history. Now, so long as you had the basic knowledge necessary to decipher the encoding language, you could benefit from the collective experience of the entire species. The limitation of external memory is organization and availability. We no longer carry information with us, but we must actively seek it out, and then apply it contextually.
Books, carried to their logical conclusion, now stand us on the precipice of the Fourth Epoch of Memory. In this Epoch, seeking out, organizing, and contextually applying information has been rendered an external process. Put more simply, the Fourth Epoch of Memory arrives when externally stored information is indistinguishable from internally stored information. What we know won't be limited to what we can remember, because we can access external memory (the Internet et al) with the same ease, speed, and faculty as internal memory (our brains).
A combination of high-speed mobile communications, augmented reality interfaces, and massive free, searchable libraries of information are only just now being born. But when this mass of external memory that we have spent centuries exponentially expanding is suddenly available in perfect context at the perfect moment for everyone, we will have become a fundamentally different species all over again. What we know will be limited only by our curiosity and our bandwidth speed.
I, for one, can't wait for the Fourth Epoch to arrive. As Jamais Cascio warns us, this era won't be without its drawbacks, but I think it will be well worth the price of admission. I only wish Carl had been here to see it.
"When our genes could not store all the information necessary for survival, we slowly invented brains. But then the time came, perhaps ten thousand years ago, when we needed to know more than could conveniently be contained in brains. So we learned to stockpile enormous quantities of information outside our bodies. We are the only species on the planet, so far as we know, to have invented a communal memory stored neither in our genes nor in our brains. The warehouse of that memory is called the library."I posit that the advancement of our (and every) species is defined by the limits of our memory. As we overcome the inherent design limitations of each type of memory, we move into a new Epoch of Memory, and a new level of complexity as both a society and a species.
-- "Persistence of Memory," Cosmos
Within Sagan's words we observe the first three Epochs of Memory. Like all species on Earth, we first solely stored memory genetically. Not just the memory of our biological selves, but whatever rote, instinctual behaviors that dominate our actions still today. Even viruses, which exhibit nothing so complex as "thought" or "intention" nonetheless possess a form of genetic memory that impels them to propagate. This was our First Epoch of Memory, when our genes made of us all that we could ever be. The design limitation of genetic memory was its dependence on mutation to spur advancement. All learning was random. Carried to its logical conclusion, the First Epoch led to a random mutation that improved our memory.
The Second Epoch saw the rise of brains, organs designed explicitly to deal with information. We (or, rather, our genetic forebearers) could now remember information and apply that memory long before the implications of that data became encoded in our genes. We need not wait for some random mutation to give us an advantageous instinctive aversion to chewing on hemlock leaves; we could now remember the illness we suffered after first consuming hemlock and, if we survived, choose not to eat it again. Many species developed brains and applied it to this explicit advantage.
Carried to its logical conclusion, the Second Epoch gave rise to language, a collection of symbols that could be remembered collectively by a group and used to communicate information not directly discerned by an individual. Now, if one member of the tribe learned not to eat hemlock, all the members of the tribe could benefit from that knowledge. The limitation of language was tied, initially, to the limitation of our physical memory. We could only remember so much, and the fidelity of that memory often suffered when it was transferred -- as in, told -- to another individual. The children's game of telephone is the classic exhibition of oral history's limitations.
Language, carried to its logical conclusion, gave rise to Third Epoch of Memory, a time when information could be stored outside the individual in tangible form. Books, for most of our history. Now, so long as you had the basic knowledge necessary to decipher the encoding language, you could benefit from the collective experience of the entire species. The limitation of external memory is organization and availability. We no longer carry information with us, but we must actively seek it out, and then apply it contextually.
Books, carried to their logical conclusion, now stand us on the precipice of the Fourth Epoch of Memory. In this Epoch, seeking out, organizing, and contextually applying information has been rendered an external process. Put more simply, the Fourth Epoch of Memory arrives when externally stored information is indistinguishable from internally stored information. What we know won't be limited to what we can remember, because we can access external memory (the Internet et al) with the same ease, speed, and faculty as internal memory (our brains).
A combination of high-speed mobile communications, augmented reality interfaces, and massive free, searchable libraries of information are only just now being born. But when this mass of external memory that we have spent centuries exponentially expanding is suddenly available in perfect context at the perfect moment for everyone, we will have become a fundamentally different species all over again. What we know will be limited only by our curiosity and our bandwidth speed.
I, for one, can't wait for the Fourth Epoch to arrive. As Jamais Cascio warns us, this era won't be without its drawbacks, but I think it will be well worth the price of admission. I only wish Carl had been here to see it.
Tuesday, November 10, 2009
Truly Trivial: What sci-fi novel coined the phrase 'computer worm?'
Once again, I am overburdened by day jobbery and must resort to recycling one of my old TechRepublic Geek Trivia posts for this week's Truly Trivial column. Don't worry; it's timely. Twenty-six years ago today, the computer virus was born. Twenty one years ago last week, perhaps the most famous computer worm ever, the Morris Worm, was launched. Which brings me to my original article:
The effects of the Morris Worm were so widespread and pronounced that it made the national news (quite a feat in 1988), and it eventually earned Morris a landmark if decidedly unintimidating conviction: Three years' probation, 400 hours of community service, and a $10,050 fine. In geek circles, people sometimes referred to the Morris Worm as the Great Worm, a reference to the Great Worms (i.e., dragons) found in J. R. R. Tolkien's Lord of the Rings series.
This, of course, was not why we called Morris' creation a worm, as these types of malicious programs (distinct from computer viruses) owe their etymology to a work of science fiction, rather than fantasy.
WHAT WORK OF SCIENCE FICTION COINED THE TERM COMPUTER WORM?You can find the complete original Q&A here.
Monday, November 09, 2009
Short story: Crimes Against Science Fiction
I want to give a shout out to Suzanne Vincent and the editing crew over at Flash Fiction Online, who gave this story one of the kindest and most encouraging rejection letters I've ever received. This was really my first stab at flash fiction, and as such its rather derivative -- I was channeling a bit of Douglas Adams when I threw this together -- with a weak ending. I had fun with it, but it's time to put this little experiment out to pasture. Thus, Crimes Against Science Fiction is trunked here for your enjoyment.
---
When Tommy stumbled half-awake out of his apartment this morning, he didn't expect that the future would be waiting outside to kill him. Or that the future would be so fat.
"Stand fast, and prepare to answer for your crimes!" the one of the left shouted, pointing an exotic firearm at Tommy. The assailant was dressed in a flamboyantly colorful excuse for a military uniform, though the garment was clearly intended for someone far thinner than the rotund, pimply-faced gent breathing heavily in Tommy's parking space.
The one on the right interrupted. "Wait, we have to explain his crimes to him first. He isn't guilty of them yet." This one was slightly taller, and rail thin, except for the almost comically out-of-place beer belly swelling beneath the belt of his…er…outfit. He was wearing some bizarre hybrid of 17th century samurai armor and 21st century leather fetish gear while carrying a wildly impractical, oversized sword.
---
When Tommy stumbled half-awake out of his apartment this morning, he didn't expect that the future would be waiting outside to kill him. Or that the future would be so fat.
"Stand fast, and prepare to answer for your crimes!" the one of the left shouted, pointing an exotic firearm at Tommy. The assailant was dressed in a flamboyantly colorful excuse for a military uniform, though the garment was clearly intended for someone far thinner than the rotund, pimply-faced gent breathing heavily in Tommy's parking space.
The one on the right interrupted. "Wait, we have to explain his crimes to him first. He isn't guilty of them yet." This one was slightly taller, and rail thin, except for the almost comically out-of-place beer belly swelling beneath the belt of his…er…outfit. He was wearing some bizarre hybrid of 17th century samurai armor and 21st century leather fetish gear while carrying a wildly impractical, oversized sword.
Thursday, November 05, 2009
Nerd Word of the Week: Spam in a can
Spam in a can (adj.) - Space program slang term for a passive occupant in a spacecraft, specifically a space capsule. The phrase is generally attributed to Chuck Yeagar, if only because he's shown describing the Mercury astronauts as "spam in a can" in the movie The Right Stuff, though there is ample evidence that multiple astronauts and NASA officials used the term liberally during the 1960s Space Race. The early Mercury astronauts, all trained military pilots, are known to have resisted being mere "spam in a can" with no active control of their vessels, thus forcing a level of human direction into early space vehicles.
Spam in a can is now typically used as a snarky criticism of the current level of manned spaceflight technology, as humans are still travelling as meat packed into primitive metal containers and shipped long distances. This falls under the sensawunda criticism of NASA -- particularly the space shuttle successor Project Constellation, which is described as "Apollo on steroids" -- in that we are still not creating or using the sci-fi-inspired tech that books and movies has promised us for decades. Warren Ellis and Colleen Doran rather deftly pointed out the spam-in-a-can disappointment factor with NASA in the graphic novel Orbiter, wherein an alien intelligence redesigns our "primitive" space shuttle into a true interplanetary exploration vehicle.
I bring it up because: Laika passed away 52 years ago on Tuesday. For those that don't know the name, Laika was the first living creature that humans sent into space. She was a Soviet space dog launched aboard Sputnik 2 on Nov. 3, 1957. She died from overheating a few hours after launch, thus making Laika the first spaceflight casualty. Her likeness is preserved in a statue at the cosmonaut training facility in Star City, Russia, as her nation's first space traveler. Telemetry from her mission proved that living beings could survive launch g-forces and weightlessness, thus proving that spam in a can was a viable manned spaceflight model.
I also mention the spam in a can principle as a corollary to Charles Stross's recent thought experiment blog post, How habitable is the Earth? Stross essentially argues that humans are explicitly designed for a particular fraction of Earth's environment that exists during a hyper-minute fraction of Earth's geological history, thus making human space exploration -- which removes us from this environment -- a terribly difficult and expensive undertaking. Karl Schroeder recently counter-argued (the point, not Stross) that most of these problems are surmountable if we get launch expenses down and can get the proper equipment -- all of which already exists -- into orbit cheaply. Which gets us back to the spam in a can criticism: Until the tech gets better, large-scale humans space exploration is a pipe dream.
Spam in a can is now typically used as a snarky criticism of the current level of manned spaceflight technology, as humans are still travelling as meat packed into primitive metal containers and shipped long distances. This falls under the sensawunda criticism of NASA -- particularly the space shuttle successor Project Constellation, which is described as "Apollo on steroids" -- in that we are still not creating or using the sci-fi-inspired tech that books and movies has promised us for decades. Warren Ellis and Colleen Doran rather deftly pointed out the spam-in-a-can disappointment factor with NASA in the graphic novel Orbiter, wherein an alien intelligence redesigns our "primitive" space shuttle into a true interplanetary exploration vehicle.
I bring it up because: Laika passed away 52 years ago on Tuesday. For those that don't know the name, Laika was the first living creature that humans sent into space. She was a Soviet space dog launched aboard Sputnik 2 on Nov. 3, 1957. She died from overheating a few hours after launch, thus making Laika the first spaceflight casualty. Her likeness is preserved in a statue at the cosmonaut training facility in Star City, Russia, as her nation's first space traveler. Telemetry from her mission proved that living beings could survive launch g-forces and weightlessness, thus proving that spam in a can was a viable manned spaceflight model.
I also mention the spam in a can principle as a corollary to Charles Stross's recent thought experiment blog post, How habitable is the Earth? Stross essentially argues that humans are explicitly designed for a particular fraction of Earth's environment that exists during a hyper-minute fraction of Earth's geological history, thus making human space exploration -- which removes us from this environment -- a terribly difficult and expensive undertaking. Karl Schroeder recently counter-argued (the point, not Stross) that most of these problems are surmountable if we get launch expenses down and can get the proper equipment -- all of which already exists -- into orbit cheaply. Which gets us back to the spam in a can criticism: Until the tech gets better, large-scale humans space exploration is a pipe dream.
Tuesday, November 03, 2009
Truly Trivial: What was the original written formulation for E=mc2?
The old trivia geek is bogged down with day job wonkery, so I'm reprinting this old-school Geek Trivia from my TechRepublic days, "Eye for an Einstein." I quote from it thusly:
Get the full Q&A here.
Einstein proved that even apart from movement, all objects that possess mass also possess energy. You can find out exactly how much energy with the simple calculation of mass multiplied by the square of the speed of light, E=mc2.
The crazy thing is, you won’t find this famous equation in any of Einstein’s papers published before, during, or after 1905. That’s because Einstein never wrote his most famous equation in its most famous form.
Instead, he earned his scientific accolades expressing energy-mass equivalence very differently.
WHAT WAS THE ORIGINAL FORMULATION OF E=mc2 AS WRITTEN BY EINSTEIN?
Get the full Q&A here.
Thursday, October 29, 2009
Nerd Word of the Week: Fail Whale
Fail Whale (n.) - Nickname for the custom 404 error page for the Twitter microblogging service. Also a geek-slang curse invoked whenever something goes wrong, especially if that something is Internet-related. The Fail Whale Twitter page depicts a flock of the Twitter mascot birds attempting to hoist a cartoon whale into the air, and includes the phrase "Twitter is over capacity." The Twitter Fail Whale usually appears when the Twitter servers are overloaded, typically as a function of the service's burgeoning popularity. Thus, the Fail Whale often shows up just when the most Twitter users are paying attention. The Fail Whale has become both a beloved and a reviled symbol for Twitter and for the growing-pain-ridden universe of social networking Web applications. You can now buy Fail Whale merchandise and join the Fail Whale fan club.
I bring it up because: 40 years ago today, the ancient Internet ancestor of the Fail Whale was born -- on the same day as the Internet. On Oct. 29, 1969, the first router-linked communication between two ARPANET-linked computers occurred when the SDS Sigma 7 Host and UCLA sent the following message to the SRI SDS 940 Host at Stanford: "lo." That's the letters L and O, transmitted in lowercase. No, that isn't the earliest l33t-speak version of hello ever recorded. UCLA was trying to send the login command to the Stanford system but the ARPANET link failed two letters in. Sort of like when your Twitter post times out and the page refreshes to a cartoon rendering of several painfully peppy songbirds trying to hoist and equally over-happy humpback into the sky. In other words, on the day we invented the Internet, we also invented the Fail Whale.
I bring it up because: 40 years ago today, the ancient Internet ancestor of the Fail Whale was born -- on the same day as the Internet. On Oct. 29, 1969, the first router-linked communication between two ARPANET-linked computers occurred when the SDS Sigma 7 Host and UCLA sent the following message to the SRI SDS 940 Host at Stanford: "lo." That's the letters L and O, transmitted in lowercase. No, that isn't the earliest l33t-speak version of hello ever recorded. UCLA was trying to send the login command to the Stanford system but the ARPANET link failed two letters in. Sort of like when your Twitter post times out and the page refreshes to a cartoon rendering of several painfully peppy songbirds trying to hoist and equally over-happy humpback into the sky. In other words, on the day we invented the Internet, we also invented the Fail Whale.