I bring it up because: The econopocalypse is the new zombie apocalypse, at least according to Barack Obama's State of the Union Address last night. It was interesting to juxtapose the political crossfire over how to combat the presumed jobless recovery we're staring down after the subprime cirsis with Apple's fanboy-entrancing release of the iPad, a computer that is part phone, part laptop, and all status symbol. Clearly, we're not nearing a post-consumerist society, so far as St. Jobs is concerned, but then Steve wouldn't mind being the Big Brother in charge of the new media economy. Maybe that's why the iPad didn't ship with a viewer-facing camera -- too much of a tipoff that Big Steve is watching. In any case, times of economic unrest often inspire dystopian fiction, but whether we stay with the zombie track (as is indicated by AMC's Frank Darabont-helmed option for Robert Kirkman's Walking Dead series) or we get a new econopocalypse-styled dystopian breed in line with Jeff Somers's Electric Church remains to be seen. In either case, a lack of money will probably be good for the downer spec-fic business. Ironic.
The personal blog of Jay Garmon: professional geek, Web entrepreneur, and occasional science fiction writer.
Thursday, January 28, 2010
Nerd Word of the Week: Econopocalypse
Econopocalypse (n.) -- Slang term for a sudden and catastophic economic calamity, which results in the rise of a dystopic or even post-apocalyptic society. While the term is relatively new, econopocalyptic fiction isn't -- they just usually call it dystopian fiction and ignore its economic bent. George Orwell's 1984, Ayn Rand's Anthem, and Lois Lowry's The Giver are all arguably econopocalyptic fiction, though at least in Lowry's case the economic aspect of the dystopia -- state control of the economy -- is secondary to the state control of human consciousness. This brings up an important point; real-world economic distress often leads to a rise in creation and consumption of dystopian or apocalyptic fiction, though rarely is that fiction directly econopocalyptic.
I bring it up because: The econopocalypse is the new zombie apocalypse, at least according to Barack Obama's State of the Union Address last night. It was interesting to juxtapose the political crossfire over how to combat the presumed jobless recovery we're staring down after the subprime cirsis with Apple's fanboy-entrancing release of the iPad, a computer that is part phone, part laptop, and all status symbol. Clearly, we're not nearing a post-consumerist society, so far as St. Jobs is concerned, but then Steve wouldn't mind being the Big Brother in charge of the new media economy. Maybe that's why the iPad didn't ship with a viewer-facing camera -- too much of a tipoff that Big Steve is watching. In any case, times of economic unrest often inspire dystopian fiction, but whether we stay with the zombie track (as is indicated by AMC's Frank Darabont-helmed option for Robert Kirkman's Walking Dead series) or we get a new econopocalypse-styled dystopian breed in line with Jeff Somers's Electric Church remains to be seen. In either case, a lack of money will probably be good for the downer spec-fic business. Ironic.
I bring it up because: The econopocalypse is the new zombie apocalypse, at least according to Barack Obama's State of the Union Address last night. It was interesting to juxtapose the political crossfire over how to combat the presumed jobless recovery we're staring down after the subprime cirsis with Apple's fanboy-entrancing release of the iPad, a computer that is part phone, part laptop, and all status symbol. Clearly, we're not nearing a post-consumerist society, so far as St. Jobs is concerned, but then Steve wouldn't mind being the Big Brother in charge of the new media economy. Maybe that's why the iPad didn't ship with a viewer-facing camera -- too much of a tipoff that Big Steve is watching. In any case, times of economic unrest often inspire dystopian fiction, but whether we stay with the zombie track (as is indicated by AMC's Frank Darabont-helmed option for Robert Kirkman's Walking Dead series) or we get a new econopocalypse-styled dystopian breed in line with Jeff Somers's Electric Church remains to be seen. In either case, a lack of money will probably be good for the downer spec-fic business. Ironic.
Tuesday, January 26, 2010
Truly Trivial: How did Apple infamously deal with their unsold inventory of Lisa computers in the 1980s?
This trivia geek has been under the weather lately, so I'm recklessly going to the bullpen again for this week's Truly Trivial, resurrecting an old (but timely) Geek Trivia column. Apple is set to announce their new Apple tablet (or so we've been led to believe) in the next few hours. The uber-anticipated device is supposed to revolutionize publishing, gaming, Web surfing, commerce, race relations, economic disparity, and possibly even the quantum structure of the universe--at least according to Apple fanboys. How soon we forget that other Apple products have had similarly anticipated debuts only to fail miserably. No matter how certain you are that the Apple tablet is a can't-miss consumer device, just remember the sad tale of the Apple Lisa:
The Apple Lisa was the first commercially available stand-alone PC to employ both a graphic user interface (GUI) and a mouse. Developed under the direct supervision of Apple cofounder Steve Jobs, Apple intended the Lisa to revolutionize office computing as an all-in-one technical solution.
Beyond the GUI and the mouse, the original Lisa boasted several hardware and software features that were well ahead of their time. ... Too bad the Lisa cost almost $10,000 per unit and suffered woeful performance lags. The exorbitant amount of RAM and other high-end features made it significantly more expensive than IBM PCs and Apple's own Macintosh, which itself ran a faster, leaner GUI.
These drawbacks helped ensure that the Lisa never gained any significant market traction or adoption. After six years of frustrations and failures, Apple finally took a drastic and somewhat poignant measure to rid itself of the last 2,700 Lisa PCs it had in stock in 1989.
HOW DID APPLE UNLOAD ITS FINAL UNSOLD INVENTORY OF THE ORIGINAL LISA COMPUTER?Read the complete Q&A here.
Thursday, January 21, 2010
Nerd Word of the Week: Uncanny Valley
Uncanny valley (n.) - A phenomenon which describes your feeling of discomfort when observing characters, objects, or images that appear almost, but not quite, human. Basically, the kind of icky you feel when catching a glimpse of a weirdly almost-human robot or cartoon character --something that's almost too human, but not quite there. The explicit term uncanny valley refers to a graph of human reaction to human-like images. As the images become more human, people become more comfortable with them. As the images approach a very near-human status, a valley appears in the reaction curve, one that abates once the images become all but indistinguishable from actual humans. It is generally assumed that the uncanny valley is an evolutionary aversion -- one that arose as a means of dissuading humans from interacting with the ill or recently dead (who often look slightly less than human). That, or it's nature's way of preparing us for the inevitable zombie apocalypse.
I bring it up because: A recent Popular Mechanics article pointed out that there is virtually no scientific basis for the uncanny valley (hat tip to io9) despite the fact that science and science fiction have been casually invoking the term since the 1970s. While there is ample anecdotal evidence of the uncanny valley -- just ask anyone who has seen The Polar Express -- almost no formal research has ever been conducted into the incidence of, or mechanisms behind, the uncanny valley. That's a shame, as computer animation, particularly of the Avatar-esque 3D variety, is going to start pushing character designs right into the supposed uncanny valley. The Japanese also can't seem to stop themselves from building more and more (creepily) human-like robots, too. If mainstream media and consumer electronics are going to be heading that direction, it would be nice to know whether the public notion of the uncanny valley is mere conventional wisdom, or perhaps even erroneous pseudoscience. Somebody get the Skepchicks on this, stat!
I bring it up because: A recent Popular Mechanics article pointed out that there is virtually no scientific basis for the uncanny valley (hat tip to io9) despite the fact that science and science fiction have been casually invoking the term since the 1970s. While there is ample anecdotal evidence of the uncanny valley -- just ask anyone who has seen The Polar Express -- almost no formal research has ever been conducted into the incidence of, or mechanisms behind, the uncanny valley. That's a shame, as computer animation, particularly of the Avatar-esque 3D variety, is going to start pushing character designs right into the supposed uncanny valley. The Japanese also can't seem to stop themselves from building more and more (creepily) human-like robots, too. If mainstream media and consumer electronics are going to be heading that direction, it would be nice to know whether the public notion of the uncanny valley is mere conventional wisdom, or perhaps even erroneous pseudoscience. Somebody get the Skepchicks on this, stat!
Tuesday, January 19, 2010
Truly Trivial: When was the first 3D television broadcast, and what program was shown?
For those of us fortunate enough to attend the 2010 Consumer Electronics Show in Las Vegas, it has become glaring apparent that every major television manufacturer is desperate to shove 3D TV down our throats -- whether the consumer likes it or not. If you're among the millions of movie-goers (or Golden Globes judges) that saw James Cameron's Avatar, you know that Hollywood has also suddenly decided that 3D is the technology that will once again get consumers lining up at cinemas rather than queuing up on bittorrent. Piling on, ESPN and the Discovery Channel are committed to creating 3D HD television channels this year, and pretty much every major PC video game has a 3D expansion or sequel in the works (you couldn't throw a rock at CES without hitting a 3D version of Batman: Arkham Asylum).
Hope you like wearing dorky 3D glasses for several hours a day.
What's lost in all this sudden 3D hoopla is that 3D photography, motion pictures and television have been around for decades and that, while each has enjoyed a brief spark of popularity, the public has always swung back to familiar, comfortable two-dimensional media as its preferred viewing format. Some of this has been due to limitations in technology, some of it has been due to the paucity of good 3D content, but for this author's money the problem that killed 3D in the past remains the one that neither Silicon Valley nor Tinseltown have yet solved -- nobody wants to wear 3D glasses to watch TV. (Yes, there are 3D screens that don't require glasses, but those models demand a direct viewing angle; step a few degrees left or right of center and the image blurs, which is equally if not more inconvenient.)
Three-dimensional stereoscope photography dates back at least to the 1840s with Charles Wheatstone and David Brewster inaugurating the technology. A 3D photograph of Queen Victoria was displayed at the Great Exhibition in 1851. In 1855 the Kinematoscope 3D movie camera was produced, and by 1935 3D films started appearing in theaters. This technology didn't achieve a major commercial groundswell until the 1950s when classic films like Bwana Devil and the original House of Wax delighted movie-going audiences. But by the 1960s, the insatiable craze for 3D films had died out, partly because moviehouses couldn't afford, maintain, or properly operate the dual-projector systems required to show the films, and partly because the public fad of 3D movies had passed on.
That same fad reached television in the 1990s, when major networks offered special 3D episodes of primetime television programs -- including a particularly memorable episode of 3rd Rock from the Sun, "Nightmare on Dick Street" -- but again the craze died out by the end of the decade. This time, the passing couldn't be laid at the feet of the technology, as 3D TV was proven feasible over forty years earlier.
So, when was the first 3D television broadcast, and what program was shown?
Hope you like wearing dorky 3D glasses for several hours a day.
What's lost in all this sudden 3D hoopla is that 3D photography, motion pictures and television have been around for decades and that, while each has enjoyed a brief spark of popularity, the public has always swung back to familiar, comfortable two-dimensional media as its preferred viewing format. Some of this has been due to limitations in technology, some of it has been due to the paucity of good 3D content, but for this author's money the problem that killed 3D in the past remains the one that neither Silicon Valley nor Tinseltown have yet solved -- nobody wants to wear 3D glasses to watch TV. (Yes, there are 3D screens that don't require glasses, but those models demand a direct viewing angle; step a few degrees left or right of center and the image blurs, which is equally if not more inconvenient.)
Three-dimensional stereoscope photography dates back at least to the 1840s with Charles Wheatstone and David Brewster inaugurating the technology. A 3D photograph of Queen Victoria was displayed at the Great Exhibition in 1851. In 1855 the Kinematoscope 3D movie camera was produced, and by 1935 3D films started appearing in theaters. This technology didn't achieve a major commercial groundswell until the 1950s when classic films like Bwana Devil and the original House of Wax delighted movie-going audiences. But by the 1960s, the insatiable craze for 3D films had died out, partly because moviehouses couldn't afford, maintain, or properly operate the dual-projector systems required to show the films, and partly because the public fad of 3D movies had passed on.
That same fad reached television in the 1990s, when major networks offered special 3D episodes of primetime television programs -- including a particularly memorable episode of 3rd Rock from the Sun, "Nightmare on Dick Street" -- but again the craze died out by the end of the decade. This time, the passing couldn't be laid at the feet of the technology, as 3D TV was proven feasible over forty years earlier.
So, when was the first 3D television broadcast, and what program was shown?
Thursday, January 14, 2010
Nerd Word of the Week: Slacktivism
Slacktivism (n.) -- Pejorative slang term for feel-good measures that have little hope of actually aiding the cause or individuals they claim to support. Particularly used in cases where the "activism" is limited to online activity with little or no investment or sacrifice required of the supporter, such as signing an online petition, joining a Facebook group, or modifying your online avatar in some fashion.
I bring it up because: This week saw two events that brought the slacktivists out in droves: The earthquake near Haiti, and Google's threatened withdrawal from China. While many online actions have actually done some good -- such as entreaties to text certain codes that will send a donation to Haiti relief efforts through your cell phone bill -- far too much of the "response" has been limited to Twitter hashtags. How much did turning your avatar green really help the election protesters in Tehran last year? How much did revealing the color of your bra on Facebook really help breast cancer awareness or research? True protest, and true activism, means upsetting the status quo, but by integrating our support into our regular routine of online activities it becomes just more trend-follower noise rather than actual change. Something to think about as we rage against the machine online -- if our words aren't backed up by substantive actions, we're just another part of the machine we rage against.
I bring it up because: This week saw two events that brought the slacktivists out in droves: The earthquake near Haiti, and Google's threatened withdrawal from China. While many online actions have actually done some good -- such as entreaties to text certain codes that will send a donation to Haiti relief efforts through your cell phone bill -- far too much of the "response" has been limited to Twitter hashtags. How much did turning your avatar green really help the election protesters in Tehran last year? How much did revealing the color of your bra on Facebook really help breast cancer awareness or research? True protest, and true activism, means upsetting the status quo, but by integrating our support into our regular routine of online activities it becomes just more trend-follower noise rather than actual change. Something to think about as we rage against the machine online -- if our words aren't backed up by substantive actions, we're just another part of the machine we rage against.
Tuesday, January 12, 2010
Truly Trivial: What billion-dollar movie franchise did James Cameron NOT create, but want a writing credit for?
So James Cameron's Avatar is on its way to becoming the highest grossing movie in cinema history, displacing the previous record-holder of 12 years, James Cameron's Titanic. That Cameron guy sure has a knack at the box office; factoring in Terminator 2 and True Lies, Cameron's last four major motions pictures have raked in a little over $4 billion -- and that's before you count ancillary merchandising, home video, and television rebroadcast profits. It also ignores the added tally of the original Terminator, The Abyss, and Aliens, all of whom were arguably superior movies -- aesthetically speaking, if not financially -- than Cameron's more contemporary cash cows.
What's funny about Cameron's success is that, for all his ability to push the visual envelope and expertly depict even the most pedestrian of storylines (*COUGH*Titanic*COUGH*Avatar*COUGHCOUGH*), he's begun to develop a rep as the guy who steals all his ideas. The most famous example is the Terminator franchise, which owes great steaming piles of mea culpa to one Harlan Ellison, who just happens to be one of the most iconic sci-fi scribes of the 20th century. The Terminator shared a number of explicit plot points and story ideas with a couple of classic Outer Limits episodes written by Ellison: "Demon With a Glass Hand" and "Soldier".
Now, Ellison is infamous for being both contentious and litigious -- he earned as much notoriety for quarrelling with Gene Roddenberry, for whom he wrote arguably the greatest original Star Trek episode ever, "City on the Edge of Forever", than he did for his actual writing -- so it's little wonder that Cameron found himself taking some heat from Ellison over The Terminator. What's surprising is that Ellison's case was strong enough that Cameron caved, and the Terminator film and franchise now appear with the following phrase in their credits: "Acknowledgment to the works of Harlan Ellison."
If the Ellison affair was an isolated incident, we'd be apt to let it go and probably even chalk it up to Ellison being easier to buy off than fight off. But there have been questions raised about the unacknowledged inspiration for Avatar, too. No, we're not talking about Dances With Wolves, though that parallel is explicit. Poul Anderson's novella Call Me Joe follows the story of a paraplegic who connects his mind to a genetically engineered lifeform to explore a harsh planet and then ends up going native. Sound familiar?
Here's where it gets really funny. There's another billion-dollar movie franchise that Cameron wanted to direct but for which he couldn't secure the legal rights. For once, lack of ownership actually stopped Cameron from making the movie, even if it didn't stop him from writing a script treatment for the property. Moreover, when the movie finally got made -- and became an international phenomenon -- there were some very vague similarities between Cameron's script treatment and the finished product, so much so that Cameron felt "slighted" that his previous work wasn't acknowledged. Ironic, isn't it. So, you gotta ask:
What billion-dollar movie franchise did James Cameron NOT create, but want a writing credit for?
What's funny about Cameron's success is that, for all his ability to push the visual envelope and expertly depict even the most pedestrian of storylines (*COUGH*Titanic*COUGH*Avatar*COUGHCOUGH*), he's begun to develop a rep as the guy who steals all his ideas. The most famous example is the Terminator franchise, which owes great steaming piles of mea culpa to one Harlan Ellison, who just happens to be one of the most iconic sci-fi scribes of the 20th century. The Terminator shared a number of explicit plot points and story ideas with a couple of classic Outer Limits episodes written by Ellison: "Demon With a Glass Hand" and "Soldier".
Now, Ellison is infamous for being both contentious and litigious -- he earned as much notoriety for quarrelling with Gene Roddenberry, for whom he wrote arguably the greatest original Star Trek episode ever, "City on the Edge of Forever", than he did for his actual writing -- so it's little wonder that Cameron found himself taking some heat from Ellison over The Terminator. What's surprising is that Ellison's case was strong enough that Cameron caved, and the Terminator film and franchise now appear with the following phrase in their credits: "Acknowledgment to the works of Harlan Ellison."
If the Ellison affair was an isolated incident, we'd be apt to let it go and probably even chalk it up to Ellison being easier to buy off than fight off. But there have been questions raised about the unacknowledged inspiration for Avatar, too. No, we're not talking about Dances With Wolves, though that parallel is explicit. Poul Anderson's novella Call Me Joe follows the story of a paraplegic who connects his mind to a genetically engineered lifeform to explore a harsh planet and then ends up going native. Sound familiar?
Here's where it gets really funny. There's another billion-dollar movie franchise that Cameron wanted to direct but for which he couldn't secure the legal rights. For once, lack of ownership actually stopped Cameron from making the movie, even if it didn't stop him from writing a script treatment for the property. Moreover, when the movie finally got made -- and became an international phenomenon -- there were some very vague similarities between Cameron's script treatment and the finished product, so much so that Cameron felt "slighted" that his previous work wasn't acknowledged. Ironic, isn't it. So, you gotta ask:
What billion-dollar movie franchise did James Cameron NOT create, but want a writing credit for?
Thursday, January 07, 2010
Nerd Word of the Week: Blobject
Blogject (n.) - A portmanteau of blob and object, a blobject is a household item or device distinguished by its smooth, rounded, almost seamless design. The iPod is a classic blobject, and its popularity has radically popularized the blobject design ethos. Blobjects owe their existence largely to computer-aided design and manufacturing, and you can see early inklings of its association with futurism in early 1970s sci-fi television and movies, where the smooth "plastic fantastic" designs of Logan's Run and its ilk took hold. This aesthetic was mainstreamed, arguably, by Star Trek: The Next Generation where rounded edges and buttonless interfaces were the norm. Everything was seamless, plastic, and disposable. In some ways, the steampunk movement arose as a repudiation of the blobjectivism of mainstream design, with the individualized, customized, constantly-tinkered-with and constantly maintained bulk and clatter of steampunk tech (and its associated DIY culture) rejecting the upgrade-every-year trendiness and assumed vacuousness of blobject ownership.
I bring it up because: I am presently at the Consumer Electronics Show, and though I wrote this entry before I left (Planning!) I fully expect CES to be dominated both by already know blobjects (Google's Nexus One) and speculation about possible future blobjects (Apple's iSlate tablet). It's just one more step towards an ability to instantly manufacture anything we can mock up in a CAD program -- hello 3D printing, which is already scheduled to be demo'd at this year's CES -- which is itself another increment on our journey towards Ray Kurzweil and Vernor Vinge's predicted techno-singularity. Just so long as the future has Wi-Fi, I'm cool.
I bring it up because: I am presently at the Consumer Electronics Show, and though I wrote this entry before I left (Planning!) I fully expect CES to be dominated both by already know blobjects (Google's Nexus One) and speculation about possible future blobjects (Apple's iSlate tablet). It's just one more step towards an ability to instantly manufacture anything we can mock up in a CAD program -- hello 3D printing, which is already scheduled to be demo'd at this year's CES -- which is itself another increment on our journey towards Ray Kurzweil and Vernor Vinge's predicted techno-singularity. Just so long as the future has Wi-Fi, I'm cool.
Tuesday, January 05, 2010
Truly Trivial What sci-fi novel was the XBOX 360 dev team required to read?
I'm off to the Consumer Electronics Show this week, so I'm shamelessly and indefensibly shirking my trivia responsibilities yet again. To fill the void, here's some tech-toy themed minutia from my old Geek Trivia columns:
While Bill Gates may have a personal wealth that dwarfs the gross national product of many third-world countries, and Microsoft boasts a cash flow that would make some state revenue cabinets envious, jumping headlong into the multibillion-dollar gaming hardware market was still quite a daring leap for a software company. The man who convinced Gates and, perhaps more important, Steve Ballmer to get in the game, so to speak, was Xbox development chief J Allard. ...
Allard drew his inspiration for the Xbox 360 not just from more traditional sources of product development and market research, but also from science fiction—including a noted sci-fi novel that Allard made required reading for his entire Xbox 360 development team.
WHAT SCIENCE-FICTION NOVEL DID XBOX 360 DEVELOPMENT CHIEF J ALLARD REQUIRE HIS TEAM MEMBERS TO READ?Get the complete Q&A here.