I bring it up because: Today is Thanksgiving in the United States, a day of feasting gleefully without much concern to potential (if overhyped) frankenfood farmageddon or the lessons of The Omnivore's Dilemma. More specifically, Time named the nascent possibility of vat-grown meat one of the top technologies of 2009. Thus, we may be on the verge of sidestepping many of the moral concerns which have stirred many to vegetarianism (and into the ranks of PETA), leaving us only with the health risks associated with vat-meat. One wonders whether the next foodie counterculture movement won't be a rejection of "establishment" vat-meat in favor of natural farm-fresh animal flesh, with the next generation of suburban hippie defined as avowed carnivores, rather than prosetylizing meat-abstainers. Now that would be ironic.
The personal blog of Jay Garmon: professional geek, Web entrepreneur, and occasional science fiction writer.
Thursday, November 26, 2009
Nerd Word of the Week: Vatvegan
Vatvegan (adj.) - A term coined just days ago on Twitter by William Gibson, referring to a vegan who is nonetheless willing to consume artificially grown meat. Specifically, someone who has moral objections to eating conventional meat, but is willing to consume vat-grown pseudo-animal products so long as the substance consumed was not capable of experiencing pain. Artifical and/or moral meats have a long tradition in spec-fic, from the vat-meats mentioned offhand in Robert Heinlein's Door Into Summer to the sentient self-slaughtering cows found in Douglas Adams's The Restaurant at the End of the Universe to replicated meat found througthout the Star Trek mythos, vatvegans have been waiting for the goods for a long, long time.
I bring it up because: Today is Thanksgiving in the United States, a day of feasting gleefully without much concern to potential (if overhyped) frankenfood farmageddon or the lessons of The Omnivore's Dilemma. More specifically, Time named the nascent possibility of vat-grown meat one of the top technologies of 2009. Thus, we may be on the verge of sidestepping many of the moral concerns which have stirred many to vegetarianism (and into the ranks of PETA), leaving us only with the health risks associated with vat-meat. One wonders whether the next foodie counterculture movement won't be a rejection of "establishment" vat-meat in favor of natural farm-fresh animal flesh, with the next generation of suburban hippie defined as avowed carnivores, rather than prosetylizing meat-abstainers. Now that would be ironic.
I bring it up because: Today is Thanksgiving in the United States, a day of feasting gleefully without much concern to potential (if overhyped) frankenfood farmageddon or the lessons of The Omnivore's Dilemma. More specifically, Time named the nascent possibility of vat-grown meat one of the top technologies of 2009. Thus, we may be on the verge of sidestepping many of the moral concerns which have stirred many to vegetarianism (and into the ranks of PETA), leaving us only with the health risks associated with vat-meat. One wonders whether the next foodie counterculture movement won't be a rejection of "establishment" vat-meat in favor of natural farm-fresh animal flesh, with the next generation of suburban hippie defined as avowed carnivores, rather than prosetylizing meat-abstainers. Now that would be ironic.
Tuesday, November 24, 2009
Truly Trivial: What US President tried to reschedule Thanksgiving, and why?
A short work week at NotebookReview -- including preparations for Black Friday -- means we're going back to the Geek Trivia well in lieu of a fresh Truly Trivial column, but it's a timely and popular retread. The key excerpt below explains.
The full Q&A is here.
Sales volume typically spikes on Black Friday, then plummets back down the following Monday. The sales trend then ramps back up as Christmas approaches, with the two Saturdays and two Sundays nearest to but before Christmas eventually overtaking Black Friday in sales volume. (December 23 is always a good bet to be near the top as well.) The last-minute shopper is a statistical -- and economically powerful -- reality.
Despite Black Friday's somewhat overstated reputation, one U.S. president so respected the economic power wielded by the Christmas shopping season that he tried to extend it a week -- by rescheduling Thanksgiving.
WHICH U.S. PRESIDENT WANTED TO RESCHEDULE THANKSGIVING TO ACCOMMODATE CHRISTMAS SHOPPING?
The full Q&A is here.
Thursday, November 19, 2009
Nerd Word of the Week: Whuffie
Whuffie (n.) - A form of currency based on social status, rather than tangible wealth. The Whuffie originated in Cory Doctorow's science fiction novel Down and Out in the Magic Kingdom. Whuffie has since become an insider term for online reputation, with such non-fiction works as The Whuffie Factor taking the word mainstream. In Doctorow's original novel, Whuffie (which is the plural and singular form) replaced money as the means of acquiring wealth, though "wealth" in that setting meant something very different. Down and Out is set in a post-scarcity economy, where necessities and luxuries are so easily and cheaply produced they have no practical cost. Thus, the economy was based on status rather than property, and complex systems were set up to upvote and downvote your every action, such that every activity becomes a public political performance. If you think that's an unlikely future, please say so in the syndicated comments below this posts, and/or downvote (using the controversial new dislike button) the Facebook note version of this blog. Possibly using your always-on Internet phone. Which is itself a status symbol. The point (and the sarcasm) should be evident by now.
I bring it up because: The Oxford English Dictionary declared the verb unfriend 2009's Word of the Year. When removing someone from your virtual social circle is so commonplace and mainstream as to get pub from the OED, you know that virtual, pervasive reputation management is here to stay. Combine that with the relatively ballyhooed debut of LoveMachine, a real-life antecedent to the Whuffie, a virtual reputation-management system created by Second Life founder Philip Rosedale. Granted, The Whuffie Bank already got there and already has the Whuffie name. Also, the PenguinCon science fiction-slash-Linux convention has recently experimented with an in-con virtual reputation currency called a Whuffie, though in practice it worked a bit more like Slashdot's karma system. That may be a distinction without a difference, but I'll leave it to my Whuffie score to sort it out.
I bring it up because: The Oxford English Dictionary declared the verb unfriend 2009's Word of the Year. When removing someone from your virtual social circle is so commonplace and mainstream as to get pub from the OED, you know that virtual, pervasive reputation management is here to stay. Combine that with the relatively ballyhooed debut of LoveMachine, a real-life antecedent to the Whuffie, a virtual reputation-management system created by Second Life founder Philip Rosedale. Granted, The Whuffie Bank already got there and already has the Whuffie name. Also, the PenguinCon science fiction-slash-Linux convention has recently experimented with an in-con virtual reputation currency called a Whuffie, though in practice it worked a bit more like Slashdot's karma system. That may be a distinction without a difference, but I'll leave it to my Whuffie score to sort it out.
Tuesday, November 17, 2009
Truly Trivial: How many planets are known to have liquid surface water?
Last Friday, NASA announced that its LCROSS mission -- also known as "let's slam a probe into the lunar surface while another probe films the action, Jackass-style" -- found significant amounts of water on the moon. This revved up interest in lunar colonization again, despite the fact that there isn't any pure, drinkable, liquid water on the moon.
Now, water is very useful for space exploration even if you can't directly drink it, as simple electrolysis lets you crack water into breathable oxygen and hydrogen rocket fuel. The moon just became much more interesting as a weigh station for spacecraft, as robot factories could be sent ahead to stockpile hydrogen, oxygen, and possibly even purified drinkable water for manned craft heading further out into the solar system. Moreover, water is a very effective radiation shield, provided you can encircle a crew cabin in a two-meters-deep tank of H2O on all sides. The "significant amounts" of lunar water means moonbases might just be able to extract the Olympic swimming pools' worth of water necessary to ward off the cosmic fry-daddy rays awaiting space travelers.
As Karl Schroeder points out, the main barrier to manned spaceflight isn't technology but the expense of hauling the necessary materials into orbit. Launching these materials off the moon requires climbing out of one-sixth the gravity well that exists on Earth, which means there's an economic argument to be made for gathering spaceflight materials from a lunar base rather than launching them from Earth with the astronauts. Besides, if we can perfect the robot factories on the moon, we can get serious about mining comets, which have a lot more water-ice mass per capita and a lot less gravity. Robot comet iceminers mean that manned spacecraft could have a plethora of automated air, fuel, and shielding depots scattered throughout the Interplanetary Transport Network.
Thus, the lunar water discovery is very exciting, but it's unlikely to lead directly to the holy grail of space exploration -- the discovery of extraterrestrial life. Virtually all life as we know it requires water to exist, particularly liquid water that appears on the surface of the world, where it can be exposed to nurturing (and mutating) solar radiation. We've found hundreds of extrasolar planets and thousands of extraterrestrial non-stellar bodies, but liquid surface water remains stubbornly rare.
How many planets besides Earth are known to possess liquid surface water?
Now, water is very useful for space exploration even if you can't directly drink it, as simple electrolysis lets you crack water into breathable oxygen and hydrogen rocket fuel. The moon just became much more interesting as a weigh station for spacecraft, as robot factories could be sent ahead to stockpile hydrogen, oxygen, and possibly even purified drinkable water for manned craft heading further out into the solar system. Moreover, water is a very effective radiation shield, provided you can encircle a crew cabin in a two-meters-deep tank of H2O on all sides. The "significant amounts" of lunar water means moonbases might just be able to extract the Olympic swimming pools' worth of water necessary to ward off the cosmic fry-daddy rays awaiting space travelers.
As Karl Schroeder points out, the main barrier to manned spaceflight isn't technology but the expense of hauling the necessary materials into orbit. Launching these materials off the moon requires climbing out of one-sixth the gravity well that exists on Earth, which means there's an economic argument to be made for gathering spaceflight materials from a lunar base rather than launching them from Earth with the astronauts. Besides, if we can perfect the robot factories on the moon, we can get serious about mining comets, which have a lot more water-ice mass per capita and a lot less gravity. Robot comet iceminers mean that manned spacecraft could have a plethora of automated air, fuel, and shielding depots scattered throughout the Interplanetary Transport Network.
Thus, the lunar water discovery is very exciting, but it's unlikely to lead directly to the holy grail of space exploration -- the discovery of extraterrestrial life. Virtually all life as we know it requires water to exist, particularly liquid water that appears on the surface of the world, where it can be exposed to nurturing (and mutating) solar radiation. We've found hundreds of extrasolar planets and thousands of extraterrestrial non-stellar bodies, but liquid surface water remains stubbornly rare.
How many planets besides Earth are known to possess liquid surface water?
Thursday, November 12, 2009
Nerd Word of the Week: Cryptid
Cryptid (n.) - A creature that is rumored to exist but no one has managed to prove is real. Bigfoot, chupacabras, the Moth Man, and the Jersey Devil are among the more famous examples of cryptids. Cryptozoology is the "scientific" study of cryptids, though (much like ufology) the field is often dominated by fringe scientists and conspiracy theorists who seem ill-acquainted with the actual scientific method. (There's also a subfield called cryptobotany, which seems mostly obsessed with finding giant man-eating plants. Little Shop of Horrors apparently was inspired by true events.)
Cryptids, to no one's surprise, make excellent fodder for speculative fiction A Lee Martinez's novel Monster is a recent example, though the trend is hardly limited to books. The animated series The Secret Saturdays chronicles the adventures of a family of cryptid hunters, depicted in a Jack Kirby pulp-hero style. Moreover, giant cryptids seem to have a rolling contract with Syfy, as the channel has become infamous for cheeseball original cryptid-kaiju movies like Megashark vs. Giant Octopus and Dinocroc vs. Supergator.
I bring it up because: A mere 76 years ago today, the first known photographs of the world's most famous cryptid, the Loch Ness Monster, were taken. On Nov. 12, 1933 Scotsman Hugh Gray snapped the first purported pics of Nessie, though to most eyes it looks like a blurry image of a dog swimming with a stick in its mouth. 1933 was when the Nessie phenomenon gained media attention, with multiple sighting reports published in papers that year. It was only a year later that the world-famous but since-discredited Surgeon's Photo of the Loch Ness Monster was taken, proving that hoaxsters trying to horn in on the cryptid action are a time-honored tradition, even if cryptozoology itself isn't.
Cryptids, to no one's surprise, make excellent fodder for speculative fiction A Lee Martinez's novel Monster is a recent example, though the trend is hardly limited to books. The animated series The Secret Saturdays chronicles the adventures of a family of cryptid hunters, depicted in a Jack Kirby pulp-hero style. Moreover, giant cryptids seem to have a rolling contract with Syfy, as the channel has become infamous for cheeseball original cryptid-kaiju movies like Megashark vs. Giant Octopus and Dinocroc vs. Supergator.
I bring it up because: A mere 76 years ago today, the first known photographs of the world's most famous cryptid, the Loch Ness Monster, were taken. On Nov. 12, 1933 Scotsman Hugh Gray snapped the first purported pics of Nessie, though to most eyes it looks like a blurry image of a dog swimming with a stick in its mouth. 1933 was when the Nessie phenomenon gained media attention, with multiple sighting reports published in papers that year. It was only a year later that the world-famous but since-discredited Surgeon's Photo of the Loch Ness Monster was taken, proving that hoaxsters trying to horn in on the cryptid action are a time-honored tradition, even if cryptozoology itself isn't.
Wednesday, November 11, 2009
We're on the verge of the 4th Epoch of memory
Before we begin, a hat tip goes out to Phil "Bad Astronomy" Plait for disseminating this collection of Carl Sagan quotes, which inspired this little observation. As such, I'll begin with Sagan's own words, which invariably outstrip my own:
Within Sagan's words we observe the first three Epochs of Memory. Like all species on Earth, we first solely stored memory genetically. Not just the memory of our biological selves, but whatever rote, instinctual behaviors that dominate our actions still today. Even viruses, which exhibit nothing so complex as "thought" or "intention" nonetheless possess a form of genetic memory that impels them to propagate. This was our First Epoch of Memory, when our genes made of us all that we could ever be. The design limitation of genetic memory was its dependence on mutation to spur advancement. All learning was random. Carried to its logical conclusion, the First Epoch led to a random mutation that improved our memory.
The Second Epoch saw the rise of brains, organs designed explicitly to deal with information. We (or, rather, our genetic forebearers) could now remember information and apply that memory long before the implications of that data became encoded in our genes. We need not wait for some random mutation to give us an advantageous instinctive aversion to chewing on hemlock leaves; we could now remember the illness we suffered after first consuming hemlock and, if we survived, choose not to eat it again. Many species developed brains and applied it to this explicit advantage.
Carried to its logical conclusion, the Second Epoch gave rise to language, a collection of symbols that could be remembered collectively by a group and used to communicate information not directly discerned by an individual. Now, if one member of the tribe learned not to eat hemlock, all the members of the tribe could benefit from that knowledge. The limitation of language was tied, initially, to the limitation of our physical memory. We could only remember so much, and the fidelity of that memory often suffered when it was transferred -- as in, told -- to another individual. The children's game of telephone is the classic exhibition of oral history's limitations.
Language, carried to its logical conclusion, gave rise to Third Epoch of Memory, a time when information could be stored outside the individual in tangible form. Books, for most of our history. Now, so long as you had the basic knowledge necessary to decipher the encoding language, you could benefit from the collective experience of the entire species. The limitation of external memory is organization and availability. We no longer carry information with us, but we must actively seek it out, and then apply it contextually.
Books, carried to their logical conclusion, now stand us on the precipice of the Fourth Epoch of Memory. In this Epoch, seeking out, organizing, and contextually applying information has been rendered an external process. Put more simply, the Fourth Epoch of Memory arrives when externally stored information is indistinguishable from internally stored information. What we know won't be limited to what we can remember, because we can access external memory (the Internet et al) with the same ease, speed, and faculty as internal memory (our brains).
A combination of high-speed mobile communications, augmented reality interfaces, and massive free, searchable libraries of information are only just now being born. But when this mass of external memory that we have spent centuries exponentially expanding is suddenly available in perfect context at the perfect moment for everyone, we will have become a fundamentally different species all over again. What we know will be limited only by our curiosity and our bandwidth speed.
I, for one, can't wait for the Fourth Epoch to arrive. As Jamais Cascio warns us, this era won't be without its drawbacks, but I think it will be well worth the price of admission. I only wish Carl had been here to see it.
"When our genes could not store all the information necessary for survival, we slowly invented brains. But then the time came, perhaps ten thousand years ago, when we needed to know more than could conveniently be contained in brains. So we learned to stockpile enormous quantities of information outside our bodies. We are the only species on the planet, so far as we know, to have invented a communal memory stored neither in our genes nor in our brains. The warehouse of that memory is called the library."I posit that the advancement of our (and every) species is defined by the limits of our memory. As we overcome the inherent design limitations of each type of memory, we move into a new Epoch of Memory, and a new level of complexity as both a society and a species.
-- "Persistence of Memory," Cosmos
Within Sagan's words we observe the first three Epochs of Memory. Like all species on Earth, we first solely stored memory genetically. Not just the memory of our biological selves, but whatever rote, instinctual behaviors that dominate our actions still today. Even viruses, which exhibit nothing so complex as "thought" or "intention" nonetheless possess a form of genetic memory that impels them to propagate. This was our First Epoch of Memory, when our genes made of us all that we could ever be. The design limitation of genetic memory was its dependence on mutation to spur advancement. All learning was random. Carried to its logical conclusion, the First Epoch led to a random mutation that improved our memory.
The Second Epoch saw the rise of brains, organs designed explicitly to deal with information. We (or, rather, our genetic forebearers) could now remember information and apply that memory long before the implications of that data became encoded in our genes. We need not wait for some random mutation to give us an advantageous instinctive aversion to chewing on hemlock leaves; we could now remember the illness we suffered after first consuming hemlock and, if we survived, choose not to eat it again. Many species developed brains and applied it to this explicit advantage.
Carried to its logical conclusion, the Second Epoch gave rise to language, a collection of symbols that could be remembered collectively by a group and used to communicate information not directly discerned by an individual. Now, if one member of the tribe learned not to eat hemlock, all the members of the tribe could benefit from that knowledge. The limitation of language was tied, initially, to the limitation of our physical memory. We could only remember so much, and the fidelity of that memory often suffered when it was transferred -- as in, told -- to another individual. The children's game of telephone is the classic exhibition of oral history's limitations.
Language, carried to its logical conclusion, gave rise to Third Epoch of Memory, a time when information could be stored outside the individual in tangible form. Books, for most of our history. Now, so long as you had the basic knowledge necessary to decipher the encoding language, you could benefit from the collective experience of the entire species. The limitation of external memory is organization and availability. We no longer carry information with us, but we must actively seek it out, and then apply it contextually.
Books, carried to their logical conclusion, now stand us on the precipice of the Fourth Epoch of Memory. In this Epoch, seeking out, organizing, and contextually applying information has been rendered an external process. Put more simply, the Fourth Epoch of Memory arrives when externally stored information is indistinguishable from internally stored information. What we know won't be limited to what we can remember, because we can access external memory (the Internet et al) with the same ease, speed, and faculty as internal memory (our brains).
A combination of high-speed mobile communications, augmented reality interfaces, and massive free, searchable libraries of information are only just now being born. But when this mass of external memory that we have spent centuries exponentially expanding is suddenly available in perfect context at the perfect moment for everyone, we will have become a fundamentally different species all over again. What we know will be limited only by our curiosity and our bandwidth speed.
I, for one, can't wait for the Fourth Epoch to arrive. As Jamais Cascio warns us, this era won't be without its drawbacks, but I think it will be well worth the price of admission. I only wish Carl had been here to see it.
Tuesday, November 10, 2009
Truly Trivial: What sci-fi novel coined the phrase 'computer worm?'
Once again, I am overburdened by day jobbery and must resort to recycling one of my old TechRepublic Geek Trivia posts for this week's Truly Trivial column. Don't worry; it's timely. Twenty-six years ago today, the computer virus was born. Twenty one years ago last week, perhaps the most famous computer worm ever, the Morris Worm, was launched. Which brings me to my original article:
The effects of the Morris Worm were so widespread and pronounced that it made the national news (quite a feat in 1988), and it eventually earned Morris a landmark if decidedly unintimidating conviction: Three years' probation, 400 hours of community service, and a $10,050 fine. In geek circles, people sometimes referred to the Morris Worm as the Great Worm, a reference to the Great Worms (i.e., dragons) found in J. R. R. Tolkien's Lord of the Rings series.
This, of course, was not why we called Morris' creation a worm, as these types of malicious programs (distinct from computer viruses) owe their etymology to a work of science fiction, rather than fantasy.
WHAT WORK OF SCIENCE FICTION COINED THE TERM COMPUTER WORM?You can find the complete original Q&A here.
Monday, November 09, 2009
Short story: Crimes Against Science Fiction
I want to give a shout out to Suzanne Vincent and the editing crew over at Flash Fiction Online, who gave this story one of the kindest and most encouraging rejection letters I've ever received. This was really my first stab at flash fiction, and as such its rather derivative -- I was channeling a bit of Douglas Adams when I threw this together -- with a weak ending. I had fun with it, but it's time to put this little experiment out to pasture. Thus, Crimes Against Science Fiction is trunked here for your enjoyment.
---
When Tommy stumbled half-awake out of his apartment this morning, he didn't expect that the future would be waiting outside to kill him. Or that the future would be so fat.
"Stand fast, and prepare to answer for your crimes!" the one of the left shouted, pointing an exotic firearm at Tommy. The assailant was dressed in a flamboyantly colorful excuse for a military uniform, though the garment was clearly intended for someone far thinner than the rotund, pimply-faced gent breathing heavily in Tommy's parking space.
The one on the right interrupted. "Wait, we have to explain his crimes to him first. He isn't guilty of them yet." This one was slightly taller, and rail thin, except for the almost comically out-of-place beer belly swelling beneath the belt of his…er…outfit. He was wearing some bizarre hybrid of 17th century samurai armor and 21st century leather fetish gear while carrying a wildly impractical, oversized sword.
---
When Tommy stumbled half-awake out of his apartment this morning, he didn't expect that the future would be waiting outside to kill him. Or that the future would be so fat.
"Stand fast, and prepare to answer for your crimes!" the one of the left shouted, pointing an exotic firearm at Tommy. The assailant was dressed in a flamboyantly colorful excuse for a military uniform, though the garment was clearly intended for someone far thinner than the rotund, pimply-faced gent breathing heavily in Tommy's parking space.
The one on the right interrupted. "Wait, we have to explain his crimes to him first. He isn't guilty of them yet." This one was slightly taller, and rail thin, except for the almost comically out-of-place beer belly swelling beneath the belt of his…er…outfit. He was wearing some bizarre hybrid of 17th century samurai armor and 21st century leather fetish gear while carrying a wildly impractical, oversized sword.
Thursday, November 05, 2009
Nerd Word of the Week: Spam in a can
Spam in a can (adj.) - Space program slang term for a passive occupant in a spacecraft, specifically a space capsule. The phrase is generally attributed to Chuck Yeagar, if only because he's shown describing the Mercury astronauts as "spam in a can" in the movie The Right Stuff, though there is ample evidence that multiple astronauts and NASA officials used the term liberally during the 1960s Space Race. The early Mercury astronauts, all trained military pilots, are known to have resisted being mere "spam in a can" with no active control of their vessels, thus forcing a level of human direction into early space vehicles.
Spam in a can is now typically used as a snarky criticism of the current level of manned spaceflight technology, as humans are still travelling as meat packed into primitive metal containers and shipped long distances. This falls under the sensawunda criticism of NASA -- particularly the space shuttle successor Project Constellation, which is described as "Apollo on steroids" -- in that we are still not creating or using the sci-fi-inspired tech that books and movies has promised us for decades. Warren Ellis and Colleen Doran rather deftly pointed out the spam-in-a-can disappointment factor with NASA in the graphic novel Orbiter, wherein an alien intelligence redesigns our "primitive" space shuttle into a true interplanetary exploration vehicle.
I bring it up because: Laika passed away 52 years ago on Tuesday. For those that don't know the name, Laika was the first living creature that humans sent into space. She was a Soviet space dog launched aboard Sputnik 2 on Nov. 3, 1957. She died from overheating a few hours after launch, thus making Laika the first spaceflight casualty. Her likeness is preserved in a statue at the cosmonaut training facility in Star City, Russia, as her nation's first space traveler. Telemetry from her mission proved that living beings could survive launch g-forces and weightlessness, thus proving that spam in a can was a viable manned spaceflight model.
I also mention the spam in a can principle as a corollary to Charles Stross's recent thought experiment blog post, How habitable is the Earth? Stross essentially argues that humans are explicitly designed for a particular fraction of Earth's environment that exists during a hyper-minute fraction of Earth's geological history, thus making human space exploration -- which removes us from this environment -- a terribly difficult and expensive undertaking. Karl Schroeder recently counter-argued (the point, not Stross) that most of these problems are surmountable if we get launch expenses down and can get the proper equipment -- all of which already exists -- into orbit cheaply. Which gets us back to the spam in a can criticism: Until the tech gets better, large-scale humans space exploration is a pipe dream.
Spam in a can is now typically used as a snarky criticism of the current level of manned spaceflight technology, as humans are still travelling as meat packed into primitive metal containers and shipped long distances. This falls under the sensawunda criticism of NASA -- particularly the space shuttle successor Project Constellation, which is described as "Apollo on steroids" -- in that we are still not creating or using the sci-fi-inspired tech that books and movies has promised us for decades. Warren Ellis and Colleen Doran rather deftly pointed out the spam-in-a-can disappointment factor with NASA in the graphic novel Orbiter, wherein an alien intelligence redesigns our "primitive" space shuttle into a true interplanetary exploration vehicle.
I bring it up because: Laika passed away 52 years ago on Tuesday. For those that don't know the name, Laika was the first living creature that humans sent into space. She was a Soviet space dog launched aboard Sputnik 2 on Nov. 3, 1957. She died from overheating a few hours after launch, thus making Laika the first spaceflight casualty. Her likeness is preserved in a statue at the cosmonaut training facility in Star City, Russia, as her nation's first space traveler. Telemetry from her mission proved that living beings could survive launch g-forces and weightlessness, thus proving that spam in a can was a viable manned spaceflight model.
I also mention the spam in a can principle as a corollary to Charles Stross's recent thought experiment blog post, How habitable is the Earth? Stross essentially argues that humans are explicitly designed for a particular fraction of Earth's environment that exists during a hyper-minute fraction of Earth's geological history, thus making human space exploration -- which removes us from this environment -- a terribly difficult and expensive undertaking. Karl Schroeder recently counter-argued (the point, not Stross) that most of these problems are surmountable if we get launch expenses down and can get the proper equipment -- all of which already exists -- into orbit cheaply. Which gets us back to the spam in a can criticism: Until the tech gets better, large-scale humans space exploration is a pipe dream.
Tuesday, November 03, 2009
Truly Trivial: What was the original written formulation for E=mc2?
The old trivia geek is bogged down with day job wonkery, so I'm reprinting this old-school Geek Trivia from my TechRepublic days, "Eye for an Einstein." I quote from it thusly:
Get the full Q&A here.
Einstein proved that even apart from movement, all objects that possess mass also possess energy. You can find out exactly how much energy with the simple calculation of mass multiplied by the square of the speed of light, E=mc2.
The crazy thing is, you won’t find this famous equation in any of Einstein’s papers published before, during, or after 1905. That’s because Einstein never wrote his most famous equation in its most famous form.
Instead, he earned his scientific accolades expressing energy-mass equivalence very differently.
WHAT WAS THE ORIGINAL FORMULATION OF E=mc2 AS WRITTEN BY EINSTEIN?
Get the full Q&A here.