The Ultimate Irony of Climate Change: Before We Created It, It Created Us


The picture above was taken at one of the best exhibits I’ve ever seen in any museum: the Hall of Human Origins at the Smithsonian National Museum of Natural History. The chart in the lower right-hand corner shows the correlation between brain size in humans and drastic changes in climate with an emphasis on the period between 800,000 and 200,000 years ago. (A nicer version of this chart is available on the exhibit’s website.)

It makes perfect sense that greater intelligence (as evidenced by larger brains) proved advantageous during times of unpredictable weather since the more humans were able to plan ahead, communicate, and work collaboratively, the more likely they were to survive. In fact, I’ve even read that the cranial capacity of fossilized skulls gets larger the further away from the equator they occurred, suggesting a correlation between larger brains and harsher weather. In other words, in terms of natural selection, everything here appears to be in perfect working order.

But while there are no surprises in the relationship between brain size and climate change, there certainly is plenty of irony. The eventual result of all of that hard-fought intelligence were both the agricultural and industrial revolutions — precisely the technological advances that are most closely associated with modern climate change. Therefore, one could theorize that surviving rapid climate change bestowed upon humanity just enough intelligence to create even more rapid and dangerous climate change. One might even go so far as to say that the human brain is attempting to self-perpetuate continued growth.

I’ve read conflicting predictions of how this latest wave of climate change will ultimately affect brains size. Since equatorial temperatures will continue to expand latitudinally, it’s possible that the human brain could suddenly stop growing; on the other hand, due to all the challenges humanity faces as a result of rapid climate change, the size of our brains could continue to grow — perhaps at an even faster pace. Personally, I’m hoping for a future where we learn to use technology, intelligence, and even a little empathy to finally take control of our own evolutionary paths. Although it’s a little late for me to be genetically engineered, I wouldn’t mind a few multi-core petaflop processors embedded in my brain and at least one robotic arm.

The Miniaturization of Warfare


Growing up in the 80s, we were taught to fear a nuclear attack by the Soviet Union. Today, I think it’s fair to say that most people believe cyberwarfare is probably a greater threat than a full-scale nuclear holocaust.

What many people don’t fully grasp about nuclear weapons (in particular, those who object to reducing our stockpiles) is that they constitute a tremendous expense without all that much benefit — primarily due to the fact that governments can’t actually use them. Whereas the U.S. currently deploys conventional weapons on a weekly and sometimes daily basis, it’s very difficult to imagine a scenario where the United States could justify launching a nuclear attack of even the smallest scale.

This concept is critical to the plot of my story The Epoch Index, and is probably best described by the following passage:

After centuries-old rivalries finally escalated into full-scale nuclear conflicts, the United Nations drafted and unanimously voted into effect a resolution unequivocally banning any sized nuclear arsenal anywhere on the planet. The U.S. and other early nuclear adopters were happy to back (and help enforce) the new international law, having long ago anticipated the nuclear backlash and invested heavily in Prompt Global Strike systems: networks of launch vehicles and hypersonic cruise missiles designed to deliver warheads filled with scored tungsten rods twice as strong as steel and capable of ripping any structure anywhere on Earth to shreds in less time than it takes to have a pizza delivered. Thermonuclear hydrogen bombs were old news, as far as most world powers were concerned. The only reason to unleash 50 megatons of destruction is if you have very little faith in the accuracy of your delivery mechanisms. Modern weaponry can target down to the square centimeter, and since it uses real time topographical guidance, it can do so even when your entire GPS satellite network is compromised. Besides, what’s the point of defeating another nation if your great grandchildren can’t even set foot in it, and just about everything worth looting, pillaging, or oppressing is either incinerated or radioactive? Nuclear weapons are clumsy and inelegant. High-tech conventional is the new thermonuclear. Modern militaries say less is more.

In my upcoming novel Kingmaker, drones are a central theme:

It wasn’t special operations teams that concerned him; he was confident he could see a takedown coming in plenty of time, and even if he didn’t, he probably stood as good a chance of walking away from a team of Navy Seals as any one of the Seals themselves. What Alexei feared was death from above. With a well coordinated drone strike, you were simply there one moment, and everywhere but there the next. It didn’t matter how quick you were, or how smart, or how well trained. If you were on the CIA’s radar, they knew how to get you off of it and still be home in time for dinner and to kiss the kids goodnight. All it cost them was barely an hour’s worth of classified paperwork that everyone already knew would never see the inside of either a civilian or military courtroom.

As a deterrent, maintaining a nuclear arsenal equal to (or slightly greater than) those of one’s rivals still makes some strategic sense, however the reality is that weapons which can be relatively inexpensively and surreptitiously deployed are far more menacing than weapons that everyone knows you cannot actually use. In other words, the world has much more to fear from weapons that can — without due process — target buildings, vehicles, and even individuals than indiscriminate warheads that can destroy entire cities.

Just as in the world of technology, we are now witnessing the miniaturization of warfare.

How the Chrome Dev Tools Got Me an Awesome License Plate


One of my favorite places in the world is the Udvar-Hazy Air and Space Museum (which is only about 15 minutes from my house), so when I saw that I could help support the Smithsonian with a custom license plate, I figured I’d give it a go. While I was at it, I decided to see if I could figure something out that would also symbolize one of my other passions: web development. It occurred to me that the perfect way to bring them both together would be the tag “&nbsp” which is the HTML entity code for “space” (technically, it’s “ ” but you can’t get a semicolon on a license plate, and most browsers don’t require it, anyway).

When I checked the plate online, I was both pleased and surprised to find that it was available, but after I started the registration and purchase process, I found out why. The DMV web application does not escape user input, so the character sequence “&nbsp” is always displayed as a literal space. I hoped I might still get away with it, however when I tried to submit the order confirmation form, I got a server-side error message explaining that the plate ” ” (empty space) was invalid.

Being the determined hacker that I am, I initially saved the source from the confirmation page, fixed the error by turning “&nbsp” into “&nbsp” (the character code for ampersand followed by “nbsp” — the proper way to escape user input in this case), and started working on tricking the DMV’s servers into believing that the form I was submitting actually came from them. But then it occurred to me that I could simply fix the DMV’s mistake using the WebKit Web Inspector. I opened up the awesome Chrome Dev Tools, made the change in the live page, and the form submitted perfectly. About two weeks later, my brand new plates arrived.

Thanks to the WebKit Web Inspector, the Chrome Dev Tools, and the openness and transparency of the web, I’m now rolling through Northern Virginia representing all my space-enthusiast and web-developer homies.

Macro Photographs of a MacBook Pro Retina Display

The other day, I noticed my Canon 7D with a 100mm macro lens on it sitting right beside my MacBook Pro with a Retina display, so I decided to see what 220 pixels per inch looks like blown up. The photographs below compare the same icons and text on a Retina display versus the display on an 11″ MacBook Air.

Click on any of the images to see it at twice the size (note that the images are 1,000 pixels wide and 220 PPI, so they look awesome on a retina display, but they may also take a few seconds to load).


Text on an 11″ MacBook Air.


Text on a MBP Retina. Much sharper.


The icon on a standard display.


The icon on a retina display. If your monitor is clean and you look really closely, you can see a few dead pixels.


A close-up of the icon on a standard display.


A close-up of the icon on a retina display.


The menu bar on a retina display. Notice how the updated icons look great, and those that haven’t been updated yet look like crap. Unfortunately, this is what most of the internet looks like (with the exception of text, which looks great).


The dreaded ghosting issue. You can also see several pixels misbehaving in this photo (top center).


The one curious exception to text looking almost universally better on the retina display is the Twitter application. For some reason, the text looks as bad as the scaled-up profile pictures.




Genetic Data Storage Technology From Containment Becomes a Reality


In my novel Containment, I write about a computer scientist (Arik) and a biologist (Cadie) who work together on a project to use human DNA as a general data storage medium. They call the project ODSTAR for Organic Data Storage and Retrieval, and the first big piece of data they store and successfully retrieve is an image of earth known as The Blue Marble (one of the most famous photographs in history taken by the crew of Apollo 17). Their ODSTAR technology eventually gets used to store critical research which they discover can actually get passed down to future generations.

As was the case with artificial photosynthesis and the proposal to use light pollution from distant worlds to detect the existence of extraterrestrials, technology proposed in Containment has again become a reality. Researchers at Harvard University encoded a 53,426-word book into DNA and then decoded it again with an error rate of only ten bits total.

If you have a subscription to the journal Science, you can read the paper here. Otherwise, you can find more details on Mashable. And, of course, you can find Containment on Amazon.

Controlling Web Based Music Players with Global Keyboard Shortcuts

Ever since I switched from iTunes to using web-based music players (Google Music, Amazon Cloud Player, and Pandora), I’ve wanted the ability to control them with global keyboard shortcuts. The other day, I finally took the time to set it up, and I’m very happy with the results:

If you’re interested in setting this up for yourself (or simply learning about how it works), download the project files here, then follow these instructions:

  1. Unzip the project files. You should see a directory called “music_control”.
  2. Make sure you have node.js installed, then cd into the “music_control” directory and start the server with: node server.js.
  3. Cd into the “extension” directory and open “background.html” in your favorite editor. Change the SERVER_HOST variable to reflect your host name.
  4. In Chrome, go to Window > Extensions. Make sure “Developer Mode” is checked.
  5. Click on “Load unpacked extension,” then navigate to the “extension” directory. (You can also package the extension and install it normally by double-clicking on the resulting “music_control.crx” file.)
  6. Install any application that lets you map global keyboard shortcuts to shell scripts (or AppleScripts, but I prefer bash). I used an app called Shortcuts, but I’m sure there are plenty of free alternatives.
  7. Setup whatever keyboard shortcuts you want to map to the following bash commands (note that you can use something like wget rather than curl if you prefer):
    • curl "http://localhost:8000/music?play"
    • curl "http://localhost:8000/music?next"
    • curl "http://localhost:8000/music?previous"
  8. You’re done! You should now be able to control you web-based music players with keyboard shortcuts.

I realize there are a lot of moving parts here, and any number of ways to accomplish the same thing. If you decide you don’t want to use this exact implementation, hopefully this will at least get you started down the right path of your own setup. Let me know if you get this working and/or if you adapt the concept to something equally or even more interesting. I have lots of ideas for where this could go.

Inspired by the Past

Not long ago, I took my two daughters out of school for the day and the three of us went on a field trip to Udvar-Hazy Air and Space Museum. I made a deal with them: they could miss school for the entire day if they promised to listen to everything I told them, read everything I asked them to read, and answer questions at the end of the day. I wasn’t taking them out of school to ride simulators and eat freeze-dried ice cream; we were going in search of inspiration.

The idea was prompted by the arrival of Discovery (which I also took them out of school to watch). I was about the age of my youngest daughter when the Space Shuttle Columbia first launched on April 12, 1981, and now, thirty-one years later, we were witnessing the (hopefully temporary) end of manned space flight in the United States. It suddenly occurred to me that without adequate education, children today might never know that:

  • Putting astronauts into low Earth orbit was once considered almost routine (the Space Shuttle fleet flew a total of 135 missions);
  • Forty-three years ago — more than four years before I was even born — man first walked on the moon, accomplishing a feat that doesn’t seem even remotely possible in today’s economic and political climate;
  • As children, we frequently saw the Concorde — a supersonic transport jet capable of traveling at over Mach 2 — fly overhead as it landed or took off from Dulles airport, conveying passengers from New York to Paris in only 3.5 hours — over twice as fast as brand new passenger jets being built today.

While I recognize that there’s a lot of fantastic innovation going on right now, we also appear to be in an era when the best way to inspire future generations is to look to the past.


The retired Space Shuttle Discovery.


The retired Space Shuttle Discovery.


The retired Space Shuttle Discovery.


The nose of the Concorde.


The unmistakable delta-wing configuration of the Concorde.


The SR-71 Blackbird.


Probably the best view in the entire museum. The SR-71 Blackbird in the foreground, and the Space Shuttle Discovery in the background.


The top of the SR-71 Blackbird.


Some of the toys that inspired me as a child.

Using a Mobile Device as a Desktop Computer

Part 1

Part 2

Part 2 Table of Contents:

Some friends of mine and I are experimenting with what it’s like to use a mobile device (in this case, a Galaxy Nexus) as a desktop computer. With the addition of a bluetooth keyboard, multi-touch trackpad, and a monitor, I found that the experience is surprisingly good.

I don’t demo all that many applications in the video for fear of inadvertently showing sensitive data, but I think I show enough that you can get an idea for how close we already are to this type of computing model. In fact, I think if you were to set up a workspace like this for someone who didn’t have “professional” needs (such as writing code or video editing), and/or someone who didn’t have a lot of preconceptions about how a computer should work, they would be perfectly happy with the experience. I was able to do all of the following with relative ease:

  • Browse the internet.
  • Read news.
  • Manage my calendar, tasks, contacts, etc.
  • Read and write email almost as easily as I can on my desktop.
  • Listen to music and podcasts.
  • Chat on IM.
  • Edit documents.
  • Do some light photo editing (in the default gallery application).
  • Participate in social networks (Google+, Twitter, and Facebook).
  • Watch videos on YouTube and Netflix.

In other words, I was able to do most of what many people do with desktop computers on a daily basis. Of course, there were a few key things I wasn’t able to do such as:

  • Write code. I’m sure it’s possible, but definitely not practical, and probably not something I would enjoy.
  • Advanced editing of things like photos and video.
  • Advanced file management. With this kind of computing model, you definitely want to keep as much data in the cloud as possible since the file system is generally de-emphasized on mobile devices.

Keep in mind that I’m using a stock Android device with whatever capabilities are already in the OS. If you’re willing to go as far as installing Linux on your phone, you can do far more than this. Additionally, operating systems will likely have much better support for this kind of model in the future — in particular, Windows 8 with Metro.

I’m really curious about whether this kind of interaction represents the future of computing. Are we moving toward a model where we use multiple computers and mobile devices with all our data in the cloud, or in five to ten years, are we all just going to use our phones for most of our computing needs? I’m guessing it’s going to be somewhere in the middle (as these things tend to be), but I’d love to hear your thoughts.

Update: I’ve been getting a lot of questions about the cables I used to make this work. Here’s all you need to know:

  • For the display, I used a Samsung MHL to HDMI adapter (along with an HDMI cable, obviously). If you want to do audio through your monitor, make sure your HDMI cable supports audio.
  • For a USB keyboard and mouse, you’ll need a micro USB host mode OTG cable, and a powered USB hub. (I used a bluetooth keyboard and mouse, so this isn’t in the video.)
  • For audio (if you don’t have speakers in your monitor), I just used a standard 3.5mm audio cable from the phone to my computer speakers.

Thanks to Matt Pandina for helping to get this working.

Scientists Propose Detecting Extraterrestrials Through Light Pollution (as Described in “Containment”)

alien_light_pollutionAstronomers Avi Loeb and Edwin Turner recently published a paper proposing a technique for detecting extraterrestrials: use telescopes to look for light pollution from alien cities. From the paper’s abstract:

This method opens a new window in the search for extraterrestrial civilizations. The search can be extended beyond the Solar System with next generation telescopes on the ground and in space, which would be capable of detecting phase modulation due to very strong artificial illumination on the night-side of planets as they orbit their parent stars.

I was thinking the same thing when I wrote Containment:

The telescope assembled on the far side of the Moon succeeded in capturing some stunning images, including a few faint pixels of possible light pollution originating from a small rocky planet in the habitable zone of a nearby solar system…

The SETI Institute (Search for Extraterrestrial Intelligence) is already using arrays of Earth-based radio telescopes to search for evidence of alien technology (as dramatized in Carl Sagan’s excellent novel, Contact). Since we’re already detecting exoplanets, it seems reasonable that within the foreseeable future, the technology could exist to measure light pollution on extrasolar planets, providing the first hard evidence of extraterrestrial intelligence. Perhaps alien civilizations have already detected us.

It’s fascinating to watch technologies dreamed up for the sake of science fiction gradually become reality. For instance, the idea of using the LHC for time travel, and artificial photosynthesis.