If We Met Space Aliens, Would We Have to Kill Them?
The Chinese science-fiction author Cixin Liu challenges that in his popular series The Three-Body Problem, which explores what human interaction with space aliens might look like [...]
Depicted: The Australian Telescope Compact Array (ATCA) at the Paul Wild Observatory located in Narrabri, New South Wales.
Photo Credit: CSIRO Australia Telescope Narrabri, Narrabri, Australia
(This piece has spoilers for The Three-Body Problem series, also known as The Dark Forest trilogy, by Liu Cixin.)
(Dark Shift may earn money from links in this post to books on Amazon.com.)
A lot of science fiction comes with the assumption that humans could peacefully interact with space aliens... or if we did go to war with them, that the aliens would fight with the same motivations and tactics as another group of humans.
The Chinese science-fiction author Cixin Liu challenges that in his popular series The Three-Body Problem, which explores what human interaction with space aliens might look like. In the series, the characters develop and deal with the implications of Dark Forest Theory--the idea that the universe is dense with intelligent civilizations that are all hiding from each other.
The series starts out with two concepts:
- Technological explosions. A seemingly-primitive civilization may undergo an industrial revolution or a technological explosion that makes them threatening. A militarily-stronger civilization can never assume they are safe from a weaker civilization in the long run.
- Chains of suspicions. We cannot know whether an alien civilization has benevolent intentions toward. We also cannot know whether they think that we have benevolent intentions for them, or whether we think their intentions are benevolent. This chain recurses indefinitely.
These two concepts make it nearly impossible for two civilizations to trust each other indefinitely, even if it seems impossible for them to harm each other.
In The Three-Body Problem series, the above two concepts set the following incentives for civilizations:
- The safest course of action for a civilization is to never be discovered by another. A civilization can never safely broadcast their location to space.
- If a civilization is discovered, they must annihilate whoever found them to pre-empt any incoming attack.
- The safest course of action is to annihilate any civilizations one discovers, lest their annihilate you for discovering then.
One of the characters explains Dark Forest Theory as follows:
“The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds other life—another hunter, an angel or a demon, a delicate infant or a tottering old man, a fairy or a demigod—there’s only one thing he can do: open fire and eliminate them. In this forest, hell is other people. An eternal threat that any life that exposes its own existence will be swiftly wiped out. This is the picture of cosmic civilization. It’s the explanation for the Fermi Paradox”
In the series, the universe is so dense with life--all hiding from each other--that the easiest way to destroy a civilization is to simply broadcast their position in space in the hopes that someone else will receive the broadcast and launch a pre-emptive strike.
Dark Forest Theory is Cixin Liu's answer to the Fermi Paradox--the question of how the universe is fertile enough for us to evolve, but we haven't yet found evidence of intelligent life elsewhere. Dark Forest Theory answers why we haven't found them--those civilizations are either in hiding or about to wipe us out.