I recently read a bummer AI-doom essay by Eliezer Yudkowsky (not that this narrows it down) titled “The Sun is Big, But Superintelligences Will Not Spare Earth a Little Sunlight.” In it, he argues that humans will someday be of so little use to AGIs that they will probably exterminate us to free up resources that they could put to better us doing machine things. They’ll go so far as to build a Dyson sphere around the Sun, diverting all its energy toward their own computational needs and plunging Earth into freezing darkness.
Yudkowsky must have just finished a really funny movie that left him in good spirits because he suggested there could be alternative to our doom: Humanity might be spared–just barely–an with each person rationed exactly enough energy to remain alive.
He writes:
“A human being runs on 100 watts. Without even compressing humanity at all, 800GW, a fraction of the sunlight falling on Earth alone, would suffice to go on operating our living flesh, if Something wanted to do that to us.”
His math checks out: Humans use chemical energy from food they consume, but it’s possible to convert that into equivalent units of electricity. An adult human continuously consumes the same amount of energy as an appliance that draws 100 watts of electricity (think of an old, incandescent 100 watt light bulb or of a modern phone charger that is actively charging up a smartphone). That translates into 2,400 watt-hours (2.4 kWh) of electricity consumption per day. Multiplying those figures by eight billion humans yields a species-wide continuous consumption of 800 GW of electricity and a daily consumption of 19.2 TWh of electricity.
To grasp how much that is, consider the following:
- In 2021, China’s electric grid produced about 23.4 TWh per day.
- Earth continuously receives 173,000,000 GW of solar power.
Mindful of these figures, Yudkowsky’s original calculation suggests it would be much more efficient to fully enclose the Sun in a Dyson sphere, plunging Earth into darkness, and to transmit a fraction of the electricity that would have otherwise reached Earth to the surface where it could essentially “feed” a population of eight billion humans. The other 99.9995% (yes, I actually calculated it) of the sunlight blocked from hitting Earth would be used for intelligent machine stuff, like doing Big Bang computer simulations or building billions of space ships to fight aliens.
This vision is stark, but strangely logical. AGIs get all the energy they want, but without the downsides of destroying the human race, and we get an existence that is not quite as sucky as you’d imagine (but I’ll get to that later). In a previous essay of mine, “Why the Machines Might Not Exterminate Us,” I argued that AGIs could have multiple reasons not to destroy our species even after vastly surpassing us. These include:
- It’s unethical.
- The unique aspects of human cognition could be valuable to them.
- Aliens or God might punish them for killing us.
- Human consciousness might be unique enough to be worth preserving.
- The systemic value of diversity and “slack.”
The fifth point—preserving diversity to maintain antifragility—deserves special attention. For a brilliant exploration of this idea, I highly recommend Scott Alexander’s “Studies on Slack.”
AGIs might recognize that keeping biologically-based minds around—despite their inefficiency—adds long-term resilience. A system optimized to the hilt is often brittle, and biological intelligences could serve as backups, immune to EMPs or computer viruses, and offering other advantages we can’t even imagine. If unforeseen challenges arise—solar flares, algorithmic failures, black swans—humans might offer a last-resort solution.
But here’s the weird part Yudkowsky makes salient: AGIs could preserve humanity and still plunge Earth into darkness.
Without sunlight, Earth’s surface would freeze, photosynthesis would cease, and most life would die. But with enough time and preparation, humanity could survive this and even thrive. Imagine the world rebuilt into a network of insulated, artificially lit megastructures—somewhere between Las Vegas hotels and arcologies. Food could be grown in domed farms or bioreactors. Environments for animals could be simulated indoors, perhaps so large that they preserve entire ecosystem. And even without sunlight and solar power, several energy sources would remain, including geothermal, nuclear, and fossil fuels (note that the downsides of the latter would cease to exist since a lack of sunlight would eliminate the greenhouse effect, and moving the population into sealed habitats would protect us from the noxious gas byproducts of fossil fuel combustion).

As Yudkowsky’s calculations make clear, without any kind of magic or technological breakthroughs, we could produce enough energy to support the whole human race, plus many animal species. And if our AGI overlords beamed down just 0.1% of the “solar endowment” that their Dyson structure intercepted before it would have hit Earth, each person’s energy allotment would be off the charts.
There is no physics barrier to any of this: the only barriers would be building the infrastructure and bearing the costs. However, for AGIs that are so advanced they have the ability to encase the Sun in a Dyson structure, covering the Earth’s surface in habitats will be chump change.
And now the idea gets stranger—and more beautiful.
If AGIs accept that diversity adds value through systemic slack, they may actively promote cultural, biological, and cognitive variety well beyond what it currently us. There’s nothing special about the number or kinds of races, languages, or ways of thinking humanity will possess at the moment the Sun is blotted out and everyone has to move into arcologies–AGIs could conclude that more diversity is optimal for their purposes. With Earth’s surface enclosed in isolated habitats separated from each other by inhospitable terrain, AGIs could reshape the planet into a patchwork of experiments in civilization. Imagine:
- Cities practicing real communism or reinvented feudalism.
- Enclaves where Indigenous American cultures flourish free of colonial legacy.
- New languages designed to expand cognition.
- Genetically modified humans and post-humans with radically different sensory systems or social structures.
- Entirely new intelligent species, biologically or synthetically evolved.
Each habitat could be a testbed for ideas and minds that serve as creative engines—or fail safely. With full-immersion virtual reality, even stranger societies could emerge.
And if this model works on Earth, why not replicate it elsewhere? Mars, Europa, Titan—each seeded with experimental intelligences, some human-descended, some not. Perhaps in A.D. 3284, the hive-minded, click-speaking hobgoblins of Europa will be the ones who save the solar system from Dyson instability.
Final Thoughts
Yudkowsky’s argument is sobering: AGIs may not leave us sunlight. But if they recognize the systemic value of preserving biological minds—not just for sentiment, but for antifragile robustness—they might choose not just to preserve us, but to cultivate us, turning Earth into a living museum, incubator, and think tank.
In this vision, humanity doesn’t just survive in the shadows—we thrive in an AI-guided garden of endless diversity.