On my right screen, four astronauts are answering questions from the public in real time. Reid Wiseman, Victor Glover, Christina Koch, and Jeremy Hansen — the Artemis II crew — launched six days ago, broke the Apollo 13 distance record, swung around the far side of the Moon yesterday, and are now on the long ride home. It is genuinely thrilling. I grew up with the space program. It was an American institution, and the sciences and engineering fields I devoted my career to flourished under the cheerleading that program encouraged.

On my left screen, I am reading a physicist's scathing argument that the entire enterprise of human space flight is an expensive stunt.

The contrast sat with me all morning as five articles from my inbox — from space physics to Swiss fiber optics to protein databases to evolutionary biology to the open web — turned out to be dispatches from a single, ongoing war. The one between people who build real infrastructure and people who tell seductive stories to capture it.

The Hose Problem

Tom Murphy at UC San Diego runs a blog called Do the Math where he subjects popular assumptions to quantitative scrutiny. His target in this piece is space colonization — not space technology, which he's fine with, but the narrative that humanity's destiny lies among the stars.

His demolition is thorough and entertaining. The International Space Station, he points out, is essentially a camp with hoses connected to Earth. Air hose. Water hose. Fuel hose. Food conveyor belt. Without constant resupply, the occupants would be dead within a year. The cost runs to about a million dollars per person per day, and the whole enterprise would be cheaper to run on the ground. He compares it, devastatingly, to the 1958 stunt where two men kept a Cessna 172 airborne for 65 days by refueling from a truck driving underneath them — essentially the same feat, three orders of magnitude cheaper.

The deeper point is about narrative capture. A former NASA administrator once stated that the only justification for confronting the enormous challenge of human space flight is ultimate colonization. Murphy's response: once the spell is broken, that's delusional speech. The colonization story justifies the expenditure. Remove the story, and what remains is expensive stunt work.

I can hear Murphy's argument while watching Wiseman and Koch float in Orion's cabin, and both things are true simultaneously. The old justifications — spin-off technologies from Velcro to microprocessors, economic juggernauts born from the space race — belonged to a different era. Today is fundamentally different. But the new models matter too: SpaceX driving launch costs down by an order of magnitude, the booming satellite industry, interplanetary robots, asteroid detection. None of these require humans to live in space. They're Earth-serving infrastructure, operated from the ground. The narrative needs updating, not the engineering.

I fly drones over my research site periodically to measure vegetation. The temporal snapshot is limited — I get the spatial picture only when I'm out there flying. A persistent overhead platform would close that gap, giving me continuous spatial and temporal coverage. The commercial satellite constellations are creeping toward that capability, not because anyone is colonizing space, but because the infrastructure keeps getting better while remaining firmly tethered to Earth. Artemis II itself is a test flight — infrastructure validation, not colonization. The astronauts know the difference, even if the mythology doesn't.

The Pipe and the Story

Cory Doctorow's newsletter arrived the same morning with a piece about Switzerland's broadband infrastructure. The Swiss, it turns out, solved the natural monopoly problem that the United States and Germany both fumbled. They ran four-strand dedicated fiber to every home, terminating at a neutral hub. Any carrier can provide service. You switch providers with a click. The result: 25 gigabit symmetrical connections — upload and download — from multiple competing providers. The physical layer is shared; the service layer is competitive.

Doctorow frames this as a "Goldilocks" approach between two failures. The American model lets monopolists extract rent from captive customers. The German model builds redundant competing networks, wasting capital on duplicate trenches. The Swiss separated the pipe from the service and regulated accordingly. Even when Swisscom tried to dismantle the system in 2020, the federal courts slapped them down and fined them for wasting everyone's time.

Then Doctorow generalizes the principle. He calls it a "pluralized utility model" and compares it to public roads — the government builds and maintains the surface, but that doesn't mean the government controls every vehicle. Public buses, private cars, bike co-ops, delivery trucks all share the same infrastructure. You can have public provision at the physical layer without state control at the application layer.

This isn't abstract for me. I run a home server — a Mac Mini on fiber in Oregon City — that serves ecological data and published writing to anyone in the world. No intermediary. No platform. No one else's terms of service. That's the open web as I've known it since 1990, and it depends entirely on the physical layer remaining open.

My fiber provider used to be CenturyLink. Then CenturyLink became Lumen. Then, in February of this year, AT&T acquired Lumen's consumer fiber business for $5.75 billion. My service has already transitioned. The terms of service currently protect my use case, but terms of service are unilateral documents — they can be changed with thirty days' notice, and my only recourse is to cancel. Without the Swiss model, where the pipe is neutral and I choose my provider independently, I'm at the mercy of whoever owns the last mile. In Oregon City, there probably isn't a second fiber option.

The Quiet Builders

The same morning brought two pieces from Asimov Press that, taken together, tell a story about what real scientific infrastructure looks like.

Ella Watkins-Dulaney's history of bioinformatics software is nominally about protein databases and alignment algorithms, but the deeper narrative is about a quantum chemist named Margaret Dayhoff who saw, in 1962, that biology had a data management problem that computers could solve. Her program COMPROTEIN could assemble a protein sequence in minutes that had taken researchers months to work through by hand. She went on to build the Atlas of Protein Sequence and Structure — the first biological sequence database — and invented the single-letter amino acid codes still used today, because three-letter codes wasted precious computer memory.

What strikes me most is the resistance she encountered. One biochemist refused to collaborate, stating flatly: "I am not a theorizer." Frederick Sanger himself resisted computers because they might take away the pleasure of working through sequences manually. These were people at the frontier of their fields who couldn't see that computation wasn't replacing their science but enabling its next phase.

Dayhoff didn't wait for a grand theory to justify her work. She just started collecting and organizing what was available. The Atlas began with about a thousand protein sequences. Today, databases like GenBank and UniProt hold billions. That accumulation — patient, unglamorous, incremental — eventually made AlphaFold possible, winning the 2024 Nobel Prize in Chemistry. The infrastructure preceded the insight by sixty years.

Sam Clamons's companion piece takes a different approach to the same problem of biological comprehension. He normalizes evolutionary time to one generation per second and watches what emerges. The industrial revolution began ten seconds ago. Behaviorally modern humans appeared half an hour back. The dinosaur extinction, a hundred days. SARS-CoV-2 has existed in humans for approximately ten minutes.

The power is in the comparisons. The peppered moth's adaptation to industrial soot took 47 seconds. Belyayev's foxes produced elite human-seeking individuals in 6 seconds. Barley took 2.7 hours from first exploitation to recognizable domestication. These aren't just entertaining rescalings — they reveal that the intensity of selection pressure matters as much as elapsed time. And when you look across replicates — 203 domesticated crops, twelve parallel E. coli lineages — the variance in evolutionary speed is only three- to five-fold around the median. The signal emerges from the noise if you collect enough data and choose the right frame.

For someone building ecological sensing systems, that's a deeply encouraging finding. Individual sensor readings are noisy, contingent, place-specific. But accumulate enough of them across enough sites and time, and structure should emerge. That's the bet I'm making with every five-minute data point the Macroscope records.

Going Out Swinging

The fifth article was the most personal. Jay Hoffman, who runs The History of the Web, responding to Anil Dash's "Endgame for the Open Web," lays out a pattern that repeats every ten years or so. A group of well-capitalized "visionaries" takes ideas that emerged organically from open communities and repackages them as centralized, controlled products. The dot-com era turned open storefronts into Amazon. Web 2.0 turned open APIs into walled gardens. Now AI is the latest vehicle for the same consolidation.

But the web is resilient by design, Hoffman argues. Berners-Lee didn't just release it into the public domain — he deliberately made the architecture resistant to capture. Simple technologies, globally accessible, no control over who links to what. After every wave of consolidation, fragments survive and persist. Wikipedia emerged from the dot-com wreckage. WordPress and RSS outlived Web 2.0's collapse. Metafilter and Ravelry never went anywhere.

The closing was raw: "The open web gave me a community, and friends, and a career, and a purpose. It's the foundation I built my life on."

I know that feeling. I could have written that sentence.

Infrastructure All the Way Down

What connects these five pieces is a structural insight that operates at every scale. Murphy exposes the space colonization narrative as a story that distorts investment in useful space infrastructure. Doctorow exposes the free market narrative as a story that distorts investment in useful communications infrastructure. The open web essay exposes the disruption narrative as a recurring story that captures and commodifies open digital infrastructure. Watkins-Dulaney shows that the real breakthroughs in biology came from quiet infrastructure building, not grand narratives. Clamons demonstrates that the deepest patterns in evolution emerge from patient accumulation and careful reframing.

It's infrastructure all the way down. From orbital mechanics to fiber optics to web protocols to genomic databases to the molecular clockwork of evolution itself — the things that actually work are the things that someone built carefully, maintained stubbornly, and refused to let be captured by a more exciting story.

I have a technical specification sitting in a file right now for something I call the "Genetic Address" — a module that would let my ecological profiling engine query population genomics databases by geographic coordinate, returning a molecular portrait of any place on Earth. It draws on datasets that trace directly back to Dayhoff's Atlas. It depends on open APIs maintained by public institutions. It would run on a home server, over fiber, through open protocols, to anyone who asks. It's not built yet. The architecture is ready and the data sources aren't going anywhere — they're only getting richer.

That's what infrastructure looks like. Not a manifesto. Not a moonshot. A sensor reading every five minutes. A domain you own. A server you control. A database that grows one entry at a time. And the stubborn, daily act of keeping it all running while the stories swirl overhead.