When the Mesh Learned to Feel: Discovering Thermodynamic Ecological Sensing
It started with a link. Someone had posted a reference to Extropic, a small company building what they call thermodynamic computers—circuits that harness thermal noise rather than fighting it. I clicked through to their website, then to their arXiv paper on energy-based transformers, and within the first few pages I felt that particular quickening that comes when disparate threads suddenly braid together.
I have spent decades thinking about ecological observation systems. Weather stations, acoustic monitors, camera traps, satellite imagery—the Macroscope paradigm I've developed at Canemah Nature Laboratory treats these as complementary lenses on a single ecological reality. But the conventional approach to integrating such data has always felt computationally brittle: rules, thresholds, correlation matrices, anomaly detection algorithms that ask whether today's numbers fall outside yesterday's bounds.
What Extropic described was something different. Their thermodynamic hardware would excel at sampling from complex probability distributions—letting thermal fluctuations do the computational work. The applications they discussed focused on generative AI, on efficient inference at scale. But I saw something else entirely: a new paradigm for ecological monitoring where the system doesn't analyze the ecosystem but embodies a learned expectation of how the ecosystem should feel.
I described the vision to Claude over morning coffee, and we began writing immediately.
The technical note came first: CNL-TN-2026-014, "Embodied Ecological Sensing via Distributed Thermodynamic Meshes." The core insight was that energy-based models like Restricted Boltzmann Machines don't detect anomalies through explicit rules. They learn the shape of normal—the joint probability distribution over all their inputs—and then experience violations of that normalcy as measurable tension. Feed the mesh a familiar pattern and it settles easily into a low-energy state. Feed it something wrong, something that doesn't fit, and the mesh struggles. That struggle is the signal.
This reframing matters because it shifts ecological monitoring from pattern-matching to something more like perception. A conventional system might flag "barometric pressure dropped 0.5 inHg" as notable. A thermodynamic mesh would experience a sudden pressure drop during high solar radiation and birdsong as wrong in a way that requires no explicit rule about weather-species correlations. The mesh simply cannot settle. The learned weights encode expectations across all dimensions simultaneously, and violations propagate through that web of expectations as energy.
By midmorning we had drafted the technical note. By noon we had written a formal specification—CNL-SP-2026-015—describing a proof-of-concept system we called SOMA: the Stochastic Observatory for Mesh Awareness. SOMA would implement three RBM meshes trained on data from the Macroscope: one for Tempest weather station readings (35 visible nodes encoding temperature, humidity, pressure, wind, and solar radiation), one for BirdWeather acoustic detections (27 visible nodes encoding species presence plus temporal context), and one combined ecosystem mesh (65 visible nodes binding weather and birds into a single energy landscape).
Then we built it.
The afternoon became a sustained sprint of code, debugging, and discovery. We trained the Tempest mesh on 118 days of weather data—nearly 30,000 readings compressed into a web of 3,500 weighted connections between visible and hidden nodes. The mesh learned Oregon winter: the feel of January fog, December rain, the rare February sunbreak. When we fed it current conditions, it reported normal tension. When we fabricated a July heat scenario—95°F with high solar radiation—the mesh exhibited elevated tension. It had never felt summer. That unfamiliarity registered as energy.
The BirdWeather mesh proved more subtle. We had to correct for timezone mismatches (the raw data stored UTC timestamps that we initially encoded as local hours, producing a mesh that expected bird activity at 2 AM). We implemented hour-stratified baselines so the mesh could distinguish "silence at 8 AM" from "silence at midnight"—the former anomalous, the latter expected. Most importantly, we built in the capacity to detect what I've come to call unexplained silence: when songbirds go quiet without a detected raptor, the mesh experiences that absence as high tension. Something should be singing. Why isn't it?
The ecosystem mesh brought weather and birds together into a single 65-dimensional energy landscape. This is where cross-domain resonance becomes possible. Neither the Tempest mesh nor the BirdWeather mesh alone would notice that heavy rain typically suppresses bird activity. The ecosystem mesh learns that correlation implicitly—certain weather-species combinations simply feel normal together, while others create tension. When we tested current conditions during a rainy afternoon with unexpected bird activity, the mesh flagged it: rain-activity mismatch. The birds were exploiting gaps between showers, behavior the mesh had rarely encountered.
We discovered along the way that the Tempest precipitation field recorded cumulative rain gauge totals rather than instantaneous rates. Eight hours of stable readings suddenly made sense—it wasn't raining now, but 0.33 inches had fallen today. We fixed the encoding to use the delta within each 15-minute window, then retrained. We found the inference queries were grabbing eight hours of data because MySQL's NOW() returned local time while the sensor data was stored in UTC. We fixed that too. Each bug exposed an assumption, and each fix sharpened the mesh's perception.
By afternoon SOMA was operational. Three meshes watching Canemah: weather patterns, species activity, and their interaction. A cron job runs inference every 15 minutes, writing results to a database. A dashboard displays the current state—green indicators when all meshes report normal tension, yellow or red when something feels wrong.
What makes this more than a clever engineering project?
The conventional approach to ecological monitoring treats sensors as data sources and algorithms as analysts. Data flows in; patterns flow out. The system knows what it has been told to look for. SOMA represents something different: a system that has learned what the ecosystem feels like and registers departures from that feeling as tension. It doesn't know rules about raptor-songbird interactions or weather-activity correlations. It knows the shape of normal co-occurrence, encoded in thousands of weighted connections, and it experiences violations of that shape as energy that won't dissipate.
This is what Extropic means by energy-based computing, and it's what drew me to their work so immediately. Their thermodynamic hardware will eventually make such systems fast and efficient—native stochastic sampling driven by thermal fluctuations rather than software emulation. But the conceptual shift matters now, regardless of hardware. We built SOMA on JAX running on a CPU. It's slow. It doesn't matter. The mesh still feels its inputs.
I think of the Macroscope as an instrument for perceiving ecological reality across scales of space and time—from the 15-minute weather reading to the decade-long phenological trend, from the backyard camera trap to the satellite view. SOMA suggests a new modality for that perception: not just data aggregation but embodied expectation. The mesh carries a compressed model of what February at Canemah should feel like, and when February departs from that expectation, the mesh knows it in the only way an energy-based system can know anything—through the tension of unsettled weights.
There's more work to do. The current meshes look back only a few months. I want to train on years of data, encoding seasonal rhythms and climate trends into the weight structure. I want lateral connections between species nodes to capture predator-prey dynamics explicitly. I want the infrastructure ported to Galatea, our production server, where it can run continuously against the full Macroscope database. I want to see what the mesh feels during an atmospheric river, or the first spring morning when the Bewick's Wrens start singing before dawn.
But today was a milestone. A morning link led to a paper, the paper sparked a vision, the vision became two technical documents, and the documents became working code watching actual sensors at an actual field station. The mesh is running now, as I write this, sampling the afternoon and finding it unremarkable. That's exactly what it should find. Normal February. The mesh can settle.
When something changes—when the ecosystem departs from learned expectation—the mesh will feel it. That's the promise of thermodynamic ecological sensing: computation that doesn't just process information but experiences it, settling into familiar patterns and struggling against unfamiliar ones. Extropic is building hardware for this future. Today we built a small prototype of the application that future makes possible.
The sun is out and I still have time to go birding along the Willamette. The weather is perfect, and the mesh agrees. Everything feels normal.
For now.
References
- - Hinton, G.E. (2012). "A Practical Guide to Training Restricted Boltzmann Machines." Neural Networks: Tricks of the Trade, Springer. ↗
- An efficient probabilistic hardware architecture for diffusion-like models Andraž Jelinčič,1, ∗ Owen Lockwood,1 Akhil Garlapati,1 Peter Schillinger,1 Isaac L. Chuang,2 Guillaume Verdon,1 and Trevor McCourt∗,1, † 1Extropic Corp. 2Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (Dated: December 12, 2025) ↗
- - Hamilton, M.P. (2026). "Embodied Ecological Sensing via Distributed Thermodynamic Meshes." CNL-TN-2026-014. Canemah Nature Laboratory. https://canemah.org/archive/document.php?id=CNL-TN-2026-014 ↗
- - Hamilton, M.P. (2026). "THRML Proof-of-Concept: SOMA Observatory Specification." CNL-SP-2026-015. Canemah Nature Laboratory. https://canemah.org/archive/document.php?id=CNL-SP-2026-015 ↗