This is the second of four connected essays telling the story of "Strata"—a near-future fiction about an interconnected crisis in human and artificial intelligence, and one family's response. If you haven't read Chapter One, please start there, as these essays build on each other.


In Chapter One, we met Maya, a seventeen-year-old who's been learning with her grandfather's bioregional AI system for four years. While her friends use ChatGPT to write essays about books they've never read, Maya uses Strata to understand ecological patterns through actual observation. The contrast crystallizes at a Regenerate Whatcom meeting where community members describe a knowledge crisis—"information everywhere, understanding nowhere"—and announce a $10 million challenge seeking AI frameworks that can reverse the decline.

The chapter ended with Maya's family gathered around a kitchen table at Owl Farm in Bellingham, realizing that her grandfather Paul had spent fourteen years building exactly what the contest was asking for. The question: Could they explain it well enough for others to build it too?

Chapter Two begins the next morning, as the family starts mapping what makes Strata different from commercial AI systems—and discovers that the real innovation isn't the technology itself, but the relationship it enables between intelligence, people, and place.

— Mike Hamilton
Canemah Nature Lab
November 2025


Chapter 2: The Collaboration

They started the next morning in Mary's living room, coffee and notebooks spread across the old oak coffee table Paul had helped her refinish two summers ago. Outside, February rain turned the Whatcom County forest into shades of grey and green.

"We need to map what Strata actually does," Howard said, opening his laptop. "Not the technical specs yet. Just... what happens when someone uses it versus using ChatGPT or Claude or whatever."

Maya pulled out her field notebook—four years of observations now, pages warped from field use and rain. "When I asked Strata about the salamanders, here's what happened." She read from her notes: "Soil temperature 2.1°C above historical average. However, I cannot observe what you observe. Are the salamanders healthy?"

"It gave you data, then sent you back to look," Catherine said.

"Always. Every single time." Maya flipped through pages. "When I asked about a plant identification, it showed me similar species and asked me to observe specific features. When I asked about water quality, it gave me chemistry readings then asked: What do the indicator species tell you?"

Paul was at the whiteboard Mary kept in the corner. He drew two columns:

CURRENT AI | STRATA

"What else?" he asked.

"Current AI gives answers," Howard said. "Complete, confident, immediate. Even when it's wrong, it sounds certain."

"Strata expresses uncertainty," Maya added. "It tells me what it doesn't know. What it can't observe. Where its models might be wrong."

Paul wrote:

  • Gives complete answers | Asks questions
  • Confident even when wrong | Expresses uncertainty
  • Replaces observation | Directs observation

"And the learning pattern is different," Maya said. "My friends ask ChatGPT for answers they don't understand and move on. I ask Strata and end up spending an hour outside with a field guide and a hand lens, actually looking at things."

Mary came in from the greenhouse with Margaret, who'd driven over from the Lummi reservation. Margaret set a basket of cedar bark on the table—materials she and Mary had been preparing to weave.

"Tell them what you told me," Mary said.

Margaret picked up a piece of bark, running her thumb along the inner surface. "This cedar should have been ready three weeks ago. The timing's all wrong now. I've been harvesting cedar for sixty years—learned from my grandmother, who learned from hers. We know the signs: when the sap rises, when the bark loosens, when to ask the tree. But the trees are confused now. Everything's shifted."

"Could you teach that knowledge to someone young?" Paul asked. "Someone who hasn't spent sixty years with cedar?"

"I try. But the young people... they want to look it up. Watch a YouTube video. They don't want to spend years learning to feel it."

"What if AI helped with that?" Maya asked. "Not replacing the learning, but... helping someone understand what to pay attention to? When to ask you questions?"

Margaret looked skeptical. "How would a machine know what to pay attention to?"

"It wouldn't," Mary said. "Not on its own. But what if it learned from you? With your permission, with your guidance. What if it could help people understand enough to ask you better questions?"

"And you'd decide what knowledge goes in," Paul added. "What's shareable, what's sovereign, what stays with the Lummi people."

Margaret was quiet for a long moment, turning the cedar bark in her hands. "Tell me more."


By afternoon they'd filled three whiteboards and Maya's laptop was covered in sticky notes. Howard had sketched out a system architecture that made his electrical engineering background visible—clean diagrams showing how Strata connected to sensors, processed data, interfaced with users, maintained ground truth.

"The key difference," Howard said, "is feedback loops. Current AI is trained once, deployed, then just runs. Strata is constantly being retrained by actual observations. Every time Paul or Maya or Mary records field data, that goes back into the system. It can't drift away from reality because reality keeps correcting it."

"That's what prevents model collapse," Maya said, connecting it to the research papers Paul had sent her. "Those papers showed AI trained on AI output gradually loses touch with rare events, edge cases, the tails of distributions. But Strata's trained on real observations, not synthetic data."

"And it maintains metacognition," Catherine added. She'd been sketching while they talked—visual representations of how different AI systems worked. "It never lets users forget what they don't know. It's constantly asking: Are you sure? What did you observe? How confident are you?"

Margaret had been listening carefully. "And you're saying this could help our youth learn traditional practices? Not from a machine, but from elders, with the machine helping them understand what questions to ask?"

"That's the idea," Mary said. "The machine doesn't know about cedar harvest. You do. But it could help someone understand enough about seasonal patterns, about tree physiology, about observation skills, that when they come to you, they're ready to learn."

"It prepares them," Margaret said slowly, "to receive teaching."

"Yes." Mary understood immediately. "It doesn't replace the teacher. It prepares the student."

Margaret nodded. "Then I want to help write this. If you're serious about knowledge sovereignty, about doing this right, I want to be part of it."


That evening, Tom Wilson and Maria Santos drove over from their places in Whatcom County. Paul had called them after the meeting—"We're working on something for the challenge. Could use your input."

They gathered around Mary's kitchen table, now covered in notes and diagrams.

"Show us what you've got," David said.

Howard walked them through it: the technical architecture, the feedback loops, the ground-truth anchoring. Maya described her experience as a user. Margaret explained the knowledge sovereignty frameworks they were developing.

Elena kept shaking her head. "This is what we need. Exactly what we need. But can it scale? Can it work for more than just a few researchers?"

"That's what we're trying to figure out," Paul said. "I built Strata for myself, very specific to my work. But Maya's been using it for four years and she's not an ecologist. Mary's been using it up here with completely different projects. Margaret‘s talking about using it for cultural knowledge transfer. So maybe it's more adaptable than I thought."

"You're thinking about it wrong," David said. He'd been studying Howard's architecture diagrams. "You're not trying to scale one AI to serve millions of people. You're creating a framework for communities to grow their own AI. Each one specific to its place, its people, its knowledge."

"Like the Macroscope installations," Howard said, getting it. "Paul built one at Canemah, one here at Owl Farm. Same basic architecture, but each one's adapted to its location. Each one's learning from its specific place."

"So the proposal isn't 'here's Strata, deploy it everywhere,'" Catherine said. "It's 'here's the framework for how to grow your own Strata, rooted in your own bioregion.'"

"That's bioregionalism," Mary said. "That's what Regenerate Cascadia is all about—not one solution imposed from above, but communities developing solutions appropriate to their own places."

Elena pulled out her tablet. "The contest is asking for frameworks, not finished products. You're not competing with deployed systems. You're proposing a methodology. A way of building AI that's fundamentally different."

"And you've already proven it works," David added. "Two installations, years of data, multiple users with different backgrounds and needs. That's your proof of concept."

Paul looked around the table at these people he'd known for years through Regenerate Whatcom, at Mary who understood living systems as deeply as he did, at his son-in-law who saw the engineering elegance of it, at his daughter who could visualize it, at his granddaughter who'd grown up with it, at Margaret who was willing to trust them with something precious.

"Alright," he said. "Let's write the proposal."


Maya couldn't sleep that night. She lay in Mary's guest room, listening to rain on the roof, thinking about her friends at school. Jordan who'd never read Gatsby. Connor who didn't want to understand algorithms. June who asked AI for life advice.

None of them were dumb. They were smart, capable, curious about things they cared about. But they were learning to be passive. Learning that information was the same as understanding. Learning that performance mattered more than knowledge.

She thought about the salamanders migrating early, about Margaret‘s cedar bark ready at the wrong time, about all the ways the world was changing faster than people could adapt. And she thought about Strata asking her: What do you observe?

The difference was relationship. Her friends had a transactional relationship with AI: ask for output, receive output, move on. She had something else with Strata—something more like conversation, or collaboration, or maybe teaching. Strata never let her be lazy. Never let her accept easy answers. Never let her forget that understanding required work.

She got up and opened her laptop. Started writing:

User Experience: Learning to Learn

I've been using Strata since I was thirteen. My friends have been using ChatGPT since they were twelve. We're both Generation AI—the first generation to grow up with AI as a normal part of life. But we're learning completely different things.

My friend Jordan uses AI to write essays about books he hasn't read. He gets A's. He learns nothing. I use Strata to understand ecological patterns in my own backyard. My grades are probably lower. But I know what I'm looking at when I go outside.

The difference isn't the technology. It's the relationship...

She wrote for an hour, trying to capture what was so hard to articulate: how Strata had taught her to be curious, to observe carefully, to question her own assumptions. How it had made her smarter by never letting her be passive.

In the morning, she showed it to her mom over breakfast.

Catherine read it twice. "This is the heart of the proposal, you know. The technical architecture matters, but this—this is why it matters."

"It's just my experience. Grandfather's the scientist, Dad is the engineer—"

"And you're the actual user. The generation this is supposed to serve." Catherine pointed at a paragraph. "This part about learning to learn—that's what the contest is really asking for. Not better AI tools. A different relationship with intelligence."

Mary came in from feeding the chickens. "Let me see."

She read it standing at the counter, still in her rain jacket, and when she looked up her eyes were bright.

"Paul," she called, "come read this."


They worked for six days straight. Howard took family leave. Maya using her spring break. Catherine set up a workspace in Mary's living room, creating visual diagrams of how Strata worked, how it could scale, how communities could adapt it.

Paul handled the technical architecture—the sensor networks, the data processing, the machine learning that kept Strata grounded in observation. Howard designed the hardware specifications and deployment frameworks. Mary and Margaret worked on the knowledge sovereignty protocols: how to protect traditional knowledge, how to ensure community ownership, how to prevent corporate capture.

And Maya wrote about being a learner.

They held video calls with people across Cascadia: Portland Regenerate groups, Vancouver Island communities, tribal partners from other Coast Salish nations. Each conversation added new dimensions. A teacher in Portland described students who couldn't write without AI. A fisher in British Columbia talked about youth who could look up tide tables but couldn't read the water. An Indigenous language teacher in Seattle explained how AI could help with vocabulary but couldn't teach the cultural context that made language alive.

"Everyone's describing the same crisis," David said on one call. "Knowledge everywhere, understanding nowhere. Information abundant, wisdom scarce."

"Because current AI treats knowledge as data," Paul said. "Extract it, process it, deliver it. No relationship, no context, no embodiment."

"Strata treats knowledge as something that grows," Mary added. "In people, in places, through relationship over time."

Margaret spoke up from her seat next to Mary. "In our traditions, knowledge isn't something you have. It's something you're in relationship with. You don't own it, you're a caretaker of it. That's what this framework does—it makes AI a caretaker too, not an owner."

"That needs to be in the proposal," Elena said. "That exact framing."


On day six, Paul stood at the whiteboard trying to articulate something he'd understood intuitively but never put into words.

"The problem with current AI development is the training data. They scrape the internet—social media, forums, blogs, everything. But most online content is optimized for engagement, not truth. Short, provocative, emotionally charged. That's what causes brain rot in the models."

"And model collapse happens when AI trains on AI output," Howard added, referencing the research papers. "Each generation loses fidelity. Rare information disappears. It converges toward mediocrity."

"So Strata resists both problems by training on what?" Catherine asked, trying to visualize it.

"Ground truth," Paul said. "Actual sensor data. Real observations. Field notes from people who are outside looking at things. Traditional knowledge from people who've spent decades learning to see."

"Not scraped from the internet," Maya said, getting it. "Intentionally curated. Only high-quality sources."

"And constantly updated with new observations," Paul added. "It never stops learning from reality."

"That's the architectural innovation," Howard said, sketching rapidly. "Current AI is trained once and deployed. Strata is in constant conversation with the real world. It can't drift because reality keeps correcting it."

"Write that," Catherine said. "That's a core principle."


On day ten, they had a complete draft. Seventy-three pages: technical specifications, community frameworks, ethical protocols, implementation guidelines, and Maya's essay about learning to learn.

They gathered around Mary's table one more time: Paul, Mary, Howard, Catherine, Maya, Margaret, David, and Elena.

Catherine projected the title page:

STRATA: A Framework for Growing Bioregional Intelligence

Submitted to The Cascadia Intelligence Commons Challenge

Authors: Paul Hamilton (Canemah Nature Lab), Mary Sullivan (Owl Farm), Maya Quackenbush (student researcher), Howard Quackenbush (technical implementation), Catherine Hamilton (communication design), Margaret Williams (Lummi Nation, knowledge sovereignty), with community partners across Cascadia

"Read the abstract," David said.

Paul pulled it up on his laptop:

We propose a framework for developing AI systems that reverse current trends toward human cognitive decline, loss of metacognitive awareness, model collapse, and disconnection from place-based knowledge. Over fourteen years, we developed Strata—a bioregional intelligence system that demonstrates how AI can enhance rather than replace human capability. Unlike commercial systems that extract knowledge and deliver information transactionally, Strata facilitates learning through continuous relationship between users, places, and communities.

Our framework addresses six interconnected crises:

  1. Declining human cognitive abilities (Flynn effect reversal)
  2. Erosion of metacognitive awareness in AI users
  3. Model collapse in AI trained on synthetic data
  4. "Brain rot" from low-quality training data
  5. Loss of traditional and place-based knowledge
  6. Disconnection of learners from observation and embodiment

We present technical architecture, ethical protocols, community ownership structures, and four years of validation across multiple users and bioregions. The framework is open-source, community-adaptable, and designed to be grown rather than deployed—each bioregion cultivating its own intelligence, rooted in its own place.

Margaret was crying. "You listened. You really listened."

"It's your framework too," Mary said. "You helped build it."

"Will it be enough?" Elena asked. "To win, I mean?"

"I don't know," Paul said honestly. "But it's true. Every word of it is true. We've proven it works. Now we just have to explain it well enough that other people can see it too."

Howard checked his watch. "Submission deadline is midnight tonight. We need to upload it."

"Wait," Maya said. She pulled out her phone, took a photo of all of them around the table: three generations, two bioregions, cultures in collaboration, holding a proposal that might change how humanity and artificial intelligence could grow together.

"For the record," she said. "In case this matters someday."

"It already matters," Mary said. "Whether we win or not, we've proven it's possible."

Paul uploaded the proposal at 11:47 PM. They sat in Mary's living room, exhausted, exhilarated, uncertain, hopeful, knowing they'd done something that mattered.

Outside, February rain continued falling on Owl Farm, on the Salish Sea, on the forests of Cascadia, on a world desperately needing new ways to think, to learn, to be intelligent together.


Next: Chapter Three—The Proposal (presentations, competition, and what happens when they win)

Strata, Chapters Three & Four: The Proposal and The Future