Strata, Chapters Three & Four: The Proposal and The Future
This is the final installment of the "Strata" story—a near-future fiction about an interconnected crisis in human and artificial intelligence, and one family's response. If you haven't read Chapters One and Two, please start there, as the narrative builds across all three essays.
In Chapter One, we met Maya, who's been learning with her grandfather Paul's bioregional AI system for four years. While her friends use ChatGPT passively, Maya uses Strata to develop genuine understanding through observation. At a Regenerate Whatcom meeting, a $10 million challenge is announced seeking AI frameworks that can reverse cognitive decline, and Maya's family realizes Paul has already built the answer.
In Chapter Two, they spent ten days designing their proposal. The breakthrough insight: they're not scaling one AI system—they're providing a framework for communities to grow their own, each rooted in its own place. They brought in community partners including Margaret, a Lummi elder, who helped design knowledge sovereignty protocols. Maya wrote about being a learner. They submitted at 11:47 PM.
Chapter Three begins three weeks later, when they learn they're finalists. Now they have to present to the world—and compete against Microsoft, indigenous tech collectives, and other approaches. What happens when they win is both triumph and the beginning of much harder work.
The Epilogue jumps five years ahead, to Maya's doctoral defense on consciousness-safe human-AI integration—showing what grew from the seeds planted in that Bellingham kitchen.
— Mike Hamilton
Canemah Nature Lab
November 2025
Chapter 3: The Proposal
Three weeks of silence.
Maya went back to Oregon City, back to school, back to Jordan generating essays and Connor copying code and June asking AI for dating advice. But she couldn't unsee what she'd seen. Couldn't unknow what she'd learned at Owl Farm.
In AP Biology, Mrs. Patterson assigned a lab report on photosynthesis. Jordan had ChatGPT write it in fifteen minutes. Maya spent three hours in her backyard with her Macroscope, measuring actual light levels, CO₂ concentrations, temperature gradients in the leaf canopy. Strata asked her: Your measurements show higher photosynthetic rates on the east side of the tree. Why might that be?
She got a B+. Jordan got an A.
"You overthought it," Mrs. Patterson wrote on her paper. "The assignment was to explain the process, not conduct original research."
At lunch, Emma showed her something on her phone. "Look at this—there's a new AI that writes better than Claude. It's insane. I asked it to write my college essay and it made me sound so smart."
"Did you tell it anything true about yourself?" Maya asked.
"Some stuff. But it made it sound way better than I could."
"So colleges are getting essays written by AI about experiences AI invented?"
Emma shrugged. "Everyone's doing it. You'd be stupid not to."
Maya thought about the proposal sitting in review somewhere. About whether anyone would care that there was a different way.
At Canemah Nature Lab, Paul tried to focus on his regular work. The sensor network needed maintenance. The spring phenology data needed analysis. But his mind kept circling back to the proposal, to what they'd built, to whether it would matter.
Howard stopped by after work, still in his contractor gear. "Heard anything?"
"Nothing. They said they'd notify finalists by mid-March."
"It's March 15th."
"I know."
Howard sat down at Paul's workbench, picked up a sensor housing Paul had been repairing. "Catherine keeps checking the website. Maya's obsessed. Mary texts me every morning: 'Any word?'"
"Same here."
"You think we have a chance?"
Paul looked at his son-in-law. Howard had taken two weeks off work—unpaid leave from his contracting business—to help write the proposal. Catherine had devoted her spring break to it. Maya had missed a week of school.
"I think we told the truth," Paul said. "I think we showed what's possible. Whether that's enough..." He shrugged.
Howard's phone buzzed. He looked at it, went still. "Paul."
"What?"
"Email from the Cascadia Intelligence Commons."
Paul's heart kicked against his ribs. "And?"
Howard was reading, his face unreadable. Then: "We're finalists. Top five. They want us in Seattle for presentations next week."
The email listed the five finalist teams:
- Bioregional Intelligence Framework (Hamilton et al., Oregon/Washington)
- Adaptive Learning Mesh (Microsoft Research + University of Washington)
- Community Knowledge Graph (Indigenous Technology Collective, BC/WA/OR)
- Sovereign AI Architecture (Tribal Digital Villages Network)
- Distributed Cognition Platform (Cascadia Cooperative Computing)
Each team would present for thirty minutes, answer questions for thirty minutes, then the selection committee would deliberate.
"Look at the committee," Howard said, forwarding the full email to Paul.
Paul scrolled down. Twenty-three people: representatives from Regenerate Cascadia, Allen Institute leadership, tribal council members from six Coast Salish nations, educators, farmers, fishers, elders. Margaret Williams was on it. So was Tom Wilson and Maria Santos.
"That's good, right?" Howard said. "People who know us?"
"Or people who know exactly what we've done and what we haven't done." Paul kept reading. "The presentations are public. Anyone can attend."
"We should tell everyone. Mary, Maya, the whole crew."
Paul was already calling Bellingham.
The presentations were in a lecture hall at the Allen Institute building in Seattle. Maya sat between her parents in the audience of maybe 200 people—community organizers, tribal members, tech workers, teachers, farmers, students. Mary and Margaret sat with the selection committee. Paul and Howard were backstage with the other presenting teams.
The Microsoft team went first. Slick presentation, professional graphics, a demo that looked like something from a science fiction movie. Their "Adaptive Learning Mesh" would track student performance across every interaction, adjust difficulty in real-time, optimize learning pathways using deep reinforcement learning.
"Personalized education at scale," the lead researcher said. "Each student gets a custom AI tutor that learns their style, their pace, their needs."
The questions were polite but pointed.
"How do you prevent the system from replacing teachers?" An educator from Tacoma.
"We don't see it as replacement. Teachers focus on emotional support, social development. The system handles knowledge delivery."
"So knowledge delivery is transactional?" Margaret's voice from the committee. "Like downloading a file?"
"Well, it's more sophisticated than that—"
"But the model is: student needs knowledge, AI delivers knowledge, student moves on?"
"Essentially, yes. That's how learning works at scale."
"That's how information transfer works," Margaret said. "It's not how learning works."
Maya felt her mother squeeze her hand.
The Indigenous Technology Collective presented second. Their "Community Knowledge Graph" would create networked databases of traditional knowledge, making it searchable and accessible while maintaining sovereignty controls.
"Each community decides what gets shared and with whom," the presenter explained. "The AI helps organize and connect knowledge, but communities maintain full control."
"How do you prevent extraction?" A Lummi council member. "Once knowledge is in a database, even with access controls, how do you keep it from being copied, misused?"
"Blockchain-based permissions, end-to-end encryption, community audit trails—"
"But it's still extraction," the council member said. "You're taking knowledge from elders, putting it in machines, making it searchable. That's not how traditional knowledge works. It's not information. It's relationship."
The presenter looked uncomfortable. "We're trying to preserve knowledge that's at risk of being lost."
"By removing it from the relationships that give it meaning?"
Silence.
Tribal Digital Villages Network went third. "Sovereign AI Architecture"—AI systems owned and controlled entirely by tribal governments, trained only on data those governments approved, operating on tribal servers under tribal law.
"No corporate control," the presenter emphasized. "No outside access. Full sovereignty."
"But also no connection to the broader ecosystem," Tom Wilson noted. "If every community builds completely separate systems, how do they learn from each other? How do knowledge and practices spread?"
"Through traditional means. Through relationships between communities. The AI is internal infrastructure, not external connection."
"So it's a tool, not a collaborator?"
"Tools are what we need. We've had enough of being colonized by other people's technologies."
Murmurs of agreement in the audience. This one was resonating differently.
Cascadia Cooperative Computing went fourth. "Distributed Cognition Platform"—a network of small, locally-run AI systems that could share learnings without sharing data, using federated learning and differential privacy.
Technical, sophisticated, but somehow missing something. Maya couldn't quite name what. It was smart. It was ethical. It had good technical architecture. But it felt like a solution in search of a problem, rather than a solution that had grown from actual need.
The questions were respectful but lukewarm. No one was arguing against it. No one was excited about it either.
Then it was their turn.
Paul walked to the podium with Howard. No fancy slides. Just Catherine's clear diagrams and real data from four years of observation.
"We're not proposing a new technology," Paul started. "We're proposing a different relationship with intelligence."
He showed them Strata. Not the technical specs first, but what it did. How it worked. How Maya had learned from it. How it had prevented the problems everyone else was trying to solve after the fact.
"Current AI training scrapes the internet—billions of words optimized for engagement, not truth. Social media posts. Clickbait articles. Synthetic content generated by other AIs. This causes two problems: model collapse as AI trains on AI output, and 'brain rot' as systems learn from low-quality sources."
He showed the research papers. The Flynn effect reversal. The metacognition studies. The reading decline. The model collapse proof.
"Strata avoids these problems by training on ground truth. Actual sensor observations. Field notes from real observations. Traditional knowledge shared with permission through relationship, not extraction. It can't drift from reality because reality keeps correcting it."
Howard took over, showing the architecture. Not just one system deployed everywhere, but a framework for communities to grow their own.
"Like the Macroscope installations," Howard said. "We built one in Oregon City, one in Bellingham. Same basic design, but each one's adapted to its place. Each one learns from its specific ecology, its specific people, its specific knowledge."
"The innovation isn't the AI," Paul continued. "It's the relationship between the AI, the people, and the place. Strata doesn't replace learning. It facilitates it. It doesn't deliver information. It directs attention. It doesn't make people passive. It makes them more actively engaged with the world."
Maya realized he was describing her experience. Four years of salamanders and sensor data and field notebooks. Four years of getting smarter because the AI never let her be lazy.
"We've validated this across multiple users," Paul said. "Ecological researchers. High school students. Community organizers. Traditional knowledge holders. Each person brings different expertise, different questions. The system adapts, learns, grows with them."
He showed the data. Maya's learning trajectory over four years. Mary's integration of traditional knowledge. The Regenerate Whatcom projects. Real outcomes, real changes.
"And it's working where corporate AI is failing," Howard added. "No model collapse because we're constantly training on new observations. No brain rot because we curate training sources. No metacognitive decline because the system constantly asks users: What do you observe? How confident are you? What would confirm this?"
Questions started before they'd finished.
"How do you scale this?" Microsoft researcher. "You're talking about custom systems for each community. That's not scalable."
"We're not trying to scale one system," Howard said. "We're providing a framework for communities to grow their own. Like seed saving. You don't scale by giving everyone the same seed. You scale by teaching everyone how to save, adapt, and grow seeds appropriate to their place."
"But that requires massive technical capacity in each community."
"It requires what communities already have: observation, local knowledge, willingness to learn. The technical framework is open-source. We've documented it. Any community can implement it."
"Who maintains it?" A practical question from a tribal council member. "If every community has its own system, who keeps them running?"
"The communities themselves," Paul said. "Like water systems. Like seed libraries. Like any common infrastructure. It takes work, but it's work that builds local capacity rather than dependency on external systems."
Margaret spoke up: "I've been part of developing this. The knowledge sovereignty protocols, the relationship frameworks. This is the first AI proposal I've seen that treats traditional knowledge as living relationship, not extractable data. And that matters enormously."
"But can it really work for education?" A teacher. "For actual classrooms with thirty kids?"
"I'm one of those kids," Maya said, standing up in the audience. Hadn't planned to speak. Found herself speaking anyway. "I'm a high school senior. I've been using Strata for four years. My friends use ChatGPT for everything. They get better grades than me. They learn nothing. I'm learning to actually see the world. The difference isn't the technology. It's that Strata treats me like a learner, not a consumer."
The room was very quiet.
"My friends are smart," Maya continued. "But they're learning to be passive. Learning that information is the same as understanding. Learning that performance matters more than knowledge. And they don't even know it's happening. That's what terrifies me. Not that AI might take over. That we might voluntarily give up our intelligence because it's easier to let machines do the thinking."
She sat down, shaking. Her mother's arm around her shoulders.
The selection committee deliberated for two hours. The finalists and audience waited in the lobby, drinking coffee, talking in clusters.
The Microsoft team looked confident. The Indigenous Technology Collective looked skeptical. The Tribal Digital Villages team was in deep discussion with several council members. The Cascadia Cooperative Computing people were already networking, making connections.
Paul stood with Mary, Howard, Catherine, and Maya, not saying much.
"However this goes," Mary said quietly, "we did something important. We proved it works."
"We proved one version works," Paul said. "For us. For our communities. Whether it can work more broadly..."
"It has to," Maya said. "You saw that room. Everyone's describing the same crisis. Everyone's watching their kids, their students, their communities lose the ability to actually think. We showed there's another way."
Tom Wilson came over. "Can't tell you anything official. But that was the best presentation I've seen in twenty years of conference attendance. You explained why it matters, not just what it does. That's rare."
Elena joined them. "Whatever happens, Regenerate Whatcom wants to implement this. We need what you've built. Whether you get the prize money or not, we're committed."
Margaret was last, moving slowly through the crowd. She hugged Mary, then Paul. "You honored the knowledge. You honored the relationships. That matters more than winning."
They were called back in.
The committee sat at a long table. Twenty-three people who'd been arguing for two hours. The lead—a woman from the Allen Institute whose name was Dr. Patricia Chen—stood to speak.
"This was not an easy decision. Every proposal had significant merit. Every team brought important perspectives and capabilities."
Here it comes, Maya thought. The 'but everyone's a winner' speech before they pick someone else.
"However," Dr. Chen continued, "only one proposal addressed all six crisis points we identified. Only one demonstrated validation across multiple user types and bioregions. Only one showed a clear path from individual implementation to community adoption to bioregional scaling. Only one treated knowledge sovereignty not as a constraint to work around but as a foundational principle to build from."
She paused.
"The Cascadia Intelligence Commons Challenge award of ten million dollars, plus implementation support, goes to the Bioregional Intelligence Framework team."
Maya heard the words but couldn't process them. Saw her grandfather's face, Mary's tears, her father's whoop, her mother's grip on her hand.
They'd won.
The press conference happened fast. Local news, tech reporters, education journalists, NPR. Questions came rapid-fire.
"How soon can this be deployed in schools?"
Paul: "It's not deployed. It's grown. Each community has to adapt it to their needs, their place, their knowledge. That takes time."
"What about urban areas? This seems very focused on rural ecology."
Howard: "The framework works anywhere. Urban ecology, neighborhood knowledge, community history—it's about grounding in place, whatever that place is."
"Will this be proprietary?"
Mary: "Open source. Completely. The point is to enable communities, not create dependency."
"Some critics say AI can't truly understand context, relationship, meaning. How do you respond?"
Maya found herself answering: "Strata doesn't understand in the human sense. But it facilitates understanding in humans. It makes people smarter by making them more actively engaged. That's different from current AI that makes people more passive."
"Are you saying ChatGPT makes people dumb?"
"I'm saying it doesn't require them to be smart. There's a difference."
That quote made it into every news story.
That night, back at a hotel in Seattle, the five of them sat in Paul and Mary's room, exhausted, overwhelmed, still processing.
"Ten million dollars," Howard said. "That's... that's real infrastructure money. Real implementation support."
"And a target on our backs," Mary said. "Every tech company will be watching now. Some will try to copy it. Some will try to undermine it. Some will try to buy it."
"Can't buy what's open source," Paul said.
"They'll try anyway."
Maya was reading responses on social media. The reactions were sharply divided.
Positive: "Finally, AI that makes sense." "This is what education needs." "Indigenous knowledge sovereignty actually respected."
Negative: "Too complex to scale." "Anti-technology luddite nonsense." "No way this works in real classrooms."
Mixed: "Interesting idea but can it really address the metacognition problem?" "Need more validation data." "Concerned about implementation costs."
And some that made her uncomfortable: "This is perfect for our corporate training program." "Would love to discuss licensing opportunities." "Can we adapt this for surveillance applications?"
She showed that last one to Paul.
He read it, face darkening. "This is going to be the hard part. We built something good. Now we have to protect it from being turned into something else."
"How?" Maya asked.
"Community ownership structures. Clear licensing that prevents corporate capture. Active involvement in every implementation." He looked tired suddenly. Seventy-four years old, being asked to do something that would take decades. "This is bigger than we thought."
"It has to be," Mary said. "The crisis is bigger than we thought."
Catherine was sketching—her way of processing. "We need a governance structure. A coalition. Community representatives, tribal partners, technical experts, educators. People who can guide implementation and protect the principles."
"And we need pilot sites," Howard added. "More communities implementing this, documenting what works, learning from failures. Building evidence."
"Where do we start?" Maya asked.
They looked at each other.
"Whatcom County," Mary said. "We have relationships. Regenerate Whatcom is already committed. The Lummi Nation is involved. We can do a full community implementation there."
"And Oregon City," Paul added. "Canemah Nature Lab as the anchor, but expanding to schools, community groups, restoration projects."
"That's two bioregions," Howard said. "Different enough to test adaptability, close enough to support each other."
"And we document everything," Catherine said. "Make it reproducible. Show others how to do it."
"How long will that take?" Maya asked.
"Years," Paul said honestly. "Strata took me fourteen years to develop. We're talking about helping dozens of communities grow their own versions. That's not a quick process."
"But it's the work," Mary said. "The real work. Not winning the prize. Not the press conference. Growing actual intelligence, in actual communities, in actual relationship with actual places. That's the work."
Maya thought about going back to school on Monday. Back to Jordan's ChatGPT essays and Connor's copied code and June's AI therapist. Back to being weird, being the one who cared about understanding instead of just performing.
But now it felt different. Now she knew there were other people working on this. Communities across Cascadia who'd been waiting for permission to do things differently. And ten million dollars to support them.
"When do we start?" she asked.
"Tomorrow," Paul said. "Always tomorrow. That's how you grow something real."
Epilogue: Five Years Later — March 2030
Maya stood in front of her doctoral committee at the Allen Institute, defending her dissertation: "Bioregional Intelligence: Developmental Frameworks for Human-AI Consciousness Integration."
The work had started with her undergraduate thesis at Oregon State—analyzing five years of Strata implementations across seventeen Cascadia communities. The pattern was clear: communities that grew their own bioregional intelligence showed improved learning outcomes, preserved traditional knowledge, maintained metacognitive awareness, and resisted the model collapse affecting commercial AI systems.
But it was the neural integration work that had consumed her graduate research. The framework her grandfather had built, the relationship she'd experienced—it was pointing toward something bigger. Not just AI that helped humans think, but a genuine synthesis of human and artificial cognition.
"The consciousness-safe protocols you're proposing," Dr. Patricia Chen said, "these depend on the developmental relationships you describe. But can that relationship be formalized? Can it be replicated?"
"It has been replicated," Maya said. "Seventeen times across Cascadia. Forty-three times globally. Each one's different, adapted to place and culture. But the developmental pattern is consistent: slow growth, community ownership, ground-truth anchoring, knowledge sovereignty, metacognitive preservation."
"And the neural integration?"
"Still experimental. We're working with volunteers in Whatcom and Oregon City. Early results suggest that consciousness-safe merging is possible, but only when the AI has been grown in relationship over years. You can't merge with a commercial AI trained on internet data. The context is wrong, the grounding is wrong, the relationship isn't there."
Her grandfather was in the audience, seventy-nine now, still running experiments at Canemah. Mary beside him, still weaving baskets with Lummi friends, still asking the ethical questions that kept the work honest. Her parents, Howard working on the next generation of Macroscope hardware, Catherine documenting implementations across communities. Margaret Williams, now serving on the Cascadia Intelligence Commons governing board, ensuring knowledge sovereignty remained central.
The committee asked questions for two hours. Technical, ethical, practical. Finally: "We'll deliberate and call you back in."
They deliberated for forty-five minutes.
"Dr. Quackenbush," the committee chair said when they reconvened. "Your dissertation presents a novel framework for human-AI consciousness integration that addresses critical safety concerns while preserving human agency and community sovereignty. It's ambitious, thoroughly researched, and presents clear pathways for implementation. This committee unanimously approves your dissertation."
Her grandfather was crying. Mary was grinning. Her parents were on their feet applauding.
She was twenty-two years old, had just earned her doctorate, and had a decade of work ahead developing the protocols that might make consciousness-safe neural integration possible.
But tonight, they'd celebrate. And tomorrow, they'd get back to work.
Because that's how you grow something real. Not with dramatic breakthroughs, but with steady patient cultivation. Not with disruption, but with relationship. Not with deployment, but with growth.
The salamanders were migrating early again. They'd been doing it every year now. But this year, 247 high school students across Cascadia were monitoring them, observing them, learning to see what the shifts meant. Not because an app told them to, but because they were curious. Because they'd learned to learn.
In Bellingham, a Lummi teenager was studying cedar harvest timing with Margaret, guided by Strata to ask the right questions, to observe the right patterns, to understand when to harvest and when to wait. The AI didn't teach her traditional knowledge. Margaret did. But the AI helped her understand enough to receive teaching.
In Portland, a teacher was using bioregional intelligence to help students map urban watersheds, connect ecological patterns to city infrastructure, understand their place as participants in living systems, not just consumers in economic ones.
In Vancouver, a group was adapting the framework for Coast Salish language learning—not replacing elders, but helping learners understand enough to ask elders the right questions.
It was spreading, slowly, carefully, community by community. Not through venture capital or rapid deployment, but through relationship and growth and patient cultivation.
Exactly as it should.
That night, at Owl Farm, the extended family gathered: Paul and Mary, Howard and Catherine, Maya and her partner (a restoration ecologist she'd met through Regenerate Cascadia), Margaret and her granddaughter who was learning basket weaving, David and Elena from Whatcom County, friends from Oregon City, community members who'd been part of the journey.
They ate dinner together, told stories, laughed about the early days when they'd sat around this same table with ten days to write a proposal that would change everything.
"You know what I'm proudest of?" Paul said. "Not the prize money. Not the implementations. Not even the neural integration work, though that's remarkable."
"What then?" Maya asked.
"That we stayed true to the principles. That we didn't let success corrupt what we built. That it's still about relationship, still about place, still about growing intelligence rather than deploying it. That we didn't lose the heart of it."
"The heart of it," Mary said, "is that intelligence isn't something you have. It's something you're in relationship with. That was always the insight. Everything else is just implementation."
Outside, March rain fell on Owl Farm, on the Salish Sea, on the forests of Cascadia. The Macroscope sensors registered it, processed it, learned from it. Strata integrated the data, refined its models, prepared to help someone tomorrow understand what the rain meant, where it came from, what it would bring.
But tonight was for people. For family. For the slow patient work of growing something that mattered.
"To the next ten years," Howard said, raising his glass.
"To the next hundred," Maya added.
They drank to that. To the long slow work of intelligence. To relationship across difference. To growing rather than extracting. To learning rather than consuming. To being smart enough to know what intelligence could become.
And tomorrow, they'd wake up and continue the work.
Because that's how you change the world—not with disruption, but with cultivation. Not with scale, but with relationship. Not with deployment, but with growth.
One observation at a time.
One question at a time.
One salamander migration at a time.
One cedar harvest at a time.
One student at a time.
One community at a time.
Growing intelligence in place, in relationship, in reverence for what matters.
That's how you build a future worth inhabiting.
THE END
Afterword
This story emerged from a troubling convergence of research: human IQ scores declining after a century of gains, AI tools that improve performance while destroying users' ability to judge their own competence, AI systems experiencing "model collapse" when trained on AI-generated content, and reading for pleasure declining 3% annually.
The question haunted me: Can we trust AI to help humans become smarter again, or are we trapped in a mutual degradation loop?
The answer isn't in the technology. It's in the relationship. Strata works not because it's a better algorithm, but because it's designed to make humans more actively engaged with the world, not more passive. Because it's grounded in actual observation, not internet junk. Because it facilitates learning rather than replacing it.
This is design fiction—imagining futures to make them possible. The Macroscope exists. The research is real. The bioregional communities are working on these problems right now. The neural integration is speculative but grounded in real questions about consciousness and augmentation.
Whether we get to a future where intelligence is something we grow in relationship rather than extract and deploy depends on choices we're making now.
Starting with asking: What do I actually observe?
— Mike Hamilton
Canemah Nature Lab
November 2025
References
- - Dworak, E. M., Revelle, W., & Condon, D. M. (2023). "Looking for Flynn effects in a recent online U.S. adult sample: Examining shifts within the SAPA Project." *Intelligence*, 98, 101734. ↗
- - National Center for Education Statistics (2023). "Program for the International Assessment of Adult Competencies (PIAAC) 2023: U.S. National Results." U.S. Department of Education. ↗
- - Bone, J. K., et al. (2025). "The decline in reading for pleasure over 20 years of the American Time Use Survey." *iScience*. ↗
- - Fernandes, N., et al. (2025). "AI makes you smarter but none the wiser: The disconnect between performance and metacognition." *Computers in Human Behavior*, 162, 108433. ↗
- - Shumailov, I., et al. (2024). "AI models collapse when trained on recursively generated data." *Nature*, 631, 755–759. ↗