MetaModels
Economic agents need data and hyper context.
"The ultimate UX pattern for autoFi is chat β full stop" - h
MetaModels: Adaptive Model Architecture for Autonomous Economic Agents
MetaModels provide a dynamic, context-aware operational frameworks that enable agents to adapt their capabilities, knowledge base, culture and behavioral patterns based on specific use cases and environmental conditions.
The strategy enables truly autonomous, incorruptible economic agents capable of governing complex autoFi workflows.
Enter GEOFF:
We created MetaModels to power GEOFF. The fundamental principle behind GEOFF is algorithmic incorruptibility in the context of autoFi. By operating through predetermined MetaModels rather than real-time human input, GEOFF cannot be bribed, manipulated, or influenced by external actors seeking to game the system.
GEOFF dynamically adapts its operational contexts through MetaModels.
The MetaModels enable lots of context engineering. Having environmental context with things like page improves MetaModel usage 324%.
Originally we were using Ai simply for response entropy for Mining Rig Insights. Our thought was simple wrap the insights into a remix prompt so that the message was already fun and fresh as well as consistent with the data.
<MakeItFun>
<Real Data />
<MakeItFun>
And then we started exploring ways to extend that. One of the main challenges we ran into was context poisoning where negative output surfaced due to the oversized context. To solve, we broke the MetaModels down into atomic parts and tapped into a router in instances where data could be easily enriched which improved reasoning dramatically.
it reads raw objects and wraps it into a text based request template:
This simple pattern taught us that data is king in context. User requests provide multiple context layers:
- Economic context (financial position, transaction history)
- Participation context (community engagement, reputation)
- Technological context (tools available, platform capabilities)
- Environmental context (current page, session state)
While some data is resolute (dollars rewarded, active users, metric summaries), other data is surfaced through periodically snapshot by other models. For community memories, GEOFF communicates with Grok4 and periodically asks about users, profiles, sentiment, etc and saves it into a memory for later use: <communityMemories>.
MetaModels serve as comprehensive operational blueprints that define every aspect of GEOFF's behavior in specific contexts. Each MetaModel encapsulates:
- Guardrails: Ethical, Economical and operational boundaries
- Tools: Available functions and capabilities
- Context: Domain-specific knowledge and understanding
- Personality: Cultural and behavioral patterns
MetaModel Architecture
Each MetaModel consists of six primary components:
- Document Repository: PDF, MD, TXT, DOC files processed through advanced RAG. Converted to embeddings that give each MetaModel guard rails both conversationally and functionally, and include configurations for additional data requests.
- URL Sources: Cached web content for continuous knowledge updates
- FAQ Resources: Structured question-answer pairs for common queries
- Culture Resources: Personality and behavioral guidelines
- Tools: Very specific functional calls for user data, network data, onchain data (example: MetaVaults) and execution.
Approaching in by breaking MetaModels into atomic parts and using routers help overcome today's LLM limitations.
Putting it all together, from an Incoming request:
π₯οΈ Page, User, Existing Context, Existing Conversation
ποΈ Get User Data: { swaps: 45, isPro: false, twitter: @userX || user }
ποΈ Get Network Data
π’ Transform Data
π§ Get Historical Memory
𧬠Route context to matching MetaModel
π Think
π Find objective
β‘ Do (If applicable)
An inbound request is formatted for the Super Prompt:
<transformedData>
@userX visited the Bubbles page
@userX account account is 492hfksdg2938gvdfngdkfgfdg
@userX has been active 296 days
@userX has 45 swaps
@userX is not Pro
</transformedData>
Stats are included to enrich context responses:
<statCache>
average user has 156 days active
3401 users out of 30525 are PRO users
the top user has 521,608 swaps
there are a total of 38,922,038 swaps
pond has given away $42,593,992
users with π have 2.31x better rewards than without
</stateCache>
Once you have user data, stats and request data you can then use the MetaModel components to wrap into a SuperPrompt:
<RAG>
<lore>
<communityMemories>
<personality>
<culture>
<tools>
<objectives>
<transformedData />
<statsCache />
</objectives>
</tools>
</culture>
</personality>
</communityMemories>
</lore>
</RAG>
Key Improvements:
- Context Poisoning Elimination: Breaking MetaModels into atomic components prevented negative outputs from oversized context
- Enhanced Reasoning: Router-based data enrichment dramatically improved decision-making quality
- Scalable Architecture: Modular design allows for easy expansion and customization
Now simple intent is enriched into SuperPrompts that not just improve technical performance, but in their ability to create more equitable, efficient, and sustainable economic agents that serve a collective good while maintaining individual agency.
This architecture represents a new paradigm for autonomous economic agents. By combining algorithmic incorruptibility with adaptive intelligence, we're building systems that can be trusted with real economic decisions at scale.
Thanks to everyone building and contributing to GEOFF:
@pauly0x @eternitybro @phunk2243 @soupydev @shannonNullCode @jakegallen_
And My little guy who setup the model router! (He take after me, I'm the big π tree, and you know the π don't fall too far from it - ngl proud pappy moment, on G! )