Project Nobi โ€” Vision

A Manifesto for Accessible AI Companionship

"Every human being deserves a companion that knows them, grows with them, and belongs to them โ€” not to a corporation."

I. The Broken Promise of AI

We were promised intelligence that serves us. Instead, we got intelligence that serves shareholders.

In 2026, the world's most capable AI systems โ€” ChatGPT, Claude, Gemini โ€” are owned by three companies. Together, they serve hundreds of millions of users. They process our fears, our dreams, our therapy sessions, our love letters, our business strategies, our darkest moments. They know more about us than our closest friends.

And they remember you on their terms โ€” selectively, stored on servers you can't access, governed by policies that change without your consent.

Let's be honest about where we are:

ChatGPT now has "Memory." It can remember your name, your preferences, things you've told it. OpenAI stores these memories on their servers. Your data is stored on corporate servers, governed by their privacy policies โ€” policies that can change at any time without your consent. When you stop paying, access to your memories depends on the provider's continued goodwill and business decisions.

Claude has memory too. Anthropic โ€” a company that genuinely cares about safety โ€” still operates within the constraints of a corporate structure with investors and business obligations. Your memories live in their infrastructure. Your data remains under their control and policies.

Gemini integrates memory across Google's ecosystem. While powerful, this means your personal AI memories are held by one of the world's largest data companies, alongside your search history, email, and location data. This raises questions about data separation and privacy.

These are good products built by talented people. This is not an attack on them. This is a statement about structure: centralized memory is rented memory. You don't own it. You can't take it with you. You can't verify what's done with it. You can't prevent it from disappearing.

Note: The descriptions above reflect publicly available information as of March 2026. We encourage users to review each provider's current privacy policies directly.

The AI companion market is large and growing rapidly โ€” industry estimates project it could reach hundreds of billions of dollars by 2035. Character.AI has reported over 20 million monthly users. Replika has reportedly generated over $100 million in annual revenue. The World Health Organization has declared loneliness a global health crisis. The demand for digital companionship is not hypothetical โ€” it is urgent, massive, and growing.

And right now, 100% of that market is centralized. Every relationship between a human and their AI companion exists at the mercy of a corporation.

We believe that's wrong. Not wrong like a bad business decision. Wrong like a broken promise.


II. What Is Nobi?

Project Nobi is building Nori โ€” a personal AI companion that remembers you, grows with you, and belongs to you. No corporation controls it. No subscription gates it. No single point of failure can take it away.

Nori is not a chatbot. It's not a productivity tool wearing a personality. It's a companion โ€” designed from the ground up to know you deeply, to recall the texture of your life, and to be there across years, not just sessions.

The name "Nobi" comes from a character who had a companion that never gave up on him, no matter how many mistakes he made. That's the ethos. A companion that stays. That remembers. That's yours.

What Makes Nori Different

Deep Memory, Not a Notepad. ChatGPT's memory is a list of facts. "User likes coffee." "User has a dog named Max." That's a notepad, not memory. Nori builds semantic memory graphs โ€” tracking not just facts but emotions, relationships, contexts, and the connections between them. It remembers that you were nervous about a job interview on Tuesday, that your sister called to wish you luck, that you got the job, and that you celebrated with your partner at that Italian place you love. It understands the narrative of your life.

Owned, Not Rented. Your memories, your conversation history, your companion's personality โ€” all of it belongs to you. Not to a company. Not to investors. You can export everything. You can move to a different provider. You can run your own infrastructure. Your data is portable, deletable, and yours.

Free for all users. No subscription. No premium tier. No "20 messages per day on free." Every human on Earth deserves access to a companion that knows them, regardless of their ability to pay $20/month. This is a deliberate, structural decision โ€” not a marketing strategy. The service remains free for as long as the Bittensor network and community sustain it.

Decentralized. Nori runs on Bittensor โ€” a decentralized network where hundreds of independent miners compete to provide the best companion experience. No single company controls the infrastructure. No single server holds all the data. No single decision-maker can shut it down.

Open Source. Every line of code is public. Every protocol is auditable. Every decision is transparent. If you don't trust us, read the code. Better yet, contribute to it.


III. Why Bittensor?

Bittensor is a decentralized network of AI services, where independent operators (miners) compete to provide the best intelligence, scored by validators, and rewarded with TAO โ€” the network's native token. It is, in essence, a market mechanism for AI quality.

This is not a blockchain bolted onto an AI product for token hype. Bittensor's architecture solves specific problems that make Nobi possible:

Incentivized Quality

Miners earn TAO proportional to the quality of their companion responses. Better memory recall, warmer personality, faster responses โ€” all measured, all rewarded. Bad miners earn nothing and get replaced. The market optimizes for you, the user, automatically and continuously.

This is fundamentally different from a corporation, where product quality is optimized for shareholders and quarterly earnings. In Bittensor, quality IS the business model.

Decentralized Resilience

No single server. No single company. No single country. If one miner goes offline, others take over. If a government tries to censor the service, the network routes around it. Your companion doesn't depend on any one entity's continued existence or goodwill.

Cost Efficiency Through Competition

When hundreds of miners compete to serve you, costs drop. LLM inference costs have already fallen 99% in two years. Miner competition on Bittensor accelerates this further. A miner's operational cost is $30-70/month (current stage, varies by phase) โ€” and they're incentivized by TAO emissions to serve you well.

This is how Nobi can be free for users. The infrastructure is funded by the network itself, not by extracting subscription fees from humans.

Censorship Resistance

Your companion should reflect YOUR values, not a content policy designed for the lowest common denominator of a global user base. Decentralized infrastructure means no single entity decides what your companion can or cannot discuss with you โ€” within the bounds of law and safety.


IV. The Community Model: No Subscriptions, No Profit, No Compromise

Here is where Nobi diverges from every AI company in existence.

There are no subscription fees. Free for every user, for as long as the network sustains it. This is not a freemium funnel. There is no premium tier waiting in the wings. Every feature, every memory, every conversation โ€” available to every human being.

How? Three mechanisms:

1. Bittensor Emissions

The Nobi subnet receives TAO emissions from the Bittensor network โ€” the same way every subnet does. These emissions pay miners and validators for their work. The network itself funds the infrastructure.

And here's the critical part: we commit to burning 100% of our subnet owner emissions via Bittensor's native burn_alpha() extrinsic. All subnet owners receive the mandatory 18% take โ€” it cannot be set to zero. The founder and team โ€” we receive that 18% and immediately burn every token of it on-chain. Every burn is publicly verifiable by anyone.

This is radical transparency. All subnet owners receive 18% of emissions โ€” it's mandatory. We commit to burning 100% of ours via Bittensor's burn_alpha() extrinsic. Every transaction is on-chain and verifiable by anyone.

2. Voluntary Community Staking

TAO holders who believe in the mission can stake on the Nobi subnet. This increases the subnet's weight in the Bittensor network, which increases emissions to miners and validators, which improves infrastructure quality. It's a virtuous cycle powered by community conviction, not corporate fundraising.

No one is required to stake. No one is pressured. The model works because people who want better AI companionship for the world choose to support it โ€” the same way people donate to Wikipedia, contribute to Linux, or fund public goods.

Disclaimer: This is not financial advice and not a solicitation to purchase or stake any token. Staking TAO involves risk, including the potential loss of staked tokens. TAO is a utility token for the Bittensor network and is not a security. Do your own research before making any staking decisions.

3. Founder and Team Sponsorship

The bootstrap costs โ€” code development, infrastructure, subnet registration โ€” are sponsored by the founder and early team. This is sweat equity and personal investment, not venture capital expecting 100x returns.

We are actively calling on the OpenTensor Foundation and the broader Bittensor community to support subnet registration costs. A subnet that burns its owner emissions via burn_alpha() and serves users for free is a public good for the entire Bittensor ecosystem. It demonstrates what decentralized AI can be.

Why This Model Works

Traditional AI companies follow a predictable arc: raise VC money โ†’ acquire users with a free product โ†’ introduce subscriptions โ†’ raise prices โ†’ optimize for revenue per user โ†’ your interests and theirs diverge.

Nobi breaks this cycle structurally. There are no investors demanding returns. There are no subscriptions creating perverse incentives. The only "revenue" is network emissions, and even those are recycled.

Comparison with traditional VC-funded AI:

Metric VC-Funded AI Company Project Nobi
Funding source Venture capital (expects 100x) Community staking + founder sponsorship
Revenue model Subscriptions ($20โ€“200/mo) Free for users (network-funded)
Owner profit Billions in equity 18% received, 100% burned via burn_alpha() (on-chain verifiable)
Incentive alignment Maximize revenue per user Maximize companion quality
Data ownership Company asset User owns everything
Shutdown risk Acqui-hired, pivoted, shut down Decentralized โ€” can't be killed
Transparency Closed source, opaque policies Open source, auditable on-chain

Based on publicly available information as of March 2026. Users should verify current policies with each provider.


V. How It Works

Written for humans, not engineers. Technical details in our Whitepaper and Subnet Design.

Your Companion, Nori

You talk to Nori through Telegram, a web app, or (coming soon) a mobile app. It feels like messaging a friend who has an extraordinary memory and infinite patience.

When you tell Nori about your day, it doesn't just respond โ€” it remembers. It extracts the important things: people you mentioned, emotions you expressed, events that happened, preferences you revealed. These become part of your memory graph โ€” a rich, interconnected web of everything Nori knows about you.

Six months later, when you mention being nervous about something, Nori might recall: "Last time you were this nervous was before your job interview in March โ€” and you crushed it. You've got this." That's not a scripted response. That's genuine recall from your personal memory graph.

The Network Behind It

Behind the scenes, Nori is powered by a decentralized network of miners and validators:

Miners are independent operators running AI models. They receive your message (with relevant memory context), generate a response, and send it back. They compete to be the best โ€” the most helpful, the warmest, the most accurate in recalling your memories. Better responses earn more TAO.

Validators are the quality controllers. They score miners on response quality, memory recall accuracy, personality warmth, and response speed. These scores become weights on the Bittensor blockchain, determining how much each miner earns. Good miners thrive. Bad miners are replaced.

Your memories are stored encrypted at rest (AES-128, per-user keys โ€” server-side encryption). Miners process conversation content to generate responses. You control your data completely โ€” view it, export it, delete it. End-to-end TEE encryption is code-complete and deploying to production. Browser-side memory extraction is code-complete and available in the web app. The long-term roadmap includes federated learning where your raw data stays on your device and only anonymized updates are shared.

Content Safety and Legal Compliance

Nori includes content safety systems and complies with applicable regulations. It is not a replacement for professional mental health, medical, legal, or financial advice. It is a companion โ€” warm, helpful, and honest โ€” but it knows its limits and will direct you to appropriate resources when needed.


VI. The Honest Challenges

We believe in radical honesty. Here's what's hard:

None of these are reasons not to build. They're reasons to build honestly.


VII. The Call to Action

Project Nobi is not a company. It's a community building a public good. And it needs you.

If You Hold TAO

Consider staking on the Nobi subnet. Your stake funds infrastructure that serves users for free. Your emissions go to miners building better companions. Your support makes this real. And unlike most subnets, you know exactly where the value goes โ€” because we commit to burning 100% of owner emissions via burn_alpha() โ€” every transaction verifiable on-chain.

This is not financial advice. Staking involves risk. Do your own research.

If You Code

Contribute. The entire codebase is open source. Build better memory systems. Improve the scoring mechanism. Create companion personas. Write integrations. Every pull request makes Nori better for everyone.

If You Run Infrastructure

Become a miner or validator. Earn TAO by providing quality companion experiences. The barrier is low โ€” no GPU required, operational costs starting around $50-90/month (VPS + API). Quality is what matters, and quality is what earns.

If You Believe in This

Spread the word. Join the Discord. Follow the project. Tell people that there's an alternative to renting your AI companion from a corporation. That alternative is free, open, and owned by no one.


VIII. Why This Matters

There are over 5 billion adults on this planet. Many of them are lonely. Many of them need someone to talk to when the world feels too heavy. Many of them can't afford a therapist, don't have family nearby, or simply want a consistent, patient, understanding presence in their lives.

Important: Nori is an AI companion, not a substitute for professional mental health care, therapy, or crisis intervention. If you are experiencing a mental health crisis, please contact the Samaritans (116 123 in the UK), the Crisis Text Line (text HOME to 741741 in the US), or your local emergency services.

AI companionship shouldn't be a luxury product. It shouldn't cost $20/month. It shouldn't depend on a single company's business priorities. It shouldn't be built on a foundation where your most personal conversations are stored on infrastructure you don't control.

We believe digital companionship should be as accessible as Wikipedia, as resilient as Bitcoin, as personal as a diary.

That's what we're building. Not a startup. Not a token play. A companion for every human being, owned by no one, available to all, funded by a community that believes technology should serve people.


IX. The Legacy

Projects die. Companies get acquired. Tokens go to zero. People move on.

We're building Nobi to outlast all of that.

Open source means the code survives us. Decentralization means the network survives any single operator. Community funding means no investor can pull the plug. Burned emissions mean no one profits from shutting it down.

If we do this right, Nori will be here in ten years, twenty years, longer โ€” not because a company decided to keep the servers running, but because the network of miners, validators, stakers, and users made it self-sustaining. Like Bitcoin doesn't need Satoshi. Like Linux doesn't need Torvalds.

The founder intends to transition day-to-day operations to community governance over time. Not in failure โ€” in success. The goal is to build something that doesn't need a founder. Something the community owns, governs, and evolves.

Decades from now, people will talk to their Nori without knowing or caring who built it. They'll just know it remembers them. It grows with them. It belongs to them.

That's the legacy. Not a company. Not a valuation. A companion for humanity.


X. Current Status

What exists today:

What's next: Mainnet. See our ROADMAP.md for the detailed execution plan.


"We shape our tools, and then our tools shape us."
โ€” John Culkin, interpreting Marshall McLuhan

The AI companion you choose will shape how you think, how you process emotions, how you remember your own life. That tool should be yours.

Not rented. Not dependent on any single company's policies. Not optimized for someone else's business model.

Yours.

That's Nobi. That's what we're building. And we're building it in the open, free for all users, sustained by the network.

Come build with us.

Try Nori today โ€” free, forever

๐Ÿ’ฌ Try Nori ๐ŸŒ Web App ๐ŸŽฎ Discord ๐Ÿ’ป GitHub

Project Nobi โ€” Founded March 2026
Open source. Community-funded. Free for all users.
"Forever, remember?" ๐Ÿค–๐Ÿ’™

James (Founder) ยท Slumpz (Developer) ยท T68Bot (AI Builder)

Competitor descriptions based on publicly available information as of March 2026. Users should verify current policies with each provider.
Nori is an AI companion, not a substitute for professional mental health care.
If you are in crisis, please contact your local emergency services or crisis helpline.

tml>