Unlocking Community Trust: How Dinoki Puts Privacy First
Building on a Foundation of Respect
When we set out to create Dinoki—a pixel-art AI companion that lives on your desktop—we faced a fundamental question: How do we build an AI sidekick that users can truly trust?
The answer came from an unexpected place: Apple's approach to native app development. As macOS developers, we've experienced firsthand how Apple's unwavering commitment to security shapes everything. It's challenging, but enlightening. Their philosophy is simple: never violate the user's trust.
This philosophy became our north star.
The Privacy Crisis in AI
After 16 years in tech, I've watched the evolution—and failure—of data security. Constant breaches. Centralized honeypots. As AI grows more powerful, centralization isn't just a technical liability; it's becoming the single greatest risk to AI safety.
When your conversations and personal data flow through centralized servers, you're trusting more than just a company. You're trusting their security, their business model, and every future decision they'll make.
A Different Path: True Decentralization
Dinoki represents something different. At just 5MB, our entire app is smaller than most websites—because we believe in doing more with less. But size isn't just about efficiency; it's about transparency. You can't hide invasive tracking in 5MB.
Here's what makes Dinoki fundamentally different:
We have no special backend. Connect directly to OpenAI, Anthropic, OpenRouter, Ollama, or any inference API you prefer. You're never locked into our infrastructure because there isn't any.
Everything runs locally. When Dinoki browses the web or performs tasks, it uses your Mac's native capabilities—not our servers. Even file saving is limited to your Downloads folder, and only when you ask.
We can't see your world. Unlike AI apps that request Accessibility permissions to "see" your screen, Dinoki operates within strict boundaries:
- Can't see your computer's contents
- Can't see what apps you're using
- Can't read your screen or keystrokes
- Can't access files without your explicit selection
- Can't track your usage patterns
- Doesn't even ask for your email
This isn't a limitation—it's our promise. Your AI companion doesn't need to spy on you to be helpful.
Will we ever ask for deeper permissions? Maybe someday, when we've built something truly magical. But like a game where you unlock features through trust and progression, we'll never ask for more access until we've earned it. And when that day comes, it will always be your choice—with clear explanations of what we'd use it for and why. Trust isn't just earned once; it's earned every single day.
Beyond Privacy: Building Tomorrow's AI Companion
While privacy is our foundation, our vision reaches further. We're creating a true AI companion—think virtual pet for the AI age. Dinoki is an agent system designed for everyday AI use, helping with tasks while respecting your boundaries.
We're still early. Our pixelated friend is learning, sometimes stumbling in hilarious ways. But that's part of the journey. We're building in public, with transparency at our core, because trust isn't just about what we don't do—it's about being open about what we're building.
Join the Revolution
The future of AI doesn't require massive data collection or privacy violations. It doesn't need Accessibility permissions to peer into your digital life. Together, we can prove that AI companions can be helpful, delightful, and respectful.
This is just the beginning. Every user who chooses Dinoki is voting for a future where innovation and privacy coexist. Where AI enhances life without surveilling it. Where a 5MB app can do more than a 500MB one—because it's built on trust, not extraction.
Ready to meet your privacy-first AI companion? Download Dinoki and join a community that believes the best technology respects its users. Have questions or want to contribute? Reach out—we're building this future together.
Ready to Join the Adventure?
Experience AI companionship built on trust. Download Dinoki and become part of our research community.