PrivyFeed — Support
Private, Local-first AI RSS Reader for macOS
This document covers how to get help, frequently asked questions, and troubleshooting for PrivyFeed.
About PrivyFeed
PrivyFeed is a local-first, privacy-first AI RSS reader for macOS. Your feeds, your articles, and your AI—all on your device. No cloud, no tracking, no account required.
Frequently Asked Questions
How do I add RSS feeds?
- Click Add Feed (or use the menu)
-
Enter the feed URL (e.g.
https://example.com/feed.xml) - Or import feeds from an OPML file
Why isn't "Ask this Feed" working?
PrivyFeed uses Ollama for AI features. To use Ask this Feed:
- Install Ollama on your Mac
-
Pull a model:
ollama pull llama3.2(or another model) -
In PrivyFeed Settings, ensure the Ollama URL is
http://localhost:11434 - Wait for feeds to be indexed (first time may take a few minutes)
What models does PrivyFeed support?
- Chat: Any Ollama chat model (e.g. llama3.2, mistral, qwen2)
- Embeddings: Any embedding model (e.g. nomic-embed-text, mxbai-embed-large)
How do I import my existing feeds?
Use File → Import OPML to import feeds from another RSS reader (Feedly, Inoreader, NetNewsWire, etc.).
Where is my data stored?
All data is stored locally on your Mac—in SQLite and UserDefaults. Nothing is sent to external servers.
System requirements
- macOS 13.0 or later
- Ollama (optional, for AI features): ollama.ai
PrivyFeed Pro
What do I get with PrivyFeed Pro?
PrivyFeed Pro is a one-time purchase that unlocks advanced knowledge tools:
- Unlimited Ask: Remove the daily 100-request limit
- Multi-feed Ask: Query across multiple subscriptions or groups at once
- Knowledge Assets: Save Q&A as Notes, generate Daily/Weekly Digests, and view Topic Insights
- Advanced Control: Fine-tune AI temperature, max tokens, and indexing strategies
Does PrivyFeed Pro use the cloud?
No. Just like the free version, Pro features run 100% locally on your Mac using Ollama. When you ask a question across 50 feeds, your own computer does the reading and thinking.
Why is generating a Digest slow?
Generating a digest for a week's worth of articles requires the AI to read and synthesize a lot of text. Since this happens on your device (not a cloud server), performance depends on your Mac's speed (M1/M2/M3 chips perform best).
Contact Us
For support, bug reports, or feature requests:
- Feedback & Issues: Submit Feedback or Report a Problem
Related Links
- Privacy Policy
- Ollama — Local AI models
PrivyFeed — Private, Local-first AI RSS Reader for macOS • Bundle ID: com.kenelite.privyfeed