PrivyPDF — Support

Local PDF Q&A with Ollama. Your documents never leave your machine.

This document covers how to get help, frequently asked questions, and troubleshooting for PrivyPDF.

About PrivyPDF

PrivyPDF is a local-first macOS app that turns your PDFs into a searchable, question-answerable knowledge base. All processing runs on your Mac via Ollama. Your documents and embeddings stay in your App Sandbox.

Requirements

Quick Start

  1. Install Ollamaollama.com
  2. Pull models in Terminal:
    ollama pull nomic-embed-text
    ollama pull qwen2.5:7b-instruct
  3. Open PrivyPDF → Import a PDF → Parse & Index → Ask questions

Frequently Asked Questions

Q: "Cannot connect to Ollama" or indexing fails

A: Ensure Ollama is running:

ollama serve

Check that it listens on http://localhost:11434. If you use a custom URL, set it in PrivyPDF → Settings.

Q: Which models should I use?

A:

Configure both in PrivyPDF → Settings.

Q: Where are my PDFs stored?

A: PDFs and indexes are stored in the app sandbox:

~/Library/Containers/<BundleID>/Data/Library/Application Support/PrivyPDF/

Your documents never leave your Mac.

Q: Can I use a remote Ollama server?

A: The app is designed for local Ollama. For remote servers, you would need to configure your base URL (e.g. http://your-server:11434) in Settings. Network access must be allowed by the app.

Q: Answers are slow or inaccurate

A: Try a larger chat model (e.g. qwen2.5:14b or llama3.1:70b). Ensure your PDF has extractable text (not scanned images). For scanned PDFs, OCR support may be added in future versions.

Q: How do I export document content?

A: Use the Export buttons in the document list or chat view to export as Markdown or JSON.

Contact Us

For support, bug reports, or feature requests:

PrivyPDF is a free product; we do not guarantee response times.

Related Links

PrivyPDF — Local PDF Q&A with Ollama. Your documents never leave your machine.