OpenHuman Guide

OpenHuman Guide

Desktop setup & workflows · Updated

Install for Windows

Guide

Privacy & local-first model (plain English)

  • · Workflow knowledge is designed to stay on your hardware in SQLite + vault files.
  • · OAuth scopes are controlled per vendor dashboard—revoke from Google, Slack, etc. if you rotate access.
  • · Sending a prompt still implies routing content to whichever cloud models your subscription allows.

1 · What stays local

Marketing and docs emphasize that your workflow data remains on the device, encrypted at rest per their security story, and treated as yours rather than training fodder. That is the core differentiator versus always-on hosted agents.

2 · What still leaves the machine

When you issue a command, relevant Memory Tree chunks and tool outputs are packaged (after TokenJuice compression) and sent to the configured LLM endpoints for inference. That is not optional if you rely on hosted models. Optional Ollama flows exist for on-device models but trade speed and setup complexity.

3 · OAuth footprint

Connectors use standard OAuth. Each vendor shows you scopes before approval. If you lose a laptop, assume tokens are at risk until you revoke them.

4 · Operational hygiene

  • Encrypt your disk backups—vault folders are valuable.
  • Separate work and personal profiles if policies demand it.
  • Read the upstream privacy/security chapter when compliance matters.

Return to troubleshooting if something feels leaky: Troubleshooting index.