Skip to main content

Microsoft vs Anthropic (Claude)

Based on our analysis, Anthropic (Claude) is the more privacy-respecting choice overall.

BACK →
CategoryMicrosoftAnthropic (Claude)
OverallC- · 44/100B · 72/100
What they collectConcern (35)Mixed (65)
Who they share it withConcern (40)Positive (78)
What you can doMixed (58)Positive (78)
What they promiseMixed (52)Positive (82)
In plain English — Microsoft

Microsoft's privacy statement covers an enormous product surface — Windows, Office, Azure, Bing, Xbox, and Copilot — and the data practices vary dramatically across them. The umbrella policy is deliberately vague, deferring almost all specifics to product-level documentation. Cross-product data combination, AI model training on your content, and employer/school access to your files and communications are the key risks most consumers don't realise they're accepting.

View full analysis →
In plain English — Anthropic (Claude)

Anthropic collects identity and account data, all prompts and responses, and coding sessions. Consumer users can opt in to having conversations used for model training with data retained up to 5 years. API and commercial customers are unaffected: their data is never used for training. With training off, 30-day retention for safety then deleted. No advertising business; data never sold. Dedicated Privacy Center at privacy.claude.com.

View full analysis →

Privacy policies decoded, for free.

Browse plain-English grades for the apps you use every day. Don't see the one you need? Submit it and we'll add it.