Discord vs Anthropic (Claude)
Based on our analysis, Anthropic (Claude) is the more privacy-respecting choice overall.
BACK →| Category | Discord | Anthropic (Claude) |
|---|---|---|
| Overall | C+ · 58/100 | B · 72/100 |
| What they collect | Mixed (52) | Mixed (65) |
| Who they share it with | Mixed (55) | Positive (78) |
| What you can do | Positive (72) | Positive (78) |
| What they promise | Positive (65) | Positive (82) |
Discord collects your messages, activity, device data, and behavioural signals, and uses them for personalisation and sponsored content targeting — but it doesn't sell your data, encrypts voice and video end-to-end, and gives you genuine in-app controls over most processing. The biggest risks are public server content being used to train AI systems and third-party bots operating largely outside Discord's privacy guarantees.
View full analysis →Anthropic collects identity and account data, all prompts and responses, and coding sessions. Consumer users can opt in to having conversations used for model training with data retained up to 5 years. API and commercial customers are unaffected: their data is never used for training. With training off, 30-day retention for safety then deleted. No advertising business; data never sold. Dedicated Privacy Center at privacy.claude.com.
View full analysis →