All activity
privatiserstarted a discussion
It solves the AI data privacy problem, now people just need to see it.
Hey All, I built privatiser to stop people from leaking their sensitive data and ai using it for training. That's just the start of it, it can now work with virtually every type of sensitive data. I recently launched on Product Hunt but didnt get the traction I was looking for so creating this thread as a pre launching space to happily announce that after reaching 150 downloads totals the pro...
Every time you paste logs, configs, or code into an AI assistant, you risk leaking API keys, passwords, internal hostnames, and customer data.
Privatiser strips it all out locally before you hit send, then lets you reverse the anonymization on the AI's response.
Works as a browser extension, VS Code extension, and web tool. Nothing leaves your machine.

PrivatiserAnonymise sensitive data before it hits any AI
privatiserleft a comment
Hey PH! Built this after watching a colleague paste a production .env file into an AI assistant. The data was gone before anyone noticed. Privatiser detects 30+ pattern types across secrets, PII, network identifiers, AWS/GCP/Azure resources, and more. All processing is local. Would love feedback on what patterns you'd want added, or what's getting flagged that shouldn't be.

PrivatiserAnonymise sensitive data before it hits any AI
