All activity
Pith left a comment
We were spending too much on LLM API calls — and realized most prompts carry redundant tokens that inflate costs without adding value. So we built PithToken: an API proxy that sits between your app and any OpenAI-compatible provider. It optimizes your prompts automatically before forwarding them. What you get: • 48% average token reduction (benchmarked on 100K real conversations from...
PithTokenCut LLM API costs 50% with prompt injection defense
