Feeding raw data directly into your AI agents eats up context windows and spikes your OpenAI and Anthropic costs.
Earlier this year, we launched standard HTML, JSON, and Markdown extraction. Today, we are introducing outputs built entirely for AI: markdown-llm, text-llm, and html-llm. We automatically strip out navbars, footers, ads, and scripts, delivering only the context your models actually need.
You can save up to 85% on tokens compared to raw HTML when using text-llm output format.