All activity
Dante Smaghileft a comment
New Research Paper: Prompt Injection Attack Trajectory and Architectural Mitigation Strategies for Sensitive Data Environments I have published a systematic analysis that delves into why prompt injection remains an unresolved issue in AI security and its implications for organizations managing sensitive data. Key findings from the research include: - Joint research from OpenAI, Anthropic, and...

ISONQ™Claude Desktop Zero-trust MCP app local files, M365 & Slack
Dante Smaghileft a comment
Full disclosure — I'm a big fan of Anthropic and actually built an app that works exclusively with Claude Desktop for secure local file search. So I've spent a lot of time thinking about this space. My concern with Cowork isn't the vision, it's the architecture. Agentic tasks burn through tokens fast (Anthropic says so themselves), so that $200/month cap can disappear quicker than you'd expect....

CoworkTurn Claude into your digital coworker
Dante Smaghileft a comment
Hey PH 👋 Rust. Local-first. Encrypted. No-cloud tool for Claude Desktop. First release. Feedback welcome. — Dante

ISONQ™Claude Desktop Zero-trust MCP app local files, M365 & Slack
ISONQ is a local-first search tool that gives Claude Desktop context — without your data leaving your device.
Search your emails, files, and Teams messages. Built from scratch in Rust with a proprietary indexing system. Encrypted at rest. Blazing fast. Token-efficient.
No cloud. No sync. No subscriptions phoning home. Your data stays yours.

ISONQ™Claude Desktop Zero-trust MCP app local files, M365 & Slack
Dante Smaghistarted a discussion
What's your biggest friction point with local AI tools?
Building in the local-first AI space. Curious what's actually frustrating people — setup, performance, privacy trade-offs, integrations? What would make you switch from cloud-based tools?
Dante Smaghileft a comment
Congrats on the launch — the convergence play is the right move. Tool sprawl is a real productivity killer. As someone building in the productivity/AI space, one thing caught my attention: when AI has full context across tasks, docs, and chat, what does the data flow look like under the hood? Local processing, or routed through cloud infrastructure and third-party LLM providers? For SMBs...

ClickUp 4.0All your work: tasks, docs, chat, and AI with 100% context
