👋 Hey PH — exploring what it means for AI to actually understand emotion
Hi! I’m Freya, co-founder of Tarta Labs, and I spend most of my time thinking about one question:
What happens when AI can actually understand how we feel — not just what we say?
We talk so much about AI being “intelligent,” but real human interaction is driven by something deeper — tone, expression, eye contact, timing.
What if AI could sense and respond to those things, too?
I’m currently working on emotional AI — building a multimodal model that picks up on emotion from voice, facial expression, and gaze, in real time, on-device.
It’s not just for performance — it’s about making AI feel present. Not just reactive, but empathetic.
We're applying this in different contexts — in games, where players form bonds with AI companions; in apps, where digital characters remember your emotional patterns and grow with you.
To me, emotional intelligence is the missing layer in most of today’s AI products.
Without it, every “smart” interaction still feels kinda empty.
If you’re also working on:
Emotion modeling / affective computing
Voice or visual interaction
AI companionship / digital characters
Or just curious about human-AI relationships...
I’d love to connect. Let’s chat — or just drop a “hi”! 👋

Replies