Luka Mlakar

Free A/B testing course - The free framework for tests that actually move revenue

We've helped 3,000+ teams run experiments. The same mistakes show up everywhere: no hypothesis, tests killed too early, vanity metrics dressed up as wins. So we built the course we needed two years ago. 6 modules covering what to test, how to read statistical significance, and how to build a system your team repeats every month , not just once. Tool-agnostic. Completely free. Based on real experiments, not textbook theory.

Add a comment

Replies

Best
Luka Mlakar
Hey Hunters, it's Luka from Optibase. We built Optibase after Google Optimize shut down in 2023. Two years and 3,000+ teams later, we keep seeing the same thing. Teams don't struggle with ideas. They struggle with process. No hypothesis. No idea how long to run the test. Results that "look good" but wouldn't survive a stats 101 exam. One team told us they'd been "A/B testing" for six months — turns out they were just changing headlines every Monday and checking Analytics on Friday. No control group. No sample size calc. No confidence threshold. They were basically flipping a coin and calling it data. That story isn't rare. It's the norm. So we took everything we've learned from working with teams like Memberstack (who turned small experiments into $500K in revenue) and PheedLoop (3.9% to 5.4% landing page conversion) and packaged it into 6 modules. What's inside: Module 1: What A/B testing actually is (and isn't) Module 2: What to test first for maximum impact Module 3: Turning research into real hypotheses Module 4: Sample sizes, duration, and not ruining your data Module 5: Reading results without fooling yourself Module 6: The mistakes that waste the most time and money It's free. No gate, no upsell, you don't even need Optibase — the frameworks work with any tool. We just think better testing makes the whole ecosystem better. Happy to answer anything about A/B testing or experimentation in the comments. – Luka
Taimur Haider

@luka_mlakar Congrats on the launch.


I spent about 5 minutes going through the homepage and product flow. One thing stood out.

The structure on the page. Test → Measure → Understand → Personalize
That is the real experimentation loop most teams miss.

Also noticed the strong Webflow-first positioning across testimonials. The real value there is removing developer dependency. When marketers can launch tests themselves, experiment velocity increases.

I've a question. How does Optibase handle traffic allocation during tests? Strict fixed splits for statistical purity?
Or adaptive allocation once a variant starts outperforming?

Klara Minarikova

The course looks solid for high-traffic sites, but what's the approach when you're working with pages that get maybe 500 visits a month? Does Module 4 cover how to run meaningful tests with low traffic?