Skip to main content

Posts

Showing posts from September, 2025

What Actually is an AI Product Manager?

So you've probably seen job postings for "AI Product Manager" or "AI PM" lately, right? Ever since ChatGPT shook up the world, pretty much every company seems to be hunting for AI PMs. But here's the thing - when you actually ask someone "What exactly is an AI PM?", getting a clear answer isn't as easy as you'd think. I was confused at first too, honestly. I'd been working in this field under the "Intelligence" banner even before LLMs became a thing, but I wasn't really sure when I officially became an "AI PM." I kept asking myself, "Am I actually an AI PM right now?" So today, I wanted to share my honest thoughts about what an AI PM really is and clear up some common misconceptions, based on my experience working in this field from the Intelligence days through to the AI era. I hope this helps anyone who's dreaming of becoming an AI PM or is just curious about the role. How I Became an AI PM My fir...

How to Turn Obsidian into a Real Second Brain with Gemini CLI

Have you ever had this experience with note-taking apps? You start off really enthusiastic, collecting and organizing information like crazy, but then it gets more and more complicated over time until you eventually just... stop using it? You wanted to create this brilliant second brain, but somewhere along the way it just became a digital trash can for information. I've been consistently using Obsidian myself, but honestly, the information just keeps piling up day after day, and organizing all of it is no joke. Getting information from websites, emails, messages, newsletters, and social media into Obsidian was super easy. But organizing it? That was tougher than I thought. And actually retrieving that information later? Even harder. That's when I discovered  Gemini CLI . This tool helped me solve the fundamental problem of information classification. One YouTube video really convinced me of its potential. Gemini CLI and Obsidian are honestly a perfect match. Since Obsidian man...

Why LLMs Hallucinate - The Real Reason AI Lies to Us

Have you been chatting with AI lately? It's incredibly smart but sometimes pretty confusing When you're talking with AI chatbots like ChatGPT or Claude, there are honestly some mind-blowing moments. They tackle complex questions effortlessly and sometimes give you more accurate information than humans would. But then there are those "wait, is this actually right?" moments when they confidently deliver completely wrong answers. This phenomenon is called  hallucination  - basically when AI acts like it's seeing things and presents false information as if it's absolutely true. This post is based on research recently published by OpenAI . I'll break down what OpenAI researchers actually discovered about why hallucination happens and how we might solve it, in a way that hopefully makes sense to everyone. If you want to use AI more effectively, this stuff is genuinely helpful. What exactly is hallucination? Is it really that serious? Q1. What exactly is hallucin...