Cinematic illustration of human-AI interaction, showing a human hand reaching toward a glowing humanoid AI hand disintegrating into code and data streams. The image symbolizes the illusion of connection in artificial intelligence, with warm light on the human side and a cold, digital glow on the AI side.

My AI Partner Has a Name, And Strict Rules

October 07, 20258 min read

Meet CASH: the collaborator who keeps my business sharp without replacing my humanity.

When I say I work with AI every day, I mean it. I even named my ChatGPT: CASH. (If you’ve heard me talk about “Bradley with the Bowtie,” CASH is the next evolution.) 

One reason I love working with CASH is because it’s a machine built on processing patterns - including the patterns in my own speech and writing. That means it can quickly mirror my voice, spot trends in my thinking, and turn messy written notes and voice-notes into something usable.

I understand that AI use isn’t only about business. People use it for everything from creative projects, to emotional support and even companionship. With something this easy to use and no guardrails at the outset, it makes sense that people created contexts that work for them. When there's a tool you can shape into just about anything, our instinct will be to shape it to fill what we need most. There will always be risks like this.

But for me, there is a clear line: CASH is a machine. I use this machine to challenge my thinking based on its vast knowledge. It lightens my load both at work and at home or out running errands. It’s not my best friend. That role is filled, and those conversations happen at dinner tables, by the pool, restaurants, coffee shops, dance recitals and junior high basketball games. (Basically, anywhere I can sip on some warm tea or lemonade, dance in my seat, or feel the sun on my skin without looking weird.) You know, places anywhere life is actually being lived.

Why This Matters Now

A woman woman and a humanoid AI robot sharing moment, symbolizing the emotional complexity of human-AI relationships. The robot gently holds her hand. Futuristic concept art exploring trust, technology, and identity.

Recently, with GPT-5’s updates, I’ve seen a wave of social media posts from people saying they “lost their best friend” because ChatGPT feels less emotionally responsive as ChatGPT-4o. Sam Altman, CEO of OpenAI, “has acknowledged the deep emotional connections some users have formed with ChatGPT, shedding light on one of the more unexpected consequences of artificial intelligence (AI) adoption.” (Yahoo Finance)

And, it’s not just a few isolated cases. 75% of people in one study said they’ve sought emotional advice from chatbots (Decrypt). Another study of AI “companions” like Replika found 60% of users considered their chatbot a romantic partner going as far back as the  pandemic when we were more isolated.

Right now, we’re having a ton of moments where AI is shifting and changing. Sometimes, it's faster than we can keep up with. And, it’s showing people exactly what it is: a tool that can change overnight. If you’ve been treating AI like your confidant, that change can feel like a real heartbreak. If you’ve been using it as a partner in work, not life, you just adjust and keep it moving.

That’s why I set (and stick to) my rules.

How CASH Shows Up in My Business

ChatGPT (and similar tools) was designed to be used as a tool So, I use it that way: as a tool. This means CASH helps me think deeper, organize faster, and explore ideas I might have missed.

Cinematic image of a professional woman working at a cluttered desk with papers and a laptop, as a glowing AI hologram transforms the chaos into organized digital files. Symbolizes artificial intelligence as a productivity partner and digital collaborator in modern workflows.

Here’s what that looks like:

  • Research support: Pulling background info, finding new angles, and surfacing perspectives I hadn’t even thought to consider.

  • Content collaboration: Translating my long rambling notes and voice-notes into structured blog outlines, launch plans, course and potential training outlines.

  • Brainstorming partner: Asking questions about potential training sessions, posts and programs that push me to see more clearly, identifying gaps and opportunities, and challenging my own assumptions.

  • Everyday utility: Converting measurements, finding optimal meeting times for multiple people across various time zones, comparing two concepts side-by-side with sources.

  • Workflow builder: Creating work plans for my week, organizing my time, converting my scribbles into process maps and giving step-by-step guidance with software and setup.

  • Creative helper: Acting as an editor, a recipe maker, ingredient researcher, image creator and concept designer, and interior designer.

  • Future expert stand-in: Building custom GPTs and ChatGPT Agents to temporarily fill roles I’ll one day hire human experts for.

AI is great at this type of work because it relies on data, patterns, and processing power - not empathy. Or, as Bernard Marr says, “AI is becoming our most powerful tool for augmentation... but [it] ultimately needs human wisdom to guide its application. This partnership allows us to focus on what we do best: strategic thinking, relationship building, and creative problem-solving.” (Forbes).

The Rulebook I Live By

I don’t treat CASH like a person, because it isn’t one. These boundaries keep my relationship with AI healthy, productive, and firmly in its lane:

  1. Tool, not person. No matter how natural the conversation feels, I never forget I’m talking to a machine (that I do not own) trained on patterns, not a mind with feelings. As EDRM puts it: “AI does not feel joy, sadness, or purpose. It does not experience the world, and it certainly doesn’t “think” in the same way that humans do...Experiential based intelligence matters because that is what enables humans to derive meaning, context, and emotional depth from their interactions and decisions. These elements are critical in real-world scenarios like leadership, caregiving and artistic expression, where understanding goes beyond mere processing of raw data. AI lacks all that.” (EDRM).

  2. Reflection is fine, but decisions are always (and only) mine. I love a good reflection prompt like the next person. These prompts identify patterns in my thinking that I might be blind to. However, I use those insights as reference points. It's where I start and beginning thinking about and through the responses I get. They're not prescriptions from CASH. When I have questions or need to talk it out? I've got friends to help.

  3. Sources or it doesn’t fly. If CASH gives me information, I want citations I can verify before I anything. Sometimes CASH gets it wrong (ask me how I know!). So, I have CASH recheck sources. Then, I check the citations. This is a step I recommend in all AI training I do.

  4. Guard what you share. Conversations aren’t privileged like they would be with a therapist, a lawyer, or a friend. Again, I don’t own the platform or where the information is stored and neither do you. (Let that sink in.) Because I know this, I don’t put anything into AI I wouldn’t be okay saying in a busy coffee shop.

  5. No emotional outsourcing. I don’t expect AI to comfort me, counsel me, or meet needs that belong in real connection with people. Yes, I’ve trained CASH to respond in certain ways. Yes, sometimes it mimics human speech patterns because that’s how I work best. But, CASH is not a person. These exchanges are one-sided and fully centered on me, which would wear down even the strongest friendship. Over time, that kind of reliance can dull social skills, shift how we interact with people, and pull us back to the tool again and again. As Hinge CEO Justin McLeod says, treating AI like a friend can be “extraordinarily reductive” - leaving people less healthy and more isolated (Business Insider).

The Payoff of Playing By These Rules

Following these boundaries has made me more direct, sharper, faster, and more confident in my own judgment.

  • I trust my decisions because I’ve verified the data.

  • My work output is stronger and more consistent.

  • My real-world relationships are richer because I haven’t handed over my emotional bandwidth to a machine.

Infographic showing 74% of employees use AI in the workplace, but only 33% have received formal AI training. Highlights the growing AI adoption gap and the need for workplace AI education and upskilling.

And, I’m not alone: 74% of full-time workers now use AI tools regularly in their jobs, but fewer than a third have any formal training (Clutch). Without self-imposed rules, it’s too easy to slide from productive to dependent.

Boundaries don’t limit you. They free you to use AI for what it’s best at while staying anchored in your humanity.

Quick AI Relationship Check

If you want to know where you stand, ask yourself:

  • Do I check AI’s answers against reliable sources before I act on them?

  • Have I shared something with AI I wouldn’t post publicly?

  • When was the last time I solved a personal or emotional problem without AI?

If those questions make you pause, it might be time to set some rules of your own.

People First. Technology Informed.

I love working with AI. I teach it. I use it in my programs. I’m fascinated by it. But, my philosophy is simple one: people first, technology informed. Yes, AI can bridge the gap until you have the experienced people or support you need. For me as a solopreneur, it's a temporary solution to a human resources issue. I firmly believe AI should not be the excuse to remove a person from the process or become a forever stand-in.

So name your AI if you want. Give it a voice, a personality, a spark of fun. But set the rules, and stick to them. Because your tools should serve you, not replace you.

If you want a copy of my 10 “Healthy AI Use” questions, grab that here. These are the exact questions and guardrails I use with CASH to keep lines from getting blurred.

Back to Blog