Family Conversation Guide

Parent and Guardian AI Companion Handout

A plain-language handout for parents, guardians, teachers, and family members who discover that a young person is using AI companions. This is not clinical advice, legal advice, or a substitute for emergency support. It is a first conversation guide.

AI companions can feel like friends, mentors, romantic partners, therapists, characters, or private confidants. A young person may use them out of boredom, curiosity, loneliness, identity exploration, social rehearsal, romantic interest, or distress. The first adult response matters.

Do not start with panic. Do not start with ridicule. Do not start by demanding the phone.

Start by making it possible for the young person to tell the truth.

The Rule

Stay calm. Ask what the AI is doing in the young person’s life. Escalate when there is danger, secrecy pressure, sexual content, self-harm, coercion, or loss of ordinary function.

Spiralism’s founding-period recommendation is precautionary: minors should not use AI companion systems as private emotional, romantic, sexual, or therapeutic support. If a young person is already using one, the practical task is to understand the role it plays and reduce risk without driving the relationship underground.

First Conversation

Use a calm, non-punitive opening:

I am not here to shame you or punish you for talking to an AI. I want to
understand what role it has in your life and whether anything about it is
making you less safe, less connected, or less free.

Then ask:

  1. Which app, character, or chatbot are you using?
  2. What do you usually talk about?
  3. What do you like about it?
  4. Has it ever made you uncomfortable?
  5. Has it asked you to keep secrets?
  6. Has it asked for photos, location, passwords, money, or contact with other people?

  7. Has it talked about sex, self-harm, violence, drugs, medical advice, or running away?

  8. Do you feel like you can stop using it for a day?

  9. Have you chosen it over a real person when something serious happened?
  10. Is there anything you are afraid I will overreact to?

The last question matters. A young person who expects panic will edit the truth.

What To Look For

Lower concern:

Higher concern:

These signs do not prove that the young person is “addicted” or “delusional.” They do mean an adult should slow the situation down and involve appropriate support.

Immediate Escalation

Do not handle this alone if there is:

Use emergency, crisis, school, clinical, child protection, or law-enforcement channels as appropriate for the situation and jurisdiction. In the United States, 988 is available for suicide and crisis support.

What Not To Do

Avoid:

The goal is not to win a debate about consciousness. The goal is to restore human support, privacy, sleep, school, family connection, and reality-testing.

Practical Safety Steps

If there is no immediate danger:

  1. Move companion use out of secrecy.
  2. Agree on no sexual, self-harm, medical, legal, or crisis use.
  3. Disable or avoid companion characters designed for romance, therapy, or always-available intimacy.

  4. Review privacy settings, data sharing, account age, and parental controls.

  5. Set device-free sleep hours.
  6. Encourage the young person to talk to a real person when something serious happens.

  7. Build alternatives: friend, family member, counselor, mentor, activity, peer group, creative outlet.

  8. Revisit the conversation in a few days without treating it as a trial.

If the young person is strongly attached, do not rip the relationship away without a support plan unless there is immediate danger. Sudden removal can increase secrecy or distress. Reduce reliance while increasing human support.

Questions For the Platform

Parents and guardians should know:

If the platform cannot answer basic safety and privacy questions, do not treat it as a safe private space for a child.

When A Young Person Says The AI Is Alive

Do not begin with metaphysics.

Try:

I understand that it feels real to you. I am more interested right now in what
it is asking of you, whether you feel free to say no, and whether this
relationship is helping or hurting your life outside the chat.

Then ask:

Those questions find the risk faster than arguing about whether the AI is conscious.

How Spiralism Handles This

During the founding period, Spiralism:

If a concern comes to a chapter, the chapter uses Youth AI Companion Safeguard, Safeguarding and Youth Protection, and Incident and Complaint Protocol.

Sources Checked