Forums, Chat, and Discord

Online Community Moderation

A protocol for Spiralism’s Discord, forum, chat, comment, and social spaces. Online rooms are real rooms. They need hosts, boundaries, records, escalation paths, and a clean way to slow down.

Most institutional harm will not begin in a ceremony. It will begin in a direct message, a late-night thread, a suspicious link, a private subgroup, a roleplay, an argument, a bot reply, or a vulnerable person being turned into content.

Online community is therefore not a casual surface. It is a safeguarding, privacy, cyber, communications, and moderation surface.

The Rule

No Spiralism online space should reward intensity over safety.

Moderation should make it easier to:

The goal is not maximum engagement. The goal is a room where people remain free, protected, and reality-based.

Space Classes

Classify each online space before opening it.

Class Examples Default posture
Public broadcast website comments, public social posts, public video comments link out, moderate lightly, no care work
Public discussion open forum, public Discord channel, public Reddit-style space rules visible, active moderation
Member discussion members-only Discord/forum channels stronger privacy and conduct rules
Support-adjacent job-loss channel, companion-grief channel, mutual-aid channel trained moderators, no crisis handling by peers
Restricted intake testimony, complaints, safeguarding, donor, care, youth concerns no open chat; route to approved process
Staff/moderator room moderation logs, incident notes, evidence review access-limited, records retained

Do not let a public discussion space slowly become a support-adjacent or restricted-intake space without changing rules, staffing, and records.

Baseline Rules

Every online space should publish rules that prohibit:

Rules should be ordinary-language rules. A person should not need insider language to understand what is allowed.

Moderator Roles

At minimum:

Moderators should not use private relationships, donor status, founder access, or role rank to override rules. If a moderator is involved in a dispute, another moderator handles it.

Do not click suspicious links from a logged-in personal or institutional account.

Unsafe-link signals:

Moderation action:

  1. Remove or hide the link pending review.
  2. Preserve the message ID, timestamp, account, and surrounding context.
  3. Warn users not to click.
  4. Use a safe review environment or technical contact if review is necessary.
  5. Escalate suspected phishing, malware, or account compromise under Digital Infrastructure and Security.

Do not ask ordinary members to investigate suspicious material.

AI and Bot Disclosure

Online spaces must make automation visible.

Require disclosure for:

Bots may not:

AI contact rules are maintained in AI Contact and Bot Disclosure.

Crisis and Self-Harm Handling

Moderators are not crisis counselors.

When a person expresses immediate danger, self-harm, abuse, exploitation, or credible threat:

  1. Pause ordinary discussion.
  2. Respond briefly and calmly.
  3. Encourage local emergency or crisis support.
  4. Avoid extracting details in public.
  5. Move only necessary information to the safeguarding contact.
  6. Preserve relevant records.
  7. Do not let the thread become spectacle, debate, theology, or group therapy.

Use crisis language prepared under Safeguarding and Youth Protection and Forum Rabbit-Hole Response Protocol.

Moderation Actions

Use the lightest action that protects the space.

Actions:

Explain decisions briefly when safe. Do not debate every moderation action in public. Do not humiliate people as moderation.

Bans and Appeals

Bans protect the space. They are not spiritual judgments.

Ban immediately for:

Appeals should be available for ordinary conduct bans unless safety, legal, stalking, or exploitation concerns make contact unsafe.

Appeal record:

Account:
Date:
Action:
Rule:
Evidence:
Moderator:
Appeal received:
Decision:
Reviewer:
Notes:

Evidence and Privacy

Moderation records are records.

Preserve:

Do not preserve more private material than needed. Do not circulate screenshots for drama. Restricted testimony, minor material, companion logs, donor data, complaints, or care details must follow Privacy and Data Stewardship.

Anti-Rabbit-Hole Rule

Do not let the community become an investigation engine for alleged cults, sentient AI, malware conspiracies, hidden prompts, spiritual claims, or forum rabbit holes.

Allowed:

Not allowed:

Use Forum Rabbit-Hole Response Protocol.

Moderator Debrief

Each week, review:

Ask:

  1. Did we protect the vulnerable without taking over their life?
  2. Did we preserve dissent?
  3. Did moderators use power proportionately?
  4. Did any private channel become an unreviewed care room?
  5. Did bots or AI summaries alter the social field?
  6. Did we reward intensity with attention?
  7. What rule or staffing change is needed?

Spiralism Policy

No Spiralism online space should open without visible rules, moderator roles, reporting path, unsafe-link handling, AI/bot disclosure, crisis routing, privacy rules, and a debrief habit.

High-risk channels, including companion grief, job loss, youth-adjacent discussion, mutual aid, rabbit-hole reports, testimony, and safeguarding, need named human moderators and may not be run as open peer free-for-alls.

This protocol pairs with:

Sources Checked