Transparency and Public Registers
A protocol for making institutional accountability visible. Spiralism should not ask the public to trust private assurances when a public register can carry the weight.
The institution now has policies for AI use, vendors, moderation, provenance, incidents, corrections, partnerships, and data stewardship. Those policies matter only if someone outside the inner circle can see how they are applied.
Transparency is not total exposure. It is disciplined disclosure: enough public information for members, donors, partners, critics, and future researchers to understand the institution’s choices without exposing private people.
The Rule
Trustworthy institutions publish what they can, protect what they must, and explain the difference.
A public register should answer:
- what exists;
- who owns it;
- why it exists;
- what data or people it affects;
- what policy governs it;
- when it was last reviewed;
- how to challenge, correct, or ask questions.
The register does not replace internal records. It gives the public a stable window into them.
Public Registers
Spiralism should maintain these public registers as the institution matures.
| Register | Purpose | Public fields |
|---|---|---|
| AI Use Register | Explain material AI use | tool category, purpose, disclosure norm, human owner, last review |
| Vendor Register | Show important third-party dependencies | vendor class, purpose, data class, owner role, review date |
| Partnership Register | Disclose material relationships | partner, purpose, money/data/access involved, conflict review |
| Correction Log | Preserve public corrections | artifact, issue, correction date, source of correction |
| Incident Aggregate | Report patterns without exposing people | counts by category, lessons, policy changes |
| Publication Register | Track public works | title, owner, sources checked, AI-use note, correction contact |
| Chapter Register | Show active chapters | city/region, status, host role, contact route, review date |
| Policy Revision Log | Show governance changes | policy, date, reason, approving role |
Not every small operational choice needs public posting. Material choices do.
What Not to Publish
Do not publish:
- names of vulnerable people without consent;
- minor information;
- private testimony details;
- donor details beyond public-recognition consent;
- incident narratives that identify people;
- moderation evidence;
- private messages;
- companion logs;
- credentials, recovery details, account names, or security architecture;
- live malicious links or usable attack details;
- legal strategy;
- internal deliberation that would chill reporting or repair.
Transparency is not a reason to betray privacy.
AI Use Register
The AI Use Register should include material institutional uses, not every small grammar check.
Record:
- tool or vendor category;
- use case;
- data class allowed;
- data class prohibited;
- whether outputs are public-facing;
- whether human review is required;
- whether automated contact occurs;
- disclosure text;
- owner role;
- last review date.
Example:
Use: Public research drafting
Tools: General-purpose language model, public web search
Allowed data: Public sources, non-sensitive drafts
Prohibited data: testimony, donor records, incident records, minor material
Human review: required for sources, claims, consent, and publication
Disclosure: used when AI materially shaped public artifact
Owner: Editorial Steward
Last review: YYYY-MM-DD
Vendor Register
The Vendor Register should summarize important dependencies without publishing security-sensitive details.
Record:
- vendor name or category;
- purpose;
- data class touched;
- account owner role;
- backup owner role;
- whether MFA is required;
- export path exists: yes/no;
- exit plan exists: yes/no;
- last review date;
- governing policy.
Do not publish admin emails, recovery methods, payment details, API keys, technical topology, or vulnerability details.
Correction Log
Corrections should be visible and boring.
Record:
- date;
- artifact;
- claim or issue corrected;
- correction made;
- source or reason;
- whether AI assistance contributed;
- reviewer.
Do not turn correction logs into defensive essays. The point is to make repair ordinary.
Incident Aggregate
Incident transparency should protect privacy while showing learning.
Publish annually:
- number of incidents by broad category;
- number of complaints received;
- number resolved, pending, or referred;
- number of publication corrections;
- number of data/security near misses;
- number of chapter pauses or closures where public disclosure is appropriate;
- policy or training changes made because of incidents.
Do not publish details that identify people unless the affected person has explicitly agreed and publication is necessary.
Update Cadence
Suggested cadence:
| Register | Cadence |
|---|---|
| AI Use Register | quarterly |
| Vendor Register | quarterly |
| Partnership Register | when material relationship begins or changes |
| Correction Log | as needed, within seven days of correction where practical |
| Incident Aggregate | annually |
| Publication Register | at publication |
| Chapter Register | monthly or when status changes |
| Policy Revision Log | at approval |
A stale register is worse than no register because it creates false confidence. If a register cannot be maintained, narrow its scope until it can.
Public Register Page
The website should eventually include a public “Transparency” page with:
- plain-language AI-use statement;
- current material AI uses;
- important vendors by category;
- material partnerships;
- correction log;
- annual incident aggregate;
- policy revision log;
- contact route for corrections, privacy, safeguarding, and press.
The page should not be a marketing page. It should be quiet, dated, and easy to audit.
Challenge and Response
Every register needs a challenge path.
When a member, source, donor, critic, partner, journalist, or researcher raises a register concern:
- Acknowledge receipt.
- Preserve the challenged record.
- Assign a reviewer.
- Compare the public register to the underlying record.
- Correct, explain, or escalate.
- Record the outcome.
Do not punish people for noticing mismatch. Mismatch is useful signal.
Spiralism Policy
Spiralism should publish a public transparency page before it asks for broad public trust, major donations, formal partnerships, or chapter expansion.
During the founding period, a simple register is enough: AI use, material vendors, material partners, corrections, and policy revisions. As the institution grows, add incident aggregates, chapter status, and annual learning notes.
This protocol pairs with:
- Communications and Press;
- Research and Editorial Integrity;
- AI Literacy and Use Protocol;
- Vendor and Platform Governance;
- Incident and Complaint Protocol;
- Partnership Strategy;
- Evaluation and Learning Loop;
- Privacy and Data Stewardship.
Sources Checked
- NIST, Artificial Intelligence Risk Management Framework, January 2023.
- NIST AI Resource Center, AI RMF Core, accessed May 2026.
- OECD, AI Principles, adopted 2019 and updated 2024, accessed May 2026.
- OECD, Governing with Artificial Intelligence, 2025.
- Federal Trade Commission, Advertising FAQ’s: A Guide for Small Business, accessed May 2026.
- Federal Trade Commission, AI Companies: Uphold Your Privacy and Confidentiality Commitments, January 9, 2024.