Commitment

AI Accountability Pledge

A framework for using AI without surrendering labor dignity, memory, or moral agency. Not a manifesto. A working standard, applied daily.

The standard

Before asking whether a system is efficient, stable, or profitable, ask whether people inside it can breathe, eat, think, heal, move, create, relate, and participate without being crushed. If the answer is no, then the system is not serving life. It is managing decline.

Commitments

  1. 01

    Transparency of use

    When AI shapes language, structure, or output, say so. Not as a disclaimer but as honesty about where the work came from. The ideas are mine. The shaping is collaborative. That distinction matters.

  2. 02

    No replacement of human judgment

    AI can draft, research, and organize. It does not decide what matters, what to publish, or what to believe. Judgment stays with the person. The machine assists; it does not author values.

  3. 03

    Protect labor dignity

    Automation should expand what people can do, not eliminate the people doing it. If a tool replaces a job, the question is not efficiency -- it is what happens to the person. That question comes first.

  4. 04

    Environmental honesty

    Every model run, every training cycle, every inference request has a cost in energy, water, and carbon. Use AI intentionally, not casually. The convenience of generation should not obscure the weight of it.

  5. 05

    No surveillance framing

    AI should not be used to monitor, rank, or profile people without their knowledge. The tools on this site do not track visitors, score behavior, or build profiles. Presence here is not data extraction.

  6. 06

    Fail-safe defaults

    When AI-assisted systems on this site break, they fail quiet -- not loud. No hallucinated data presented as truth. No automated actions without human review. If the system cannot verify, it says so.

  7. 07

    Preserve creative agency

    Music, writing, and visual work on this site are human-originated. AI may help with structure, research, or iteration, but the creative impulse -- the reason something exists at all -- is not delegated.

The cost of running AI

Section focus

AI Accountability

Artificial intelligence is not weightless software. It is physical infrastructure, electricity demand, cooling demand, and procurement pressure that moves through real communities. This panel tracks those hidden costs and where accountability has to be engineered into deployment decisions.

Infrastructure reality Grid-aware deployment Resource transparency

Training emission

500+

Large legacy run baseline

Metric tons CO2e for one large legacy model training run (GPT-3 era estimate).

Cooling water

700k

Full-cycle cooling estimate

Liters of freshwater used for one large-model training cycle.

Inference energy

10x

Generative vs. search query

Typical energy ratio of one generative query versus one standard search query.

Accountability lens 1

Training day carbon burden

Model training pushes thousands of accelerators for sustained windows. Emission totals depend as much on grid quality and facility geography as on model architecture.

Emissions comparison (tonnes CO2e)

Legacy model training versus familiar real-world baselines.

Note: BLOOM was trained on a lower-carbon grid, showing how regional energy mix can sharply reduce training emissions.

Grid intensity determines footprint

Training in high-coal grids can multiply emissions compared with hydro or nuclear-heavy regions. Site selection is a first-order policy decision, not a cosmetic optimization.

Hardware lifecycle also counts

Frequent accelerator turnover compounds impact through manufacturing and e-waste. Accountability has to include procurement cadence, not just runtime electricity.

Accountability lens 2

Water pressure in data center growth

Cooling towers and power generation chains both consume water. As AI demand scales, local water planning and transparency reporting become public-interest concerns.

How water is consumed
  • Direct cooling draw: on-site tower systems use freshwater to keep thermal loads under control.
  • Indirect energy draw: utilities also consume water producing the electricity AI workloads require.
  • Conversation-scale impact: estimates suggest a 20-50 prompt chat can map to roughly 500 ml water usage.
Corporate water trend (billions of gallons)

Approximate trend line as AI infrastructure scaled between 2018 and 2022.

Accountability lens 3

Inference is the daily load

Training is episodic, inference is continuous. Each generated response performs token-by-token computation, so scaled usage shifts baseline data center demand.

Energy per query (watt-hours)
Why this matters at system scale

Search retrieves indexed records. Generative inference computes token probabilities in sequence, increasing accelerator utilization per request. At billions of interactions, this changes total grid demand and required backup capacity.

Counterbalance

Where AI can reduce emissions

Accountability is not anti-AI. It is pro-measurement. The same systems can reduce emissions when deployed toward grid balancing, logistics efficiency, and material innovation.

Potential sector reduction (%)
Grid optimization

Real-time balancing can improve renewable integration and reduce curtailment waste.

Material discovery

Accelerated candidate search can compress R&D cycles for batteries and energy systems.

Agricultural and logistics efficiency

Route planning and precision operations reduce fuel, fertilizer, and water overhead.

Notes and source anchors

Values in this panel are educational estimates aggregated from public reporting and research summaries, including Stanford HAI, hyperscaler environmental disclosures, BLOOM reporting context, and Joule-era inference cost discussions.

Why this exists

This pledge exists because the tools are powerful and the defaults are not neutral. AI systems are built by companies with incentives that do not always align with the people using them. The convenience of generation can obscure the cost of extraction. The speed of automation can erase the dignity of labor.

This is not anti-technology. This site uses AI actively -- for research, writing assistance, code, data sync, and archive management. But using a tool and being accountable for how you use it are not the same thing. This page is the accountability part.

People should not have to prove they are worth helping by first becoming a crisis. Systems should not have to break before someone asks whether they were serving life.