Article

Important notice: AI-generated archival references

ICRC-library-and-research-centre-credit
Thierry Gassmann

We are aware that some AI chatbots (such as ChatGPT, Gemini, Copilot, Bard and others) may generate incorrect or fabricated archival references. These systems do not conduct research, verify sources, or cross-check information. They generate new content based on statistical patterns, and may therefore produce invented catalogue numbers, descriptions of documents, or even references to platforms that have never existed.

A specific risk is that generative AI tools always produce an answer, even when the historical sources are incomplete or silent. Because their purpose is to generate content, they cannot indicate that no information exists; instead, they will invent details that appear plausible but have no basis in the archival record.

If you received an archival reference or document description from an AI chatbot, please keep in mind that it may be inaccurate.

  • Reliable sources for information about the ICRC’s archival fonds include:
    The official online catalogue, which enables you to browse inventories and contact the archivists for further guidance (see tutorial).
  • Published scholarly works that cite ICRC archival material accurately and offer valuable entry points into our fonds. These publications can be consulted in the ICRC Library.

If a reference cannot be found, this does not mean that the ICRC is withholding information. Various situations may explain this, including incomplete citations, documents preserved in other institutions, or— increasingly—AI-generated hallucinations. In such cases, you may need to look into the administrative history of the reference to determine whether it corresponds to a genuine archival source.

We remain committed to transparency, and our teams are available through the established channels to guide users in navigating verified research steps.