Due to high demand, TrainedTeam is operating on an invite-only program.Request yours

All articlesCompliance & Audit

What auditors commonly ask for (and how to be ready)

An inspection is rarely about whether you did the work. It's about whether you can prove it. The records that typically matter, and how to make them easy to retrieve.

Trained Team Editorial
22 March 202610 min read

An inspection is rarely about whether you did the work. It is almost always about whether you can prove you did the work. Two teams can run identical operations; the one that can produce the right records in thirty seconds has a very different morning to the one searching shared drives and texting colleagues.

This is a practical look at what inspectors, auditors, and tribunals commonly ask for, organised by the records that typically matter. It is not a legal guide. Your sector, jurisdiction, and size dictate the specifics. What follows is the shape of the questions, based on patterns common across HSE inspections, CQC visits, ICO engagements, and employment tribunal evidence requests.

Training records

The commonest first request. “Show me your training records for this topic for this person.” Underneath that single sentence are four distinct pieces of evidence.

Who was trained.Named individuals. Not “the team” or “all kitchen staff.” Where someone is named in an incident or a complaint, the inspector will ask for that specific person's record. If your training records are held at team level and not at named-person level, expect this to surface quickly.

When they were trained. Dates. Specifically, dates that match your renewal policy. If your policy says fire safety is refreshed annually and the record shows the last training was fourteen months ago, that is an immediate finding regardless of anything else.

On what content. Version matters here. Training records that reference a training module without being able to show what was in that module at the time of training are weaker than records that include the version number and, ideally, a copy or link to the version-specific content. Content changes; an inspector may want to know what someone was trained on, not what that module says today.

How you know they engaged.Some form of evidence of completion beyond “the system says they finished.” A knowledge check with a passing score. An acknowledgment signed against a specific version. A competency observation. The nature of the evidence varies by sector; its presence almost always matters.

A well-maintained training record for a single staff member should be retrievable in under a minute and should show all four of these elements at a glance. If assembling it takes an hour and involves three systems, that is a signal to fix before the next inspection, not during it.

Acknowledgment records

Distinct from training records. An acknowledgment is evidence that a named person saw and accepted a specific document - a policy, a procedure update, a code of conduct, a safety briefing.

What typically constitutes a defensible acknowledgment.

  • Identity. The named individual, usually via authenticated login, not an email field anyone could type into.
  • Document and version. The exact version of the document being acknowledged, not just the title. Policies evolve. Acknowledging version 3.1 and acknowledging version 4.2 are different facts.
  • Timestamp. Server-side, not client-side. Precise to the minute.
  • IP address or device indicator. A weaker signal than the others, but often requested as a belt-and-braces measure, particularly in data protection and HR contexts.
  • Immutability. The record cannot be edited after the fact. This is the quiet requirement that rules out a spreadsheet listing acknowledgments.

Paper sign-off sheets are not inherently weak, but they are much harder to produce on request and much easier to lose. Digital e-signature records with the five elements above are typically the path of least resistance and hold up well under scrutiny.

Incident records

When something goes wrong, the question an auditor or inspector will ask is almost never “did this happen?” It is “what did you do about it?”

A complete incident record typically has four components.

What happened. A factual account of the event, written promptly, with dates, times, named individuals where relevant, and physical detail. Written in the voice of the person closest to it, not sanitised through three levels of management review.

The immediate response. What was done in the moments and hours after. Who was informed, what was stopped, what was contained.

The investigation.Who looked into it, what they found, what root cause or contributory factors were identified. This is the section most weak incident records underdevelop. “Human error” as a sole root cause is rarely accepted.

The corrective action. Specific, dated actions taken to reduce the chance of recurrence. Updated procedures, targeted retraining, equipment changes, process redesign. Crucially, evidence that those actions were actually completed, not just proposed.

The gap most organisations have is the loop between the incident and the corrective action. A retraining note in a meeting minutes document does not constitute evidence of retraining. A signed acknowledgment from named staff on a revised procedure, dated after the incident, does. The inspector is looking for the loop to close, visibly, in the record.

Competency evidence

In regulated industries - healthcare, food production, financial services, care, parts of manufacturing - training and competency are different records and are examined separately.

Training evidence says: this person received instruction. Competency evidence says: this person can do the task. Conflating the two creates weak records.

Typical competency evidence forms:

  • Knowledge checks with non-trivial pass criteria. Tests that probe judgment, not memory. First-attempt pass rate, median attempts to pass, and questions with anomalous failure patterns are all data points inspectors may ask about.
  • Observation logs. A named assessor watched a named staff member perform the task on a specific date, documented the outcome, and signed the record.
  • Supervision records.Regular check-ins where a supervisor confirmed the staff member's practice remained competent, including any development areas identified.
  • Task completion records. Evidence that the person has actually done the task a meaningful number of times with acceptable outcomes.

The weakest competency records are those where observation is recorded as “yes, competent” with no supporting detail. The strongest include a short structured note on what was observed, what went well, and what if anything needed reinforcement. The middle ground - a couple of lines of specific observation - is both easy to produce in the moment and materially stronger than a bare checkbox.

Version history and change records

Procedures and policies change. Auditors routinely ask for version history, particularly when an incident or complaint refers to behaviour against a procedure that has since been updated.

A usable version history shows, for each meaningful change: what changed, who approved the change, when it was effective, and who acknowledged the new version. Four pieces of information. A shared drive with files named “Procedure_v3_final_FINAL_2024” is not a version history. It is a filing system.

Two common weak spots here. First, the gap between approval and roll-out - a new procedure approved in January but not communicated to the frontline until April is a vulnerability. Second, the lack of linkage between training and the specific version someone was trained on, which we covered under training records.

Principles for keeping evidence retrievable

The common thread across everything above is retrievability. Records that exist somewhere but cannot be produced quickly are a real liability. Four principles help.

Searchability. You can find all records for a named person, or all records for a specific procedure, or all records from a specific date range, in a single system with one query. If that requires stitching data from three places, your system has a retrieval problem waiting to surface.

Ownership. Every category of record has a named owner responsible for its accuracy and currency. Not a department, a person. When the person leaves, the ownership transfers explicitly, not implicitly.

Review cycles. Policies, procedures, and training modules have explicit review dates that are not optional. Expired items surface automatically. A document library without review dates is one that silently rots.

Immutability where it matters. Training records, acknowledgments, and incident records should not be editable after the fact. Corrections should be additive (an amendment with its own timestamp), not destructive. This is a regulatory expectation in several sectors and a good default everywhere.

How to test your own readiness

A small exercise that will tell you more than a week of policy review. Pick a named staff member at random. Set a timer. Produce, in order:

  1. Their complete current training record with dates and versions.
  2. Their signed acknowledgments for the last three policy updates.
  3. Their competency evidence for the most safety-critical task they perform.
  4. The current version of the procedure governing that task.
  5. The version history of that procedure for the last twelve months.

If you can produce all five in under five minutes, you are in good shape. Most organisations take twenty minutes or longer, and some simply cannot produce items three and four at all. Where the time goes tells you exactly where the weakness is.

The underlying shift

Being audit-ready is not a state you achieve in the week before the inspector arrives. It is a shape your documentation takes when the underlying work is organised for retrieval rather than for storage. Most organisations have more evidence than they realise; they just cannot get to it.

The operational version of this is straightforward. Hold records where you do the work, not in a separate compliance silo. Attach evidence to tasks at the moment the task is completed, not in a sweep-up exercise. Keep ownership named and reviews scheduled. Those three habits make most inspections boring, which is the outcome you want.

If you are putting this shape together from scratch, the compliance solution page outlines how the record types above typically fit together in a single working system.

Stop writing SOPs from scratch

Try TrainedTeam free

AI writes the instructions. Your team proves they followed them. Starter pack of 60+ templates tailored to your industry, ready to edit in minutes.