Most MS4 program audits go badly for the same reason: the program is run on records that aren’t designed to be defended. Inspections live in one spreadsheet, training in another, IDDE in a shared drive folder, BMP inventory in someone’s email outbox. When the auditor asks for a chronology, a coordinator spends two days assembling one — and the assembled chronology has gaps the original records didn’t.
Audit-readiness isn’t a project you do before an audit. It’s a property of how you keep records day-to-day. This post is a practical guide to getting there without rewriting everything.
What auditors actually look for
The high-frequency audit questions, in roughly the order they tend to come up:
- Show me your inspection records for [program] in [time period]. Not a summary. The actual records — date, location, inspector, findings, photos, follow-up.
- For deficient findings, show me the resolution. Each deficient finding should have a re-inspection record, a resolution note, or an escalation to enforcement.
- For your annual report’s [number], show me the underlying records. The report says 47 construction inspections — show the 47 inspections.
- For this enforcement action, show me the chronology. Initial finding through final closure, including all communication and re-inspections.
- Show me your BMP inventory and verify each one has been inspected. The inventory should be current; each item should have a recent inspection.
- Show me your training records. Per-employee, by date, with topics covered and attestation.
What these have in common: every question is “show me the record.” None of them are “tell me about your program.” A program that can produce records on demand is audit-ready. One that has to assemble them in advance isn’t.
The spreadsheet trap
Spreadsheets can run a small MS4 program. The trap isn’t the spreadsheet itself; it’s three patterns that emerge over time.
One spreadsheet per topic, no cross-references. Inspections in one tab, BMPs in another, outfalls in a third. The inspector logs findings against site names that don’t quite match the BMP inventory, so reconciliation requires a human to recognize that “Boat Yard” and “Tacoma Boat Yard” are the same site.
Manual report fields disconnected from underlying data. The annual report has cells where the coordinator types totals. Those totals are computed from spreadsheets in another file. Updating one doesn’t update the other. By the time the report is signed off, the source spreadsheet has drifted.
One person owns the institutional memory. Knowledge of which inspections happened, why a site is on a watch list, what happened to the violation in 2021 — it lives in the head of one coordinator. When that person changes roles, the program loses its audit narrative.
You don’t escape these patterns by being more careful with spreadsheets. You escape them by structuring records so the audit story assembles itself.
What audit-defensible records actually look like
Each compliance event should carry:
- A timestamp with both date and (where relevant) time
- A named user who entered the record
- A location reference that links to a specific site, outfall, or BMP — not free text
- Findings or actions in a structured format that can be queried
- Evidence attachments (photos, screening results, signed forms) on the record itself
- A status (open, in progress, closed) that updates as work proceeds
- A trail of changes — when something was modified, by whom, and what changed
The last bullet is the one most spreadsheets fail. A spreadsheet doesn’t remember that Cell B47 used to say “12” and now says “14” because of an import correction. An audit log does.
Counts derived from records, not typed in
The annual report’s MCM section says “47 construction inspections this year.” That number should be derived from the inspection log — when you count records that match the criteria, you should get 47. If you can’t run that query and arrive at the same number, your report has drifted from your data.
Two failure modes to recognize:
- Optimistic drift: the report says 47 because that’s what you remembered. The records show 41. The audit finds the 6-inspection gap.
- Pessimistic drift: the report says 41 because you stopped counting after the spreadsheet got messy. The records actually have 47, but six are tagged inconsistently. The audit doesn’t catch it, but you’ve under-reported your own work.
Both are bad. The fix is the same: derive the count from the records, every time, automatically. Don’t type it.
The override problem
Sometimes a count needs adjustment. A duplicate slipped in. A record was logged in the wrong year. An inspection happened but didn’t get entered until weeks later. Manual adjustments will happen — the audit-readiness question is whether those adjustments are documented.
In a spreadsheet, an override is a typed-over cell. There’s no record of what changed or why. In a defensible system, an override is a deliberate action that produces a one-line audit-log entry: previous value, new value, user, timestamp, reason.
The audit-log entry isn’t bureaucracy. It’s the answer to “show me what changed and why.” A program that can produce that for any number in the report is one an auditor closes the file on quickly.
Readiness as a continuous view
If audit-readiness is a property of day-to-day records, then there should be a way to see it day-to-day. A program-level readiness view — what’s complete, what’s overdue, what’s flagged for review, what hasn’t been started — running across all six MCMs.
The view doesn’t have to be fancy. It does have to:
- Be live, not a snapshot generated in advance
- Cover the whole program, not one MCM at a time
- Surface what’s actionable (overdue inspections, unresolved deficiencies, incomplete records)
- Avoid invented metrics (no “92% compliant” calculated from fields no one filled in)
If the readiness view is a manufactured percentage, it’s not telling the truth. If it’s the actual state of your records, it’s the audit dress rehearsal you can run any week.
How software fits
Tools like NPDESTracker structure inspection, IDDE, BMP, and enforcement records so they reconcile to the annual report automatically — counts derived, overrides logged, readiness rollup live across all six MCMs. The full reporting story is on the reporting page; the inspection workflow is detailed on the inspections page.
The point of the tool is to remove audit-readiness from year-end and make it a property of how records are kept day-to-day. That’s the only sustainable version.