NPDESTracker

NPDES Reporting Software: What MS4 Programs Need Before Annual Reports Are Due

Learn what NPDES reporting software should do for MS4 programs: inspections, IDDE, public education, outfalls, evidence linking, audit logs, and annual report preparation.

Published May 5, 2026

Most small Phase II MS4 programs do the work all year and rebuild the annual report in March. The inspections happened, the IDDE complaints were investigated, the outreach events ran, the BMPs got walked, the training was logged. But the records that prove it live across a few spreadsheets, a few hundred inspection PDFs, an inbox of complaints, a binder of enforcement letters, and a shared drive nobody is sure who curates. The work was real. Reassembling it into a permit-aligned annual report is the part that breaks every year.

NPDES reporting software exists to close that gap. The point is not to build a separate report at the end of the year. The point is to connect the annual report answers to the records the program already collects through the year, and to keep the audit log honest about what was logged, when, and by whom. This post is about what NPDES reporting software should actually do for an MS4 program, where the hard parts are, and how a permit-aligned product fits into the work without overpromising.

What NPDES reporting software should track

A useful product holds the records a small MS4 program is generating across all six Minimum Control Measures, plus the cross-cutting records the annual report eventually rolls up. Concretely:

  • Inspections. Construction site inspections, post-construction BMP inspections, structure inspections (catch basins, manholes, vaults, outfalls), and municipal facility inspections. Each kind has a structured form, a permit alignment, follow-up logic for deficient findings, and a clear distinction between site-level and structure-level work.
  • IDDE complaints and investigations. Complaint intake, screening visits, sample readings, source tracing, enforcement, and closure on a single thread per incident, not scattered across files.
  • Outfalls. A live inventory with inspection history, dry-weather screening results, and any IDDE incidents linked back to the affected outfall.
  • Public education and outreach (MCM 1). Activity records with audience, topic, materials distributed, reach methodology, and dates that the annual report’s MCM 1 section can pull from directly.
  • Public participation (MCM 2). Public meetings, comment periods, volunteer events, and agenda or minutes documentation.
  • Operations and maintenance, good housekeeping (MCM 6). Facility inspections, spill response, fleet operations, salt and fertilizer application records, and routine maintenance logs.
  • Construction and source control inspections (MCM 4). Per-site inspection cadence, deficient findings, and inspector follow-through.
  • Enforcement and corrective actions. Notices of violation, compliance schedules, and corrective-action records, linkable back to the deficiencies they closed.
  • Training and documentation. Staff training records (date, audience, topic, materials), with attribution and timestamping.

The boundary that matters: each of these should be a typed record with a known shape, not a generic note attached to a generic task. The full inspection workflow is detailed on the inspections page, and the cross-program rollup is on the annual reporting page.

Why annual reports are hard

Most of the difficulty in MS4 annual reporting is not the writing. It is the reassembly. A few patterns that show up in nearly every small Phase II program:

  • Counts live in separate places. The inspection count is in one spreadsheet, the IDDE incident count is in an email folder, the outreach event count is on someone’s calendar. The annual report asks for those numbers in a single section structure, and the coordinator walks five sources to produce the right cell.
  • Evidence is disconnected from the answer. A question on the report needs to be supported by inspection records, BMP records, training logs, or enforcement actions. In an ad hoc stack, the evidence is real but the link from “this answer” to “those records” is held in the coordinator’s head.
  • Narrative answers are written from memory. Long-form questions about how a program ran a campaign, investigated an illicit discharge, or maintained municipal facilities tend to be written from recall, weeks or months after the work happened, with no easy way to cite the underlying records inline.
  • Staff turnover creates risk. When the coordinator who knew where everything was leaves, the spreadsheet is still there, but the mental model that made it work isn’t. The next coordinator starts from a partial picture.
  • Report deadlines compress review time. By the time the report is drafted, supervisors and city attorneys have a narrow review window. Anything that pushes drafting later, like reassembling counts from scratch, eats into review.

These are not failures of the program. They are what happens when the records and the report live in different systems. NPDES reporting software should remove the gap, not paper over it.

What good NPDES reporting software should do

The capability test for “is this product actually built for NPDES reporting” is whether the annual report opens as a view of records that already exist, or as a blank document waiting for the coordinator to type counts into it. The capabilities that distinguish a permit-aligned product from a generic compliance tool tend to look like this:

  • Organize records by permit section. The product knows the permit’s MCM structure and the question prompts inside each section. The user does not assemble a template from scratch each year.
  • Link evidence to each annual report question. Each question knows what kinds of evidence support it. A coordinator can attach the supporting inspection, IDDE incident, BMP record, training log, or enforcement action with one click. Linked evidence is visible in the report itself.
  • Derive counts from records. When the report asks how many construction inspections happened during the reporting year, the count is computed from the actual inspection records. It is not typed in.
  • Support manual overrides with audit logs. Sometimes a count needs adjusting. A record was logged in the wrong year, an inspection happened but never got entered, a duplicate slipped through. Every override should produce a one-line audit-log entry with the user, timestamp, original value, new value, and reason.
  • Show missing evidence. A readiness view should surface what is complete, what is overdue, and what is flagged for review across all six MCMs, before the deadline rather than after.
  • Produce print and export-ready support. A clean print-friendly view of the full report (municipality, year, period, certification metadata, every section in template order, every saved answer, every linked evidence summary) that can be saved as PDF or attached to an email.
  • Keep staff review in control. Every saved answer is a human action. The product does not auto-submit reports to a state agency portal, does not certify the report on its own, and does not lock or unlock reporting years without an explicit user step.

The point of all of this: the report should be a review of records, not a rebuild. We covered the hard parts of audit-readiness in how to make an MS4 program audit-ready and the full evaluation framework in the MS4 stormwater software guide.

Smart Draft from linked evidence

Even when counts come from records, the long-form narrative questions on an annual report are still written by hand. A drafting feature that can read from the linked evidence and produce a starting-point draft for the coordinator to edit is a meaningful time-saver, but only if it is scoped honestly.

Inside NPDESTracker, that drafting feature is called Smart Draft. It is assistive, not authoritative. The coordinator reads the draft, edits the parts that need judgment, and saves the answer. The audit log captures the action against the user who saved it, regardless of how the draft was produced. The draft is never saved as the answer until a human uses it and saves it.

A few details worth stating clearly for procurement and IT review:

  • Default Smart Draft is template-based and deterministic. It assembles the draft from the linked evidence and a set of question-aware templates inside NPDESTracker. The same inputs produce the same draft.
  • The default mode does not contact an external AI provider. No customer records leave the tenant in the default mode.
  • An optional external AI mode is opt-in and operator-controlled. When configured, Smart Draft can send a limited, field-whitelisted slice of linked record context to a configured provider. The optional mode is off by default and can be disabled at any time.
  • The draft is not saved until a human uses and saves it. Smart Draft does not submit reports, does not certify reports, does not lock reporting years, and does not replace staff review.

Full description of how Smart Draft works is on the Smart Draft product page, and the full disclosure of what is and is not sent in optional external AI mode is on the Smart Draft and AI page.

MS4 vs generic compliance tools

There is a meaningful difference between software built around MS4 stormwater records and the annual reporting cycle, and software repurposed from a generic compliance or task-management product. The difference shows up in the moments where the work meets the permit.

A generic task manager can hold inspection tasks. It cannot, on its own, know that a question on the WA Phase II annual report’s MCM 4 section expects a count of construction inspections during the reporting year, drawn from inspection records of a specific kind, with specific evidence linking. That alignment has to be built. A product that ships with the alignment included starts somewhere different than a product that requires the agency to configure it.

NPDESTracker is built around MS4 stormwater records and annual reporting. The first complete annual report template wired in is the Western Washington Phase II 2024-2029 template. Adjacent state programs (Oregon Phase II, Idaho Phase II, Colorado Phase II) are covered in the permit pages.

What NPDESTracker is not: it is not a GIS platform. It can read and write standard GIS file formats so it coexists with ArcGIS Online, ArcGIS Enterprise, or QGIS. The boundary between purpose-built MS4 software and a GIS platform is covered in MS4 software vs ArcGIS: why most municipal programs need both.

How NPDESTracker fits

For a small Phase II program evaluating NPDES reporting software, the practical fit comes down to whether the product holds the records that feed the annual report, in shapes that match the permit, with audit-defensible records and honest scope.

Where NPDESTracker fits in the work:

  • Annual reporting. Counts roll up from records. Evidence linkable per question. Manual override with audit log. Print-friendly preview. The first template wired in is WA Phase II Western 2024-2029. Detailed on the annual reporting page.
  • Inspections. Mobile-friendly construction, post-construction BMP, structure, and facility inspections, with photo and GPS attachment, deficient-finding follow-through, and a clear site-vs-structure distinction. Detailed on the inspections page.
  • Smart Draft. An assistive drafting feature inside the annual reporting workspace. Default mode is template-based and deterministic. Optional external AI mode is opt-in and operator-controlled. Staff review every saved answer. Detailed on the Smart Draft page and the Smart Draft and AI disclosure.
  • Tenant-scoped, auditable, browser-based. Per-agency tenant separation at the query layer, multi-factor authentication supported for administrators, audit logs across compliance records. Detailed on the security page.
  • Honest scope. No invented compliance scores. No certifications claimed that are not held. NPDESTracker does not submit annual reports to Ecology, EPA, or any state agency on a customer’s behalf. The submission step stays with the human at the agency, on the agency’s submission channel.

The simplest way to evaluate the fit is the interactive demo. Browse-only with sample data, no signup, no call. Open the inspections workflow, the IDDE thread, the annual report view, and a Smart Draft run side by side and see whether the workflow matches how the program actually runs.

For Phase II programs ready to move past the spreadsheet and PDF stack, pricing covers the Founder Pilot at $2,500 and Standard Pilot at $4,900 for a 90-day evaluation, and the Annual Platform from $13,000 a year for the full deployment. The Annual Platform is the place where evidence-linked annual reporting, audit-defensible records, Smart Draft, and the WA Phase II template are designed to operate together.

Further reading

See it run.

Open the demo with sample data. Browse-only, no signup, no call.