NPDESTracker

MS4 Stormwater Software: What Small Cities Actually Need

What small Phase II MS4 programs actually need from stormwater compliance software. Why spreadsheets and PDFs break down, what to look for, and how a permit-aligned product replaces the ad hoc stack without overpromising.

Published May 5, 2026

Small Phase II MS4 programs run on a stack that almost no one designed on purpose. A few spreadsheets carry inspection counts. A shared drive carries inspection PDFs. An inbox carries IDDE complaints. A binder carries enforcement letters. A separate spreadsheet carries the post-construction BMP inventory. Once a year the coordinator opens a blank annual report template and rebuilds the year from those pieces. The work gets done, but the system around it is fragile, and every March it shows.

This post is about what small cities actually need from MS4 stormwater software, written from the perspective of the program running the work. It is not a feature list. It is the set of practical capabilities that make the work survive turnover, audits, and the next permit cycle.

Why spreadsheets and PDFs break down

The ad hoc stack is not stupid. It works for a while, especially in a small program with one coordinator who has been there long enough to know where everything is. The points where it stops working are predictable.

The annual report becomes a from-scratch project every year. The data exists, scattered across files and inboxes. Reassembling it into the report’s section structure is hours of copy-paste, sorting, and reconciliation. The narrative in the report ends up paragraph-style and partly written from memory.

Records drift away from the program. A construction inspection PDF lives in one folder. The deficient findings land in a follow-up spreadsheet. The closure letter lands in an email thread. By the time anyone wants to know whether the deficiency was actually closed, the chain is broken.

IDDE investigations lose their thread. A complaint comes in by phone, gets logged in an inbox, opens a screening visit, leads to source tracing across two streets, and resolves with a notice of violation. Six months later, when a state-agency reviewer asks about the incident, the chronology is reconstructed from memory and three places.

Cadence slips quietly. Catch basin inspections are due annually, post-construction BMPs on their own schedule, training on theirs. None of those schedules live in the spreadsheet. A coordinator who is busy enough has no warning that an inspection is sliding past due until well after it has.

Audit-defensibility is implicit. Records exist, but the answer to “who changed this entry on November 14, 2024 and why” is usually “I did, I think, because I remember.” That answer does not land well in an audit.

Turnover deletes institutional memory. When the coordinator leaves, the system leaves with them. The spreadsheet is still there. The mental model that made it work is not.

The pattern across all of these: the records exist, but the workflow around the records is held together by one person paying attention. That is a fragile foundation for permit compliance.

What the work actually consists of

Before talking about software, it helps to lay out what a small Phase II MS4 program is doing across the year. The six Minimum Control Measures cover:

  • Public education and outreach (MCM 1)
  • Public involvement and participation (MCM 2)
  • Illicit discharge detection and elimination, IDDE (MCM 3)
  • Construction site stormwater runoff control (MCM 4)
  • Post-construction stormwater management for new development and redevelopment (MCM 5)
  • Pollution prevention and good housekeeping for municipal operations (MCM 6)

The records that flow out of those programs include sites and projects, inspections, IDDE complaints and screenings, drainage structures, outfalls, BMP inventory, public outreach activities, public participation events, training records, and enforcement actions. The annual report is a view of all of those records over the reporting year, structured by MCM, with permit citations behind each line.

The software question is whether all of those pieces can live in one workflow that knows about the permit, or whether they stay scattered across tools that were not designed for permit alignment.

Sites, inspections, complaints, IDDE, activities, outfalls

Useful MS4 software organizes around the things the program actually tracks, not around generic “tickets” or “tasks.” The first practical capability test is whether the product knows the difference between a site, a structure, an inspection, and a complaint, and treats them as distinct first-class objects.

Sites and projects. A construction site has a project file, a SWPPP, a contractor of record, a permit-aligned set of inspections, and (eventually) a closure date. The product should hold all of that together, not split it across a project management tool and a spreadsheet.

Drainage structures and outfalls. Catch basins, manholes, vaults, outfalls. Each has a location, a maintenance and inspection cadence, and a history. The product should keep that history per structure, with a clear distinction between site-level work and structure-level work. We wrote about that distinction in the catch basin and drainage structure inspections post.

Inspections. Construction, post-construction BMP, structure, facility. Each kind has a structured form, a permit alignment, photo and GPS attachment, and follow-up logic for deficient findings. Mobile capture matters more than people expect, especially for inspectors working in the field with patchy signal. Detailed in the photo and GPS evidence post.

IDDE complaints and incidents. A complaint should open a single thread. Screening visits, sample readings, source tracing, enforcement actions, and closure all live on that thread, not in scattered files. The chronology should be obvious from the record itself. Detailed in the IDDE complaint-to-closure playbook.

BMP inventory. Every post-construction BMP a city accepts is a long-term obligation. The product should hold the BMP, its as-built reference, its inspection history, and its responsible party. Without that, BMPs get lost. Covered in tracking post-construction BMPs without losing them.

Public education and participation activities. MCM 1 and MCM 2 work needs activity records with audience, reach methodology, and dates. The annual report sections for these MCMs are usually the ones coordinators most dread reconstructing from email threads. Activity records should write themselves into the report.

Training and enforcement. Staff training records and enforcement actions are first-class records too. Both feed the annual report directly.

The boundary that matters: the product should treat each of these as a typed record with a known shape, not as a generic note attached to a generic task.

Annual report evidence

The deepest difference between a working MS4 software stack and an ad hoc one is what happens when the annual report opens. In the ad hoc stack, the report is a blank document, and the coordinator types numbers into it from memory, spreadsheets, and email. In a permit-aligned product, the report is a view of records that already exist, organized by the permit’s section structure, with each number traceable to the records that produced it.

A useful product does the following on the report side:

  • Counts come from records, not text fields. If a question asks for the number of construction inspections during the reporting year, the count is computed from the actual inspection records, not typed into a cell.
  • Evidence is linkable per question. Each annual report question knows what kinds of evidence support it, and a coordinator can link the supporting records with one click. Linked records are visible in the report itself.
  • Narrative drafts can start from linked evidence, not from a blank field. A drafting feature that reads from the linked evidence and produces a starting-point answer for the coordinator to edit is a meaningful time-saver. The coordinator stays in the loop. This is what Smart Draft does inside NPDESTracker, and the Smart Draft and AI disclosure explains exactly how it works and what it does not do.
  • Manual override leaves a trail. Sometimes a count needs adjusting. Every override should produce an audit-log entry with the user, the timestamp, the original value, the new value, and the reason. The override should not be invisible.
  • The output is print-friendly. The reporting workspace should produce a clean PDF-ready view of the full report, with municipality, year, period, and certification metadata. The product does not auto-submit to any state agency portal. The submission step stays with the human at the agency.

The detailed framing of how counts roll up across all six MCMs is on the annual reporting page. For Western Washington programs specifically, the WA Phase II Western 2024-2029 template is the first template wired in.

Audit readiness

Audit readiness is not a separate capability. It is what is left over when records, cadence, and audit logs are built into the workflow from the start. A program is audit-ready when:

  • A reviewer can ask about any number in the annual report and the underlying records are one click away.
  • A reviewer can ask “what changed on this BMP record on November 14, 2024” and the audit log shows who changed what and when.
  • A reviewer can ask for a list of overdue inspections at the end of the year and the system can produce it without anyone scrambling.
  • Enforcement actions are linked to the deficiencies they closed, and IDDE incidents have a single chronological thread.
  • Training records are timestamped and attributed.

A program running on spreadsheets can pass an audit, but it usually does so because of the coordinator’s personal recall, not because the records make the case on their own. A program running on permit-aligned software should be able to defend itself from the records alone. We covered the practical side in how to make an MS4 program audit-ready.

What to look for in software

When a small Phase II program is evaluating MS4 stormwater software, the practical evaluation criteria look something like this:

Does it know the permit? A product that ships with a permit-aligned annual reporting template, the right MCM structure, expected evidence kinds, and citations is starting from a different place than a generic compliance product. Ask whether the report assembles automatically from the records and whether the section structure is the actual permit structure, not an adapted approximation.

Does the field workflow survive a real day in the field? Mobile capture, offline support, photo and GPS attachment, structured forms, deficient findings flowing into follow-up, and the same login working in the browser back at the desk. Ask the question from a working inspector’s perspective.

Does IDDE work as a single thread? Complaint to screening to source tracing to closure, on one record. Photos, samples, and notes anchored to that record.

Is there a real audit log? Every create, edit, and override logged with user, timestamp, and reason. The product should not need a separate audit module.

Is the annual report a view of records, not a separate document? Counts derived from records, not typed in. Evidence linkable per question. Manual override allowed but logged.

Is the data the program’s data? Tenant separation, exports in standard formats, account removal on customer request, no cross-agency record references. The agency owns its records.

Is the team accountable? A small product with direct contact between the buyer and the person building it usually serves a small program better than an enterprise sales process.

Does the security page tell the truth? A page that says what the product does have, plainly says what it does not, and does not claim certifications it does not hold is a much safer signal than a page full of vague compliance assertions.

Is AI scoped honestly? AI-assisted drafting can save real time on long-form questions. The software should be able to explain exactly when it calls an external AI provider, what data is sent, how to disable it, and what the human still has to review. AI features that submit, certify, or replace staff review on the annual report should be a hard no.

The cheaper version of all of this: ask the vendor to walk through the annual report assembling from sample data, in real time. If the workflow does not make sense in five minutes, it is not the workflow.

How NPDESTracker fits

NPDESTracker is purpose-built MS4 stormwater software for small and mid-sized Phase II MS4 permittees. It is not a generic asset-management product, not a generic permit-management product, and not a CMMS. It is a compliance documentation tool organized around the six Minimum Control Measures and the annual report.

Where it fits in the work:

  • Field inspections and IDDE. Mobile-friendly inspections, IDDE complaint threads from intake to closure, photo and GPS evidence, deficient findings flowing into follow-up. Detailed on the inspections page.
  • Annual reporting. Counts roll up from records. Evidence linkable per question. Override with audit log. Print-friendly preview. The first template wired in is WA Phase II Western 2024-2029. Detailed on the annual reporting page.
  • Smart Draft. An assistive drafting feature that turns linked evidence into a starting-point draft for an annual report answer. Default mode is template-based and deterministic. Optional external AI mode is opt-in and operator-controlled. Staff review every saved answer. Full disclosure on the Smart Draft and AI page.
  • Tenant-scoped, auditable, browser-based. Per-agency tenant separation at the query layer, multi-factor authentication supported for administrators, audit logs across compliance records. Detailed on the security page.
  • Honest scope. No invented compliance scores. No traffic-light theatre. No claims of certifications the product does not hold.

What NPDESTracker is not: it is not a GIS platform, not a CAD system, not a hydraulic model, and not a billing system. It can read and write standard GIS file formats so it coexists with ArcGIS and QGIS, and that boundary is covered in MS4 software vs ArcGIS.

The simplest way to evaluate the fit is the interactive demo. Browse-only with sample data, no signup, no call. Open the inspections workflow, the IDDE thread, the annual report view, and the Smart Draft run side by side and see whether the workflow matches how the program actually runs.

For Phase II programs that decide to move past the spreadsheet stack, pricing covers the Founder Pilot at $2,500 and Standard Pilot at $4,900 for a 90-day evaluation, and the Annual Platform from $13,000 a year for the full deployment. The Annual Platform is the place where Smart Draft, audit-defensible records, and the WA Phase II template are designed to operate together.

Further reading

See it run.

Open the demo with sample data. Browse-only, no signup, no call.