2024 – Now
3 min read

Water Restrictions API

A public .NET Web API providing structured water restriction data across New Zealand, with an automated AI-powered scraping pipeline for keeping data up to date.

Table of Contents

Water restriction data in New Zealand is decentralised — each council and water entity publishes its own information in its own format, with no standard or central source. The Water Restrictions API provides a single machine-readable interface: a structured REST API covering areas, restriction stages, stage assignment history, scheduled upcoming changes, and the organisations that manage them. It’s the data layer behind the Water Restrictions Website.

API overview

Here’s a simple example — fetching the current restrictions for Upper Hutt, managed by Wellington Water:

GET https://api.waterrestrictions.nz/v1/areas/73dac0b7-a246-4298-aa6f-d80cf23e93c2

This returns information about the area, including the current stage assignment if one is active:

{
  "id": "73dac0b7-a246-4298-aa6f-d80cf23e93c2",
  "name": "Upper Hutt",
  "scheme": {
    "id": "b3e4bf6d-c613-4ba6-bf4a-ff9e3bea15d4",
    "name": "Greater Wellington Region",
    "description": null
  },
  "currentStageAssignment": {
    "id": "9f2a1c3b-4d7e-4a2b-8b1d-2c3e4f5a6b7c",
    "effectiveFrom": "2025-01-15T00:00:00Z",
    "createdBy": "AI",
    "stage": {
      "id": "5a3f7a40-fd8c-45ac-86c6-e17d5725e4a1",
      "severity": 1,
      "name": "Level 1",
      "description": "Outdoor residential water restrictions start at Level 1.\n\n..."
    }
  },
  "organisation": {
    "id": "3f0412e0-7f15-41d9-adf0-6fc0e3985b7d",
    "name": "Wellington Water",
    "websiteUrl": null
  },
  "scrapingSources": [
    {
      "url": "https://www.wellingtonwater.co.nz/water-supply/water-restrictions/",
      "description": "Wellington Water restrictions page"
    }
  ]
}

Additional endpoints expose stage assignment history (/v1/areas/{id}/stage-assignments) and upcoming scheduled changes (/v1/areas/{id}/scheduled-changes). A full reference is available in the API docs.

Data management

Externally, the API is a read-only interface. Internally, data is managed through Bridge — a companion web application for viewing and editing restriction data, reviewing AI-detected changes, and monitoring the scraping pipeline.

Automation

A background pipeline periodically scrapes council and water entity websites for restriction updates. The scraper runs as a scheduled container job and triggers AI processing on completion.

The scraper handles sources that render restriction information with JavaScript. Proxies reduce the risk of bot detection. If content hasn’t changed since the last scrape, the job is skipped and no AI call is made. If bot detection is triggered, AI processing is bypassed and a Discord notification is dispatched for review.

For sources where new content is detected, the extracted content is passed to an AI language model, which identifies any restriction stage changes and returns them with a confidence score. Changes that meet the confidence threshold are auto-approved and applied immediately; those that don’t are held for manual review in Bridge, where they can be approved or rejected.

Scheduled stage changes can also be prepared in advance — for example, a planned stage increase ahead of a dry season — and will be applied automatically at the configured time.

Stack

This project uses my standard stack setup.