2025 – Now
3 min read

Bridge

An internal admin portal for managing the Water Restrictions platform. Handles area and organisation management, scraping pipeline configuration, AI-assisted data quality audits, and scheduled changes. Built with Next.js.

Table of Contents

Bridge is the internal admin portal for the Water Restrictions platform. It manages areas, organisations, and restriction schemes, and is the primary tool for configuring and monitoring the scraping pipeline that powers the Water Restrictions Website. Feedback submissions from the website are reviewed and managed in Bridge, and a full domain events log provides an audit trail of all important system changes.

Dashboard

The dashboard provides an at-a-glance view of the platform’s current state. Stat cards show key platform counts. A restriction severity overview displays the current distribution of areas across restriction stages as colour-coded progress bars. A recent activity feed lists the latest domain events across all entities in the system.

Dashboard with stat cards, restriction severity overview, and recent activity

Domain events

The domain events log provides a paginated record of all system changes. Each entry can be opened to inspect the full event payload as formatted JSON.

Domain event detail with full event payload as JSON

Areas

The areas list is searchable and filterable by organisation and scheme. Each row shows details about the area, including the current stage as a colour-coded badge.

Areas list with search, filters, and current stage badges

Each area has a detail page showing the current stage, a severity history chart tracking restriction level changes over time, a stage timeline of past assignments, and the area boundary on an interactive map.

Area detail page showing the stage timeline, severity chart, and boundary map

Areas are edited on a dedicated page with a Monaco GeoJSON editor and a live map preview that updates as the boundary changes.

Area edit page with GeoJSON editor and live map preview

Scraping pipeline

Organisations group the scraping sources — the URLs monitored for restriction updates. Per-source settings and organisation-wide reprocessing are managed per organisation.

Each scrape run is recorded as a scrape job. The jobs list shows status across all organisations, with badges indicating each job’s outcome.

Scrape jobs list with status badges

Data quality audits

The Area Data Quality audit is triggered manually to check for drift between the API’s stored restriction data and what the scraping sources currently report. Results are tabbed by outcome, with severity mismatches surfaced in a dedicated tab. Each row expands to show the sources used and any quality flags raised.

Area Data Quality audit results with expandable rows

Scheduled changes

The scheduled changes page provides a global view of all planned stage changes across every area. Each entry shows the area, the target stage, the effective date, and current approval status. Scheduled changes can also be managed from the area detail page.

Global scheduled changes with approval status

Feedback

Feedback submitted via the Water Restrictions Website surfaces in Bridge for review, giving users a direct channel to report incorrect restriction data.

Stack

This project uses my standard stack setup.