New Joint Agency Guidance: Secure Connectivity Principles for OT

By Patrick Miller

A Five Eyes plus European intelligence coalition has published a new doctrine for securing OT connectivity against nation-state threats. This Deep Dive examines what the NCSC principles mean for utilities and industrial operators, what breaks in legacy environments, and the safety, cost, and engineering realities of moving from compliance-driven security to true operational resilience.

 

Overview

In early 2026, the UK National Cyber Security Centre published Secure Connectivity Principles for Operational Technology (OT). On the surface, it reads like another piece of industrial cybersecurity guidance. In reality, it is something much more significant.

This document was developed jointly by the UK NCSC, Australia’s ASD, Canada’s Cyber Centre, US CISA and FBI, Germany’s BSI, the Netherlands NCSC, and New Zealand’s NCSC. This makes it a Five Eyes plus European intelligence-grade doctrine for how critical infrastructure networks should be connected.

Unlike regulatory frameworks, which focus on minimum controls and auditability, this guidance is built around one question: How do you prevent nation-state adversaries from gaining and maintaining access inside operational technology environments?

The Eight Secure OT Connectivity Principles at a Glance

NCSC Principle What it means in practice
1. Balance risks and opportunities Every OT connection must have a documented business case, risk tolerance, and senior risk owner. Connectivity is not assumed. It is justified, threat-modeled, and revisited over time.
2. Limit the exposure of your connectivity OT systems should not be directly reachable from external networks. Connectivity should be outbound-initiated, brokered through a DMZ, time-limited, and continuously assessed for accidental internet exposure.
3. Centralise and standardise network connections Remote access, data sharing, and vendor connectivity should use a small number of hardened, repeatable, monitored access paths instead of ad hoc VPNs and bespoke links.
4. Use standardised and secure protocols Legacy industrial protocols should be replaced or wrapped with secure variants such as DNP3-SA, OPC UA, CIP Security, and TLS. Protocols must be validated, not blindly trusted.
5. Harden your OT boundary Treat the OT perimeter as a security platform: modern firewalls, strong authentication, least-privilege access, removal of default credentials, and where appropriate hardware-enforced data flows.
6. Limit the impact of compromise Segment and micro-segment networks so a single compromised laptop, vendor account, or device cannot pivot to broader zones or safety-critical systems.
7. Ensure all connectivity is logged and monitored Monitoring should be designed to detect misuse, anomaly, and command-level abuse, not just collect logs for audits.
8. Establish an isolation plan Build and test the ability to isolate OT from external dependencies during incidents, while preserving safety-critical visibility and operations through trusted data flows where needed.
 

The threat model is not hypothetical

The NCSC is explicit about why this guidance exists: “Exposed and insecure OT connectivity is targeted by both opportunistic and highly capable actors, including state-sponsored campaigns against critical national infrastructure.”

This is a post-Volt Typhoon, post-TRITON, post-Industroyer world. The goal is not to stop nuisance malware. It is to prevent strategic pre-positioning inside power grids, pipelines, water systems, and manufacturing environments. Connectivity is where those campaigns live.

These are not best practices, they are end-state architecture

The NCSC does not pretend this is easy: “These principles are intended as goals rather than minimum requirements.” This matters, because the document is not describing what most organizations can do today. It is describing what OT must become if it is to survive in a world where cyber operations are instruments of state power.

The architectural pivot: OT stops accepting inbound connections

The single most important control in the entire document is deceptively simple: “All connections with the OT environment should be initiated as outbound connections from within the OT environment.”

This one requirement invalidates most of today’s remote access and integration patterns:

  • No inbound VPNs

  • No internet-facing gateways

  • No persistent vendor tunnels

  • No cloud systems directly pulling data from OT

Instead, all access becomes brokered through hardened intermediaries in a DMZ. External systems never connect to OT. OT selectively connects outward. This design alone neutralizes many of the techniques used in real-world intrusions.

Exposure is not a device problem, it is a systems problem

The guidance treats exposure as a property of the whole system, not just internet-facing PLCs. Wireless links, radio bridges, vendor endpoints, cloud connectors, and even undocumented modems all count as network edges. This is why the document strongly pushes exposure management and external attack surface monitoring for OT. If it can be discovered, it can be targeted.

Protocols are no longer trusted, they are verified

The document goes far beyond “use secure protocols.” It calls for schema-based protocol validation at trust boundaries, where only known-good commands and data structures are allowed to pass. This is how classified networks protect cross-domain data flows. The same logic is now being applied to industrial control systems.

The OT boundary becomes a hardened security platform

The guidance is blunt about OT boundary devices: “They must be modern, patchable, cryptographically capable, and easy to replace. Obsolete devices should not be used as security controls.” That is a direct challenge to how many OT perimeters are built today.

Hardware-enforced trust is back

For high-risk environments, the NCSC explicitly endorses unidirectional (taps, data diodes, etc.) and cross-domain architectures to enforce unidirectional or tightly controlled data flows in hardware. Given all the critical firewall vulnerabilities of late, this is recognition that software-based controls do not hold up against well-resourced adversaries.

Lateral movement is the real enemy

The document introduces two concepts most OT security programs still underplay: contamination and lateral movement. Compromised laptops, poisoned updates, misconfigured gateways, and flat vendor networks allow attackers to move quietly until they reach safety-critical systems. Micro-segmentation, separation of duties, DMZs, and browse-down administration are designed to make that movement architecturally difficult.

Monitoring is about detecting adversaries, not just collecting logs

Principle 7 reframes logging and monitoring around detecting misuse, anomaly, and abuse of emergency access, not simply satisfying audit requirements. This is a threat-intelligence/hunting-driven SOC model applied to OT.

Isolation is a design requirement, not an emergency tactic

The guidance requires OT systems to be able to isolate from external dependencies while preserving safety-critical data flows through trusted mechanisms such as data diodes This is how infrastructure survives pre-positioning campaigns without shutting down. The approach promotes the “intelligent islanding” or “turtle mode” discussions during incident response and crisis management, but as a design principle and not just a reactionary behavior.

The [in]feasibility reality

This model assumes capabilities many [most] OT environments do not yet have.

It requires:

  • OT network architects

  • Industrial protocol security engineers

  • Identity and access control specialists

  • Threat detection engineers

  • Operational governance across IT, OT, and vendors

  • Significant resources and budget

Implementing this in a medium-sized utility or industrial enterprise is typically a multi-year, eight-figure transformation, involving tens of thousands of engineering hours. There is no “lightweight” version of this that still delivers nation-state resilience.

What breaks in legacy environments

Most brownfield OT networks violate nearly every assumption in this guidance:

  • Flat trust zones

  • Insecure, unauthenticated protocols

  • Vendor VPNs with broad access

  • Unsupported and unpatchable devices

  • Hidden external dependencies

Many legacy systems cannot be upgraded to secure protocols and must instead be isolated behind gateways, diodes, or segmented enclaves. This adds cost, complexity, and operational friction.

The safety tradeoff

This is where the guidance becomes dangerous if misunderstood. Done correctly, this model reduces the likelihood that cyber events become physical safety events by preventing unauthorized control and limiting blast radius.

Done poorly, it introduces new safety risks:

  • Latency and jitter from encryption and inspection

  • Lost alarms or operator visibility

  • Accidental isolation of safety-critical systems

  • Fragile dependencies on security infrastructure

The NCSC explicitly warns that connectivity changes must be evaluated for operational risk, loss of connectivity, and manual fallback capability. Security controls inserted without safety engineering can become new failure modes.

What this means for regulated infrastructure

One of the most potentially disruptive sentences in the document is this: “If regulatory connectivity patterns fall short of modern security expectations, operators must implement compensating controls.”

That is a direct challenge to legacy regulatory models that lock in weak architectures. This joint doctrine is saying something quietly but clearly: If your regulation requires a weak architecture to remain compliant, do not expect to fend off highly skilled adversaries.

Final thoughts

This document is not a checklist, but rather a blueprint for how industrial systems should be connected and architected in a world where cyber operations are strategic weapons. Most OT environments today are not built this way. Far from it, and most cannot be made this way quickly. But this is where the bar has been set and agreed upon by enough agencies with first-hand knowledge of what capable adversaries can do.

 

Featured Posts

Patrick Miller