OT IT Convergence

OT/IT Convergence — What It Means for Your Factory

Walk into any manufacturing plant and you will find two worlds operating side by side but rarely talking to each other. On the shop floor, programmable logic controllers (PLCs), SCADA systems, human-machine interfaces (HMIs), and variable frequency drives keep production running. Upstairs, enterprise resource planning (ERP) systems, manufacturing execution systems (MES), databases, and cloud analytics platforms manage planning, quality, and business intelligence. The first world is Operational Technology. The second is Information Technology. For decades they have coexisted with minimal overlap, each managed by separate teams with separate budgets, separate networks, and fundamentally separate priorities. That separation made sense once. It is now becoming one of the biggest obstacles to manufacturing competitiveness.

OT/IT convergence is the practice of creating structured, secure pathways between these two domains so that data flows from machines to business systems and decisions flow back down to the shop floor. It does not mean merging everything into one team or one network. It means building a shared data layer that both sides can trust. This article explains what that looks like in practice, why it matters now, and how to approach it without breaking what already works.

Understanding the Two Worlds

Operational Technology encompasses every system that directly monitors or controls physical processes. In a typical discrete manufacturing environment, this includes PLCs that execute machine logic in deterministic scan cycles measured in milliseconds, SCADA platforms that aggregate process data across production lines, HMIs that give operators real-time visibility into machine state, distributed control systems (DCS) in process-intensive operations, and industrial networks running protocols like EtherNet/IP, Profinet, or Modbus TCP. OT systems are designed around one overriding principle: uptime. A PLC that has been running a stamping press reliably for fifteen years is not considered outdated. It is considered proven. OT engineers are understandably cautious about anything that could introduce latency, instability, or new attack surfaces into these environments.

Information Technology, by contrast, encompasses the systems that store, process, and transmit business data. ERP systems manage production orders, material requirements, and financial transactions. MES platforms track work-in-progress, enforce routing logic, and record quality data. Cloud infrastructure hosts analytics, reporting, and increasingly, AI-driven decision support. IT teams prioritise data accessibility, integration, security policy enforcement, and the ability to deploy updates and new capabilities rapidly. Where OT measures success in years of uninterrupted operation, IT measures success in how quickly systems can adapt to new business requirements.

Neither perspective is wrong. Both are right within their own context. The challenge of convergence is respecting both sets of priorities simultaneously.

Why They Were Separate — and Why That Is Changing

The historical separation between OT and IT was not an accident. It was a deliberate architectural decision driven by real constraints. OT networks needed deterministic communication where a control signal arrives within a guaranteed time window. IT networks operated on best-effort delivery where a few hundred milliseconds of latency is irrelevant to an ERP transaction but catastrophic to a motion control loop. OT systems ran proprietary protocols designed for reliability in harsh electrical environments. IT systems standardised on TCP/IP. The security models were fundamentally different: OT relied on physical isolation (air gaps) while IT relied on firewalls, authentication, and encryption. Connecting these worlds meant exposing safety-critical equipment to risks that no plant manager was willing to accept.

Three forces are now making convergence both possible and necessary. First, the Industrial Internet of Things (IIoT) has placed IP-capable sensors and edge devices directly on production equipment, creating data streams that need to reach business systems in near real time. Second, cloud-based analytics and AI require access to granular operational data that has historically been trapped inside OT networks. Third, customer demands for traceability, lot-level genealogy, and real-time order visibility require production data to flow seamlessly into supply chain and quality systems. You cannot build a digital thread if the first half of that thread is locked inside a PLC network that nothing else can see.

ISA-95: The Model That Still Matters

Before discussing modern convergence architectures, it is worth revisiting the ISA-95 standard (IEC 62264), because it remains the most widely understood framework for describing how OT and IT relate to each other. ISA-95 defines five levels:

  • Level 0 — Physical Process: The actual production equipment, sensors, and actuators that interact with raw materials and work-in-progress.
  • Level 1 — Basic Control: PLCs, RTUs, and safety controllers that execute real-time control logic with scan times in the low milliseconds.
  • Level 2 — Area Supervisory Control: SCADA systems, HMIs, and local data historians that provide operators with visibility and control over production lines or cells.
  • Level 3 — Manufacturing Operations: MES, quality management, maintenance management, and production scheduling systems that orchestrate work across the plant.
  • Level 4 — Business Planning and Logistics: ERP, supply chain management, CRM, and business intelligence platforms that operate at the enterprise level.

Traditionally, data moved upward through these levels in batch transfers with significant latency at each boundary. A shift production report would be compiled in the MES at Level 3 and uploaded to the ERP at Level 4 hours or even a day later. Machine-level data at Levels 0-2 rarely made it past the local historian. Modern convergence does not eliminate these levels. It compresses the latency between them and enables selective real-time data flow across level boundaries without compromising the isolation that keeps lower levels stable.

Unified Namespace: The Bridge Between OT and IT

The most practical architectural pattern for OT/IT convergence today is the Unified Namespace (UNS). A UNS is a single, event-driven data layer where every system, machine, sensor, and application publishes and subscribes to data using a consistent, semantically organised topic hierarchy. Instead of building point-to-point integrations between every OT system and every IT system (which creates an unmanageable web of N-squared connections), every system connects to the UNS once. Data producers publish. Data consumers subscribe. The namespace itself is organised to reflect the ISA-95 hierarchy: enterprise, site, area, line, work cell, and equipment.

The messaging backbone of a UNS is typically MQTT, a lightweight publish-subscribe protocol designed for constrained environments. MQTT brokers handle message routing with minimal overhead, support quality-of-service levels from fire-and-forget to guaranteed delivery, and can operate over unreliable network links with automatic reconnection. For OT data ingestion, OPC UA provides a standardised, secure interface for reading data from PLCs, CNCs, and industrial controllers, which is then published into the MQTT namespace. The combination of OPC UA at the equipment edge and MQTT as the transport layer has emerged as the de facto standard for industrial data architectures.

What makes a UNS powerful is what it eliminates. There is no more waiting for a weekly data dump from the historian into a spreadsheet. There is no more building a custom integration every time a new system needs access to machine data. When an MES needs real-time cycle counts from a press line, it subscribes to the relevant topic. When the ERP needs to know that a production order is complete, it subscribes to the order status topic. When a quality system needs to correlate process parameters with inspection results, both data streams are available in the same namespace with aligned timestamps. The UNS becomes the single source of truth for the current state of the operation.

Edge Computing: Where OT Meets IT in Practice

Convergence does not mean sending every PLC register value to the cloud. That approach floods networks, creates unnecessary latency dependencies, and exposes OT systems to risks they should not carry. Edge computing provides the buffer zone where OT data is collected, filtered, contextualised, and made available to IT systems without requiring direct access to the OT network.

An edge gateway sits at the boundary between the OT and IT domains. On its southbound interface, it speaks native industrial protocols: OPC UA, Modbus, Siemens S7, EtherNet/IP, Profinet. On its northbound interface, it publishes contextualised data into the UNS via MQTT or makes it available through REST APIs. The critical design principle is that the edge device initiates all outbound connections. No inbound traffic from the IT network or the cloud reaches the OT network through the edge gateway. This outbound-only architecture preserves the isolation that OT teams depend on while giving IT teams the data access they need.

Edge devices also perform local processing that does not need to leave the plant floor. Real-time alarming, threshold monitoring, local dashboards for operators, and even lightweight analytics can run at the edge with sub-second response times. Only aggregated, contextualised data moves northward to the cloud for enterprise-level analytics, cross-plant benchmarking, and long-term trend analysis. This tiered approach respects the latency requirements of OT while satisfying the data requirements of IT.

Security: The Non-Negotiable Foundation

No discussion of OT/IT convergence is complete without addressing security directly. The IEC 62443 standard provides a comprehensive framework for industrial cybersecurity, and any convergence initiative should align with its principles. The core concept is defence in depth: multiple layers of security controls so that the failure of any single layer does not expose the entire system.

In practical terms, this means several things for a convergence architecture:

  • Network segmentation: OT and IT networks remain physically or logically separated. Communication between zones passes through defined conduits (firewalls, DMZs, data diodes) with strict access control lists that permit only the specific traffic patterns required.
  • Outbound-only communication: Edge devices push data outward through encrypted TLS connections. No inbound port is opened on the OT network. This eliminates the most common attack vector: an open port that an adversary can scan and exploit.
  • Authentication and authorisation: Every device, user, and application that connects to the UNS or accesses OT data must authenticate. Role-based access controls ensure that a business analyst can view OEE dashboards but cannot send commands to a PLC.
  • Encrypted transport: All data moving between OT and IT domains is encrypted in transit using TLS 1.2 or higher. Certificate-based mutual authentication ensures both ends of the connection are verified.
  • Monitoring and anomaly detection: Network traffic between zones is logged and monitored for unusual patterns. A sudden spike in outbound data volume or an unexpected protocol on the OT network triggers an alert for investigation.

The key message for both OT and IT teams is that convergence does not mean removing the boundaries between networks. It means creating controlled, monitored, and encrypted pathways through those boundaries. The air gap may be gone, but the security posture should be stronger than it was when the only protection was a locked cabinet and an unlabelled Ethernet cable.

The Human Side: OT and IT Teams Working Together

Technology is rarely the hardest part of convergence. The harder challenge is organisational. OT teams and IT teams have different training, different tools, different escalation paths, and different definitions of what constitutes a critical incident. An OT engineer will tell you that the worst thing that can happen is unplanned downtime. An IT security analyst will tell you that the worst thing that can happen is a data breach. Both are correct, and convergence requires finding an operating model where both concerns are addressed without one team overriding the other.

The most successful convergence programmes establish a cross-functional governance structure early. This does not mean creating a new department. It means defining shared change management processes so that an IT-initiated firmware update does not disrupt a production run, and an OT-initiated PLC programme change does not introduce an unmonitored data flow. It means agreeing on a common asset inventory so that both teams know what devices are on the network. It means defining shared incident response procedures so that a cybersecurity event on the IT network triggers the right notifications to OT personnel, and an equipment fault on the OT network is visible to the IT monitoring team.

Convergence is not IT taking over OT, and it is not OT ignoring IT. It is both teams agreeing that shared visibility, shared data, and shared accountability produce better outcomes than operating in parallel with no communication.

A Practical Approach: Start with the Data Layer

If your organisation is beginning a convergence initiative, do not start by reorganising teams or rewriting network architectures. Start with the data layer. Identify the top three to five data flows that, if connected, would deliver immediate operational value. Common starting points include:

  • Machine availability and cycle time data flowing from PLCs into the MES for automated OEE calculation, replacing manual data entry and shift-end reports.
  • Process parameter data (temperatures, pressures, torques) correlated with quality inspection results to detect process drift before it produces scrap.
  • Equipment condition data (vibration, current draw, temperature trends) flowing to a maintenance system for condition-based or predictive maintenance programmes.
  • Production order status and material consumption flowing from the shop floor to the ERP in near real time, giving planning teams accurate visibility into work-in-progress.
  • Energy consumption data at the machine level, enabling scheduling optimisation and sustainability reporting with actual measured values rather than estimates.

Deploy an edge gateway on a single production line. Connect it to the existing PLCs and sensors using native industrial protocols. Publish the data into a local UNS broker. Let the MES, quality system, and maintenance system subscribe to the topics they need. Measure the value delivered. Then expand to additional lines and additional data flows using the same architecture. This incremental approach proves value at each stage, builds confidence across both OT and IT teams, and avoids the risk of a large-scale transformation programme that takes two years to deliver its first result.

Where This Is Heading

OT/IT convergence is not a one-time project. It is an ongoing architectural evolution that enables progressively more sophisticated use cases. Once the shared data layer is in place, manufacturers can layer on advanced analytics, machine learning models for predictive quality and maintenance, digital twin simulations fed by live production data, and closed-loop optimisation where AI-driven recommendations flow back down to the shop floor as adjusted set points or scheduling changes. None of these capabilities are possible without the foundational step of connecting OT and IT through a secure, standardised data architecture.

The manufacturers who will lead in the coming decade are not the ones with the most advanced machines or the largest IT budgets. They are the ones who bridge the OT/IT divide effectively, creating an operational data fabric where every system contributes to and benefits from a shared understanding of what is happening on the factory floor right now.

How Tomax Supports OT/IT Convergence

The Tomax platform is built on a federated Unified Namespace architecture with native support for OPC UA and MQTT, enabling secure data flow from shop floor equipment into MES, quality, maintenance, and planning applications without requiring direct OT network access. Edge gateways handle protocol translation and local processing with outbound-only communication, while the cloud layer provides enterprise-wide analytics and cross-plant visibility. This architecture lets manufacturers start with a single line and scale at their own pace, using the same data foundation at every stage.