User Tools

Site Tools


alarm_analysis:resilient_data_transfer

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
alarm_analysis:resilient_data_transfer [2018/02/05 14:08] sualarm_analysis:resilient_data_transfer [2023/12/29 13:41] (current) su
Line 3: Line 3:
 ===== Overview ===== ===== Overview =====
  
-Intelligent Plant produces highly configurable tools for capturing and analyzing data. We provide solutions that can be deployed in high-security/militarized network zones and reliably move data across networks, without compromising security.+Before reading this guide, you should be familiar with [[alarm_analysis:alarm_analysis_network_architectures|Alarm Analysis Network Architecture]] document. This illustrates possible network installations including scenarios where Alarms & Events are collected from a militarized zone and transferred to another network for processing.
  
-This document describes best practice for particular use-case:+In these scenarios, it is necessary to configure Data Core installations to act as data relay. Moreover, if the connectivity between relays is unreliable, further configuration is required to guarantee resilient data transfer.
  
-//Collect **Process Control Alarm & Event** data for **Alarm Analysis** on the **Industrial App Store.**//+This guide focuses on such a scenario
  
 +Data is gathered from serial feed and relayed via a Data Core Node from the Process Information Network to an App Store Connect on the Business Network. The relay is configured for Resilient Data Transfer.
  
-===== Terminology =====+{{ :alarm_analysis:alarmanalysisresilientdatatransfer.png |}}
  
-| Alarm Analysis | Intelligent Plant application that processes Alarm & Event data to produce interactive EEMUA (Engineering Equipment and Materials Users Association) grade reports and analysis. | +A two box installation is required, with the following Data Core configuration:
-Data Core Node | Intelligent Plant's data-routing and data-access software. Operates as a Windows service. | +
-| Industrial App Store | Cloud portal for industrial apps (including online editions of Intelligent Plant applications). | +
-| App Store Connect | An instance of Data Core Node pre-configured for secure connections with the Industrial App Store. | +
-| PCN | Process Control Network. | +
-| PIN | Process Information Network. | +
-| BN | Business Network. Also referred to as the Enterprise Network. | +
-| TCP | Transmission Control Protocol. A network communication protocol. |+
  
 +{{ :data_core:resilienttransfer_02.png |}}
  
-===== Network Topology =====+  * Data Core listens for events (TCP Printer Stream). Arriving events are parsed and sent... 
 +    - On an immediate data relay to App Store Connect (Fast TCP Out) 
 +    - To a local data repository (Big Data Sink)
  
-{{ :data_core:resilienttransfer_01.png |}}+  * A second process on the Data Core Node polls the local data repository for stored events (Big Data Source) and sends to... 
 +    - a further data relay that awaits an Ack from the App Store Connect (TCP Out). If Ack is not received, data is resubmitted.
  
-  - Alarm & Event data enters the PIN via a serial printer-port cable connected to a Serial to Ethernet Converter. \\ \\ +  App Store Connect listens for incoming events (TCP In) and stores to a local data repository (Big Data Sink).
-  - This physical architecture guarantees one-way data-flow. \\ \\ +
-  - The Data Core node installation on the PIN includes: +
-    - TCP Printer Port listener (listening for input from the Serial to Ethernet Converter) +
-    - Alarm & Event Collector (collecting A&E data from the PCN) +
-    - a TCP Out Channel for data transfer \\ \\ +
-  - The Data Core TCP Out Channel (on the PIN) initiates an authenticated TCP connection with App Store Connect (on the BN). Data transmitted on the TCP channel is signed and encrypted. \\ \\ Alarm & Event data is sent to App Store Connect. App Store Connect responds with an acknowledgement. If the Alarm Analysis server fails to receive a positive acknowledgement, data is resubmitted. \\ \\ +
-  - App Store Connect acts a local data processor and supports connections to the Industrial App Store. \\ \\ App Store Connect incudes: +
-    - TCP Listener (listening for input from the Alarm Analysis server) +
-    - Alarm & Event Collector (collecting A&E data from the Alarm Analysis server) +
-    - Alarm Analysis Processing (converting A&E data to Alarm Analysis records) \\ \\ Alarm Analysis data remains on the Business Network. \\ \\ +
-  - App Store Connect initiates a connection to the Industrial App Store using Microsoft's SignalR technology. This supports secure 2-way communication which allows App Store apps to connect and query data on the Business Network. \\  \\ Access to data via App Store Connect is only available if explicitly shared with another App Store User (see step 7). \\ \\ For more information, refer to the App Store Wiki: Connection Security. +
-  - The Industrial App Store: a cloud portal of industrial applications that can connect to business data sources through App Store Connect. \\ \\ +
-  - Users inside and outside the Business Network (e.g. office and home workers) access Alarm Analysis via the Internet. They log-in to the Industrial App Store and select the Alarm Analysis app. \\ NB. App Store users must be granted access to the Alarm Analysis data source (configured on the Business Network App Store Connection). \\ For instructions on sharing data, refer to the App Store Wiki: Share Data with other App Store Users\\ \\+
  
-===== Deployment and Configuration Guide =====+While connectivity between the PIN and BN is good, all events will reach the destination twice. The fast stream means data arrives quickly, the second stream guarantees resilience. App Store Connect consolidates the two streams  to avoid duplication.
  
-==== 1. PIN: Data Core Node ==== 
  
-The network topology above assumes data crosses into the PIN via a serial printer porter feed. Our first task is to install and configure a Data Core node on the PIN. 
  
-  * [[data_core:stand-alone_installation | Install Stand-Alone instance of Data Core ]]+====== Step by Step Install and Configuration ======
  
-Configure the following Data Core components (assume default settings unless explicitly specified):+**1. Install Data Core Node on PIN** 
 + 
 +For detailed instructions on how to deploy a Data Core node, see: [[data_core:stand-alone_installation]]. 
 + 
 +**2. Data Core Node Configuration** 
 + 
 +Configure the following Data Core components (assume default settings unless otherwise stated)
 + 
 +For detailed instructions on how to create an Event Source to Sink subscription, see: [[data_core:Event Subscription]]
  
 ^ //TCP Printer Stream// ^^ ^ //TCP Printer Stream// ^^
-^ Description | Listen and parse data arriving on TCP channel | 
 ^ Type | TCP Printer (Event Source) | ^ Type | TCP Printer (Event Source) |
 +^ Description | Listen and parse data arriving on TCP channel |
 +^ Disabled | False |
 ^ TCP Port | 9000 | ^ TCP Port | 9000 |
 ^ Message Delimiter | New Line {\n} | ^ Message Delimiter | New Line {\n} |
 ^ Maximum Characters per Scan | 4000 | ^ Maximum Characters per Scan | 4000 |
 +
 +^ // Fast TCP Out// ^^
 +^ Type | TCP Event Sink (Event Sink) |
 +^ Description | Immediate data transfer to the Business Network |
 +^ Disabled | False |
 +^ TCP Server Host | [ IP Address of server hosting "Fast TCP In" ] |
 +^ TCP Server Port| 11000 |
 +^ Username | [Service account with access to server hosting "Fast TCP In" ] |
 +^ Password | [Service account password with access to server hosting "Fast TCP In" ]   |
 +^ Check Response | No|
 +^ //Subscribes to:// | TCP Printer Stream |
 +
 +^ //Big Data Sink// ^^
 +^ Type | Big Data Event Sink (Event Sink) |
 +^ Description | Save data to the Big Data Store |
 +^ Disabled | False |
 +^ Big Data URL | http://localhost:9200 |
 +^ //Subscribes to:// | TCP Printer Stream |
  
 ^ //Big Data Source// ^^ ^ //Big Data Source// ^^
-^ Description | Retrieve collected data from Big Data Store | 
 ^ Type | Big Data Event Source (Event Source) | ^ Type | Big Data Event Source (Event Source) |
 +^ Description | Retrieve collected data from Big Data Store |
 +^ Disabled | False |
 +^ Paused | False |
 ^ Big Data URL | http://localhost:9200 | ^ Big Data URL | http://localhost:9200 |
 ^ Index Filter | tcpprinterstream.evt_* | ^ Index Filter | tcpprinterstream.evt_* |
 ^ Sleep Period | 30 | ^ Sleep Period | 30 |
 ^ Lag | 60| ^ Lag | 60|
- 
-^ //Big Data Sink// ^^ 
-^ Description | Save data to the Big Data Store | 
-^ Type | Big Data Event Sink (Event Sink) | 
-^ Big Data URL | http://localhost:9200 | 
-^ //Subscribes to:// | TCP Printer Stream | 
  
 ^ // TCP Out// ^^ ^ // TCP Out// ^^
-^ Description | Transfer data to the Business Network | 
 ^ Type | TCP Event Sink (Event Sink) | ^ Type | TCP Event Sink (Event Sink) |
-^ TCP Server Host | [ IP Address of server hosting "Fast TCP In" ] |+^ Description | Resilient data transfer to the Business Network | 
 +^ Disabled | False | 
 +^ TCP Server Host | [ IP Address of server hosting "TCP In" ] |
 ^ TCP Server Port| 11000 | ^ TCP Server Port| 11000 |
-^ Username | [Service account with access to server hosting "Fast TCP In" ] | +^ Username | [Service account with access to server hosting "TCP In" ] | 
-^ Password | [Service account password with access to server hosting "Fast TCP In" ]   |+^ Password | [Service account password with access to server hosting "TCP In" ]   |
 ^ Check Response | Yes | ^ Check Response | Yes |
-^ //Subscribes to:// | TCP Printer Stream \\ Big Data Source |+^ //Subscribes to:// | Big Data Source |
  
 +**3. Install App Store Connect on BN**
  
-    +For detailed instructions on how to deploy App Store Connect, see: [[data_core:how_to_connect_your_data_to_the_app_store]]
  
  
-===== Data Relay across 3-Zone Network Architecture =====+**4. App Store Connect Configuration**
  
-The following example illustrates relaying data across a 3 zone network architecture. In each zone a Data Core Node is installed.+Configure the following Data Core components (assume default settings unless otherwise stated).
  
-{{:alarm_analysis:resilientdatatransfer2.png|}}+For detailed instructions on how to create an Event Source to Sink subscription, see[[data_core:Event Subscription]]
  
-Alarm & Event data enters Data Core Node 1 via a serial printer-port cable connected to a Serial to Ethernet Converter. This physical architecture guarantees one-way data-flow from Process Controllers to Data Core.+^ //TCP In// ^^ 
 +^ Type | TCP Event Source (Event Source) | 
 +^ Description | Receive data from Process Information Network 
 +^ Disabled | False | 
 +^ TCP Server Port| 11000 |
  
-A **Fast Flow** relays data across the zones via a fire-and-forget process providing near-real-time data on the destination server.+^ //Big Data Sink// ^^ 
 +^ Type | Big Data Event Sink (Event Sink) | 
 +^ Description | Save data to the Big Data Store | 
 +^ Disabled | False | 
 +^ Big Data URL | http://localhost:9200 | 
 +^ Big Data Refresh Interval | 5s | 
 +^ //Subscribes to:// | TCP In |
  
-On a separate **resilient flow** the database on the source server is poled and new data relayed via a guaranteed delivery process. This provides a complete data record on the destination server. 
  
-The database on the destination server consolidates data from both flows.+**5Firewall Settings**
  
-===== Data Core Installation ===== +Firewalls will need to allow passage for the following protocols on ports:
  
-Stand-Alone installations of Data Core are required on each domain traversed.+^ Firewall ^ Requirements ^ 
 +| BN:Internet Network Firewall | TCP Port 443 open to outbound traffic from computer hosting App Store Connect and user machines to: \\ \\ https://appstore.intelligentplant.com \\ https://login.microsoftonline.com * \\ \\ * Required for Azure Active Directory log-in. \\ \\ For instructions on enabling log-in to the App Store with business accounts, refer to: App Store Registration for Organisations. | 
 +| Computer hosting App Store Connect | Windows Firewall TCP Port 443 open to outbound traffic \\ TCP Port 11000 open to inbound traffic | 
 +|PIN:BN Network Firewall | TCP Port 11000 open to outbound traffic | 
 +| PCN:PIN Network Firewall | No inbound access required|
  
-If the intention is to make data available to the App Store, the destination server could be an "App Store Connect" Data Core instance.+==== 5. Testing ==== 
 + 
 +Assuming that the TCP Printer Stream configured above is listening to an active Alarm & Event stream, we should see evidence of data store to the Big Data repositories on the PIN and BN. 
 + 
 +A quick test is to execute a URL search query. 
 + 
 +1. Log on to servers hosting Data Core Node 
 + 
 +Open web browser and enter 
 +<code> http://localhost:9200/_cat/indices/tcpprinterstream.evt_*?v </code> 
 + 
 +A "doc.count" greater than zero indicates Alarm & Event data is successfully stored.
  
 <code> <code>
-App Store Connect is an instance of a Data Core Node pre-configured for secure connection with the Industrial App +health status index                          pri rep docs.count docs.deleted store.size pri.store.size  
-Store.+green  open   tcpprinterstream.evtidx_201802             10            0    211.7kb        211.7kb 
 </code> </code>
  
-===== Data Core Configuration ===== +A "doc.count" field 
  
-Configure the following Event Source and Event Sink components and subscriptions.+2Log on to servers hosting App Store Connect
  
-{{:alarm_analysis:resilientdatatransfer7.png|}}+Open web browser and enter 
 +<code> http://localhost:9200/_cat/indices/tcpin.evt_*?v </code>
  
-^ //TCP Printer Stream// ^^ +You should expect to see something like:
-^ Type | TCP Printer (Event Source) | +
-^ TCP Port | 9000 | +
-^ Message Delimiter | New Line {\n} | +
-^ Maximum Characters per Scan | 4000 |+
  
-^ //Fast TCP Out// ^^ +<code> 
-^ Type | TCP Event Sink (Event Sink) | +health status index                          pri rep docs.count docs.deleted store.size pri.store.size  
-^ TCP Server Host | [ IP Address of server hosting "Fast TCP In" ] | +green  open   tcpin.evtidx_201802              2           10           10    211.7kb        211.7kb  
-^ TCP Server Port| 11000 | +</code>
-^ Username | [Service account with access to server hosting "Fast TCP In" ] | +
-^ Password | [Service account password with access to server hosting "Fast TCP In" ]   | +
-^ Check Response | False |+
  
-^ //Retrieve Data// ^^ +A "doc.count" greater than zero indicates Alarm & Event data is successfully stored.
-^ Type | Big Data Event Source (Event Source) | +
-^ Big Data URL | http://localhost:9200 | +
-^ Index Filter | tcpprinterstream.evt_* | +
-^ Sleep Period | 30 | +
-^ Lag | 60|+
  
-^ //TCP In// ^^ +A "docs.deleted" greater than zero indicates events are arriving on both fast and resilient streams. The data consolidation process marks a duplicate data document for deletion. This is an active measure and clears to zero over time.
-^ Type | TCP Event Source (Event Source) | +
-^ TCP Server Port| 11000 |+
  
-^ //TCP Out// ^^ 
-^ Type | TCP Event Sink (Event Sink) | 
-^ TCP Server Host | [ IP Address of server hosting "Resilient1TCP In" ] | 
-^ TCP Server Port| 11000 | 
-^ Username | [Service account with access to server hosting "Resilient TCP In" ] | 
-^ Password | [Service account password with access to server hosting "Resilient TCP In" ]   | 
-^ Check Response | True | 
  
-^ //Store Data// ^^ 
-^ Type | Big Data Event Sink (Event Sink) | 
-^ Big Data URL | http://localhost:9200 | 
-^ Big Data Refresh Interval | 5s | 
  
  
-===== Firewall Requirements =====  
  
-Network and server firewalls must be open for TCP traffic on the ports configured above.+ 
 + 
 + 
 + 
 + 
 +==== 6Next Steps ==== 
 + 
 +So far, we've moved Alarm & Event data across a network. We are now ready to configure Alarm Analysis processing. 
 + 
 +For more info, see [[alarm_analysis:how_to_configure_an_alarm_event_import_stream|How to configure an Alarm & Event Import Stream]]. 
 + 
  
alarm_analysis/resilient_data_transfer.1517839712.txt.gz · Last modified: 2018/02/05 14:08 by su