Digital danger: controversy and concerns of data lifecycle in shipping
Updated: Jul 22, 2020
The shipping industry appears to be transmuting and adapting to the current world scenario. We find ourselves with fuel prices still low, some sectors are struggling with excess tonnage while other sectors have higher than break even earnings. Regulatory compliance, monitoring and reporting requirements are either piling up or just around the corner.
As the industry transforms, data driven services and analytics are being brought to the forefront and are being rapidly adopted. For many companies their data is understandably quite a sensitive subject. Since it is arguably a new area for the shipping industry, there are a few questions that keep cropping up regarding how data is dealt with in its entirety. Most of the concern stems from fear of the unknown or the possibility of a worst case scenario and we hope to answer some of them here by addressing the entire data lifecycle.
Data is collected from on-board equipment and sensors. Relevant data is also collated from other available sources, most of which are publicly available. It is important that the data is not just continuously available but also completely reliable. Ensuring data quality means that it has to be of the required accuracy and frequency. Companies that work with data ensure its sanctity by continuous monitoring of individual data streams, careful filtering and creating sensitive alerts. This process is the corner stone to obtaining accurate results when the data is analyzed.
The speed of data transfer is dependent on the communication capability of the vessel and amount of data transfer depends on the appetite for data that the company has. With good broadband connectivity data can be both transmitted and received by the vessel virtually in real-time. Where vessels do not have the latest communications systems, data can be transferred to a memory stick and then shipped to the data centres when the vessel arrives in port.
There are several methods of data enrichment done based on the requirement. It enhances the value of the raw data and in doing so accurately allows better comprehension and greater understanding. Some examples are, mathematical and statistical modelling such as the ‘propulsion power decomposition’ to understand the breakup of the dynamic sea margin, meteorological and hydrographic overlays to measure the impact of weather conditions, new data exposure to machine learning technology, etc.
Along the data trail there are quite a few areas that have been identified where security breaches can happen.
The best way to prevent physical tampering of sensors and on-board networks is to restrict entry to only authorized personnel on board. Restricted areas on-board should be identified and access strictly controlled for visitors, contractors, etc.
The IT systems on ships need to follow industry standards and best practices on security including a solid architecture design and proper firewalls.
Transfer of data is typically highly compressed and encrypted. Security during data transfer is ensured by industry best practice of ‘public key infrastructure’. Which is what is used in e-commerce and internet banking.
Internet security is ensured of by standard industry best practice of IP -verification or physical token with username and password i.e. ‘2-factor authentication’.
This is by far the biggest area of concern for most companies. Following standard industry practices such as keeping standard computer software & operating systems up-to-date, educating and training employees on cyber-security and cyber-crimes are the effective ways to mitigate these threats.
The system is considered robust to the extent that using current computing power to break the encryption codes used to secure data enroute, is estimated to take hundreds of years to decrypt.
With the amount of data increasing exponentially, secure storage has become a matter of grave concern. Most data companies use the cloud to store data. Additionally, backup data may also be stored in bank vaults. For data redundancy and performance improvement, industry best practices on data storage technology such as RAID (redundant array of independent disks) are used.
A commonly asked question is “whose is the data”? Does it belong to the ship-owner, equipment manufacturer, sensor manufacturer, data collector or automation provider? Most ‘data companies’ take the view that the raw data belongs to the ship-owner who in most cases is their customer. However, the standard industry practice is to use data that is further processed, enriched or improved for R&D purposes as long as it is sufficiently anonymized.
Companies dealing with data in the industry are continuously evaluating the threats to data, finding new ways to improve and keep it secure.
Are these some of your common concerns?
Let me know you thoughts.
#Data #DataLifecycle #DataProprietorship #DataStorage #Cyberthreats #DataEnrichment #DataSecurity #DataCollection #DataTransfer #maritime #shipping #autonomous #unmanned #technology #insight #marine #renewables #innovation #offshore #leadership #fuel #emissions #environment #regulation #compliance #strategy #IMO #melvinmathews