Maritime AIS: Why data quality matters

Share:
Jump straight to

For virtually any commercial business or organization (governmental or non-governmental) in the modern world, the collection of quality data is of elemental importance.

The world is a big, complicated place with many, many moving parts. Getting and keeping a handle on what’s going on is step one.

And trying to forge a path forward or make significant decisions without quality data is simply shooting in the dark.

As noted in Forbes only five short years ago:

“As organizations look to adopt the new wave of coming technologies, like automation, artificial intelligence and the Internet of Things, their success in doing so and their ability to differentiate themselves in those spaces will be dependent upon their ability to get data management right. This will become increasingly important as connected devices and sensors proliferate, causing an exponential growth in data—and a commensurate growth in opportunity to exploit the data.

Those that position their organizations to manage data correctly and understand its inherent value will have the advantage. In fact, we may see leaders pull so far in front that it will make the market very difficult for slow adopters and new entrants.”


Share
Share on LinkedIn

Nothing has happened since this was written to alter that trajectory. Data-driven practices continue to move to the forefront of organizational management.

With regards to the maritime industry, every significant stakeholder now depends on automatic identification system (AIS) data to inform policymaking and manage expectations. And that data better be of high quality and accuracy.

Spire was one of the first companies to dive into this rapidly expanding realm. Now traded on the New York Stock Exchange (NYSE), Spire has eight corporate offices on three continents, with over 375 employees in over 40 countries.

The methodology of good data

Quality data requires a few benchmarks to be met. It has to be comprehensive and consistent—not to mention accurate. And it has to be collected quickly enough to be exploitable in the moment at hand.

Measuring the quality of data is a discipline in and of itself, one that plays a major role in the IT field, especially in the machine learning realm. Concerns span the spectrum from ensuring modeling protocols are not based on flawed data to using machine learning to fine-tune the collection of the data itself.

What’s vital is that datasets used for analysis are accurate:

“It is well understood from literature that the performance of a machine learning (ML) model is upper bounded by the quality of the data. While researchers and practitioners have focused on improving the quality of models (such as neural architecture search and automated feature selection), there are limited efforts towards improving the data quality. One of the crucial requirements before consuming datasets for any application is to understand the dataset at hand and failure to do so can result in inaccurate analytics and unreliable decisions. Assessing the quality of the data across intelligently designed metrics and developing corresponding transformation operations to address the quality gaps helps to reduce the effort of a data scientist for iterative debugging of the ML pipeline to improve model performance.”


Share
Share on LinkedIn

The drive to create and capture higher-quality data is an effort that spans academic disciplines and economic sectors. The quality and management of earth observation data is an especially “hot topic,” as outlined in a paper recently presented at an Institute of Electrical and Electronics Engineers symposium entitled “A Machine Learning Approach for Data Quality Control of Earth Observation Data Management System.” 

Fishing for data on the open seas

In the maritime sector, especially when dealing with the global supply chain, comprehensiveness means … well, being global in scope. Silicon on a ship leaving Brazil’s Port of Belem en route to Taiwan’s Port of Keelung is intertwined with the manufactured computer chips that will later make their way to the Port of Los Angeles. Quality data has to capture moving parts far and wide.

It has to be available round-the-clock, since decisions need to be made now and not at some vague moment in the future, and consistent so that it doesn’t have gaps that leave ignorance in their wake. And while being steady, it has to be accurate. Regardless of the end-use of the data in question, the fact is that in any scenario bad data is worse than no data. Being confident that you know what’s going on will lead to far more decision-making confidence than is warranted while making the honest assessment that you don’t have the accurate data you need will breed caution that is more akin to wisdom.

Such reality is not simply theoretical; it can have real-world implications of the highest order. For example, three people died in a collision of two vessels—theCooperative Spirit andRC Creppel—that were towing barges on the Lower Mississippi River last year. The National Transportation Safety Board (NTSB) blamed the incident on “insufficient radio communication [and] not broadcasting accurate Automatic Identification System (AIS) information regarding tow size.”

According to the NTSB, the “RC Creppel’s AIS broadcast showed its length at 69 feet rather than its actual overall length of 514 feet, while theCooperative Spirit’s AIS broadcast showed the length at 200 feet, far below its actual overall length of 1,600 feet.” The accident saw 42 barges break free, about 8,000 gallons of diesel fuel leak into the river, the release of sulfuric acid vapors, and an estimated $3 million in damages.

The accident is an example of how important quality data is in crowded shipping channels. The course, speed, rate of turn, and size of dozens of vessels have to be tracked accurately to avoid collisions and the delays that result from them. Likewise, better understanding the entirety of the modern maritime transport system demands quality data on a macro scale.

AIS data has quickly matured

Lemur satellite in spaceThough not originally designed to be a data-gathering tool, AIS has over the past two decades grown into a comprehensive information-retrieval infrastructure. When first made operational in the early 1990s, terrestrial AIS (T-AIS) was a localized system for high-congestion corridors meant to prevent ship-to-ship collisions. Transceivers on ships pinged similar units on land to pinpoint vessel locations in what were AIS communication cells restricted to areas near land.

But the technology was moving fast. By the mid-2000s it became feasible for transceivers to communicate much farther, all the way to orbiting satellite-based receivers. This opened up a wealth of possibilities.

Satellite AIS (S-AIS) effectively expanded the range of tracking ships from about 30 nautical miles to 3,000. In short order, a fleet of low Earth orbit (LEO) satellites were deployed that enabled a comprehensive communication matrix to come into being.

How maritime AIS accuracy has improved

Network and satellite data exchange over planet earth in space 3D renderingThe rapid expansion of AIS hardware and software has led to an explosion of high-quality data. Satellite-based maritime domain awareness has been established by the relatively rapid positioning of a powerful orbiting infrastructure based on miniaturized satellite technology. For example, in the past decade Spire has launched over 100 satellites in what is now the world’s largest multi-purpose constellation, including wide placement in differing orbital planes and managed orbital regimes to provide truly global coverage.

This network provides the high-quality data that over 600 solution customers depend on. Coverage spans the entire globe and our architecture provides the delivery of real-time data without the latency associated with store-and-forward satellite architecture. The Spire global S-AIS constellation provides global coverage with superior detection rates, dependable downlinks in near real-time, the ability to detect all AIS broadcasters (and not just those mandated by the International Maritime Organization [IMO]), ramp-up capabilities to handle the rapidly expanding use of automatic storage management (ASM), and adapting to innovative uses of the maritime VHF spectrum.

Maritime AIS continues to expand

The capabilities and breadth of AIS have grown rapidly and continue to expand. The first generation of AIS focused on large vessels above 300 gross tonnage (GT), which is now classified as Class A and is mandated by the IMO. Class B AIS operates at a lower wattage and is not mandated internationally, though some governments do require it for larger local vessels, usually fishing boats.

But with the development exactTrax, AIS coverage is now rapidly expanding to smaller vessels that can incorporate this newer lower-cost technology into their operations. Using autonomous maritime radio devices (AMRD) and solar/battery-powered AIS units, exactTrax is practical for virtually any ocean-going vessel and is unleashing a wealth of new quality data regarding local shipping routes and marine protected areas.

The vast array of ways maritime AIS data is now used

AIS has fundamentally altered the monitoring of the maritime domain in a relatively short time. It is now deployed aboard every large vessel and many smaller ones around the globe, providing such crucial applications as navigation improvement, search and rescue transmitters, man-overboard units, Application-Specific Messages (ASM), and Emergency Position Indicating Radio Beacons (EPIRB).

And the wealth of data now collected is mind-blowing. Spire operates over 30 ground stations and 70 antennas in 16 countries, tracking 350 million AIS messages and 250 thousand vessels daily. In total, this creates a continuous web of quality data that creates real value for users in the governmental, commercial, and nonprofit sectors.

The Spire S-AIS is now the foundation for a wide range of applications in the maritime analytics field:

Some of the specific ways Spire AIS data is being used include monitoring the sanctions currently placed on Russian economic activities due to the invasion of Ukraine, scrutinizing the functioning of the global supply chain, helping increase the efficiency of the shipping industry, and better preparing for the expansion of routes in Arctic waters.

These kinds of specific solutions using quality AIS data will likely continue to grow in the future. As the impact of climate change continues to broaden, pressure will grow to minimize the carbon and sulfur emissions of commercial vessels via better routing. Quality data will be a vital tool in pursuing better energy efficiency performance.

As Sweden’s SSPA notes, “For future research, it is proposed that AIS data could be useful in studies about the spread of marine litter, shallow water effects, fuel consumption, the fill rate of cargo ships, or how political and historical events such as Brexit affect maritime traffic.”

Spire is an industry leader

Spire Global maritime vessel shipping globe illustrationSince being founded, Spire has developed advanced capabilities to provide the most accurate and reliable AIS data possible and has delivered proven value across a wide range of applications. We now operate the world’s largest maritime listening constellation and have already undertaken two launch campaigns in 2022.

Quality data is critical to the many clients that depend on Spire to make crucial decisions. Providing accurate and reliable information not only drives optimal operations for our clients, but also expands what analytics can accomplish in the face of challenges like climate change and supply chain interruptions. Over the long haul, it is part of a rising tide of transparency, safety, and compliance that will raise all boats.

More Blog Posts