Publication Date: 2023-06-14

Approval Date: 2023-02-23

Submission Date: 2023-02-01

Reference number of this document: OGC 23-010

Reference URL for this document: http://www.opengis.net/doc/PER/FMSDI3

Category: OGC Public Engineering Report

Editor: Robert Thomas, Sara Saeedi

Title: Towards a Federated Marine SDI: Connecting Land and Sea to Protect the Arctic Environment Engineering Report


OGC Public Engineering Report

COPYRIGHT

Copyright © 2023 Open Geospatial Consortium. To obtain additional rights of use, visit http://www.opengeospatial.org/

WARNING

This document is not an OGC Standard. This document is an OGC Public Engineering Report created as a deliverable in an OGC Interoperability Initiative and is not an official position of the OGC membership. It is distributed for review and comment. It is subject to change without notice and may not be referred to as an OGC Standard. Further, any OGC Public Engineering Report should not be referenced as required or mandatory technology in procurements. However, the discussions in this document could very well lead to the definition of an OGC Standard.

LICENSE AGREEMENT

Permission is hereby granted by the Open Geospatial Consortium, ("Licensor"), free of charge and subject to the terms set forth below, to any person obtaining a copy of this Intellectual Property and any associated documentation, to deal in the Intellectual Property without restriction (except as set forth below), including without limitation the rights to implement, use, copy, modify, merge, publish, distribute, and/or sublicense copies of the Intellectual Property, and to permit persons to whom the Intellectual Property is furnished to do so, provided that all copyright notices on the intellectual property are retained intact and that each person to whom the Intellectual Property is furnished agrees to the terms of this Agreement.

If you modify the Intellectual Property, all copies of the modified Intellectual Property must include, in addition to the above copyright notice, a notice that the Intellectual Property includes modifications that have not been approved or adopted by LICENSOR.

THIS LICENSE IS A COPYRIGHT LICENSE ONLY, AND DOES NOT CONVEY ANY RIGHTS UNDER ANY PATENTS THAT MAY BE IN FORCE ANYWHERE IN THE WORLD. THE INTELLECTUAL PROPERTY IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. THE COPYRIGHT HOLDER OR HOLDERS INCLUDED IN THIS NOTICE DO NOT WARRANT THAT THE FUNCTIONS CONTAINED IN THE INTELLECTUAL PROPERTY WILL MEET YOUR REQUIREMENTS OR THAT THE OPERATION OF THE INTELLECTUAL PROPERTY WILL BE UNINTERRUPTED OR ERROR FREE. ANY USE OF THE INTELLECTUAL PROPERTY SHALL BE MADE ENTIRELY AT THE USER’S OWN RISK. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR ANY CONTRIBUTOR OF INTELLECTUAL PROPERTY RIGHTS TO THE INTELLECTUAL PROPERTY BE LIABLE FOR ANY CLAIM, OR ANY DIRECT, SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES, OR ANY DAMAGES WHATSOEVER RESULTING FROM ANY ALLEGED INFRINGEMENT OR ANY LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR UNDER ANY OTHER LEGAL THEORY, ARISING OUT OF OR IN CONNECTION WITH THE IMPLEMENTATION, USE, COMMERCIALIZATION OR PERFORMANCE OF THIS INTELLECTUAL PROPERTY.

This license is effective until terminated. You may terminate it at any time by destroying the Intellectual Property together with all copies in any form. The license will also terminate if you fail to comply with any term or condition of this Agreement. Except as provided in the following sentence, no such termination of this license shall require the termination of any third party end-user sublicense to the Intellectual Property which is in force as of the date of notice of such termination. In addition, should the Intellectual Property, or the operation of the Intellectual Property, infringe, or in LICENSOR’s sole opinion be likely to infringe, any patent, copyright, trademark or other right of a third party, you agree that LICENSOR, in its sole discretion, may terminate this license without any compensation or liability to you, your licensees or any other party. You agree upon termination of any kind to destroy or cause to be destroyed the Intellectual Property together with all copies in any form, whether held by you or by any third party.

Except as contained in this notice, the name of LICENSOR or of any other holder of a copyright in all or part of the Intellectual Property shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Intellectual Property without prior written authorization of LICENSOR or such copyright holder. LICENSOR is and shall at all times be the sole entity that may authorize you or any third party to use certification marks, trademarks or other special designations to indicate compliance with any LICENSOR standards or specifications.

This Agreement is governed by the laws of the Commonwealth of Massachusetts. The application to this Agreement of the United Nations Convention on Contracts for the International Sale of Goods is hereby expressly excluded. In the event any provision of this Agreement shall be deemed unenforceable, void or invalid, such provision shall be modified so as to make it valid and enforceable, and as so modified the entire Agreement shall remain in full force and effect. No decision, action or inaction by LICENSOR shall be construed to be a waiver of any rights or remedies available to it.

None of the Intellectual Property or underlying information or technology may be downloaded or otherwise exported or reexported in violation of U.S. export laws and regulations. In addition, you are responsible for complying with any local laws in your jurisdiction which may impact your right to import, export or use the Intellectual Property, and you represent that you have complied with any regulations or registration procedures required by applicable law to make this license enforceable.

Table of Contents

1. Executive Summary

The Federated Marine Spatial Data Infrastructure (FMSDI) Pilot Phase 3 is an OGC Collaborative Solutions & Innovation (COSI) initiative with the objective of enhancing Marine Spatial Data Infrastructures (MSDIs) through a better understanding MSDI maturity and to learn more about current capabilities and shortcomings of marine data services offered by all Marine Spatial Data Infrastructures. The Pilot was focused on advancing the implementation of open data standards, architecture, and prototypes for use with the creation, management, integration, dissemination, and onward use of marine and terrestrial data services for the Arctic.

This initiative OGC FMSDI Phase 3 builds directly on what was accomplished starting from September 2021 until July 2022 through the Federated Marine Spatial Data Infrastructure Pilot Phase 1 & 2. These pilots again built on the works of prior initiatives, such as the Marine Spatial Data Infrastructure Concept Development Study, the Maritime Limits and Boundaries Pilot, and the Arctic Spatial Data Pilot. The latter plays an important role for this 2022 pilot, because it followed comparable goals and motivations.

The Marine Spatial Data Infrastructure Concept Development Study Engineering Report (ER)[1] summarized the efforts and information gathered from a Request for Information which focused on in-depth data requirements, architecture, and standards needs for a Marine Spatial Data Infrastructure. The Maritime Limits and Boundaries Pilot ER[2] worked to build a detailed implementation for testing IHO S-121 Standard data. The Arctic Spatial Data Infrastructure Pilot[3] aimed to utilize international standards to support a spatial data exchange focusing on the complex issues of Arctic marine space. FMSDI Pilot Phase 1 and 2 focused on Marine Protected Areas (MPAs) and to further advance the interoperability and usage of MPA data by implementing the IHO standard S-122 and several OGC API standards through a federation of MPA data from the various countries interested in the Baltic/North Sea region[4]. The third phase of the FMSDI pilot, which started in July 2022, focused on land/sea use cases and extends the use cases developed in the second phase to add the Arctic region as a new location to the demonstration scenarios.

A goal of phase 3 of the FMSDI pilot was to demonstrate to stakeholders the value of implementing and leveraging open data standards, architecture, and prototypes for use with the creation, management, integration, dissemination, and onward use of marine and terrestrial data services for the Arctic. Through a variety of use cases/scenarios in the Arctic, off the western coast of Alaska, these scenarios demonstrated how accessing and publishing data, through FAIR Data Principles - Findable, Accessible, Interoperable, Reusable, can improve decision making during a maritime emergency.

This pilot addressed a variety of questions uniquely related to MSDIs. A few of these questions were as follows.

  • Are stakeholders discovering and obtaining the right data?

  • What do stakeholders have and what is missing?

  • Do the available standards and interfaces to these data work?

  • Do stakeholders have the right tools?

The Pilot Phase 3 also prototyped the proposed OGC API - Discrete Global Grid System (DGGS) in performing analytics related to coastal flooding and erosion focused on the integration of both terrestrial and marine spatial information. In addition, this Pilot included an open survey with the objective to gather additional information to better support future development of FMSDI pilots. These ongoing pilots will aid in unlocking the full societal and economic potential of the wealth of marine data at local, national, regional or international levels. This survey also provided information and insight on the current state of MSDI.

1.1. Technical Overview

The Pilot Phase 3 activities were divided amongst seven individual participants according to the component they were providing, either a client or server, and the approved sub-scenario each participant developed. Each participant performed a thorough execution of all phases of data discovery, access, integration, analysis, and visualization appropriate to their component. Participants collaborated and supported each other with data, processing capacities, and visualization tools. An overarching goal was to learn more about current capabilities and possible shortcomings of marine data services offered by all Marine Spatial Data Infrastructures, Web portals, and directly accessible cloud native data in the arctic area. There were seven participants each implementing separate components including four clients and four servers. These were the following:

  • Two universal Clients (D100 and D101): These client services demonstrated different viewpoints and methods for digesting the data from different servers and standardized data.

  • Two Data Fusion Servers (D103 and D104): These servers ingested various data inputs, including S-100 data. Both were implemented using the OGC API - Features standard.

  • An independent DGGS Client (D102) and independent DGGS fusion server (D105): This client ingested the outputs from the independent DGGS server and analyzed and presented the data and results to end users using the OGC API - DGGS.

  • An integrated, real time data fusion server (D107) and client (D106): One processing server that ingested the data from various sources of providers.

Phase 3 includes an overarching, sea-based, health and safety scenario incorporating the land/sea interface in the Arctic. This scenario demonstrated the technology and data used with OGC, IHO, and other community standards in response to a grounding event and the evacuation of a cruise ship or research vessel in the Arctic. Participants were responsible for developing and demonstrating sub-scenarios that showed how data can be discovered, accessed, used and reused, shared, processed, analyzed, and visualized. Each sub-scenario demonstrated what is currently possible and what gaps are experienced with the resources that can be discovered on the Internet.

Participants contribution included, but were not limited to, the following:

  • Demonstrate how FAIR data principles, through the use of OGC API’s and IHO standards, can be effectively used to facilitate rescue operations in the Arctic including oil spill tracking and remediation;

  • Demonstrate interoperability between land and marine data that is necessary to understand coastal erosion;

  • Demonstrate the OGC Discrete Global Grid Systems (DGGS) API use-case in the Arctic;

  • Access to land-based emergency services/resources (e.g., Coast Guard stations, transit times to emergency services or ports, medical facilities and resources, helicopter access);

  • Access to coastal environmental/topographic/hydrographic/maintenance data;

  • Access to Global Maritime Traffic data used in the Arctic;

  • Access to voyage planning information (e.g., Arctic Voyage Planning Guide).

In addition to marine data, the sub-scenarios included elements coming from the land side. This is particularly interesting, because often land-sea use cases require the integration of data from multiple organizations, with each organization potentially limiting its view to one side of the land-sea transition.

1.2. Key Conclusions and Recommendations

The Pilot Phase 3 encountered many challenges leading to a number of conclusions and recommendations, a summary of which are presented here. The key deliverable were participant demonstrations of OGC and IHO standards highlighting interoperability between a selection of clients and servers. In addition to the deliverables from participants, an online user survey was conducted to gather insights from the Marine community.

These conclusions and recommendations are summarized below. A more comprehensive description is presented in Chapter 9, Challenges and Lessons Learned, and Chapter 10, Recommendations and Future Work.

1.2.1. Phase 3 User Survey

During the course of the Pilot an online user survey was conducted. An analysis of the responses to the survey tended to show the following.

  • The need for partnership will be required to create a successful FMSDI.

  • Most of the participants found that data within their MSDI to be least findable in contrast to other FAIR principles.

  • The marine community appears to be an avid standard user.

  • By implementing interoperability between OGC and IHO, it is possible to increase the number IHO S-100 datasets made available to the public.

  • Although data-oriented use cases had the majority of suggested use cases, an MSDI should support real end-user applications using land-sea data integration as well.

1.2.2. Summary

There were many challenges leading to lessons learned from the execution of the pilot. These include the following.

  • Data Availability - Phase 3 of the pilot demonstrated the lack of data in the subject area. There was little real time data available, so these data were simulated for the pilot. It was also noted by participants that there was limited support for data conforming to either S-100 or OGC API models. It is recommended that future scenarios should occur in an AOI where a wider variety of datasets are available. It is also recommended that, moving forward, developing and publishing implementation guides should be part of future Marine pilot activities.

  • Arctic Voyage Planning Guide (AVPG) - The AVPG provided a significant amount of data through much of the Arctic region but only a small subset of the layers contained data in the subject area. As this is a developing framework, continued investigation of effective use of the AVPG (Arctic Voyage Planning Guide) in future Phases of FMSDI is recommended.

  • Land / Sea Interface - In the area that connects the land / sea domain it is crucial for datasets to meet and integrate effectively. However, datasets from these two domains don’t always achieve this, particularly when data has been captured at different times, with different coordinate reference frames, to different standards and data models, or to varying levels of scale, precision and accuracy. It is recommended that future phases should investigate how a concrete methodology, using existing best practices, could be published for resolving land / sea interoperability issues.

  • DDIL (Denied, Disrupted, Intermittent, and Limited Bandwidth) Environments - Given the challenging connectivity in the Arctic environment, all the more important when dealing with emergency and disaster situations, it is recommended that further investigation is required on how to optimize the retrieval and storage of marine and terrestrial feature collections as a GeoPackage using a supported OGC and IHO file encoding standards.

  • OGC Standards

    • Using OGC API — Features to serve Federated Marine Data. An attractive element of the API model for data producers, particularly of marine data, is the retrieval from the authoritative source of the data. For this Phase, there was a comparison of OGC WMS and OGC API — Features using the same data sources. It was found that OGC WMS provided the benefit of uniform styling and legends across all clients. This was not available through OGC API — Features. However, OGC APIs may provide a much faster response time due to the possibly smaller size of the JSON responses. It is recommended that providing a consistent portrayal from OGC API — Features be further investigated.

    • The issue of security (authentication and authorization) has not been explored to its full extent and should be explored further as much of the marine community uses restricted datasets.

    • Using Draft OGC API — DGGS - In this phase, one of the first OGC API — Discrete Global Grid Systems TIEs was done between different software solutions, each implementing very different Discrete Global Grid Systems. A lesson to take away from this is that although hexagons offer unique interesting sampling characteristics (the hexagon being the regular polygon tiling the plane closest to a circle), they do add a significant amount of complexity compared to square or rhombus tiles. During the process, the OGC API — Tiles standard is used to encode and transfer DGGS data. This phase of the pilot has shown that the draft candidate OGC API — DGGS standard is on solid ground, and that the candidate OGC API — Processes — Part 3: Workflows and Chaining Standard nicely complements these capabilities by allowing to describe workflows integrating multiple OGC API data sources and processes, regardless of where they originate. It is therefore recommended that future work should address these opportunities and challenges.

  • IHO Standards - The pilot demonstrated how S-100’s General Feature Model (GFM) can represent multiple different datasets, for different purposes, in a search/rescue context. The ability to integrate such APIs together and form a common endpoint for users and the ability for users to ingest OGC API endpoints is a high priority. Therefore, it is recommended that this be pursued further in future phases.

1.2.3. Demonstration Videos

To view demonstration videos showing the accomplishments of the seven participants in this phase of the pilot, please click the following link to the OGC YouTube Channel.

2. Security Considerations

No security considerations have been made for this document.

3. Contributors

All questions regarding this submission should be directed to the editor or the submitters:

Name Affiliation Role

Robert Thomas

OGC

Editor

Sara Saeedi Ph.D

OGC

Editor

Sina Taghavikish Ph.D

OGC

Contributor

Jason MacDonald

Compusult Limited

Contributor

Jérôme Jacovella-St-Louis

Ecere Corporation

Contributor

Gordon Plunkett

ESRI Canada

Contributor

Matthew Jestico

Helyx Secure Information Systems Ltd

Contributor

Jonathan Pritchard

IIC Technologies Limited

Contributor

Bradley Matthew Battista

Tanzle, Inc.

Contributor

Perry Peterson

University of Calgary

Contributor

Marta Padilla Ruiz

University of Calgary

Contributor

4. Acknowledgements

The OGC expresses its gratitude to the sponsor of this phase of the FMSDI pilot: The National Geospatial-Intelligence Agency (NGA).

Also special thanks to the sponsors of the previous phases of the FMSDI pilot: UK Hydrographic Office and Danish Geodata Agency.

The OGC further wishes to express its gratitude to all participants, data providers Appendix A: Data & Services and to the companies and organizations that provided excellent contributions in responding to the online survey that provided key input for this OGC Engineering Report.

5. Scope

This Engineering Report (ER) summarizes the main achievements of the Federated Marine Spatial Data Infrastructure (FMSDI) Pilot Phase 3. It focused on a variety of aspects contributing to an overarching scenario to aid in the better understanding of both the challenges and potential opportunities for coastal communities, ecosystems, and economic activities in the Arctic region.

The sub-scenarios, i.e., those scenarios developed by each participant, address aspects of the changing Arctic landscape. These activities included the following.

  • A sea-based, health and safety scenario incorporating the land/sea interface in the Arctic. This scenario demonstrates the technology and data used with OGC, IHO, and other community standards in response to a grounding event and the evacuation of an expedition cruise ship or research vessel in the Arctic. Demonstrating interoperability between land and marine data that is necessary to aid first responders and other stakeholders. This scenario incorporates, but is not be limited to:

    • voyage planning information (e.g., Arctic Voyage Planning Guide, Safety of Navigation products and services, Maritime Safety Information);

    • land-based emergency services/resources (e.g., Coast Guard stations, transit times to emergency services or ports, medical facilities and resources, helicopter access);

    • coastal environmental/topographic/hydrographic/maintenance data (e.g., deposition and dredging of seafloor sediment, changes in coastline and bathymetry); and

    • global maritime traffic data in the Arctic (e.g., to help assess likelihood of other ships in responding to a ship in distress).

  • Demonstrating interoperability between land and marine data that is necessary to understand coastal erosion (e.g., ocean currents, geology, permafrost characteristics, etc.).

  • General sensitivity to climate change.

Normative References

  • [IHO S-57], IHO: IHO S-57, IHO Transfer Standard for Digital Hydrographic Data. International Hydrographic Organization, Monaco (2000-). https:/iho.int/uploads/user/pubs/standards/s-57/31Main.pdf.

  • [IHO S-100], IHO Universal Hydrographic Data Model. International Hydrographic Organization, Monaco (2018–). https:/iho.int/uploads/user/pubs/standards/s-100/S-100Ed%204.0.0 Clean 17122018.pdf.

  • [OGC 18-062r2], OGC API — Processes, https:/ogcapi.ogc.org/processes/

  • [OGC 17-069r4], OGC API — Features, https:/ogcapi.ogc.org/features/

  • [OGC 20-004], OGC API — Records, https:/ogcapi.ogc.org/records/

  • [OGC 20-039r2], OGC API — Discrete Global Grid System (DGGS), https:/ogcapi.ogc.org/dggs/

  • [OGC 19-086r5], OGC API — Environmental Data Retrieval (EDR), https:/ogcapi.ogc.org/edr/

  • [ISO 19136], ISO 19136-1:2020 Geographic information — Geography Markup Language (GML) — Part 1: Fundamentals

6. Terms, definitions and abbreviated terms

6.1. Terms and definitions

6.1.1. API

An Application Programming Interface (API) is a standard set of documented and supported functions and procedures that expose the capabilities or data of an operating system, application, or service to other applications [adapted from ISO/IEC TR 13066-2:2016].

6.1.2. DDIL

DDIL (Denied, Disrupted, Intermittent, and Limited Bandwidth environments) to describe scenarios where the connectivity is not ideal and actions need to be taken to guarantee a normal or minimum operation of software applications.

6.1.3. interoperability

Interoperability is the ability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units

6.1.4. Marine Spatial Data Infrastructure (MSDI)

MSDI is a specific type of Spatial Data Infrastructure (SDI) with a focus on the marine environment.

6.2. Abbreviated terms

AOI

Area of Interest

AVPG

Arctic Voyage Planning Guide

CAAS

Communication as a Service

CDS

Concept Development Study

CGDI

Canadian Geospatial Data Infrastructure

CGNDB

Canadian Geographical Names Database

COM

Component Object Model

COSI

Collaborative Solutions & Innovation

COTS

Commercial Off The Shelf

CRS

Coordinate Reference System

CSW

Catalog Service Web

DCE

Distributed Computing Environment

DaaS

Data as a Service

DAP

Data Access Protocol

DAB

Data Access Broker

DCAT

Data Catalog Vocabulary

DCOM

Distributed Component Object Model

DDIL

Disconnected, degraded, intermittent, limited bandwidth environments

DGGS

Discrete Global Grid System

DOT

Department of Transportation

EDR

Environmental Data Retrieval

EO

Earth Observation

ER

Engineering Report

FAIR

Findable, Accessible, Interoperable, and Reusable

FMSDI

Federated Marine Spatial Data Infrastructure

GEO

Group on Earth Observation

GEOINT

Geospatial Intelligence

GEOSS

Global Earth Observation System of Systems

GeoXACML

Geospatial XACML

GFM

General Feature Model

GIS

Geographic Information System

GISS

Geographic Information System Service

GML

Geography Markup Language

HDF

Hierarchical Data Format

HTTP

Hypertext Transfer Protocol

ICT

Information and Communication Technology

IDL

Interface Definition Language

IHO

International Hydrographic Organization

InaaS

Information as a Service

IoT

Internet of Things

ISO

International Organization for Standardization

JSON

JavaScript Object Notation

JSON-LD

JSON Linked Data

KML

Keyhole Markup Language

MPA

Marine Protected Area

MSDIWG

Marine Spatial Data Infrastructures Working Group

NASA

National Aeronautics and Space Administration

netCDF

network Common Data Form

NGA

National Geospatial-Intelligence Agency

NMA

Norwegian Mapping Authority

NOAA

U.S. National Oceanic and Atmospheric Administration

NRCan

Natural Resources Canada

NSDI

National Spatial Data Infrastructure

OGC

Open Geospatial Consortium

OPeNDAP

Open-source Project for a Network Data Access Protocol

OSM

OpenStreetMap

PaaS

Platform as a Service

POI

Point(s)-of-interest

RDF

Resource Description Framework

RFI

Request For Information

RFQ

Request For Quotation

SaaS

Software as a Service

SDI

Spatial Data Infrastructure

SDK

Software Development Kit

SDO

Standards Developing Organization

SLD

Styled Layer Descriptor

SOS

Sensor Observation Service

SPARQL

SPARQL Protocol and RDF Query Language

SWE

Sensor Web Enablement

SWG

Standards Working Group

TIE

Technology Integration Experiment

UN-GGIM

United Nations Committee of Experts on Global Geospatial Information Management

USGS

U.S. Geological Survey

W3C

World Wide Web Consortium

WCPS

Web Coverage Processing Service

WCS

Web Catalog Service

WFS

Web Feature Service

WMS

Web Mapping Service

WMTS

Web Mapping Tile Service

WPS

Web Processing Service

WSDL

Web Services Description Language

WxS

Web <whatever> Service

XACML

eXtensible Access Control Markup Language

7. Overview and Background

We currently experience a rapidly changing environment in the Arctic with climate change being an important factor. Coastlines change, sea currents are affected that lead to changing climate on both land and sea and have consequences on marine food chains. Formerly frozen methane deposits in the Arctic Ocean have started to be released, whereas changing ice patterns make large areas of the Arctic accessible and navigable during long periods of the year. Thus, impacts of climate change on the Arctic environment present both challenges and potential opportunities for coastal communities, ecosystems, and economic activities. With more and more data becoming available, the question is: what can we do with all of these data? Do we better understand the status quo, or changes over time? Is it the right data that we find? What do we have and what are we missing? Do the interfaces to these data work? And, do we have the right tools?

Expressed more specifically, we can ask what analytical predictive information products can we extract from the available data? What is the role of EO Frameworks to share multi-temporal, multi-spectral analysis as information products? How can these data streams feed into and streamline regulatory processes?

This initiative builds on what has been accomplished in previous initiatives; the Marine Spatial Data Infrastructure Concept Development Study, the Maritime Limits and Boundaries Pilot, and the Arctic Spatial Data Infrastructure Pilot. The Marine Spatial Data Infrastructure Concept Development Study summarized the efforts and information gathered from a Request for Information which focused on in-depth data requirements, architecture, and standards needs for a Marine Spatial Data Infrastructure. The Maritime Limits and Boundaries Pilot worked to build a detailed implementation for testing S-121 Standard data.

Phase 3 of the Federated Marine SDI (FMSDI) Pilot is focused on advancing the implementation of open data standards, architecture, and prototypes for use with the creation, management, integration, dissemination, and onward use of marine and terrestrial data services for the Arctic. The use cases developed in this phase of the FMSDI pilot further demonstrated the capabilities and use of OGC and IHO standards.

An overarching, sea-based health and safety scenario incorporating the land/sea interface in the Arctic was developed. This scenario demonstrates the technology and data used with OGC, the International Hydrographic Organization (IHO), and other community standards in response to a grounding event and the evacuation of a cruise ship or research vessel in the Arctic.

Participants developed demonstration sub-scenarios showing how data can be discovered, accessed, used and reused, shared, processed, analyzed, and visualized. Each sub-scenario demonstrates what is currently possible and what gaps are experienced with the resources that can be discovered on the Internet.

Activities included the following.

  • Demonstrate how FAIR principles can be effectively used to facilitate rescue operations in the Arctic including oil spill tracking and remediation.

  • Demonstrate interoperability between land and marine data that is necessary to understand coastal erosion (e.g., ocean currents, geology, permafrost characteristics, etc.).

  • Investigate the role of vector tiles and style sheets across the land-sea interface.

  • Inclusion of the Arctic Voyage Planning Guide (AVPG) and how standards may be used to expand the guide.

  • Demonstrate the OGC Discrete Global Grid Systems (DGGS) API use-case.

7.1. Towards an FMSDI (Initiative Overview)

The Federated Marine Spatial Data Infrastructure (FMSDI) Pilot is an OGC Collaborative Solutions & Innovation (COSI) initiative with the objective to enhance Marine Spatial Data Infrastructures (MSDIs), to better understand MSDI maturity and to demonstrate the power of FAIR (Findable, Accessible, Interoperable, Reusable) data in the context the marine environment.

A Marine Spatial Data Infrastructure (MSDI) is a specific type of Spatial Data Infrastructure (SDI) with a focus on the marine environment. It is not only a collection of hydrographic products but an infrastructure that promotes the interoperability of data at all levels (e.g., national, regional, and international). Similar to all SDIs, it tries to enhance the discoverability, accessibility, and interoperability of marine data. By doing so, it supports a wider, non-traditional user-base of marine data, far beyond what is typically used for navigation.

7.1.1. Previous Phases

The FMSDI pilot built on the works of prior initiatives, such as the Marine Spatial Data Infrastructure Concept Development Study, the Maritime Limits and Boundaries Pilot, and the Arctic Spatial Data Infrastructure Pilot. Currently, FMSDI initiative include the following three phases which will be extended in future.

  • Phase 1: Marine Data Availability and Accessibility Study (MDAAS): The first phase of the FMSI initiative (Aug 2021 - Dec 2022) included the Marine Data Availability and Accessibility Study RFI (Request for Information). MDAAS started with the release of a Request for Information (RFI) to help determine data availability and accessibility of Marine Protected Areas (MPA, IHO S-122) and other marine data in the North Sea and Baltic Sea. MDAAS further helped assess interoperability, availability and usability of data, geospatial web services, and tools across different regions and the use of marine spatial data. MDAAS also provided identification of gaps and helped define reference use-cases and scenarios for use in future FMSDI Pilot activities.

  • Phase 2: IHO and OGC standards applied to Marine Protected Area Data: The second phase (Jan 2021 - Jun 2022) extended the MPA-focus of the first phase by digging into all the various data services and begins building out an S-122 demonstration model, including the exploration of the S-100 data specifications and how other data (terrestrial, meteorological, Earth observation, etc.) can mingle to create a more holistic view of the region of focus. In addition, phase two designed a MSDI maturity model, which provides a roadmap for MSDI development. The maturity model was derived from the United Nations Global Geospatial Information Management (UN-GGIM) Integrated Geospatial Information Framework (IGIF, or UNGGIM-IGIF).

  • Phase 3: Connecting Land and Sea to Protect the Arctic Environment: Third phase of the FMSDI This initiative (Jul 2022 - Dec 2022) builds directly on what was accomplished earlier in Phase 1 & 2. Also, Phase 3 is relevant to the previous Arctic Spatial Data Pilot conducted in 2017. Phase 3 focuses on land/sea use cases and extends the use cases developed in the second phase to add the Arctic region as a new location to the demonstration scenarios. Phase 3 will advance the implementation of open data standards, architecture, and prototypes for use with the creation, management, integration, dissemination, and onward use of marine and terrestrial data services for the Arctic. This phase includes an overarching, sea-based health and safety scenario incorporating the land/sea interface in the Arctic. This scenario will demonstrate the technology and data used with OGC, IHO, and other community standards in response to a grounding event and the evacuation of a cruise ship or research vessel in the Arctic.

7.1.2. IHO Standards, S-100

The IHO is a high level domain specific international intergovernmental standards organization, often represented by their national hydrographic offices. The IHO initially developed unique stand-alone standards, such as the IHO Transfer Standard for Digital Hydrographic Data S-57, but is in the process of replacing these standards with standards based on the ISO Geographic information/Geomatics standards (i.e., ISO/TC 211). The transition to the new IHO Universal Hydrographic Data Model (S-100) is in progress, and much of the hydrographic data currently in use is built to S-57 and is therefore only partially suitable for use with many of the Web Services standards and APIs available from OGC.

IHO S-100, the framework standard, has been under development since the publication of its predecessor, IHO S-57 in 2001. IHO S-57, the current standard for the encoding of Electronic Navigational Charts under the SOLAS convention is a vector based standard developed specifically for the purpose of encoding charts for the purpose of safe navigation but S-100, conceived shortly afterwards represented a much bigger step forward.

S-100 aimed to overcome many of the perceived shortcomings of the newly released S-57 standard and was defined as a much broader standard where a framework of ISO-like structures was defined leaving the details of content and encoding to the authors of specific product specifications which would then sit alongside the main framework. S-100 therefore had the following goals:

  • production of a standard in close alignment with the ISO19100 framework;

  • a framework standard which defines content through individual product specifications;

  • a separation of data content from its representation in encodings;

  • fully machine-readable standards for both feature content and portrayal;

  • the location within a registry located at the IHO, of features, their attributes and metadata; and

  • the facility to update feature content and portrayal by end user systems.

S-100 is currently at edition 4.0.0 with edition 5.0.0 in preparation. As part of the development of S-100 two encodings for vector feature data were defined, one was ISO8211, a compact binary format used predominantly in the encoding of Vector ENC charts, and the other was the S-100 GML profile, Part 10b. The S-100 GML (Geography Markup Language) profile is a subset of GML ISO19136 designed to support the encoding of simple vector datasets. The S-100 GML profile was developed by the UK around 2013 and incorporated into S-100 as Part 10b shortly thereafter. The profile has been used by various project teams within the IHOs Nautical Publications working group as well as other S-100 based product specification developments. S-100 does not define data content but only provides a toolkit for its definition. Data content is defined within S-100 product specifications. These product specifications detail how data is defined, aggregated, and exchanged along with metadata such as coverage, CRS, geometry and encoding details. The detailed definitions of feature, attribute and associations for an individual product specification are contained within its feature catalog and also lodged in the IHOs geospatial registry; an ISO compliant registry where all S-100 products' definitions, concepts and details are kept and reconciled by a dedicated team.

The following S-100 product specifications were used during this pilot:

  • S-101 Electronic Navigational Chart (ENC);

  • S-102 Bathymetric Surface;

  • S-104 Water Level Information for Surface Navigation;

  • S-111 Surface Currents;

  • S-122 Marine Protected Areas;

  • S-124 Navigational Warnings;

  • S-125 Marine Aids to Navigation (AtoN); and

  • others.

It is important to note, then, that the development of IHO S-100 has continued during the Pilot’s progress and this is reflected in updates to the existing S-100 standard. This Pilot remains the most up to date implementation of the S-100 framework to date and certainly one of the few with concrete reference implementations in software from multiple participating vendors.

7.1.3. OGC API Standards and the Challenges for the Marine Community

OGC’s main interaction with the marine community is mainly through its established Marine Domain Working Group (MDWG), formed within the IHO/OGC memorandum of understanding (MOU). This group was formed to address the gaps in the OGC framework within the marine domain. For many years the IHO has run an MSDI community through its MSDIWG and the OGC MDWG works closely with the MSDIWG to cross-fertilize ideas and outline where opportunities exist to improve the ecosystem for the benefit of stakeholders.

The OGC MDWG has a focus on the S-100 framework and its broader integration into the OGC community. This process is likely to take some time and the MLB Pilot is a significant step in exploring such interfaces. IHO and OGC standards have many common elements and both derive largely from overarching ISO standards. The practicalities of their use alongside each other are not always well defined however and the project has sought to explore these practical steps as much as possible.

For several years, the OGC members have worked on developing a family of Web API standards for the various geospatial resource types. These APIs are defined using OpenAPI. As the OGC API standards keep evolving, are approved by the OGC and are implemented by the community, the aviation industry can subsequently experiment and implement them.

The following OGC API Standards and Draft Specifications were used for the development of APIs during this Pilot.

OGC API – Features: a multi-part standard that defines the capability to create, modify, and query vector feature data on the Web and specifies requirements and recommendations for APIs that want to follow a standard way of accessing and sharing feature data. It currently consists of the following four parts.

OGC API - Maps: Maps offers a modern approach to the OGC Web Map Service (WMS) standard for provision map and raster content.

OGC API – Common: Common provides those elements shared by most or all of the OGC API standards to ensure consistency across the family.

OGC API - Processes: Processes allow for processing tools to be called and combined from many sources and applied to data in other OGC API resources though a simple API.

OGC API - EDR: The Environmental Data Retrieval (EDR) Application Programming Interface (API) provides a family of lightweight query interfaces to access spatio-temporal data resources by requesting data at a Position, within an Area, along a Trajectory or through a Corridor. A spatio-temporal data resource is a collection of spatio-temporal data that can be sampled using the EDR query pattern geometries. These patterns are described in the section describing the Core Requirements Class.

OGC API – Tiles: This API defines how to discover which resources offered by the Web API can be retrieved as tiles, retrieve metadata about the tile set (including the supported tile matrix sets, the limits of the tiled set inside the tile matrix set) and how to request a tile.

OGC SensorThings API: Provides an open, geospatial-enabled and unified way to interconnect the Internet of Things (IoT) devices, data, and applications over the Web.

Draft OGC API - DGGS: This draft API enables applications to organize and access data arranged according to a Discrete Global Grid System (DGGS).

Draft OGC API Coverages: Coverages allows discovery, visualization and query of complex raster stacks and data cubes.

Draft OGC API – Styles: This draft API specifies building blocks for OGC Web APIs that enables map servers and clients as well as visual style editors to manage and fetch styles.

The development and adoption of these APIs is likely to have a significant impact on how organizations like IHO engineer systems which seek to be interoperable within OGC standards. The move to an API based interconnecting system of standards clarifies the dividing line between content, its expression within a technical encapsulation (its encoding) and the transport of that data to its ultimate destination.

The IHO, similarly, is instigating a major drive towards implementation of IHO S-100. The standard, many years in the making, is already key to many activities and initiatives in the marine domain. Of particular note is the IMO’s eNavigation initiative, for which S-100 forms the common maritime data structure (CMDS). The IHO is embarking on a push to get S-100 accepted as equivalent for carriage of charts and publications under the global SOLAS convention, a large undertaking and one which will embed its use in live vessel navigation for many years. As a dynamic standard which relies on the creation of product specifications S-100 is wholly dependent on its ability to be interoperable with other standards frameworks and to remain current with those frameworks.

This project is intended to contribute positively to both the OGCs APIs interface with the IHO and to assist the productionisation of S-100 as part of the implementation roadmap.

7.1.4. Arctic Voyage Planning Guide (AVPG)

Where possible, participants incorporate the Arctic Regional Marine Spatial Data Infrastructures Working Group’s (ARMSDIWG) Arctic Voyage Planning Guide (AVPG). The AVPG is intended as a strategic planning tool and a compilation of data and services for national and international vessels traveling in the Arctic. This guide is currently in its development stage. An example of a Canadian implementation is shown in Appendix B. Participants attempted to utilize multiple data from the below list of themes and content while identifying gaps, and making recommendations to improve the federation of data services required for usability of voyage planning in the Arctic.

The Arctic Voyage Planning Guide contains the following themes and content.

  • Theme 1 – Carriage Requirements

    • Navigational Warnings Services

    • Radio Aids to Navigation

    • List of Lights and Buoys and Aids to Navigation

    • Nautical Charts and Publications services

  • Theme 2 – Regulatory Requirements

    • Acts and Regulations specific to marine navigation (similar to S-49 E.3.2)

    • IMO Guidelines for Operating in Polar Waters

  • Theme 3 – Arctic Environment Considerations

    • Communities and Populated Areas Information

    • Weather Station Locations and Services Available (similar to S-49 E.4.2 and U.4))

    • Airports and Hospitals

    • Resource Development Significant Locations

  • Theme 4 – Route Planning

    • Traditional Traffic Routes (similar to S-49 E.3.2)

    • Controlled Navigational Areas including Vessel Traffic Services Zones

    • Limiting Depth For Constricted Waterways

    • Tide, Current and Water Level information (similar to S-49 U.6.1)

    • Environment Protected Areas

    • Major Aids to Navigation (similar to S-49 E.1.2 and U.1.2)

    • Places of refuge or Pilot Boarding Stations (similar to S-49 E.1.5)

    • Calling-in Points (similar to S-49 E.4.1)

  • Theme 5 – Reporting and Communicating

    • Areas of Legislative Importance to Navigation

    • Marine Communication Services (similar calling-in info to S-49 E.4.1)

  • Theme 6 – Marine Services

    • Ice Breaking Support Services

    • Search and Rescue Support Services

    • Ice Services Information (similar to S-49 U.6.4)

  • Theme 7 – Nautical Charts and Publication

    • Nautical Chart Catalog and Coverage

    • Publication Catalog and Coverage

7.2. Pilot Execution Process

All participants were invited to suggest sub-scenarios and/or modifications to the overall scenario. During the pilot execution phase, each participant performed a thorough execution of all phases of data discovery, access, integration, analysis, and visualization. Participants collaborated and supported each other with data, processing capacities, and visualization tools. The overarching goal is to learn more about current capabilities and shortcomings of marine data services offered by all Marine Spatial Data Infrastructures, Web portal, and directly accessible cloud native data. The following graphic illustrates the pilot development process.

Pilot execution
Figure 1. FMSDI Phase 3 execution process

7.3. ER Chapter Overview

The remainder of this document is summarized below.

Chapter 5: A Survey on User Community Needs

This chapter summarizes the results of a web-based survey to gather and identify the requirements and use cases of a regional/international MSDI from a user community perspective.

Chapter 6: Research Objectives and Technical Architecture

This chapter describes the motivations that guided this Pilot’s work, the research objectives, and the component architecture that was demonstrated to address this Pilot’s goals.

Chapter 7: Overarching Master Scenario

This chapter describes the motivations that guided this Pilot’s work, the research objectives, and the component architecture that was demonstrated to address this Pilot’s goals.

Chapter 8: Components and sub-scenarios

This section describes the four servers and four clients used within this pilot.

  • Section a: Data Servers and Services: This section describes the Data Fusion Servers. This was the component designed to ingest datasets as well as other datasets, combine them, and serve them through an API built using the OGC API standards. This component was demonstrated by IIC Technologies, Compusult, University of Calgary and Tanzle.

  • Section b: Data Clients and Visualization: This section describes the Data Clients and Visualization. This was the component designed to access and view datasets as well as other datasets, combine them, and visualize them through an API built using the OGC API standards. These clients were demonstrated by ESRI Canada, Helyx, Ecere, and Tanzle.

Chapter 9: Challenges and Lessons Learned

This section outlines a prescriptive list of challenges and lessons learned encountered through the different stages of the initiative. The section also includes recommendations for the various standards utilized through the initiative.

Chapter 10: Recommendations for Future Work

This section outlines a descriptive list of various items that could be expanded upon in future initiatives or for the sponsors to utilize and build from.

8. A Survey on User Community Needs

As part of FMSDI Phase 3, an on-line user survey was conducted. A blank copy of the online survey is included in Appendix D.

The Survey on the Federated Marine Spatial Data Infrastructure (FMSDI) user community was released in October 2022 to gather and identify the requirements and use cases of a regional/international MSDI. The results of the survey will help shape the OGC’s future FMSDI pilot activities and to serve the user community’s needs better.

The survey was available online on October 27, 2022, and the results were finalized by December 2022.

8.1. Questions & Response Summaries

A total of ten questions were asked in four categories. The summary of this RFI is provided in the following sections.

Question 1: Is your organization aware of marine spatial data infrastructures (MSDIs)?

From 35 replies, 91% of the survey participants were aware of marine spatial data infrastructures, and only 9% were unaware of MSDIs.

Question 2: Is your organization aware of the concept of a Federated MSDI?

Out of 35 respondents, 80% were aware of the concept of a Federated MSDI, while only 20% were not aware of the concept of the Federated MSDI.

Question 3: What is the overall role of your organization in a federated marine spatial data infrastructure?

Most of the 35 participants had multiple roles with 60% data user/analyst, 57% data provider/enabler, and 51% data producer/owner, and 23% also had other roles (Figure 1). The other roles included human health aspects, system research, creating Artificial Intelligence environments, GIS and SDI technologies, Software solutions (data integration, data visualization, server, client, analytics), supporting standards’ development, data architecture and technology development.

Survey 1
Figure 2. The summary of the answers for the role of the 35 respondents in federated marine spatial data infrastructures.

Question 4: What activities is your organization involved in within the marine domain?

The 35 participants had multiple activities in their organization. Figure 2 shows that Research had the highest percentage at 57%, followed by Science at 54%, Hydrography at 49%, Government at 43%, Oceanography at 34%, Transportation at 31%, Emergency Management at 29%, Resource Management at 26%, Marine Biology 17%. Only 11% had activities beyond the aforementioned list; this includes human health in FMSDI, Natural Resource Development, Stakeholder Engagement, learning and research, and Navigation Services.

Survey 2
Figure 3. The summary of the 35 responses received to the question regarding the participants’ activities within the marine domain.

Question 5: Do you use any of these spatial/marine standards?

Only three participants did not specify the standards that they use, while from the 31 remaining responses, the majority use equally OGC or IHO standards with 22 and ISO with 20 participants. Figure 3 also shows the list of other standards that were mentioned by the participants including IALA, W3C, NENA, INSPIRE, IEC, WMO, and FGDC. Figure 4, furthermore illustrates the combination of standards used by each participant where the largest group is OGC, ISO, and IHO combination with 27%.

Survey 3
Figure 4. The summary of the 34 responses received to the question regarding the spatial/marine standards used by the participants.
Survey 4
Figure 5. Combination of standards used by the 34 participants

Question 6: Do you currently use an MSDI (Marine Spatial Data Infrastructure)?

From the 35 received responses, 66% of respondents currently use an MSDI, 28% do not use MSDI, and 6% do not use MSDI, but they have a data service or software package that could contribute to an SDI.

Question 7: If you answered 'Yes' to the previous question (#6), does the MSDI that you are using meet your needs?

From the 23 responses (Figure 9), 26% of participants' MSDIs meet their needs, 18% do not, 48% maybe, 4% do not always and 4% partially.

Survey 5
Figure 6. The summary of the 23 responses received to the question regarding whether their MSDI meets their needs.

Question 8: Please rank each of the FAIR (Findable, Accessible, Interoperable and Reusable) Data Principles as it applies to data and services provided in the MSDI you contribute to or access (5-star rating system, with a 5-star being the best score).

The summary of the 30 responses received to the question regarding ranking each of the FAIR (Findable, Accessible, Interoperable and Reusable) Data Principles as it applies to data and services provided in the MSDI that participants contribute to or access (5-star rating system, with a 5-star being the best score) summarizes the responses received from 30 participants. Although the answers are not identical for any of the four principles, the lowest score (i.e., 1) had the lowest response percentage, showing that most of the participants had, to some extent, faith in their own MSDI being Findable, Accessible, Interoperable and Reusable. Most of the participants did not find their MSDIs Findable, with scores 1 and 2 cumulating to 37%. The majority find their MSDIs Reusable and Accessible with 53% and 50% (i.e., the sum of scores 4 and 5) and, to a lesser extent, interoperable (43%).

Survey 6
Figure 7. The summary of the 30 responses received to the question regarding ranking each of the FAIR (Findable, Accessible, Interoperable and Reusable) Data Principles as it applies to data and services provided in the MSDI that participants contribute to or access (5-star rating system, with a 5-star being the best score)

Question 9: Please rank the UN-GGIM Integrated Geospatial Information Framework (IGIF) Strategic Pathways, on the basis of needed improvement to enable federation of the MSDIs you contribute to or access, from your perspective

The summary of 26 responses received for ranking the UN-GGIM Integrated Geospatial Information Framework (IGIF) Strategic Pathways on the basis of needed improvement to enable federation of the MSDIs participants to contribute to or access, from their perspective. summarizes the answers in nine categories of Governance and Institutions, Policy and Legal, Financial, Data, Innovation, Standards, Partnership, Capacity and Education, and Communication and Engagement. The participants identified Partnership with 48% as the area that needs the most improvements, followed by Data, Governance and Infrastructure with 42%.

Survey 7
Figure 8. The summary of 26 responses received for ranking the UN-GGIM Integrated Geospatial Information Framework (IGIF) Strategic Pathways on the basis of needed improvement to enable federation of the MSDIs participants to contribute to or access, from their perspective.

Question 10: Please provide applicable high-level use case(s), from your point of view, that would best exercise a federated MSDI.

From the 24 responses, Data-related use cases had the highest interest among the participants. Still, there was no consensus on any specific data use case with topics such as data distribution and data access. As Figure 8 presents, the Monitoring, predictive models and port-related use cases each had 21%. Suggested monitoring use cases are focused on coastal and biodiversity, while Predictive models are focused on models for MSDI to predict future potential impacts. Some of the Port related use cases include port creation, port and harbor management, and integrated coastal zone management.

Environmental and ecosystem-related use cases (e.g., water temperature, toxic materials in water) were followed by 17%. This category included climate change and Marine Protected Areas (MPA). Impact analysis also constituted 17% of participants' use cases with topics such as infrastructure and supply chain. Cutting-edge technologies, such as predictive digital twins and virtual environments, also comprised 17% of participants' use cases. Spatial topics such as Marine GIS, Marine Cadastre, and Marine spatial planning were also of interest to 17%.

Offshore renewable energy use cases, sea level rise use cases, legislative and regulatory information on maritime boundaries, e-Navigation and Nautical Charts were some other categories of use cases that each had 13%. Disaster response, land/sea interface, and economy-related use cases had 8%.

Survey 8
Figure 9. Summary of high-level use cases based on a number of participants who mutually mentioned them, from their point of view, that would best exercise a federated MSDI

8.2. User Survey Conclusions

The need for partnership in the FMSDI is clear. Most of the participants found their MSDI to be least findable in contrast to other FAIR principles; in terms of data, the question almost always begins with either an exploratory "What is here?" or a search for "Where is it?". That might be another angle to the Partnership being the main area that needs to be improved in global initiatives such as UNGGIM IGIF strategic pathway.

The marine community appears to be an avid standard user. And the need for interoperability between IHO and OGC standards is once again highlighted. It should be noted that not too many use cases explicitly mentioned interoperability between OGC and IHO compliant products and datasets, but by implementing interoperability between OGC and IHO, it is possible to make more IHO S-100 datasets available to the public.

Although data-oriented use cases had the majority of suggested use cases, it was clear that data has become less of an issue and access to analysis comes into focus. The use cases were mostly focused on real end-user applications.

==Research Objectives and Technical Architecture

This Pilot was the third phase of a broader initiative and this chapter describes the motivations that guided this Pilot’s work and the component architecture that was demonstrated to address this Pilot’s goals.

The goal of this pilot is to explore the current data basis, the accessibility to these data, processing capacities, analytical components, and visualization options that help us to better understand the opportunities and challenges described above. This pilot explores the data integration and analysis layer, and demonstrates information products for senior decision makers in particular.

It has been widely recognized that standards play an essential role for data integration and processing. FAIR principles as in Findable, Accessible, Interoperable, and Reusable are key to integrate data from multiple sources and owners into analytical environments. This pilot explores how geospatial data, at standardized interfaces, can be combined with other data and live spatial data feeds. These data feeds, e.g., ocean monitoring buoys or weather sensors represent a key dimension of a Digital Arctic. A significant amount of data in the marine environment has been produced with navigation as the primary goal. Is that data fit for other purposes, such as search & rescue, environment, economics? What are the new purposes and use cases we can build with the data we have available?

8.3. Problem Statement

One of the challenges of Marine data is to make it available for a wide variety of users, including those outside the MSDI domain, such as fishermen, resource extractors, utilities, tourists, or recreational boaters. These users, who do not have direct access to marine databases to access the data they need to perform their activities, rely on smaller consumer-facing applications, which in turn rely on APIs to request and consume the data they work with.

The use of standards makes it easier for developers to build software applications. The more robust these standards are, the easier it is to build applications, and the more diverse the audiences that are able to utilize them in a variety of scenarios. Because of this, the demonstration of standards related to both marine data and the APIs they are served through becomes of a key importance.

Within this context, this pilot addressed the following research questions.

  • What steps were taken in the server development to standardize the various data into an S-100 or OGC API accessible data set?

  • What stages the data goes through in a fusion scenario, regarding format, metadata, etc?

  • What steps were taken in the server development to synthesize the data and create digestible data for clients?

  • Which OGC APIs were leveraged to perform transformations to this data?

  • How the data were processed by the clients and what views were used?

  • What kind of modifications do the various S-100 and OGC API standards need to better address the use of Marine data?

This pilot addresses all these questions, challenges, and opportunities using three main pillars. As illustrated in the figure below, the first pillar is "Demonstration". Here, participants demonstrate scenarios that integrate data from various sources with emphasis on the land-sea interface. The second pillar explores the value of existing and emerging OGC standards, and the third pillar looks at related standards from the International Hydrographic Organization, IHO.

3 pillars of pilot
Figure 10. Overview of the pilot’s three pillars and their respective work items on top together with the envisioned results of the pilot at the bottom; all in context of federated marine spatial data infrastructures

8.4. Technical Overview

The Pilot activities were divided amongst the individual participants according to the component they were providing, either a client or server, and the approved sub-scenario each participant developed.

Each participant performed a thorough execution of all phases of data discovery, access, integration, analysis, and visualization appropriate to their component. Participants collaborated and supported each other with data, processing capacities, and visualization tools. The overarching goal is to learn more about current capabilities and shortcomings of marine data services offered by all Marine Spatial Data Infrastructures, Web portals, and directly accessible cloud native data.

The following image shows the participants and their general allocated roles:

Participant roles
Figure 11. Roles of each participant

FMSDI Phase 3 includes an overarching sea-based health and safety scenario incorporating the land/sea interface in the Arctic. This scenario demonstrates the technology and data used with OGC, IHO, and other community standards in response to a grounding event and the evacuation of a cruise ship or research vessel in the Arctic.

Participants developed demonstration sub-scenarios that showed how data can be discovered, accessed, used and reused, shared, processed, analyzed, and visualized. Each sub-scenario demonstrates what is currently possible and what gaps are experienced with the resources that can be discovered on the Internet.

Activities include the following.

  • Demonstrate how FAIR data principles, through the use of OGC API’s and IHO standards, can be effectively used to facilitate rescue operations in the Arctic including oil spill tracking and remediation.

  • Access to land-based emergency services/resources (e.g., Coast Guard stations, transit times to emergency services or ports, medical facilities and resources, helicopter access).

  • Coastal environmental/topographic/hydrographic/maintenance data (e.g., deposition and dredging of seafloor sediment, changes in coastline and bathymetry).

  • Global Maritime Traffic data used in the Arctic (e.g., to help assess likelihood of other ships in responding to a ship in distress).

  • Voyage planning information (e.g., Arctic Voyage Planning Guide, Safety of Navigation products and services, Maritime Safety Information).

  • Demonstrate interoperability between land and marine data that is necessary to understand coastal erosion (e.g., ocean currents, geology, permafrost characteristics, etc.).

  • Demonstrate the OGC Discrete Global Grid Systems (DGGS) API use-case.

  • Investigate the role of vector tiles and style sheets across the land-sea interface.

In addition to the marine data, the sub-scenarios included elements coming from the land side. This is particularly interesting, because often land-sea use cases require the integration of data from multiple organizations, with each organization potentially limiting its view to one side of the land-sea transition. This interoperability of marine data with land/terrestrial geospatial web-based data services, for example, is part of this pilot.

A goal of this Phase is to advance the implementation of open data standards, architecture, and prototypes for use with the creation, management, integration, dissemination, and onward use of marine and terrestrial data services for the Arctic.

The following chapters describe the Overarching Master Scenario and all participant components and sub-scenarios individually including the interactions between them. The final two chapters that follow present the main lessons learned, recommendations and suggested future work.

Each component chapter includes a description of the baseline from which that component was demonstrated, the sub-scenario, the technical architecture of that component, and the challenges and lessons learned that emerged from the demonstration of that component.

9. Overarching Master Scenario

9.1. Background

Shipping traffic in the waters off western Alaska and the Arctic have significantly increased. In the last 12 years there has been a significant increase in shipping traffic and all signs indicate that the amount of traffic will increase. As the amount of shipping traffic increases the risk of accidents also increases.

The area off western Alaska includes national parks and several Large Marine Ecosystems (LMEs) and challenging navigation conditions, potentially impacting wildlife species by disturbances and implications from this increase in vessel traffic.

This is a sea-based, transportation, health and safety scenario incorporating the land/sea interface off the west coast of Alaska. The area of interest for the overall scenario is shown below:

FMSDI3 AOI01
Figure 12. Broad view of the area of interest

9.2. Narrative

The expedition cruise ship, ‘Discovery’ (completely fictional), is conducting a 22-day trip from Nome, Alaska to Kangerlussuaq (pronounced gang-er-LOOSE-swak), Greenland. Discovery is carrying 142 passengers and 104 crew and departs Nome on July 22 around 1700 hours. Overnight she passes through the Bering Strait on her way to her first stop, Kotzebue Bay (pronounced KAWT·zuh·boo). She is scheduled to arrive in the area around 0800 hours and will remain there for most of the day providing an opportunity for passengers to view the flora and fauna via shore excursions.

However, early in the morning of July 23, just as they were about to round Cape Espenberg, the ship had a major power failure that, due to a cascading generator shutdown, caused the vessel to lose propulsion and most major systems. Bridge electronics for communications and electronics continued to function as they are on a separate redundant system. One of the generators was able to be restarted within an hour to supply partial electricity to the ship but not before the relatively high winds at the time pushed the ship dangerously close to the shore. Under partial power the ship tried to navigate the shallow area indicated on the nautical chart, but the ship ran aground.

Grounding area detail new
Figure 13. Detailed area of grounding incident

Discovery grounded in an ecologically sensitive area, namely the Bering Land Bridge National Preserve.With only partial power to run on-board systems and a low probability to become seaworthy, Discovery declares an emergency.

Grounded ship03
Figure 14. Expedition cruise ship, Discovery - grounded

9.3. Development of Participant Sub-scenarios

All participants developed sub-scenarios within the context of the overarching scenario. Sub-scenario workflows were possibly linked to the overarching scenario through:

  • the Event - Vessel grounding (possibly due to weather and/or mechanical issues and/or coastal erosion);

  • moving the vessel - combination land (heavy equipment) & sea (tug boats) access;

  • rescuing / transferring passengers and crew (immediate transfer of any injured); and

  • ecological impact prediction and mediation (both on land and in the ocean).

End-to-end sub-scenario workflow included and demonstrated the following:

  • the exposure of existing data via OGC APIs;

  • consumption and interoperability of available data by OGC APIs and IHO standards;

  • a focus on the marine / terrestrial boundary;

  • discovery of data and services metadata and integration into the overall incident; and

  • the binding to, or the portrayal of data and services via a variety of clients and/or GIS application.

9.4. Data Resources

Each participant was responsible, in part, for discovering and integrating data that were appropriate to their sub-scenarios. Available data would serve many purposes, including, but not limited to the following.

  • Emergency planning and management

  • Soil erosion

  • Coastal protection and shoreline management

  • Habitat mapping and heritage assessment

  • Fisheries regulation

  • Site usage (e.g., renewable energy and oil and gas extraction)

  • Conservation assessment and designation

  • Homeland security and defense

  • Licensing and consent evaluation

  • and more

Data discovery was part of the sub-scenario development and it was up to individual participants to find appropriate data for their sub-scenarios. A variety of available datasets discovered and used within the sub-scenarios are shown in Appendix A.

10. Components and Sub Scenarios

This section describes the individual participant’s component, either client or server, and their approved sub-scenario.

As described in detail in the following sections, each participant performed a thorough execution of all phases of data discovery, access, integration, analysis, and visualization appropriate to their component. Participants collaborated and supported each other with data, processing capacities, and visualization tools.

The following sections describe the details regarding their individual experiences during this pilot. These include the following.

  • Description - a brief description of their role within the pilot

  • Sub-scenario - description and how it relates to the overarching scenario. Includes:

    • Actor(s), end-user(s) and/or stakeholder(s)

    • Data and Platforms

    • AVPG Themes if used

  • Technical Architecture - describing interoperable technologies used and data flow

  • Demonstration - outlines the demonstration including an abbreviated storyboard

  • Results: Integration, Interactions and Interoperability - results of the Pilot from participant’s POV. Describes various interactions with a variety of data and services including the interoperability of these interactions and whether it was successful.

  • Challenges and Lessons Learned

  • Other sections the participant feels are important to their experience

To view individual videos of the accomplishments of the seven participants in this phase of the pilot, please click the following link to the OGC Youtube Channel.

10.1. Technology Integration Experiments (TIEs)

For the FMSDI Pilot Phase 3 there were no formal TIEs required by participants. For this phase of the pilot the participants tended to perform informal TIEs between the various clients and servers and these outcomes are shown in the ‘Results: Integration, Interactions & Interoperability’ sub-sections of each participant’s write-up. These informal TIEs focused on the exchange of marine and terrestrial data through OGC APIs and IHO standards exploring the potential synergies of this integration.

However, due to the more experimental nature of the draft OGC API - DGGS and the significant development effort that incurred during the Pilot, these components were more thoroughly tested. A complete presentation of this effort is described in the ‘Client 3 (D102) - Ecere’ section of this ER.

10.1.1. TIE Summary Table

The following table summarizes the TIEs (Technology Integration Experiments) performed during FMSDI Phase 3.

Table 1. TIE table summarizes the general TIEs between FMSDI Phase 3 components

SERVER / CLIENT

Client 1 (D100) - Esri Canada

Client 2 (D101)- Helyx

Client 3 (D102) - Ecere

Client 4 (D106)- Tanzle

Server 1 (D103) - IIC Technology

Successful

Successful

Successful

N/A

Server 2 (D104) - Compusult

Successful

Successful

Successful

N/A

Server 3 (D105) - UCalgary (DGGS)

N/A

N/A

Successful

N/A

Server 4 (D107)- Tanzle

N/A

N/A

Successful

10.2. Data Fusion Servers and Services

The data fusion servers were the components designed to ingest a wide variety of data from various sources. These servers may transform and/or expose the data to comply with either OGC APIs or IHO S-100 standard.

Data fusion servers were demonstrated by IIC Technologies (D103), Compusult (D104), University of Calgary (D105) and Tanzle (D107). The Tanzle component was a special case as their contribution was demonstrating a more integrated (client and Server), proprietary, solution that they were looking to expand their interoperability capabilities.

10.3. Server 1 (D103) - IIC Technology

10.3.1. Description

In previous FMSDI pilots, IIC made use of open source components for the deployment of OGC API - Features services covering Marine Protected Areas which implement the IHO S-100’s General Feature Model (GFM) content. This blending of IHO content with OGC web services models has been extended in this pilot to a more diverse set of data, meeting the use cases defined in the project. As an in-kind contribution use has been made of IIC’s own featureBuilder S-100 authoring tools and an automated pipeline for deployment to the web services API endpoint. Additionally, a fully configured cloud server has been deployed which is used for serving data.

Of particular interest within this pilot is the use made of a mapping between the S-100 GFM and GeoJSON. This is reproduced at Appendix C: The S-100 GFM (Geo) JSON format in this document and it is intended to release the mapping as an OGC document. This provides a way of taking data which meets the GFM requirements, described by S-100 feature catalog and encoding it in a form natively understood by most OGC API implementations. The GeoJSON encoding, whilst limited in some respects, has a ubiquitous appeal to developers of web-based systems and the combination of OGC transport of IHO services is attractive to many members of the IHO community who wish to deploy marine geospatial data alongside navigational use cases.

The technology used for deployment was pygeoapi. It is a python server implementation of the OGC API suite of standards and also allows for metadata to be defined for services. This has allowed IIC to develop mappings, not just for content, but for its metadata too. Names, aggregations of features, services and many “delivery” and “discovery” aspects have been developed, all of which contributes to the goal of using S-100 in situations alongside, and outside direct navigational use cases, on ongoing theme for the IHO community in general.

The pilot described in these sections is a good opportunity to broaden the data scope of IHO-OGC interoperability as it encompasses navigational datasets and use cases, together with logistical/temporal use cases (water level against time, route exchange for moving vessels) and land/sea “interfaces” (use of land based logistics to effect search and rescue, use of land resources in exceptional circumstances)

10.3.2. Sub-scenario and Data

The sub-scenario documented in this section is the detail of the grounding of a cruise vessel in the region of Kotzebue, Alaska. Much thought was given to the construction of the sub-scenario, balancing the following:

  1. the need to ensure it represents a credible set of circumstances described in real-world terms; and

  2. the desire to show a diverse and representative set of use cases for marine geospatial data and how an API paradigm, and in particular OGC API, transport can enhance response scenarios.

The location of the pilot is the region of Kotzebue in Alaska, approx 66° North and 166° West, illustrated in the following chart (NOAA).

IIC 01
Figure 15. The location of the pilot (AOI) in Kotzebue in Alaska

The sub-scenario has been chosen and constructed to highlight the following:

  • the use of OGC API for exchange and interoperable computation in relation to search and rescue activities in a chosen (specifically, remote) region;

  • building on previous experience with Marine geospatial data in such settings and technologies; and

  • highlighting the role of Land/Sea interfaces in such scenarios.

The sub-scenario concerns a single vessel, “Discovery”. This was modeled on a real-world vessel and dimensions set accordingly. The vessel is a passenger vessel with 182 passengers on board, engaged on a coastal trip, entering Kotzebue harbor via a pilot boarding point, the entirety of Kotzebue harbor being subject to compulsory pilotage conditions. En route to the pilot rendezvous point (the pilots meet at the Pilot Boarding Place for transfer to the vessel) the vessel suffers a loss of power and propulsion and drifts towards the shore, eventually grounding approx 4.5 Nautical Miles NW of the Espenberg Light. At the point of engine failure the coastguard is notified and an emergency declared due to the number of passengers involved and the risk to environment and property.

The sub-scenario was designed to enable the use of many different types of data and so various aspects of the proposed chain of events are in place to exercise these aspects. These are described in more detail in the Demonstration subsection.

10.3.3. Technical Architecture

The architecture of the deployed server is similar in concept to that deployed in the previous pilots, using JSON encoded data within a pygeoapi / apache2 web server to render OGC API - Features conformant data to end users. The apache mod_wsgi plugin is used to enable the pygeoapi plugin which renders the GeoJSON according to requests made.

IIC 02
Figure 16. The architecture of the deployed server (D103)

Data is initially either converted (in the case of ENCs where only S-57 data is available) or compiled from scratch (most of the data in the pilot has been compiled due to the sparsity of data in the region). GeoJSON is used as the encoding but this is moderated using the S-100 GFM encoding which maps all properties from S-100 attributes/subattributes and defines feature names and codes. In this way the S-100GFM is used as a universal model for the data promulgated by the pygeoapi server.

Other aspects of the technical architecture are as follows.

  • Only S-100 edition 5.0.0 was used - this is now the published version of S-100.

  • A revised GML encoding is also available under S-100 edition 5.0.0. This could have been implemented as well but the interoperability goals are probably better served by the use of GeoJSON.

  • The project explored a number of different datasets including the following.

    • S-421 (an edition 5.0.0 feature catalog was produced). Route exchange became increasingly important as the sub-scenario was developed. The exchange of data between participants (vessel, coastguard, pilot, assisting vessel) is crucial for managing a time-sensitive operation. S-421 (migrated to edition 5.0.0) is a very rich data model and, although much of the detailed modeling wasn’t used) it is striking how frequently the interaction between static (S-100 modeled) data and routing information appears in the scenario.

    • S-101 (using S-57 data converted). A number of individual feature types were extracted from S-101 data and implemented as OGC API services, e.g., LandArea, DepthArea, LandRegion and others. These are essential “backdrop” datasets and individual layers are a good way of enabling their use. S-101 itself is an aggregation of features without a “layer” concept but with a strong topology element so the backdrop of LandArea, DepthArea, UnsurveyedArea (essential in a remote region) and DredgedArea form a “skin of the earth” which serves as an agreed backdrop for all activities in the scenario. Additionally, extracts of land based information contained within electronic charts were performed. The best example is the runways drawn from largest scale charts and matched against the land runways viewable through openstreetmap. The attribution of these is, of course, limited but the spatial extents are accurate.

    • S-127 (pilot boarding places, areas of compulsory pilotage, contact details etc.). These regulatory areas were modeled and released as API features datasets. As with S-123 more compilation could have been done to construct a fuller picture of the regulatory environment but this is not really specific to the scenario itself - emergency MRCC and Juneau Coastguard details are certainly modeled using S-127 and their integration with the main ECDIS data is noted.

    • S-131 marine harbor infrastructure (fairways and berths). At a late stage some modeling of Pilot Boarding Places in S-131 was also done. There is an outstanding consideration of the S-98 interoperability specification in respect of projects such as these and this is noted as an output as there is no direct corresponding standard in the OGC framework which accomplishes what S-98 is designed to achieve in a regulatory environment.

    • S-123 Radio Service Areas (a single radio station was modeled using an edition 5.0.0 feature catalog - this could be more enhanced for better coverage and expression of S-123 content).

    • S-104 - see notes on S-104. S-104 is currently only encoded in hdf5 under S-100 and no facility for simple time series predictions covering a defined area are possible. This is a shortcoming of S-104 and makes its use in remote areas difficult. This will be fed back to the IHO working groups as such scenarios are valid use cases for water level information, most of which has been withdrawn from S-101 electronic charts.

    • “Depth Data” - S-102 would have been useful for the area, particularly for the refloating element of the scenario but it is not available. It is noted that some older surveys are available in .bag form which could be converted into S-102 but project timing did not allow for this to be completed. Depth data in this region were drawn directly from the largest scale charts and served as a separate OGC API layer. The GeoJSON encoding for S-100 GFM data could be enhanced to provide for such conversion (i.e., converting areas of dense, gridded data to individual features for geoprocessing) and this will be considered in that document as it is developed further.

  • Sources for ENCs were NOAA S-57 ENCs which were converted to S-101 charts for use in the project. These used v1.0.0 of the S-101 model although it is now published at v1.1. The video made for the project put the S-101 data in a simulator with the route data of the vessel and simulated the grounding incident from end to end.

10.3.4. Demonstration

The sequence of events, broadly, is:

  • Time of power failure

  • Likely grounding is notified to CG and vessel/passengers made ready

  • Eventual grounding

  • End point of vessel drift, site of grounding within shoal area offshore in location shown

    • A small number of minor casualties caused by vessel movement when grounding occurs. 2 of these require evacuation which is requested.

    • 4Nm NW of Espenberg light

    • Tide at this point is falling and current is onshore making natural freeing of the vessel unlikely.

  • Planning phase for recovery

    • Efforts to free vessel not successful

    • Medical assistance for 2 passengers required

    • Communications from 2 assisting vessels received and on course to the grounding site, the pilot vessel (having already left Kotzebue harbor) and a vessel in the vicinity of the grounding site, responding to the mayday call.

    • Planning between CG and the vessel considers:

      • Tugs from Kotzebue/towing back to Kotzebue harbor

      • Spares for propulsion repair by air

      • Required dive survey for check of seaworthiness of vessel

  • Plan established

    • Medical Evacuation of 2 passengers via helicopters from Kotzebue

    • Evacuation of remaining passengers to vessel Good Samaritan upon its arrival

    • CG vessel + divers to undertake survey on arrival at vessel

    • Time to next high tide < 6hrs, weather appears more favorable, therefore an attempt to free the vessel within 2 hrs high tide, also using tow from CG vessel, is organized.

  • Medical Evacuation via helicopter of casualties

  • Evacuation of remaining passengers via attending vessel, assisted by CG vessel (now in attendance)

  • Detailed Inspection of hull carried out. Re-Planning of recovery

  • Spares delivered by air - repairs carried out to propulsion system

  • Attempt to free the vessel is successful at high tide, now with propulsion and power restored.

  • Vessel towards Kotzebue harbor with CG vessel accompanying/towing in case of recurrence of problems

The route, shown in red, follows the image below.

IIC 03

The actions, actors, and technologies involved are described in the following table, according to each stage of the incident and its response.

Table 2. D103 server demonstration storyboard

Activity Description

Actor/End User & Technology

Vessel routing

  • Prior to engine failure the vessel routing is almost entirely S-101 / chart based. Future S-100 ECDIS will carry more data but the minimum is the navigational chart.

  • It is noted that the route published crosses the safety contour due to a large area marked as shallow. This is not unusual, many vessels are required to cross the safety contour because of a lack of available contours in existing charts. Availability of S-102 would allow a user selected contour (our vessel is 7.8m draft and the depth area in question is 5-10m [insert screenshot[). This illustrates the challenging nature of navigation in the area.

Initial engine failure

  1. Likely grounding is notified to CG and vessel/passengers made ready.

  2. Estimates of vessel trajectory and projected vessel route are made. S-421 projection is shared with CG. This is promulgated by CG.

  3. Projected grounding point is plotted over S-101 data, using S-102 and adjusted for S-104 water levels (from fixed tidal station predictions). The nearby light (Espenberg) is used as a reference (S-101 and S-125 for details on its periodic status). Images of light are sent as extract to all vessels for locating vessel (S-125).

  4. Pilot Boat is already en-route to PIlotBoardingPLace (location in S-101). Pilot Boat reroutes to projected site of grounding (S-421 exchange).

  5. CG uses AIS feeds and VHF (AIS, S-123) to ask for assistance from nearby vessels. A single vessel is able to assist. It signals CG (S-123 radio) and sends a projected route plan (S-421) with intended route and intercept points.

Grounding

  1. End point of vessel drift, site of grounding within shoal area offshore in location shown.

  2. A small number of minor casualties caused by vessel movement when grounding occurs. 2 of these require evacuation which is requested.

  3. Position is 4Nm NW of Espenberg light.

  4. Tide at this point is falling and current is onshore making natural freeing of the vessel unlikely.

  5. Vessel grounding point is noted and relayed to CG.

  6. Medical and evacuation protocols using contact details in S-127 are used to ask for immediate assistance.

  7. CG coordinate with air/sea rescue

  8. Planning is commenced for recovery operations. Uses S-101 as a backdrop, S-104 (predictions) for tidal height. Land sources for medical assistance and proximity of vessels assisting.

+1hr from Grounding

Planning phase for recovery

  • CG can use land based data for establishing time taken for arrival by medical team. ETA communicated to Pilot Boat and assistance vessel (this could use S-421 or S-211 for messaging).

  • No other vessels are available immediately. Focus is on refloating vessel at next high tide (using S-104, S-101 and S-102 data (noting lack of S-102 or S-104 in their native forms for the area)).

  • Close inspection of S-102 and S-98 Water Level Adjustment algorithms allows for accurate modelling of water levels. S-111 also suggests tidal streams are favorable.

  • CG vessel is dispatched to location (S-421 route data sent to all parties) along with emergency contact details (S-123). CG vessel is to assist with evacuation and offer tow if necessary during re-floatation.

Plan Established

Plan established

  • Medical Evacuation of 2 passengers via helicopters from Kotzebue.

  • Evacuation of remaining passengers to vessel Good Samaritan upon its arrival.

  • CG vessel + divers to undertake survey on arrival at the vessel.

  • Time to next high tide < 6hrs, weather appears more favorable too therefore an attempt to free the vessel within 2 hrs high tide, also using tow from CG vessel, is organized.

Summary

The recovery Plan is based on data content including:

  • Current charts and publications

  • Vessel movements and intentions

  • Tidal predictions

  • Weather forecast

  • Detailed survey bathymetry of location

  • Sea floor investigation

  • Currents

  • Risk analysis process

Plan is approved by CG including data on which it is based

Initial Evacuation

  • Evacuation of remaining passengers via attending vessel assisted by CG vessel (now in attendance)

  • Detailed Inspection of hull carried out

Re-Planning

CG carries out surveys and assessment of the vessel and reports to JRCC results.

  • Ongoing monitoring from real-time sensors including S-111 and S-104 (water levels).

  • Vessel assisted by Pilot and CG vessels. Passengers evacuated to Kotzebue harbor (S-421 routes produced during evac for approval by vessel traffic services). Liaison uses emergency contact information within S-123 and S-127.

Recovery

  • Spares delivered by air (runway information is included in the S-101 extracts as a separate feature endpoint).

  • Attempt to free the vessel is successful using S-104 Water LEvel, S-101 depths and (could use) S-111 currents as well.

  • Proceeds towards Kotzebue harbor with CG vessel accompanying/towing in case of recurrence of problems.

  • Follows route in S-421 - exchange between ship/shore.

The pilot has demonstrated how S-100’s GFM can represent multiple different datasets for different purposes in a search/rescue context. The ability to integrate such APIs together and form a common endpoint for users is a high priority as is the ability for end users to ingest OGC API endpoints. There is, obviously, already a well defined ecosystem for search/rescue using multiple spatial data sources and the extent to which OGC API is used in those systems is unknown. The aim of the pilot is to show how S-100 data can be reused, encoded as OGC APIs and its potential in a broader use case scenario.

10.3.4.1. Demonstration Video

To view the demonstration video showing the IIC Technologies Server highlighting the blending of diverse IHO S-100 content with OGC APIs for marine rescue activities in the arctic, please click the following link to the video on OGC’s YouTube Channel

10.3.5. Results: Integration, Interactions, and Interoperability

Results of the interoperability and implementation were very mixed. For some data types a lot of coverage exists and standards are in place for them to be expressed using S-100. For example charts, S-421 routes, spot depths, aids to navigation and regulatory information all exist, even though some are still in publication form. This is to be expected as navigation in the bay is the primary usage and safety of navigation in the region is assured by the various authorities with jurisdiction.

However, there was a lack of any systematic S-100 data approach for the region, not unusual given that S-100 implementation is in its infancy. So, for this Pilot, a number of different datasets were created and converted and exposed via OGC API. This process worked very well and showed that, for test regions, a multi-product test bed can be established reasonably quickly to test/exercise scenarios.

The utility of the S-100 data and API endpoints created was effective with clients able to quickly use data and understand its source and intended usage. This allowed a number of aspects of the scenario to be explored and identified areas where data was deficient (e.g., it’s difficult to model whether the tide will refloat a vessel if only point predictions are available with no S-100 model to transport them), it also shows the value of certain data types (e.g., the skin of the earth from ENC as a ubiquitous backdrop and the value of route exchange during operational scenarios where multiple vessels are interacting).

The Land/Sea integration, from a basic maritime perspective, was not a major factor. Post-incident enquiries/modeling and broader considerations (e.g., effects of climate change / sea level rise on siltation etc etc) may require more detailed land sea integration but at a basic “vector features” level (we looked at coastlines, runways, amenities and landmarks) the data found for maritime purposes is interoperable with the land based sources - however, this should be countered with the fact that the land data was not rich nor dense so a meaningful assessment of the integration with the maritime data available is difficult.

10.3.6. Challenges and Lessons Learned

  1. Water Level Information for Surface Navigation - S-104 (point predictions + time series) During the Pilot a number of datasets were produced and created either by conversion or compilation. The scenario offers the ability to test which datasets have a higher priority or more frequent use in the chosen scenario. As the scenario concerns a grounding incident with potential danger to life/property and environment there is also an attendant liability to consider for coastal states and search/rescue operators. Water Levels (primarily tidal levels) are a complex topic and their accurate prediction must be incorporated into any scenarios modeling the after-effects of incidents. Much tidal prediction globally is still based on astronomical predictions stemming from modeling at individual points over long periods of time. Additionally, in areas of sparse coverage the determination of water level at a particular point may be challenging. IHO S-104 is the S-100 product specification designed to model such water levels and its inclusion will have a profound effect on future navigation systems. Tidal effects have (currently) been withdrawn from the product specifications for revised electronic charts in favor of water level integration stemming from S-104. However, there are few options from an ECDIS point of view for inclusion of tidal water level heights should the dense gridded data required by S-104 be unavailable. This warrants further discussion with the IHO bodies responsible to ensure states can implement S-100 even in areas of sparse coverage.

  2. Features, metadata, aggregations. There are a number of areas where the OGC API implementation is incomplete. Although the JSON encoding, and the arrangement of the API in individual endpoints has been accomplished, the mapping between the S-100 GFM and feature catalog structure is still very loose. The encoding doesn’t really answer this question, nor does it answer how Part 17 and metadata should be mapped to a web service type model. Content is well covered by the existing encoding but metadata is not, and the OGC API - Records metadata may not perform the same function as S-100 Part 17. This is also noted in the section defining the GeoJSON encoding of S-100 GFM data.

  3. How are APIs forwarded and aggregated. Under S-100 there is a very clear, prescriptive methodology for aggregating features into datasets and multiple datasets together into exchange sets for transport to an implementing system. With APIs the situation is more complex. It is not clear how a data producer (and endpoint implementer) exercises governance or control over how APIs are aggregated or forwarded. Data Integrity is a very active area of interest for the IHO community and a number of possibilities exist but a broader, OGC specific methodology for aggregating individual endpoints and specifying how they may be forwarded, would be a big step forward. An example is ENC endpoints which are composed of many (>180) different feature types. Ideally an API endpoint for ENC data would be an aggregation of different endpoints for individual feature types but no methodology for achieving this exists.

  4. Areas of sparse coverage. Areas of sparse coverage exercise standards in a number of ways. Marine charting has long been aware of such issues and ensures that data is “marked” ( in a number of ways) to denote either age or uncertainty. Marine geospatial data is expensive to collect and process and areas of sparse coverage, particularly in the Arctic, often pose a challenge to end users when routes are constructed. The S-104 accommodation documented in the first of the lessons learnt is a case in point. Inclusion of land based features often has a greater significance in remote areas, or harsh environments - our scenario includes locations of runways on land which often have a greater significance where air support is more likely to be required. Would it be possible to form a more detailed view of data requirements, or a profile suitable for Arctic charting/data gathering? This would be an intriguing question, pointing to a regional approach for marine geospatial data for maritime use (navigation and search/rescue). S-100 is a very flexible tool for creating such things, given the nature of the framework and now that the published tools are available, such a project is far more feasible.

  5. Land/Sea observations - isolated vs features which require integration. One of the main topics of the pilot was “Land/Sea” integration, one which has been addressed repeatedly by the IHO and MSDI community. There are, admittedly, many challenges here. The experience of the Pilot, and of the use cases examined in the search/rescue scenario are that land/sea challenges fall into three broad categories.

    1. Those caused by a mismatch of Coordinate Reference Systems (CRS)/Datums. Frequently (and with good reason) IHO marine geospatial data is given against multiple vertical datums with a sounding datum (for Low Water) and a High Water datum for measuring heights/elevations. Most frequently the horizontal datum is WGS84 as the vast majority of use cases are focused on navigation systems which are always aligned to WGS84.

    2. A mismatch of modeling. Land based data frequently uses feature models and attribution which are specific to the country, region or some administrative sub-division. Marine geospatial data under S-100, by contrast, is based on globally agreed definitions and uses structures prescribed by the framework. This then requires a semantic mapping between the features. An example is the runway feature defined in our scenario. No concrete methodology for mapping between S-100 and other domain models exists currently (nor between S-100 models although this is under consideration currently).

    3. A mismatch of scale. Marine geospatial data is often at varying scales due to the costs of acquisition. By contrast land data is normally at a homogenous scale for a region. This leads to geometry discrepancies between the features which require harmonization for land/sea interoperability to be effected.

No concrete methodology has been published for resolving these three different aspects of land/sea interoperability but one could be constructed using existing best practices. This would at least establish where the biggest gaps are, and identify technologies for their resolution. Certainly, the scenario considered for search and rescue in a remote region with sparse coverage is a very good example. In our scenario land data must be repurposed (for delivery of spares to the vessel, evacuation via helicopter or plane, contact details on shore) and where innovative methods must be used for safety reasons.

10.4. Server 2 (D104) - Compusult

10.4.1. Description

Compusult, incorporated in 1985, is a diversified information technology company with over 37 years of experience in software development and is a leading provider of geospatial software solutions. Compusult is a long-standing member of the Open Geospatial Consortium (OGC) that assisted in pioneering specifications like the Web Map Service (WMS) Catalog Service for the Web (CSW), GeoPackage, etc.

Our commercial geospatial software solution, Web Enterprise Suite (WES), manages geospatial data by delivering a complete, end-to-end system providing geospatial interoperability, publishing, data discovery, data management, data access/collection, data analysis/synchronization, user management and collaboration. WES components are based on OGC / IHO / ISO and other interoperability specifications.

GO Mobile is an iOS, Android and Windows-based mobile application that uses OGC standards to provide first responders, surveyors and in-field data collectors with the ability to collect, consume and share information in connected or disconnected environments.

Compusult has created server instance D104 and instantiated a copy of our Web Enterprise Suite software as an in-kind contribution to provide OGC Web and OGC API services to access the datasets that are outlined in the sections below. This allowed clients to be able to directly consume and compare the performance and applicability of the OGC interfaces with a common dataset.

WES product, DBWMS, utilizes RDBMS data stores with accompanying OGC Styled Layer Descriptor (SLD) documents to produce Web Map Service endpoints to display the RDBMS data using the OGC WMS standard. Compusult also provided an OGC API component that allows the data from RDBMS data stores to be provisioned as OGC API - Features and coverages.

The data sources described in the sections below were made available to all clients participating in the pilot via OGC standards to be used to support emergency operations and potential threat to the surrounding marine ecosystems. This data, and associated APIs, can also be useful in helping craft climate change and/or search and rescue scenarios. The availability of the services in both the OGC Web Service and OGC API service flavors allowed for the direct comparison of the performance and flexibility of these standards.

10.4.2. Sub-scenario and Data

Rescuing / transferring passengers and crew (immediate transfer of any injured) and potential oil spill detection.

The Discovery cruise ship is being monitored as it travels from Nome, Alaska to Kangerlussuaq, Greenland. A distress message has been sent from the Discovery cruise ship via ship board sensors stating that the ship has lost power. This is confirmed with the captain that informs CG-SAR that the ship is adrift. The US Coast Guard Search and Rescue (CG-SAR) has been alerted and they have entered the incident into the Incident Command System (ICS).

An analyst with CG-SAR begins the process to create a situational awareness portfolio to support the rescue operations. The analyst begins the process by gathering relevant information and content to provide situational awareness and provide input into the decision-making process.

The following sources were made available to all participants as OGC APIs - Features and OGC APIs - Coverages and, where applicable, OGC WMS services. These data sources can be compared and contrasted historically to be used for comparison with other sources of weather-based data.

  • Daily Polar Sea Ice Concentrations: Canadian Ice Services sea ice concentration data for the period 2006-current for the eastern and western Arctic. This data provides coverage polygons of sea ice concentrations over the entire Arctic area.

  • NOAA Global Deterministic Prediction System (GDPS) Data: Global datasets (in GRIB format) from NASA using OGC API - Features and Coverages. This allows users to interact with temperature, precipitation, cloud cover, wind speed and direction, and humidity on a global scale.

  • National Oceanic and Atmospheric Administration (NOAA) Weather Data: Global Forecast System (GFS) weather prediction system containing a global computer model and variational analysis run by the U.S. National Weather Service (NWS).

  • METAR Weather Report Observations: Worldwide METAR weather reports.

The analyst then incorporates the logistics to support the rescue operation. Logistics services include the following from Compusult as OGC APIs and OGC WMS services as well as other readily available sources from the internet such as hospitals.

  • FAA Commercial Air Traffic and Live Flight Information: Commercial Air Traffic and Live Flight Information providing real-time En Route Flight Data information about aircraft around the world from the Federal Aviation Administration (FAA).

  • Automatic Identification System (AIS) Maritime Ship Traffic: Automatic Identification System (AIS) provides access to real time updates of vessel positions data, ship details, port calls and voyage information and even more.

  • National Oceanic and Atmospheric Administration’s (NOAA) National Data Buoy Center (NDBC) Buoys and Ships Observation Data: Comprehensive, reliable systems and marine observations to support the forecasts and warnings missions of the National Weather Service (NWS) and NOAA, to save lives and property. NDBC’s major missions include the weather/hurricane buoys network, C-MAN coastal station network, TAO Buoy Network that supports ENSO forecasting, and DART Buoy Network that provides open-ocean measurement of Tsunamis. Also includes NDBC ship data.

The analyst then creates a situational view using the information above and dispatches the nearest aircraft to provide up to date reconnaissance conditions of the ship. This was simulated using the Compusult GO Mobile application to take pictures and gather metadata about the ship and its surroundings. This data was available as an OGC WMS and OGC API - Features for other participants to consume.

Reports from the reconnaissance mission indicate there is a potential fuel leakage. An analysis conducted using satellite imagery to determine the impact. The analysis concludes there are no serious impacts from the potential spillage and rescue operations are instigated with constant updates and briefings. The rescue operation includes the dispatch of the helicopter bringing in personnel to assist the captain and the dispatch of a nearby US Coast Guard cruiser to rescue all personnel. The analyst provides the cleanup crew with relevant information to support vessel recovery and disposal. The following data sources were used to determine potential fuel leakage.

  • Radarsat Satellite imagery: RCM Dual pole HH/HV SAR Imagery

  • Sentinel 1: all-weather, day and night radar imagery for land and ocean services

  • Sentinel 2: high-resolution optical imagery for land services. It provides for example, imagery of vegetation, soil and water cover, inland waterways and coastal areas.

  • Sentinel 3: high-accuracy optical, radar and altimetry data for marine and land services. It measures variables such as sea-surface topography, sea- and land-surface temperature, ocean color and land color with high-end accuracy and reliability

Output of the fuel leakage analysis was made available as Esri Shapefile and OGC API - Features.

Actor(s), end-user(s), and/or stakeholder(s):

  • Environmental Analyst, CG-SAR

Interoperable Technologies:

  • OGC API - Features, Coverages, Styles

  • OGC Catalog

  • OGC SensorThings API (IoT)

  • OGC services (WMS, WMTS)

Data and Platforms

  • Accessing the following data through local OGC API - Features and OGC WMS

    • AIS Ship Traffic (API Features)

    • Live flight information

    • Daily Polar Sea Ice

    • Metar Weather Reports

    • NOAA ship and buoy data

  • Accessing the following data through local OGC API - Coverages

    • NOAA Weather Data

  • Accessing other participant OGC API - Features, Coverages and EDR, Compusult uses other participant OGC APIs and Services as they become available to test interoperability and to enhance our scenario.

Arctic Voyage Planning Guide Themes (additional info)

  • Theme 1: Carriage Requirements : Navigational Warnings Services

  • Theme 3: Arctic Environment Considerations

    • Communities and Populated Areas Information

    • Weather Station Locations and Services Available (similar to S-49 E.4.2 and U.4))

    • Airports and Hospitals

  • Theme 6: Marine Services

    • Ice Breaking Support Services

    • Search and Rescue Support Services

    • Ice Services Information (similar to S-49 U.6.4)

10.4.3. Technical Architecture

The figure below shows the technical architecture of the D104 server made available by Compusult to all OGC Participants. This diagram shows the flow of data from the various sources through OGC APIs and Services. For this pilot the open source application PyGeoAPI was used to create the OGC API services. OGC SensorThings and OGC WMS services were made available from Compusult’s Web Enterprise Suite application.

Compusult 01
Figure 17. Technical architecture of the D104 server

NOAA Global Forecast system (GFS), GRIB weather files are downloaded and processed to generate particle layers depicting the wind direction and speed. Additionally, the GRIB files are processed to create CoverageJSON that is fed to the client using OGC API - Coverages.

Text based Daily Polar Sea Ice Concentrations, METAR weather reports, FAA Flight Information, Automatic Identification System (AIS) Maritime Ship Traffic and National Data Buoy Center (NDBC) Buoys and Ships Observation Data are parsed, stored in a postgres database, and served in the client using OGC API - Features.

IoT Sensor Data is gathered and stored in SensorHub and made available externally using OGC SensorThings and OGC WMS standards.

Satellite data such as Sentinel and Radarsat are processed to detect potential oil spills. This process results in a shapefile which is rendered in the client using OGC API - Features.

Using GO Mobile, field data is collected in disconnected operations stored in OGC GeoPackages and rendered in the client using OGC WMS.

In order to provide context to the data that was rendered on the screen, Compusult extended the OGC API - Features spec to also include styling rules. These styling rules provided a default style for each layer with an URL to the associated SLD. In order to make use of the SLDs Compusult incorporated an open source application to extend the capabilities of the Leaflet to read and render complicated SLD documents. The SLD extension for Leaflet is located on github at https://github.com/orfon/Leaflet.SLD.

The sample JSON below shows the default style for the Metar Weather Report Observations layer.

 {"default":"","styles":[
    {"styles":[
        {"description":"Default Style for Metar Weather Report Observations","links":[
           {"rel":"style",
"href":"https://dbwms-dev-srv2.compusult.com/ServiceDBWMS/webservices/geodb/METAR/feature/styles/metar_weather_report_default?",
          "type":"application/vnd.ogc.sld+xml;version=1.0",
          "title":"Style for Metar Weather Report Observations"
        }],
    "Id":"metar_weather_report_default",
    "title":"Metar Weather Report Observations"}]}]
}

10.4.4. Demonstration

The following storyboard D104 server demonstration storyboard can help illustrate how users will interact with the server component developed in this deliverable. This storyboard helps to clarify the product objectives and functionalities. This storyboard helps to clarify the product objectives and functionalities. The actions, actors and technologies involved are described in the following table, according to each stage of the demonstration.

Table 3. D104 server demonstration storyboard

Seq #

Activity Description

Actor/End User & Technology

1

A call center receives a call about an emergency incident.

Actor: Call Center Operator

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC Feature API

Data: AIS ship positions

2

The location of the incident is forwarded to emergency response professionals. “location report” allows them to view the incident location on a map, and also see their own position on the map.

Actor: Coast Guard

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC Feature API

Data: AIS ship positions

3

Ground crews are dispatched to the shoreline to collect field data in the form of photos and routes to access potential impacts.

Actor: Coast Guard

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC GeoPackage

Data: Routes and photos

4

Helicopters are dispatched to retrieve people from the ship and transport them to the nearest medical center.

Actor: Coast Guard

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC GeoPackage

Data: Routes, photos and GIS locations

5

Satellite imagery over the area is gathered for analysis of a potential oil spill.

Actor: Coast Guard

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC API - Features, ESRI Shapefile

Data: Data detailing potential for an oil spill

10.4.4.1. Demonstration Video

To view the demonstration video showing the Compusult Fusion Server supporting emergency operations and potential threats to the surrounding marine ecosystems in the arctic, please click the following link to the video on OGC’s YouTube Channel

10.4.5. Results: Integration, Interactions & Interoperability

Compusult was able to provide services in both OGC API and OGC WMS to the various clients. These services were successfully integrated with all clients.

  • D100 - Esri Canada

  • D101 - Helyx

  • D102 - Ecere

These clients each provided different technologies including Esri’s ArcGIS Pro and Helyx with a web based leaflet JS client showing that the integration of OGC Standards for Features and WMS can be accomplished using various clients. There were no issues with interoperability between the servers and clients. By providing the root endpoint to the OGC API - Features server the clients were able to interact with the services and data without requiring additional information from Compusult.

From a client perspective Compusult successfully published the OGC API endpoint into our OGC CSW catalog to allow for easy search and discovery at a later point. The same OGC API was then used in a portfolio to enhance the data content for the scenario. However with the lack of defined styling rules and legend it made it difficult to determine visually what the intent of the data was when first viewed. Users are required to interact with the data to understand the content they are analyzing visually.

As described in the Challenges and Lessons Learned below, Compuslt feels that OGC API - Features can provide a great deal of information to the end user in a standard format that is easy to discover and navigate. However the lack of styling rules and legends takes away from the end user experience.

10.4.6. Challenges and Lessons Learned

The results of this pilot showed the lack of data in the subject area. The Arctic Voyage Planning Guide provided a significant amount of data through much of the Arctic region but only a small subset of the layers contained data in the subject area. This provided great challenges in developing a scenario and hence simulated data had to be used.

Besides the lack of data, two of the main challenges Compusult faced when rendering the data in the client was the lack of styling rules and legends. This left us with rendering polygon and point features from various sources that all looked and felt the same. Without styling and legends it was difficult to put the data in the right context for the end user.

As described in the technical architecture section previously, Compusult overcame the styling problem with its own API and client by extending the OGC API - Features to induce references to SLDs that were utilized in Leaflet through the open source addition of Leaflet.sld.js.

With Coverage JSON, a legend to our client was added by mapping the data content in JSON to a color palette representing the legend for the data. The color profile is selected by the client based on the type of GRIB data, i.e., wind, temp, rain. This provides the end user with a legend representative of the data being viewed. This however comes with limitations as the client has to be updated to accommodate new data and it does not provide a solution to external clients.

Compusult was also able to compare the use of OGC WMS and OGC API - Features using the same data sources through this pilot. OGC WMS provided the benefit of uniform styling and legends across all clients as the images provided were rendered and served from the same server. However the images can be quite large and thus cause performance issues in bandwidth constrained environments. It was determined that the smaller size of the JSON responses from OGC APIs will provide a much faster response time. The added ability to filter the results from OGC API also provided much more control to the end user, allowing them to select features only relevant to the area in which they were interested. With the ability to do more fine grained filtering through the use of CQL queries, clients can make requests to the servers that would return only those features pertinent to the situation. For example, the scenarios shown in the resulting videos could be enhanced from the end user if they were provided the ability to find features contained in an AOI showing a water depth no greater than N meters. Such a query would have allowed the user to determine if and when the ship would run aground.

10.5. Server 3 (D105) - University of Calgary

10.5.1. Description

Our team is undertaking research into the use of DGGS within the University of Calgary, School of Engineering, Geomatics Department. Through successfully participation in Phase Two of the FMSDI Pilot Project, our Fusion server provided data agnostic features and coverage data fusion services discovered through OGC EDR API Collections (See Video Here). Participation in the original Arctic Spatial Data Pilot (See Video Here) showed where the particular scenario involved scientific modeling and analysis, based on a DGGS system architecture that fused data coming from OGC Feature and Coverage services to forecast change in permafrost distribution due to climate change. Slope stability, wildlife ranges and human socio-economic effects were assessed.

Our contribution to the Federated Marine SDI Pilot Phase 3 is a Discrete global grid system (DGGS) server deployed as the D103 Fusion Server. The D103 DGGS Fusion Server acts as a data integration engine, harvesting key data values from multiple sources of heterogeneous data and delivering data packets of analysis ready data through the OGC API - DGGS. As DGGSs are hierarchical equal area data structures, the data can reside at a native resolution while aligned with other spatial resolutions/scales, and the data in a collection of cells can be summarized using statistics and lists.

The D103 DGGS Fusion Server interacted with the DGGS client provided by Ecere corporation serving as a practical Technology Integration Experiment (TIE) for the current OGC API - DGGS candidate draft standard.

10.5.2. Sub-scenario and Data

For Phase 3 of the pilot, the University of Calgary developed three sub-scenarios as follows.

  • Analysis of coastal erosion due to climate change in Alaska on a DGGS Client.

  • Analysis of potential flooded populations due to sea level rise on a DGGS Client.

  • Analysis of potential hazards along a navigational route on a DGGS Client.

10.5.2.1. Predictive modeling and visualization of coastal erosion

For the three sub-scenarios predictive modeling and visualization were used. Arctic permafrost constitutes one-third of the world’s coastline that is characterized by the presence of ice and cohesive sediments. Erosion of the Arctic coastline has adverse impacts on social life and the economy of the communities living in the area. There are two main processes of coastal erosion in the Arctic regions: thermodenudation and thermoabrasion. Studies suggest that erosion along the Arctic coastline is considerable and increasing. With increasing global warming, sea-ice is disappearing at an accelerated rate and wave growth in the Arctic has increased to an alarming level. The situation is worsened by the fact that most of the existing knowledge regarding coastal erosion pertains to temperate areas and for non-cohesive sediments. Afzal, M.S., Lubbad, R. (2019).

Generally, diverse land and marine datasets (ocean currents, geology, temperature, wind, permafrost characteristics, etc.) are integrated to reveal sensitivity to inundation, coastal flooding, and erosion, that is arising from climate-related changes (e.g., sea level, sea ice and storminess). These integrated data values are used to support inputs to parametric type models that can be tested and used. An example of this would be the coastal erosion model described by Rolph et al. (2021) [6], where a wind forcing dataset, masked during times of sea ice cover, is used to force a coupled storm surge model. This provides water level data to the erosion model, driving the bluff retreat and beach erosion through a heat and volume balance. Sea surface temperature, wave height, and wave period are also considered, as well as the prescribed cliff and beach parameters of volumetric ice content, sediment grain size, cliff height, thaw depth, and cliff and beach angle (Model sketch illustrating an example of a parametrization model of pan-Arctic coastline erosion. Basic physical model parameters are in black and processes in redfootnoe::[Rolph, R., Overduin, P. P., Ravens, T., Lantuit, H., & Langer, M. (2021). ArcticBeach v1. 0: A physics-based parameterization of pan-Arctic coastline erosion. Geoscientific Model Development Discussions, 1-26.] (Rolph et al., 2021)).

UoC 01
Figure 18. Model sketch illustrating an example of a parametrization model of pan-Arctic coastline erosion. Basic physical model parameters are in black and processes in redfootnoe::[Rolph, R., Overduin, P. P., Ravens, T., Lantuit, H., & Langer, M. (2021). ArcticBeach v1. 0: A physics-based parameterization of pan-Arctic coastline erosion. Geoscientific Model Development Discussions, 1-26.] (Rolph et al., 2021)
10.5.2.2. Analysis of coastal erosion due to climate change in Alaska on a DGGS Client

In the first sub-scenario, a regional government on the coastline of Alaska is concerned over the risk of coastal erosion and has hired a consulting scientist to provide a general overview of it. Using acceptable methods of prediction and readily available data, the scientist is able to rapidly integrate data via a DGGS Service from their client application and undertake various computations to assess the potential risk.

The analysis and modeling performed here are limited to illustrate how data, information acquisition and integration in the Alaska Regions can be transformed into knowledge discovery using Standards for Interoperability. They should not be used as a representative of scientific results or conclusions.

The main goal is to predict where the coastal slopes will be weakened due to lost permafrost and sea rise considering the slope and aspect of the surface model, surficial geology, land use, and current permafrost conditions.

For this, a parametric model was developed on the DGGS space following five steps listed below based on Couture and Riopel (2008)footnote::[Couture, R., & Riopel, S. (2008). Regional landslide susceptibility mapping and inventorying in the Mackenzie Valley, Northwest Territories. In Proceedings of the 4th Canadian Conference on Geohazards: from causes to management, Presse de l’Université Laval, Québec (pp. 375-382). ] (The landslide susceptibility parametric method steps used to produce a landslide susceptibility map (Couture and Riopel, 2008)).

  • Step 1: Data acquisition

  • Step 2: Derive new data

  • Step 3: Parametrize/reclassify

  • Step 4: Rating and weighting

  • Step 5: Final integration

UoC 02
Figure 19. The landslide susceptibility parametric method steps used to produce a landslide susceptibility map (Couture and Riopel, 2008)

Parameters normalize each condition from 1 to 10 where 10 is a characteristic likely to significantly contribute to slope failure. The final integration is a combination of the parameters with the following weightings.

  • 30% Slope Parameter

  • 5% Aspect of Slope Parameter

  • 5% Extent of Permafrost Parameter

  • 20% Ice Content Parameter

  • 10% Land Cover Type Parameter

  • 30% Surficial Geology

The resulting DGGS Globe indicating coastal sensitivity to climate change can be shared with others and can be used to assess the impacts on local communities, infrastructure, traditional activities, wildlife migration, etc. and identify areas in need of detailed study.

Several questions were asked during the scenario development.

  • Where are the areas along the coast of Alaska that might be prone to erosion due to the rising sea levels and lost permafrost due to climate change?

  • What are the characteristics of the human population and natural environment that may be affected by these changes?

The relationship of this sub-scenario to the main scenario was a Coastal Erosion Map for Ecotourists interested in what the landscape will look like in 50 years.

10.5.3. Data and sources:

The datasets described above come from different types of sources and services: coverage services, feature services etc. The Discrete global grid system (DGGS) server is able to act as a data integration engine and acquire this data via OGC standards. During the ingestion process, the data is quantized and sampled into DGGS cells. Using DGGS operations, the erosion model is implemented as a series of calculations over the cells. The results are exposed using the OGC API - DGGS, as data packets of analysis ready data that can be read by any DGGS-enabled client (Analysis of Coastal Erosion due to climate change scenario architecture.).

UoC 03
Figure 20. Analysis of Coastal Erosion due to climate change scenario architecture.
10.5.3.1. Analysis of potential flooded populations due to sea level rise on a DGGS Client

Rising sea levels are due to two main factors: melting of land ice, adding water to the oceans; and the warming of the watersa(s water expands as it gets warmer). Sea level has been measured regularly since the 19th century using systems of coastal tide gauges. Over the past 100 years, global average sea level has risen steadily for about 0.2m (Two possible results of sea level rising based on two pathways from global warming: RCP2.6 Low – 0.44m (black) and RCP8.5 High 0.84m (red)). In recent years, sea level is rising more than twice as fast.

UoC 04
Figure 21. Two possible results of sea level rising based on two pathways from global warming: RCP2.6 Low – 0.44m (black) and RCP8.5 High 0.84m (red)

Many coastal communities are already seeing the effects of the sea level rise. This sub-scenario would simulate several situations in the future where new raised sea levels are modeled over the coast of Alaska. This is achieved by a cell selection filtered by a range of heights/depths and quickly shows the areas that would be affected from a sea level rise of a certain magnitude. The integration of more datasets of interest to analyze the characteristics of the population affected is possible through the DGGS server.

10.5.4. Data and sources

  • Global Elevations – GEBCO Bathymetry: https://www.gebco.net/data_and_products/gridded_bathymetry_data

  • World population density dataset: Center for International Earth Science Information Network - CIESIN - Columbia University. 2018. Gridded Population of the World, Version 4 (GPWv4): Population Density Adjusted to Match 2015 Revision UN WPP Country Totals, Revision 11. Palisades, New York: NASA Socioeconomic Data and Applications Center (SEDAC). https://doi.org/10.7927/H4F47M65.

10.5.4.1. Analysis of potential hazards along a navigational route on a DGGS Client

Information concerning the navigational safety of ships is of critical importance along a navigational route, and unfortunately, shipwrecks are not uncommon. In earlier days, this information was difficult to obtain in a timely manner due to limitations of maritime communication equipment and data availability. With S100 hydrographic sources containing historical, derived, and real-time information, a model of the navigational risk can help assess navigational routes.

In this sub-scenario, a navigator is planning a route over the waters near Alaska, and they want to assess major risks and concerns over their route before deciding where to navigate along.

Shallow sea depth, Marine Protected Areas, High-Density Vessel Traffic, marine life, and other cautions provided by NOAA and the US Coast Guard are mashed up and parameterized to prioritize risk.

10.5.6. Technical Architecture

10.5.6.1. Overview

The D103 Fusion Server allows to integrate geographical data from heterogeneous sources – both raster and vector encoding – into an Icosahedral Snyder Equal Area Aperture 3 Hexagonal (ISEA3H) Discrete Global Grid System (DGGS). The D103 DGGS Fusion Server acts as a data integration engine, harvesting key data values from multiple sources of heterogeneous data and delivering data packets of analysis ready data through the OGC API - DGGS implementation, or other types of OGC APIs implementations (EDR, Feature, etc.). These implementations allow the discovery of data as OGC collections and serve as multiple interfaces for an integrated data application to access DGGS data. Overview of the technical architecture of the DGGS fusion server with three main components: data, DGGS server, and clients that communicate using OGC APIs shows an overview of this server technical architecture.

UoC 05
Figure 22. Overview of the technical architecture of the DGGS fusion server with three main components: data, DGGS server, and clients that communicate using OGC APIs

The D103 Fusion Server component was demonstrated by the University of Calgary sub-scenarios development and successfully responded to DGGS API query requests from the Ecere corporation client.

10.5.6.2. DGGS and the Icosahedral Snyder Equal Area (ISEA) projection

A Discrete Global Grid System is a geometric partitioning of the entire globe with tessellations of uniform cells that refine into ever smaller cells. Each cell is indexed to permit parent-child and neighbor operations. Data is easily partitioned horizontally and vertically for efficient search, processing, and transmission. As the data values are sampled and quantized into the discrete cells, most DGGS operations are akin to image processing and image algebra. Topographic relationships are explicit set theory, and because cells are equal area, aggregation and summary operations are straightforward.

The D103 Fusion Server internally performs data quantization and integration based on the Icosahedral Snyder Equal-Area aperture 3 Hexagonal (ISEA3H) DGGS. This DGGS is based on the planar ISEA projection described by John P. Snyder’s in ”An Equal-Area Map Projection For Polyhedral Globes” (1992) (ISEA projection illustration from the PROJ library for proj-string: +proj=isea).

UoC 06
Figure 23. ISEA projection illustration from the PROJ library for proj-string: +proj=isea

The ISEA3H DGGS has the icosahedron as base polyhedron, with its triangular faces partitioned into hexagons. Each vertex of the icosahedron is the meeting of 5 triangles, so the grid shape at each vertex is necessarily a pentagon, and the first resolution is composed of 12 pentagon cells. For subsequent resolutions, the grid contains 12 pentagon cells in each, and the remaining cells are hexagons. The pentagon cells have the property of being 5/6 the area of the hexagons. The ISEA3H global grid at several resolutions. shows the ISEA3H DGGS at several resolutions.

UoC 07
Figure 24. The ISEA3H global grid at several resolutions.

Aperture (A3) refers to the way in which refinements of the hexagons are performed. Aperture 3 (Three resolutions of an aperture 3 hexagon grid on the face of the icosahedron. Adapted from Sahr et al. (2003)footnote::[Sahr, K., White, D. and Kimerling, A. J. (2003). Geodesic discrete global grid systems, Cartography and Geographic Information Science 30(2): 121–134.].) causes the hexagons to alternate between two orientations (point at top and edge at top), and the area of each hexagon decreases by 1/3 at each increasing resolution.

UoC 08
Figure 25. Three resolutions of an aperture 3 hexagon grid on the face of the icosahedron. Adapted from Sahr et al. (2003)footnote::[Sahr, K., White, D. and Kimerling, A. J. (2003). Geodesic discrete global grid systems, Cartography and Geographic Information Science 30(2): 121–134.].
10.5.6.3. The PYXIS indexing

The PYXIS indexing is a parent-child hierarchical indexing for the ISEA3H DGGS that uses a 3-bit-digit to encode 7 child cells to its parent. It is the indexing system that the D103 Fusion Server uses internally.

The square root three tiling used in the ISEA3H is a series of base 3 refinements. Hexagons cannot be divided into smaller hexagons nor aggregated to form larger hexagons, which distinguish them from common congruent tessellations of square and triangular tiles (quad-trees). Congruence provides an efficient basis for the hierarchical division of space but fails to retain the important mathematical property of monotonic convergence required in a complete numbering system.

The PYXIS index encodes both hierarchy and convergence. Two types of child cells can be identified in the square root three subdivision. A child cell that shares the centroid of the parent cell is called a Centroid Child, and a child cell whose centroid is located at the vertex of a parent cell is called a Vertex Child. The PYXIS indexing allocates a digit to each of the children of a parent cell. The parent index is refined by adding a zero (0) placeholder to index the Centroid Child.

Children share a vertex with 3 parents. This raises a problem similar to what is encountered when rounding the number 23.465 to 2 decimal places - up or down? A rule is required to overcome this dilemma. Parent cells that were themselves Centroid Children (centroid parent) are used to form the index of the Vertex Children. So, a useful property is that every child zone centered on a parent zone’s vertex (Vertex Child) always has exactly one parent that is a Centroid Child. This property can be used to easily select the primary parent for Vertex Child zones.

In Three levels of the pyxis indexing at different colors (green, blue, red)., the parent cell "0" spawns a Centroid Child "00". This can continue indefinitely for all cells that are located at the center of their parent - so "00" is the parent to the vertex cell "000". The parent index is refined by adding a digit "1,2,3…​6) to index the Vertex Child (in Three levels of the pyxis indexing at different colors (green, blue, red). shown for example as "002"). This type of indexing generates the PYXIS Tile shown in Pyxis tile and the snowflake pattern, what is called the “Snowflake pattern”.

UoC 09
Figure 26. Three levels of the pyxis indexing at different colors (green, blue, red).
UoC 10
Figure 27. Pyxis tile and the snowflake pattern

Coverage of PYXIS tiles over the Earth forms a complete global spatial reference system. A full PYXIS tile is positioned on each of the 20 faces of the icosahedron, and a modified 5/6th PYXIS tile (with one branch removed from the root/first parent and its subsequent Centroid Children) is positioned on each of the 12 vertices of the icosahedron as shown in the unfolded version (General form of the PYXIS indexing encoding). Each of these 32 tiles (20+12) is given a unique label with a general form shown in Full earth coverage of the PYXIS tiles on the unfolded icosahedron.

UoC 11
Figure 28. Full earth coverage of the PYXIS tiles on the unfolded icosahedron
UoC 12
Figure 29. General form of the PYXIS indexing encoding
10.5.6.4. The Rhombus indexing

The Rhombus Index addresses the need to utilize a rectilinear data structure, such as texture mapping and other efficient GPU operations, while maintaining complete coverage over all ISEA3H DGGS hexagonal cells. An indexation method based on rhombuses greatly simplifies the tasks of partitioning, indexing and encoding the data of particular geospatial regions.

The first resolution of 10 rhombus tiles correspond to two of the twenty icosahedron triangle faces of the planar ISEA projection (Full earth coverage of the Rhombus tiles on the unfolded icosahedron). Each of these root rhombuses is divided into nine (3 by 3) equal area rhombuses at the next level (aperture 9, Figure 16). Since for ISEA3H the area of hexagons of the next level occupies an area three times smaller than the previous one, each level of rhombus tile corresponds to every other (even) level of ISEA3H.

UoC 13
Figure 30. Full earth coverage of the Rhombus tiles on the unfolded icosahedron
UoC 14
Figure 31. Next level resolution of the Rhombus tiles shown on the globe. Each root Rhombus is divided into 9 rhombus children

The Rhombus index uses UV axis coordinates to address rhombus tiles that cover the ISEA3H hexagons (Next level resolution of the Rhombus tiles shown on the globe. Each root Rhombus is divided into 9 rhombus children). The problem arises of how to create a relationship between the rhombus and the hexagon cells within it. A decision needs to be made of what hexagon edge cells to include in each rhombus, creating the concept of included VS open edges (Included VS open edges on a Rhombus tile.): all cells intersecting the Included edges are included in the rhombus region, while all other cells are included in the rhombus region if fully contained by it.

UoC 15
Figure 32. Next level resolution of the Rhombus tiles shown on the globe. Each root Rhombus is divided into 9 rhombus children
UoC 16
Figure 33. Included VS open edges on a Rhombus tile.

As a summary, the rhombus indexing maintains the following main properties.

  • All Rhombuses have the same area on earth.

  • There are 10 rhombuses covering the whole earth, and the north and south pentagons do not belong to any rhombus.

  • Every subsequent resolution is represented by 9 rhombus children and is equal to two resolutions zooming in PYXIS indexing.

  • Rhombuses are congruent.

  • Each rhombus includes hexagonal tiles only in 2 of its edges, with the concept of included and open edges.

Each of these rhombus tiles are also given a unique label with the following general form:

                MDDDD

Where M is the main resolution (0…9 rhombus and N/S for north and south poles), and D are the sub rhombuses (0…8)
10.5.6.5. The PYX0 format

The PYX0 format is the PYXIS defined format utilized to send Rhombus tiles (containing float data values) to a DGGS client that understands the Rhombus tiling schema. It is a way to send float32 textures by quantizing the data values and compressing them. The compression is done in 2 steps.

  1. A lookup table of float values is created (a wanted BinCount defines the number of entries). This is done by sorting and normalizing the data.

  2. Each float data value is quantized into 2 bytes.

    1. First byte (stored in ValueBinIndex array) defines the bin to use in the Bins lookup table.

    2. Second byte (stored in ValueBinOffset array) defines the offset of the current value inside the bin.

They get stored into a buffer as shown in the following table.

Table 4. PYX0 format utilized to quantize and send Rhombus tiles

Section

Length (bytes)

Description

MagicHeader (signature)

4

"PYX0"

RhombusSize

4

size of image (as int)

BinCount

4

number of bins (as int)

BinOffsetResolution

4

offset resolution range (as int)

Bins (lookup table)

4 * BinCount

bins (as floats)

ValueBinIndex array

size * size

one byte index into bins lookup table for each value

ValueBinOffset array

size * size

one byte offset into bin for each value

To restore the value and decode the PYX0 format, a DGGS API Client reader should do the following:

                resultValue = Bins[ValueBinIndex] +
            (Bins[ValueBinIndex+1] - Bins[ValueBinIndex]) *
                ValueBinOffset / BinOffsetResolution

where a ValueBinIndex of 1 refers to the first entry in the Bins lookup table (it should be replaced by ValueBinIndex - 1 for 0-based indexing, as in C-like programming languages). A ValueBinIndex of 0 indicates a null value.

Therefore, the number of possible values is BinCount * BinOffsetResolution, which is distributed nonlinearly on the given input values range. Note also that some rhombuses can contain null values. A null value is treated as ValueBinIndex=0 ValueBinOffset=0, therefore bin numbers start from 1.

10.5.6.6. DGGS server architecture

The PYXIS DGGS server cluster used in the Pilot Project operates 6 different modules to facilitate data discovery, encoding, storage, processing, visualization, and sharing; depicted in Figure 19 and explained below:

  1. Discovery: crawling and harvesting data sources. This is done by the ability to discover OGC compliant services and other types of services for available datasets.

  2. Encoding: sampling and quantizing data into the hexagonal cells following quantization algorithms of nearest neighbor, bilinear and bicubic. DGGS are data agnostic – capable of ingesting data values across all spatial resolutions, reference systems, formats, and data types.

  3. Storage: cache and indexing. The fast data transmission of rhombus tiles is based on the indexing system and a caching mechanism on the server.

  4. Processing: based on the four DGGS operations of iterations, calculations, selections, and aggregated summaries.

  5. Visualization: worldview webglobe UI (Ucalgary DGGS Client).

  6. Sharing:

    1. Tile generator: that generates rhombus tiles encoded in PYX0 format as data packets, and also PNG styled images based on the rhombus indexing.

    2. Pipeline encoding: deals with processes (operations) in the data and allows the chaining of processes and styling mechanisms.

UoC 17
Figure 34. DGGS server architecture composed of six different modules.

The capabilities of the server get exposed by the available OGC API implementations. Integrated data applications acting as clients can send requests to retrieve data packages encoded as PYX0 rhombus tiles and PNG styled images of the data, perform calculations across the server’s data collections, explore and discover characteristics of selected locations, and search and find locations that meet specific characteristics.

10.5.6.7. DGGS operations and processes

As described on the processing module of the DGGS server, four main operations and its combinations describe DGGS processes as follows.

  • Iterations: processes that traverse over the cell indexing (Zone IDs).

  • Calculations: processes that create or modify a single value in a cell. With data values aligned in the DGGS cells, calculations are as simple as spreadsheet expressions that result in new data values.

  • Selection: processes that group a set of cells. For example, using a feature to select an area of interest (represented as a group of cells).

  • Aggregation: processes that characterize or summarize the values in a group of cells. After a selection, the server can return aggregated statistics to characterize a location.

The DGGS server defines the concept of pipelines as a chaining of DGGS operations and datasets. The pipelines get stored in the server and are accessible as if they were OGC collections. Note that the pipelines perform the calculations on the fly and use the same caching mechanism as datasets. In a best practice, the DGGS server would provide a method of serializing these operations into a process chain and instances of DGGS process chains in themselves could be used in a larger process chain framework with the ability of using a network of different OGC compliant servers.

10.5.6.8. OGC API - DGGS implementation

This section describes the OGC API - DGGS implementation of the UCalgary DGGS server. This was done based on the official OGC documentation accessible at https://developer.ogc.org/api/dggs/index.html

The following API endpoints were implemented:

10.5.7. Demonstration

This section presents the results of the implementation of the sub-scenarios described in the sub-scenario and data section.

10.5.7.1. Analysis of coastal erosion due to climate change in Alaska on a DGGS client

The results of the five steps of the parametric model implementation are described below.

  1. Data acquisition: data was harvested from the described data sources and quantized into the DGGS server.

  2. Derive new datasets: slope and aspect were derived from the GEBCO Bathymetry dataset using the following DGGS operations and the calculator tool (Figure 20) from the Ucalgary DGGS client:

                slope([OGC_DGGS_Server/GEBCO_2022_Bathymetry_Topo_Elevations])
                aspect([OGC_DGGS_Server/GEBCO_2022_Bathymetry_Topo_Elevations])
    UoC 18
    Figure 35. Calculator tool of the UCalgary DGGS client web Globe application.
    UoC 19
    Figure 36. Results of the slope calculation based on the GEBCO bathymetry dataset.
    UoC 20
    Figure 37. Results of the aspect calculation based on the GEBCO bathymetry dataset.
  3. Parametrize/reclassify: the parametrization and reclassification of the parameters involved in the model were done using the calculator tool from the Ucalgary DGGS client. It assigns different values depending on the attributes of the datasets:

    Slope Parameter -> transform([Slope],{"0":1, "3.5":3, "8.7":5, "17.6":7, "36.4":10, "100":10}, false)
    Aspect of Slope Parameter -> transform([Aspect],{0:1, 45:5, 135:10, 225:5, 315:1, 360:1}, false)
    Extent of Permafrost Parameter -> transform ([EXTENT@Permafrost],{"C":1, "D":5, "S":7, "I":10},true)
    Ice Content Parameter -> transform ([CONTENT@Permafrost],{"l":1, "m":5, "h":10},true)
    Land Cover Type Parameter -> transform([Global_Land_Cover],{0:1, 1:1, 2:1, 3:1, 4:1, 5:1, 6:5, 7:5, 8:5, 9:5, 10:5, 11:1, 12:5, 13:1, 14:5, 15:1, 16:10},false)
    Surficial Geology -> =transform([STATE_LABEL2@/Surficial_Geology_of_Alaska],{"CPC":1, "DOgi":1, "DSum":1, "Gabbro":1, "Jag":1, "Jegd":1, "Jg":1, "Jgr":1, "Jhg":1, "Jise":1, "Jit":1, "JPztu":1, "Jtr":1, "Jum":1, "Keg":1, "Kgb":1, "Kgu":1, "KJdg":1, "KJg":1, "KJmu":1, "KJse":1, "Klgr":1, "Klqm":1, "Kmgr":1, "Kmqm":1, "KPzum":1, "Ksfg":1, "Ksy":1, "Kum":1, "MDgi":1, "MzPzi":1, "Ogi":1, "PIPgi":1, "Pzgb":1, "PzPxmi":1, "QTi":1, "Tcp":1, "Tcpp":1, "Tehi":1, "Tephi":1, "Tgba":1, "Tgbe":1, "Tgbw":1, "Tgw":1, "TKg":1, "TKgb":1, "TKgd":1, "TKhi":1, "TKpd":1, "TKpeg":1, "TKts":1, "Tmi":1, "Tod":1, "Toegr":1, "Togr":1, "Togum":1, "Tpg":1, "Tpgi":1, "Tpgr":1, "Trc":1, "Trdg":1, "Trgb":1, "TrPzig":1, "Trqd":1, "Trum":1, "":1, "Clgv":2, "Dfr":2, "Dmv":2, "Dvec":2, "Jab":2, "Jtk":2, "JTrob":2, "JTrpf":2, "KJiv":2, "KJv":2, "Kmvi":2, "Ksbd":2, "Ksv":2, "Kvu":2, "MzPzyo":2, "OCv":2, "Phb":2, "Pv":2, "Pxtnm":2, "Pxv":2, "QTv":2, "QTvs":2, "Qv":2, "Sv":2, "Tbk":2, "Tca":2, "Tepv":2, "Tev":2, "Thi":2, "TKv":2, "TKwt":2, "Tmv":2, "Tob":2, "Togv":2, "Tpcv":2, "Tpt":2, "Tpv":2, "Trb":2, "Trcb":2, "Trn":2, "Trvs":2, "Trvsw":2, "Tsr":2, "Tvm":2, "Tvu":2, "Twv":2, "CPxwg":5, "CPxwgm":5, "Crc":5, "CZogn":5, "Das":5, "Dgb":5, "Dgbm":5, "Dmi":5, "Dogn":5, "DOmvs":5, "DOnx":5, "DOtm":5, "DOtp":5, "DOtu":5, "DPxacs":5, "DPxaqm":5, "DPxasm":5, "DPxsgm":5, "Dq":5, "Dv":5, "JPk":5, "JPs":5, "JTrsch":5, "KDt":5, "Khs":5, "Kmig":5, "Kps":5, "MacLaren":5, "MDag":5, "MDm":5, "MDts":5, "MDtv":5, "MDv":5, "MOkg":5, "Mzmu":5, "MzPza":5, "MzPzgs":5, "MzPzm":5, "MzPzmb":5, "MzPzp":5, "MzPzsk":5, "MzPzss":5, "MzPzsv":5, "Ocs":5, "Onim":5, "pCkm":5, "Pcs":5, "PIPsm":5, "Pks":5, "Pm":5, "Ptls":5, "Pxm":5, "Pxqm":5, "Pzarqm":5, "Pzce":5, "Pzcn":5, "Pze":5, "Pzgn":5, "Pzkp":5, "Pzks":5, "Pzm":5, "Pzncs":5, "Pznp":5, "Pzps":5, "PzPxb":5, "PzPxgb":5, "PzPxkg":5, "PzPxnc":5, "PzPxrg":5, "PzPxrqm":5, "PzPxybg":5, "PzPxygs":5, "PzPxyqm":5, "PzPxyqs":5, "Pzymi":5, "Pzyms":5, "QTm":5, "Sbs":5, "SOmi":5, "SZfw":5, "Tcc":5, "TKgg":5, "Trms":5, "TrMsm":5, "TrPvs":5, "TrPzbi":5, "TrPzgp":5, "Tvc":5, "Xio":5, "Zam":5, "Zgn":5, "Zgns":5, "Zngn":5, "":5, "":5, "":5, "Ca":7, "Chulitna":7, "Clg":7, "Clgk":7, "Clgt":7, "CPxt":7, "CPxwn":7, "CZls":7, "Dbf":7, "Dbfl":7, "Dbfw":7, "DCbg":7, "Dcc":7, "DCd":7, "Dcr":7, "Degh":7, "Degn":7, "Dke":7, "Dls":7, "Dlse":7, "Dnr":7, "Dof":7, "DOhb":7, "DOka":7, "DOls":7, "DSbr":7, "DSld":7, "DSpf":7, "DSt":7, "DSwc":7, "DSyl":7, "Du":7, "Dyp":7, "Dyss":7, "DZkb":7, "DZnl":7, "DZwp":7, "IPDcf":7, "IPlgw":7, "IPMch":7, "IPMn":7, "IPsb":7, "JDmc":7, "JDoc":7, "JIPe":7, "Jk":7, "JMct":7, "JMpu":7, "Jms":7, "JMsu":7, "Jnk":7, "JPzs":7, "Js":7, "Jsc":7, "Jt":7, "JTrct":7, "JTrls":7, "JTrmc":7, "JTrmv":7, "JTro":7, "JTrp":7, "JTrrb":7, "JTrv":7, "JTrvs":7, "Jvc":7, "Jvs":7, "Kaf":7, "Kafv":7, "Kcc":7, "Kcct":7, "Kcgc":7, "Kcm":7, "Kcs":7, "Kcv":7, "Kcvg":7, "Kfm":7, "Kfy":7, "Kgk":7, "Khk":7, "Khnl":7, "Kipc":7, "Kit":7, "KJgn":7, "KJgv":7, "KJks":7, "KJs":7, "KJyg":7, "KJyh":7, "Kk":7, "Kke":7, "Kkg":7, "Kkn":7, "Km":7, "Kmf":7, "KMmu":7, "Kmss":7, "Knf":7, "Kof":7, "Kpf":7, "KPu":7, "Kqc":7, "Kqcs":7, "Ksb":7, "Ksbf":7, "Ksd":7, "Ksg":7, "Kst":7, "Kto":7, "KTrs":7, "KTrvs":7, "Ktu":7, "Kvgc":7, "Kyg":7, "MDe":7, "MDegk":7, "MDip":7, "Mek":7, "Meks":7, "Mes":7, "Mgq":7, "Mk":7, "Mlga":7, "Mlgac":7, "Mlgk":7, "Mlgnu":7, "Mlgw":7, "Oc":7, "OCdv":7, "OCjr":7, "Ols":7, "OPxls":7, "OPxpt":7, "Oyl":7, "PDcf":7, "PDsc":7, "Pe":7, "Peh":7, "Pehls":7, "Ph":7, "PIPms":7, "PIPt":7, "Plps":7, "Pls":7, "Pstc":7, "Ptl":7, "Pxkd":7, "Pzcu":7, "Pzgp":7, "Pzls":7, "Pzpr":7, "QTgm":7, "SCpl":7, "SCs":7, "Sl":7, "Slc":7, "SOd":7, "SOdc":7, "SOig":7, "SOv":7, "SOyl":7, "SOyld":7, "St":7, "Stc":7, "Tarcs":7, "Tcb":7, "Tcl":7, "Tes":7, "Tk":7, "TKcf":7, "Tkf":7, "TKgrs":7, "TKis":7, "TKkf":7, "Tkn":7, "TKpc":7, "TKs":7, "Tms":7, "TMzu":7, "Tnc":7, "Tng":7, "Top":7, "Tos":7, "Tovs":7, "Trcnk":7, "Trcs":7, "TrDtz":7, "Trgs":7, "Trgsl":7, "Trhg":7, "Trhgs":7, "Trhgv":7, "Trif":7, "TrIPeg":7, "TrIPsf":7, "Trkc":7, "Trlb":7, "Trls":7, "Trpg":7, "TrPsg":7, "TrPzvs":7, "Trrs":7, "Trsl":7, "Trsy":7, "Trsyv":7, "Trwm":7, "Tsf":7, "Tsi":7, "Tsk":7, "Tsti":7, "Tsu":7, "Ttk":7, "Tts":7, "Tuu":7, "Tvs":7, "White Mountain":7, "York":7, "Yukon Flats":7, "":7, "":7, "":7, "JTros":3, "JTrpp":3, "KJm":3, "KJmy":3, "Kkbm":3, "Kmar":3, "KTrm":3, "Kumc":3, "TKm":3, "g":10, "Qs":10, "QTs":10, "bu":1, "Canada":0, "DCmt":7, "w":0},true)
    UoC 21
    Figure 38. Results of the slope parametrization and reclassification process.
    UoC 22
    Figure 39. Results of the aspect parametrization and reclassification process.
    UoC 23
    Figure 40. Results of the extent of permafrost parametrization and reclassification process.
    UoC 24
    Figure 41. Results of the ice content parametrization and reclassification process.
    UoC 25
    Figure 42. Results of the land cover parametrization and reclassification process.
    UoC 26
    Figure 43. Results of the surficial geology parametrization and reclassification process.
  4. Rating and weighting and 5. Final integration+

    The final model weighting and integration was performed with an final calculation using the calculator tool from the Ucalgary DGGS client:

            =[Parametrized Slope]*0.3+[Parametrized Aspect]*0.05+[Parameterized Extent of Permafrost]*0.05+[Parameterized Content of Permafrost]*.2+[Parameterized Land Cover]*0.1+[Parameterized Surficial Geology]*0.3

    The final results of the integration are shown in Final results of the model integration in Alaska.:

UoC 29
Figure 44. Final results of the model integration in Alaska.

Using the capabilities of the DGGS client and the basic DGGS operation of subselection, it can even go a step forward and select only the areas of high risk erosion in the coast of Alaska (Figure 30). These results would give a good starting point to our scientist to provide a general overview about potential risk of coastal erosion to the regional government of Alaska.

UoC 28
Figure 45. Final results of high erosion potential risks on Alaska’s coastline.
10.5.7.2. Analysis of potential Flooded Populations Due to Sea Level Rise on a DGGS Client

Results of the rise of sea level of 1m in the coast of Alaska. shows the results of a simulated scenario where the sea level has risen 1m, achieved by a cell selection filtered by a range of heights/depths. It shows the areas that would be affected by the sea level rise, and by exploration and integration of more datasets of interest (like world population density, buildings or ortophotographs, Integration of a high resolution ortophoto in an area that has been identified as flooded after a sea level rise of 1m.) the characteristics of the population affected can be quickly analyzed.

UoC 30
Figure 46. Results of the rise of sea level of 1m in the coast of Alaska.
UoC 31
Figure 47. Integration of a high resolution ortophoto in an area that has been identified as flooded after a sea level rise of 1m.
10.5.7.3. Analysis of potential hazards along a navigational route on a DGGS client

The datasets described in the sub-scenario section are mashed up and parameterized to prioritize the risk along a navigational route. The following calculations were performed on the UCalgary DGGS client using the calculator tool and taking into account the characteristics and attributes of each dataset.

GEBCO – Shallow Water → shallow waters should be avoided. This reclassification could be changed depending on the navigator’s ship draft.

transform([OGC_DGGS_Server/GEBCO_2022_Bathymetry_Topo_Elevations],{"-1000":0,"-20":1,"-15":2,"-10":3,"-5":4,"0":0},false)*1.0

NOAA_Lanes_and_Cautions → this dataset contains information about shipping lanes and it is classified by the types of areas (some to be avoided and others recommended for navigation).

transform([THEMELAYER@OGC_DGGS_Server/NOAA_shipping_lanes],{"Area to be Avoided":4,"Particularly Sensitive Sea Area":3,"Precautionary Areas":1,"Recommended Routes":0,"Shipping Fairways Lanes and Zones":0,"Speed Restrictions/Right Whales":1,"Traffic Separation Schemes":2,"Traffic Separation Schemes/Traffic Lanes":0},false)*1.0

Coast_Guard_Warnings → this dataset includes information about areas to be avoided for navigation. It also includes data about shipwrecks and major shipwrecks that could be of danger for navigation.

transform([area_desc@OGC_DGGS_Server/Vessel_High_Density_Areas],{"Area to Be Avoided":2},false)*1
transform([area_desc@OGC_DGGS_Server/Vessel_Points],{"ADEC Response Capacity":3,"Major shipwreck":1,"Shipwreck":4},false)*1.0

Coast_Guard_Traffic_Observations → density of the vessel traffic is classified by the types of ships that navigate the areas.

transform([type@OGC_DGGS_Server/Vessel_Traffic_Density],{"Cargo":3,"Fishing":1,"Tanker":4,"Tow / Tug":2},false)*1.0

Marine_Protected_Areas → Marine protected areas should be avoided for navigation.

transform([MARINE@OGC_DGGS_Server/World_Database_of_Protected_Areas_Dec2022],{"0":0,"1":2,"2":4},false)*1.0

The results of the mashup and reclassification are shown in Mash up map result of the potential hazards along a navigational route to Mash up map result of the potential hazards along a navigational route. Overview showing traffic points datasets related to shipwrecks.. Exploring the mash up map and symbolization, a navigator would be able to assess major risks and concerns over their route before deciding where to navigate along.

UoC 32
Figure 48. Mash up map result of the potential hazards along a navigational route
UoC 33
Figure 49. Mash up map result of the potential hazards along a navigational route. Filtered by areas to be avoided.
UoC 34
Figure 50. Mash up map result of the potential hazards along a navigational route. Overview showing traffic points datasets related to shipwrecks.
10.5.7.4. Demonstration Video

To view the demonstration video showing the University of Calgary’s server using the OGC DGGS API, through 3 diverse sub-scenarios; coastal erosion, rising sea levels and navigational hazards in the Arctic, please click the following link to the video on OGC’s YouTube Channel

10.5.8. Results: Integration, Interactions & Interoperability

The D103 DGGS Fusion Server interacts successfully with the DGGS client provided by Ecere corporation using the current OGC API - DGGS candidate draft standard. It used the ISEA3H as the underlying DGGS, the Rhombus indexing as indexing method and tiling system, and the PYX0 format as a data encoding. Figure 36 shows the results of the integration of the coastal erosion susceptibility map displayed on the Ecere’s GNOSIS Cartographer 3D client. For more information on the integration results and interactions, please go to the Client 3 (D102) - Ecere client section.

Since the OGC draft standard API was implemented, any other client that is able to understand the DGGS and tiling system, and is able to decode the PYX0 format, would be able to read data tiles from the UCalgary DGGS server.

UoC 35
Figure 51. Results of the coastal erosion susceptibility workflow and GEBCO bathymetry data retrieved from the University of Calgary DGGS and displayed in Ecere’s GNOSIS Cartographer 3D client.

10.5.9. Challenges and Lessons Learned

10.5.9.1. Rhombuses over PYXIS tiles

From the pilot kickoff, the decision was made that the Ecere client would interact with the ISEA3H DGGS server based on its hexagonal and pentagonal zones. Therefore, the first R&D activities from Ecere Client focused on attempting to build a hierarchical DGGS based on the ISEA3H hierarchical topology. After some initial work, and given the complexity of the implementation and the limited time for the project, the final integration and TIE experiments were based on rhombus tiles instead of using the hexagonal based PYXIS indexing. This decision was made to facilitate the transport mechanism, as rhombus tiles address the need to utilize a rectilinear data structure, such as texture mapping and other efficient GPU operations, while maintaining complete coverage over all ISEA3H DGGS cells.

The initial R&D work by the Ecere client led to valuable and interesting insights that can be consulted on its own section.

10.5.9.2. GeoTIFF data encoding

The initial months of the project were focused on a server implementation to encode DGGS data into a GeoTIFF format. This decision was made since GeoTIFF is a popular format for encoding coverage (tiled) data, and it seemed like a good choice to encode rhombus tile float data values from the DGGS server. However, it was problematic to encode the Inverse Snyder Projection into the GeoTIFF heading since it is not yet supported by the popular proj4 coordinate transformation library, and it also does not have a standardized identifier in EPSG or WKT format. Other concerns are related to the fact that static coverage formats fail to take advantage of the hierarchical progressive streaming characteristic of DGGS, and if attempting to stream data at several resolutions, other standardized formats might be of better fit (netCDFF/HDF5).

The breakthrough came as a result of encoding a rhombus shaped tile of ISEA3H hexagonal cells using the own UV coordinate based PYX0 format, which successfully passed data through to the client since a detailed description of the encoding was given a priori. During the process, it was determined that the rhombus tiling schema can be constructed to conform with the OGC Tiles API standard, and can be described in a mechanism suitable for GDAL and popular GIS tools, so that other non-DGGS clients can access these data tiles.

10.5.9.3. OGC API - DGGS server limitations

While a process-based analysis is desirable, the current structure of the DGGS Server has focused on the discovery-based aspects of DGGS capabilities. That is, at the core of the server is the ability to perform the following.

  • Choose the data values (properties) from available geographical data sources to be quantized and sampled into the cells of the DGGS.

  • Determine a Zone ID based on a geographic coordinate.

  • Select a cell from a Zone ID.

  • Select a group of implied cells from a Parent Zone ID and depth (this method assumes a congruent tiling is available).

  • Select a group of cells within a single geometry, buffer of a geometry, or topological operations for 2 or more geometries.

  • Select a group of cells based on a data value filter.

  • List features and/or summarize the spatial statistic (histogram) of the data values held in a selection.

The implemented OGC API - DGGS does expose some of this functionality, but it assumes the client has detailed knowledge of the DGGS geometry, including tessellations, refinement, indexing/encoding, etc. The DGGS API implemented only allows access to DGGS data through the PYX0 format and static PNGs. Also, one of the more interesting OGC API - DGGS proposed endpoints related to DGGS Zone queries to answer the question where is it? was not implemented due to the limited time for the project and the added complexity of the endpoint, which needs to encode filtering capabilities in a query language (like CQL2). This should be considered for future work on the DGGS API implementation.

Another area to explore would be related to the data transmission at several resolutions at once. At the moment, the DGGS server only serves data from a particular Rhombus tile at a given resolution. The current OGC API - DGGS does not include any mention or recommendations on this topic. A depth parameter could be added to the request, in relation to the number of resolutions that the client wants the data for. This would result in a hierarchical tile of data.

One of the challenges while implementing the OGC API - DGGS was the lack of clear documentation in the OGC website. While is understood that the details of the API are still being discussed, confusing information is given on the webpage (see https://developer.ogc.org/api/dggs/index.html compared to https://ogcapi.ogc.org/dggs/overview.html and https://github.com/opengeospatial/ogcapi-discrete-global-grid-systems). If there are still two approaches being discussed for the OGC API - DGGS, it should be clearly stated in the documentation. If a decision has not yet been made, it should not be published on the website, or it should include a warning.

10.5.9.4. DGGS processes represented as chain of OGC API - Processes workflows

As described in the DGGS server architecture section, in a best practice, the DGGS server would provide a method of serializing pipeline operations into a process chain, which is not yet exposed through the server APIs. It was pointed out by Ecere Corporation group that the candidate Standard OGC API – Processes - Part 3: Workflows and Chaining enables such instant integration and visualization of any geospatial data and/or processing capabilities. It achieves this by extending OGC API – Processes – Part 1: Core with the ability to perform the following.

  • Reference local and remote nested processes as inputs in execution requests.

  • Reference local and remote OGC API collections as inputs in execution requests.

  • Modify data accessed as inputs and returned as outputs through filtering, selecting, deriving and sorting by fields (measured/observed properties).

  • Requesting output data (e.g., via OGC API - DGGS “Data Retrieval” requests for a particular DGGS zone) from a resulting virtual OGC API collection in order to trigger processing execution for a particular area and resolution of interest.

Future efforts should be made to encode the pipeline concept of the DGGS server into the OGC API - processes workflow and explore the capabilities of this implementation in relation to DGGS processes and operations.

10.5.10. Final remarks

One significant outcome of the pilot project was the establishment of the DGGS server as a repository for connecting and using heterogeneous data relevant to arctic and marine spatial analysis that will be useful for testbed, pilots and hackathon type sandbox activity in the future.

Since during the development of the project it was discovered that a rhombus tile system conforms to the OGC Tiles API standard, future work should focus on working on an implementation of the Inverse Snyder Projection in proj4 library, and to propose a standardized identifier for this projection. This would allow other non-DGGS clients to access the data tiles and to utilize DGGS at its full potential. Also more types of integrated data applications that make use of machine learning and data mining algorithms, jupyter notebooks, and common GIS applications could start benefiting from DGGS capabilities.

We recommend that future work focus on prototyping a DGGS Tile Generator based on these recommendations to interoperate and transmit DGGS data via OGC Tile Matrix standard and have a client that can read these and operate on them through emerging OGC API standards. Of particular interest are the OGC API – Processes - Part 3: Workflows and Chaining for encoding processes and pipelines, and the DGGS API Zone query endpoint combined with the Common Query Language to refine by Search and Filter. Developing a more scientific oriented case study, such as an improved coastal erosion scenario, would demonstrate the value of all of these APIs and provide valuable feedback for the DGGS community and those writing the DGGS API Standard.

Additional challenges and lessons learned during the integration experiment and interactions are also presented in the Client 3 (D102) - Ecere client section.

10.6. Server 4 (D107) and Client 4 (D106): Integrated - Tanzle

10.6.1. Description

Tanzle’s platform is a modular data integration platform that may be configured to source and supply any describable data including but not limited to data standards used within the FMSDI Pilot Phases. Tanzle is creating a new foundation for dynamic data that is refreshed and reconciled in time and place. The most unique aspect of our platform is how it can index, dynamically process and present data- both static and raw global sensor data- together into a collaborative environment. Through our data plane, it can organize, dynamically summarize, and present the data, allowing the user to explore raw sensor data or pass it through data processing pipelines with intermediate through full curation.

Tanzle participated in this phase of the FMSDI Pilot as an in-kind contribution by tasking our platform to demonstrate a cross-cutting server/client solution leveraging cloud infrastructure. It can be expected that further development/refinement, inside of Tanzle, will be rewired to effectively contribute to the Pilot Technology Integration Experiments (TIEs), and offer an alpha-level client component to demonstrate server-client without impeding the other Pilot participants.

10.6.2. Sub-scenario and Data

A call center receives a call about an emergency incident. The location of the incident is electronically forwarded to emergency response professionals. A continuously updated “location report” is created and shared, allowing responders to view the incident location on a map, and also see their own position on the map. The initial report could include any relevant information from the past or present, or from a model. Once created, the report will be continuously updated to inform all users of dynamic conditions that might affect them.

Dynamic conditions not characterized by foundation data require rapid data integration of historic, new and time-dynamic data to ensure safe response and recovery. Past and present activity streamed across authorized clients, dynamically contributing to the report for in situ and post-emergency investigation. Several stakeholders may interact with the report. The stakeholders in this perspective include real-time actors such as vessel crew and first responders, law enforcement, analysts and engineers whereas the post-emergency activities expand the group to include reinsurance agents, media, corporate and government agents. Many of the stakeholders are high-level actors requiring simple tools and interfaces to the following basic data sources.

  • S-102: Bathymetric Surface

  • S-127: Marine Traffic Management

  • WMS-T: Time-dynamic web mapping service

  • GRB2: Binary gridded weather

  • CSV: Unstructured vector data

  • [Geo]JSON: Vector data

  • CAD: Entity models

10.6.3. Technical Architecture

The Tanzle platform organizes data by time and space in a when-where-what order that enables both a real-time data plane and efficient visualization of data over time. Managing geospatial time-series data at scale requires a flexible, configurable combination of cloud infrastructure and reactive data management. Our creation is Tanzle Advanced Backend Services (TABS). TABS is uniquely derived from time-critical financial technology, adapted to support any definable data, with native support for geospatial. The figure below shows the core features of TABS. Particular attention should be paid to the multiple options (arrows) at major pipeline junctions. Elements of both real-time and curation workflows may be blended to suit a variety of scenarios, and this is done in a way that takes care to not overwhelm a frontend client.

Tanzle 1
Figure 52. Tanzle Advanced Backend Services (TABS) - a reactive platform designed for real-time and spatiotemporal analytics.

TABS is configured here to index [geo]spatiotemporal data and algorithms, manage streaming data using reactive data processing and reduce latency and load on downstream clients and front ends. The relationship between TABS and any frontend is definable, and, in this case, are utilizing CesiumJS as a starting frontend for its advanced support for web-based and time-dynamic data and models. Due to time constraints of developing this new method, there was limited time to evaluate other clients such as QGIS. The implementation of QGIS as a frontend client was not successful - this will be addressed later and below. TABS and CesiumJS provide a flexible and efficient means of integrating native data (i.e., sensor streams) into OGC standards using a core TABS feature- ‘Algorithmic Map Tiles’. Algorithmic map tiles result when time-series data are input to a predefined algorithm, the result of which is stored in a time-dynamic web mapping service (WMS-T). What is novel about this approach is that:

  1. it may be done on-the-fly and driven by the user experience; and

  2. the metadata associated with the pre-curated data is preserved in TABS and may also be made available.

10.6.4. Demonstration

Emergency responders building execution plans needed to access and collaborate on multiple sets of data. This includes the aggregation of Foundation and sensor data such as themed maps, shipping and weather patterns, and real-time sensor streams. TABS is the core technology that reacts to both changes in data and changes coming from users, providing a real-time, reactive backend to dynamic data. While TABS is the core technology, it is the client that ultimately connects the user to the data. The demonstration used an Edge-friendly web-based client. More specifically, we chose to use VueJS and CesiumJS as the frontend client, and tested QGIS client for interoperability.

There was insufficient real-time data to thoroughly demonstrate capabilities and therefore it required the demonstration to be moved away from the primary area of interest to successfully present the most fundamental components:

  1. capture or playback of real-time data feeds from their native format;

  2. apply transformation, rendering and user-supplied algorithms to the data feeds; and

  3. serve results to clients as low-level graphics and WMS-T. Provide tools for visual discovery in a collaborative scene.

Interoperable Technologies

  • Tanzle Advanced Backend Services (TABS): data streaming manager

  • CesiumJS: web-based geospatial client

  • QGIS: OSGeo desktop geospatial client

10.6.4.1. Demonstration Video

To view the demonstration video showing Tanzle demonstrating a cross-cutting, server/client solution leveraging cloud infrastructure and real-time data for the Federated Marine SDI, please click the following link to the video on OGC’s YouTube Channel.

10.6.5. Server-Side Results: Integration, Interactions & Interoperability

A significant aspect of this sub-scenario is the implementation of algorithmic map tiles - a reactive and dynamic data reduction and visualization approach. Algorithmic map tiles are similar to conventional map tiles; however, the pixel values are determined in real time, allowing for new data to automatically update the tile set. This allows consistent management of metadata across raw and simplified data and it allows tile specifications to be manipulated as the need arises, surfacing opportunities for improvement. The figure below illustrates two representations of a fleet of buoys obtained through Xerox PARC and DARPAs Ocean of Things program: 1) as raw entities, and 2) as a dynamically rendered heatmap. The fleet of drift buoy data are obtained in CSV or JSON format, ingressed, indexed and optimized such that they may then be dynamically processed and rendered at all supported levels of detail.

Tanzle 2
Figure 53. Drift buoy entities are introduced in real time or played back. The left panel shows individual buoys as markers whereas the right panel shows the group dynamically rendered as a time-dynamic heatmap. This enables the fleet of buoys to behave in near real time as a localized marine weather layer.

Dynamically rendered heatmaps are pursued here to demonstrate how real-time, streaming data may utilize algorithms and steps usually found in curation, only these steps are localized in time and space. These are not client-side graphics because the volume of data even in small spatiotemporal boundaries can be overwhelming. The figure below shows two examples, one being the long-term and all-inclusive heatmaps for vessel traffic and buoy environmental data and the other being the short-term heatmaps filtered to represent only one entity from either data set.

Tanzle 3
Figure 54. Attribute-based heatmaps of AIS and drift buoy fields, for the entire datasets in the left panel and for single entities in the right panel.

Each time-step of the dynamically rendered heatmaps is served to clients as either a low-level PNG tile or through the Web Mapping Service for Time-Dynamic data (WMS-T). It was determined that, while building support for self-service that a fully supported external, time-dynamic service would be a significant task and could only offer support for a subset during this phase of the Pilot. This includes the {time}/{tileset}/{row}/{col}.JPEG format of WMS, and also support for dynamic rendering of GRB2 files.

Data integration beyond algorithmic map tiles was significantly easier in TABS. Unstructured data coming from telemetered and broadcast messages (AIS and drift buoy) may be ingressed and fed through the appropriate data processing pipelines to produce either entity-mapping or heatmap rendering. Additionally, our heatmap renderings allow for the application of additional algorithms such as: 1) autocorrelation, and 2) cross correlation of the fields within or between data sets. More advanced heatmap results were provided to clients as the following.

  • AIS vessel counts

  • AIS vessel speed

  • Drift Buoy water temperature

  • Drift Buoy air temperature

  • Drift Buoy air-water temperature difference.

We were able to serve interactive CAD models of the drift buoys, vessels and the International Space Station (ISS). Live video feeds were successfully streamed into the scene and could even be associated with CAD models. Video and snapshot media were streamed from NASA’s High Definition Earth Viewing (HDEV) experiment hosted on Ustream for the ISS, and from Trafficland for the Department of Transportation cameras. Unfortunately, access to the most optimum data and used surrogates was not available. For example, a single buoy model is used to represent all buoys, and the same is true for the AIS vessels, one model for all vessels.

Other data layers were used from various sources. Weather data in WMS and GRB2 formats was pulled from NOAA to test sideloading versus streaming and found both to be successful. Much of the weather data or data coming from other Pilot participants was not leveraged because the area of interest was changed to pursue a broader variety of data sources. This does not mean that it is not compatible with the Pilot-hosted layers - they are supported per OGC specifications. Topography and bathymetry from USGS and BOEM was also pulled and fused them together to produce a high-fidelity topobathymetric surface (a composite DEM) for the region. DEM creation did follow a curation workflow where conventional, static map tiles were produced using GDAL and sideloaded as a local tileset. However, the source elevation data (were it available) may also be assigned to streaming data processing pipelines in the future, and continuously updated elevation data could automatically update the tileset.

10.6.6. Client-Side Results: Integration, Interactions & Interoperability

A few caveats must be known that there are cases where CesiumJS, OGC or other APIs, and data providers may not be well-supported, that these limitations are not limitations of TABS, and present opportunities for improvement in the services and data being integrated. And, while Tanzle hopes to improve upon TABS in service to overcoming limitations of interoperability like these through extended engineering, noting where such effort is required is likely all there is time for during this Pilot. Such notes provide beneficial topics to address in later Phases of FMSDI as well as in working groups focusing on standards and emerging strategies. A deeper dive on these and related topics is anticipated.

The following figures illustrate how the Tanzle platform successfully integrated all of the data sources and feeds that were available. However, managing real-time data required data that were at the time unavailable, forcing us to use a surrogate area - the Gulf of Mexico (GoM). The GoM provides abundant real-time and historic data in a variety of formats and projections. The figures below illustrate our successes in adapting parts of CesiumJS source code to accommodate a variety of features. One such feature is our introduction of location histograms to the native time scrubber as seen in the figure below. Like the algorithmic map tiles, the location histogram is also dynamically updated based on the viewport boundaries as well as time- and entity-filtered data results.

Tanzle 4
Figure 55. Location histograms have been created and added to the native CesiumJS time scrubber. This informs a user when and roughly how much data exists whereas the map alone expresses only where data exists.

The location histogram works with the map layers and the layer manager tree to visually inform a user about data trends in space and in time, and the reactive nature of TABS allows a user to switch between entity-level data and the reduced, curated forms without the need to recreate entire layers when new data arrive. The figure below shows a scenario where a user filters data for a sub region allowing for entity-level analysis of AIS vessel traffic, and then uses the filtered heatmap to identify a dark vessel before sending the user to find satellite data to further explore the scenario. Unfortunately, the best data were unavailable and surrogate data was used from the International Space Station in the form of live video.

Tanzle 5
Figure 56. AIS vessel patterns on the sea and in orbit. The top left shows search-and-recovery in blue, port authorities in yellow and other vessels in random colors. The top right shows a ferry, and the location histogram that has been supplied to the time scrubber reveals the ferry schedule. The lower left shows the heatmap for a vessel (CAD in lower left) that goes dark as it moves offshore. The lower right shows a video feed for the International Space Station as a placeholder for any orbital data feeds that could be provided (none were).

Streaming media need not come from satellites alone. It was determined that there is an airport and a lighthouse nearby to the Alaska emergency site in the sub-scenario, and it was decided to find surrogates for data that may stream from these sources as well. The possibility for Department of Transportation traffic cameras to image and stream onsite information about vessels nearing bridges was identified. It is shown, in the figure below, just how abundant these cameras are and how they could be useful. The cameras have known positions but the imagery they provide is not georeferenced. Therefore, the platform supplied the image/video feeds to a thumbnail in the lower left of the interface that opens when the associated maker is clicked on the map. This is a great example of integrating geospatial data with non geospatial data.

Tanzle 6
Figure 57. Trafficland video feeds representing real-time Department of Transportation data.

10.6.7. Challenges and Lessons Learned

A significant challenge was the lack of data in the AOI. For example, historic AIS shipping data declined from late 2020 to nearly no data in the AOI during 2022. Collectively, Pilot participants struggled to find data in the AOI that supported their sub-scenarios, warranting either simulation of data or demonstrating a capability in another area. Tanzle’s preference was to focus on capability instead of simulation given that the collective effort aimed to identify opportunities for technology improvement and not to actually characterize the AOI. The following two figures speculate that Alaska AIS data are coming primarily from privately maintained ground-based receivers.

Tanzle 7
Figure 58. AIS data findings in Alaska from well-known sources. The Marine Exchange of Alaska data holdings is likely the only location for AIS traffic in Alaska in the last two years.

There are likely two related causes for the loss of AIS data in Alaska. First, satellite data coverage in northern latitudes does not match that of lower latitudes. The figure below illustrates various low Earth orbit constellations. It is clear that the cutoff coincides with the Aleutian Island chain, north of which there is little coverage. This would demand more of the ground-based AIS receivers, and those appear to be privately operated and maintained by only the Marine Exchange of Alaska (MXAK). Why MXAK AIS data does not trickle into “MarineCadastre.gov” in compliance with the Freedom of Information Act guidelines referred to by the US Coast Guard is a question that cannot be answered here.

Tanzle 8
Figure 59. Disparate constellations in Low Earth Orbit where satellite AIS tracking would occur. Most constellations focus on lower latitudes which may explain why AIS data in Alaska are privately held by the Marine Exchange of Alaska who maintains the ground-based receiver network.

Cesium and QGIS are both great candidates for Tanzle’s sub-scenario. However, both presented challenges and limitations. In Cesium’s case, utilizing and augmenting CesiumJS source code was required to leverage low-level PNG graphics as illustrated below.

Tanzle 9
Figure 60. Heatmaps for two distinct data sources: 1) AIS vessel traffic (brownish) and 2) Drift buoy environmental data (blueish). Both datasets provide point clouds too large for most frontend clients but are reduced in real time to prevent the need for heavy client-side graphics.

A tiling algorithm was created based on theirs in order to be both time-dynamic and reactive. Augmentation of other components of CesiumJS source code such as the time scrubber were required. It was determined that the default time scrubber would be a great start but lacked a lot of the features needed to be expressed such as location histograms, time-optimized data filtering, and other backend or statistical features. Experimentation with various workstations and browsers before setting data volume caps derived from empirical data was also performed. The user was informed of “too much data” by coloring the entity in the layer manager red and reporting the number of vertices requested. This informs the user that switching to a reduced data view would be better than querying too much entity-level data. This provides an intuitive means of guiding users toward higher levels of detail or simplified data at lower levels of detail. There are more elegant solutions underway but this was a good starting point.

QGIS presented other challenges. First, QGIS does offer tools for low-level “hacking” and did not have time to explore solutions as as done with CesiumJS. It was decided to leverage QGIS to test the WMS-T egress from TABS as a result. This surfaced an immediate limitation in QGIS that it did not offer web-based authentication using a Bearer Token which is how we configured authentication to TABS. The long-term release of QGIS at the time of this writing was 3.22. It was found, in a bleeding-edge release (3.28), that there was an option to supply a HTTP header in the WMS layer manager. This could work because a WMS-T map tile with curl by passing a HTTP header, was able to successfully retrieved. The figure below illustrates the same layers used in CesiumJS above, and shows the Advanced option in QGIS where the HTTP header for authentication was supplied. This successfully authenticated QGIS to TABS. However, it required scraping the Bearer Token out of the web-developer tools of the web browser because Bearer Tokens expire daily in our configuration. Not elegant, but proves the concept and surfaces the limitation. Unfortunately, authenticating was not the only issue with QGIS. It was next discovered that QGIS was not properly parsing the capabilities that TABS was providing in our WMS-T implementation. The X-Y locations were not valid and the result is “Failed to parse capabilities.” This is likely due to the fact that we did not have time to fully implement the WMS-T spec, or perhaps that the version implemented, was no longer aligned with the version of QGIS we used. This is certainly something to further explore.

Tanzle 10
Figure 61. QGIS access to WMS-T required updating from a long-term release to a bleeding-edge release to support authentication through a Bearer Token. Then, upgrading surfaced issues in the version of WMS-T being supported. QGIS fails to parse the WMS-T capabilities from Tanzle’s server.

The final lesson that should be learned here is that any form of situational awareness will demand data coming from diverse sources provided through a variety of economic models, and these characteristics challenge the application of a data standards approach alone. We found that it is easier to integrate data through platform disaggregation, meaning that every aspect of the data workflow strives to maintain orthogonal relationships to every other aspect. The use cases driving the overriding products and applications determine where flexibility must yield to absolute standards. This even extends to the decisions governing whether a server or client is assigned a firmly defined set of roles.

10.7. Data Clients and Visualization

The data clients were the components designed to ingest and visualize a wide variety of data from multiple server instances provided by other pilot participants or other sources. This was accomplished primarily through the use of a variety of OGC API’s.

Data clients and visualization were demonstrated by ESRI Canada (D100), Helyx (D101), and Ecere (D102).

As described in detail in the following sections, each participant performed a thorough execution of all phases of data discovery, access, integration, analysis, and visualization appropriate to their component. Participants collaborated and supported each other with data, processing capacities, and visualization tools.

10.8. Client 1 (D100) - Esri Canada

10.8.1. Description

Esri Canada appreciates the opportunity to contribute to the Open Geospatial Consortium (OGC) “Federated Marine Spatial Data Infrastructure” (FMSDI) Pilot Phase 3 and to this Engineering Report. Esri Canada are pleased to see that the international hydrographic community is continuing to investigate the use of Spatial Data Infrastructure (SDI) best practices for increasing the interoperability of systems and streamlining the exchange of geospatial data assets for important marine applications such as coastal erosion, climate change, coastal flooding, maritime safety, and marine traffic. Once fully implemented, the FMSDI will help improve the efficiency and quality of marine decision-making and help support governments, agencies, NGOs, private companies, and citizens unlock their valuable investments in spatial and observation data at national, regional, community and local levels.

National and international SDIs are a collection of publicly (and sometimes privately) available geospatial data, applications, policies, and standards that assist users find, use, and share geospatial information. An SDI is a distributed ‘system of systems’ based on a high level of interoperability, usually implemented, operated, and maintained by a particular community to support a specific application. The MSDI is an IHO initiative to develop the marine component of an SDI. The MSDI is a framework of suggested best practices and guidance for the management of marine geospatial data, underpinned by key principles such as standards, supporting interoperability, integration, institutional collaboration, and coordination. The FMSDI adds the important dimension of a “federated” model as part of that overarching MSDI framework.

Many governments and organizations have recognized that spatial data and related information collected to produce navigational charts, which help support the safety of navigation, are also useful for many other applications such as the study and management of the ocean, marine and coastal environments. Additional and sometimes duplicate thematic maritime, coastal, marine and oceans data is independently being collected and used. This means that the FMSDI must support a wider and less traditional base of marine data and data users, far beyond just the typical marine navigation community.

Esri Canada has been a leader in supporting SDI efforts in Canada and internationally, including the development of the Canadian Geospatial Data Infrastructure (CGDI) and Canada’s contribution to the international Marine Spatial Data Infrastructure (MSDI) initiative. Esri Canada is an OGC Technical Member and previously successfully participated in the OGC Arctic Spatial Data Pilot conducted in 2017 and the OGC Maritime Limits and Boundaries Pilot conducted in 2019.

10.8.2. Sub-scenario and Data

In the FMSDI Phase 3 Pilot Project, the Esri Canada team developed a client application (Client 1 (D100)) that supports accessing and organizing FMSDI data to efficiently visualize, manage, analyze, and store the data and Web Service data for a hypothetical ship grounding scenario.

In this scenario, the expedition cruise ship, Discovery, is conducting a 22-day trip from Nome, Alaska, to Kangerlussuaq, Greenland. Discovery is carrying 142 passengers and 104 crew and departs Nome on July 22 at around 1700 hours. She passes through the Bering Strait overnight to her first stop, Kotzebue Bay (pronounced kaat·suh·byoo). She is scheduled to arrive in the area around 0800 hours and will remain there for most of the day, providing an opportunity for passengers to view flora and fauna via shore excursions.

EsriC 1
Figure 62. Area of Interest (Cape Espenberg, Alaska) showing Discovery ship route and grounding location due to power failure [ESRI Canada, 2022]

However, early in the morning, the ship had a major power failure that, due to a cascading generator shutdown, caused the vessel to lose propulsion. One of the generators could be restarted within an hour to supply partial electricity to the ship but not before the relatively high winds at the time pushed the ship dangerously close to the shore (Cape Espenberg). Under partial power, the ship tried to navigate the shallow area indicated on the nautical chart, but the ship ran aground. With only partial power to run on-board systems and a low probability of becoming seaworthy, Discovery declares an emergency.

For this Client, Esri Canada used ArcGIS Pro (latest version) software to consume and test the performance and compatibility of the datasets provided by the servers (server 1 and 2) to develop an Arctic Voyage Planning Guide to help the ship captain in search and rescue scenarios.

EsriC 2
Figure 63. Arctic Voyage Planning Guide (ESRI Canada)

The Esri Canada Client application provided current information to the ship’s captain and crew based on the data layers included in the Arctic Voyage Planning Guide (AVPG) and other authoritative sources. The client supports the hypothetical scenario of the cruise ship, “Discovery”, which is conducting a 22-day cruise when a power failure occurs and the ship drifts aground near Cape Espenberg area of Kotzebue Bay, Alaska.

Actor(s), end-user(s) and/or stakeholder(s)

  • Ship captain, crew, and passengers on board

Interoperable Technologies

  • OGC API - Features

  • OGC web services (WMS, WMTS, WFS)

  • S-100 data layers

  • Esri shapefiles (where data was missing in the AVPG guide)

Data and Platforms

  • Platform: ArcGIS Pro

  • Data: OGC standard layers provided by Servers 1 and 2, other applicable data layers and web services from publicly accessible sites, esri shapefiles where OGC standard layers were missing or unavailable

Arctic Voyage Planning Guide Themes

Esri Canada incorporated the arctic voyage planning guide using the themes provided by the OGC to be used by the ship captain and the crew for search and rescue. In addition to the AVPG data layers, an additional layer was developed for data specifically related to the ship grounding incident. In addition, an “other” data layer was included for relevant data not directly related to the incident and not included as an AVPG layer.

EsriC 03
Figure 64. Datasets within each Theme for Arctic Planning Voyage Guide (Esri Canada, 2022)

10.8.3. Technical Architecture

The technical architecture of the D-100 Client is based on the ArcGIS Pro architecture (https://pro.arcgis.com/en/pro-app/latest/get-started/get-started.htm).

SDIs are evolving and developing. From local and regional data cooperatives to National Spatial Data Infrastructures to application specific SDIs like the FMSDI, the internet and cloud computing are transforming the way organizations manage data and collaborate in a global system of systems. (https://www.esri.com/en-us/arcgis/integrated-geospatial-infrastructure/overview).

The process for developing the Client 1 first used various internet and geospatial catalog searches looking for data or web services related to this application. When applicable data or services were found, they were included in the ArcGIS Pro demonstration client. For support of the servers developed in this pilot project, the web services were integrated into the client simply by adding the web address and indicating the type of web service that was being added.

The interoperable technologies used by the D100 client included the following.

  • OGC API - Features

  • OGC web services (WMS, WMTS, WFS)

  • S-100 data layers

  • Esri shapefiles (where data was missing in the AVPG guide)

The platform by the D100 client included the following.

  • ArcGIS Pro, which is the powerful single desktop GIS application developed by Esri and which is a feature-packed software product having a 64 bit architecture, ArcGIS Online integration and Python 3 support. ArcGIS Pro supports data visualization; advanced analysis; and authoritative data maintenance in 2D, 3D, and 4D. It supports data sharing across a suite of ArcGIS products such as ArcGIS Online and ArcGIS Enterprise, and enables users to work across the ArcGIS system and other web server systems through Web GIS.

  • Specifically for this project, the ArcGIS Pro D100 client accessed and displayed the various OGC Web Service layers published by the FMSDI Project participant servers.

  • In addition, the ArcGIS Pro D100 client accessed and displayed numerous public web services that were found on-line that were in the Area of Interest and were related to the various themes of the Arctic Voyage Planning Guide.

The data access by the D100 client included the following.

  • OGC standard layers provided by Servers 1 and 2, other applicable data layers and web services from publicly accessible sites, esri shapefiles where OGC standard layers were missing or unavailable.

  • A list of the many AVPG data layers accessed by the D100 client are noted elsewhere in this ER.

The architecture wiring diagram of the technical architecture for the D100 Client is provided below.

EsriC 4
Figure 65. Architecture Wiring Diagram (ESRI Canada, 2022)

10.8.4. Demonstration

The following storyboard can help illustrate how users will interact with the client component developed in this deliverable. This storyboard helps to clarify the product objectives and functionalities. This storyboard helps to clarify the product objectives and functionalities. The actions, actors and technologies involved are described in the following table, according to each stage of the demonstration.

Table 5. D100 client demonstration storyboard
Seq # Activity Description Actor/End User & Technology

1

A call center receives a call about an emergency incident from Discovery ship.

Actor: Call Center Operator

Application: Incident Reporting System

Interoperable Technologies: Catalog OGC Feature API, S-111

Data: AIS ship positions

2

Ship captain reviews the Arctic Voyage Planning Guide to view Discovery’s current location and forwards the location to the emergency response professionals for ship crew rescue

Actor: Ship Captain

Application: Arctic Voyage Planning Guide

Interoperable Technologies: OGC CSW Catalog, OGC Feature API

Data: AIS ship positions and route using Arctic Voyage Planning Guide

3

“Location report” allows the emergency response professionals to view the incident location on the map

Actor: Coast Guard

Application: Incident Reporting System

Interoperable Technologies: OGC CSW Catalog, OGC Feature API, S-111

Data: AIS ship positions

10.8.4.1. Demonstration Video

To view the demonstration video showing how ESRI Canada’s client consumes and tests the performance and compatibility of datasets to develop an Arctic Voyage Planning Guide aiding the ship captain in the search and rescue scenario in the Arctic, please click the following link to the video on OGC’s YouTube Channel

10.8.5. Results: Integration, Interactions, and Interoperability

It was easy to integrate the OGC compliant data from other servers into ArcGIS Pro. There were no issues in terms of data interoperability and interactions.

The FMSDI D-100 Client demonstration showed how easy it is to implement standards-based web services. The demonstration showed how web service geospatial data can be ingested from various sources via the internet and then how this data can be used in important client applications. Multiple web service standards were demonstrated including WMS, WFS, WMTS, GeoServices REST, and the new OGC API - Features. In addition, multiple IHO S-100 hydrographic and marine data model standards were also shown.

Web services allow implementers to easily employ geospatial functionality over the Internet. Common standards for web services and data models helps make their use straightforward and easy to integrate. Web services underpin the findable, accessible, interoperable, and reusable data principles and D-100 Client highlighting the hypothetical Emergency Management application using standards-based interoperability shows how and why the use of geospatial web service data in geospatial data application is now a necessity.

To provide situational awareness information in the hypothetical scenario to the ship’s captain, crew and passengers, geospatial web services were used to deliver geographic information simultaneously from numerous internet servers to the client map display program.

The D100 web service client was focused on selecting, ingesting and displaying user requested data. The geospatial data was layered and displayed in an organized fashion according to the Arctic Voyage Planning Guide themes with additional themes for information about this specific emergency incident plus a theme for other data layers not defined in the guide. The client functionality includes the following.

  • Support for open geospatial data standards.

  • Employs commonly available internet architectures and protocols.

  • Was built and tested using ArcGIS Pro software.

  • Shows the phenomenal interoperability of Geospatial Web Services.

  • Illustrates the data layers from the Arctic Voyage Planning Guide.

  • Demonstrates the use of a robust client implementation.

  • Confirms the ease of use for non-geospatial practitioners.

The D100 client developed for this project was:

  • easy to use and understand for novice map display operators;

  • able to ingest and display a wide variety of data from numerous government and other web map servers;

  • able to demonstrate the concurrent use of several different types of web service standards including WMS, WFS, WMTS, OGC Features API and ArcGIS Rest services; and

  • able to show the ingest and display of a number of different IHO S-100 hydrographic and maritime data model specifications.

The data layers presented in the D100 client demonstration include the following web services.

  1. World Oceans basemap

  2. Arctic high-resolution DEM – hillshade elevation tinted

  3. World high resolution imagery

  4. World elevation 3D/Terrain 3D

In addition, the FMSDI demonstration showed the following layers, which were from FMSDI project published Web Services from the IIC Technologies – Server 1 (D103) at http://18.130.213.16/ogcfmsdi/collections.

  1. Vessel waypoints (symbolized as red map pins)

  2. Vessel route (symbolized as a white line between waypoints)

The specific point/location of the vessel grounding is also shown with a special yellow, red, and black ship icon symbol.

10.8.6. Challenges and Lessons Learned

  1. A significant challenge is the lack of OGC compliant data in the Area of Interest (AOI) for the arctic planning voyage guide. Where possible, Esri Canada used the esri shapefiles provided by different sources to fill the themes relevant data to complete the guide. Also, some datasets e.g., coast pilot guides are only available as pdf (view online or downloadable), and as such they were added as hyperlinks within the shapefiles. However, in case of power outage or limited Internet access, this can hinder the ability of the ship captain to make informed decisions as he/she will have no way to download the coast pilot guides. Hence, hard copies should be kept available on the ship.

  2. Because the emergency took place in Alaskan waters near the Arctic Circle, various map projections were tested to clearly visualize the situation for the captain and crew. In Arctic regions some of the best map projections for this area include the following.

    1. Global projection as shown below

    2. Alaska Polar Stereographic

    3. Lambert Conformal Conic Inferior projections for this part of the world include:

    4. Web Mercator

    5. Universal Transverse Mercator

      EsriC 5
      Figure 66. Projection testing in ESRI Software (ESRI Canada, 2022)
  3. Some Sentinel satellite images don’t display from the State of Alaska web service and may not have the right url for the web service. But when looking at the Sentinel-2 view, there appears to be a lot of cloud cover in the images around the AOI in Alaska

    EsriC 6
    Figure 67. ArcGIS Online Map View Classic (ESRI Canada, 2022)
  4. AVPG challenges encountered during the pilot.

    1. A link to the US Coast Pilot book(s) in theme #7 has been included. This is accessed in the demonstration via a link in the popup to the PDF file.. GP1

    2. For theme 6, it was not clear what some feature layer names are (i.e., FAA_JSONToFeatures). A more descriptive name has been provided due to our audience of the ship’s captain and crew.[GP2]

    3. For Theme 5, it was not clear what was required, but there are a few laws and regulations in the US that correspond to Canadian legislation in Theme 5.

      1. Shipping https://www.ecfr.gov/current/title-46

      2. Navigation and Navigable Waters https://www.ecfr.gov/current/title-33

      3. Protection of Environment https://www.ecfr.gov/current/title-40 For this layer some of these could make a polygon of Alaska and put these links in as popups, or any other suggestion.

    4. Also, for theme 5, the difference between the communication in theme 1 and theme 5 is that theme 5 is two-way communication, while theme 1 is the one way communication by radio of aids to navigation like here https://www.ccg-gcc.gc.ca/publications/mcts-sctm/ramn-arnm/docs/ramn-arnm-2022-eng.pdf so Navtex should be in theme 1.

    5. For theme 4, it would have been preferred if a limits and boundaries service, that is in S-121 format, could be found.

    6. For theme 3 there are a few layer names that were made more descriptive to the lay mariners, layer names such as METAR, AIS_JasonToFeatures, Marine Transportation(x2), ahari_2020, etc. were changed.

    7. For theme 2 – this is the Alaska region for the coast guard. There were some additional layers in the coast pilot that were included here.

    8. For theme 0 – There were some layers such as moving Light All Round, Radiostation, soundings, deptharea, runway, unsupervisedAreas, land region, polar/arctic ocean reference to other layers that were useful but were not directly related to the incident. So, an additional AVPG layer (Theme 8) was added for other appropriate layer data.

10.9. Client 2 (D101)- Helyx

10.9.1. Description

Helyx Secure Information Systems Ltd is a UK professional services company that covers a wide range of relevant skills and capabilities, specializing in the provision of information management, exploitation, assurance, and GIS solutions and services. Helyx have been a member of the OGC since 2016 and have been involved in previous OGC Testbeds and Pilots. Helyx were also involved in Phase 2 of the FMSDI Pilot, investigating how S-122 Marine Protected Area (MPA) datasets could be used in low-connected environments, developing a web-based client and contributing to the Summary Engineering Report. Several members of Helyx have extensive geographical and environmental backgrounds, and combined with technical knowledge and hands-on experience, recognize the increasing importance of the terrestrial and maritime interface in relation to climate change.

For the FMSDI Pilot Phase 3, which focuses on the maritime and terrestrial domains in the Arctic, Helyx was interested in how users, applying various interoperable technologies, can access, process, and visualize datasets that span these two domains. Additionally, these two domains have traditionally been segregated as the end users generally have different requirements and use cases, as well as varying processes and standards for data capture, processing, storage, and dissemination. Helyx views this as an opportunity to demonstrate how users could use and process data, and a chance to increase the exposure of OGC APIs, IHO standards, and the terrestrial / maritime interface to a growing user base.

10.9.2. Sub-scenario and Data

As described in the Chapter 7, the overall scenario is based on the expedition cruise ship ‘Discovery’ losing engine power and subsequently running aground off the coast of Kotzebue, Western Alaska. The sub-scenario for the D101 client is centered on an oil spill from the grounded vessel, and determining what anthropogenic and environmental areas may be affected when the vessel is recovered and towed to a suitable port. The sub-scenario therefore occurs some hours-days after the grounding of the vessel has occurred, once the Search and Rescue (SAR) operation is underway or has been completed.

The emphasis is on allowing an end user, who may not be equipped with the appropriate software or processing capabilities, to analyze an event in a web client using data from several disparate sources. Whilst it can be favorable to have all the necessary data stored or hosted on one server, whereby powerful server-side processing and analytics can occur, this is not always possible. For example, an event of interest occurs in a new or unusual Area of Interest (AOI) and there is not the existing datasets or infrastructure in place.

In this sub-scenario the end user utilizes client-side processing to conduct the analysis using available datasets in the region. This end user could be an analyst from a government or non-profit environmental organization, or an analyst from the vessel’s company (or insurance company) who needs to monitor the nature and extent of the oil spill and then identify the potentially disrupted areas when the vessel is recovered.

Actor(s), end-user(s), and/or stakeholder(s)

  • Ecological and environmental groups

  • Vessel recovery team

  • Vessel operating company

  • Vessel insurance company

  • Local / national government

Platform, Datasets, and Technical Architecture

The web client is built using Leaflet JS, a lightweight and openly available JavaScript library for creating interactive maps. This was chosen for its accessibility, ease of use and the ability to extend the core capabilities using a vast number of plugins. A number of datasets from two participant servers (D103 & D104) were utilized, alongside several other external data providers and web services that were sourced.

The types of technologies, services, and datasets accessed include the following.

  • Basemap services (OpenStreetMap, ArcGIS) - to allow the user to select the most appropriate one for their needs.

  • OGC API - Features, Coverages ­– datasets hosted by the D103 & D104 servers.

  • OGC WMS – layers hosted by the D104 server.

  • S-100 type data – S-101, S-104, S-127, S-421 datasets and more from D103.

Arctic Voyage Planning Guide

Where possible, datasets from across seven common themes were ingested into the web-client (APVG theme and corresponding dataset for the demonstration sub-scenario).

Table 6. APVG theme and corresponding dataset for the demonstration sub-scenario
APVG Theme Datasets Used (Non-Exhaustive List)

Carriage Requirements

S104 Water Level

S421 Route Lines / Points

NOAA Buoys

Regulatory Requirements

N/A

Arctic Env. Considerations

Populated Places

Weather Stations

Airports

Oil Spills

Route Planning

Maritime Protected Areas

Boundaries / Limits

Reporting and Comms

Soundings

Radio Station

Marine Services

Ice concentrations / extent

FAA Flights

AIS

Nautical Charts and Publications

N/A

Helyx 1
Figure 68. Example of the D101 client with various datasets loaded ingested, along with some processing results.

10.9.3. Technical Architecture

The D101 client utilizes connections to multiple data services, including those from other FMSDI Pilot participants, requesting data by submitting queries. Some datasets are loaded into the client by default, whereas others are only loaded once the user has provided input e.g., drawn an AOI. Various types of services are used, including OGC APIs, other standard geospatial web services, and locally held data (Figure X).

Helyx 2
Figure 69. Diagram of Helyx Technical Architecture

10.9.4. Demonstration

The D101 client demonstrates client-side geoprocessing of datasets from different sources and ingest routes. The table below (D101 client demonstration storyboard) summarizes the sub-scenario in the video demonstration.This storyboard helps to clarify the product objectives and functionalities. The actions, actors and technologies involved are described in the following table, according to each stage of the demonstration.

Table 7. D101 client demonstration storyboard
Seq # Activity Description Actor/End User & Technology

1

An analyst within an environmental team is tasked with determining what areas may be affected by the oil spill from the Discovery. Using data supplied from D104, they load in the oil spill detection data.

Actor: Analyst

Application: Web-based client processing

Interoperable Technologies: OGC Feature API, WMS, WFS

Data: Oil spill dataset

2

They are then given some parameters to conduct some preliminary analysis to identify affected areas. They can use the preset functions within the client to select an option for the weather conditions and volume of oil spilt.

Actor: Analyst

Application: Web-based client processing

Interoperable Technologies: OGC Feature API, client-processing

Data: Oil spill datasets, weather, sea-state

3

The client returns the results back to the end user. They can then see which areas are potentially at risk from being affected by the oil spill. The analyst is able to quickly put this into their report and send out to their management for review.

Actor: Analyst

Application: Web-based client

Interoperable Technologies: Openly available web-based client Data: AIS ship positions

4

The analyst is then tasked with finding out which suitable ports are nearby. Using open-source research and suitable mapping they find one or more nearby ports and draw an approximate route to these in the client. The client then returns various datasets within a specified distance back to the user displaying what areas and features may be affected when the Discovery is towed back to port.

Actor: Analyst

Application: Web-based client processing

Interoperable Technologies: OGC APIs, WMS

Data: Routes, AIS traffic, populated places, environmental areas, etc.

10.9.4.1. Demonstration Video

To view the demonstration video showing how Helyx Secure Information System Ltd’s client uses client-side processing to determine what anthropogenic and environmental areas may be affected from an oil spill when the vessel is recovered and towed to a suitable port in the Arctic, please click the following link to the video on OGC’s YouTube Channel

10.9.5. Results: Integration, Interactions, and Interoperability

In general, there were no issues in accessing and integrating the various data services from either the participant servers or external services. Most were served via OGC compliant APIs or as standard geospatial formats which Leaflet can easily ingest. It needs to be noted though that having a disparate number of data sources, with some being held locally, is not the most efficient method for getting data into a client or displaying the relevant metadata associated with each dataset. Additionally, the level of metadata with each dataset varied; therefore, the usefulness of some datasets to an end user may be a challenge if they are unfamiliar with them.

Leaflet is a versatile library; however, many plugins are required to make the most out of the datasets, i.e., geoprocessing capabilities, WMS enhancements, Esri interoperability, etc.

DDIL / Low Network Connectivity

Whilst not the primary focus during this phase of the FMSDI Pilot, the D101 client recognizes the need to account for the fact that network access and connectivity can be difficult, particularly with maritime users or terrestrial users in remote areas. With that in mind, several methods have been employed in the client to minimize the volume of data requested from providers such as the following.

  • Using the functionality of spatial filters to only request common data within in the map extent.

  • Only querying and subsequently displaying other data once the user has provided an AOI.

There are various mechanisms relating to spatial data infrastructures, network connectivity, and as covered in Phase 2 of the FMSDI Pilot, the data itself, that could be implemented to reduce the volume of data if this was of primary concern to the end users.

10.9.6. Challenges and Lessons Learned

Data fidelity

One noticeable challenge was the limited processing capability of the client when analyzing high fidelity data, as these were typically large datasets with complex shapes. The oil spill dataset in particular:

  • had a high spatial resolution (cells of 10-15m);

  • contained many holes within the polygons; and

  • contained many individual features.

This therefore took a long time to process in the web client and would occasionally crash. There are some possible workarounds including the following.

  • Utilizing other geoprocessing functionality to pre-process the data before running other processes (i.e., fill holes in the polygon, simplify the shape or extent).

  • Having data pre-processed and stored on the server side.

  • Iterating through the polygons one at a time, rather than all at once.

  • Using desktop software or other machine methods to assist in simplifying the shape.

  • Standing up a local server to store and process data .

Land and Maritime Domains

The area that connects the land maritime domains is crucial for datasets covering them to meet and integrate effectively. However, datasets from these two domains don’t always achieve this, particularly when data has been captured at different times, to different standards, or to varying levels of precision and accuracy. This was almost certainly exacerbated during this Pilot as data was obtained from a wide range of data providers.

Oil Spill Modeling

The processing for the oil spill was deliberately rudimentary as it is a very complex topic with a lot of variables, and the focus for the Pilot was on the data and client-side processing. There is scope for future efforts to work on using a more sophisticated model (such as NOAA’s GNOME) or creating one in-house to better reflect the reality in an oil spill scenario. To do this would require the correct datasets, some of which were not readily available over the scenario AOI. Finally, the buffer method used within the client is generated in all directions as there is not the functionality to determine or emphasize a particular direction. This would again be improved using a more sophisticated model.

Data Availability

Finally, as noted by the other Pilot participants, whilst there were many different data sources available, the actual amount of data situated within the scenario AOI was minimal. This made it difficult to test some of the datasets within the AVPG. A future scenario could occur in an AOI where there are more datasets available, either over a larger geographic extent or several smaller AOIs tied to a master scenario.

10.10. Client 3 (D102) - Ecere

10.10.1. Description

Ecere Corporation is a Canadian company developing geospatial software solutions. Ecere implements several standards part of the OGC API family in its GNOSIS Map Server, GNOSIS Software Development Kit, and GNOSIS Cartographer (3D visualization client) software products. Ecere also actively contributes to the OGC API standards themselves, including through a co-editor role for OGC API - Discrete Global Grid Systems (DGGS).

As a participant in the Federated Marine SDI Pilot Phase 3, Ecere focused on the implementation of a DGGS client interacting with the DGGS server provided by the University of Calgary. This close collaboration allowed to validate the current draft OGC API - DGGS candidate standard in the context of practical Technology Integration Experiments (TIEs).

10.10.2. Sub-scenario and Data

The sub-scenario for the Ecere client consisted in demonstrating the use of DGGS in performing analytics related to coastal flooding and erosion in combination with the DGGS server provided by the University of Calgary, focused on the integration of both terrestrial and marine spatial information. The client enabled visualizing the results in 3D perspective together with additional sources of information provided by other services.

10.10.3. Technical Architecture

10.10.3.1. Overview

The TIEs with the University of Calgary DGGS server were demonstrated in Ecere’s GNOSIS Cartographer 3D visualization client, built using the cross-platform GNOSIS Software Development Kit. The GNOSIS SDK, written in the eC programming language and leveraging the open-source Ecere SDK, enables the development of geospatial applications for desktop, mobile and web platforms. Support for accessing data from the DGGS server was added to the GNOSIS SDK’s OGC API client module. In the first experiment, corresponding to the OGC API - DGGS “Data Retrieval” requirements class, the client requested data for specific zones of a Discrete Global Grid System instance. Support for the ISEA9R DGGS, described below, was also implemented in Ecere’s GNOSIS Map Server in addition to its native GNOSIS DGGS.

10.10.3.2. DGGS based on the Icosahedral Snyder Equal Area (ISEA) projection

Although the University of Calgary’s DGGS server internally performs data quantization and integration based on the Icosahedral Snyder Equal-Area aperture 3 Hexagonal (ISEA3H) DGGS, for the purpose of the TIEs, participants selected a dual DGGS of ISEA3H, based on rhombuses, which greatly simplifies the tasks of partitioning, indexing and encoding the data of particular geospatial regions. Each of the ten root rhombuses (or diamonds, as they are also called in Kevin Sahr, Denis White and A. Jon Kimerling’s 2003 paper “Geodesic DGGS" [7]) corresponds to two of the twenty icosahedron triangles (joined at their base) of the planar ISEA projection – the same equal-area projection devised by John P. Snyder’s (ISEA), described in his 1992 paper ”An Equal-Area Map Projection For Polyhedral Globes", also used for ISEA3H. In this DGGS used for the data exchange, each of these root rhombuses is divided into nine (3 by 3) equal area rhombuses at the next level (aperture 9). The ISEA9R name has been suggested to refer to this DGGS, but in the initial GNOSIS implementation the ISEA9Diamonds identifier was used. Since for ISEA3H the area of hexagons of the next level occupies an area three times smaller than the previous one, each level of ISEA9R corresponds to every other (even) level of ISEA3H.

Ecere 01
Figure 70. ISEA projection illustration from the PROJ library for proj-string: +proj=isea
Ecere 02
Figure 71. NASA Blue Marble Next Generation projected by Ecere’s GNOSIS Map Server to the ISEA unfolded icosahedron plane where the ten root rhombuses can be seen connected in a zig-zag pattern.
Ecere 03a
Figure 72. ISEA9R root (level 0) rhombus divided into nine level 1 rhombuses, showing vertices centered in ISEA3H level 6 hexagons (the polar ISEA orientation is used here rather than the standard one).
10.10.3.3. Relationship between ISEA3H and ISEA9R

The hexagonal (and 12 pentagonal) zones of the ISEA3H DGGS even levels are centered on the vertices of the ISEA9R rhombus zones. This allows transporting zone area values for those ISEA3H even levels as ISEA9R zone point values. The reverse is also true: ISEA9R rhombus zone vertices can be mapped to the single ISEA3H hexagon (or pentagon) within which they are centered.

ASCII diagram illustrating the relationship between ISEA3H hexagonal zones at even levels (formed by six vertices marked with asterisks) and ISEA9R rhombus zones (formed by four vertices marked with letters, centered within ISEA3H hexagons).
 \ *   \ *   \ *   \ *
	* \   * \   * \   * \   *
	|--A--|--B--|--C--|--D--|
	*   \ *   \ *   \ *   \ *
	   * \   * \   * \   * \   *
	   |--E--|--F--|--G--|--H--|
	   *   \ *   \ *   \ *   \ *
	      * \   * \   * \   * \   *
	      |--I--|--J--|--K--|--L--|
	      *   \ *   \ *   \ *   \ *
	         * \   * \   * \   * \   *
	         |--M--|--N--|--O--|--P--|
	         *   \ *   \ *   \ *   \ *
	            * \   * \   * \   * \
Ecere 03b
Figure 73. ISEA9R root (level 0) rhombus divided into level 1, 2 and 3 rhombuses, showing vertices centered in ISEA3H level 6 hexagons (the polar ISEA orientation is used here rather than the standard one).
Ecere 03c
Figure 74. Area value for a ISEA3H level 6 hexagon or pentagon corresponding to an ISEA9R level 3 point value.

Hexagonal zones from odd levels (e.g., hexagon BCGJIE in the figure above) do not have a direct mapping to a level of the same ISEA9R DGGS, but instead could be mapped to a separate aperture 9 rhombus layout, with the two alternating sets of multi-resolution rhombus grids encompassing together all ISEA3H hierarchy levels.

Conversely, a value representing the area of an ISEA9R rhombus zone can be mapped to two vertices of a corresponding ISEA3H even level hexagonal zone. If the area of each rhombus is divided into two triangles, then the area of each triangle maps to a single ISEA3H zone vertex. Another triangle-based ISEA9T DGGS could be defined which would be equal area and partially axis-aligned, since two out of three triangular zone edges (those shared with the encompassing rhombus zone) are aligned with the CRS axes (if applying an affine transform, as discussed below).

Vice-versa: each vertex of an hexagonal ISEA3H even level’s zone also maps to one triangle half of an ISEA9R rhombus zone. It would also be easy to aggregate the two values into a single rhombus zone area value.

ASCII diagram illustrating the relationship between ISEA3H hexagonal zones (formed by six vertices marked with asterisks) and the triangular zones of a potential ISEA9T DGGS (formed by three vertices marked with letters, within which are centered the hexagon vertices), corresponding to two halves of an ISEA9R rhombus.
 \ * / \ * / \ * / \ * /
	* \ / * \ / * \ / * \ / *
	|--A--|--B--|--C--|--D--|
	* / \ * / \ * / \ * / \ * /
	 / * \ / * \ / * \ / * \ / *
	   |--E--|--F--|--G--|--H--|
	   * / \ * / \ * / \ * / \ * /
	    / * \ / * \ / * \ / * \ / *
	      |--I--|--J--|--K--|--L--|
	      * / \ * / \ * / \ * / \ * /
	       / * \ / * \ / * \ / * \ / *
	         |--M--|--N--|--O--|--P--|
	         * / \ * / \ * / \ * / \ *
	          / * \ / * \ / * \ / * \
ASCII diagram illustrating the two triangle halves of an ISEA9R rhombus at the parent level, each triangle containing 9 triangles and the rhombus containing 9 rhombuses from the previous diagram. The hexagons from the following diagram have their vertices F and K centered within each triangle.
|--A--|--B--|--C--|--D--|
	    \               / \
	     \             /   \
	      E     F     G     H
	       \         /       \
	        \       /         \
	         I     J     K     L
	          \   /             \
	           \ /               \
	         |--M--|--N--|--O--|--P--|
ASCII diagram illustrating the ISEA3H hexagonal zones at two hierarchy levels up compared to the first ASCII diagrams shown earlier (next higher even level). The hexagon vertices F and K are centered within the parent triangular zones from the previous diagram.
 \          |      \          |
	  \         |       \         |
	---A-----B--|--C-----D---     |
	    \       |         \       |
	     \      |          \      |
	      E     F     G     H     *
	       \ /     \         \  /    \
	\     / \         \     / \         \
	   *     I     J     K     L     *     *
	   |      \          |      \          |
	   |       \         |       \         |
	   |     ---M-----N--|--O-----P---     |
	   |         \       |         \       |
	   |          \      |          \      |
10.10.3.4. A Two-Dimensional Tile Matrix Set based on ISEA9R

An important simplifying aspect of the rhombus-based ISEA9R DGGS compared to the hexagon-based ISEA3H is that data for each zone can easily be encoded in traditional imagery formats such as PNG and (Geo)TIFF by mapping the skewed rhombus grid to a rectilinear grid. In addition to facilitating data transmission, this also allows to cache data packets in 2D textures stored in the memory of Graphical Processing Units (GPU).

During the pilot, participants realized that a CRS based on the ISEA plane to which a simple affine transformation is applied can form the basis of a proper 2D Tile Matrix Set (2DTMS), as defined by the 2D Tile Matrix Set and Tileset Metadata Standard. This enables the description of an ISEA9R DGGS strictly based on the definition of such a CRS as well as the 2DTMS definition schema specified in the 2DTMS standard. Furthermore, this also enables the use of the approved OGC API - Tiles standard as an alternative to the OGC API - DGGS “Data Retrieval” requirements class. A DGGS defined in this CRS (after the affine transformation has been applied) conforms to the axis-aligned conformance class of the OGC Abstract Topic 21 - Discrete Global Grid Systems Standard, planned for Part 4, where all zone edges are parallel to the axes of the base CRS. Because the ISEA projection is equal area and the affine transformation maintains this property, the DGGS also still conforms to the equal area DGGS conformance class of OGC Topic 21 - Part 1, resulting in a very useful combination of convenient characteristics: being both equal area as well as axis-aligned. The base CRS for the DGGS and 2DTMS used in the TIEs is the ISEA projection using the standard orientation, placing one vertex of the icosahedron at (58.282525590°N, 11.250°E) and an adjacent vertex due North from this first vertex. The planar space of the unfolded icosahedron is then translated so that the leftmost icosahedron vertex lies at (0, 0), rotated 60 degrees clockwise, sheared with a 1.0 / sqrt(3) factor horizontally (30° skew), and finally scaled by (1.0 / 7674457.9483758880087 m, 1.0 / 6646275.5435688359285 m), resulting in a 0..5 x 0..6 2D space. Ten of the thirty 1x1 square tiles within this space are valid and correspond to the ISEA projection rhombuses, being located wherever the integral part of the y (vertical) coordinate is either equal to or one more than the integral part of the x (horizontal) coordinate (forming a staircase pattern).

Ecere 04a
Figure 75. NASA Blue Marble Next Generation in the CRS for the ISEA9R 2D Tile Matrix Set (the ISEA unfolded icosahedron plane translated, rotated 60° clockwise, sheared 30° and scaled to perfect squares), showing the 10 / 30 valid tiles (squared icosahedron rhombuses along a staircase pattern).
Ecere 04b
Figure 76. 4° graticule projected to ISEA (standard orientation) by the GNOSIS implementation of the projection, with the ISEA9R 2D Tile Matrix Set affine transformation applied (forward projection)
Ecere 05
Figure 77. The projected 4°graticule from the previous figure successfully de-projected back to a WGS84 geographic CRS by the GNOSIS implementation of the ISEA projection (inverse projection)
The first two levels of a JSON ISEA9R 2D Tile Matrix Set definition. Note that the CRS and 2DTMS URIs are not yet registered, and that the cellSize shown here is in meters (not scaled CRS units, to review).
{
   "id" : "ISEA9R",
   "title" : "ISEA9R",
   "uri" : "http://www.opengis.net/def/tilematrixset/OGC/1.0/ISEA9R",
   "crs" : "http://www.opengis.net/def/crs/OGC/0/1534",
   "orderedAxes" : [ "E", "N" ],
   "tileMatrices" : [
      {
         "id" : "0",
         "scaleDenominator" : 112793326.695706769824,
         "cellSize" : 31582.131474797894,
         "cornerOfOrigin" : "topLeft",
         "pointOfOrigin" : [ 0, 0 ],
         "matrixWidth" : 5,
         "matrixHeight" : 6,
         "tileWidth" : 243,
         "tileHeight" : 243
      },
      {
         "id" : "1",
         "scaleDenominator" : 37597775.5652355924249,
         "cellSize" : 10527.3771582659647,
         "cornerOfOrigin" : "topLeft",
         "pointOfOrigin" : [ 0, 0 ],
         "matrixWidth" : 15,
         "matrixHeight" : 18,
         "tileWidth" : 243,
         "tileHeight" : 243
      }
   ]
}
Ecere 06
Figure 78. Rows and columns indexing for the ISEA9R 2D Tile Matrix Set, with the transformed grid from the PROJ ISEA projection illustration, also showing level 1 grid (the number displayed on level 1 tiles is not the {index} relative to the root level rhombus, but rather a unique index for each zone within ISEA9R level 1)

A sample definition for the 2DTMS based on this CRS is available from the GNOSIS Map Server implementation (supporting both OGC API - Tiles as well as OGC API - DGGS):

	https://maps.gnosis.earth/ogcapi/tileMatrixSets/ISEA9Diamonds?f=json

With both OGC API - DGGS "Data Retrieval" and OGC API - Tiles (coverage tiles), it is possible to return data packets that represent either the value of cells wholly between the zone / tile area (Value-Is-Area), as well as values corresponding to an infinitely small point at the vertices of the zone / tile, and at regular interval in between (Value-Is-Point). Rather than representing a single value for a zone / tile, a data packet typically also contains values for a fixed number of zoom levels beyond the hierarchy level / tile matrix of the requested zone / tile.

In this 2DTMS definition, a tile size of 243x243 (3^5) is assumed for Value-Is-Area tile payloads (corresponding to 244x244 for Value-Is-Point coverage tiles), and tile indexing using rows and columns starts from the top-left corner. The provisional (currently invalid) http://www.opengis.net/def/crs/OGC/0/1534 URI was used to identify the CRS of the ISEA plane to which the affine transformation required for the 2DTMS definition has been applied.

10.10.3.5. Indexing the ISEA9R DGGS

For the purpose of the DGGS API ../zones/{zoneID}, Ecere suggested a zone indexing of the form {level}{root}-{index}, where:

  • {level} is a letter indicating the hierarchy level (A:0, B:1, C:2…);

  • {root} is an index from 0-9 referring to the root rhombus: 0 for the top-left zone / left-most in ISEA plane, following a zig-zag pattern to the bottom/right; and

  • {index} is a hexadecimal numerical index for zones within the root rhombus starting at 0 and incrementing first going from left to right along a row, then continuing to increment at the first column of the next row; for level 1, each root rhombus contains 9 (3x3) such zones numbered 0..8, for level 2 each rhombus contains 81 (9x9) numbered 0..50 (hex), and so on.

Therefore, zone ID “C0-3C” corresponds to level 2 (“C”), root rhombus 0, row 6, column 6 (index 3C in hexadecimal, or 60 in decimal: 6 x 9 + 6). The main advantage of this indexing is that it is more compact than the decimal rows and columns of the Tiles API, especially at more detailed zoom levels. This is particularly relevant for the DGGS API “Zones Query” requirements class where a (possibly very large) list of zone IDs is returned in response to a geospatial query e.g., formulated using an OGC Common Query Language (CQL2) expression, answering a question of the type “Where is it?”. Ecere implemented this indexing in its GNOSIS Map Server demonstration end-point. For example, data of the NASA Visible Earth night lights (“https://blackmarble.gsfc.nasa.gov/[black marble]”) collection for C0-3C zone is available at: https://maps.gnosis.earth/ogcapi/collections/lights/dggs/ISEA9Diamonds/zones/C0-3C/data?f=png (DGGS API - Data Retrieval conformance class), or alternatively for the Tiles API at: https://maps.gnosis.earth/ogcapi/collections/lights/tiles/ISEA9Diamonds/2/6/6?f=png

Ecere 07
Figure 79. Zone Information Resource for ISEA9R (“ISEA9Diamonds”) DGGS Zone C0-3C for NASA Visible Earth “black marble” in GNOSIS Map Server OGC API - DGGS Implementation
10.10.3.6. Discrete Global Grid System based on the GNOSIS Global Grid 2D Tile Matrix Set

In addition to newly implemented support for the ISEA9R DGGS, the GNOSIS Map Server and GNOSIS SDK rely heavily on their native axis-aligned DGGS based on the GNOSIS Global Grid 2D Tile Matrix Set (http://www.opengis.net/def/tilematrixset/OGC/1.0/GNOSISGlobalGrid). This 2D Tile Matrix Set leverages the variable-width conformance class, allowing to coalesce tiles while approaching the poles, in order to approximate equal area in polar regions while being based on a geographic (EPSG:4326) CRS.

Ecere 08
Figure 80. Tile Matrix (zoom level) 2 from the GNOSIS Global Grid 2D Tile Matrix Set, showing the different coalescence factors (c) for different latitude rows near the poles (OGC 17-083r4)

Although this DGGS does not conform to the OGC Topic 21 - Part 1 equal area conformance class since the area difference for zones of the same level exceeds the maximum threshold of 1% allowed, it achieves a reasonable ±45.9% difference from median zone area up to level 25 (~849 cm² mean zones) or ±48.4% from median up to level 26 (~212 cm² mean zones). This difference in zone area can easily be accounted for in analytics calculations, and is the same for all zones of a given latitude band, a fact which can be used for optimizing such calculations. Examples of zone area calculations and their associated variance from the mean zone area are seen in the zones query resources of the GNOSIS Map Server implementation.

This grid’s main advantage is that data in an unprojected geographic coordinate system can be very efficiently ingested, while projected data only requires a single inverse projection to geographic coordinates. Similarly, projecting data from this grid requires a single forward projection transformation from EPSG:4326, which is readily available for most projections. The grid is particularly well suited to present data on a virtual 3D globe, including with a 3D terrain mesh generated from digital elevation models.

In contrast, a DGGS based on the ISEA projection offers the advantage of being truly equal area, but requires performing an additional costly forward or inverse conversion to that ISEA projection when ingesting or exporting data to another projection or to a geographic CRS.

Data served by the GNOSIS Map Server is internally stored and processed using the GNOSIS Global Grid, and data displayed in GNOSIS clients is also natively cached and displayed using the GNOSIS Global Grid. The TIEs results with the University of Calgary DGGS server demonstrate that runtime conversion between the two different DGGS can be achieved efficiently, although further optimizations and more direct pathways could be implemented.

10.10.3.7. Particularities of TIEs with University of Calgary DGGS Server

At the time of performing TIEs with the University of Calgary DGGS server, the implementation differed from the OGC API – DGGS candidate standard (but a simple direct mapping between the two was possible) and zone data packets were available in a custom “PYX0” data format (but could also easily have been implemented using a common format such as GeoTIFF). The initial template URL for retrieving raw data value packets from the University of Calgary API was: http://digitalearthsolutions.com/api/v1/rhombus/TextureValue?key={PYXISRhombusKey}&size=244&geosource={geoSourceID}&format=PYX0 where {geoSourceID} referred to a particular collection of geospatial data, and {PIXYSRhombusKey} referred to a particular Rhombus Zone ID. In a conforming OGC API - DGGS implementation, this data access template URL would correspond to e.g., http://digitalearthsolutions.com/api/v1/collections/{collectionId}/dggs/ISEA9R/zones/{zoneId}/data, where the PYX0 format would be negotiated with an Accept: HTTP request header using a corresponding media type (for GeoTIFF: Accept: image/tiff; application=geotiff). A relatively simple mapping from the suggested ISEA9R zone ID (through the corresponding 2D Tile Matrix Set level, row and column) to a PIXYSRhombusKey was established. In addition, either a vertical or horizontal flip (depending on the root rhombus) had to be performed on the data to map correctly to the 2DTMS. This is illustrated in the following code listing and figure:

This function returns a PYXIS Rhombus Key string from an ISEA9R 2DTMS Level, Row, Column
void pyxisRhombusKeyFromLRC(char * key, int level, int row, int col)
{
   static const int pyxisRootDiamonds[10] = { 0, 6, 1, 5, 2, 9, 3, 8, 4, 7 };
   static const bool xFlips[10] = { true,  false, true,  false, true,  false, true,  false, true,  false };
   static const bool yFlips[10] = { false, true,  false, true,  false, true,  false, true,  false, true  };
   uint64 p = (uint64)pow(3, level);
   uint pr = (uint)(row / p), pc = (uint)(col / p);
   int root = pr + pc;
   int pyxisRoot = pyxisRootDiamonds[root];
   bool flipX = xFlips[root], flipY = yFlips[root];
   uint r = row, c = col;
   int i = 0, l;
   key[i++] = (char)('0' + pyxisRoot);
   for(l = 1; l <= level; l++)
   {
      int n;
      r -= pr * p, c -= pc * p;
      p /= 3;
      pr = (uint)(r / p), pc = (uint)(c / p);
      n = (flipY ? 2-pr : pr) * 3 + (flipX ? 2-pc : pc);
      key[i++] = (char)('0' + n);
   }
   key[i] = 0;
}
Ecere 09
Figure 81. Numbering of root rhombus and horizontal or vertical flip for University of Calgary Server API, overlaid on top of the initial GEBCO elevation data styled PNG tiles provided
10.10.3.8. Defining an integration and processing workflow to determine susceptibility to coastal erosion

Although most of the efforts during the pilot was spent on achieving a successful TIE for the client to retrieve data from the server for a particular geographic area, thoughts were put into how a client could describe the information it wishes to retrieve, which could be the result of integrating several different sources of spatiotemporal information and processing capabilities discovered by the client whether they are provided by the same service as DGGS server, or from other OGC API endpoints.

The OGC API – Processes - Part 3: Workflows and Chaining candidate Standard enables such instant integration and visualization of any geospatial data and/or processing capabilities available from anywhere. The candidate Standard achieves this by extending OGC API – Processes – Part 1: Core with the ability to:

  • reference local and remote nested processes as inputs in execution requests;

  • reference local and remote OGC API collections as inputs in execution requests;

  • modify data accessed as inputs and returned as outputs through filtering, selecting, deriving and sorting by fields (measured/observed properties); and

  • requesting output data (e.g., via OGC API - DGGS “Data Retrieval” requests for a particular ISEA9R DGGS zone) from a resulting virtual OGC API collection in order to trigger processing execution for a particular area and resolution of interest.

A workflow for determining the susceptibility to coastal erosion from four geospatial datasets (digital elevation model, surface geology, permafrost and land cover) was developed by the University of Calgary and used as a basis to prototype a definition using an extended JSON execution request as specified by OGC API - Processes - Part 3. CQL2 expressions are used to associate properties from the datasets to the extent to which they could be contributing to erosion, then finally weighted to produce a single susceptibility percentage value. The digital elevation model values first goes through slope and aspect processes in order to produce two distinct values, each independently contributing to the erosion susceptibility.

This workflow is illustrated in a diagram below, and the corresponding JSON execution request is presented in the following JSON listing.

Ecere 10
Figure 82. Diagram illustrating possible interactions between OGC API clients and servers enabled by OGC API - Processes - Part 3: Workflows & Chaining, including the ability to trigger on-demand processing for a given Area & Resolution of Interest through OGC API data access mechanisms, such as OGC API - Discrete Global Grid Systems
Ecere 11
Figure 83. Diagram illustrating coastal erosion susceptibility workflow, integrating four different data sources
JSON listing for the coastal erosion susceptibility workflow illustrated in the previous wiring diagram
{
   "process": "https://example.com/ogcapi/processes/PassThrough",
   "inputs": {
      "data": [
         {
            "process": "https://example.com/ogcapi/processes/Slope",
            "inputs": {
               "dem": { "collection":
"https://example.com/ogcapi/collections/DEM" }
            },
            "properties": { "s" : "slope >= 36.4 ? 10 : slope >= 17.6 : 7 : slope >= 8.7 ? 5 : slope >= 3.5 ? 3 : 1" }
         },
         {
            "process": "https://example.com/ogcapi/processes/Aspect",
            "inputs": {
               "dem": { "collection":
"https://example.com/ogcapi/collections/DEM" }
            },
            "properties": { "a" : "aspect >= 315 or aspect < 45 ? 1 : aspect >= 225 or aspect < 135 : 5 : 10" }
         },
         {
            "collection":
"https://example.com/ogcapi/collections/ArcticPermafrost",
            "properties": {
               "e" : "extent = 'c' ? 1 : extent = 'd' ? 5 : extent = 's' ? 7 : extent = 'i' ? 10 : null"
               "c" : "content = 'l' ? 1 : content = 'm' ? 5 : content = 'h' ? 10 : 0"
            }
         },
         {
            "collection":
"https://example.com/ogcapi/collections/Landsat7LandCover",
            "properties": { "l" : "lc in(0,1,2,3,4,5,11,13,15) ? 1 : lc in(6,7,8,9,10,12,14) ? 5 : lc = 16 ? 10 : 0" }
         },
         {
            "collection": "https://example.com/ogcapi/collections/AlaskaSurficialGeology",
            "properties": {
               "g" :
               "qcode = 'Ql' ? 0 : qcode in ('Qra','Qi','Qrc', 'Qrd', 'Qre') ? 1 : qcode in ('Qrb','Qaf', 'Qat', 'Qcb','Qfp','Qgmr') ? 3 : qcode in ('Qcc','Qcd','Qel','Qm1', 'Qm2','Qm3','Qm4','Qw1','Qw2') ? 5 : qcode in ('Qes','Qgm') ? 7 : qcode in ('Qed','Qgl','Qu') ? 10 : 0"
            }
         }
      ]
   },
   "properties": { "susceptibility" : "0.30 * s + 0.05 * a + 0.05 * e + 0.20 * c + 0.10 * l + 0.30 * g" }
}

10.10.4. Demonstration

Due to the amount of effort involved in researching and developing support for the DGGS and Icosahedral Snyder Equal Area projection necessary to achieve successful TIEs with the University of Calgary DGGS server, the demonstration of the pilot results for Ecere’s contribution consists mainly in the above detailed description of a proven technical architecture for client/server DGGS interaction, as well as screenshots and a video of Ecere’s GNOSIS Cartographer client visualizing data retrieved from the University of Calgary’s DGGS server in its 3D virtual globe. These screenshots are featured in the following section.

In a real-world scenario, a client could allow a user to independently discover data sources and processes available from one or more OGC API servers, define a workflow integrating these data sources and processes in a meaningful way, request data for its current area and resolution of interest (using e.g., OGC API - DGGS) based on its current viewport configuration which would trigger on-the-fly processing, and quickly display results visually to the user. Demonstrating such interaction involving multiple end-points where a workflow is built starting from data and process discovery to final visualization in the context of a practical scenario would be interesting and insightful future work for a follow-on phase of the Federated Marine SDI Pilot, or for another OGC Innovation Program initiative.

10.10.4.1. Demonstration Video

To view the demonstration video, from a 3D perspective, Ecere’s client demonstrates the use of DGGS in performing analytics related to coastal flooding and erosion focused on the integration of both terrestrial and marine spatial information using OGC API – DGGS in the Arctic, please click the following link to the video on OGC’s YouTube Channel

10.10.5. Results: Integration, Interactions & Interoperability

Ecere 12
Figure 84. Results of the coastal erosion susceptibility workflow and GEBCO bathymetry data retrieved from the University of Calgary DGGS server as ISEA9R / PYX0 data tiles styled and displayed in Ecere’s GNOSIS Cartographer 3D client, integrated with elevation data from Viewfinder Panoramas and Blue Marble Next Generation stored as GNOSIS Global Grid data tiles. ESA’s Gaia Sky in Colour in the background.
Ecere 12b
Figure 85. Results of the coastal erosion susceptibility workflow retrieved from the University of Calgary DGGS server styled and displayed in Ecere’s GNOSIS Cartographer,integrated with elevation data from Viewfinder Panoramas and Blue Marble Next Generation.
Ecere 13
Figure 86. GEBCO bathymetry data retrieved from University of Calgary DGGS server as ISEA9R / PYX0 data tiles displayed in Ecere’s GNOSIS Cartographer 3D client, integrated with elevation data from Viewfinder Panoramas stored as GNOSIS Global Grid data tiles. ESA’s Gaia Sky in Colour in the background.
Ecere 14
Figure 87. GEBCO bathymetry data retrieved from University of Calgary DGGS server as ISEA9R / PYX0 data tiles displayed in Ecere’s GNOSIS Cartographer 3D client, integrated with elevation data from Viewfinder Panoramas stored as GNOSIS Global Grid data tiles. The GEBCO elevation data is used for constructing the 3D terrain mesh as well as for applying an elevation color map style.
Ecere 15
Figure 88. Illustrating the dynamic 3D terrain mesh constructed on the fly based on the variance in elevation.

10.10.6. Additional research and development efforts

In this section, several additional research and development experiments related to an ISEA3H DGGS based on hexagons (and 12 pentagons) are described on which (very) considerable time and effort was spent, but in the end did not contribute directly to the final results, which relied solely on rhombus grids. However, this work led to valuable and interesting insights as well as familiarity with the ISEA3H Discrete Global Grid System. Details on the challenges leading to taking on this extra work are mentioned in the following section on challenges and lessons learned.

10.10.6.1. Building hierarchical hexagonal/pentagonal grids

Since Ecere’s initial understanding, from the pilot kickoff and from the first meetings discussing the DGGS work, was that the client would interact with the ISEA3H DGGS server based on hexagonal and pentagonal zones, the first research & development activities focused on attempting to build a hierarchical DGGS based on the ISEA3H hierarchical topology. These experiments did not use a proper Icosahedral Snyder Equal Area projection, but instead tried to emulate the desired equal area properties of the ISEA3H zones by dividing geodetic distances on the WGS84 ellipsoid, working directly with geodetic latitude and longitudes. As a result, the computed zone edges are not accurate, which is more obvious when superposing grids from several hierarchy levels apart. The icosahedron orientation used in these experiments was the polar orientation, rather than the standard ISEA orientation used in the final TIEs with the University of Calgary DGGS server.

Ecere 16
Figure 89. The dodecahedron (12 pentagonal faces) used as a starting point in these experiments (level 0)
Ecere 17
Figure 90. The icosahedron (20 triangular faces), which is the true solid forming the basis of the ISEA projection and ISEA3H / ISEA9R DGGS. Each vertex is centered in a pentagonal face of the above dodecahedron, mapping ISEA3H level 0 pentagons to ISEA9R level 0 rhombuses (two triangles).
Ecere 18
Figure 91. Level 1 truncated icosahedron (traditional soccer ball pattern: 20 hexagons and 12 pentagons)
Ecere 19
Figure 92. Level 1, 2 and 3 grids displayed together
Ecere 20
Figure 93. Level 4, 5 and 6 grids displayed together
Ecere 20b
Figure 94. Level 3, 5 and 8 grids displayed together
Ecere 20c
Figure 95. Level 3, 4, 5, 6, and 7 grids displayed together
Ecere 21
Figure 96. Level 2, 4 and 6 grids displayed together
Ecere 22
Figure 97. Level 6 grid displayed together with the icosahedron outline. Each ISEA3H level 6 hexagon is centered on a vertex of an ISEA9R level 3 rhombus, resulting from taking a root rhombus made of two triangular icosahedron faces, with 28 ((6/2)^3+1) level 6 hexagons along its edges, and dividing it in 3x3 equally three times.
Ecere 23
Figure 98. A close-up view of 3D globe showing the Blue Marble Next Gen. with three hierarchy levels overlaid
Ecere 24
Figure 99. A close-up view of 3D globe showing the Blue Marble Next Gen. with two hierarchy levels overlaid
Ecere 25
Figure 100. A close-up view of 3D globe showing the Viewfinder Panoramas elevation coverage with three hierarchy levels overlaid
Ecere 26
Figure 101. 3D globe showing the Viewfinder Panoramas elevation coverage with hierarchy level 7 overlaid
Ecere 27
Figure 102. Viewfinder Panoramas elevation coverage around Mt. Fuji with hierarchy level 7 grid overlaid
10.10.6.2. Indexing mechanism based on each zone having a single primary parent (and most zones 3 primary children)

Ecere researched a potential indexing scheme where each child zone has a "primary" parent (centroid child zones have 1 parent, while vertex child zones have 3 parents) and most zones have 3 "primary" children (most zones have 7 children except pentagons which have 6). One advantage of this indexing is that it is hierarchical in nature, and can represent any zone in a single 64-bit identifier all the way down to level 31. A unique zone index key is structured like so as a 64-bit integer:

class ISEA3HKey : uint64
{
public:
   int level:5; // Level 0..31
   int root:5;  // Level 1 root zone (0..11 pentagons; 12..31 hexagons (level 1+))
   uint64 index:54; // Base 3 (0,1,2) * 3^pos
}

where index contains the parent index (itself implying all its ancestors starting from level 2) multiplied by 3, plus the current index (0: center child, 1: second primary child, 2: third primary child), each displayed in gray/green, red, blue in screenshots below. While the decision of how to associate which children with which parent, based on the vertex index order, originally seemed trivial at the lower levels (with pentagons centered on both north and south poles, the two polar pentagons would only be assigned their center child as primary), things become more complicated at higher levels. Interesting patterns become necessary to avoid overlapping at levels 5 and below.

Ecere 28
Figure 103. In this screenshot, the most counter-clockwise (1: blue) and most clockwise (2: red) primary children are rendered closer to the center child (0: green) of the primary parent (orange outline).
Ecere 29
Figure 104. In this screenshot, the parent zone vertices numbered 0-5 counter clockwise, organized in such a way that the clockwise primary child (2: red) is always centered on vertex 0, and the most counter-clockwise primary child (1: blue) is always centered on vertex 1. As a general rule, the parent hexagons usually need to be re-oriented so that the vertex 5 is southernmost in the southern hemisphere, and northernmost in the northern hemisphere. However, this rule breaks down at higher levels with a more complicated pattern necessary.
Ecere 30
Figure 105. Level 1 and 2 grids in a plate carrée projection
Ecere 31
Figure 106. Level 1, 2 and 3 grids in a plate carrée projection, missing zone areas due to indexing challenges
Ecere 32
Figure 107. Level 2, 3 and 4 grids in a plate carrée projection, used in attempts to figure out patterns to associate a unique primary parent with a maximum of two primary vertex children with pen and paper
Ecere 33
Figure 108. This screenshot shows the pattern of children at level 6, with level 5 parent hexagons having to be oriented based on whether they are or not within a triangular pattern that repeats 5 times around the Earth longitudes.
Ecere 34
Figure 109. This screenshot shows the pattern of children at level 8.
Ecere 35
Figure 110. This screenshot shows the patterns of children of both level 6 and 8 superposed, showing how the triangular areas align.
Ecere 36
Figure 111. This screenshot shows the pattern of children at level 8, together with the level 9 outlines, the icosahedron, as well as the level 0 pentagons which are centered on those icosahedron vertices (forming a dodecahedron).
Ecere 37
Figure 112. This screenshot shows a different pattern necessary for assigning primary parents at level 9.

10.10.7. Implementing the PYXIS “snowflake fractal” indexing

The PYXIS indexing successfully achieves a similar aim to the indexing Ecere attempted to develop (described above), but relying on the very useful property that every child zone centered on a parent zone’s vertex (Vertex Child) always has exactly one parent that is a Centroid Child. This property is not particularly obvious, since three levels of the hierarchy need to be considered, but can be used to easily select the primary parent for Vertex Child zones. The descendants of a root pentagon or hexagon zone form an interesting fractal pattern resembling a snowflake.

Ecere 38
Figure 113. PYXIS indexing snowflake fractal pattern for root hexagonal zones, for a single high resolution level

Ecere implemented support for this indexing as a simpler and more reliable approach for uniquely indexing zones, as well as a mechanism to encode zone indices in 64 bit identifiers, down to level 24 (with 10 additional levels within the data tiles themselves, resulting in roughly ~6.2 cm / pixel resolution), made of a 5 bits {Level} component, a 5 bits (0-31) {Root} component corresponding to Level 1 (truncated icosahedron) root zones, and a 54 bits {Index} component computed as follows.

  • All 32 root zones at level 1 have an {Index} of 0.

  • The {Index} of a Centroid Child is its parent’s {Index} multiplied by 8 if that index has any of the 3 least significant bits set (parent’s {Index} & 7 is non-zero); otherwise, plus 7 where

    • index = (parentIndex & 7) ? (parentIndex << 3) : (parentIndex + 7).

  • The {Index} of a Vertex Child is its primary (itself a Centroid Child) parent’s {Index} multiplied by 8 if either that index has any of the 3 least significant bits set or if its primary parent’s primary parent (primary grandparent) was also a Centroid Child; plus (1..6) to indicate on which vertex it is where

    • index = (parentIndex & 7) || (parentZone.parents[0] && parentZone.parents[0].isCentroidChild ? (parentIndex << 3) : parentIndex) + vertexIndex.

Ecere 39
Figure 114. PYXIS indexing snowflake fractal pattern for root hexagonal zones
Ecere 40
Figure 115. PYXIS indexing snowflake fractal pattern for root pentagonal zones
10.10.7.1. Hexagonal zone data packets

One disadvantage of the PIXYS indexing is that the descendants of a root hexagon or pentagon zone do not fully cover the root zone (which results in the fractal edges). This is inconvenient for a DGGS data retrieval API where a client first identifies the list of zones of interest and requests data for these zones at a higher resolution that should ideally fully cover the requested area.

A solution would be to use the PYXIS indexing for the purpose of identifying unique zones of any level, but devising a data packet encoding which would contain all children zones contained within a requested zone, regardless of whether they are part or not of that root zone’s hierarchy based on the PIXYS indexing i.e., including children zones within the requested hexagonal zone but outside of the fractal pattern, and excluding children zones within the zone fractal pattern but outside of the requested pentagonal zone.

One remaining challenge would be to determine how the values for each zone should be ordered within such a data packet. This could for example be based on a spiral pattern starting in the center of the zone, or based on a particular orientation of the requested zone.

These questions are left as future work for an ISEA3H OGC API - DGGS where actual hexagonal zones, rather than rhombuses, are used as a data transport mechanism.

10.10.8. Challenges and Lessons Learned

The goal of achieving one of the first OGC API - Discrete Global Grid Systems TIEs between different software solutions, each implementing very different Discrete Global Grid Systems prior to the start of this short pilot, presented several challenges.

The most important of these challenges was that an initial server API with which to attempt TIEs, and associated documentation, was only available to Ecere late in the pilot phase. As a result, it was very difficult for Ecere to plan for the best approach in developing its DGGS client component. Significant efforts were spent in different directions, exploring various possibilities that provided valuable insights and familiarity with the research topic, but had no direct impact on the final results demonstration.

A large portion of this extraneous work related to the complexities of dealing with the hexagons (and 12 pentagons) of the ISEA3H DGGS, which can be avoided completely when benefiting from the contrasting simplicity of rhombus grids. The results of these additional experiments are presented in the previous section.

A lesson to take away from this is that although hexagons offer unique interesting sampling characteristics (the hexagon being the regular polygon tiling the plane closest to a circle), they do add a significant amount of complexity compared to square or rhombus tiles. This is precisely why using the ISEA9R DGGS as a simpler transport mechanism for ISEA3H is an appealing proposition.

Another challenge was that the Icosahedral Snyder Equal Area (ISEA) projection is not currently widely supported by popular geospatial software tools. One of these most widely used tools is the PROJ library, which currently only supports the forward projection (from a geodetic latitude, longitude on the WGS84 ellipsoid to the unfolded icosahedron plane), but not the inverse projection (from the unfolded icosahedron plane to a geodetic latitude, longitude on the WGS84 ellipsoid). As a result, the ability for other tools relying on PROJ such as GDAL and QGIS to integrate data from DGGS based on the ISEA projection is very limited. Several operations, such as QGIS visualization and gdalwarp to re-project raster data, require the projection transformation to be implemented in both directions.

A lesson learned is that it should not be very difficult to implement the missing reverse transformation in the PROJ library. An implementation of both the forward and reverse projection in the Java programming language under an MIT license, which was used as starting point for Ecere’s implementation, is available from:

A related lesson learned the hard way is that the sphereToIcosahedron() and icosahedronToSphere() functions in this module appear to deal with geocentric latitudes, rather than geodetic latitudes.

In addition to the lack of support in PROJ, there does not seem to be a well-defined standardized identifier (e.g., EPSG code) to refer to the ISEA projection. This in turn makes it difficult to properly geo-reference ISEA-projected data in a GeoTIFF file. The (currently invalid) http://www.opengis.net/def/crs/OGC/0/1534 identifier was used in a provisional manner for the pilot, but referred to the ISEA projection after applying an affine transformation to define a 2D Tile Matrix Set.

Effective communication with the University of Calgary participants on the complex topics of Discrete Global Grid Systems initially also proved to be difficult. Different terminology is sometimes used by different researchers and by the OGC DGGS working groups. As an example, what were previously called “DGGS cells” are now called “DGGS zones” by the DGGS working groups to avoid confusion with the coverage concept of cells. The available documentation on Discrete Global Grid Systems is limited and mostly consists of academic papers, often very dense in content and difficult to understand, sometimes ambiguous to the reader.

After several regular DGGS task-specific meetings and a learning experience by both DGGS participants, by the end of the pilot, a valuable common understanding had emerged. A lesson learned is that more accessible documentation, focused on practical implementation but with an in-depth understanding of a particular discrete global grid system, would be highly valuable, and something towards which the pilot participants would be in a unique position to contribute based on the experience gained.

Another lesson learned is that the current draft candidate OGC API - DGGS Standard is on solid ground to answer typical DGGS questions of the type “What is here?” and “Where is it?”, and that the candidate OGC API - Processes - Part 3: Workflows and Chaining Standard nicely complements these capabilities by allowing to describe workflows integrating multiple OGC API data sources and processes, regardless of where they originate from.

Finally, the key take-away lesson from this pilot’s DGGS activity is that an axis-aligned and equal-area ISEA9R DGGS can be defined and communicated very simply based on a 2D Tile Matrix Set using a CRS constructed by applying an affine transformation to the Icosahedral Snyder Equal Area (unfolded icosahedron plane) projection. This ISEA9R DGGS can serve as a simple transport mechanism for values corresponding to zones of the ISEA3H DGGS’s even hierarchy levels. It should be easy for several existing geospatial tools to enable support for this transport mechanism once the reverse ISEA projection is implemented in the PROJ library, since the mechanism can be described strictly in terms of the widely supported OGC 2D Tile Matrix Set Standard, allowing data to be accessed through OGC API - Tiles as well as OGC API - DGGS.

11. Challenges and Lessons Learned

The development, testing, and demonstrations performed throughout Phase 3 of the Pilot presented many challenges and lessons learned for all of the Pilot’s participants. While many chapters include lessons learned for its corresponding component, this chapter summarizes and groups them for easier reference.

11.1. Phase 3 User Survey

The need for partnership in the FMSDI is clear. Most of the participants found their MSDI to be least findable in contrast to other FAIR principles; in terms of data, the question almost always begins with either an exploratory "What is here?" or a search for "Where is it?". That might be another angle to the Partnership being the main area that needs to be improved in global initiatives such as UNGGIM IGIF strategic pathway.

The marine community appears to be an avid standard user. And the need for interoperability between IHO and OGC standards is once again highlighted. It should be noted that not too many use cases explicitly mentioned interoperability between OGC and IHO compliant products and datasets, but by implementing interoperability between OGC and IHO, it is possible to make more IHO S-100 datasets available to the public.

Although data-oriented use cases had the majority of suggested use cases, it was clear that data has become less of an issue and access to analysis comes into focus. The use cases were mostly focused on real end-user applications.

11.2. Data Availability

Phase 3 of the pilot showed the lack of data in the subject area. There were areas of very sparse coverage. There was little real time data available so these data were simulated for the pilot. It was also noted by participants that there was limited support for data conforming to either S-100 or OGC API models. This may have been due, in part, to the relatively recent publishing and adoption of these specifications.

11.3. Arctic Voyage Planning Guide

The Arctic Voyage Planning Guide provided a significant amount of data through much of the Arctic region but only a small subset of the layers contained data in the subject area. This provided great challenges in developing a scenario and hence simulated data had to be used for some of the themes.

A list of other AVPG challenges encountered during the pilot is as follows.

  1. For theme 6 it’s not clear what some feature layer names are (i.e., FAA_JSONToFeatures).

  2. For theme 4 it would have been preferred if a limits and boundaries service in S-121 format could be located.

  3. In Theme 3 there are a few layer names that were made more descriptive to the lay mariners, these include METAR, AIS_JasonToFeatures, Marine Transportation(x2), ahari_2020, etc.

11.4. Land / Sea Interface

One of the main topics of the pilot was “Land / Sea” interface integration. In the area that connects the land / sea domain it is crucial for datasets to meet and integrate effectively. However, datasets from these two domains don’t always achieve this, particularly when data has been captured at different times, to different standards, or to varying levels of precision and accuracy. This was certainly exacerbated during this Pilot as data was obtained from a wide range of data providers.

There were many challenges experienced here. The use cases examined in the search / rescue scenario determined that land / sea challenges fall into three broad categories.

  1. Mismatches of Coordinate Reference Systems (CRS)/Datums. Frequently (and with good reason) IHO marine geospatial data is given against multiple vertical datums with a sounding datum (for Low Water) and a High Water datum for measuring heights/elevations. Most frequently the horizontal datum is WGS84 as the vast majority of use marine cases are focused on navigation systems which are always aligned to WGS84.

  2. Mismatches of modeling. Land based data frequently uses feature models and attribution which are specific to the country, region or some administrative sub-division. Marine geospatial data under S-100, by contrast, is based on globally agreed definitions and uses structures prescribed by the framework. This then requires a semantic mapping between the features. No concrete methodology for mapping between S-100 and other domain models exists currently (nor between S-100 models although this is under consideration currently).

  3. Mismatches of scale. Marine geospatial data is often at varying scales due to the costs of acquisition. By contrast land data is normally at a homogenous scale for a region. This leads to geometry discrepancies between the features which require harmonization for land/sea interoperability to be effected.

No concrete methodology has been published for resolving these three different aspects of land/sea interoperability but one could be constructed using existing best practices.

11.5. DDIL Environments

Several participants noted that the lack of connectivity in the Arctic environment had a detrimental effect on data collection dramatically reducing both the variety and amount of appropriate data in times of an emergency. This had an impact on informed decision making for the various actors (ships captain, first responders, etc.) within the scenarios and sub-scenarios.

11.6. OGC Standards

11.6.1. Using OGC API - Features to Serve Federated Marine Data

API extracts data from the source: The most attractive element of the API model for data producers, particularly of marine data, is the retrieval from the authoritative source of the data. This poses questions in the navigational context, obviously, most notably: How can they be assured that data is being used in the correct context, and if any attempts should be made to standardize data import to the bridge in such API formats.

11.6.2. Accessing Data Through OGC API - Features

  • For this Phase, one participant compared the use of OGC WMS and OGC API - Features using the same data sources. OGC WMS provided the benefit of uniform styling and legends across all clients as the images provided were rendered and served from the same server. However, the images can be quite large and thus cause performance issues in bandwidth constrained environments. Possibly the smaller size of the JSON responses from OGC APIs will provide a much faster response time.

  • Security: The issue of security (authentication and authorization) has not been explored to its full extent. Although the client may authenticate to a service to allow its usage, there does not appear to be a formal means to restrict the authorization of the client to use a particular service capability or collections endpoint.

  • The lack of styling rules and legends within the OGC API - Features takes away from the end user experience. It made it difficult to determine visually what the intent of the data was when first viewed. Without styling and legends it was difficult to put the data in the right context for the end user.

11.6.3. Feature Styling in OGC API - Features

OGC API - Features does not contain any style information. This might not be as simple as having the source service provide the style because different countries and organizations could have different standards or priorities when visualizing the data. Some possible solutions are listed below.

  • Standardize the data fields and then indicate the type of data accosted with that standard on the collection level. Then the Client can use predefined styling rules based on the provided data type.

  • Provide styling rules with the data for the client to use.

11.6.4. DGGS: Using Draft OGC API - DGGS

The goal of achieving one of the first OGC API - Discrete Global Grid Systems TIEs between different software solutions, each implementing very different Discrete Global Grid Systems prior to the start of this short pilot, presented several challenges.

The most important of these challenges was that an initial server API with which to attempt TIEs, and associated documentation, was only available to Ecere late in the pilot phase. As a result, it was very difficult for Ecere to plan for the best approach in developing its DGGS client component. Significant efforts were spent exploring various possibilities that provided valuable insights and familiarity with the DGGS, but had no direct impact on the final results demonstration.

A lesson to take away from this is that although hexagons offer unique interesting sampling characteristics (the hexagon being the regular polygon tiling the plane closest to a circle), they do add a significant amount of complexity compared to square or rhombus tiles. This is precisely why using the ISEA9R DGGS as a simpler transport mechanism for ISEA3H is an appealing proposition.

Early in the pilot it was decided to encode DGGS data into a GeoTIFF format. During the process, it was determined that the rhombus tiling schema can be constructed to conform with the OGC API - Tiles standard, and can be described in a mechanism suitable for GDAL and popular GIS tools, so that other non-DGGS clients can access these data tiles.

Another challenge was that the Icosahedral Snyder Equal Area (ISEA) projection is not currently widely supported by popular geospatial software tools. One of these most widely used tools is the PROJ library, which currently only supports the forward projection (from a geodetic latitude, longitude on the WGS84 ellipsoid to the unfolded icosahedron plane), but not the inverse projection (from the unfolded icosahedron plane to a geodetic latitude, longitude on the WGS84 ellipsoid).

A lesson learned is that it should not be very difficult to implement the missing reverse transformation in the PROJ library. An implementation of both the forward and reverse projection in the Java programming language under an MIT license, which was used as starting point for Ecere’s implementation, is available from:

https://github.com/mocnik-science/geogrid/blob/master/src/main/java/org/giscience/utils/geogrid/projections/ISEAProjection.java

Related to this is that the sphereToIcosahedron() and icosahedronToSphere() functions in this module appear to deal with geocentric latitudes, rather than geodetic latitudes.

A lesson learned is that more accessible documentation, focused on practical implementation but with an in-depth understanding of a particular discrete global grid system, would be highly valuable, and something towards which the pilot participants would be in a unique position to contribute based on the experience gained.

There were DGGS API Server limitations. While a process-based analysis is desirable, the current structure of the DGGS Server has focused on the discovery-based aspects of DGGS capabilities. The implemented OGC API - DGGS does expose some of this functionality, but it assumes the client has detailed knowledge of the DGGS geometry, including tessellations, refinement, indexing / encoding, etc. The DGGS API implemented only allows access to DGGS data through the PYX0 format and static PNGs.

Another challenge while implementing the OGC API - DGGS was the lack of clear documentation in the OGC website. If there are still two approaches being discussed for the OGC API - DGGS, it should be clearly stated in the documentation. If a decision has not yet been made, it should not be published on the website, or it should include a warning.

A key take-away lesson from this pilot’s DGGS activity is that an axis-aligned and equal-area ISEA9R DGGS can be defined and communicated very simply based on a 2D Tile Matrix Set using a CRS constructed by applying an affine transformation to the Icosahedral Snyder Equal Area (unfolded icosahedron plane) projection. This ISEA9R DGGS can serve as a simple transport mechanism for values corresponding to zones of the ISEA3H DGGS’s even hierarchy levels. It should be easy for several existing geospatial tools to enable support for this transport mechanism once the reverse ISEA projection is implemented in the PROJ library, since the mechanism can be described strictly in terms of the widely supported OGC 2D Tile Matrix Set Standard, allowing data to be accessed through OGC API - Tiles as well as OGC API - DGGS.

This phase of the pilot has shown that the draft candidate OGC API - DGGS standard is on solid ground with regards to answering typical DGGS questions of “What is here?” and “Where is it?”, and that the candidate OGC API - Processes - Part 3: Workflows and Chaining Standard nicely complements these capabilities by allowing to describe workflows integrating multiple OGC API data sources and processes, regardless of where they originate.

11.7. IHO Standards

The pilot demonstrated how S-100’s General Feature Model (GFM) can represent multiple different datasets, for different purposes, in a search/rescue context. The ability to integrate such APIs together and form a common endpoint for users and the ability for users to ingest OGC API endpoints is a high priority. A goal of the pilot is to show how S-100 data can be reused, encoded as OGC APIs and its potential in a broader use case scenario.

Results of the interoperability and implementation were very mixed.

For some data types a lot of coverage exists and standards are in place for them to be expressed using S-100. eg. charts, S-421 routes, spot depths, aids to navigation and regulatory information all exist, even though some are still in publication form.

However, there was a lack of any systematic S-100 data approach for the area of interest, not unusual given that S-100 implementation is in its infancy.

The utility of the S-100 data and API endpoints created was effective with clients able to quickly use data and understand its source and intended usage. This allowed a number of aspects of the scenario to be explored and identified areas where data was deficient.

12. Recommendations and Future Work

The recommendations and future work presented is built upon the findings that emerged from the development and testing of a variety of servers and clients in an emergency response scenario. While many previous chapters include recommendations and future work suggestions, this chapter summarizes and groups them for easier reference.

The results of this Phase of the Pilot form an important step in the evolution of a Marine SDI. These are summarized in the following sections.

12.1. Data Availability

There was an overall lack of data and limited support for data conforming to either S-100 or OGC API models. This may have been due, in part, to the relatively recent publishing and adoption of these specifications. There was improved support by providers for OGC Web services. Both issues point towards improving ease of implementation for these specifications. It is recommended that, moving forward, developing and publishing implementation guides should be part of future Marine pilot activities.

It is also recommended that future scenarios could occur in an AOI where there are more datasets available, either over a larger geographic extent or several smaller AOIs tied to a master scenario.

The implementation of these recommendations would be highly dependent on the goals of future pilot activities, i.e., whether the goal is to determine data availability or determine interoperability and use of open standards.

12.2. Mismatches in the Land/Sea Interface

The pilot presented three broad categories of land/sea mismatches with no concrete methodology published for resolving interoperability issues. Future phases could look into how one could be constructed using existing best practices and address the mismatches as follows.

  • Coordinate Reference Systems (CRS)/Datums: Ensuring data is located on common datums (horizontal/vertical).

  • Modeling: Interoperable content and geometry models (either the same, or able to be transformed) - mapping between S-100 feature models and other domain models.

  • Scale: Requires harmonization for land/sea interoperability. Primarily achieved through institutional arrangements and management practices which promote interoperability (these are mostly non-technical issues).

12.3. Arctic Voyage Planning Guide

The Arctic Voyage Planning Guide provided a significant amount of data through much of the Arctic region but only a small subset of the layers contained data in the subject area. This provided great challenges in developing a scenario and hence simulated data had to be used.

As this is a developing framework, continued investigation of effective use of the AVPG in future Phases of FMSDI is recommended.

12.4. IHO S-100 Standards

For the FMSDI Pilot Phase 3 scenario, the search and rescue responses demonstrate a strong role for Interoperable content and geometry models (either the same, or able to be transformed). Such transformation frameworks remain to be developed at the S-100 content level and OGC transport level and it is recommended that this be pursued further in future phases.

There is now a first draft of S-100 GFM data expressed in a JSON encoding enabling many services to use existing structures and interoperability with open standards tools. Transformation of content and methods for aggregation, together with common OGC API - Records metadata would enhance this greatly. It is recommended that such efforts be continued in future phases.

12.5. DDIL Environments

Given the challenging connectivity in the Arctic environment, all the more important when dealing with emergency and disaster situations, it is recommended that further investigation is required on how to optimize the retrieval and storage of marine and terrestrial feature collections as a GeoPackage using a supported OGC and IHO file encoding standards.

12.6. Using Vector Tiles

Vector tiles enable the delivery of data in predefined tiles which enable small amounts of data to be delivered to the client and has been previously proven to work in DDIL environments. Some of the potential benefits of vector tiles are as follows.

  • Efficiently storing, delivering and visualizing vector data from large datasets (such as S-122)

  • Varying levels of data ‘resolution’

  • Efficient caching of data

  • Providing clients with a hierarchy of available data, while awaiting requests for higher resolution tiles

  • Using established techniques and APIs (OGC API - Tiles, OGC API - Features, OGC WMTS)

Due to the many benefits that vector tiles offer, especially for users operating in a DDIL environment, it is also recommended that future work should explore using vector tiles for marine data.

12.7. Further Explore OGC API - DGGS

Sufficient capabilities exist within the OGC API - DGGS draft standard to take advantage of the potential of DGGS. However, a more efficient method of encoding and decoding DGGS cell location by the client application, where the general case is millions of DGGS cells is required. It is therefore recommended that future work should address the challenge of representing the unique DGGS geometric representation sufficient for a naïve client application.

It is recommended that more accessible documentation, focused on practical implementation but with an in-depth understanding of a particular discrete global grid system, would be highly valuable, and something towards which the pilot participants would be in a unique position to contribute based on the experience gained. However, it appears that there may be two approaches being discussed for the OGC API - DGGS and it should be clearly stated that this situation exists in the documentation. It may be advisable that a warning be included.

It is also recommended that future efforts should be made to encode the pipeline concept of the DGGS server into the OGC API - Processes workflow and explore the capabilities of this implementation in relation to DGGS processes and operations.

During the course of the project it was discovered that a rhombus tile system conforms to the OGC API - Tiles standard, and it is recommended that future work could focus on implementation of the Inverse Snyder Projection in proj4 library, and to propose a standardized identifier for this projection. This would result in non-DGGS clients accessing the data tiles and utilize DGGS at its full potential. In addition, more types of integrated data applications that make use of machine learning and data mining algorithms, jupyter notebooks, and common GIS applications could start benefiting from DGGS capabilities.

Another area to explore would be related to the data transmission at several resolutions at once. At the moment, the DGGS server only serves data from a particular Rhombus tile at a given resolution. The current draft OGC API - DGGS does not include any mention or recommendations on this topic. A depth parameter could be added to the request, in relation to the number of resolutions that the client wants the data for. This would result in a hierarchical tile of data.

A significant outcome of this Phase of the pilot was the establishment of the DGGS server as a repository for connecting and using heterogeneous data relevant to arctic and marine spatial analysis. It may be recommended that this server could be useful for testbed, pilots, and hackathon type sandbox activity in the future.

12.8. Participant Demonstration Videos

To view individual videos of the accomplishments of the seven participants that lead to many of these recommendations, please click the following link to the OGC YouTube Channel.

Appendix A: Appendix A: Data & Services

A.2. Other data sources that may be useful to the pilot

  • Arctic SDI GeoPortal - 8 country international Arctic mapping (Information). The Arctic SDI Factsheet.

  • Conservation of Arctic Flora and Fauna (CAFF) - Biodiversity working group of the Arctic Council (Additional Info)

  • Ecological and Biological Significance Areas in the Arctic Marine Environment -Scripps Institution of Oceanography, University of California, San Diego

  • Arctic Portal - Private sector

  • International Bathymetric Chart of the Arctic Ocean - International effort (IHO & IOC-UNESCO)

  • Polar Geospatial Center (PGC) - University of Minnesota (incl. ArcticDEM - NGA Sponsored)

  • Alaska Division of Geological & Geophysical Surveys (DGGS) - State of Alaska

  • Alaska Ocean Observing System (AOOS) - one of 11 regional systems across the US coast that represent NOAA’s Integrated Ocean Observing System (IOOS)

  • [Alaska Science Center](https://www.usgs.gov/centers/alaska-science-c

A.3. Data and Services Discovered and Accessed by Participants

A.3.1. ESRI Canada

Public Web Services for the project area of interest in Kotzebue Bay, Alaska that were discovered for the FMSDI project are as follows:

From NWS Integrated Dissemination Program (IDP) GIS Services Showcase - https://noaa.maps.arcgis.com/apps/MapSeries/index.html?appid=cf93575e5535467199da7358ee6c825c

Satellite Loop – True Colour (under most Right hand tab) https://noaa.maps.arcgis.com/apps/MapSeries/index.html?appid=cf93575e5535467199da7358ee6c825c

NOAA - Marine Protected Areas

WMS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NOAA/MPA_Inventory/MapServer

NMFS - Essential Fish Habitat EFH)_MapperData

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NMFS/EFH_MapperData/MapServer

National Ocean Service Observations/CO_OPS

Web Map

https://tidesandcurrents.noaa.gov/map/index.html?region=Alaska

National Ocean Service Observations/CO_OPS

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NOS_Observations/CO_OPS_Products/MapServer

NWS Climate Outlooks - cpc_6_10_day_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_6_10_day_outlk/MapServer

NWS Climate Outlooks - cpc_8_14_day_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_8_14_day_outlk/MapServer

NWS Climate Outlooks - cpc_gfs_precip_anom

WMS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_gfs_precip_anom/MapServer

NWS Climate Outlooks/ cpc_mthly_precip_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_mthly_precip_outlk/MapServer

NWS Climate Outlooks/ cpc_mthly_temp_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_mthly_temp_outlk/MapServer

NWS Climate Outlooks/ cpc_sea_precip_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_sea_precip_outlk/MapServer

NWS Climate Outlooks/ cpc_sea_temp_outlk

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_sea_temp_outlk/MapServer

NWS Climate Outlooks/ cpc_weather_hazards

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Climate_Outlooks/cpc_weather_hazards/MapServer

NWS Observations/ ahps_riv_gauges

WMS & WFS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Observations/ahps_riv_gauges/MapServer

NWS Observations/ radar_base_reflectivity

WMS

https://idpgis.ncep.noaa.gov/arcgis/rest/services/NWS_Observations/radar_base_reflectivity/MapServer

From NOAA NowCOAST Web Mapping Portal - https://nowcoast.noaa.gov/

NowCOAST

WMS & ArcGIS REST

https://nowcoast.noaa.gov/

From State of Alaska Open Data Portal https://gis.data.alaska.gov/pages/imagery

Alaska High Resolution Imagery (RGB) (.5m) WMTS https://geoportal.alaska.gov/arcgis/rest/services/ahri_2020_rgb_cache/MapServer/WMTS/1.0.0/WMTSCapabilities.xml

Alaska High Resolution Imagery (RGB) (.5m)

WMS

https://geoportal.alaska.gov/arcgis/services/ahri_2020_rgb_cache/MapServer/WMSServer?request=GetCapabilities&service=WMS

Alaska High Resolution Imagery (CIR) (.5M)

WMTS

https://geoportal.alaska.gov/arcgis/rest/services/ahri_2020_rgb_cache/MapServer/WMTS/1.0.0/WMTSCapabilities.xml

Alaska High Resolution Imagery (CIR) (.5M)

WMS

https://geoportal.alaska.gov/arcgis/services/ahri_2020_rgb_cache/MapServer/WMSServer?request=GetCapabilities&service=WMS

SPOT5 Orthomosaic (RGB) (2.5m)

WMTS

https://geoportal.alaska.gov/arcgis/rest/services/ahri_2020_rgb_cache/MapServer/WMTS/1.0.0/WMTSCapabilities.xml

From Maritime Cadastre National Viewer https://marinecadastre.gov/nationalviewer/

12NM Territorial Sea WMS https://maritimeboundaries.noaa.gov/arcgis/services/MaritimeBoundaries/US_Maritime_Limits_Boundaries/MapServer/WMSServer?request=GetCapabilities&service=WMS

24NM Contiguous Zone

WMS

https://maritimeboundaries.noaa.gov/arcgis/services/MaritimeBoundaries/US_Maritime_Limits_Boundaries/MapServer/WMSServer?request=GetCapabilities&service=WMS

Bureau of Ocean Energy Management - Oil and Gas Planning Areas

WMS

https://gis.boem.gov/arcgis/services/BOEM_BSEE/MMC_Layers/MapServer/WMSServer?request=GetCapabilities&service=WMS

Bureau of Ocean Energy Management - Oil and Gas Resource Potential

WMS

https://gis.boem.gov/arcgis/services/BOEM_BSEE/MMC_Layers/MapServer/WMSServer?request=GetCapabilities&service=WMS

Essential Fish Habitat (EFH)

WMS

https://idpgis.ncep.noaa.gov/arcgis/services/NMFS/EFH_Only/MapServer/WMSServer?request=GetCapabilities&service=WMS

Federal and State Waters

WMTS

https://coast.noaa.gov/arcgis/rest/services/OceanReports/FederalAndStateWaters/MapServer/WMTS/1.0.0/WMTSCapabilities.xml

Limit of OCSLA '8(g)' zone

WMS

https://gis.boem.gov/arcgis/services/BOEM_BSEE/MMC_Layers/MapServer/WMSServer?request=GetCapabilities&service=WMS

Submerged Lands Act Boundary

WMS

https://gis.boem.gov/arcgis/services/BOEM_BSEE/MMC_Layers/MapServer/WMSServer?request=GetCapabilities&service=WMS

FEMA Regional Headquarters WMS https://gis.fema.gov/arcgis/services/FEMA/RegHQs/MapServer/WMSServer?request=GetCapabilities&service=WMS

Lots of FEMA Layers – But ESRI REST Services ONLY.

https://gis.fema.gov/arcgis/rest/services/FEMA

NOAA Charttools

https://gis.charttools.noaa.gov/arcgis/rest/services

From Marineregioins.org Portal - https://www.marineregions.org/downloads.php

Marineregioins.org

https://www.marineregions.org/downloads.php

Esri ArcGIS Maritime Server

ArcGIS REST

https://enterprise.arcgis.com/en/server/10.8/publish-services/windows/what-is-a-maritime-chart-service.htm

OceanWise

WMS

https://www.oceanwise.eu/data/enc-wms/

Lifewatch Metadata Catalog

https://metadatacatalogue.lifewatch.eu/srv/api/records/5ec6b072-9e83-433b-b293-e0f75a714e34

A.3.2. Ecere

Ecere visualized in its GNOSIS Cartographer 3D client the results of a coastal erosion workflow from the University of Calgary’s DGGS server which integrated four datasets:

In addition, Ecere also used the following datasets for its visualization demonstration:

Appendix B: Appendix B: Arctic Voyage Planning Guide

B.1. A Canadian implementation:

Arctic Voyage Planning Guide (Canada)

FOR INFORMATION ONLY -NOT APPLICABLE TO FMSDI AOI PLUS MANY OF THE LINKS DO NOT WORK.

Carriage Requirements:

Nautical Charts

The Canadian Hydrographic Service (CHS) publishes almost 1,000 paper charts, which are described in a series of four free chart catalogs for the Pacific Coast, Central Canada, Atlantic Coast and the Arctic. These catalogs are available from CHS chart dealers, or you can contact CHS directly for a free catalog. Although CHS does not sell charts directly to the public, we distribute to over 800 dealers across Canada and around the world.

At present, less than 10% of Arctic waters are surveyed to modern standards. In addition, the mariner must be aware of the horizontal datum used for the chart. GPS positions can only be plotted directly on NAD 83 (equivalent to WGS 84) charts. For charts with other datums, the appropriate correction must be applied. Some Arctic charts do not have a reference datum and therefore no available corrections. In such cases, alternative sources of positional information should be used such as radar and visual lines of position when possible. It is always recommended that more than one means be used to fix a position.

As always, mariners must use up-to-date nautical charts and nautical publications to plan each voyage. This includes making use of annual and monthly Notices to Mariners and northern Canada Sailing Directions. Of particular note, given the challenges in Canada’s northern waters of charting, confirming chart anomalies, and servicing aids to navigation, mariners must ensure that all Notices to Shipping (broadcast and written) and NAVAREA warnings that are in force in the area are taken into account.

Sailing Directions

Sailing Directions are the indispensable companions to charts. A great tool for planning and assisting in navigation, Sailing Directions provide information beyond that which can be shown on a chart

Tides and Current Tables

The Tides, Currents, and Water Levels publication provides predicted times and heights of high and low waters, and the hourly water levels for over seven hundred stations in Canada. The printed version is published yearly and is available through the authorized chart dealers.

List of Lights, Buoys and Fog Signals

Provides key information about Canadian Coast Guard approved and managed lights, buoys and fog signals, including position, characteristics, height and reference charts.

Radio Aids to Marine Navigation (RAMN)

The main purpose of RAMN is to present information on radio communications and radio navigational aids services provided in Canada by the Canadian Coast Guard. Radio facilities of other government agencies that contribute to the safety of ships in Canadian waters are also included.

Ice Navigation in Canadian Waters

Ice Navigation in Canadian Waters published by the Department of Fisheries and Oceans, Canadian Coast Guard, provides important information to ships operating in ice in all Canadian waters, including the Arctic. This document provides Masters and watchkeeping crew of vessels transiting Canadian ice-covered waters with the necessary information to achieve an understanding of the hazards, navigation techniques, and response of the vessel.

Annual Notice to Mariners (NOTMAR)

The Canadian Coast Guard (CCG) Notices to Mariners (NOTMAR) publication provides mariners the necessary information to update charts and nautical publications. It advises mariners of new initiatives, services and also important announcements concerning the maritime community.

Notice to Shipping (NOTSHIP)

The Canadian Coast Guard (CCG) issues Notices to Shipping (NOTSHIP) to inform mariners about hazards to navigation and to share other important information. NOTSHIP alerts are broadcast by radio by Marine Communications and Traffic Services (MCTS). Written NOTSHIP alerts are issued when the location of the hazard is beyond broadcast range, or when the information remains in effect for an extended period of time.

Legislation and Regulation

Acts and Regulations

The Canada Shipping Act (CSA) 2001 is Canada’s principal legislation for shipping. It applies in all Canadian waters, including the Arctic. The Marine Liability Act (MLA) makes the owners and/or operators of vessels responsible and liable for their vessels and the consequences of their operations. The Marine Transportation Security Act (MTSA) provides for the security of marine transportation. It applies to ships and marine facilities in Canada, and to Canadian ships outside of Canada. The Navigable Waters Protection Act (NWPA) protects navigation from being impeded or made more dangerous, and it regulates ferry cables and drawbridges.

Northern Canada Vessel Traffic Service Zone (NORDREG)

The Northern Canada Vessel Traffic Services Zone Regulations formally establish the Northern Canada Vessel Traffic Services (NORDREG) Zone and, consistent with international law regarding ice-covered areas, implement the requirements for vessels to report information prior to entering, while operating within and upon exiting Canada’s northern waters.

Arctic Shipping Pollution Prevention Regulations

Navigation in coastal waters within Canadian jurisdiction north of latitude 60°N is governed by the Arctic Shipping Pollution Prevention Regulations (ASPPR), under the Arctic Waters Pollution Prevention Act. The ASPPR deal with the construction of ships (certain construction requirements for different navigation zones); bunkering stations; Arctic Pollution Prevention Certificates; Ice Navigator issues (any vessel planning to use the Arctic Ice Regime Shipping System and every tanker must have a qualified Ice Navigator on board); fuel and water concerns (enough of both on board before entering a zone); sewage deposit and oil deposit mishaps (unavoidable deposit only, that is, to save a life; or from damage to a ship from stranding, collision, or foundering if all reasonable precautions were taken).

IMO Guidelines for Ships Operating in Polar Waters

Ice poses serious danger to ships. The International Maritime Organization (IMO) Guidelines seek to minimize incidents and to prevent loss of life and property during ship operations in ice-covered waters.

Appendix C: Appendix C: The S-100 GFM (Geo) JSON format

The format of data returned from the API is a JSON-rendered version of S-131 “data”. The format is described in this appendix.

C.1. Overall scheme.

The JSON format consists of the following elements, dealt with separately in the subsequent sections.

  1. Collection Object. A single JSON object consisting of one or more features[see note on what the meaning of “feature” is in this context]. Each of these consist of the following.

    1. Feature header information

    2. Properties

    3. Geometry

    4. Relationships.

C.1.1. Collection Object.

The collection object uses the GeoJSON standard of “Feature Collection” and consists of no other properties or attributes, merely a (possibly empty) array of feature objects. These feature objects (“feature” in this sense is the JSON definition of “feature”, not the S-100 GFM definition of feature. Where the difference is not obvious in context it has been defined accordingly.)represent either S-100 GFM feature types or information types. Example:

{
  "type": "FeatureCollection",
  "features": []
}

Currently no dataset metadata is included in the FeatureCollection – this could be implemented and should be modeled on the S-10 Part 10b encoding for simplicity. The FeatureCollection only acts as an aggregation operator for S-100 features and information types so there is no requirement for it where only a single feature (or information type) is being encoded.

C.1.2. Feature Objects

An example of an individual feature object is shown below:

    {
  	"type": "Feature",
  	"id": "fiho.s100.S131.Terminal.0179",
  	"properties": {
    	"featureName": {
      	"name": "Port of Rauma - South section - Container-terminal"
    	}
  	},
  	"geometry": {
    	"type": "Point",
    	"coordinates": [
      	21.4342,
      	61.1298
    	]
  	}
    }

C.1.3. Header

The feature object contains the following sub-objects.

  1. The “type” – this is labeled “Feature” to conform to GeoJSON specifications

  2. “id” – a unique identifier for the feature itself. By convention this can either be included alongside the feature properties or as a separate property of the feature itself. There are pros/cons for either possibility including compatibility with some third party systems and OGC API - Features specifications

  3. “properties” - this sub-object contains the S-100 thematic attribute mappings (as described later

  4. “geometry” – this sub-object contains the S-100 geometry applicable to this feature. Note that no support for topology is included in this iteration of the specification. Topology (specifically, the topology supported under S-100 Part 7) could be supported using a methodology like topoJSON but for simplicity the initial release of the GFM mapping uses simple, inline geometry. Geometry is further described later

  5. Relationships. The implementation of S-100 relationships is described later

As yet, no formal mapping from S-100 Feature Catalogs to JSON Schemas is defined. This is an ongoing work in progress but, once complete, should provide a normative method for deriving a schema against which S-100 JSON data can be validated.

C.1.4. Properties

The properties sub-object contains the thematic attribution for each feature. This is a direct mapping from the names and values defined by the S-100 feature catalog. Only defined attributes are included and they are referenced by the Feature Catalog “code” value, according to the feature catalog defined for the product specification (and defined by the S-100 XML Schemas for feature catalogs). This is similar in structure to the S-100 Part 10b GML encoding.

No ordering is imposed on the JSON data, unlike GML Schemas defined under Part 10b

Sub-Attributes are defined as sub-objects, for example:

{
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "properties": {
        "communicationChannel": [
          "B20",
          "B14"
        ],
        "fixedDateRange": {
          "endDate": "2021-06-01",
          "startDate": "2021-05-01"
        },
        "orientation": "25.6",
        "periodicDateRange": {
          "endDate": "2021-01-01"
        },

Here the subattribute endDate (defined by the feature catalog) is implemented as a sub-object f the fixedDateRange attribute (also defined by the feature catalog schema)

Where the feature catalog allows for multiplicity > 1 the individual values are arranged in a JSON array, for example:

      "type": "Feature",
      "properties": {
        "communicationChannel": [
          "B20",
          "B14"
        ],

Here communicationChannel has multiple values (as the feature catalog has a multiplicity >1) and so values are implemented as an array. There is a possible ambiguity here as a single value for an attribute with multiplicity>1 could be implemented as an array with one value or as a single value. The implementation uses a single value instead of a single-valued array in order to increase interoperability with GIS and spatial tools but this probably requires further experimentation and discussion.

JSON supports a restricted set of types, and therefore only the following explicit mappings are defined.

  • Number ( for S-100 GFM integer primitive types)

  • Boolean (for S-100 GFM boolean types)

  • String (for all other S-100 GFM types, including dates)

This means that the JSON encoding is, with the exclusion of boolean and integer types, effectively untyped. This is not necessarily a problem and has not hindered takeup of JSON forms in the broader geospatial community. It will remain to be seen whether this causes a problem . It is not dissimilar to the S-100 PArt 10a ISO8211 encoding where attribute and sub-attribute values are encoded as binary strings, regardless of their S-100 GFM type - these types can be enforced by S-100 data editors, rather than at the encoding level.

C.1.5. Geometry

Geometry can be directly implemented as GeoJSON geometry objects. Point, LineString and Polygon using coordinates defined in an appropriate CRS can be directly embedded in GeoJSON objects. The suggestion is to use inline coordinates using the simple geometry types allowable in GeoJSON. The big gap is, of course, that S-100 topology is not supported under GeoJSON as geometric primitives can not be indirectly referenced. This means all geometry must be inline and shared geometry (and, by implication, attributes on spatial features) is not supported. These features are not universally used - it is suggested that more complex spatial types can be supported by other encodings (GML or ISO8211 under Part 10b and Part 10a of S-100).

C.1.6. Relationships

The hardest element of the S-100 GFM to implement is the relationships between features and information types. JSON does not currently implement a standardized way of referencing between objects in a datasets - currently there is a $ref protocol that can be used for such things, and the encoding uses unique identifiers for all features. The S-100 GFM names both relationships, and the individual roles using unique codes and so it seems a referencing mechanism could be finalized reasonably simply. This is an ongoing area, the pygeoapi implementation uses $ref to implement this as a first cut.

C.2. Moving Forward

The GeoJSON encoding for S-100 GFM data has been a big step forward in moving S-100 GFM data to OGC API - Features but is very much a “v1.0.0”. In order to produce a more finished encoding, the following points should be considered.

  1. CRS Support in GeoJSON is still limited. For many S-100 product specifications this isn’t an issue as they are located on WGS84 but S-100’s CRS support is far broader. A balance needs to be made between utility and scope - GeoJSON enjoys good interoperability with the broader geospatial community and weakening interoperability in order to support more CRSs with a bespoke format may be a step backwards.

  2. Better Support for Associations. S-100 has a very rich model for modeling and representing associations between features and information types. A suitable/equivalent encoding for GeoJSON has not been found yet and this is ongoing.

  3. Geometry. No support for topology is yet available. Extending the encoding to encompass topoJSON would be a good step forward and allow the JSON encoding to sit alongside ISO8211 (S-100’s encoding with a rich support for topological structures). A tighter definition of geometry primitives and a mapping to S-100 Part 7 would be a step forward. A mapping against the geometry defined in S-100 Part 10b would also be useful for implementers.

  4. Metadata and transport. The GeoJSON encoding is the most important part of data delivery via an API but a number of other areas remain to be covered including dataset metadata and a definition of how the aggregation is documented by OGC API. OGC API has a number of mandatory and optional features and a closer examination should be done, concentrating on a number of S-100 product specifications, such as those considered by this project.

Appendix D: Appendix D: FMSDI User Survey

survey1
survey2

Appendix E: Revision History

Date Editor Release Primary clauses modified Descriptions

Jan 31, 2023

S. Saeedi

1.0

all

final review and edit

January 26, 2023

R. Thomas

0.9

all

preparation for publication

November 1, 2022

R. Thomas

0.8

all

comments integrated

July 8, 2022

R. Thomas and S. Saeedi

0.1

all

initial version

Bibliography


1. Development of Spatial Data Infrastructures for Marine Data Management Engineering Report.
2. Maritime Limits and Boundaries Pilot Engineering Report.
3. https://www.ogc.org/pub/ArcticSDP/pilot-reports.html
4. Towards a Federated Marine SDI: IHO and OGC standards applied to Marine Protected Area Data Engineering Report.