Publication Date: YYYY-MM-DD

Approval Date: YYYY-MM-DD

Submission Date: YYYY-MM-DD

Reference number of this document: OGC 19-016

Reference URL for this document:

Category: OGC Public Engineering Report

Editor: Name(s) Michael A. Leedahl

Title: OGC Testbed-15: Data Centric Security

OGC Public Engineering Report


Copyright © 2019 Open Geospatial Consortium. To obtain additional rights of use, visit


This document is not an OGC Standard. This document is an OGC Public Engineering Report created as a deliverable in an OGC Interoperability Initiative and is not an official position of the OGC membership. It is distributed for review and comment. It is subject to change without notice and may not be referred to as an OGC Standard. Further, any OGC Public Engineering Report should not be referenced as required or mandatory technology in procurements. However, the discussions in this document could very well lead to the definition of an OGC Standard.


Permission is hereby granted by the Open Geospatial Consortium, ("Licensor"), free of charge and subject to the terms set forth below, to any person obtaining a copy of this Intellectual Property and any associated documentation, to deal in the Intellectual Property without restriction (except as set forth below), including without limitation the rights to implement, use, copy, modify, merge, publish, distribute, and/or sublicense copies of the Intellectual Property, and to permit persons to whom the Intellectual Property is furnished to do so, provided that all copyright notices on the intellectual property are retained intact and that each person to whom the Intellectual Property is furnished agrees to the terms of this Agreement.

If you modify the Intellectual Property, all copies of the modified Intellectual Property must include, in addition to the above copyright notice, a notice that the Intellectual Property includes modifications that have not been approved or adopted by LICENSOR.


This license is effective until terminated. You may terminate it at any time by destroying the Intellectual Property together with all copies in any form. The license will also terminate if you fail to comply with any term or condition of this Agreement. Except as provided in the following sentence, no such termination of this license shall require the termination of any third party end-user sublicense to the Intellectual Property which is in force as of the date of notice of such termination. In addition, should the Intellectual Property, or the operation of the Intellectual Property, infringe, or in LICENSOR’s sole opinion be likely to infringe, any patent, copyright, trademark or other right of a third party, you agree that LICENSOR, in its sole discretion, may terminate this license without any compensation or liability to you, your licensees or any other party. You agree upon termination of any kind to destroy or cause to be destroyed the Intellectual Property together with all copies in any form, whether held by you or by any third party.

Except as contained in this notice, the name of LICENSOR or of any other holder of a copyright in all or part of the Intellectual Property shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Intellectual Property without prior written authorization of LICENSOR or such copyright holder. LICENSOR is and shall at all times be the sole entity that may authorize you or any third party to use certification marks, trademarks or other special designations to indicate compliance with any LICENSOR standards or specifications.

This Agreement is governed by the laws of the Commonwealth of Massachusetts. The application to this Agreement of the United Nations Convention on Contracts for the International Sale of Goods is hereby expressly excluded. In the event any provision of this Agreement shall be deemed unenforceable, void or invalid, such provision shall be modified so as to make it valid and enforceable, and as so modified the entire Agreement shall remain in full force and effect. No decision, action or inaction by LICENSOR shall be construed to be a waiver of any rights or remedies available to it.

None of the Intellectual Property or underlying information or technology may be downloaded or otherwise exported or reexported in violation of U.S. export laws and regulations. In addition, you are responsible for complying with any local laws in your jurisdiction which may impact your right to import, export or use the Intellectual Property, and you represent that you have complied with any regulations or registration procedures required by applicable law to make this license enforceable.

Table of Contents

1. Subject

The OGC Testbed-15 Data Centric Security Engineering Report (ER) discusses the current state of security in protecting data in a geospatial environment. The ER examines the use of encrypted container formats such as NATO STANAG 4778 "Information on standard Metadata Binding" with metadata as defined in NATO STANAG 4774 "Confidentiality Metadata Label Syntax" in combination with geospatial data using the encoding for an OGC Web Feature Service (WFS) FeatureCollection structure. This report also makes a recommendation for the creation of new media types to support output container formats such as STANAG 4778. The report then discusses various implementation scenarios in which a STANAG 4778 (eXtensible Markup Language (XML) container maintains encrypted data from author to service to viewer. These implementations use the new OGC API - Features - Part 1: Core with features encrypted using keys supplied by feature authors and users.

2. Executive Summary

OGC members can derive business value from this ER in the following three areas:

  1. Where Data Centric Security fits in with proposed standards such as OGC API for Features.

  2. Techniques to use and issues that impact implementation of Data Centric Security.

  3. The continuing work that remains in the area of Data Centric Security.

The motivation for data centric security is a response to the possibility of an unauthorized user who intercepts network traffic or hacks systems storing sensitive data. When looking at drafting OGC standards such as OGC API - Features in a data centric security scenario, standards need to include ways to classify the security requirements around data access. This classification can exist as additional metadata fields. The requirement stems from the need to limit different consumers to a different subset of data. Additional requirements include the need for representation of the source of the information as well as an assurance that the information has not been tampered with. A fundamental requirement for data centric security is that the data is always in an encrypted form until an authorized actor makes use of the data. As the data could pass through systems that do not belong to the data consumer nor the producer, the data must remain encrypted throughout the geospatial environment. The geospatial environment includes all infrastructure that touches the geospatial data (services, networks, storage, clients, etc.). For the purposes of this ER and the data centric security thread in the testbed, a requirement exists to use an open source implementation of OGC API - Features.

The Testbed-15 findings show that it is possible to support data centric security within the OGC API service framework. The ER explores three scenarios:

  • A request for features that are intercepted, filtered, encrypted and signed by a security proxy that forwards the request and modifies the response from a vanilla OGC API Features service to a STANAG 4778 output format. Annex A provides more details for this scenario.

  • This scenario includes a security proxy that contains a geospatial policy of classified and unclassified areas. The scenario is similar to the one above in that a request is intercepted, filtered, encrypted and signed by the security proxy. The difference is that temporal decisions and spatial filtering is performed on the results of the request by the security proxy. See Annex B for more details.

  • A request for features that are intercepted, filtered, encrypted and signed by a security proxy which forwards the request and response from an OGC API Features service that understands the STANAG 4778 output format. In this scenario, the OGC API - Features service returns a feature collection with STANAG 4778 encoded feature objects. See Annex C for more details.

The first challenge, an implementor encounters, occurs when sending the request. The current code lists do not support a STANAG 4778 output format. The STANAG 4778 output format is a container format that contains encrypted portions of sensitive data and associated metadata. A sub-challenge is that the OGC API set of standards needs a way to specify both the container encoding and the format of the data in the container. Once standards such as OGC API - Features support the documentation of containers and data and get agreement by the implementing community, then interoperability is possible.

The next challenge for implementation, which is outside the scope of OGC API standards work, is one of key management. In the first and second scenarios, the OGC API service does not know anything about keys. The feature data is either not encrypted on the storage or the data is encrypted by the file system or the database system. The security proxy (PDP/PEP) encrypts the data as the data is returned to the authorized actor. This allows the OGC API - Features service to search the data as the data is visible to the service. On the third scenario, the OGC API - Features service either needs authorization to access keys or the services ability to filter data is limited. One challenge that is within the scope of OGC API standards is in the description and negotiation of Key management. Currently there are no markings in the service to specify whether the metadata is encrypted with the public key of the client or if the metadata contains the key for the sensitive encrypted feature data. There are potentially other key management methods which client and service implementations may use in negotiation and description of key management.

One other challenge that an OGC Standards Working Group (SWG) should address is the inclusion of a digital signature element in the scheme of a feature collection. Current standards, such as OGC API - Features, do not contain a digital signature as part of the scheme. The testbed participants were able to add one for the purposes of demonstrating Data Centric Security. However, the resulting feature collection will fail WFS FeatureCollection scheme validation. This issue is demonstrated in scenario three where the OGC API - Features service returns a feature collection with STANAG 4778 encoded features.

Future testbeds should investigate:

  • Additional container formats for encoding output formats. In this testbed, STANAG 4778 was chosen because of its use by NATO Partner Nations for exchanging data. The STANAG XML format is useful for systems that are working with XML data. Other encoding formats exist and some applications, particularly in a commercial business, may not be as keen to support XML. An investigation in using a JavaScript Object Notation (JSON) based encoding would be beneficial as many applications today exchange information using JSON.

  • Key management markings. When running the tests in the testbed, you see that the metadata contains the symmetric key for decrypting the feature data and the metadata is encrypted with the public key of the user. An alternative key management scenario may store the keys in a key management service and require the client to fetch the key via a key identifier stored in the metadata. There should be some indication to the client of where to fetch the keys from and how to decrypt the features and metadata.

  • Authentication and Authorization Protocols. To run the tests in this testbed, use OAuth 2 to issue a bearer token for access delegation. OAuth 2 scopes are validate along with GeoXACML to define authorization. Future implementations should evaluate data provenance using assertions, Blockchain technologies or other standards.

  • XML Digital Signature in OGC encoding standards. Scenario Three demonstrates the ability to include a set of STANAG 4778 container objects in a WFS FeatureCollection result. Putting a STANAG 4778 container object in as a feature works because the feature collection schema allows for a type of xs:any. Applying a digital signature to the final feature collection results in an invalid structure as the schema defined in wfs.xsd does not support the insertion of a W3C XML Digital Signature element. From the testbed results, the participants encourage OGC SWGs for OGC API standards to add optional schema elements that allow the use of XML Digital Signatures. See the OGC Change Request #614 for more information.

2.1. Document contributor contact points

All questions regarding this document should be directed to the editor or the contributors:


Name Organization Role

Michael Leedahl

Maxar Technologies, Inc.


Andreas Matheus

Secure Dimensions


Donovan Dall

Helyx Secure Information Systems


George Elphick

Helyx Secure Information Systems


2.2. Foreword

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights.

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.

3. References

4. Terms and definitions

For the purposes of this report, the definitions specified in Clause 4 of the OWS Common Implementation Standard OGC 06-121r9 shall apply. In addition, the following terms and definitions apply.

● AS

OAuth2 Authorization Server — a component that dispatches, validates manages bearer access tokens.

● GeoPDP

Geospatial Policy Decision Point — a component of a policy based system that uses a request, attributes about a request (including geospatial attributes) and a policy document to make an access decision to allow access to a resource. The GeoPDP implements the OGC GeoXACML implementation specification.

● GeoPEP

Geospatial Policy Enforcement Point — a component of a geospatial aware policy based system that works with a GeoPDP to enforce access decision and perform obligations requested by the GeoPDP.

● OGC API - Features

OGC API - Features - Part 1: Core — is a new OGC standard for a feature service application programming interface that provides access to feature collections and the items in them. This standard was formally known as WFS3 for Web Feature Service version 3.

● LDProxy

LDProxy — An Open Source product by Interactive Instruments which provides most of the REST implementation specified in the OGC API - Features Standard.


Policy Decision Point — a component of a policy based system that uses a request, attributes about a request (including geospatial attributes) and a policy document to make an access decision to allow access to a resource. The PDP implements the OASIS XACML3 standard.

4.1. Abbreviated terms

  • AD Authorization Decision

  • ADR Authorization Decision Request

  • AS Authorization Server

  • DCS Data Centric Security

  • DWG Domain Working Group

  • GeoPDP Geospatial Policy Decision Point

  • GeoPEP Geospatial Policy Enforcement Point

  • GeoXACML Geospatial eXtensible Access Control Markup Language

  • OAPIF Short form of OGC API - Features - Part 1 - Core

  • OGC Open Geospatial Consortium

  • PDP Policy Decision Point

  • SAML Security Assertion Markup Language

  • SWG Standard Working Group

  • TB15 OGC Testbed-15

  • WFS3 Web Feature Service version 3 (Also known as OGC API Features)

  • XACML eXtensible Access Control Markup Language

  • XML eXtensible Markup Language

  • XSLT eXtensible Stylesheet Language Template

5. Overview

This section provides a brief overview and description of the key sections of this Engineering Report.

Section 6 provides a look at the landscape of data centric security technologies and techniques. The section also covers considerations of implementing data centric security on performance of a feature retrieval service such as OGC API - Features.

Section 7 outlines scenarios, requirements and architecture used in the Testbed for Data Centric Security. The first of three scenarios defined in this testbed works with a proxy solution. The proxy deals with authentication, authorization and converting to a STANAG 4778 container format. The proxy is put in front of a vanilla implementation of an OGC API - Features service. The second scenario is similar to the first scenario with the exception that the proxy service applies a spatial filter on the request. The third scenario looks at an authentication and authorization system that passes through a request to an OGC API - Features service which already stores the features in a STANAG 4778 format encoding. Next the section covers requirements and presents a mapping of requirements to architectural elements. This is followed by the engineering aspects of the architecture and infrastructure setup for the Data Centric Security portion of Testbed-15.

Section 10 presents the results of the Technology and Integration Experiment (TIE) testing. In general the TIEs show what happens when you call an OGC API - Features service with and without adding a security proxy in front for each scenario.

Section 11 provides a summary of the main findings. This section shows that adding data centric security using containers that conform to STANAG 4778 is possible.

Section 12 looks at additional the aspects of key management, authentication/authorization and filtering that were not covered in this testbed.

Annex A provides a demonstration and implementation instructions for scenario one.

Annex B provides a demonstration and implementation instructions for scenario two.

Annex C provides a demonstration and implementation instructions for scenario three.

6. Data Centric Security

6.1. State of Art in Data Centric Security

When implementing a platform for data centric security, data providers and distributors need way to:

  • Authenticate agents/users

  • Prove the authenticity and Integrity of data

  • Providence of the data history

  • Classify data

  • Manage rights and policies for accessing data

  • Manage keys for encrypting and decrypting data

  • Automation of encryption and decryption of data

  • Automation of data masking and unmasking

  • Discovery or cataloging of encrypted data

Data Centric Security requires some sort of agent or client that runs on endpoints where data is being created that can assess the data and perform actions on the data as defined in policies. Often these policies are centrally managed and pushed out to the agent or client. Users or automated processes that create or use data should not be aware that the data is encrypted when they access the data.

There are many technologies and standards that are used to accomplish data centric security such as:

  • Digital Rights Management (DRM)

  • Data Loss Prevention (DLP)

  • Key Management Interoperability Protocol (KMIP)

  • Public-Key Cryptography Standards (PKCS #11)

  • Information Flow Control

  • Attribute-Based Access Control

  • Role-Based Access Control

  • W3C PROV

  • Digital Signatures

  • Transport Layer Security (TLS)

  • Secured Hyper-Text Transfer Protocol (HTTPS)

Many of the afore mentioned technologies and standards have been applied to various solutions for file and email management. When looking for solutions to these problems in a geospatial context, data centric security seams to be behind other infrastructures. There was some work done on Data Providence in Testbed 10. Some companies have published papers on various was to record providence in metadata. Most products support HTTPS using TLS; howver, this is a transport security model vs. a data centric security model. There are products available as a proxy service to provide authentication, authorization and policy access controls based on Attributes or Roles. These solutions may and often are placed in front of geospatial solutions today. There are a number of solutions available to inspect network traffic to and from a web service or to and from a database that provide DLP and threat assessment services. These proxy services may and often are put in front of geospatial services. The common thread that all of theses services have for geospatial implementations is that they are done on the network transport layer and not on the data and services themselves. This suggests that more work and experiments are needed to ultimately create standards that make it possible to implement DRM, DLP, Policy decisions by marking the data itself and implementing the technologies in the services that serve the data.

6.2. Data Burden

Adding encryption will always create some additional overhead to any web implementation. This comes in two parts. The first area of overhead in in size of packets. The second are of overhead is in performance do to additional processing time on servers and clients. This is especially true when the data is large. Returning an entire feature collection as an encrypted section in a container format could be quite large. Encryption creates a binary results that are converted to base 64 encoding when put into a XML container. This conversion expands the data to as much a double in size. This creates a burden on network communications. In this engineering report, we examine a few scenarios in which we implement data centric security as a proxy in front of OGC API - Features services that are unaware and aware of data centric security. In both cases, a data burden is placed on the client application as it has to decrypt the data to use or present the data.

In the unaware version, all the encryption is done in the proxy service at request time. This could impose a burnden on the data over a classic service that doesn’t provide data centric security. However, in many classic implementations, security is still enforced and encryption is applied in other forms that do still add a burden on data in terms of performance of retrieval. The big difference is that in a classic scenario the encryption is done on the transport layer and is communicated in a binary form. In the Data Centric Security scenario the data is expanded to convert to an Ascii format for inclusion into a container. For example in a classic implementation whole disk encryption or database encryption may be applied. This adds additional latency to the delivery of data in that it must be decrypted from the storage before being delivered. An additional source of overhead in a classic implementation is imposed on the transport layer as the data is often encrypted through the communication protocol such as Transport Layer Security (TLS). By using a proxy to provide data centric security, it is no longer important to use TLS as the data is already encrypted. However, the solution we present in this engineering report for the proxy scenario does use both asymmetric and symmetric keys to encrypt portions of the data as does TLS. In effect a classic implementation and the data centric security implementation should be simular in performance in cases where transport and storage encryption was used on a classic service.

In contrast, a data centric security aware OGC API - Features service have containers with encryption already applied to the data in the database or on the data storage. In this scenario, the data creator had a burden to encrypt the data before putting it into the service. For this engineering report and testbed experiments we didn’t encrypt the metadata portion in the containers stored in the service. This allows a proxy server to make classification and filtering decisions based on the metadata. The proxy service then used an asymetric key to encrypt the metadata as was done in the unaware scenarios. Another alternative that we could look at in future testbeds is to use a key to decrypt the metadata in the proxy service thus allowing us the ability to store the metadata encrypted as well. As to data burden and performance, this solution is potentially more performant in that the proxy service isn’t applying encryption to the bulk of the data as it is already encrypted. There is still a burden on the client side to decrypt the data. A classic solution with network and storage encryption makes the encryption seamless to the client where a data centric security model imposes work on the client to decrypt. However, if we were to do performance testing is is quite possible that the scenario where features are stored as encrypted containers may actually perform a little faster then the classic solution.

In summary, encryption always add a burden to data and performance of a system. That is certainly the case in the geospatial data solutions. In this testbed, we did not perform any performance testing so there are no metrics to report on. Perhaps that is something best left to future testbeds. However we can work out the data burden in a logical manner and looking at it logically, the data burden should be similar to a classic implementation where data is encrypted at rest and in motion. The client will have more burden in a data centric security model to decrypt data then a classic implementation would impose. A data centric aware service, such as OGC API - Features, could logically decrease some of the performance burden placed on the data.

7. Senarios, Requirements and Architecture

7.1. Senarios

The driving use case for the Testbed-15 Data Centric Security activity is enabling NATO partnering countries to share geodata across potentially insecure networks. To accomplish this use case, data must be secure from storage through delivery. This involves storing the data in an encrypted form in the spatial database and leaving it encrypted in transit. For a client to use this data, the client needs the ability to retrieve a key to decrypt the data. To support the NATO use case, the decision was made to encode the data in a transfer protocol format defined in STANAG 4778. This decision provides the developers with three implementation scenarios:

7.1.1. Scenario One

In this scenario, a default GeoServer implementation is put behind a Geospatial Policy Enforcement Point (GeoPEP). The GeoPEP acts as a proxy server to apply generic metadata. Further, the proxy server packages and encrypts geo-data (features) into the STANAG 4778 format. Further, the GeoPEP returns the data to the client. Figure 1 shows an example of a typical flow from client request to response using the GeoPEP proxy. This scenario leverages XACML to consider processing geographic properties of the request and response.

scenario one
Figure 1. GeoPEP as a Proxy for STANAG 4778
  1. Client sends an OGC API Features (WFS3) request.

  2. The GeoPEP intercepts the request and asks the PDP if the request is allowed. The Policy Decision Point (PDP) uses a XACML Policy for a list of obligations that apply to the request parameters. XACML is an XML document that contains access control rules and obligations. The PDP responds with filter obligations on the request, a STANAG 4778 transformation with digital signature and encryption obligations on the response.

  3. The GeoPEP applies the filters specified in the filter obligation to the request and sends the modified request to the OGC API Features (WFS3) service.

  4. The GeoPEP receives a response from the OGC API Features (WFS3) service and applies the STANAG 4778 transformation to the response.

  5. The GeoPEP creates a symmetric key to encrypt the STANAG 4778 objects, uses the public key of the user to encrypt the symmetric key for inline distribution. This step is not shown on the diagram.

  6. The GeoPEP calculates the digital signature for the response and sends it to the client.

7.1.2. Scenario Two

This scenario uses the same setup as scenario one except that the GeoXAML Policy contains conditions for geographic and temporal access conditions as well as filtering. Figure 2 shows a similar flow to scenerio one except that the GeoPEP may deny the request based on spatial condition with temporal requirements. The flow in scenario two may rewrite the request to filter off of spatial requirments and ignore classification due to policy overrides for emergency situations. Lastly the flow may result in classification based filtering as in scenario once when time and location are not a consideration.

spatio temp option1
Figure 2. GeoPEP as a Proxy for STANAG 4778 with Spatial and Temporal Filtering

The steps illustrated in Figure 2 are explained as follows:

  1. As in scenario one, the client sends an OGC API Features (WFS3) request.

  2. The GeoPEP intercepts the request and asks the GeoPDP if the request is allowed. The GeoPDP responds with a geographic and temporal filter obligation on the request and a STANAG 4778 transformation with digital signature obligations on the response. GeoPEP applies the filters specified in the filter obligation to the request and sends the modified request to the OGC API - Features (WFS3) service.

  3. The GeoPEP receives a response from the OGC API Features (WFS3) service and applies the STANAG 4778 transformation to the response.

  4. The GeoPEP creates a symmetric key to encrypt the STANAG 4778 objects, uses the public key of the user to encrypt the symmetric key for inline distribution. This step is not shown on the diagram.

  5. The GeoPEP calculates the digital signature for the response and sends it to the client.

7.1.3. Scenario Three

This scenario involves an OGC API - Features service that supports the STANAG 4778 format as an output format. Figure 3 shows an example of a flow from client to request to response using an OGC API - Features service supporting STANAG 4778.