Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possibility to add server information in service description #4

Closed
mjacoby opened this issue Jul 27, 2016 · 25 comments
Closed

Possibility to add server information in service description #4

mjacoby opened this issue Jul 27, 2016 · 25 comments
Labels
sensing v1.1 This change should be discussed for v1.1 of the sensing document.

Comments

@mjacoby
Copy link

mjacoby commented Jul 27, 2016

Section 9.2.1 of the SensorThings API Specification defines that a HTTP GET call to the service root URI has to return a JSON object with a property named value. Is it allowed, according to the standard, to return additional information like this?

{
    "capabilities" : 
    {
        "idType" : "integer",
        "supportedEncodings" : [
            "application/vnd.geo+json",
            "application/pdf",
            "http://www.opengis.net/doc/IS/SensorML/2.0"
        ],
        "keepTimezone" : true
    },
    "value": [
        {
            "name": "Things",
            "url": "http://example.org/v1.0/Things"
        },
        ...
    ]
}

And if so, are there any consideration to standardize any of the capabilities like the supportedEncoding in a later version?

@liangsteve
Copy link
Contributor

I think it should be a broader discussion in the SWG. Capabilities were in a very early version of the specification draft. There was a specific link to it, similar to this: PATH/v1.0/Capabilities. In OData, I believe it has something like $metadata, and it returns service metadata. The OData way is quite comprehensive, but follows EDXML. Why don't you prepare a presentation for the Orlando SWG? Or we can have SWG telecons early September? (after summer)

@mjacoby
Copy link
Author

mjacoby commented Jul 28, 2016

You're right, in OData you can access metadata of the service via
GET ~/$metadata/
but this rather returns information on the data model of the service than on general server capabilities like e.g. supportedEncodings as far as I understand.
Unfortunately, it's pretty unlikely that I'll join the SWG in Orlando but a telecon sounds like a good idea as there is not only this issue but also some others that Hylke and I have come across so far.

@hylkevds
Copy link
Contributor

@liangsteve
Copy link
Contributor

notes from 2018-08-17 telecon discussions: the only information (as far as we know for now) that has to have an additional document to offer the information is MQTT relevant information, e.g., MQTT URL, websocket supported or not?, port number 1883 and/or 9001?

@hylkevds
Copy link
Contributor

I've updated the proposal to only contain the mqtt bits. Something that might be a point of discussion:
Do we add protocol/address/port as it is now, or do we just add urls, like:

{
  "mqttOptions": {
    "isEnabled": true,
    "endpoints": [
      "tcp://example.org:1338",
      "wss://example.org:443/mqtt"
    ]
  }
}

@MarcusAlzona
Copy link

Explore naming "capabilities" something else

@taniakhalafbeigi taniakhalafbeigi added the sensing v1.1 This change should be discussed for v1.1 of the sensing document. label Dec 27, 2018
@hylkevds
Copy link
Contributor

hylkevds commented Feb 1, 2019

How about serverSettings ?
or serverFeatues, configuration, serverConfiguration.

We also ran into another situation where probing the server for supported extensions becomes problematic: when you want to use several optional features or extensions at the same time. If I want to fetch data using data arrays, with a filter based on interval logic functions, I may have to do several requests before I actually get any data.

  1. request data, with dataArrays and interval logic. If I get an error it might be that either dataArrays are not supported, or interval logic is not supported.
  2. request data with only interval logic. If I again get an error, it might be that just interval logic is not supported, but data arrays are
  3. rewrite the query to not use interval logic, and use data arrays. If I still get an error, maybe data arrays are also not supported.
  4. request data without data arrays, or interval logic.

We could add a simple list of implemented extensions (and optional features?) to the serverSettings, something like:

  "serverSettings": {
    "extensions": [
      "mqtt",
      "dataArray",
      "put",
      "jsonPatch"
    ],
    "mqtt": {
      "endpoints": [
        "mqtt://server.example.com:1833",
        "mqtts://server.example.com:8883",
        "ws://server.example.com/sensorThings",
        "wss://server.example.com:443/sensorThings"
      ]
    }
  }

Any future extensions can define their keyword to be added to this list.

@taniakhalafbeigi
Copy link

I agree with adding implementation extensions there as well. That could be very useful for clients.

@hylkevds
Copy link
Contributor

I've updated the PR with the proposed changes: https://github.com/opengeospatial/sensorthings/pull/60/files

@hylkevds
Copy link
Contributor

Additional points that came up in the Telco at 2019-04-23:

  • How does this relate to an OpenAPI description?
    Having an OpenAPI description would be good, but:
    • OpenAPI can not describe the MQTT part
    • Making an OpenAPI description would be too big a change for v1.1, would be an item for v2.0
    • The STA landing page is supposed to be simple and user-friendly, an OpenAPI description is neither.
    • The landing page could link to an OpenAPI description.
  • Instead of having a list of extension keywords, and a separate set of extension settings, would it be an option to merge those into one?
    • That would be awkward for extensions that do not have settings.
    • Most extensions do not have additional settings.

@hylkevds
Copy link
Contributor

In the telco of 2019-05-08 this was discussed some more:

  • Vendor extensions?
    The current PR codes labelling extensions are simple strings. To avoid conflicts, the
    PR states that vendors must add a vendor-prefix to custom extensions. The worry is
    that the standard should not contain non-standard parts.
    • The simple solution is to use URLs as extension keys
      This directly provides a link to the description of the extension.
  • Security: We can/should add an Authentication/Security extension

@hylkevds
Copy link
Contributor

The URLs of the official extensions could be:

    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#BatchRequests
    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#DataArray
    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#JSON-Patch
    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#MQTT
    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#MultiDatastream
    http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#PUT

Would it make sense to add a key to indicate the core "version"?

   http://docs.opengeospatial.org/is/18-088r1/18-088r1.html#core

This could allow the service to advertise all versions of the API it supports. If a client gets the URL for v1.1, it could find out that the service also supports v2.0, and then switch to v2.0 instead...

@taniakhalafbeigi
Copy link

In the telco of 2019-05-08 this was discussed some more:

  • Vendor extensions?
    The current PR codes labelling extensions are simple strings. To avoid conflicts, the
    PR states that vendors must add a vendor-prefix to custom extensions. The worry is
    that the standard should not contain non-standard parts.

I don't see the problem of adding extensions to the standard. The only thing that could be a concern to the standard is if something couples with the standard data model in a way that the standard data model cannot be used standalone. Other than that anything could be a feature/extension of a specific implementation. Whether an extension interferes with this fact, we would not know either with or without vendor-prefix.
So, I am not sure why we need the vendor prefix. It could be an implementations choice, I am not sure why we need to add this to the spec, as we are making the spec bigger and more complicated than it should be. My understanding is that the interoperability is core for standard features but not for anything beyond that. We can simply add a line like The standard does not prevent services from having extended functionalities that does not interfere with the coherence of the data model and it can be advertised in the capability document as extension.
I am not sure, maybe I am missing something, so please correct me if I am wrong and hopefully it will be clear to me 😊

  • The simple solution is to use URLs as extension keys
    This directly provides a link to the description of the extension.
  • Security: We can/should add an Authentication/Security extension

@hylkevds
Copy link
Contributor

Sorry, I forgot to add that bit of the discussion. The vendor prefix was to ensure there are no collisions in the tags. We don't want the situation where vendor A creates an extension called FantasticStuff, and vendor B also creates an extension called FantasticStuff, that does something different. That would be very confusing for users.

Using URLs that point to the actual description of the extension solves that issue in a much nicer way, since one can't have two different descriptions at the same URL. And it even helps a user in finding out what an extension actually does. So that means the vendor prefix bit can be removed.

Adding the line that the extensions must not interfere with the coherence of the data model is a good idea!

@taniakhalafbeigi
Copy link

Thanks a lot for the clarification.
I think we are discussing interoperability for stuff which are not in the standard. I think there would actually be no harm, from spec point of view, if you have an extension with same name as ours. The client can only work the standard specification in the beginning and then if the client decides to use a custom implementation features, then the client needs to be customized and add those for that implementation which is not interoperability issue anymore.
I think we are trying to standardizing stuff that are not part of the standard spec itself. I totally see how using that URL can be helpful and everything and I agree that is a good idea. But I think it is a best practice thing, not sure if it should be part of spec.

@liangsteve
Copy link
Contributor

We might want to look at OGC API and see how can we align with it.

Example
/conformance

{
  "conformsTo": [
    "http://www.opengis.net/spec/ogcapi-features-1/1.0/req/core",
    "http://www.opengis.net/spec/ogcapi-features-1/1.0/req/oas30",
    "http://www.opengis.net/spec/ogcapi-features-1/1.0/req/html",
    "http://www.opengis.net/spec/ogcapi-features-1/1.0/req/geojson"
  ]
}

http://docs.opengeospatial.org/DRAFTS/17-069r1.html#_declaration_of_conformance_classes

@hylkevds
Copy link
Contributor

Adding the conformance declaration as a separate url sounds like a good idea, to increase compatibility with the OGC API. This should probably be a separate issue.

Using the conformance classes in our serviceSettings list also makes a lot of sense. One problem there is that some of the optional things we have in the spec does not have a separate requirement or conformance class (PUT and JSON Patch). Would it be worth it to specify separate requirements for those?

Should the list contain the full list of requirements? Or can a class be a placeholder for "all requirements in the class are implemented"?

@hylkevds
Copy link
Contributor

The ITU-T has now also published the (OGC?) SensorThings API, but without conformance classes or requirement definitions. Will it be a problem for the adoption of v1.1 by the ITU, if we use conformance classes pointing to opengis.net, or URLs pointing to docs.opengeospatial.org in our serverSettings definition?

@liangsteve
Copy link
Contributor

IMHO, I don't see any issue.

@hylkevds
Copy link
Contributor

Will the ITU-T have no problems publishing the document with requirements that point to opengis.net?
Because if we add the requirement URLs to the serverSettings page, and UTI-T publishes their version without those requirements, or with requirements with different URLs this is not exactly good for interoperability.

@liangsteve
Copy link
Contributor

If you look at Table 11 or Table 14, it points to ogc domain (e.g., SensorML http://www.opengis.net/doc/IS/SensorML/2.0). For v1.1, I plan to use a similar way, that is a code list for the declaration of conformance classes. I don't see any issue as some examples are already there.

As I mentioned, the content of the OGC standard and ITU-T standard will be the same, except for (1) the requirement classes and conformance classes, and (2) forward, introduction, summary (which are informative anyways).

@KathiSchleidt
Copy link
Collaborator

KathiSchleidt commented Aug 5, 2019

Based on my analysis of the INSPIRE Download Service Requirements, I have the following wish-list for the server info page:

  1. STA Version Number: while we have this in the URL, it might be nice to also have this persisted in the info page
  2. Metadata: INSPIRE requires additional metadata, 2 options:
    1. via a link to a metadata service (CSW) providing service metadata,
    2. in the download service,
      Option 1 seems far more suited to not overloading STA, thus would recommend provision of CSW URL
      Example: “@metadata”: “http://md.example.org/csw?id=2134”
      Source: 2.2.1. Download Service Metadata parameter
  3. Language: while we only have one language, we should name this. If it’s only one, we’re fine only responding in this one language
    Example: “@lang”: “eng”
    Source: 2.2.1. Download Service Metadata parameter
    2.2.3. Languages parameter
    Two language parameters shall be provided:
    • the response Language parameter indicating the natural language used in the Get Download Service Metadata response parameters,
    • the Supported languages parameter containing the list of the natural languages supported by the Download Service.
      (not sure we can/should support both)
  4. CRS: in STA, we have the following options for providing features:
  • GeoJSON - which currently requires WGS84
  • WKT - allows anything
    Some information on which alternative has been utilized in the service would be valuable, together with additional CRS supported in WKT
    Example: "encodingType" : "application/vnd.geo+json"
    For WKT add: “crs: “1234”
    Source: 2.2.4. Spatial Data Sets Metadata parameters
  1. Service Root URI page Descriptions:
    Add “title” and “description” to base information on Service Root URI page in addition to "name" and "URL"
    Source: 4.2.1. Describe Spatial Data Set response parameter
  2. Extended Properties Description:
    INSPIRE requires that a service can provide descriptions of the featureTypes served. While the base types in STA are defined by the standard, it would be valuable to provide additional information on what has been added to the properties
    Source: 8.2.1. Describe Spatial Object Type response parameter
    Note: could also be covered by JSON-LD Contexts
  3. SelfLink:
    To support INSPIRE get service metadata requirements (different from dataset metadata requirements with suggested @metadata element above), we'd need a selflink
    Example: "@iot.selfLink" : "http://service.datacove.eu:8080/SensorThingsServer-1.0_BRGM/v1.0/”
    Source: 2.2.1. Download Service Metadata parameter
  4. Identifier (not sure we need this):
    Would it make sense to provide a unique identifier for a STA endpoint?
    It could be useful for the INSPIRE requirement of providing dataset identifiers (whatever a dataset is). It could also help in alignment should the URL of the STA service change.
    However, as it’s required in most service requests, its probably simpler to do dataset identification via the base STA Endpoint
    Example: “@iot.id”: “French Fairies”

@KathiSchleidt
Copy link
Collaborator

  1. Open API: add Open API doc under /api

@hylkevds
Copy link
Contributor

I've updated the PR #60, by grouping the data model requirements classes into a new requirements class. Since every implementation has to implement all of those, grouping those requirements classes greatly shortens the conformance class list.

@hylkevds
Copy link
Contributor

Fixed with the publication of Sensing v1.1.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sensing v1.1 This change should be discussed for v1.1 of the sensing document.
Projects
None yet
Development

No branches or pull requests

6 participants