Skip to content
This repository has been archived by the owner on Nov 18, 2021. It is now read-only.

Eth2.0 Call 37 Agenda #141

Closed
djrtwo opened this issue Apr 6, 2020 · 11 comments
Closed

Eth2.0 Call 37 Agenda #141

djrtwo opened this issue Apr 6, 2020 · 11 comments

Comments

@djrtwo
Copy link
Collaborator

djrtwo commented Apr 6, 2020

Eth2.0 Call 37 Agenda

Meeting Date/Time: Thursday 2020/4/9 at 14:00 GMT

Meeting Duration 1.5 hours

YouTube Live Stream Link

  1. Testing and Release Updates
  2. Client Updates
  3. Tesnets
  4. Research Updates
  5. Networking
  6. APIs
  7. Spec discussion
  8. Open Discussion/Closing Remarks
@skmgoldin
Copy link

Hi Danny, I have one proposed item to add to the agenda.

The agenda item: Can we use a protobufs-based spec as the basis for a uniform REST API in Eth 2.0, and can we use Prysm's as a starting point?

The context: The Infura team has been considering how we are going to support the upcoming multi-client testnet, and of course we are planning for the eventual mainnet genesis as well. As Infura has extensively experienced the pains resulting from Eth 1.0's API not being standardized across client implementations, we want to help and be involved early in the API standardization process for Eth 2.0 so that users don't suffer from poor interoperability in the future.

For this reason, we will soon open an issue in the eth2.0-APIs repo proposing the community adopt Prysm’s protobufs-based spec as a starting point for a common Eth 2.0 API spec. As OpenAPI can be generated from protobufs we will not be advocating for all clients to adopt gRPC, merely that we use a protobufs-based spec as the basis for a uniform REST API.

We appreciate that non-spec and Prysm-specific elements of the protobufs spec will need to be excised, that documentation will need to be improved in the prysmaticlabs/ethereumapis repo to avoid issues pertaining to the GPL license on the Prysm client itself, and that the spec will generally need to picked over, modified and improved to meet the needs of all client teams.

Infura has already begun internally to dedicate resources to this effort, and we intend on producing an API conformance testing tool on the basis of the (hopefully generative) spec that will be adopted in the eth2.0-apis repo.

@prestonvanloon
Copy link

@skmgoldin, with regards to licensing, prysmaticlabs/ethereumapis is Apache 2.0 and has no dependencies on Prysm or other GPL license. The only dependencies within the protobufs are github.com/grpc-ecosystem/grpc-gateway (BSD-3) for swagger and well known protos like https://github.com/protocolbuffers/protobuf/blob/master/src/google/protobuf/timestamp.proto.

We've taken great care to avoid any Prysm specific elements in this repository, but I understand there may be additional RPC's in our specifications that would not be mandatory for a minimum implementation, if such a definition is required.

For additional context, there have been previous open discussions:

Some other interesting links

@edsonayllon
Copy link
Contributor

Hello. I know this is an Eth2 call, but I believe there is an overlap. The EIP Improvement Process group currently has a survey searching for feedback for how EIPs are currently decided for deployment. We are reaching out to stakeholders in the EIP process. Feedback from the Eth2 team would also be appreciated.

Survey link: https://docs.google.com/forms/d/e/1FAIpQLSeadXscoQgrKznUOAEB_jSzNNFKHWDEFJxKH1LpDsDsC6mXpw/viewform

@paulhauner
Copy link
Contributor

Can we use a protobufs-based spec as the basis for a uniform REST API in Eth 2.0

I won't be on the call tonight but I have some thoughts on the topic.

If the one and only goal is to build a HTTP REST API with concise and polished OpenAPI documentation then I would find it difficult to argue that defining it with protobuf and then machine translating it into OpenAPI is the best way to go about it.

However, I assume that we don't have only one goal. There's another goal in here that is perhaps inferred: ensure the API is compatible with gRPC.

So now we have two known goals:

  • Produce HTTP REST API with OpenAPI docs.
  • Ensure the API is compatible with gRPC.

The idea from @skmgoldin is to define one of these formats and then derive the other one from it. This is sensible, but given we're talking about two disparate languages I think it's inevitable that the translation is lossy or restrictive. Seeing as these two goals conflict a little I think it will help to prioritize. From what I can tell, @skmgoldin's goal is to define a "uniform REST API in Eth 2.0" whilst "not ... advocating for all clients to adopt gRPC" . Therefore I'd have to assume the priorities are:

  1. Produce HTTP REST API with OpenAPI docs.
  2. Ensure the API is compatible with gRPC.

The current suggestion is to define protobuf and then derive OpenAPI. I've done some research into this and here's what I've found:

  1. In order to affect the OpenAPI docs, one will need to learn:
    • protobuf
    • Mapping between protobuf and JSON
    • Mapping between gRPC and HTTP
    • Mapping between protobuf comments and OpenAPI descriptions, attributes, etc. I struggled to find docs, but I think this is the source code.
  2. From what I can tell there's features in OpenAPI that we're not going to be able to hit via protobuf. (I couldn't confirm this since I couldn't find docs)

Considering these findings, I think that (1) stands out the most; that's quite an onerous and tangential learning path if you want to modify a JSON HTTP spec and I think low-barrier-to-entry collaboration is crucial here. I couldn't get detail on (2) so I find it hard to judge the impact.

If there's some downsides to converting from protobuf to OpenAPI, then perhaps we can go the other way; define OpenAPI and derive protobuf? This would seem to fit better with our priorities by shielding the highest priority goal from the losses of conversion. Indeed this seems possible:

  • gnostic (googleapis): "converts JSON and YAML OpenAPI descriptions to and from equivalent Protocol Buffer representations."
  • openapi2proto (ny times): "will accept an OpenAPI/Swagger definition (yaml or JSON) and generate a Protobuf v3 schema and gRPC service definition from it."

Given that our primary goal is to produce a slick OpenAPI specification that's easy to collaborate on then I would say that we stick with YAML. It's already the defacto human-readable serialization for Eth2 and it's the most expressive for the primary task at hand (slick HTTP API docs). Seeing as gRPC compatibility is something that is desired, add an openapi->protobuf converter in the CI for the repository that fails whenever the OpenAPI doesn't map to protobuf. The end result is a maximally collaborative and expressive OpenAPI along with protobufs for those so inclined.

@AgeManning
Copy link

I also agree with the points raised by @paulhauner.

The second point that was raised was:

use Prysm's (API) as a starting point

I've not looked into Prysm's API currently (so cannot talk to the differences to the current specification this would entail), but we (Lighthouse) have been working under the assumption that the current specification in the eth2-api's repo was the specification to follow. The Eth2 spec has shifted and we have adjusted our API slightly without maintaining this repo, so our API currently differs (minimally) from the API currently in the repository.

I would propose we keep the current specification as a base and iterate from there. I'm curious to know if other teams have also been implementing based on the current version of this API.

@arnetheduck
Copy link

arnetheduck commented Apr 9, 2020

For background on the choice of OpenAPI, ethereum/consensus-specs#1012 - the core idea being that we want to maximise compatibility with all kinds of existing internet infrastructure, for example make requests for simple things like blocks cache-friendly.

Let's assume that we do not wish to revisit this choice at this point - this is what nimbus has been doing (we're in the process of building out the infrastructure needed to implement the eth2-api repo specification).

Going protobuf -> openapi to design an OpenAPI specification, this seems backwards for all the reasons @paulhauner points out - it is difficult enough to produce a high-quality API, that we will want to avoid additional tooling and constraints.

Regarding gRPC, if we want to introduce the additional feature that the spec also comply with gRPC constraints, it would be useful to know more about what these constraints would be and whether they are reasonable from an OpenAPI point of view - every compatibility and conversion layer incurs a cost, so finding out what that cost is would be a first step - as well as having ideas about where this cost should be borne (by not using the full OpenAPI potential or by having a less natural gRPC interface).

Regarding using Prysm API as base - if infura wishes to dedicate resources to conversion, it would indeed be useful to make the conversion once and compare the outcome - also to identify flaws and weak spots in the API as it's published in eth2-api repo - this would be a useful base for discussion on how to iterate on those API and make them better.

@ericsson49
Copy link

I've finally adapted my fork choice integrations tests so that they match the existing eth2.0 spec test format (and also work with the new BLS signatures), spec version is v0.10.1.
More info can be found here.

@djrtwo
Copy link
Collaborator Author

djrtwo commented Apr 9, 2020

From my understanding, one of the blockers in using the eth2.0-apis repo for Prysm was the choice to use 0x00 hex string format for bytes (pubkeys, etc), as opposed to something like base64. This choice was made to preserve standard readability of pubkeys, addrs, etc as in Ethereum today, and to provide some sort of conformance in an eventual merge.

That said, this is not a hill worth dying on. The system level eth2 APIs are divergent in many other respects from Eth1.

I'd like to address if this is really the sticking point on Prysmatic's end, and if others have strong thoughts about the use of hex-string vs another encoding for bytes.

@benjaminion
Copy link
Contributor

Quick notes from the call - with actions! 😀

@paulhauner
Copy link
Contributor

if others have strong thoughts about the use of hex-string vs another encoding for bytes.

No strong feelings. Pubkeys and hashes in 0x... seems natural to me since that's what we see in Eth1 and hashing tools like sha256sum and md5sum. Also not willing to die (or even suffer minor injuries) on this hill.

@poojaranjan
Copy link
Contributor

Ethereum 2.0 Implementers Call 37 Notes

@djrtwo djrtwo closed this as completed Apr 14, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants