-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(rest)!: update AFJ to 0.4.0 #222
feat(rest)!: update AFJ to 0.4.0 #222
Conversation
49d01de
to
53378d9
Compare
This is great @mattdean-digicatapult ! Regarding your instability problems on the tests, they might be related to the 'classic' performance problem we have with askar/anoncreds-rs/indy-vdr wrappers on NodeJS, where the only (up to now) solution we found was to use a patched version of ref-napi package. See https://aries.js.org/guides/0.4/getting-started/set-up/aries-askar So it could be useful to restrict, in |
f88fa0a
to
07a4b72
Compare
Thanks for the pointer @genaris. I'll be honest I hadn't followed the docs precisely because this package is marked as supporting node 12+. Having tried what you suggested locally is helps ... somewhat. The tests now pass most of the time, but then occationally the same error will occur. Honestly my immediate thought is there's some sort of race condition occuring under the hood that prevents shutdown from always succeeding. Looking at the logs, when the error occurs, there does seem to be some didcom activity taking place after the shutdown is called and this is then somehow causing the test failure. This is pretty much a guess at this stage. One question; is the change in https://aries.js.org/guides/0.4/getting-started/set-up/aries-askar something you're happy for me to make? It would mean changing thetop level package in this repository to only support node 18+. |
IMHO it will not be a problem to do so, given that we are already planning to drop support for node 16 in AFJ main repository. We are also discussing to update dependencies in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing job @mattdean-digicatapult ! Very clean and useful. I left some comments regarding agent configuration to support all features we previously did (in terms of protocols and credential formats). There will be room for improvements to make REST Agent more configurable (for instance adding other VDRs than Indy or credential formats than AnonCreds), but I think we can start with this update and then think about that.
An additional point that would make this update match the features from previous release is DID creation and import: as you've seen, we don't have a public DID setting anymore, so we'll need to either register or import an existing DID in order to be ables to sign Indy ledger write transactions. So my suggestion would be to update DidController to have an interface like the following:
/dids/resolve/{did}
: resolve a DID (same as current/dids/{did}
)/dids/create
: create, register and store a new DID (take parameters fromagent.dids.create
)/dids/import
: import an existing DID (this is the typical case that replaces currentpublicDidSeed
functionality)
There could be some other methods (like update
) but I think this would be enough and quite straightforward to implement.
Pretty happy to see this PR. Hopefully we can merge it soon! I promise to give you more feedback quickly than before 😄
packages/rest/src/cli.ts
Outdated
@@ -21,23 +21,25 @@ const parsed = yargs | |||
}) | |||
.option('indy-ledger', { | |||
array: true, | |||
// TODO: this default is invalid, fixme |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Problem here is that you cannot pass an empty array to IndyVdrModule
's network
object, right?. I think you can instance IndyVdrModule
only in case this object has at least a network. Otherwise it will not be meaningful to have such module (as it will not be able to resolve or register any did/anoncreds object).
So my suggestion here is to first prepare agent modules and then instantiate the Agent passing them in constructor parameters (like we did in AFJ demo or tests with getXXAgentModules()
).
packages/rest/src/cliAgent.ts
Outdated
import type { InitConfig } from '@aries-framework/core' | ||
import type { WalletConfig } from '@aries-framework/core/build/types' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import type { InitConfig } from '@aries-framework/core' | |
import type { WalletConfig } from '@aries-framework/core/build/types' | |
import type { InitConfig, WalletConfig } from '@aries-framework/core' |
packages/rest/src/cliAgent.ts
Outdated
proofs: new ProofsModule({ | ||
autoAcceptProofs, | ||
}), | ||
credentials: new CredentialsModule({ | ||
autoAcceptCredentials, | ||
}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As you may have noticed, Proofs and Credentials modules support by default only V2 protocols. And they don't set any specific credential/proof format, meaning that they are not actually usable.
I think it will be better to create a configuration where the agent supports both V1 and V2 protocols for each, using legacy Indy credentials and AnonCreds. Look at getAskarAnonCredsIndyModules
from AFJ demo for a good example about what I'm referring to.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As you may have noticed, Proofs and Credentials modules support by default only V2 protocols. And they don't set any specific credential/proof format, meaning that they are not actually usable.
We should change this...
const proof = await this.agent.proofs.proposeProof(connectionId, presentationPreview, proposalOptions) | ||
const proof = await this.agent.proofs.proposeProof({ | ||
connectionId, | ||
protocolVersion: 'v2', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here we are hard-coding v2 because this Agent setup supports only that protocol. I think RequestProposalOptions
could be extended to accept other parameters, like protocolVersion
and format (indy / anoncreds).
const proof = await this.agent.proofs.acceptProposal(proofRecordId, proposal) | ||
const proof = await this.agent.proofs.acceptProposal({ | ||
proofRecordId, | ||
proofFormats: { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here. depending on the format it will use indy or anoncreds
const proof = await this.agent.proofs.requestProof(connectionId, proofRequestOptions, config) | ||
const proof = await this.agent.proofs.requestProof({ | ||
connectionId, | ||
protocolVersion: 'v2', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above regarding protocolVersions
packages/rest/src/utils/agent.ts
Outdated
autoAcceptProofs: AutoAcceptProof.ContentApproved, | ||
}), | ||
credentials: new CredentialsModule({ | ||
autoAcceptCredentials: AutoAcceptCredential.ContentApproved, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comments about module config regarding this default agent
This is great @mattdean-digicatapult!! Agree with @genaris in extending the interface to allow for more options. In general I think we should keep the interface of the REST api calls as close as possible to the Api classes methods and parameters, so there is consistency in features, options, and how you use the apis. So e.g. if a call to createOffer is an object, with
You are correct that this is now handled by the oob functionalnity. Basically what is possible now:
Yeah we probably need to add some configaration options and move some things around. |
import { Body, Controller, Delete, Example, Get, Path, Post, Query, Res, Route, Tags, TsoaResponse } from 'tsoa' | ||
import { injectable } from 'tsyringe' | ||
|
||
import { ProofRecordExample, RecordId } from '../examples' | ||
import { RequestProofOptions, RequestProofProposalOptions } from '../types' | ||
|
||
const maybeMapValues = <V, U>( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be good to move this to an utils directory
...transformRequiredAttributes(restriction.requiredAttributes), | ||
...transformRequiredAttributeValues(restriction.requiredAttributeValues), | ||
}) | ||
|
||
@Tags('Proofs') | ||
@Route('/proofs') | ||
@injectable() | ||
export class ProofController extends Controller { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As mentioned in a separaete comment, I think we need to align the API with the new interface from AFJ for these modules, espeically the oob, connections, credentials and proofs modules
proofRecord: proof.proofRecord, | ||
} | ||
} | ||
// TODO: Represent out-of-band proof |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be just create-request
and then return an object with the proofRecord and the requestMessage (follow API from createRequest
on proofs module. Then the user can call the oob api with the message
type CredentialProtocols = [V2CredentialProtocol] | ||
type CredentialFormats = [CredentialFormat] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We probably also want to support v1 protocol + legacy indy /anoncreds formats
@@ -175,20 +172,55 @@ export interface ConnectionInvitationSchema { | |||
imageUrl?: string | |||
} | |||
|
|||
export interface RequestProofOptions extends ProofRequestConfig { | |||
export interface RequestProofOptionsProofRequestRestriction { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should match this with the API as we have in AFJ, and not create custom properties here.
The API in AFJ is:
{
connectionId: string
protocolVersion: string // 'v1' or 'v2'
proofFormats: {
<dynamic interface>
}
}
You can use the whole RequestProofOptions
type from AFJ to get the correct types and input interface.
You can define the proof protocols type once like this:
type ProofProtocols = [V1ProofProtocol, V2ProofProtocol<[LegacyIndyProofFormatService, AnonCredsProofFormatService]>]
and then in the rest api calls you can use as the type:
RequestProofOptions<ProofProtocols>
One thing you may run into is that the generated swagger is invalid, and this is due to some bugs in tsoa: https://github.com/lukeautry/tsoa/issues?q=is%3Aopen+is%3Aissue+author%3A%40me+sort%3Aupdated-desc
Maybe this is resolved by updating to the latest version, but if still the case, you need to define the top level object yourself, but can still infer the format types:
See this as an example: https://github.com/hyperledger/aries-framework-javascript-ext/blob/main/packages/rest/src/controllers/types.ts#L125
For the request proof interface this would look something like:
// Define once, can be reused
type ProofFormats = [LegacyIndyProofFormat, AnonCredsProofFormat]
interface RestRequestProofOptions {
connectionId: string
autoAcceptProof?: AutoAcceptProof
comment?: string
proofFormats: ProofFormatPayload<ProofFormats, 'createRequest'>
}
It may seem a bit complex, but it helps in keeping the API of the rest package consistent with what we have in the JS api.
Let me know if you have any questions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, looking at how it's done in the credentials controller is a good example, as this one was already updated to match the new api from 0.3.x which overhauled the credentials api but not yet the proofs api
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand the reasonning here but there are some problems with attr::${string}::marker
and attr::${string}::value
specifically. Notably tsoa does not deal well with types containing patterned keys like:
type Foo {
[key: `attr::${string}::marker`]: '1' | '0'
[key: `attr::${string}::value`]: string,
}
including types like this in a body will result in tsoa throwing an internal GenerateMetadataError
when we build the spec and routes. (I'll see if I can find the issue that discussed this but I can't find it to hand)
This essentially leaves us with two choices:
- match the API using just a
{ [key: string]: unknown }
and then parse out and validate themarker
andvalue
entries by hand - introduce new properties that represent these concepts more neatly within tsoa
I understand the value of consistency but choosing option 1. will lead to a poorly documented OpenAPI type and potentially buggy and error prone parseing logic being done by hand, which feels messy to say the least.
I don't really like the idea of the API being dictated by an issue with tooling, but in this instance I felt it was the lesser evil
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with you here. Maybe we can use the structure of the class in this case? So it will look something like:
restrictions: [
{
attributeValues: {
test_prop: 'test_value',
test_prop2: 'test_value2',
},
attributeMarkers: {
test_prop: true,
test_prop2: true,
},
},
],
Thanks for the comments @genaris @TimoGlastra, going to try and have a look at this over the next couple of days |
Also siwtches the rest package to use indy-vdr and anoncreds packages Tests are currently passing but 2 suites fail due to a cleanup issue that still needs to be resolved There are also several TODO items that should be reviewed Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
1. refactored module instantiation to a helper function 2. configured correct types to support v1 and v2 credential and proof protocols and correct formats 3. proof controller updated to take parameters like agent 4. re-enabled oob proof api as create-request Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
b6d7bc4
to
039948d
Compare
Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
I think I've covered the review comments now if somebody would be able to have a look. On the new Did routes in particular there is a bucket tonne of oppinionation |
Ports changes from openwallet-foundation/credo-ts-ext#222 to this repo. This includes commits: * 039948d9849f0082bb057eb1885b35595b098233 * 2e2638903f5510acd559026057140819bf74c54f
Signed-off-by: Matthew Dean <matthew.dean@digicatapult.org.uk>
…26) * Synchronise updates form openwallet-foundation/credo-ts-ext#222 Ports changes from openwallet-foundation/credo-ts-ext#222 to this repo. This includes commits: * 039948d9849f0082bb057eb1885b35595b098233 * 2e2638903f5510acd559026057140819bf74c54f * Fix did import * Fix routes * bump version
@TimoGlastra Was wondering if you had a chance to review Matts subsequent changes? Thanks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome work! Some small, non-blocking, issues but overall looks good!
"@hyperledger/anoncreds-nodejs": "^0.1.0", | ||
"@hyperledger/anoncreds-shared": "^0.1.0", | ||
"@hyperledger/aries-askar-nodejs": "^0.1.0", | ||
"@hyperledger/aries-askar-shared": "^0.1.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shared is being reexported by the nodejs
and react-native
, these can be omitted from the dependencies.
const { schemaState } = await this.agent.modules.anoncreds.registerSchema({ | ||
schema: { | ||
issuerId: schema.issuerId, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const { schemaState } = await this.agent.modules.anoncreds.registerSchema({ | |
schema: { | |
issuerId: schema.issuerId, | |
const { schemaState } = await this.agent.modules.anoncreds.registerSchema({schema, ... }) |
Something like that would work right?
id: '821f9b26-ad04-4f56-89b6-e2ef9c72b36e', | ||
createdAt: new Date('2022-01-01T00:00:00.000Z'), | ||
protocolVersion: 'v2', | ||
state: 'proposal-sent' as ProofState, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enum can be used here.
return undefined | ||
} | ||
|
||
return Object.entries(attributes).reduce<{ [key in `attr::${string}::marker`]: '1' | '0' }>( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be mistaken, but don't we have this logic within AFJ itself? Or is there a minor difference?
@@ -1,33 +1,43 @@ | |||
import type { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe out-of-scope for now, but I think it would be better to define these types where they are being used. A separate types
dir can always be a bit confusing and it will never capture all the types
defined in the package. We can leave it right now, but it is worthwhile to clean this up another time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, looking at these types a bit, we might be able to create some of them based on the agent modules input types. Would be a fun generic to write and something that we can fix later.
export interface RestAgentModules extends ModulesMap { | ||
connections: ConnectionsModule | ||
proofs: ProofsModule<[V1ProofProtocol, V2ProofProtocol<[LegacyIndyProofFormatService, AnonCredsProofFormatService]>]> | ||
credentials: CredentialsModule<[V1CredentialProtocol, V2CredentialProtocol]> | ||
anoncreds: AnonCredsModule | ||
} | ||
|
||
export type RestAgent< | ||
modules extends RestAgentModules = { | ||
connections: ConnectionsModule | ||
proofs: ProofsModule< | ||
[V1ProofProtocol, V2ProofProtocol<[LegacyIndyProofFormatService, AnonCredsProofFormatService]>] | ||
> | ||
credentials: CredentialsModule<[V1CredentialProtocol, V2CredentialProtocol]> | ||
anoncreds: AnonCredsModule | ||
} | ||
> = Agent<modules> | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we not infer these types from the modules? If not, this is perfectly fine but if we can do something like type RestAgent = Agent<ReturnType<getAgentModules>>
(I think) that would be better.
obj?: { | ||
[key: string]: V | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
obj?: { | |
[key: string]: V | |
} | |
obj?: Record<string, V> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might also be good to define the return type as Record<string, U> | undefined
@@ -194,7 +190,7 @@ describe('CredentialController', () => { | |||
], | |||
}, | |||
}, | |||
autoAcceptCredential: 'always', | |||
autoAcceptCredential: 'always' as AutoAcceptCredential, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
enum can be used here.
Codecov Report
@@ Coverage Diff @@
## main #222 +/- ##
===========================================
- Coverage 86.78% 59.18% -27.61%
===========================================
Files 24 2 -22
Lines 1241 49 -1192
Branches 270 8 -262
===========================================
- Hits 1077 29 -1048
+ Misses 164 20 -144 |
Apologies for the delay in finishing this off. I've been pulled onto other projects but will hopefully find some time in the next couple of weeks |
Superseded by #256 |
resolves #215
Also switches the rest package to use
indy-vdr
,askar
andanoncreds
package set removeingindy-sdk
So this is a fairly opinionated upgrade with a fair number of breaking changes against the original rest API. The majority of the potentially controversial changes are to the
SchemaController
and theCredentialDefinitionController
components as the error handling methodology in theanoncreds
module is very different to that inindy-sdk
. There are also some pretty substantial changes to theProofController
.I'm now at the point though where I could use an experienced set of eyes to determine if what I've done is sensible. This isn't quite ready but I feel it's close. Any help/advice from existing maintainers would be greatly appreciated.
A few todo items still remain to be resolved notably:
oob
functionality but it's unclear how to replicate this or if it's even possibleattr::${string}::value
that I cannot reproduce using TSOA. I'm not sure how or if these should be represented.Next, whilst the tests are currently passing 2 suites fail due to a cleanup issue that still needs to be resolved. Essentially when two of the tests suites shut down I'm getting an error I'm struggling to debug around the Askar wallet being closed:
Honestly the entire time I've worked on this the tests have felt a bit fragile, partly because of concurrent testing causing collisions in open wallets. I've done my best to fix some of these issues, but the above failure seems intermittent so I'm not convinced I was entirely successful.
FInally I should point out that up to now I've been relying almost exclusively on the tests here to determine if things are working. I don't know if there's any other things I should be checking, but if there is some guidance would be very helpful.