Skip to content

Add reso versioning #182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Nov 7, 2022
Merged

Add reso versioning #182

merged 4 commits into from
Nov 7, 2022

Conversation

codygustafson
Copy link
Contributor

@codygustafson codygustafson commented Nov 4, 2022

We should be able to alter the version of the RESO Web API we are
hitting. This change should allow for that using the already existing
version configuration option. That option was initially spark-only
since the RESO Web API did not exist yet.

We should be able to alter the version of the RESO Web API we are
hitting. This change should allow for that using the already existing
`version` configuration option. That option was initially spark-only
since the RESO Web API did not exist yet.
@codygustafson codygustafson requested a review from dgmyrek November 4, 2022 22:11
Copy link
Contributor

@dgmyrek dgmyrek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn't get this to work in testing. The URI seemed to be locked to the default (no /Version/x in the path) regardless of whether I tried setting SparkApi.client.version to the integer or string value "2".

@@ -83,7 +83,7 @@ def request(method, path, body, options)
request_opts.merge!(options)
request_path = if middleware && middleware.to_sym == :reso_api
dd_version = "Dictionary/#{dictionary_version}/" unless dictionary_version.nil?
"/Reso/#{dd_version}OData#{path}"
"/Version/#{version}/Reso/#{dd_version}OData#{path}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does setting the :reso_api flag set a different default version than the Spark API's "v1"?

I'm wondering if the default value of this variable needs to be updated on the fly to be compatible with this new RESO API code path:

DEFAULT_VERSION = 'v1'

Copy link
Contributor Author

@codygustafson codygustafson Nov 7, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, @dgmyrek. The way this was built was to take into account that parameter under all circumstances. Instead of having a computed default, I opted to ignore this parameter unless it is not the default.

@codygustafson codygustafson merged commit 097d805 into master Nov 7, 2022
@codygustafson codygustafson deleted the add-reso-versioning branch November 7, 2022 18:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants