-
Notifications
You must be signed in to change notification settings - Fork 571
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add tokenMap props and convertTokenData utility #1397
Conversation
This pull request is automatically being deployed by Amplify Hosting (learn more). |
a07dd39
to
d65595c
Compare
e248f97
to
b906023
Compare
7d7cf36
to
c80f112
Compare
c80f112
to
7aaa408
Compare
// Stop either when we encounter a "value" prop or if we find that every prop is not an object, meaning we cannot traverse any further | ||
if (isPlainObject(slice[key]) && Object.hasOwn(slice[key], `${usesDtcg ? '$' : ''}value`)) { | ||
result.push({ | ||
.../** @type {T} */ (slice[key]), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dbanksdesign this change introduces top-level token copy which can broke some use cases. convertTokenData process tokens the same way
This can be broken change after 4.2.0 version
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to_ret.push(/** @type {Token} */ (slice[name]));
is equivalent to
result.push({ .../** @type {T} */ (slice[key]) })
given the token is an object type.. All we added is a key property to it.
I'm not sure why that would be a breaking change, could you elaborate or show an example?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not the same, second is equivalent of Object.assign({}, slice[key])
It's not breaking change for me, but for someone may be.
for example
for(const token of flattenTokens(dictionary.tokens)) {
// ...some modification of token
}
in 4.2 will affect real tokens in dictionary
in 4.3 will ignore top properties change, for example token.value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I see. The flattenTokens util should return a value that isn't coupled to the input value imo, so mutating the result should not mutate the input, this is unintentional and a bad practice if you ask me.
I would consider this an unintentional bugfix then; while people may rely on the "bug" (unintended behavior) in their code, it doesn't make the bugfix warrant a major bump.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand that. I fixed it in my pipeline.
But to make function totally pure you should use full clone instead of just shallow clone.
Because current behavior allow of update for example subproperties of token, i.e attributes, properties of object values etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True, I think in a lot of places I'm already using structuredClone()
but we should verify for any function that is doing some kind of token handling that these functions are pure. https://github.com/amzn/style-dictionary/blob/main/lib/utils/flattenTokens.js#L62 is a good example where it's not yet being done.
Created an issue for this #1412
For next PR:
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.