-
Notifications
You must be signed in to change notification settings - Fork 44
feat: validate token name localizations #2468
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis change introduces localization support and validation for token configurations in the data contract module. A new accessor module (with a Changes
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (4)
packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/v0/mod.rs (1)
1-7
: LGTM! Consider enhancing the documentation.The trait is well-designed with clear method signatures and consistent naming patterns. The methods appropriately handle both singular and plural forms with default language fallback.
Consider adding documentation to clarify:
- What is the default language code?
- What happens when the requested language code is not found?
/// Accessor trait for getters of `TokenConfigurationConventionV0` pub trait TokenConfigurationConventionV0Getters { - /// Returns the localized token name in singular form + /// Returns the localized token name in singular form. + /// If the requested language code is not found, falls back to English ('en'). fn singular_form_by_language_code_or_default(&self, language_code: &str) -> &str; - /// Returns the localized token name in plural form + /// Returns the localized token name in plural form. + /// If the requested language code is not found, falls back to English ('en'). fn plural_form_by_language_code_or_default(&self, language_code: &str) -> &str; }packages/rs-dpp/src/errors/consensus/basic/token/missing_default_localization.rs (1)
1-24
: LGTM! Consider adding documentation.The error type is well-designed with appropriate trait implementations and clear error message.
Consider adding documentation to explain the error's purpose and when it's raised:
+/// Error raised when a token configuration is missing the required English ('en') localization. +/// This error occurs during validation of token configurations to ensure that a default localization +/// is always available. #[derive( Error, Debug, Clone, PartialEq, Eq, Encode, Decode, PlatformSerialize, PlatformDeserialize, )] #[error("Missing english ('en') localization which is using by default")] #[platform_serialize(unversioned)] pub struct MissingDefaultLocalizationError {}packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/validate_localizations/mod.rs (1)
9-26
: LGTM! Consider adding documentation.The method is well-structured with proper version handling and error management.
Consider adding documentation to explain the method's purpose and return values:
+ /// Validates the localizations of a token configuration based on the platform version. + /// + /// # Arguments + /// * `platform_version` - The platform version to determine which validation rules to apply + /// + /// # Returns + /// * `Ok(SimpleConsensusValidationResult)` - The validation result + /// * `Err(ProtocolError)` - If the platform version is not supported pub fn validate_localizations( &self, platform_version: &PlatformVersion, ) -> Result<SimpleConsensusValidationResult, ProtocolError> {packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/validate_localizations/v0/mod.rs (1)
5-22
: Consider enhancing the error message.The implementation correctly validates the presence of English localization. However, consider adding a more descriptive error message to help users understand why English localization is required.
- MissingDefaultLocalizationError::new().into(), + MissingDefaultLocalizationError::new() + .with_message("English (en) localization is required as the default language") + .into(),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (15)
packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/v0/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/validate_localizations/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/validate_localizations/v0/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/mod.rs
(1 hunks)packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/v0/mod.rs
(3 hunks)packages/rs-dpp/src/errors/consensus/basic/basic_error.rs
(2 hunks)packages/rs-dpp/src/errors/consensus/basic/token/missing_default_localization.rs
(1 hunks)packages/rs-dpp/src/errors/consensus/basic/token/mod.rs
(2 hunks)packages/rs-drive-abci/src/execution/validation/state_transition/state_transitions/data_contract_create/advanced_structure/v0/mod.rs
(1 hunks)packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/mod.rs
(1 hunks)packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/v1.rs
(1 hunks)packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/v2.rs
(1 hunks)packages/wasm-dpp/src/errors/consensus/consensus_error.rs
(2 hunks)
✅ Files skipped from review due to trivial changes (2)
- packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/mod.rs
- packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/mod.rs
⏰ Context from checks skipped due to timeout of 90000ms (4)
- GitHub Check: Build Docker images (Drive, drive, drive-abci) / Build Drive image
- GitHub Check: Build Docker images (DAPI, dapi, dapi) / Build DAPI image
- GitHub Check: Build JS packages / Build JS
- GitHub Check: Rust crates security audit
🔇 Additional comments (13)
packages/rs-dpp/src/errors/consensus/basic/token/mod.rs (1)
7-7
: LGTM!The new module and its public export are correctly added in alphabetical order, maintaining the file's organization.
Also applies to: 16-16
packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/methods/validate_localizations/v0/mod.rs (1)
1-4
: LGTM!The imports are correctly organized and include all necessary dependencies.
packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs (2)
1-8
: LGTM!The module structure and imports are well-organized, with clear separation of concerns.
9-25
:❓ Verification inconclusive
Consider handling potential panic cases.
The implementation correctly delegates to v0 methods, but consider handling cases where the requested language code doesn't exist to prevent potential panics.
Could you provide the implementation of
singular_form_by_language_code_or_default
andplural_form_by_language_code_or_default
from the v0 module to ensure they handle missing language codes gracefully?
🏁 Script executed:
#!/bin/bash # Description: Check implementation of getter methods in v0 module ast-grep --pattern $'impl TokenConfigurationConventionV0 { $$$ fn singular_form_by_language_code_or_default($$$) { $$$ } $$$ fn plural_form_by_language_code_or_default($$$) { $$$ } $$$ }'Length of output: 210
Manual Verification Required for v0 Getter Implementations
The current delegation in the
TokenConfigurationConventionV0Getters
implementation passes calls directly to the v0 methods without any additional safeguard. However, we couldn’t confirm from an automated search whether the underlying implementations ofsingular_form_by_language_code_or_default
andplural_form_by_language_code_or_default
in the v0 module handle cases of missing language codes gracefully.Please manually verify that in the v0 module:
- When an unknown
language_code
is provided, the implementations either return a proper default value without panicking or have explicit error handling.- If no such handling exists, consider refactoring these methods to ensure they do not trigger panics when the requested language code is missing.
packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/v2.rs (1)
22-22
: LGTM!The addition of
validate_localizations
field with initial version 0 is consistent across v1 and v2, correctly reflecting the introduction of localization validation as a breaking change.packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/v1.rs (1)
22-22
: LGTM!The addition of
validate_localizations
field with initial version 0 is consistent across v1 and v2, correctly reflecting the introduction of localization validation as a breaking change.packages/rs-platform-version/src/version/dpp_versions/dpp_validation_versions/mod.rs (1)
24-24
: LGTM!The new field
validate_localizations
is correctly added to theDataContractValidationVersions
struct, following the same pattern as other validation fields.packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/v0/mod.rs (3)
26-26
: LGTM!The constant
ENGLISH_ISO_639
is well-named and uses the standard ISO 639 language code.
33-36
: LGTM!The
localizations
field is well-designed with:
- Ordered storage using
BTreeMap
- Clear documentation specifying ISO 639 2-chars language code requirement
- Default attribute for serde
63-77
: LGTM!The implementation of
TokenConfigurationConventionV0Getters
is efficient and idiomatic:
- Uses references to avoid unnecessary cloning
- Properly handles missing language codes by defaulting to English
- Uses
map
andunwrap_or_else
for clean error handlingpackages/rs-drive-abci/src/execution/validation/state_transition/state_transitions/data_contract_create/advanced_structure/v0/mod.rs (1)
121-133
: LGTM!The localization validation is well-integrated:
- Follows the existing pattern of validation checks
- Properly handles validation errors through
ConsensusValidationResult
- Correctly integrates with the bump identity nonce action
packages/rs-dpp/src/errors/consensus/basic/basic_error.rs (1)
473-474
: LGTM!The
MissingDefaultLocalizationError
variant is well-integrated:
- Follows the existing pattern of error variants
- Uses transparent error propagation
- Is placed at a logical position in the enum
packages/wasm-dpp/src/errors/consensus/consensus_error.rs (1)
69-69
: LGTM! Error handling for token localization validation is properly implemented.The addition of
MissingDefaultLocalizationError
is well-integrated into the error handling system and aligns with the PR's objective to enforce token name localization requirements. The implementation follows the established error handling patterns and is correctly placed within the token-related error section.Also applies to: 693-695
# Conflicts: # packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs # packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/v0/mod.rs # packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/v0/mod.rs # packages/rs-dpp/src/errors/consensus/codes.rs # packages/wasm-dpp/src/errors/consensus/consensus_error.rs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments. If you are seeing this consistently it is likely a permissions issue. Please check "Moderation" -> "Code review limits" under your organization settings.
Actionable comments posted: 25
🔭 Outside diff range comments (3)
packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/mod.rs (1)
27-28
:⚠️ Potential issueFix syntax error: Remove extra closing brace.
There's an extra closing brace
}
on line 27 that doesn't have a matching opening brace. This will cause compilation errors.-} -packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs (1)
26-26
:⚠️ Potential issueFix file structure issue
There appears to be a misplaced closing brace that doesn't match with any opening brace in this file. This suggests either incomplete code or a formatting error.
This closing brace should be removed as it doesn't match any opening brace in the implementation:
- }
packages/dapi-grpc/clients/platform/v0/objective-c/Platform.pbobjc.h (1)
6935-6984
: 🛠️ Refactor suggestionEnsure TokenEvent handling is complete across the codebase.
With the removal of the Transfer event type, ensure that all code paths that process TokenEvent objects are updated to handle the new type set.
Any switch statements or if-else chains that process TokenEvent types should be updated to remove the Transfer case. Additionally, ensure that any documentation is updated to reflect the current set of supported token event types.
🧹 Nitpick comments (62)
packages/js-dash-sdk/src/SDK/Client/Platform/NonceManager/NonceManager.spec.ts (2)
54-55
: Consider addressing TypeScript type issuesThe code uses a TypeScript ignore directive to bypass type checking for BigInt addition. Consider properly typing these operations instead of using ignore directives.
- // @ts-ignore - .to.equal(prevNonce + BigInt(1)); + .to.equal(prevNonce + BigInt(1) as bigint);
99-100
: Consider addressing TypeScript type issuesSimilar to the previous instance, consider properly typing BigInt operations instead of using ignore directives.
- // @ts-ignore - .to.equal(prevNonce + BigInt(1)); + .to.equal(prevNonce + BigInt(1) as bigint);packages/js-dapi-client/lib/test/fixtures/getStatusFixture.js (1)
1-3
: Add documentation for the fixture functionThe JSDoc comment is empty. Consider adding descriptive documentation explaining the purpose of this fixture and how it should be used in tests.
-/** - * - */ +/** + * Creates a fixture for testing the getStatus functionality + * with realistic mock data representing various system components. + * @returns {Object} A mock status response object + */packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/v0/mod.rs (1)
7-13
: New token configuration localization structureThis struct defines the localization information for token configurations, with fields for capitalization preference and singular/plural forms.
Consider adding validation for the
singular_form
andplural_form
fields to ensure they're non-empty strings. For example:impl TokenConfigurationLocalizationV0 { pub fn new(should_capitalize: bool, singular_form: String, plural_form: String) -> Result<Self, &'static str> { if singular_form.is_empty() { return Err("singular_form cannot be empty"); } if plural_form.is_empty() { return Err("plural_form cannot be empty"); } Ok(Self { should_capitalize, singular_form, plural_form, }) } }packages/js-dapi-client/lib/methods/platform/getStatus/StateSyncStatus.js (3)
39-44
: Fix documentation comment for getRemainingTime method.The JSDoc comment incorrectly states "Total synced time" but should describe "Remaining time" to match the method's actual purpose.
/** - * @returns {bigint} Total synced time + * @returns {bigint} Remaining time to sync */
60-65
: Fix documentation comment for getSnapshotHeight method.The JSDoc comment incorrectly states "Chunk process average time" but should describe "Snapshot height" to match the method's actual purpose.
/** - * @returns {bigint} Chunk process average time + * @returns {bigint} Snapshot height */
67-72
: Fix documentation comment for getSnapshotChunkCount method.The JSDoc comment incorrectly states "Chunk process average time" but should describe "Snapshot chunks count" to match the method's actual purpose.
/** - * @returns {bigint} Chunk process average time + * @returns {bigint} Snapshot chunks count */packages/js-dapi-client/lib/methods/platform/getEpochsInfo/EpochInfo.js (1)
10-16
: Missing type validation for constructor parametersWhile the types in JSDoc have been updated, there's no runtime validation to ensure the parameters are actually BigInts. This could lead to type inconsistencies if callers pass regular numbers.
Consider adding validation in the constructor:
constructor(number, firstBlockHeight, firstCoreBlockHeight, startTime, feeMultiplier) { this.number = number; + if (typeof firstBlockHeight !== 'bigint') { + firstBlockHeight = BigInt(firstBlockHeight); + } this.firstBlockHeight = firstBlockHeight; this.firstCoreBlockHeight = firstCoreBlockHeight; + if (typeof startTime !== 'bigint') { + startTime = BigInt(startTime); + } this.startTime = startTime; this.feeMultiplier = feeMultiplier; }packages/js-dapi-client/test/unit/methods/platform/getEpochsInfo/GetEpochsInfoResponse.spec.js (1)
17-164
: Consider adding edge case tests for BigInt conversionsWhile the tests have been updated for the happy path, it would be valuable to add tests that verify correct behavior when passing different types of values (e.g., number, string) that should be converted to BigInt.
Consider adding test cases like:
it('should handle number values for bigint fields', () => { const epochInfoWithNumberValues = new EpochInfoClass(1, 1, 1, Date.now(), 1.1); // Verify that the getters return bigint values even when constructed with numbers expect(typeof epochInfoWithNumberValues.getFirstBlockHeight()).to.equal('bigint'); expect(typeof epochInfoWithNumberValues.getStartTime()).to.equal('bigint'); });This would require adding the type conversion in the EpochInfo constructor as suggested in my previous comment.
packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/evaluate_interval.rs (1)
7-40
: Well-documented new method for reward calculation.The
rewards_in_interval
method is a well-structured addition that calculates total rewards emitted in a given interval. The implementation leverages existing functionality by delegating toevaluate_interval_in_bounds
, and the comprehensive documentation clearly explains parameters and return values.One suggestion: consider adding unit tests specifically for this method to ensure it handles various scenarios correctly, including edge cases like empty intervals or maximum token amounts.
packages/rs-dpp/src/data_contract/methods/equal_ignoring_time_based_fields/mod.rs (1)
23-41
: Well-structured implementation for comparing contracts ignoring timestamps.The method effectively enables validation of contract equality while disregarding time-related fields, which is valuable for testing and migration scenarios. The version handling pattern is clean and follows best practices.
Consider adding
platform_version
to the documentation comments to fully document all parameters./// # Arguments /// - `other`: A reference to another `DataContract` to compare against. + /// - `platform_version`: The platform version to use for the comparison. /// /// # Returns /// - `true` if all non-time fields match, otherwise `false`.
packages/rs-dapi-client/src/transport/tonic_channel.rs (2)
1-2
: Consider more descriptive error handling.Panicking when
uri.host()
isNone
might be acceptable in a controlled environment, but consider replacingexpect
with a custom error to gracefully handle malformed URIs.-let host = uri.host().expect("Failed to get host from URI").to_string(); +let host = match uri.host() { + Some(h) => h.to_string(), + None => return Err(TransportError::HostMissing("Provided URI has no host".to_string())), +};
16-45
: Validate TLS settings and domain name handling.The
create_channel
function is well structured:
- Conditionally applies user-specified timeouts and CA certificates.
- Sets TLS config properly and uses
connect_lazy()
.However, consider:
- Validating the
host
to ensure it matches the certificate's SNI requirements, especially if dealing with non-production endpoints.- Handling potential errors from
domain_name(host)
, which could fail if the format is invalid.Otherwise, the function is solid and consistent with Tonic best practices.
packages/rs-dpp/src/block/finalized_epoch_info/getters.rs (1)
1-79
: Return slices instead of&Vec
for more flexibility.Overall, these getters are straightforward and cleanly implemented. As a minor improvement, returning slices (e.g.,
&[(Identifier, u64)]
) instead of&Vec<(Identifier, u64)>
can offer callers more flexibility and avoid exposing the underlying vector type.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/validation.rs (2)
13-17
: Consider modularizing the validation logic.The single
validate
method is quite large and handles many distribution variants. Splitting further into helper functions or a trait-based approach can reduce complexity and improve readability.
13-2345
: Overall thorough test coverage.This file has a comprehensive suite of tests covering valid and invalid scenarios for all distribution variants. Maintaining such coverage ensures reliability and aids future refactoring.
packages/rs-dpp/src/errors/consensus/basic/data_contract/invalid_token_distribution_function_incoherence_error.rs (1)
1-22
: Consider adding more context to the error
Right now, only a message string is stored. Adding additional fields (e.g., parameters that triggered the incoherence) can help diagnose the root cause more precisely during debugging.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/evaluate.rs (2)
20-43
: Consider using a cryptographically secure PRNG
The Random distribution uses a deterministic SplitMix64-like approach. If unpredictability is required, you might consider a more secure PRNG (e.g.,rand::rngs::StdRng
). Otherwise, this deterministic approach is acceptable for non-security-oriented randomness.
132-198
: Polynomial distribution floating-point caution
Although overflow checks are in place, partial exponents (e.g., fractionalm/n
) can still introduce floating-point precision issues. If precise integer math is required, consider an alternative approach.packages/rs-dpp/src/errors/consensus/state/token/invalid_token_claim_property_mismatch.rs (1)
23-41
: Consider adding unit tests for error conversion and field retrieval.While this is a straightforward error type, adding a minimal set of tests would help ensure consistent behavior and guard against future regressions, particularly for error formatting and struct field usage.
packages/rs-dpp/src/data_contract/v1/serialization/mod.rs (2)
135-140
: Initializing new time-related fields fromDataContractInSerializationFormatV1
.These lines correctly map the optional fields from the serialization format into the data contract. Verify any required validation (e.g., ensuring
created_at
<=updated_at
) is handled, if applicable.
165-170
: Revisiting validation of block heights and epochs.Consider verifying the integrity of
created_at_block_height
andupdated_at_block_height
, and similarly for epochs, if you anticipate constraints (e.g., not in the distant past). Providing some basic checks can preempt invalid states.Would you like me to propose a validation snippet for these fields or open an issue to track this?
packages/dapi-grpc/build.rs (1)
16-27
: Conditional code generation for multiple feature flags is well-structured.Invoking
generate_code
for eachImplType
is a clean, modular approach. If you ever need a combined server+client build, consider adding a dedicated variant or a combined approach.packages/js-dapi-client/lib/methods/platform/response/Metadata.js (1)
10-10
: Using BigInt conversion ensures consistent handling of numeric valuesConverting inputs to
BigInt
ensures consistent type handling throughout the code and prevents potential issues with large numbers.However, consider adding error handling for invalid inputs that can't be converted to BigInt:
- this.height = BigInt(properties.height); + try { + this.height = BigInt(properties.height); + } catch (e) { + throw new Error(`Invalid height value: ${properties.height}`); + } - this.timeMs = BigInt(properties.timeMs); + try { + this.timeMs = BigInt(properties.timeMs); + } catch (e) { + throw new Error(`Invalid timeMs value: ${properties.timeMs}`); + }Also applies to: 12-12
packages/rs-dpp/src/errors/consensus/basic/data_contract/invalid_token_distribution_function_divide_by_zero_error.rs (1)
19-26
: Versioning comment inconsistencyThere's a comment about not changing field order without introducing a new version, but the struct is marked as
#[platform_serialize(unversioned)]
. This inconsistency could cause confusion for future developers.Clarify the versioning strategy with one of these approaches:
-#[platform_serialize(unversioned)] +#[platform_serialize(versioned)]Or update the comment to reflect the current unversioned state:
- /* - - DO NOT CHANGE ORDER OF FIELDS WITHOUT INTRODUCING A NEW VERSION - - */ + /* + + Even though this struct is unversioned, consider backward compatibility + before changing the field structure + + */packages/rs-dpp/src/errors/consensus/basic/token/invalid_token_amount_error.rs (1)
22-29
: Constructor without validationThe constructor creates the error but doesn't validate that
token_amount
is actually greater thanmax_token_amount
. While this validation might be done by the code that creates the error, adding it here would ensure the error state is always valid.pub fn new(max_token_amount: u64, token_amount: u64) -> Self { + debug_assert!(token_amount > max_token_amount, "token_amount must be greater than max_token_amount"); Self { max_token_amount, token_amount, } }
packages/js-dapi-client/test/unit/methods/platform/getProtocolVersionUpgradeVoteStatus/GetProtocolVersionUpgradeVoteStatusResponse.spec.js (1)
117-124
: Tests updated to match BigInt conversionThe tests have been properly updated to verify each metadata property individually, including the conversion of height and timeMs to BigInt. This change maintains test coverage for the updated Metadata implementation.
One minor observation: line 94 still uses
.to.equal()
for height comparison rather than.to.deep.equal(BigInt())
like lines 117-118. Consider updating that assertion for consistency.packages/rs-dpp/src/data_contract/associated_token/token_configuration/v0/mod.rs (1)
61-64
: Remove unused default function.The
default_keeps_history()
function is no longer used after implementing the newTokenKeepsHistoryRules
structure and should be removed.-// Default function for `keeps_history` -fn default_keeps_history() -> bool { - true // Default to `true` for keeps_history -}packages/rs-dpp/src/errors/consensus/state/state_error.rs (1)
60-64
: Be mindful of enum variant ordering.Note the comment about not changing the order of variants. The new variants are correctly added at the end of the enum to preserve backward compatibility, but this is an important consideration for future changes.
Also applies to: 242-272
packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/accessors.rs (1)
1-1
: Remove unused importThe
TokenAmount
import is not used in this file and should be removed.-use crate::balances::credits::TokenAmount;
🧰 Tools
🪛 GitHub Check: Rust packages (wasm-dpp) / Linting
[warning] 1-1: unused import:
crate::balances::credits::TokenAmount
warning: unused import:crate::balances::credits::TokenAmount
--> packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/accessors.rs:1:5
|
1 | use crate::balances::credits::TokenAmount;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (drive) / Linting
[warning] 1-1: unused import:
crate::balances::credits::TokenAmount
warning: unused import:crate::balances::credits::TokenAmount
--> packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/accessors.rs:1:5
|
1 | use crate::balances::credits::TokenAmount;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (dash-sdk) / Linting
[warning] 1-1: unused import:
crate::balances::credits::TokenAmount
warning: unused import:crate::balances::credits::TokenAmount
--> packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/accessors.rs:1:5
|
1 | use crate::balances::credits::TokenAmount;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^packages/rs-dapi-client/src/transport/grpc.rs (1)
473-482
: Add test coverage for new transport request.This new transport request for subscribing to the masternode list looks good. However, consider adding or confirming existing test coverage to ensure correct streaming behaviors and error handling downstream.
packages/rs-dpp/src/data_contract/document_type/index/mod.rs (3)
62-66
: Review concurrency aspect.
OnceLock<Regex>
is thread-safe and suitable here. Just keep in mind that once it panics (if the regex is invalid), it may be stuck. Ensuring validation or fallback helps prevent runtime failures.
69-87
: Runtime fallback vs. panics.
CallingRegex::new(&self.regex_str).expect("valid regexp")
will panic if run outside thevalidation
feature check. Consider gracefully handling or surfacing errors at runtime instead.- .get_or_init(|| Regex::new(&self.regex_str).expect("valid regexp")); + .get_or_init(|| { + Regex::new(&self.regex_str) + .unwrap_or_else(|_| Regex::new("^$").unwrap()) // inert fallback + });
525-539
: Compile-time validation logic.
The#[cfg(feature = "validation")]
block to pre-validate the regex is beneficial. However, for run-time usage without this feature, consider a fallback to avoidexpect
panics. This is mostly a non-blocking suggestion, as you have partial coverage already.packages/rs-dpp/src/errors/consensus/basic/data_contract/invalid_token_distribution_function_invalid_parameter_tuple_error.rs (1)
21-29
: Consider using an enum for therelation
field.Currently, the
relation
field is stored as aString
with possible values like "greater than" or "smaller than." If these are the only expected values, you might consider using an enum for stronger type safety and clearer semantics.first_parameter: String, second_parameter: String, - relation: String, // "greater than" or "smaller than" + relation: DistributionRelation, + +#[derive(Debug, Clone, PartialEq, Eq, Encode, Decode, PlatformSerialize, PlatformDeserialize)] +pub enum DistributionRelation { + GreaterThan, + SmallerThan, +}packages/dapi-grpc/clients/platform/v0/objective-c/Platform.pbobjc.h (1)
1-5
: Breaking change alert in Protocol Buffer definitions.The field renumbering and removal of the Transfer field represents a breaking change in the Protocol Buffer definitions, as indicated in the PR title with the "!" marker. This is important to document clearly for clients of this API.
In Protocol Buffers, field numbers are critical for serialization compatibility. When removing fields and renumbering, clients must be updated accordingly. Consider providing migration guidance for users of this API and ensure version compatibility is properly handled.
For future protocol evolution, consider using reserved field numbers for removed fields to prevent accidental reuse:
// Example of how the proto file should reserve removed field numbers message TokenEvent { // Current fields... // Prevent reuse of removed field numbers reserved 5; // formerly Transfer reserved "Transfer"; }Also applies to: 6935-6938
packages/rs-dpp/src/data_contract/v1/data_contract.rs (4)
3-14
: Consider verifying if these imports are all necessary.
The newly added importsEpochIndex
,TimestampMillis
, andBlockHeight
are presumably for the new fields below. If any unused imports remain, removing them can help keep the codebase clean.
27-36
: Documentation improvement suggestion.
These doc comments comprehensively describe changes fromDataContractV0
toDataContractV1
. Consider including a quick example or a short "usage scenario" to clarify how these contract governance enhancements address real-world use cases.
38-45
: Clarify the intended usage of tokens in documentation.
You mention tokens facilitate supply, minting/burning, and governance. It might be beneficial to provide a prose example (e.g., “Token X with max supply Y...”) for clarity.
46-65
: Expand upon the block height usage in comments.
While referencing block height for traceability is good, specifying how other services might consume or verify these fields (e.g., for auditing or conflict resolution) could guide downstream developers.Would you like me to propose additional doc clarifications or open a separate PR to detail usage patterns?
packages/rs-dapi-client/src/transport/wasm_channel.rs (4)
1-2
: Add clarifying docstrings for all gRPC requests.
Given this file introduces a WASM-oriented gRPC approach, adding docstrings for each request type and usage within the DAPI context can improve maintainability.
19-22
: Export naming clarity.
It might be helpful to name these clients with a suffix (likeWasmPlatformGrpcClient
) so that their usage context is obvious in a large codebase.
32-36
: Consider adding an async constructor.
WASM-based networking might benefit from anasync
constructor that can validate connectivity. Although the current synchronous approach is simpler, verifying connectivity upfront can catch issues early.
46-72
: Ensure readiness checks on repeated calls.
poll_ready
returns to a “ready” state, but WASM clients can unpredictably become “not ready” if the environment changes (e.g., network interruption). Consider rechecking readiness before each critical send if reliability is crucial.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/encode.rs (3)
4-4
: Check for possible usage of specialized data structures.
The import ofBTreeMap
suggests we store step values. Verify whether frequent insertions/deletions or large data might benefit from other data structures.
268-287
: InvertedLogarithmic decoding.
Because it is effectively 1 / log(x) style, confirm the function doesn’t blow up whenlog(x)
is near zero. Potentially clamp or limit the distribution.
307-310
: Random borrow_decode constraints.
Same fundamental domain check caution as above. Ifmin > max
, consider an error path or domain check.packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/accessors/v0/mod.rs (1)
13-23
: Consider accepting string references in setters to avoid unnecessary allocations.
Currently, these setters acceptString
, which forces ownership transfer. In many cases, allowing&str
or using a generic parameter (e.g.,S: Into<String>
) could be more flexible and efficient.Example of a refactor to remove extra allocations:
- fn set_singular_form(&mut self, singular_form: String); + fn set_singular_form<S: Into<String>>(&mut self, singular_form: S); - fn set_plural_form(&mut self, plural_form: String); + fn set_plural_form<P: Into<String>>(&mut self, plural_form: P);packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_moment/mod.rs (1)
86-155
: Reduce boilerplate by consolidating partial equality checks with macros.
You’ve defined multiplePartialEq
implementations to compareRewardDistributionMoment
against different numeric types. Consider using a macro or generic approach to unify this code and streamline maintenance.packages/rs-dpp/src/data_contract/v1/accessors/mod.rs (1)
246-274
: Optionally enforce monotonic updates or domain constraints.The setter methods for timestamps, block heights, and epochs currently set optional values without domain-specific validation (e.g., checking that the updated value is after creation for timestamps or block heights). If the domain requirements allow out-of-order or absent values, this is fine; otherwise, consider adding checks here or in upstream logic to ensure monotonic progression or non-null fields.
packages/rs-dpp/src/data_contract/accessors/mod.rs (1)
270-318
: Consider clarifying edge cases in doc comments.The newly introduced getters for
created_at()
,updated_at()
, block heights, and epochs all returnNone
when called on a V0 contract. Consider updating doc comments to clarify that these values areSome(...)
only for V1 contracts and must be handled carefully in downstream logic.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/mod.rs (2)
1-3
: Ensure module organization is necessary.
The new submodules (accessors
,evaluate_interval
) appear to broaden the functionality for reward distribution. Please confirm they’re actually used in other parts of the codebase and not causing unnecessary modular fragmentation.
25-31
: Verify time-bound logic inTimeBasedDistribution
.
Similarly, the optionalstart
andend
fields for time-based intervals rely on precise timestamp comparisons. Consider edge cases around clock drift or missing timestamps.packages/rs-dpp/src/data_contract/accessors/v1/mod.rs (1)
1-7
: Enhance documentation for newly includedBlockHeight
andTimestampMillis
.
Explicitly mention the units or meaning of each field (e.g., “block height counts from network launch,” “timestamp is UNIX milliseconds”). This avoids confusion for cross-team developers.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/mod.rs (8)
76-106
:StepDecreasingAmount
approach is reminiscent of Bitcoin-like halving.
The shift to a numerator/denominator representation is safer than floating-point. However, confirm if large exponents or accumulative multiplications might overflow ifu64
ranges are approached.
108-121
:Stepwise
withBTreeMap<u64, TokenAmount>
is flexible but can be large.
Be mindful of potential memory overhead if storing many intervals. Consider an alternative data structure or lazy loading if intervals become too large in real use.
242-325
:Polynomial
variant handles rational exponents well.
This is powerful, but watch for potential performance overhead in repeated exponentiation. Caching partial results might help if used frequently.
327-384
:Exponential
extension covers growth/decay parameters.
The combination ofm/n
to scale exponent is flexible, but also can lead to confusion. Clear internal docs or tests around partial exponent usage might help.
521-523
:Display
forRandom
clarifies range.
Recommend adding “uniform random range” text if needed for debugging clarity. Optional.
524-547
:Display
forStepDecreasingAmount
: watchers might want to see step boundaries.
Currently, it prints the decrease factor and step size. Consider also showing how many steps have already passed if that info is tracked.
609-634
:Display
forExponential
: watch performance if debug logs are frequent.
String formatting combined with exponent math can be expensive. Typically not critical, but keep in mind.
635-660
:Display
forLogarithmic
ensures all parameters are shown.
Consider explicitly indicating the log base in the output for clarity.
🛑 Comments failed to post (25)
packages/js-dapi-client/lib/BlockHeadersProvider/BlockHeadersProvider.js (1)
11-11:
⚠️ Potential issueFix incorrect JSDoc description for fromBlockHeight property
The description for the
fromBlockHeight
property incorrectly states "a target batch size per stream" which is the same as the description fortargetBatchSize
. ThefromBlockHeight
property should have a description that reflects its purpose as a starting block height.- @property {number} [fromBlockHeight=1] a target batch size per stream + @property {number} [fromBlockHeight=1] starting block height for synchronization📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.* @property {number} [fromBlockHeight=1] starting block height for synchronization
packages/rs-dpp/Cargo.toml (1)
76-76: 💡 Verification agent
🧩 Analysis chain
Removal of 'platform-value' from default features
Removing 'platform-value' from default features could potentially break functionality for consumers of this crate that rely on this feature without explicitly enabling it.
🏁 Script executed:
#!/bin/bash # Check for usage of platform-value in codebase # Look for imports or uses of platform-value rg -i "use.*platform[_-]value|platform[_-]value::" --type rustLength of output: 68203
Action Needed: Reconsider the removal of the “platform-value” default feature or update documentation accordingly
After verifying the extensive usage of
platform_value
across the codebase (as shown by the grep results in numerous modules such as inrs-dpp
,rs-drive
, and others), it appears that many parts of the system rely on its availability. Removing"platform-value"
from the default features inpackages/rs-dpp/Cargo.toml
means that consumers who previously depended on its implicit inclusion might experience build or runtime failures unless they explicitly opt into it.
- Verify Breaking Change: Confirm whether this change is intentional. If it is, ensure that downstream users are clearly informed that they must add
"platform-value"
manually to their dependencies.- Documentation & Migration: Update the migration notes and documentation to explain that the functionality provided by
platform_value
is now an optional feature.- Optional Reintroduction: Consider whether reintroducing
"platform-value"
in the default feature set might be preferable if most consumers rely on it.Please ensure that these points are addressed to prevent unintentional breakage for consumers of this crate.
packages/js-dash-sdk/src/SDK/Client/Platform/NonceManager/NonceManager.ts (1)
159-160: 🛠️ Refactor suggestion
Remove @ts-ignore comment and fix the underlying issue
The
@ts-ignore
comment is hiding a TypeScript error instead of properly fixing it. This could lead to runtime issues.- // @ts-ignore - const nextIdentityContractNonce = identityContractNonce + BigInt(1); + const nextIdentityContractNonce = identityContractNonce + BigInt(1);If TypeScript is complaining about mixing types, ensure that
identityContractNonce
is properly typed asbigint
.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.const nextIdentityContractNonce = identityContractNonce + BigInt(1);
packages/wasm-dpp/src/errors/consensus/consensus_error.rs (2)
693-695:
⚠️ Potential issueFix duplicated error handling case
The
MissingDefaultLocalizationError
is being handled twice in this file - once here and again at lines 731-733. This redundancy should be resolved to ensure proper error handling flow.
690-690:
⚠️ Potential issueFix syntax error with mismatched parenthesis
There is a stray closing parenthesis on this line without a matching opening parenthesis, which will cause compilation errors.
packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/mod.rs (1)
1-1:
⚠️ Potential issueUpdate import statement.
The static analysis shows that the current import is unused and differs from what's actually used in the file. The code is importing
TokenConfigurationLocalizationV0
from thev0
module instead.-use crate::data_contract::associated_token::token_configuration_localization::v0::TokenConfigurationLocalizationV0; +use crate::data_contract::associated_token::token_configuration_localization::v0::TokenConfigurationLocalizationV0;Committable suggestion skipped: line range outside the PR's diff.
🧰 Tools
🪛 GitHub Check: Rust packages (wasm-dpp) / Linting
[warning] 1-1: unused import:
crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
warning: unused import:crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/mod.rs:1:5
|
1 | use crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (drive) / Linting
[warning] 1-1: unused import:
crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
warning: unused import:crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/mod.rs:1:5
|
1 | use crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (dash-sdk) / Linting
[warning] 1-1: unused import:
crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
warning: unused import:crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_localization/mod.rs:1:5
|
1 | use crate::data_contract::associated_token::token_configuration_localization::accessors::v0::TokenConfigurationLocalizationV0Getters;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^packages/js-dapi-client/lib/methods/platform/getDataContractHistory/getDataContractHistoryFactory.js (1)
48-48: 💡 Verification agent
🧩 Analysis chain
Potential type inconsistency without validation
The code now converts
startAtMs
to a string, but there's no validation to ensure that the parameter is actually a BigInt when passed by the caller.Consider adding validation to ensure the expected type:
- .setStartAtMs(startAtMs.toString()) + .setStartAtMs((typeof startAtMs === 'bigint' ? startAtMs : BigInt(startAtMs)).toString())Let's verify if the code works correctly with different input types:
🏁 Script executed:
#!/bin/bash # Look for other places handling startAtMs to see if they perform validation rg -A 3 "startAtMs" --type js --type tsLength of output: 11891
Fix Required: Add Runtime Validation for
startAtMs
The current implementation directly calls
startAtMs.toString()
, which assumes that the value is a BigInt. However, while the JSDoc documentsstartAtMs
as a BigInt and a default ofBigInt(0)
is provided when undefined, there's nothing preventing a caller from passing a number. This could lead to unintended behavior if precision is lost or if the parameter isn't in the expected type.Actionable suggestions:
- Update the code to enforce that
startAtMs
is a BigInt at runtime. For instance, change the call fromto.setStartAtMs(startAtMs.toString())- .setStartAtMs(startAtMs.toString()) + .setStartAtMs((typeof startAtMs === 'bigint' ? startAtMs : BigInt(startAtMs)).toString())- This guard ensures that if
startAtMs
is not already a BigInt, it will be properly converted, avoiding potential type inconsistencies.packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs (1)
1-25:
⚠️ Potential issueNew token configuration accessors with unused imports
The implementation adds new accessors for token configurations to retrieve localized singular and plural forms with language code fallback support, which aligns with the PR objective of validating token name localizations.
However, there are unused imports that should be removed:
-use crate::data_contract::associated_token::token_configuration::accessors::v0::{ - TokenConfigurationV0Getters, TokenConfigurationV0Setters, -}; +use crate::data_contract::associated_token::token_configuration::accessors::v0::TokenConfigurationV0Getters;The
TokenConfigurationV0Setters
trait is imported but not used in this file, as flagged by the linter.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.pub mod v0; -use crate::data_contract::associated_token::token_configuration::accessors::v0::{ - TokenConfigurationV0Getters, TokenConfigurationV0Setters, -}; +use crate::data_contract::associated_token::token_configuration::accessors::v0::TokenConfigurationV0Getters; use crate::data_contract::associated_token::token_configuration_convention::accessors::v0::TokenConfigurationConventionV0Getters; use crate::data_contract::associated_token::token_configuration_convention::TokenConfigurationConvention; use crate::data_contract::associated_token::token_configuration_localization::TokenConfigurationLocalization; use std::collections::BTreeMap; impl TokenConfigurationConventionV0Getters for TokenConfigurationConvention { fn singular_form_by_language_code_or_default(&self, language_code: &str) -> &str { match self { TokenConfigurationConvention::V0(v0) => { v0.singular_form_by_language_code_or_default(language_code) } } } fn plural_form_by_language_code_or_default(&self, language_code: &str) -> &str { match self { TokenConfigurationConvention::V0(v0) => { v0.plural_form_by_language_code_or_default(language_code) } } } }
🧰 Tools
🪛 GitHub Check: Rust packages (wasm-dpp) / Linting
[warning] 4-4: unused imports:
TokenConfigurationV0Getters
andTokenConfigurationV0Setters
warning: unused imports:TokenConfigurationV0Getters
andTokenConfigurationV0Setters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs:4:5
|
4 | TokenConfigurationV0Getters, TokenConfigurationV0Setters,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (drive) / Linting
[warning] 4-4: unused imports:
TokenConfigurationV0Getters
andTokenConfigurationV0Setters
warning: unused imports:TokenConfigurationV0Getters
andTokenConfigurationV0Setters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs:4:5
|
4 | TokenConfigurationV0Getters, TokenConfigurationV0Setters,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^🪛 GitHub Check: Rust packages (dash-sdk) / Linting
[warning] 4-4: unused imports:
TokenConfigurationV0Getters
andTokenConfigurationV0Setters
warning: unused imports:TokenConfigurationV0Getters
andTokenConfigurationV0Setters
--> packages/rs-dpp/src/data_contract/associated_token/token_configuration_convention/accessors/mod.rs:4:5
|
4 | TokenConfigurationV0Getters, TokenConfigurationV0Setters,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^packages/rs-dpp/src/state_transition/state_transitions/document/batch_transition/batched_transition/document_transition.rs (2)
112-114: 🛠️ Refactor suggestion
Implement or remove placeholder return of
None
.This method always returns
None
. If token-claim transitions are expected, consider implementing the logic or removing this placeholder to avoid confusion.
119-121: 🛠️ Refactor suggestion
Address missing logic for token configuration updates.
Similar to
as_transition_token_claim
, this function also returnsNone
. IfTokenConfigUpdateTransition
is supported, provide the necessary handling; otherwise, remove or comment why it's a stub.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/evaluate_interval.rs (1)
44-70: 🛠️ Refactor suggestion
Consider safe increment operation.
Withinevaluate_interval
, you properly usechecked_add
when accumulatingtotal
. However, incrementingx
withx += step;
at line 67 could overflow ifx
is very large. Consider usingx = x.checked_add(step).ok_or_else(...)
for consistency.- x += step; + x = x.checked_add(step).ok_or_else(|| { + ProtocolError::Overflow("Block height overflow in evaluate_interval".into()) + })?;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.pub fn evaluate_interval( &self, start_not_included: u64, step: u64, end_included: u64, ) -> Result<TokenAmount, ProtocolError> { if step == 0 { return Err(ProtocolError::DivideByZero( "evaluate_interval: step cannot be zero".into(), )); } if end_included <= start_not_included { return Ok(0); } let mut total: u64 = 0; // Begin at the first period after start_not_included by adding 'step'. let mut x = start_not_included + step; while x <= end_included { // Call evaluate(x) and accumulate the result. total = total.checked_add(self.evaluate(x)?).ok_or_else(|| { ProtocolError::Overflow("Total evaluation overflow in evaluate_interval".into()) })?; x = x.checked_add(step).ok_or_else(|| { ProtocolError::Overflow("Block height overflow in evaluate_interval".into()) })?; } Ok(total) }
packages/rs-dapi-client/src/transport/wasm_channel.rs (2)
24-30: 🛠️ Refactor suggestion
Consider advanced configuration for
create_channel
.
Currently, this function only initializes a client with a URI. If you plan to accept or enforce timeouts, retries, or security settings, you may want to extend_settings
usage or rename it to highlight potential future usage.
74-82: 🛠️ Refactor suggestion
Leverage more descriptive or granular error statuses.
Currently, unknownwasm_client_error_to_status
errors becomeStatus::internal(...)
. If you foresee different error categories, consider mapping them toUnavailable
,DeadlineExceeded
, or other status types.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/encode.rs (9)
23-36:
⚠️ Potential issueStepDecreasingAmount variant complexity.
This variant adds multiple fields (decrease_per_interval_numerator
,decrease_per_interval_denominator
, etc.). Carefully verify each field in your domain logic to prevent division-by-zero or negative results.
41-48: 🛠️ Refactor suggestion
Linear variant min and max values.
The presence of optionalmin_value
andmax_value
in a linear function is helpful. Validate that your domain usage checks if, for example,min_value <= max_value
whenever both are set.
79-89: 🛠️ Refactor suggestion
Exponential variant domain checks.
Applying exponentials can rapidly grow or shrink values. Make sure your domain logic prevents going below zero or exceeding supply constraints.
123-143: 🛠️ Refactor suggestion
InvertedLogarithmic usage scenario.
Inverting logs can be tricky. If results approach infinity near 0, confirm that extreme boundary conditions are gracefully handled (e.g.,min_value
or practical cutoffs).
165-179:
⚠️ Potential issueDeserialization for StepDecreasingAmount.
Same caution about potential division-by-zero or negative values ifdecrease_per_interval_denominator
is zero or ifstep_count
is invalid. Consider handling it gracefully or throwing an error.
182-183: 🛠️ Refactor suggestion
Validate Stepwise steps post-decoding.
If theBTreeMap
is empty or unsorted, how will your distribution logic behave? Confirm a fallback or error if the map is invalid.
202-211:
⚠️ Potential issuePolynomial decoding.
Similar to the linear variant, a zerod
plus a largem
orn
can cause unexpected results. Specifically test for large negative or positive exponents.Also applies to: 212-221
246-265: 🛠️ Refactor suggestion
Logarithmic decoding.
Similar domain-check caution applies. Negative or zero input to a standard log function can produceNaN
, which must be handled.
328-329: 🛠️ Refactor suggestion
Stepwise consistency check in borrow_decode.
Checking the map after decoding can prevent unexpected runtime errors if the map is empty or keys are out of order.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/reward_distribution_type/mod.rs (1)
42-74:
⚠️ Potential issueConsider fallback behavior in
contract_creation_moment
.
This method returnsNone
if creation data is unavailable. Validate that downstream code robustly handles aNone
scenario and doesn’t incorrectly default to now or block 0.packages/rs-dpp/src/data_contract/accessors/v1/mod.rs (1)
81-98: 🛠️ Refactor suggestion
Setter methods must validate domain constraints.
If there's any business rule about these fields (e.g. block height can’t decrease), consider validating such invariants in the setters to avoid inconsistent contract states.packages/rs-dpp/src/data_contract/associated_token/token_perpetual_distribution/distribution_function/mod.rs (1)
386-453:
⚠️ Potential issue
Logarithmic
coverage includes flexible bounding.
Careful: log domain must not reach zero or negative. Confirm input validation is robust so that code never attemptslog(0)
or negative.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments. If you are seeing this consistently it is likely a permissions issue. Please check "Moderation" -> "Code review limits" under your organization settings.
Actionable comments posted: 1
🛑 Comments failed to post (1)
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js (1)
120-120: 💡 Verification agent
🧩 Analysis chain
Consider using undefined assignment instead of delete operator
While functionally correct, using the delete operator might impact performance. For test code this is acceptable, but consider using undefined assignment instead.
- delete rawMintDocument.tokenId; + rawMintDocument.tokenId = undefined;Also applies to: 186-186, 255-255, 330-330
🌐 Web query:
Are there performance implications of using the delete operator vs setting to undefined in JavaScript?
💡 Result:
Using the
delete
operator versus setting a property toundefined
in JavaScript has significant performance implications due to how JavaScript engines optimize object property access. Here's a detailed breakdown:
Performance Impact of
delete
Hidden Class Disruption:
JavaScript engines like V8 use hidden classes (Shapes) to optimize object property access. When you usedelete
, the object’s hidden class is downgraded to a slower hash table structure, invalidating inline caches. This forces the engine to perform costly deoptimizations, slowing subsequent property accesses[1][3][7][8].Benchmark Results:
delete
is ~10x slower than setting a property toundefined
ornull
in property-access tests[1][11].- Deleting properties in loops or performance-critical code can lead to measurable slowdowns[7][12].
Memory Management:
Whiledelete
removes the property entirely, it doesn’t directly free memory. Instead, it breaks references, allowing the garbage collector to reclaim memory only if no other references exist[5][9]. Setting a property toundefined
retains the property (with anundefined
value), keeping its memory allocated[10][12].
When to Use
undefined
ornull
- Faster Performance:
Setting a property toundefined
ornull
preserves the object’s hidden class, avoiding deoptimization. This is optimal for performance-sensitive code[1][3][7].obj.property = undefined; // Retains property, faster access- Use Cases:
- When you need to signal a property’s absence without altering the object’s structure.
- When iterating with
for...in
orObject.keys()
, asundefined
properties remain enumerable[1][4].
Practical Recommendations
Approach Pros Cons delete
Removes property entirely Slower, causes hidden class changes undefined
/null
Faster, preserves object structure Property remains enumerable
- Avoid
delete
in hot code paths or loops[3][7][8].- Prefer immutable patterns (e.g., object destructuring) to avoid mutation side effects[10]:
const { deletedProp, ...rest } = obj; // Creates a new object without 'deletedProp'- Use
Reflect.deleteProperty
for a functional alternative todelete
[3][6].
Summary
While
delete
semantically removes a property, its performance cost in V8-engine applications makesundefined
or destructuring preferable in most cases. Reservedelete
for scenarios where property removal is structurally necessary, and optimize elsewhere for speed[1][3][7][10].Citations:
- 1: https://jsperf.app/delete-vs-undefined-vs-null/12
- 2: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/delete
- 3: https://phillcode.hashnode.dev/javascript-delete
- 4: https://jsperf.app/delete-vs-undefined-vs-null/10
- 5: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Memory_management
- 6: https://bito.ai/resources/javascript-object-delete-javascript-explained/
- 7: https://builtin.com/software-engineering-perspectives/javascript-delete
- 8: googleapis/google-api-nodejs-client#375
- 9: https://community.lambdatest.com/t/understanding-javascript-delete-operator-with-objects/33378
- 10: https://www.smashingmagazine.com/2023/10/removing-object-properties-javascript/
- 11: https://measurethat.net/Benchmarks/Show/604/1/delete-vs-set-undefined
- 12: https://www.measurethat.net/Benchmarks/Show/1565/0/delete-vs-null-vs-undefined-vs-void-0-vs-objectcreatenu
Action Required: Replace
delete
with Undefined Assignment for Better PerformanceAfter verifying with external performance benchmarks, assigning
undefined
to a property is preferable as it avoids the hidden class deoptimizations associated with thedelete
operator. Although this is test code, adopting the assignment approach can yield more consistent performance and aligns with best practices.
- Update the following lines in
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
:
- Line 120
- Line 186
- Line 255
- Line 330
Diff example:
- delete rawMintDocument.tokenId; + rawMintDocument.tokenId = undefined;🧰 Tools
🪛 Biome (1.9.4)
[error] 120-120: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js (1)
120-120
: Consider replacing delete operator with undefined assignmentStatic analysis flagged the use of
delete
operator as potentially impacting performance.Although this is test code where performance is less critical, consider:
-delete rawBurnDocument.tokenId; +rawBurnDocument.tokenId = undefined;And similar changes for other delete operations in the test file.
Also applies to: 186-186, 255-255, 330-330
🧰 Tools
🪛 Biome (1.9.4)
[error] 120-120: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
(8 hunks)packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
(1 hunks)packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
- packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
🧰 Additional context used
🪛 Biome (1.9.4)
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js
[error] 120-120: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
[error] 186-186: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
[error] 255-255: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
[error] 330-330: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
⏰ Context from checks skipped due to timeout of 90000ms (20)
- GitHub Check: Rust packages (rs-dapi-client) / Detect immutable structure changes
- GitHub Check: Rust packages (rs-dapi-client) / Check each feature
- GitHub Check: Rust packages (rs-dapi-client) / Linting
- GitHub Check: Rust packages (rs-dapi-client) / Formatting
- GitHub Check: Rust packages (json-schema-compatibility-validator) / Unused dependencies
- GitHub Check: Rust packages (json-schema-compatibility-validator) / Tests
- GitHub Check: Rust packages (json-schema-compatibility-validator) / Linting
- GitHub Check: Rust packages (drive-abci) / Tests
- GitHub Check: Rust packages (drive-abci) / Check each feature
- GitHub Check: Rust packages (drive-abci) / Linting
- GitHub Check: Rust packages (drive) / Tests
- GitHub Check: Rust packages (drive) / Formatting
- GitHub Check: Rust packages (drive) / Linting
- GitHub Check: Rust packages (dpp) / Tests
- GitHub Check: Rust packages (dpp) / Check each feature
- GitHub Check: Rust packages (dpp) / Unused dependencies
- GitHub Check: Rust packages (dpp) / Linting
- GitHub Check: Rust packages (dash-sdk) / Tests
- GitHub Check: Rust packages (dash-sdk) / Linting
- GitHub Check: Build JS packages / Build JS
🔇 Additional comments (10)
packages/token-history-contract/test/unit/tokenHistoryContract.spec.js (10)
10-15
: Improved code readabilityThe
expectJsonSchemaError
function has been refactored to use more concise assertions, making the code cleaner and more maintainable.
28-32
: Well-structured contract creationThe multiline formatting for contract creation improves readability while maintaining the same functionality.
36-42
: Cleaner test implementationThe refactored test case uses arrow functions and proper formatting, making the intent clearer while maintaining the same test coverage.
94-96
: Better error expectations formattingImproved formatting of the error parameters expectations enhances readability.
119-126
: Updated validation approach for tokenIdThe test now validates that the tokenId is a required property rather than checking its byte length, which aligns better with JSON schema validation.
🧰 Tools
🪛 Biome (1.9.4)
[error] 120-120: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
185-192
: Consistent validation approach for tokenId in transfer documentsThis change maintains consistency with the other validation updates, focusing on property requirements rather than byte length checks.
🧰 Tools
🪛 Biome (1.9.4)
[error] 186-186: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
254-261
: Updated frozenIdentityId validation approachSimilar to other property validations, this now checks for the required property rather than its byte length, ensuring consistent validation across the contract.
🧰 Tools
🪛 Biome (1.9.4)
[error] 255-255: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
324-324
: More descriptive property namingRenaming from generic
amount
to specificdestroyedAmount
improves clarity about the property's purpose in the context of frozen funds destruction.
329-336
: Consistent validation for frozenIdentityIdThis change maintains the pattern of validating property requirements rather than byte lengths, ensuring consistent validation approaches throughout the test suite.
🧰 Tools
🪛 Biome (1.9.4)
[error] 330-330: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
339-347
: Updated test descriptions to match renamed propertyTest descriptions now correctly reference the renamed
destroyedAmount
property, maintaining consistency between code and documentation.
chore: update to latest dash core 37 (#2483) feat(platform)!: token advanced distribution and updates (#2471) fix: token history contract (#2474) Co-authored-by: Ivan Shumkov <ivan@shumkov.ru> Co-authored-by: QuantumExplorer <quantum@dash.org> fix(drive): using new rust dash core methods for reversed quorum hash to maintain backwards compatibility (#2489) feat: more granular integer document property types (#2455) Co-authored-by: Quantum Explorer <quantum@dash.org> docs: update comment for data contract code range (#2476) feat: validate token name localizations (#2468) feat(sdk): get identity by non-unique keys build(deps): update grovedb to current develop test: test identity by non-unique pubkey hashes fix(sdk): dash core client fails to get quorum chore: minor fixes test(drive-abci): identity by non-unique pubkey start after chore: minor changes to verify feat(sdk): token and group queries (#2449) chore: revert limit 1 => limit none chore: add non-unique key to test identities test(sdk): test vectors for test_fetch_identity_by_non_unique_public_keys fix(platform)!: token distribution fixes and tests (#2494) chore(platform): bump to version 2.0.0-dev.1 (#2495) test: update assertion fix(sdk): make some things public (#2496) feat(platform): require token for document actions (#2498) fix: data contract proof doesn't work with new auto fields (#2501)
chore: update to latest dash core 37 (#2483) feat(platform)!: token advanced distribution and updates (#2471) fix: token history contract (#2474) Co-authored-by: Ivan Shumkov <ivan@shumkov.ru> Co-authored-by: QuantumExplorer <quantum@dash.org> fix(drive): using new rust dash core methods for reversed quorum hash to maintain backwards compatibility (#2489) feat: more granular integer document property types (#2455) Co-authored-by: Quantum Explorer <quantum@dash.org> docs: update comment for data contract code range (#2476) feat: validate token name localizations (#2468) feat(sdk): get identity by non-unique keys build(deps): update grovedb to current develop test: test identity by non-unique pubkey hashes fix(sdk): dash core client fails to get quorum chore: minor fixes test(drive-abci): identity by non-unique pubkey start after chore: minor changes to verify feat(sdk): token and group queries (#2449) chore: revert limit 1 => limit none chore: add non-unique key to test identities test(sdk): test vectors for test_fetch_identity_by_non_unique_public_keys fix(platform)!: token distribution fixes and tests (#2494) chore(platform): bump to version 2.0.0-dev.1 (#2495) test: update assertion fix(sdk): make some things public (#2496) feat(platform): require token for document actions (#2498) fix: data contract proof doesn't work with new auto fields (#2501)
Issue being fixed or feature implemented
Currently, you can create a token without name. Also, there is no standard locatization defined.
What was done?
en
) localization is presentHow Has This Been Tested?
Updated existing tests
Breaking Changes
It's a breaking change for the
v2.0-dev
branchChecklist:
For repository code-owners and collaborators only
Summary by CodeRabbit
New Features
DataContract
capabilities to include timestamps and block heights for creation and updates, enhancing traceability.Chores