-
Notifications
You must be signed in to change notification settings - Fork 117
Description
Summary
Add a Cursor rule that guides developers to write tests using Fixture Monkey effectively, following established patterns and best practices found in the codebase.
Description
Currently, developers new to Fixture Monkey may struggle with:
- Choosing the right FixtureMonkey configuration for their test scenarios
- Understanding when to use
giveMeOne()vsgiveMeBuilder()vsgiveMe() - Properly setting up test fixtures with appropriate customizations
- Following consistent patterns for test data generation
This issue proposes creating a Cursor rule that provides guidance for writing effective tests with Fixture Monkey. The rule should reference successful patterns from the existing test suite, particularly from files like JavaTest.java and documentation examples.
Key areas the rule should cover:
- FixtureMonkey instance creation and configuration
- Choosing appropriate generation methods
- Customizing test objects with
set(),size(), and other builder methods - Working with generic types and complex objects
- Best practices for test data consistency
Expected Usage
Before (without the rule):
@Test
void testUserService() {
// Developer might write verbose setup code
User user = new User();
user.setName("Test User");
user.setAge(25);
user.setEmail("test@example.com");
// Test logic...
}After (with Cursor rule guidance):
@Test
void testUserService() {
// Cursor suggests using Fixture Monkey
FixtureMonkey fixtureMonkey = FixtureMonkey.builder()
.objectIntrospector(ConstructorPropertiesArbitraryIntrospector.INSTANCE)
.defaultNotNull(true)
.build();
User user = fixtureMonkey.giveMeBuilder(User.class)
.set("age", 25)
.sample();
// Test logic...
}Complex scenario guidance:
@Test
void testProductWithMultipleOptions() {
// Rule guides proper generic type handling and collection customization
Product product = fixtureMonkey.giveMeBuilder(Product.class)
.size("options", 3)
.set("options[0]", "premium")
.set("price", Arbitraries.longs().greaterThan(0))
.sample();
}Implementation Hints
- Study existing patterns: Examine
JavaTest.javaand other test files to identify common patterns - Reference documentation: Use examples from the docs/ directory for best practices
- Create rule structure: Follow the format of existing
.cursor/rules/*.mdcfiles - Include code examples: Provide before/after examples for common scenarios
- Add decision trees: Help developers choose between different Fixture Monkey methods
Validation Requirements
The rule must be tested across 5 different projects to ensure effectiveness:
- Create test scenarios: Apply the rule to 5 different Java/Kotlin projects with varying complexity
- Measure effectiveness: Document how well the rule guides developers to use Fixture Monkey
- Collect feedback: Gather feedback on rule clarity and usefulness from different project contexts
- Iterate based on results: Refine the rule based on real-world usage patterns
- Document findings: Include a summary of validation results in the PR
Suggested test projects:
- Simple Spring Boot REST API project
- Complex domain model with JPA entities
- Microservice with multiple data transfer objects
- Legacy codebase with existing manual test setup
- Kotlin project using data classes
Validation criteria:
- Rule successfully guides developers to use Fixture Monkey instead of manual object creation
- Developers can easily choose appropriate Fixture Monkey methods
- Generated test code follows established best practices
- Rule reduces time spent on test data setup
- Code suggestions are contextually appropriate
Files to Reference
fixture-monkey-tests/java-tests/src/test/java/com/navercorp/fixturemonkey/tests/java/JavaTest.java- For test patternsdocs/content/v1.1.x/docs/get-started/- For beginner-friendly examplesdocs/content/v1.1.x/docs/generating-objects/fixture-monkey.md- For API usage patternsdocs/content/v1.1.x/docs/customizing-objects/- For customization examples
Good First Issue Because
This issue is perfect for beginners because:
- Pattern-following: The task involves analyzing existing successful patterns rather than creating new functionality
- Clear examples: Abundant test files and documentation provide concrete examples to follow
- Documentation-focused: Primarily involves writing clear, helpful documentation
- Low risk: Creating a Cursor rule doesn't affect the core codebase functionality
- Learning opportunity: Contributors will gain deep understanding of Fixture Monkey best practices
- Real-world validation: Testing across multiple projects provides valuable hands-on experience
The existing .cursor/rules/fixture-monkey-gfi-rule.mdc provides a clear template to follow, and the extensive test suite offers plenty of real-world examples to reference.
If you're interested
- Comment below to let us know you'd like to work on this
- Study the existing patterns in the test files and documentation
- Prepare test projects or identify 5 existing projects for validation
- Ask questions if you need clarification on any Fixture Monkey concepts
We're here to help guide you through the process! 🚀
Deliverables
- Cursor rule file (
.cursor/rules/fixture-monkey-test-writing-rule.mdc) - Validation report documenting testing across 5 projects
- Before/after code examples from real project usage
- Recommendations for rule improvements based on validation results