-
Notifications
You must be signed in to change notification settings - Fork 59
[repo] Integration testing #636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR introduces a comprehensive integration testing framework for the Kafka ClickHouse connector. The changes refactor existing integration tests into a more maintainable structure with shared base classes and helpers, update configuration files to support both local and cloud testing environments, and add utilities for generating test datasets across multiple serialization formats.
- Refactored integration tests into a base class structure for better code reuse
- Added
DatasetGeneratorutility for creating test data in multiple formats (Avro, Protobuf, JSON) - Updated Confluent Platform version from 7.5.0 to 7.7.0
Reviewed changes
Copilot reviewed 16 out of 17 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| data01.yaml | New test fixture defining database schema and message field mappings |
| DatasetGenerator.java | New utility class for generating test datasets in multiple serialization formats |
| keeper_map_config.xml | New ClickHouse configuration for integration tests |
| kafka_compose.yaml | New Docker Compose configuration for local Kafka/ClickHouse environment |
| clickhouse_sink_with_jdbc_prop.json | Updated to use template variables for exactlyOnce setting |
| clickhouse_sink_schemaless.json | Updated to use template variables for username and exactlyOnce settings |
| clickhouse_sink_no_proxy_with_jdbc_prop.json | New connector configuration without proxy settings |
| clickhouse_sink_no_proxy_schemaless.json | Updated database and SSL settings to use defaults |
| clickhouse_sink_no_proxy.json | Updated database and SSL settings to use defaults |
| clickhouse_sink.json | Updated to use template variables for username and exactlyOnce settings |
| IntegrationTestBase.java | New base class providing common setup/teardown and helper methods for integration tests |
| ConfluentPlatform.java | Updated Confluent version to 7.7.0, added constants, improved startup configuration |
| ExactlyOnceTest.java | Disabled test class pending restoration |
| ClickHouseSinkTest.java | New test class using refactored base class structure |
| ClickHouseSinkConnectorIntegrationTest.java | Removed old integration test implementation |
| ClickHouseCloudTest.java | Removed cloud-specific test implementation |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| record.put(field.name(), offset % 2 == 0); | ||
| case FLOAT: | ||
| case DOUBLE: | ||
| record.put(field.name(), offset * 10.123456); |
Copilot
AI
Dec 4, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing break statements in switch cases. The BOOLEAN, FLOAT, DOUBLE, and STRING cases will fall through to subsequent cases, causing incorrect behavior where STRING value overwrites all previous field assignments.
| record.put(field.name(), offset % 2 == 0); | |
| case FLOAT: | |
| case DOUBLE: | |
| record.put(field.name(), offset * 10.123456); | |
| record.put(field.name(), offset % 2 == 0); | |
| break; | |
| case FLOAT: | |
| record.put(field.name(), offset * 10.123456); | |
| break; | |
| case DOUBLE: | |
| record.put(field.name(), offset * 10.123456); | |
| break; |
| protected int generateSchemalessData(String topicName, int numberOfPartitions, int numberOfRecords) throws Exception { | ||
| Map<String, Object> config = new HashMap<>(DATA_GEN_BASE_CONFIG); | ||
| config.put("value.converter", ConfluentPlatform.KAFKA_JSON_CONVERTER); | ||
| DATA_GEN_BASE_CONFIG.put("value.converter.schemas.enable", "false"); |
Copilot
AI
Dec 4, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line modifies the shared static DATA_GEN_BASE_CONFIG map, which will affect all subsequent test executions. This should modify the local config map instead to avoid test interference.
| DATA_GEN_BASE_CONFIG.put("value.converter.schemas.enable", "false"); | |
| config.put("value.converter.schemas.enable", "false"); |
Summary
Checklist
Delete items not relevant to your PR: