-
Notifications
You must be signed in to change notification settings - Fork 319
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[#1573] feat(spark-connector): Spark regression test system #3933
Conversation
a6923a2
to
744990b
Compare
@yuqi1129 @diqiu50 @jerryshao @qqqttt123 , please help to review when you free, thanks |
The links are not accessed |
I could access the links, which links are not accessed? could you try again? |
...est/java/com/datastrato/gravitino/spark/connector/integration/test/sql/SparkQueryRunner.java
Outdated
Show resolved
Hide resolved
8d6d2ec
to
2cb407d
Compare
@diqiu50 @yuqi1129 @jerryshao , do you have time to review this? |
I'm not sure if you add docs about how to test it or not, if not please add the docs. |
...mon/src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/CatalogType.java
Outdated
Show resolved
Hide resolved
...mon/src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/CatalogType.java
Show resolved
Hide resolved
...mon/src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/QueryOutput.java
Outdated
Show resolved
Hide resolved
...rc/test/java/org/apache/gravitino/spark/connector/integration/test/sql/SparkQueryRunner.java
Outdated
Show resolved
Hide resolved
...src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/SparkTestConfig.java
Outdated
Show resolved
Hide resolved
public String getWarehouseLocation() { | ||
return get(WAREHOUSE_DIR); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we really need configurations for tests? I feel think we don't have to formalize them just for test, WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I prefer to keep the configs to control the test behavior better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to keep this configuration, you should have a better name to clarify its scope and usage, currently I feel the name is randomly picked, some of them are not so meaningful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
...common/src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/TestCase.java
Outdated
Show resolved
Hide resolved
@@ -144,10 +144,14 @@ tasks.test { | |||
|
|||
val skipITs = project.hasProperty("skipITs") | |||
val skipSparkITs = project.hasProperty("skipSparkITs") | |||
val skipSparkSQLITs = project.hasProperty("skipSparkSQLITs") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need a configuration to enable or disable this test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel like there are too many configurations to control whether to run something or not, do we have better solutions?
I would incline to use gradle to control whether to run this or not, by default we would not run these SQL tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried to define a specific task to run the test, but failed to initIntegrationTest
. So I'd like to define a enableSparkSQLTests
flag to enable the test explicitly, WDYT?
tasks.register("testSQL") {
dependsOn("compileTestJava")
doLast {
javaexec {
mainClass.set("org.apache.gravitino.spark.connector.integration.test.sql.SparkSQLRegressionTest33")
classpath = sourceSets["test"].runtimeClasspath
environment("GRAVITINO_CI_HIVE_DOCKER_IMAGE", "datastrato/gravitino-ci-hive:0.1.13")
// failed to init initIntegrationTest in javaexec
val init = project.extra.get("initIntegrationTest") as (Test) -> Unit
init(this)
}
}
}
...mon/src/test/java/org/apache/gravitino/spark/connector/integration/test/sql/CatalogType.java
Show resolved
Hide resolved
switch (str.toLowerCase()) { | ||
case "hive": | ||
return HIVE; | ||
case "lakehouse-iceberg": | ||
return ICEBERG; | ||
default: | ||
return UNKNOWN; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
switch (str.toLowerCase()) { | |
case "hive": | |
return HIVE; | |
case "lakehouse-iceberg": | |
return ICEBERG; | |
default: | |
return UNKNOWN; | |
} | |
for (CatalogType type : CatalogType.values()) | |
if (type.name.equals(str.toUpperCase())) { | |
//.... | |
} | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LAKEHOUSE-ICEBERG, couldn't be used as enum values
...rc/test/java/org/apache/gravitino/spark/connector/integration/test/sql/SparkQueryRunner.java
Show resolved
Hide resolved
private static File stringToFile(Path path, String str) throws IOException { | ||
File file = path.toFile(); | ||
try (PrintWriter out = new PrintWriter(file, StandardCharsets.UTF_8.toString())) { | ||
out.write(str); | ||
} | ||
return file; | ||
} | ||
|
||
private String fileToString(Path filePath) throws IOException { | ||
return new String(Files.readAllBytes(filePath), StandardCharsets.UTF_8); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FileUtils.writeStringToFile();
FileUtils.readFileToString()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
* "com.datastrato.gravitino.spark.connector.integration.test.sql.SparkSQLRegressionTest" | ||
* -PconfigFile=/xxx/xx | ||
*/ | ||
@Tag("gravitino-docker-it") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gravitino-docker-test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
.toString(); | ||
|
||
private static final ConfigEntry<String> TEST_BASE_DIR = | ||
new ConfigBuilder("gravitino.test.sql.dir") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gravitino.test.sql.dir -> gravitino.spark.test.dir
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
update document |
@yuqi1129 would you please take another look? |
} | ||
} | ||
|
||
// To find the catalog type from the directory name, the first name in directory match CatalogType |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the first non-unknown CatalogType
from parent to child determines the catalog type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
Others LGTM |
add support exclude test resources from test jar |
What changes were proposed in this pull request?
add Spark regression test system to do Regression test for SparkSQL.
Why are the changes needed?
Fix: #1573
Does this PR introduce any user-facing change?
no
How was this patch tested?
add IT