Skip to content

[Bug] throw Hive DDL and paimon schema mismatched excetion when show the table's schema with the timestamp type #5450

Open
@Jack1007

Description

@Jack1007

Search before asking

  • I searched in the issues and found nothing similar.

Paimon version

paimon-1.0.1

Compute Engine

hive-3.1.2
spark-3.5.1

Minimal reproduce step

step 1:
spark paimon catalog

spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
spark.sql.catalog.spark_catalog.metastore=hive
spark.sql.catalog.spark_catalog.uri=thrift://metastore:9083
spark.sql.catalog.spark_catalog.warehouse=hdfs://nameservice1/user/hive/warehouse

create table use the sparksql

create table paimondb.test_ts (
name string, 
ts timestamp)
USING paimon
TBLPROPERTIES ( 'primary-key' = 'name');

step 2:
show the table's schema on hive

show create table paimondb.test_ts;

have no columns info but the exceptions in the log:

java.lang.IllegalArgumentException: Hive DDL and paimon schema mismatched! It is recommended not to write any column definition as Paimon external table can read schema from the specified location.
Mismatched fields are:
Field #7
Hive DDL          : ts timestamp
Paimon Schema: ts timestamp with local time zone

        at org.apache.paimon.hive.HiveSchema.checkFieldsMatched(HiveSchema.java:274) ~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
        at org.apache.paimon.hive.HiveSchema.extract(HiveSchema.java:165) ~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
        at org.apache.paimon.hive.PaimonSerDe.initialize(PaimonSerDe.java:67) ~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
        at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDeWithoutErrorCheck(SerDeUtils.java:562) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:282) ~[hive-exec-3.1.2.jar:3.1.2]

What doesn't meet your expectations?

spark create table with the timestamp type, the timestamp type default means 'timestamp with local time zone',and the paimon schema file record the type is 'TIMESTAMP(6) WITH LOCAL TIME ZONE'.
Then, use hive show the table's schema, it throw the exceptions.Hive has supported the 'timestamp with local time zone' type in the vertion 3.1.2, so the error shouldn't ocurr.

Anything else?

No response

Are you willing to submit a PR?

  • I'm willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions