Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Subtask] [spark-connector] support hive table properties #1549

Closed
Tracked by #1227
FANNG1 opened this issue Jan 17, 2024 · 5 comments
Closed
Tracked by #1227

[Subtask] [spark-connector] support hive table properties #1549

FANNG1 opened this issue Jan 17, 2024 · 5 comments
Assignees

Comments

@FANNG1
Copy link
Contributor

FANNG1 commented Jan 17, 2024

Describe the subtask

support properties when creating and loading hive table, like:

    [ ROW FORMAT row_format ]
    [ STORED AS file_format ]
    [ LOCATION path ]
    [ TBLPROPERTIES ( key1=val1, key2=val2, ... ) ]

Parent issue

#1227

@FANNG1 FANNG1 added this to the Gravitino 0.5.0 milestone Feb 19, 2024
@FANNG1 FANNG1 self-assigned this Mar 4, 2024
@Yangxuhao123
Copy link
Contributor

@FANNG1 Can I work on this issue?

@FANNG1
Copy link
Contributor Author

FANNG1 commented Mar 20, 2024

@Yangxuhao123 , I'm working on ROW FORMAT and STORED AS , could you work on Location?

@Yangxuhao123
Copy link
Contributor

ok, no problem.

@FANNG1
Copy link
Contributor Author

FANNG1 commented Mar 20, 2024

please create corresponding issues and I'll assign it to you

@FANNG1
Copy link
Contributor Author

FANNG1 commented Apr 8, 2024

all are supported

@FANNG1 FANNG1 closed this as completed Apr 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants