feat(snowflake)!: transpilation support MAP_PICK #7189
feat(snowflake)!: transpilation support MAP_PICK #7189fivetran-ashashankar wants to merge 2 commits intomainfrom
Conversation
SQLGlot Integration Test ResultsComparing:
By Dialect
Overallmain: 4003 total, 4003 passed (pass rate: 100.0%), sqlglot version: sqlglot:RD-1147671-transpile_map_pick: 4003 total, 4003 passed (pass rate: 100.0%), sqlglot version: Transitions: |
| def mappick_sql(self, expression: exp.MapPick) -> str: | ||
| map_arg = expression.this | ||
| keys_to_pick = expression.expressions | ||
|
|
||
| x_dot_key = exp.Dot(this=exp.to_identifier("x"), expression=exp.to_identifier("key")) | ||
|
|
||
| if len(keys_to_pick) == 1 and isinstance(keys_to_pick[0], exp.Array): | ||
| lambda_expr = exp.Lambda( | ||
| this=exp.ArrayContains(this=keys_to_pick[0], expression=x_dot_key), | ||
| expressions=[exp.to_identifier("x")], | ||
| ) | ||
| else: | ||
| lambda_expr = exp.Lambda( | ||
| this=exp.In(this=x_dot_key, expressions=keys_to_pick), | ||
| expressions=[exp.to_identifier("x")], | ||
| ) | ||
|
|
||
| result = exp.func( | ||
| "MAP_FROM_ENTRIES", | ||
| exp.ArrayFilter(this=exp.func("MAP_ENTRIES", map_arg), expression=lambda_expr), | ||
| ) | ||
| return self.sql(result) |
There was a problem hiding this comment.
For exp.ArrayContains and exp.ArrayFilter we can use directly self.func(...)
edit: I meant exp.Func, not self.func(...).
There was a problem hiding this comment.
I think it's fine as-is because we're building the ASTs and generating once, at the end. Unless you're suggesting using exp.func, not self.func? In that case, I agree.
There was a problem hiding this comment.
Yeap sorry, I meant exp.Func, I have edited my first comment.
|
|
||
| x_dot_key = exp.Dot(this=exp.to_identifier("x"), expression=exp.to_identifier("key")) | ||
|
|
||
| if len(keys_to_pick) == 1 and isinstance(keys_to_pick[0], exp.Array): |
There was a problem hiding this comment.
This condition isn't quite right, we shouldn't be checking for Array literals (i.e., [1, 2, 3]). We should check if is_type(exp.DType.ARRAY), instead. The current logic doesn't handle, for example, columns of type ARRAY.
There was a problem hiding this comment.
Also, please include unit tests that cover the type inference part. I.e., annotate types, then do sql(...), etc.
There was a problem hiding this comment.
agh, nice catch @georgesittas , i miss read it and thought it was a type check.
There was a problem hiding this comment.
To add on top of george's comment this is the same as this comment: #7190 (comment) . This check isinstance(keys_to_pick[0], exp.Array) can fail, if for example the input is just an identifier:
SELECT MAP_PICK(x , y) FROM t;
In this ^ example, we don't know if y has a value of an Array, but we know on parse time that it is a column and its type is an Array (this is true on parse time but also on run time).
So, checking static values at parse time isn't the correct approach. On the other hand, checking types at parse time is valid because the static typing won't change at run time.
| def mappick_sql(self, expression: exp.MapPick) -> str: | ||
| map_arg = expression.this | ||
| keys_to_pick = expression.expressions | ||
|
|
||
| x_dot_key = exp.Dot(this=exp.to_identifier("x"), expression=exp.to_identifier("key")) | ||
|
|
||
| if len(keys_to_pick) == 1 and isinstance(keys_to_pick[0], exp.Array): | ||
| lambda_expr = exp.Lambda( | ||
| this=exp.ArrayContains(this=keys_to_pick[0], expression=x_dot_key), | ||
| expressions=[exp.to_identifier("x")], | ||
| ) | ||
| else: | ||
| lambda_expr = exp.Lambda( | ||
| this=exp.In(this=x_dot_key, expressions=keys_to_pick), | ||
| expressions=[exp.to_identifier("x")], | ||
| ) | ||
|
|
||
| result = exp.func( | ||
| "MAP_FROM_ENTRIES", | ||
| exp.ArrayFilter(this=exp.func("MAP_ENTRIES", map_arg), expression=lambda_expr), | ||
| ) | ||
| return self.sql(result) |
There was a problem hiding this comment.
I think it's fine as-is because we're building the ASTs and generating once, at the end. Unless you're suggesting using exp.func, not self.func? In that case, I agree.
Adds transpilation support for Snowflake MAP_PICK to duckdb