Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unify DbApiHook.run() method with the methods which override it #23971

Merged
merged 7 commits into from
Jul 22, 2022

Conversation

kazanzhy
Copy link
Contributor

@kazanzhy kazanzhy commented May 27, 2022

Now we have DbApiHook.run() method that is used by many other hooks.

Also there are following hooks that overrides this method:

I did all possible to make them as similar as possible, but some of them have peculiarities:

  • SnowflakeHook could split a string into a few statements
  • DatabricksSqlHook could split a string into a few statements, and each statement is run using a separate connection
  • ExasolHook uses pyexasol which doesn't have a reusable cursor
  • PrestoHook takes additional deprecated hql parameter and runs _strip_sql(sql)
  • TrinoHook also takes additional deprecated hql parameter and runs _strip_sql(sql)

@boring-cyborg boring-cyborg bot added area:core-operators Operators, Sensors and hooks within Core Airflow area:providers provider:snowflake Issues related to Snowflake provider labels May 27, 2022
@eladkal
Copy link
Contributor

eladkal commented May 27, 2022

Check #23112 maybe cooporating a fix to that as well can give a more complete solution

The goal is to avoid all the custom logic by each provider

@eladkal eladkal self-requested a review May 27, 2022 18:52
@kazanzhy
Copy link
Contributor Author

@eladkal thanks. I'll investigate what can be done for #23112. Anyway, this PR could be the first step.

@kazanzhy kazanzhy force-pushed the unify_dbapihook.run_method branch from 1a41803 to 3f9697c Compare May 27, 2022 21:51
@eladkal
Copy link
Contributor

eladkal commented May 28, 2022

Note about changes to DbapiHook.
The hook can be updated in next Airflow version but providers must be compatible with Airflow>=2.1 so depending on the changes you might need to add special handling for users who run Airflow in older versions

@kazanzhy kazanzhy force-pushed the unify_dbapihook.run_method branch 3 times, most recently from 6694bb5 to f1cc33f Compare May 29, 2022 23:49
airflow/hooks/dbapi.py Outdated Show resolved Hide resolved
airflow/hooks/dbapi.py Outdated Show resolved Hide resolved
@kazanzhy kazanzhy force-pushed the unify_dbapihook.run_method branch 2 times, most recently from d4f5218 to d653abd Compare May 31, 2022 12:36
This was referenced Jun 13, 2022
@eladkal
Copy link
Contributor

eladkal commented Jun 15, 2022

FYI @kazanzhy DbApiHook might be moved to the SQL provider #24422

@kazanzhy
Copy link
Contributor Author

Hi @eladkal
I think it'll be great to solve this PR before #24422.
But probably I think I need some suggestions from @potiuk

@eladkal
Copy link
Contributor

eladkal commented Jun 15, 2022

What is the current blocker to move forward with this PR?

airflow/hooks/dbapi.py Outdated Show resolved Hide resolved
Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tink this work should be connected with #24422 #24476 and any unification should be done there.

This will have the added benefit that "Core SQL" provider might be released independently from Airlfow and newer versions of it can be made prerequisites for the updated "SQL" providers - that will have the "core sql" provider as dependency. I think we should simply deprecate the DBApi and move all the unification effort there.

@kazanzhy
Copy link
Contributor Author

@potiuk
Unfortunately, this PR took a lot of time and I just want to remind you that it was inspired while I worked on #23767 (#23812, #23815, #23816).

I see that changes in #24476 don't intersect with these. But even though it's better to separate them to make review easier

@kazanzhy kazanzhy marked this pull request as ready for review June 27, 2022 14:15
@kazanzhy kazanzhy requested a review from turbaszek as a code owner June 27, 2022 14:15
@potiuk
Copy link
Member

potiuk commented Jun 27, 2022

@potiuk Unfortunately, this PR took a lot of time and I just want to remind you that it was inspired while I worked on #23767 (#23812, #23815, #23816).

I see that changes in #24476 don't intersect with these. But even though it's better to separate them to make review easier

First - general remarrk. I am not debating with your staments and reminders above. Yeah. Things were happening and we were thiking and discussing various ideas while others were being implemented. And from those discussions, it seems that the "being implemented" change is better to be delayed and reworked and likely incorporated in the other work.

Unfortunately, this happpens. Sometimes there are ideas that comes after and result in much better outcome that the first one. And it's pretty normal to redo parts of all of the change when we find a better one. This is normal. We aalso should think about the code as liability rather than asset. Sometimes iterating and improving on a PR leads to coming to better idea, which is result of the learnign during a process. There is much bigger value in an idea done right rather than one that implements a worse solution, even if is already wrritten.

BTW. It happened many time for my and other's changes, and if only that results in something that is better for the community - this is perfectly ok, normal and even good.

There is also a fallacy of "sunken cost" that you are falling in. The fact that a lot of effort has been invested in something, does not make it any more valuable than alternative solution. If anyone spent a lot of time on something, It does not mean that wee should furher invest in it, if we know we have better solution. The cost is "sunk" already. If the solution is better - we should go for it. If alternative is better - we should go for the alternative. And often that cost is not "totally sunk" - more often than not even tf the code cannot be directly "moved", the learnings from it can and reimplementting same thing using another approach and base takes a fraction of the originally invested time, because the learning/discovery has been already done and the original code can be used as a "blueprint" for reapplying it.

If we know better approach the cost that has been done alredy for this PR should not matter. We should simply make better decisions based on the current understanding of a problem. Time spent on doing any solution already shoudl not matter at all - in that decision. Only whether which of the solution is better.

Re - 'core sql':

We had already a number of problems with DBApi being in core of Airlfow and especially with any efforts there which impacted multiple providers implementing it. We have learned that with the current approach where we have providers released separately from the core, keeping DBApi in "airflow" package has more costs than benefits.

The thinking (that was born after you started working on a change and which you change sparked actually - seeing the consequences it cause) is that this approach is not sustainable. For any unification / change that uses current DB Api we will only start reaping any real benefits of in ~ 14 months - i..e about 12 months after 2.4.0 of airflow is released, because only then, any of the code in providers that will only work with the current version of DBApi (2.3 one) can be dropped. And for all 14 months we willl have to live with having to keep backwards compatibility and it will mean that we will not be able to fully utilise the benefits of such unification for more than a year.

On the other hand (and this what was the actual driver behind #24422 and #24476 is) - we have a much bigger need to unify the DB API even further. At Airflow Summit we spoke a lot about Lineage and in order to allow column based lineage and generally SQL-lineage, we need to find a better way to make sure that all community-based providers can utilise more "unified" and "fully-featured" DBApi equivalent, that will be able to evolve together with the providers. Thus the idea of 'core.sql' was born.

When 'core.sql' is implemented, we will be able to release any DB-API related unifications and changes independently from Airflow releases. This means than 3 months from now we will be able to release all DB providers with new features that anyone will be able to use in Airflow 2.2, 2.3. and that will provide much better lineage integration (and we can deprecate teh DBApi inside Airflow). We will not have to wait 14 months. This is a major win for the community - even if it means that you will have to redo all or parts of your PR.

Would you want @kazanzhy to implement your change knowing that it's going to be deprecated almost immediately after and will never be used in reality because we will replace the DB Api with Core.sql provider? Do you think it makes sense? Why would you want to do it?

@potiuk
Copy link
Member

potiuk commented Jun 28, 2022

And @kazanzhy -> the core.sql provider is already merged yesterday - so what is REALLY the best you can do now is to rebase your changes on top of it and move the DBApiHook there and deprecate the DBApiHook in core airflow - this is absolutely the best we can do now I think and it will make it sooooooo much easier to add new features to the DB related providers.

This will not even be a huge change for you I believe.

potiuk added a commit to potiuk/airflow that referenced this pull request Aug 14, 2022
There was a bug in an incoming change to common-sql provider
introduced in apache#23971 where `;-less` statements were removed
when "split_statements" flag was used. Since this flag is used
by default in Databricks statement, it introduced backwards
incompatible change.
potiuk added a commit that referenced this pull request Aug 15, 2022
There was a bug in an incoming change to common-sql provider
introduced in #23971 where `;-less` statements were removed
when "split_statements" flag was used. Since this flag is used
by default in Databricks statement, it introduced backwards
incompatible change.
@hewerthomn
Copy link
Contributor

After this update, every statement using OracleOperator now fails,

[2022-08-19, 04:09:47 -04] {sql.py:315} INFO - Running statement:
DECLARE
    v_sql LONG;
BEGIN
    v_sql := '
CREATE TABLE usr_bi_cgj.aguarda_recebimento_mandado
(
    nr_processo         VARCHAR2(25) NOT NULL,
    ds_tarefa           VARCHAR2(300) NOT NULL,
    dt_inicial          TIMESTAMP(6) NOT NULL ,
    id_orgaojulgador    NUMBER(15) NOT NULL,
    cd_especie          VARCHAR2(200) NOT NULL,
    qt_dias             NUMBER(22) NOT NULL
)
';
    EXECUTE IMMEDIATE v_sql;
    COMMIT;
    EXCEPTION
        WHEN OTHERS
        THEN EXECUTE IMMEDIATE 'TRUNCATE TABLE usr_bi_cgj.aguarda_recebimento_mandado';
    COMMIT;
END
, parameters: None
[2022-08-19, 04:09:47 -04] {taskinstance.py:1909} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/oracle/operators/oracle.py", line 69, in execute
    hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 295, in run
    self._run_command(cur, sql_statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 320, in _run_command
    cur.execute(sql_statement)
  File "/home/airflow/.local/lib/python3.7/site-packages/oracledb/cursor.py", line 378, in execute
    impl.execute(self)
  File "src/oracledb/impl/thin/cursor.pyx", line 121, in oracledb.thin_impl.ThinCursorImpl.execute
  File "src/oracledb/impl/thin/protocol.pyx", line 375, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 376, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 369, in oracledb.thin_impl.Protocol._process_message
oracledb.exceptions.DatabaseError: ORA-06550: linha 21, coluna 3:
PLS-00103: Encontrado o símbolo "end-of-file" quando um dos seguintes símbolos era esperado:

   ; <um identificador>
   <um identificador delimitado por aspas duplas>
O símbolo ";" foi substituído por "end-of-file" para continuar

OracleOperator using only single query works, but all operators using block BEGIN now fails with similar message like this:

PLS-00103: Encontrado o símbolo "end-of-file" quando um dos seguintes símbolos era esperado:

   ; <um identificador>
   <um identificador delimitado por aspas duplas>
O símbolo ";" foi substituído por "end-of-file" para continuar

@KarthikRajashekaran
Copy link

KarthikRajashekaran commented Aug 19, 2022

Update with

     Airflow 2.3.3 
     python3.8
     apache-airflow-providers-amazon==5.0.0
     apache-airflow-providers-postgres==5.2.0
     apache-airflow-providers-common-sql 

In the Dag

from airflow.providers.amazon.aws.operators.redshift_sql import RedshiftSQLOperator

Still, I am not able to run multiple SQL statements in a single file using RedshiftSQLOperator

select * from table A; delete from table B;

@potiuk
Copy link
Member

potiuk commented Aug 19, 2022

@hewerthomn @KarthikRajashekaran -> can you please open separate issues for that with all details there?

@kazanzhy
Copy link
Contributor Author

kazanzhy commented Aug 20, 2022

Hi @hewerthomn. Sorry for the inconvenience.
Unfortunately, some bugs were added by this PR. I think in your case it is fixed by #25713 PR

@kazanzhy
Copy link
Contributor Author

kazanzhy commented Aug 20, 2022

@KarthikRajashekaran it's correct.

By this PR we added the ability to split_statements when you're using the hook.
Currently, I'm working on making it possible for the Operators (see, #25717). For that you will use:

op = RedshiftSQLOperator(
    ...
    split_statements=True
    ...
)

@potiuk
Copy link
Member

potiuk commented Aug 20, 2022

Hi @hewerthomn. Sorry for the inconvenience. Unfortunately, some bugs were added by this PR. I think in your case it is fixed by #25713 PR

@hewerthomn - can you please install common-sql 1.1.0 provider and check if it is fixed ?

@hewerthomn
Copy link
Contributor

Hi @potiuk and @kazanzhy , I added the package apache-airflow-providers-common-sql==1.1.0, but same errors continue,

image

Log

[2022-08-20, 14:58:14 ] {sql.py:315} INFO - Running statement: DECLARE
    v_sql LONG;
BEGIN
    v_sql := '
create table usr_bi_cgj.dim_tarefa
(
    id_tarefa   NUMBER(22) not null primary key,
    ds_tarefa   VARCHAR2(4000) not NULL
);
';
    EXECUTE IMMEDIATE v_sql;
    COMMIT;
    EXCEPTION
        WHEN OTHERS
        THEN EXECUTE IMMEDIATE 'TRUNCATE TABLE usr_bi_cgj.dim_tarefa';
    COMMIT;
END, parameters: None
[2022-08-20, 14:58:14 ] {taskinstance.py:1909} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/oracle/operators/oracle.py", line 69, in execute
    hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 295, in run
    self._run_command(cur, sql_statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 320, in _run_command
    cur.execute(sql_statement)
  File "/home/airflow/.local/lib/python3.7/site-packages/oracledb/cursor.py", line 378, in execute
    impl.execute(self)
  File "src/oracledb/impl/thin/cursor.pyx", line 121, in oracledb.thin_impl.ThinCursorImpl.execute
  File "src/oracledb/impl/thin/protocol.pyx", line 375, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 376, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 369, in oracledb.thin_impl.Protocol._process_message
oracledb.exceptions.DatabaseError: ORA-06550: linha 17, coluna 3:
PLS-00103: Encontrado o símbolo "end-of-file" quando um dos seguintes símbolos era esperado:

   ; <um identificador>
   <um identificador delimitado por aspas duplas>
O símbolo ";" foi substituído por "end-of-file" para continuar.

@eladkal
Copy link
Contributor

eladkal commented Aug 20, 2022

This looks like a PL/SQL not SQL.
I dont think it was ever intended to work with SqlOperator of any kind.
When we talked about the unified solution I didnt consider PG/PL SQL (or equivalent) for the simple reason that its not pure SQL.

I guess if you want it to work with Sql then you should wrap your statments with store procedure and then invoke it by SELECT statment.

@hewerthomn
Copy link
Contributor

I see...

So it will stop working with pl/SQL now?

If I need to write procedures for all my operators I will use previous version of package instead for a while

Thx

@potiuk
Copy link
Member

potiuk commented Aug 20, 2022

I think this is a bug really @eladkal @kazanzhy - especially if it worked before.

I don't think we have any limitation and whether we care if the statement we send is SQL or not in most cases. If it can be sent through DB API and server processes it, it should be fine. Why would we care if it is a select query or PL/SQL?

I think the problem in this case is that we strip the ; at the end of the statement. Am I right @hewerthomn ? Can you confirm that the same query worked before ? Can you send some output of the query run in the previous version?

@kazanzhy
Copy link
Contributor Author

kazanzhy commented Aug 20, 2022

@hewerthomn can you please create a separate issue and publish the query that worked before?
I was sure that it was because of ;.
Also, please describe what operator are you using: is it OracleOperator or OracleStoredProcedureOperator.
And how do you use it, like
sql="DECLARE ..." or
sql=["DECLARE ..."]

@hewerthomn
Copy link
Contributor

@kazanzhy yes, I can post the issue.

I was using OracleOperator

@hewerthomn
Copy link
Contributor

I use sql="DECLARE ..."

@eladkal
Copy link
Contributor

eladkal commented Aug 21, 2022

Why would we care if it is a select query or PL/SQL?

Because PL/SQL has complex multipule statments that can not be splitted.

I also think we adressed this already?
I'm not near my laptop but as far as I remember we added a split_statment flag so:
OracleOperator(..., split_statment=False)
Should work in this case.

@hai-nv
Copy link

hai-nv commented Aug 21, 2022

Hi @potiuk and @kazanzhy , I added the package apache-airflow-providers-common-sql==1.1.0, but same errors continue,

image

Log

[2022-08-20, 14:58:14 ] {sql.py:315} INFO - Running statement: DECLARE
    v_sql LONG;
BEGIN
    v_sql := '
create table usr_bi_cgj.dim_tarefa
(
    id_tarefa   NUMBER(22) not null primary key,
    ds_tarefa   VARCHAR2(4000) not NULL
);
';
    EXECUTE IMMEDIATE v_sql;
    COMMIT;
    EXCEPTION
        WHEN OTHERS
        THEN EXECUTE IMMEDIATE 'TRUNCATE TABLE usr_bi_cgj.dim_tarefa';
    COMMIT;
END, parameters: None
[2022-08-20, 14:58:14 ] {taskinstance.py:1909} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/oracle/operators/oracle.py", line 69, in execute
    hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 295, in run
    self._run_command(cur, sql_statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 320, in _run_command
    cur.execute(sql_statement)
  File "/home/airflow/.local/lib/python3.7/site-packages/oracledb/cursor.py", line 378, in execute
    impl.execute(self)
  File "src/oracledb/impl/thin/cursor.pyx", line 121, in oracledb.thin_impl.ThinCursorImpl.execute
  File "src/oracledb/impl/thin/protocol.pyx", line 375, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 376, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 369, in oracledb.thin_impl.Protocol._process_message
oracledb.exceptions.DatabaseError: ORA-06550: linha 17, coluna 3:
PLS-00103: Encontrado o símbolo "end-of-file" quando um dos seguintes símbolos era esperado:

   ; <um identificador>
   <um identificador delimitado por aspas duplas>
O símbolo ";" foi substituído por "end-of-file" para continuar.

@hewerthomn
Have you try to remove ';' inside dynamic query?
From
v_sql := ' create table usr_bi_cgj.dim_tarefa ( id_tarefa NUMBER(22) not null primary key, ds_tarefa VARCHAR2(4000) not NULL ); ';
To
v_sql := ' create table usr_bi_cgj.dim_tarefa ( id_tarefa NUMBER(22) not null primary key, ds_tarefa VARCHAR2(4000) not NULL ) ';

@hai-nv
Copy link

hai-nv commented Aug 21, 2022

Why would we care if it is a select query or PL/SQL?

Because PL/SQL has complex multipule statments that can not be splitted.

I also think we adressed this already? I'm not near my laptop but as far as I remember we added a split_statment flag so: OracleOperator(..., split_statment=False) Should work in this case.

@eladkal ,
OracleOperator and OracleStoredProcedureOperator do not have 'split_statements' yet (provider version 3.3.0)
airflow.exceptions.AirflowException: Invalid arguments were passed to OracleStoredProcedureOperator (task_id: call_procedure). Invalid arguments were: | **kwargs: {'split_statements': False}
And i have same issue with ';' too when using OracleStoredProcedureOperator to trigger procedure

[2022-08-21, 09:00:10 UTC] {sql.py:315} INFO - Running statement: BEGIN pkg_test_***.test_insert_into_tmp_test(:p_input); END, parameters: {'p_input': 'A'}
[2022-08-21, 09:00:10 UTC] {taskinstance.py:1909} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/oracle/operators/oracle.py", line 104, in execute
    return hook.callproc(self.procedure, autocommit=True, parameters=self.parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/oracle/hooks/oracle.py", line 327, in callproc
    handler=handler,
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 295, in run
    self._run_command(cur, sql_statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/common/sql/hooks/sql.py", line 318, in _run_command
    cur.execute(sql_statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/oracledb/cursor.py", line 378, in execute
    impl.execute(self)
  File "src/oracledb/impl/thin/cursor.pyx", line 121, in oracledb.thin_impl.ThinCursorImpl.execute
  File "src/oracledb/impl/thin/protocol.pyx", line 375, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 376, in oracledb.thin_impl.Protocol._process_single_message
  File "src/oracledb/impl/thin/protocol.pyx", line 369, in oracledb.thin_impl.Protocol._process_message
oracledb.exceptions.DatabaseError: ORA-06550: line 1, column 63:
PLS-00103: Encountered the symbol "end-of-file" when expecting one of the following:

   ; <an identifier> <a double-quoted delimited-identifier>
The symbol ";" was substituted for "end-of-file" to continue.

@potiuk
Copy link
Member

potiuk commented Aug 21, 2022

I also think we adressed this already?
I'm not near my laptop but as far as I remember we added a split_statment flag so:
OracleOperator(..., split_statment=False)
Should work in this case.

Yeah. I am not talking about "split_statement". I know it's hard to split such PL/SQL, but I believe the problem is it does not work also when we set split_statement = False.

I think the problem is that we currently ALWAYS remove the closing ; from each statement and this is the root of the problem as apparently Oracle does not really like it.

            if split_statements:
                sql = self.split_sql_string(sql)
            else:
                sql = [self.strip_sql_string(sql)]

strip_sql_string:

    @staticmethod
    def strip_sql_string(sql: str) -> str:
        return sql.strip().rstrip(';')

The result is that If you pass "run" method a statement with ";" at the end it will be converted into one-element array of statements and the ; will be removed in that one element. I think this is the root cause of the problem.

Now the question is @kazanzhy - why do we remove that ; . Is there a reason that other DBs are complaining if we do not remove it? Unfortunately DBAPI PEP https://peps.python.org/pep-0249/#cursor-methods is silent about whether the semicolon should be there or not. and while in SQL it is generally a separator between multiple statement whether it is merely separator (not needed in the statement) or terminator (needed in the statement) is not only debatable, but also changes over time. For example whil it was allowed to treat semicolon as just a separator or whether it is a mandatory terminator has changed in SQLServer - https://wp.larnu.uk/fundamentals-the-semicolon-is-a-statement-terminator/ . It was possible to run a statement in sqlserver without semicolon, but this was deprecated.

So I think we should have flexibility on whether to remove the semicolon or not.

@KarthikRajashekaran
Copy link

@KarthikRajashekaran it's correct.

By this PR we added the ability to split_statements when you're using the hook. Currently, I'm working on making it possible for the Operators (see, #25717). For that you will use:

op = RedshiftSQLOperator(
    ...
    split_statements=True
    ...
)

So I need to wait for #25717 to get the redshift to execute multiple statements?

@hewerthomn
Copy link
Contributor

@hewerthomn can you please create a separate issue and publish the query that worked before? I was sure that it was because of ;. Also, please describe what operator are you using: is it OracleOperator or OracleStoredProcedureOperator. And how do you use it, like sql="DECLARE ..." or sql=["DECLARE ..."]

@kazanzhy I created the issue #25851

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:core-operators Operators, Sensors and hooks within Core Airflow area:providers changelog:skip Changes that should be skipped from the changelog (CI, tests, etc..) provider:snowflake Issues related to Snowflake provider
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants