Skip to content

Commit d3a3853

Browse files
AngersZhuuuupan3793
authored andcommitted
[KYUUBI #5937] PVM cause cache table not work
# 🔍 Description ## Issue References 🔗 This pull request fixes #5937 ## Describe Your Solution 🔧 If we cache a table with persist view in the query, since cache table use analyzed plan, so in kyuubi authz we will use PVM to wrap the view, but cache table use canonicalized plan, so we need to implement the `doCanonicalize()` method to ignore the impact of PVM, or it will cache cached table can't be matched. ## Types of changes 🔖 - [x] Bugfix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 #### Behavior Without This Pull Request ⚰️ #### Behavior With This Pull Request 🎉 #### Related Unit Tests --- # Checklist 📝 - [ ] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) **Be nice. Be informative.** Closes #5982 from AngersZhuuuu/KYUUBI-5937. Closes #5937 e28275f [Angerszhuuuu] Update PermanentViewMarker.scala c504103 [Angerszhuuuu] Update PermanentViewMarker.scala 19102ff [Angerszhuuuu] [KYUUBI-5937][Bug] PVM cause cache table not work Authored-by: Angerszhuuuu <angers.zhu@gmail.com> Signed-off-by: Cheng Pan <chengpan@apache.org>
1 parent 3b2e674 commit d3a3853

File tree

1 file changed

+17
-2
lines changed
  • extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/permanentview

1 file changed

+17
-2
lines changed

extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/permanentview/PermanentViewMarker.scala

Lines changed: 17 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,11 +21,14 @@ import org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
2121
import org.apache.spark.sql.catalyst.catalog.CatalogTable
2222
import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, Cast}
2323
import org.apache.spark.sql.catalyst.plans.QueryPlan
24-
import org.apache.spark.sql.catalyst.plans.logical.{LeafNode, LogicalPlan, Project, Statistics}
24+
import org.apache.spark.sql.catalyst.plans.logical.{LeafNode, LogicalPlan, Project, Statistics, View}
25+
import org.apache.spark.sql.catalyst.trees.TreeNodeTag
2526

2627
case class PermanentViewMarker(child: LogicalPlan, catalogTable: CatalogTable)
2728
extends LeafNode with MultiInstanceRelation {
2829

30+
private val PVM_NEW_INSTANCE_TAG = TreeNodeTag[Unit]("__PVM_NEW_INSTANCE_TAG")
31+
2932
override def output: Seq[Attribute] = child.output
3033

3134
override def argString(maxFields: Int): String = ""
@@ -38,6 +41,18 @@ case class PermanentViewMarker(child: LogicalPlan, catalogTable: CatalogTable)
3841
val projectList = child.output.map { case attr =>
3942
Alias(Cast(attr, attr.dataType), attr.name)(explicitMetadata = Some(attr.metadata))
4043
}
41-
this.copy(child = Project(projectList, child), catalogTable = catalogTable)
44+
val newProj = Project(projectList, child)
45+
newProj.setTagValue(PVM_NEW_INSTANCE_TAG, ())
46+
47+
this.copy(child = newProj, catalogTable = catalogTable)
48+
}
49+
50+
override def doCanonicalize(): LogicalPlan = {
51+
child match {
52+
case p @ Project(_, view: View) if p.getTagValue(PVM_NEW_INSTANCE_TAG).contains(true) =>
53+
view.canonicalized
54+
case _ =>
55+
child.canonicalized
56+
}
4257
}
4358
}

0 commit comments

Comments
 (0)