Skip to content

Commit f67140e

Browse files
AngersZhuuuupan3793
authored andcommitted
[KYUUBI #5594][AUTHZ] BuildQuery should respect normal node's input
# 🔍 Description ## Issue References 🔗 This pull request fixes #5594 ## Describe Your Solution 🔧 For case ``` def filter_func(iterator): for pdf in iterator: yield pdf[pdf.id == 1] df = spark.read.table("test_mapinpandas") execute_result = df.mapInPandas(filter_func, df.schema).show() ``` The logical plan is ``` GlobalLimit 21 +- LocalLimit 21 +- Project [cast(id#5 as string) AS id#11, name#6] +- MapInPandas filter_func(id#0, name#1), [id#5, name#6] +- HiveTableRelation [`default`.`test_mapinpandas`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [id#0, name#1], Partition Cols: []] ``` When handle `MapInPandas`, we didn't match its input with `HiveTableRelation`, cause we miss input table's columns. This pr fix this In this pr, we remove the branch of each project such as `Project`, `Aggregate` etc, handle it together. ## Types of changes 🔖 - [x] Bugfix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 #### Behavior Without This Pull Request ⚰️ For case ``` def filter_func(iterator): for pdf in iterator: yield pdf[pdf.id == 1] df = spark.read.table("test_mapinpandas") execute_result = df.mapInPandas(filter_func, df.schema).show() ``` We miss column info of table `test_mapinpandas` #### Behavior With This Pull Request 🎉 We got privilege object of table `test_mapinpandas` with it's column info. #### Related Unit Tests --- # Checklists ## 📝 Author Self Checklist - [x] My code follows the [style guidelines](https://kyuubi.readthedocs.io/en/master/contributing/code/style.html) of this project - [x] I have performed a self-review - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have made corresponding changes to the documentation - [x] My changes generate no new warnings - [x] I have added tests that prove my fix is effective or that my feature works - [x] New and existing unit tests pass locally with my changes - [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) ## 📝 Committer Pre-Merge Checklist - [x] Pull request title is okay. - [x] No license issues. - [x] Milestone correctly set? - [x] Test coverage is ok - [x] Assignees are selected. - [x] Minimum number of approvals - [x] No changes are requested **Be nice. Be informative.** Closes #5787 from AngersZhuuuu/KYUUBI-5594-approach2. Closes #5594 e085455 [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 49f09fb [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 4781f75 [Angerszhuuuu] Update PrivilegesBuilderSuite.scala 9e9208d [Angerszhuuuu] Update V2JdbcTableCatalogRangerSparkExtensionSuite.scala 626d3dd [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 3d69997 [Angerszhuuuu] Update PrivilegesBuilderSuite.scala 6eb4b8e [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 61efb8a [Angerszhuuuu] update 794ebb7 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594-approach2 a236da8 [Angerszhuuuu] Update PrivilegesBuilderSuite.scala 74bd3f4 [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 4acbc42 [Angerszhuuuu] Merge branch 'KYUUBI-5594-approach2' of https://github.com/AngersZhuuuu/incubator-kyuubi into KYUUBI-5594-approach2 266f7e8 [Angerszhuuuu] update a6c7845 [Angerszhuuuu] Update PrivilegesBuilder.scala d785d5f [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594-approach2 014ef3b [Angerszhuuuu] Update PrivilegesBuilder.scala 7e1cd37 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594-approach2 71d2661 [Angerszhuuuu] update db95941 [Angerszhuuuu] update 490eb95 [Angerszhuuuu] update 70d110e [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594-approach2 e6a5877 [Angerszhuuuu] Update PrivilegesBuilder.scala 5ff22b1 [Angerszhuuuu] Update PrivilegesBuilder.scala e684301 [Angerszhuuuu] Update PrivilegesBuilder.scala 594b202 [Angerszhuuuu] Update PrivilegesBuilder.scala 2f87c61 [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 1de8c1c [Angerszhuuuu] Update PrivilegesBuilder.scala ad17255 [Angerszhuuuu] Update PrivilegesBuilderSuite.scala 4f5e850 [Angerszhuuuu] update 64349ed [Angerszhuuuu] Update PrivilegesBuilder.scala 11b7a4c [Angerszhuuuu] Update PrivilegesBuilder.scala 9a58fb0 [Angerszhuuuu] update d0b022e [Angerszhuuuu] Update RuleApplyPermanentViewMarker.scala e0f28a6 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594 0ebdd5d [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594 8e53236 [Angerszhuuuu] update 3bafa7c [Angerszhuuuu] update d6e984e [Angerszhuuuu] update b00bf5e [Angerszhuuuu] Update PrivilegesBuilder.scala 8214228 [Angerszhuuuu] update 93fc689 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594 04184e3 [Angerszhuuuu] update 0bb7624 [Angerszhuuuu] Revert "Revert "Update PrivilegesBuilder.scala"" f481283 [Angerszhuuuu] Revert "Update PrivilegesBuilder.scala" 9f87182 [Angerszhuuuu] Revert "Update PrivilegesBuilder.scala" 29b67c4 [Angerszhuuuu] Update PrivilegesBuilder.scala 8785ad1 [Angerszhuuuu] Update PrivilegesBuilder.scala 270f21d [Angerszhuuuu] Update RangerSparkExtensionSuite.scala 60872ef [Angerszhuuuu] Update RangerSparkExtensionSuite.scala c34f32e [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594 86fc475 [Angerszhuuuu] Update PrivilegesBuilder.scala 404f1ea [Angerszhuuuu] Update PrivilegesBuilder.scala dcca394 [Angerszhuuuu] Update PrivilegesBuilder.scala c2c6fa4 [Angerszhuuuu] Update PrivilegesBuilder.scala 6f6a36e [Angerszhuuuu] Merge branch 'master' into KYUUBI-5594]-AUTH]BuildQuery-should-respect-normal-node's-input 4dd47a1 [Angerszhuuuu] update c549b6a [Angerszhuuuu] update 80013b9 [Angerszhuuuu] Update PrivilegesBuilder.scala 3cbba42 [Angerszhuuuu] Update PrivilegesBuilder.scala Authored-by: Angerszhuuuu <angers.zhu@gmail.com> Signed-off-by: Cheng Pan <chengpan@apache.org>
1 parent a2179cc commit f67140e

File tree

8 files changed

+337
-283
lines changed

8 files changed

+337
-283
lines changed

extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala

Lines changed: 57 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ package org.apache.kyuubi.plugin.spark.authz
2020
import scala.collection.mutable.ArrayBuffer
2121

2222
import org.apache.spark.sql.SparkSession
23-
import org.apache.spark.sql.catalyst.expressions.{Expression, NamedExpression}
23+
import org.apache.spark.sql.catalyst.expressions.{AttributeSet, Expression, NamedExpression}
2424
import org.apache.spark.sql.catalyst.plans.logical._
2525
import org.apache.spark.sql.execution.command.ExplainCommand
2626
import org.slf4j.LoggerFactory
@@ -69,44 +69,20 @@ object PrivilegesBuilder {
6969
if (projectionList.isEmpty) {
7070
privilegeObjects += PrivilegeObject(table, plan.output.map(_.name))
7171
} else {
72-
val cols = (projectionList ++ conditionList).flatMap(collectLeaves)
73-
.filter(plan.outputSet.contains).map(_.name).distinct
74-
privilegeObjects += PrivilegeObject(table, cols)
72+
val cols = columnPrune(projectionList ++ conditionList, plan.outputSet)
73+
privilegeObjects += PrivilegeObject(table, cols.map(_.name).distinct)
7574
}
7675
}
7776

77+
def columnPrune(projectionList: Seq[Expression], output: AttributeSet): Seq[NamedExpression] = {
78+
(projectionList ++ conditionList)
79+
.flatMap(collectLeaves)
80+
.filter(output.contains)
81+
}
82+
7883
plan match {
7984
case p if p.getTagValue(KYUUBI_AUTHZ_TAG).nonEmpty =>
8085

81-
case p: Project => buildQuery(p.child, privilegeObjects, p.projectList, conditionList, spark)
82-
83-
case j: Join =>
84-
val cols =
85-
conditionList ++ j.condition.map(expr => collectLeaves(expr)).getOrElse(Nil)
86-
buildQuery(j.left, privilegeObjects, projectionList, cols, spark)
87-
buildQuery(j.right, privilegeObjects, projectionList, cols, spark)
88-
89-
case f: Filter =>
90-
val cols = conditionList ++ collectLeaves(f.condition)
91-
buildQuery(f.child, privilegeObjects, projectionList, cols, spark)
92-
93-
case w: Window =>
94-
val orderCols = w.orderSpec.flatMap(orderSpec => collectLeaves(orderSpec))
95-
val partitionCols = w.partitionSpec.flatMap(partitionSpec => collectLeaves(partitionSpec))
96-
val cols = conditionList ++ orderCols ++ partitionCols
97-
buildQuery(w.child, privilegeObjects, projectionList, cols, spark)
98-
99-
case s: Sort =>
100-
val sortCols = s.order.flatMap(sortOrder => collectLeaves(sortOrder))
101-
val cols = conditionList ++ sortCols
102-
buildQuery(s.child, privilegeObjects, projectionList, cols, spark)
103-
104-
case a: Aggregate =>
105-
val aggCols =
106-
(a.aggregateExpressions ++ a.groupingExpressions).flatMap(e => collectLeaves(e))
107-
val cols = conditionList ++ aggCols
108-
buildQuery(a.child, privilegeObjects, projectionList, cols, spark)
109-
11086
case scan if isKnownScan(scan) && scan.resolved =>
11187
val tables = getScanSpec(scan).tables(scan, spark)
11288
// If the the scan is table-based, we check privileges on the table we found
@@ -125,7 +101,33 @@ object PrivilegesBuilder {
125101

126102
case p =>
127103
for (child <- p.children) {
128-
buildQuery(child, privilegeObjects, projectionList, conditionList, spark)
104+
// If current plan's references don't have relation to it's input, have two cases
105+
// 1. `MapInPandas`, `ScriptTransformation`
106+
// 2. `Project` output only have constant value
107+
if (columnPrune(p.references.toSeq ++ p.output, p.inputSet).isEmpty) {
108+
// If plan is project and output don't have relation to input, can ignore.
109+
if (!p.isInstanceOf[Project]) {
110+
buildQuery(
111+
child,
112+
privilegeObjects,
113+
p.inputSet.map(_.toAttribute).toSeq,
114+
Nil,
115+
spark)
116+
}
117+
} else {
118+
buildQuery(
119+
child,
120+
privilegeObjects,
121+
// Here we use `projectList ++ p.reference` do column prune.
122+
// For `Project`, `Aggregate`, plan's output is contained by plan's referenced
123+
// For `Filter`, `Sort` etc... it rely on upper `Project` node,
124+
// since we wrap a `Project` before call `buildQuery()`.
125+
// So here we use upper node's projectionList and current's references
126+
// to do column pruning can get the correct column.
127+
columnPrune(projectionList ++ p.references.toSeq, p.inputSet).distinct,
128+
conditionList ++ p.references,
129+
spark)
130+
}
129131
}
130132
}
131133
}
@@ -221,7 +223,26 @@ object PrivilegesBuilder {
221223
LOG.debug(ud.error(plan, e))
222224
}
223225
}
224-
spec.queries(plan).foreach(buildQuery(_, inputObjs, spark = spark))
226+
spec.queries(plan).foreach { p =>
227+
if (p.resolved) {
228+
buildQuery(Project(p.output, p), inputObjs, spark = spark)
229+
} else {
230+
try {
231+
// For spark 3.1, Some command such as CreateTableASSelect, its query was unresolved,
232+
// Before this pr, we just ignore it, now we support this.
233+
val analyzed = spark.sessionState.analyzer.execute(p)
234+
buildQuery(Project(analyzed.output, analyzed), inputObjs, spark = spark)
235+
} catch {
236+
case e: Exception =>
237+
LOG.debug(
238+
s"""
239+
|Failed to analyze unresolved
240+
|$p
241+
|due to ${e.getMessage}""".stripMargin,
242+
e)
243+
}
244+
}
245+
}
225246
spec.operationType
226247

227248
case classname if FUNCTION_COMMAND_SPECS.contains(classname) =>
@@ -315,7 +336,7 @@ object PrivilegesBuilder {
315336
case cmd: Command => buildCommand(cmd, inputObjs, outputObjs, spark)
316337
// Queries
317338
case _ =>
318-
buildQuery(plan, inputObjs, spark = spark)
339+
buildQuery(Project(plan.output, plan), inputObjs, spark = spark)
319340
OperationType.QUERY
320341
}
321342
(inputObjs, outputObjs, opType)

extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilderSuite.scala

Lines changed: 20 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -59,11 +59,15 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
5959
protected def checkColumns(plan: LogicalPlan, cols: Seq[String]): Unit = {
6060
val (in, out, _) = PrivilegesBuilder.build(plan, spark)
6161
assert(out.isEmpty, "Queries shall not check output privileges")
62-
val po = in.head
63-
assert(po.actionType === PrivilegeObjectActionType.OTHER)
64-
assert(po.privilegeObjectType === PrivilegeObjectType.TABLE_OR_VIEW)
65-
assert(po.columns === cols)
66-
checkTableOwner(po)
62+
if (in.nonEmpty) {
63+
val po = in.head
64+
assert(po.actionType === PrivilegeObjectActionType.OTHER)
65+
assert(po.privilegeObjectType === PrivilegeObjectType.TABLE_OR_VIEW)
66+
assert(po.columns === cols)
67+
checkTableOwner(po)
68+
} else {
69+
assert(cols.isEmpty)
70+
}
6771
}
6872

6973
protected def checkColumns(query: String, cols: Seq[String]): Unit = {
@@ -365,7 +369,7 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
365369
assertEqualsIgnoreCase(reusedPartTableShort)(po0.objectName)
366370
if (isSparkV32OrGreater) {
367371
// Query in AlterViewAsCommand can not be resolved before SPARK-34698
368-
assert(po0.columns === Seq("key", "value", "pid"))
372+
assert(po0.columns === Seq("key", "pid", "value"))
369373
checkTableOwner(po0)
370374
}
371375
val accessType0 = ranger.AccessType(po0, operationType, isInput = true)
@@ -526,12 +530,8 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
526530
assert(po0.privilegeObjectType === PrivilegeObjectType.TABLE_OR_VIEW)
527531
assertEqualsIgnoreCase(reusedDb)(po0.dbname)
528532
assertEqualsIgnoreCase(reusedTableShort)(po0.objectName)
529-
if (isSparkV32OrGreater) {
530-
assert(po0.columns.head === "key")
531-
checkTableOwner(po0)
532-
} else {
533-
assert(po0.columns.isEmpty)
534-
}
533+
assert(po0.columns.head === "key")
534+
checkTableOwner(po0)
535535
val accessType0 = ranger.AccessType(po0, operationType, isInput = true)
536536
assert(accessType0 === AccessType.SELECT)
537537

@@ -549,12 +549,8 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
549549
assert(po0.privilegeObjectType === PrivilegeObjectType.TABLE_OR_VIEW)
550550
assertEqualsIgnoreCase(reusedDb)(po0.dbname)
551551
assertEqualsIgnoreCase(reusedTableShort)(po0.objectName)
552-
if (isSparkV32OrGreater) {
553-
assert(po0.columns === Seq("key", "value"))
554-
checkTableOwner(po0)
555-
} else {
556-
assert(po0.columns.isEmpty)
557-
}
552+
assert(po0.columns === Seq("key", "value"))
553+
checkTableOwner(po0)
558554
val accessType0 = ranger.AccessType(po0, operationType, isInput = true)
559555
assert(accessType0 === AccessType.SELECT)
560556

@@ -1050,7 +1046,7 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
10501046
assertEqualsIgnoreCase(reusedDb)(po.dbname)
10511047
assertStartsWithIgnoreCase(reusedTableShort)(po.objectName)
10521048
assert(
1053-
po.columns === Seq("value", "pid", "key"),
1049+
po.columns === Seq("value", "key", "pid"),
10541050
s"$reusedPartTable both 'key', 'value' and 'pid' should be authenticated")
10551051
checkTableOwner(po)
10561052
val accessType = ranger.AccessType(po, operationType, isInput = true)
@@ -1107,7 +1103,7 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
11071103
assertEqualsIgnoreCase(reusedDb)(po.dbname)
11081104
assertStartsWithIgnoreCase(reusedTableShort)(po.objectName)
11091105
assert(
1110-
po.columns === Seq("key", "value"),
1106+
po.columns.sorted === Seq("key", "value").sorted,
11111107
s"$reusedPartTable 'key' is the join key and 'pid' is omitted")
11121108
checkTableOwner(po)
11131109
val accessType = ranger.AccessType(po, operationType, isInput = true)
@@ -1218,7 +1214,7 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
12181214
assertEqualsIgnoreCase(reusedDb)(po.dbname)
12191215
assertStartsWithIgnoreCase(reusedTableShort)(po.objectName)
12201216
assert(
1221-
po.columns === Seq("key", "value", "pid"),
1217+
po.columns === Seq("key", "pid", "value"),
12221218
s"$reusedPartTable both 'key', 'value' and 'pid' should be authenticated")
12231219
checkTableOwner(po)
12241220
val accessType = ranger.AccessType(po, operationType, isInput = true)
@@ -1625,7 +1621,7 @@ class HiveCatalogPrivilegeBuilderSuite extends PrivilegesBuilderSuite {
16251621
assert(po0.privilegeObjectType === PrivilegeObjectType.TABLE_OR_VIEW)
16261622
assertEqualsIgnoreCase(reusedDb)(po0.dbname)
16271623
assert(po0.objectName equalsIgnoreCase reusedPartTable.split("\\.").last)
1628-
assert(po0.columns === Seq("key", "value", "pid"))
1624+
assert(po0.columns === Seq("key", "pid", "value"))
16291625
checkTableOwner(po0)
16301626
val accessType0 = ranger.AccessType(po0, operationType, isInput = true)
16311627
assert(accessType0 === AccessType.SELECT)
@@ -1721,7 +1717,7 @@ class HiveCatalogPrivilegeBuilderSuite extends PrivilegesBuilderSuite {
17211717
assert(out1.isEmpty)
17221718
val pi1 = in1.head
17231719
assert(pi1.columns.size === 3)
1724-
assert(pi1.columns === Seq("key", "value", "pid"))
1720+
assert(pi1.columns === Seq("key", "pid", "value"))
17251721

17261722
// case2: Some columns are involved, and the group column is not selected.
17271723
val plan2 = sql(s"SELECT COUNT(key) FROM $reusedPartTable GROUP BY pid")
@@ -1741,7 +1737,7 @@ class HiveCatalogPrivilegeBuilderSuite extends PrivilegesBuilderSuite {
17411737
assert(out3.isEmpty)
17421738
val pi3 = in3.head
17431739
assert(pi3.columns.size === 2)
1744-
assert(pi3.columns === Seq("key", "pid"))
1740+
assert(pi3.columns === Seq("pid", "key"))
17451741

17461742
// case4: HAVING & GROUP clause
17471743
val plan4 = sql(s"SELECT COUNT(key) FROM $reusedPartTable GROUP BY pid HAVING MAX(key) > 1000")

extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/V2CommandsPrivilegesSuite.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -127,7 +127,7 @@ abstract class V2CommandsPrivilegesSuite extends PrivilegesBuilderSuite {
127127
assert(po0.catalog.isEmpty)
128128
assertEqualsIgnoreCase(reusedDb)(po0.dbname)
129129
assertEqualsIgnoreCase(reusedTableShort)(po0.objectName)
130-
assert(po0.columns.take(2) === Seq("key", "value"))
130+
assert(po0.columns === Seq("a", "key", "value"))
131131
checkTableOwner(po0)
132132

133133
assert(outputs.size === 1)
@@ -186,7 +186,7 @@ abstract class V2CommandsPrivilegesSuite extends PrivilegesBuilderSuite {
186186
assert(po0.catalog.isEmpty)
187187
assertEqualsIgnoreCase(reusedDb)(po0.dbname)
188188
assertEqualsIgnoreCase(reusedTableShort)(po0.objectName)
189-
assert(po0.columns.take(2) === Seq("key", "value"))
189+
assert(po0.columns === Seq("a", "key", "value"))
190190
checkTableOwner(po0)
191191

192192
assert(outputs.size === 1)

extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/ranger/DeltaCatalogRangerSparkExtensionSuite.scala

Lines changed: 18 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -215,8 +215,9 @@ class DeltaCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
215215
s" SELECT * FROM $namespace1.$table2"
216216
interceptEndsWith[AccessControlException](
217217
doAs(someone, sql(insertIntoSql)))(
218-
s"does not have [select] privilege on [$namespace1/$table2/id,$namespace1/$table2/name," +
219-
s"$namespace1/$table2/gender,$namespace1/$table2/birthDate]," +
218+
s"does not have [select] privilege on " +
219+
s"[$namespace1/$table2/birthDate,$namespace1/$table2/gender," +
220+
s"$namespace1/$table2/id,$namespace1/$table2/name]," +
220221
s" [update] privilege on [$namespace1/$table1]")
221222
doAs(admin, sql(insertIntoSql))
222223

@@ -225,8 +226,9 @@ class DeltaCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
225226
s" SELECT * FROM $namespace1.$table2"
226227
interceptEndsWith[AccessControlException](
227228
doAs(someone, sql(insertOverwriteSql)))(
228-
s"does not have [select] privilege on [$namespace1/$table2/id,$namespace1/$table2/name," +
229-
s"$namespace1/$table2/gender,$namespace1/$table2/birthDate]," +
229+
s"does not have [select] privilege on " +
230+
s"[$namespace1/$table2/birthDate,$namespace1/$table2/gender," +
231+
s"$namespace1/$table2/id,$namespace1/$table2/name]," +
230232
s" [update] privilege on [$namespace1/$table1]")
231233
doAs(admin, sql(insertOverwriteSql))
232234
}
@@ -283,8 +285,9 @@ class DeltaCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
283285
|""".stripMargin
284286
interceptEndsWith[AccessControlException](
285287
doAs(someone, sql(mergeIntoSql)))(
286-
s"does not have [select] privilege on [$namespace1/$table2/id,$namespace1/$table2/name," +
287-
s"$namespace1/$table2/gender,$namespace1/$table2/birthDate]," +
288+
s"does not have [select] privilege on " +
289+
s"[$namespace1/$table2/birthDate,$namespace1/$table2/gender," +
290+
s"$namespace1/$table2/id,$namespace1/$table2/name]," +
288291
s" [update] privilege on [$namespace1/$table1]")
289292
doAs(admin, sql(mergeIntoSql))
290293
}
@@ -378,19 +381,19 @@ class DeltaCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
378381
val insertIntoSql = s"INSERT INTO delta.`$path` SELECT * FROM $namespace1.$table2"
379382
interceptEndsWith[AccessControlException](
380383
doAs(someone, sql(insertIntoSql)))(
381-
s"does not have [select] privilege on [$namespace1/$table2/id," +
382-
s"$namespace1/$table2/name,$namespace1/$table2/gender," +
383-
s"$namespace1/$table2/birthDate], [write] privilege on [[$path, $path/]]")
384+
s"does not have [select] privilege on [$namespace1/$table2/birthDate," +
385+
s"$namespace1/$table2/gender,$namespace1/$table2/id," +
386+
s"$namespace1/$table2/name], [write] privilege on [[$path, $path/]]")
384387
doAs(admin, sql(insertIntoSql))
385388

386389
// insert overwrite
387390
val insertOverwriteSql =
388391
s"INSERT OVERWRITE delta.`$path` SELECT * FROM $namespace1.$table2"
389392
interceptEndsWith[AccessControlException](
390393
doAs(someone, sql(insertOverwriteSql)))(
391-
s"does not have [select] privilege on [$namespace1/$table2/id," +
392-
s"$namespace1/$table2/name,$namespace1/$table2/gender," +
393-
s"$namespace1/$table2/birthDate], [write] privilege on [[$path, $path/]]")
394+
s"does not have [select] privilege on [$namespace1/$table2/birthDate," +
395+
s"$namespace1/$table2/gender,$namespace1/$table2/id," +
396+
s"$namespace1/$table2/name], [write] privilege on [[$path, $path/]]")
394397
doAs(admin, sql(insertOverwriteSql))
395398
})
396399
}
@@ -433,9 +436,9 @@ class DeltaCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
433436
|""".stripMargin
434437
interceptEndsWith[AccessControlException](
435438
doAs(someone, sql(mergeIntoSql)))(
436-
s"does not have [select] privilege on [$namespace1/$table2/id," +
437-
s"$namespace1/$table2/name,$namespace1/$table2/gender," +
438-
s"$namespace1/$table2/birthDate], [write] privilege on [[$path, $path/]]")
439+
s"does not have [select] privilege on [$namespace1/$table2/birthDate," +
440+
s"$namespace1/$table2/gender,$namespace1/$table2/id," +
441+
s"$namespace1/$table2/name], [write] privilege on [[$path, $path/]]")
439442
doAs(admin, sql(mergeIntoSql))
440443
})
441444
}

extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/ranger/HudiCatalogRangerSparkExtensionSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -507,7 +507,7 @@ class HudiCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
507507
interceptEndsWith[AccessControlException] {
508508
doAs(someone, sql(mergeIntoSQL))
509509
}(s"does not have [select] privilege on " +
510-
s"[$namespace1/$table2/id,$namespace1/$table2/name,$namespace1/$table2/city], " +
510+
s"[$namespace1/$table2/city,$namespace1/$table2/id,$namespace1/$table2/name], " +
511511
s"[update] privilege on [$namespace1/$table1]")
512512
doAs(admin, sql(mergeIntoSQL))
513513
}

0 commit comments

Comments
 (0)