Skip to content

[SPARK-29807][SQL] Rename "spark.sql.ansi.enabled" to "spark.sql.dialect.spark.ansi.enabled" #26444

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 6 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/sql-keywords.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@ license: |
limitations under the License.
---

When `spark.sql.ansi.enabled` is true, Spark SQL has two kinds of keywords:
When `spark.sql.dialect.spark.ansi.enabled` is true, Spark SQL has two kinds of keywords:
* Reserved keywords: Keywords that are reserved and can't be used as identifiers for table, view, column, function, alias, etc.
* Non-reserved keywords: Keywords that have a special meaning only in particular contexts and can be used as identifiers in other contexts. For example, `SELECT 1 WEEK` is an interval literal, but WEEK can be used as identifiers in other places.

When `spark.sql.ansi.enabled` is false, Spark SQL has two kinds of keywords:
* Non-reserved keywords: Same definition as the one when `spark.sql.ansi.enabled=true`.
When `spark.sql.dialect.spark.ansi.enabled` is false, Spark SQL has two kinds of keywords:
* Non-reserved keywords: Same definition as the one when `spark.sql.dialect.spark.ansi.enabled=true`.
* Strict-non-reserved keywords: A strict version of non-reserved keywords, which can not be used as table alias.

By default `spark.sql.ansi.enabled` is false.
By default `spark.sql.dialect.spark.ansi.enabled` is false.

Below is a list of all the keywords in Spark SQL.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -954,7 +954,7 @@ number
| MINUS? BIGDECIMAL_LITERAL #bigDecimalLiteral
;

// When `spark.sql.ansi.enabled=true`, there are 2 kinds of keywords in Spark SQL.
// When `spark.sql.dialect.spark.ansi.enabled=true`, there are 2 kinds of keywords in Spark SQL.
// - Reserved keywords:
// Keywords that are reserved and can't be used as identifiers for table, view, column,
// function, alias, etc.
Expand Down Expand Up @@ -1154,9 +1154,9 @@ ansiNonReserved
| YEARS
;

// When `spark.sql.ansi.enabled=false`, there are 2 kinds of keywords in Spark SQL.
// When `spark.sql.dialect.spark.ansi.enabled=false`, there are 2 kinds of keywords in Spark SQL.
// - Non-reserved keywords:
// Same definition as the one when `spark.sql.ansi.enabled=true`.
// Same definition as the one when `spark.sql.dialect.spark.ansi.enabled=true`.
// - Strict-non-reserved keywords:
// A strict version of non-reserved keywords, which can not be used as table alias.
// You can find the full keywords list by searching "Start of the keywords list" in this file.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -592,7 +592,7 @@ abstract class CastBase extends UnaryExpression with TimeZoneAwareExpression wit
* Change the precision / scale in a given decimal to those set in `decimalType` (if any),
* modifying `value` in-place and returning it if successful. If an overflow occurs, it
* either returns null or throws an exception according to the value set for
* `spark.sql.ansi.enabled`.
* `spark.sql.dialect.spark.ansi.enabled`.
*
* NOTE: this modifies `value` in-place, so don't call it on external data.
*/
Expand All @@ -611,7 +611,7 @@ abstract class CastBase extends UnaryExpression with TimeZoneAwareExpression wit

/**
* Create new `Decimal` with precision and scale given in `decimalType` (if any).
* If overflow occurs, if `spark.sql.ansi.enabled` is false, null is returned;
* If overflow occurs, if `spark.sql.dialect.spark.ansi.enabled` is false, null is returned;
* otherwise, an `ArithmeticException` is thrown.
*/
private[this] def toPrecision(value: Decimal, decimalType: DecimalType): Decimal =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ abstract class BinaryArithmetic extends BinaryOperator with NullIntolerant {
sys.error("BinaryArithmetics must override either calendarIntervalMethod or genCode")

// Name of the function for the exact version of this expression in [[Math]].
// If the option "spark.sql.ansi.enabled" is enabled and there is corresponding
// If the option "spark.sql.dialect.spark.ansi.enabled" is enabled and there is corresponding
// function in [[Math]], the exact function will be called instead of evaluation with [[symbol]].
def exactMathMethod: Option[String] = None

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,15 +101,15 @@ abstract class AbstractSqlParser(conf: SQLConf) extends ParserInterface with Log
lexer.removeErrorListeners()
lexer.addErrorListener(ParseErrorListener)
lexer.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced
lexer.ansi = conf.ansiEnabled
lexer.ansi = conf.dialectSparkAnsiEnabled

val tokenStream = new CommonTokenStream(lexer)
val parser = new SqlBaseParser(tokenStream)
parser.addParseListener(PostProcessor)
parser.removeErrorListeners()
parser.addErrorListener(ParseErrorListener)
parser.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced
parser.ansi = conf.ansiEnabled
parser.ansi = conf.dialectSparkAnsiEnabled

try {
try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1673,6 +1673,20 @@ object SQLConf {
.checkValues(Dialect.values.map(_.toString))
.createWithDefault(Dialect.SPARK.toString)

val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will we deprecate this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so, maybe after a few versions, this config can be deleted.

.internal()
.doc("This configuration is deprecated and will be removed in the future releases." +
"It is replaced by spark.sql.dialect.spark.ansi.enabled.")
.booleanConf
.createWithDefault(false)

val DIALECT_SPARK_ANSI_ENABLED = buildConf("spark.sql.dialect.spark.ansi.enabled")
.doc("When true, Spark tries to conform to the ANSI SQL specification: 1. Spark will " +
"throw a runtime exception if an overflow occurs in any operation on integral/decimal " +
"field. 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in " +
"the SQL parser.")
.fallbackConf(ANSI_ENABLED)

val ALLOW_CREATING_MANAGED_TABLE_USING_NONEMPTY_LOCATION =
buildConf("spark.sql.legacy.allowCreatingManagedTableUsingNonemptyLocation")
.internal()
Expand Down Expand Up @@ -1784,14 +1798,6 @@ object SQLConf {
.checkValues(StoreAssignmentPolicy.values.map(_.toString))
.createWithDefault(StoreAssignmentPolicy.ANSI.toString)

val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
.doc("When true, Spark tries to conform to the ANSI SQL specification: 1. Spark will " +
"throw a runtime exception if an overflow occurs in any operation on integral/decimal " +
"field. 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in " +
"the SQL parser.")
.booleanConf
.createWithDefault(false)

val SORT_BEFORE_REPARTITION =
buildConf("spark.sql.execution.sortBeforeRepartition")
.internal()
Expand Down Expand Up @@ -2521,9 +2527,11 @@ class SQLConf extends Serializable with Logging {
def storeAssignmentPolicy: StoreAssignmentPolicy.Value =
StoreAssignmentPolicy.withName(getConf(STORE_ASSIGNMENT_POLICY))

def ansiEnabled: Boolean = getConf(ANSI_ENABLED)
def usePostgreSQLDialect: Boolean = getConf(DIALECT) == Dialect.POSTGRESQL.toString

def dialectSparkAnsiEnabled: Boolean = getConf(DIALECT_SPARK_ANSI_ENABLED)

def usePostgreSQLDialect: Boolean = getConf(DIALECT) == Dialect.POSTGRESQL.toString()
def ansiEnabled: Boolean = usePostgreSQLDialect || dialectSparkAnsiEnabled

def nestedSchemaPruningEnabled: Boolean = getConf(NESTED_SCHEMA_PRUNING_ENABLED)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -436,7 +436,7 @@ class ExpressionEncoderSuite extends CodegenInterpretedPlanTest with AnalysisTes
testAndVerifyNotLeakingReflectionObjects(
s"overflowing $testName, ansiEnabled=$ansiEnabled") {
withSQLConf(
SQLConf.ANSI_ENABLED.key -> ansiEnabled.toString
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> ansiEnabled.toString
) {
// Need to construct Encoder here rather than implicitly resolving it
// so that SQLConf changes are respected.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,7 @@ class RowEncoderSuite extends CodegenInterpretedPlanTest {
}

private def testDecimalOverflow(schema: StructType, row: Row): Unit = {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
val encoder = RowEncoder(schema).resolveAndBind()
intercept[Exception] {
encoder.toRow(row)
Expand All @@ -182,7 +182,7 @@ class RowEncoderSuite extends CodegenInterpretedPlanTest {
}
}

withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
val encoder = RowEncoder(schema).resolveAndBind()
assert(encoder.fromRow(encoder.toRow(row)).get(0) == null)
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Add(positiveLongLit, negativeLongLit), -1L)

Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericAndInterval.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Add, tpe, tpe)
}
Expand All @@ -80,7 +80,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(UnaryMinus(Literal(Int.MinValue)), Int.MinValue)
checkEvaluation(UnaryMinus(Literal(Short.MinValue)), Short.MinValue)
checkEvaluation(UnaryMinus(Literal(Byte.MinValue)), Byte.MinValue)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
checkExceptionInExpression[ArithmeticException](
UnaryMinus(Literal(Long.MinValue)), "overflow")
checkExceptionInExpression[ArithmeticException](
Expand Down Expand Up @@ -122,7 +122,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Subtract(positiveLongLit, negativeLongLit), positiveLong - negativeLong)

Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericAndInterval.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Subtract, tpe, tpe)
}
Expand All @@ -144,7 +144,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Multiply(positiveLongLit, negativeLongLit), positiveLong * negativeLong)

Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericTypeWithoutDecimal.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Multiply, tpe, tpe)
}
Expand Down Expand Up @@ -445,12 +445,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minLongLiteral, minLongLiteral)
val e5 = Subtract(minLongLiteral, maxLongLiteral)
val e6 = Multiply(minLongLiteral, minLongLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow")
}
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Long.MinValue)
checkEvaluation(e2, Long.MinValue)
checkEvaluation(e3, -2L)
Expand All @@ -469,12 +469,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minIntLiteral, minIntLiteral)
val e5 = Subtract(minIntLiteral, maxIntLiteral)
val e6 = Multiply(minIntLiteral, minIntLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow")
}
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Int.MinValue)
checkEvaluation(e2, Int.MinValue)
checkEvaluation(e3, -2)
Expand All @@ -493,12 +493,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minShortLiteral, minShortLiteral)
val e5 = Subtract(minShortLiteral, maxShortLiteral)
val e6 = Multiply(minShortLiteral, minShortLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow")
}
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Short.MinValue)
checkEvaluation(e2, Short.MinValue)
checkEvaluation(e3, (-2).toShort)
Expand All @@ -517,12 +517,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minByteLiteral, minByteLiteral)
val e5 = Subtract(minByteLiteral, maxByteLiteral)
val e6 = Multiply(minByteLiteral, minByteLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow")
}
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Byte.MinValue)
checkEvaluation(e2, Byte.MinValue)
checkEvaluation(e3, (-2).toByte)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -891,7 +891,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
}

test("Throw exception on casting out-of-range value to decimal type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
checkExceptionInExpression[ArithmeticException](
cast(Literal("134.12"), DecimalType(3, 2)), "cannot be represented")
checkExceptionInExpression[ArithmeticException](
Expand Down Expand Up @@ -957,7 +958,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
}

test("Throw exception on casting out-of-range value to byte type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(ByteType)
Seq(Byte.MaxValue + 1, Byte.MinValue - 1).foreach { value =>
checkExceptionInExpression[ArithmeticException](cast(value, ByteType), "overflow")
Expand All @@ -982,7 +984,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
}

test("Throw exception on casting out-of-range value to short type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(ShortType)
Seq(Short.MaxValue + 1, Short.MinValue - 1).foreach { value =>
checkExceptionInExpression[ArithmeticException](cast(value, ShortType), "overflow")
Expand All @@ -1007,7 +1010,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
}

test("Throw exception on casting out-of-range value to int type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key ->requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(IntegerType)
testLongMaxAndMin(IntegerType)

Expand All @@ -1024,7 +1028,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
}

test("Throw exception on casting out-of-range value to long type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testLongMaxAndMin(LongType)

Seq(Long.MaxValue, 0, Long.MinValue).foreach { value =>
Expand Down Expand Up @@ -1201,7 +1206,7 @@ class CastSuite extends CastSuiteBase {
}

test("SPARK-28470: Cast should honor nullOnOverflow property") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(Cast(Literal("134.12"), DecimalType(3, 2)), null)
checkEvaluation(
Cast(Literal(Timestamp.valueOf("2019-07-25 22:04:36")), DecimalType(3, 2)), null)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ class DecimalExpressionSuite extends SparkFunSuite with ExpressionEvalHelper {
}

test("MakeDecimal") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1"))
checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null)
val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1)
Expand All @@ -41,7 +41,7 @@ class DecimalExpressionSuite extends SparkFunSuite with ExpressionEvalHelper {
evaluateWithoutCodegen(overflowExpr, null)
checkEvaluationWithUnsafeProjection(overflowExpr, null)
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1"))
checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null)
val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ class ScalaUDFSuite extends SparkFunSuite with ExpressionEvalHelper {
}

test("SPARK-28369: honor nullOnOverflow config for ScalaUDF") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
val udf = ScalaUDF(
(a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)),
DecimalType.SYSTEM_DEFAULT,
Expand All @@ -69,7 +69,7 @@ class ScalaUDFSuite extends SparkFunSuite with ExpressionEvalHelper {
}
assert(e2.getCause.isInstanceOf[ArithmeticException])
}
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
val udf = ScalaUDF(
(a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)),
DecimalType.SYSTEM_DEFAULT,
Expand Down
Loading