Skip to content

Conversation

@navis
Copy link
Contributor

@navis navis commented Jun 30, 2015

Currently, metaHive in HiveContext shares single SessionState instance with all execution threads, which makes problems especially with "use" command. I've also went into this problem and spent hard time to know what has happened. This is just first try and need more tests.

@marmbrus
Copy link
Contributor

marmbrus commented Jul 6, 2015

ok to test

@SparkQA
Copy link

SparkQA commented Jul 6, 2015

Test build #36606 has finished for PR 7118 at commit f5a6830.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)

@SparkQA
Copy link

SparkQA commented Aug 21, 2015

Test build #41336 has finished for PR 7118 at commit 5f3240a.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 21, 2015

Test build #41339 has finished for PR 7118 at commit f1d01d5.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 21, 2015

Test build #41358 has finished for PR 7118 at commit cd9aaf0.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 21, 2015

Test build #41363 has finished for PR 7118 at commit c707370.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 24, 2015

Test build #41434 has finished for PR 7118 at commit 21b42cc.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 24, 2015

Test build #41436 has finished for PR 7118 at commit 8691a2c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 24, 2015

Test build #41449 has finished for PR 7118 at commit dac944b.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 25, 2015

Test build #41506 has finished for PR 7118 at commit f64398b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 26, 2015

Test build #41578 has finished for PR 7118 at commit b3f7805.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Aug 31, 2015

Test build #41819 timed out for PR 7118 at commit eaeb692 after a configured wait of 250m.

@SparkQA
Copy link

SparkQA commented Sep 8, 2015

Test build #42124 has finished for PR 7118 at commit a7baede.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@navis navis force-pushed the SPARK-8552 branch 2 times, most recently from 28c7649 to 6a16a7e Compare September 9, 2015 01:06
@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42167 has finished for PR 7118 at commit 28c7649.

  • This patch passes all tests.
  • This patch does not merge cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42175 has finished for PR 7118 at commit 6a16a7e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@WangTaoTheTonic
Copy link
Contributor

I tested this patch, but got this error when executing "show databases;" using beeline,

15/09/10 15:11:02 INFO SessionState: Created HDFS directory: /tmp/hive/root/fc5c8bbe-0e63-49f1-8286-ec51c4432b94/_tmp_space.db
15/09/10 15:11:02 INFO HiveSessionImpl: Operation log session directory is created: /tmp/root/operation_logs/fc5c8bbe-0e63-49f1-8286-ec51c4432b94
15/09/10 15:11:26 ERROR SparkExecuteStatementOperation: Error running hive query as user : root
java.lang.NullPointerException
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)

@navis
Copy link
Contributor Author

navis commented Sep 10, 2015

@WangTaoTheTonic I've also seen that rebasing this to spark-1.5.0. Seemed done something wrong in rebase process. I'll update soon.

@SparkQA
Copy link

SparkQA commented Sep 11, 2015

Test build #42312 has finished for PR 7118 at commit 7eba459.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class ShuffleDependency[K: ClassTag, V: ClassTag, C: ClassTag](
    • class CoGroupedRDD[K: ClassTag](
    • class ShuffledRDD[K: ClassTag, V: ClassTag, C: ClassTag](
    • class StringIndexer(JavaEstimator, HasInputCol, HasOutputCol, HasHandleInvalid):
    • class HasHandleInvalid(Params):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@WangTaoTheTonic
Copy link
Contributor

@navis Thanks you for the fix. I have tested "use $database" on local Thrift Server and the function is ok.

There might still have a test failed need to fix.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comments need to be updated with its construct args.

@navis
Copy link
Contributor Author

navis commented Sep 14, 2015

@WangTaoTheTonic I've fixed test itself, which seemed invalid. "add jar" should be applied to the session which called it but in test, it expects added jar to be accessed with other sessions.

@SparkQA
Copy link

SparkQA commented Sep 14, 2015

Test build #42388 has finished for PR 7118 at commit 03d4370.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Sep 14, 2015

Test build #42387 has finished for PR 7118 at commit a72d490.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@SparkQA
Copy link

SparkQA commented Sep 14, 2015

Test build #42418 has finished for PR 7118 at commit 431b0d1.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLContext(
    • class HiveContext(sc: SparkContext, optionConf: Map[String, String] = Map.empty)
    • protected[hive] class SQLSession(val sessionID: Int, val config: Map[String, String])
    • class SharedWrapper(
    • class IsolatedWrapper(

@WangTaoTheTonic
Copy link
Contributor

@marmbrus How do you think this fix? As the issue priority is very high, I think we better fix it ASAP.

@marmbrus
Copy link
Contributor

I think I prefer the solution taken in #8909, which limits the use of thread local variables

@davies
Copy link
Contributor

davies commented Oct 9, 2015

Since #8909 is merged, would you mind close this PR?

@navis navis closed this Oct 12, 2015
@pzzs
Copy link
Contributor

pzzs commented Nov 7, 2015

@navis
when i use spark-sql and run sql like that
"
add jar /home/udf-0.0.1-SNAPSHOT.jar;
create temporary function arr_greater_equal as 'com.xx.yy.dac.hive.udf.UDFArrayGreaterEqual';
"
I got a error
15/11/07 16:01:15 ERROR CliDriver: org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10072]: Database does not exist: default
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:349)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:326)
at org.apache.spark.sql.hive.client.SharedWrapper.withHiveState(ClientWrapper.scala:469)
at org.apache.spark.sql.hive.HiveContext$$anon$2.withHiveState(HiveContext.scala:194)
at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:138)
at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:326)
at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:316)
at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:542)
at org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants