Skip to content

[SPARK-11672] [ML] Set active SQLContext in MLlibTestSparkContext.beforeAll #9694

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

mengxr
Copy link
Contributor

@mengxr mengxr commented Nov 13, 2015

Still saw some error messages caused by SQLContext.getOrCreate:

https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3997/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=spark-test/testReport/junit/org.apache.spark.ml.util/JavaDefaultReadWriteSuite/testDefaultReadWrite/

This PR sets the active SQLContext in beforeAll, which is not automatically set in new SQLContext. This makes SQLContext.getOrCreate return the right SQLContext.

cc: @yhuai

@@ -48,8 +48,11 @@ private[util] sealed trait BaseReadWrite {
/**
* Returns the user-specified SQL context or the default.
*/
protected final def sqlContext: SQLContext = optionSQLContext.getOrElse {
SQLContext.getOrCreate(SparkContext.getOrCreate())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems we still need to use getOrCreate? Otherwise, we will keep creating new sqlContexts. Also, if we actually use HiveContext, without using getOrCreate, we will not be able to get that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just a workaround for the issue. User can use .context(hiveContext) to specify a context to use.

@SparkQA
Copy link

SparkQA commented Nov 13, 2015

Test build #45868 has finished for PR 9694 at commit 2b68807.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):\n * public class JavaMultilayerPerceptronClassifierExample\n * public class JavaGradientBoostingClassificationExample\n * public class JavaGradientBoostingRegressionExample\n * public class JavaRandomForestClassificationExample\n * public class JavaRandomForestRegressionExample\n

@mengxr mengxr changed the title [SPARK-11672] [ML] do not use SQLContext.getOrCreate [SPARK-11672] [ML] Set active SQLContext in MLlibTestSparkContext.beforeAll Nov 13, 2015
@SparkQA
Copy link

SparkQA commented Nov 13, 2015

Test build #45887 has finished for PR 9694 at commit 1a2a866.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yhuai
Copy link
Contributor

yhuai commented Nov 13, 2015

LGTM

asfgit pushed a commit that referenced this pull request Nov 13, 2015
…reAll

Still saw some error messages caused by `SQLContext.getOrCreate`:

https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3997/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=spark-test/testReport/junit/org.apache.spark.ml.util/JavaDefaultReadWriteSuite/testDefaultReadWrite/

This PR sets the active SQLContext in beforeAll, which is not automatically set in `new SQLContext`. This makes `SQLContext.getOrCreate` return the right SQLContext.

cc: yhuai

Author: Xiangrui Meng <meng@databricks.com>

Closes #9694 from mengxr/SPARK-11672.3.

(cherry picked from commit 2d2411f)
Signed-off-by: Xiangrui Meng <meng@databricks.com>
@asfgit asfgit closed this in 2d2411f Nov 13, 2015
asfgit pushed a commit that referenced this pull request Nov 15, 2015
The same as #9694, but for Java test suite. yhuai

Author: Xiangrui Meng <meng@databricks.com>

Closes #9719 from mengxr/SPARK-11672.4.

(cherry picked from commit 64e5551)
Signed-off-by: Yin Huai <yhuai@databricks.com>
asfgit pushed a commit that referenced this pull request Nov 15, 2015
The same as #9694, but for Java test suite. yhuai

Author: Xiangrui Meng <meng@databricks.com>

Closes #9719 from mengxr/SPARK-11672.4.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants