-
Notifications
You must be signed in to change notification settings - Fork 28.6k
Fixed broken pyspark shell. #444
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Merged build triggered. |
Merged build started. |
Yep, I think python shell's document should be update same time. sys.version_info only became a named tuple in 2.7. To get this to work in 2.6, it needs to be accessed as a regular tuple:
see line 25. |
Can you submit a PR for that? |
Sure. I'll modify it. |
Actually never mind I will do it here. You can't just change that line without changing the indent anyway. |
ok pushed |
Yes, agree with you. |
Merged build triggered. |
Merged build started. |
Merged build finished. |
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14229/ |
Merged build finished. |
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14232/ |
Jenkins, retest this please. |
Merged build triggered. |
Merged build started. |
Ok I merged this since none of the tests will run this code path, and it is a hot fix. |
Author: Reynold Xin <rxin@apache.org> Closes #444 from rxin/pyspark and squashes the following commits: fc11356 [Reynold Xin] Made the PySpark shell version checking compatible with Python 2.6. 571830b [Reynold Xin] Fixed broken pyspark shell. (cherry picked from commit 81a152c) Signed-off-by: Reynold Xin <rxin@apache.org>
Merged build finished. All automated tests passed. |
All automated tests passed. |
Clarify that Python 2.7 is only needed for MLlib
Author: Reynold Xin <rxin@apache.org> Closes apache#444 from rxin/pyspark and squashes the following commits: fc11356 [Reynold Xin] Made the PySpark shell version checking compatible with Python 2.6. 571830b [Reynold Xin] Fixed broken pyspark shell.
Clarify that Python 2.7 is only needed for MLlib (cherry picked from commit 4f0c361) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
* Use a list of environment variables for JVM options. * Fix merge conflicts.
Fixes issues apache#444 Signed-off-by: Melvin Hillsman <mrhillsman@gmail.com>
…t/txt format plans ### What changes were proposed in this pull request? This PR adds a download link to the ExecutionPage for SVG/dot/txt format plans.  ### Why are the changes needed? These downloaded assets can improve the UX for sharing/porting to papers, social media, external advanced visualization tools, e.t.c. ### Does this PR introduce _any_ user-facing change? Yes, UI changes ### How was this patch tested? - SVG ```svg <svg xmlns="http://www.w3.org/2000/svg" viewBox="-16 -16 304.046875 95.53125" width="304.046875" height="95.53125"><style>/* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ .label { font-size: 0.85rem; font-weight: normal; text-shadow: none; color: #333; } svg g.cluster rect { fill: #A0DFFF; stroke: #3EC0FF; stroke-width: 1px; } svg g.node rect { fill: #C3EBFF; stroke: #3EC0FF; stroke-width: 1px; } /* Highlight the SparkPlan node name */ svg text :first-child:not(.stageId-and-taskId-metrics) { font-weight: bold; } svg text { fill: #333; } svg path { stroke: #444; stroke-width: 1.5px; } /* Breaks the long string like file path when showing tooltips */ .tooltip-inner { word-wrap:break-word; } /* Breaks the long job url list when showing Details for Query in SQL */ .job-url { word-wrap: break-word; } svg g.node rect.selected { fill: #E25A1CFF; stroke: #317EACFF; stroke-width: 2px; } svg g.node rect.linked { fill: #FFC106FF; stroke: #317EACFF; stroke-width: 2px; } svg path.linked { fill: #317EACFF; stroke: #317EACFF; stroke-width: 2px; } </style><g><g class="output"><g class="clusters"/><g class="edgePaths"/><g class="edgeLabels"/><g class="nodes"><g class="node" id="node0" transform="translate(136.0234375,31.765625)" style="opacity: 1;" data-original-title="" title=""><rect rx="5" ry="5" x="-136.0234375" y="-31.765625" width="272.046875" height="63.53125" class="label-container"/><g class="label" transform="translate(0,0)"><g transform="translate(-131.0234375,-26.765625)"><foreignObject width="262.046875" height="53.53125"><div xmlns="http://www.w3.org/1999/xhtml" style="display: inline-block; white-space: nowrap;"><br /><b>Execute CreateHiveTableAsSelectCommand</b><br /><br /></div></foreignObject></g></g></g></g></g></g></svg> ```  - DOT ```dot digraph G { 0 [id="node0" labelType="html" label="<b>Execute InsertIntoHadoopFsRelationCommand</b><br><br>task commit time: 7 ms<br>number of written files: 1<br>job commit time: 24 ms<br>number of output rows: 1<br>number of dynamic part: 0<br>written output: 468.0 B" tooltip="Execute InsertIntoHadoopFsRelationCommand file:/Users/hzyaoqin/spark/spark-warehouse/t, false, Parquet, [parquet.compression=zstd, serialization.format=1, mergeschema=false, __hive_compatible_bucketed_table_insertion__=true], Append, `spark_catalog`.`default`.`t`, org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe, org.apache.spark.sql.execution.datasources.InMemoryFileIndex(file:/Users/hzyaoqin/spark/spark-warehouse/t), [c]"]; 1 [id="node1" labelType="html" label="<br><b>WriteFiles</b><br><br>" tooltip="WriteFiles"]; subgraph cluster2 { isCluster="true"; id="cluster2"; label="WholeStageCodegen (1)\n \nduration: 158 ms"; tooltip="WholeStageCodegen (1)"; 3 [id="node3" labelType="html" label="<br><b>Project</b><br><br>" tooltip="Project [1 AS c#0]"]; 4 [id="node4" labelType="html" label="<b>Scan OneRowRelation</b><br><br>number of output rows: 1" tooltip="Scan OneRowRelation[]"]; } 1->0; 3->1; 4->3; } ``` - TXT [plan.txt](https://github.com/user-attachments/files/19480587/plan.txt) ### Was this patch authored or co-authored using generative AI tooling? no Closes #50427 from yaooqinn/SPARK-51629. Authored-by: Kent Yao <yao@apache.org> Signed-off-by: Kent Yao <yao@apache.org>
No description provided.