Skip to content

Commit 8c0f198

Browse files
Updates for security patches and Python 3.7.9
1 parent 1767f7b commit 8c0f198

File tree

8 files changed

+31
-41
lines changed

8 files changed

+31
-41
lines changed

Installed-Libraries.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22

33
copyright:
4-
years: 2017, 2019
5-
lastupdated: "2019-10-07"
4+
years: 2017, 2020
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -18,12 +18,12 @@ subcollection: AnalyticsEngine
1818
# Spark kernels and libraries on the cluster
1919
{: #installed-libs}
2020

21-
The {{site.data.keyword.iae_full_notm}} comes with a set of libraries. These libraries are pre-installed on each of the cluster nodes and are available by default on the kernels. The table below lists the locations of these libraries:
21+
The {{site.data.keyword.iae_full_notm}} comes with a set of libraries. These libraries are pre-installed on each of the cluster nodes and the libraries are available by default on the kernels. The table below lists the locations of these libraries:
2222

2323
|AE version| Environment | Kernel | Libraries |
2424
|-------------|--------|-----------|------------|
25-
|AE 1.2| Python 3.7 |Python 3.7 with Spark 2.3.2 |Python libraries packaged with Anaconda3-2018.12 at /home/common/conda/anaconda3/ |
26-
|AE 1.2| Scala 2.11|Scala 2.11 with Spark 2.3.2 |Scala/Java libraries (Scala 2.11 and Java 1.8) under /home/common/lib/scala/spark2 |
25+
|AE 1.2| Python 3.7 |Python 3.7 with Spark 2.3.2 |Python libraries packaged with Anaconda3-2018.12 at `/home/common/conda/miniconda3.7` |
26+
|AE 1.2| Scala 2.11|Scala 2.11 with Spark 2.3.2 |Scala/Java libraries (Scala 2.11 and Java 1.8) under `/home/common/lib/scala/spark2` |
2727

2828
For installed Spark connectors, see the [documentation](/docs/AnalyticsEngine?topic=AnalyticsEngine-spark-connectors).
2929

@@ -38,7 +38,7 @@ To see the list of installed packages, execute the following command from a Pyth
3838
Alternately, SSH to the cluster and run this command:
3939

4040
```python
41-
/home/common/conda/anaconda3/bin/pip list
41+
/home/common/conda/miniconda3.7/bin/pip list
4242
```
4343

4444
## R

SSH-connection.md

Lines changed: 3 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22

33
copyright:
4-
years: 2017, 2019
5-
lastupdated: "2018-09-26"
4+
years: 2017, 2020
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -36,23 +36,12 @@ $ ssh clsadmin@iae-tmp-867-mn003.us-south.ae.appdomain.cloud
3636
/usr/hdp/current/spark2-client/jars/spark-examples.jar
3737
```
3838

39-
## Running spark-submit with Anaconda Python 2
40-
41-
To run spark-submit with Anaconda Python 2, enter:
42-
43-
```
44-
PYSPARK_PYTHON=/home/common/conda/anaconda2/bin/python spark-submit \
45-
--master yarn \
46-
--deploy-mode cluster \
47-
/usr/hdp/current/spark2-client/examples/src/main/python/pi.py
48-
```
49-
5039
## Running spark-submit with Anaconda Python 3
5140

5241
To run sprk-submit with Anaconda Python 3, enter:
5342

5443
```
55-
PYSPARK_PYTHON=/home/common/conda/anaconda3/bin/python spark-submit \
44+
PYSPARK_PYTHON=/home/common/conda/miniconda3.7/bin/python spark-submit \
5645
--master yarn \
5746
--deploy-mode cluster \
5847
/usr/hdp/current/spark2-client/examples/src/main/python/pi.py

best-practices.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
copyright:
44
years: 2017, 2020
5-
lastupdated: "2020-01-09"
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -140,8 +140,8 @@ The software packages on `AE 1.2` clusters include components for Horton Datawor
140140

141141
| AE 1.2 clusters | Based on HDP 3.1 |
142142
|-----------------|-----------------------------|
143-
| `AE 1.2 Hive LLAP` <br>Choose if you are planning to run Hive in interactive mode, with preconfigured settings for Hive LLAP for faster responses. | Hadoop, Livy, Knox, Ambari, Anaconda-Py, Hive (LLAP mode) |
144-
| `AE 1.2 Spark and Hive` <br>Choose if you are planning to run Hive and/or Spark workloads. | Hadoop, Livy, Knox, Spark, JEG, Ambari, Anaconda Py, Hive (non LLAP mode ) |
143+
| `AE 1.2 Hive LLAP` <br>Choose if you are planning to run Hive in interactive mode, with preconfigured settings for Hive LLAP for faster responses. | Hadoop, Livy, Knox, Ambari, Conda-Py, Hive (LLAP mode) |
144+
| `AE 1.2 Spark and Hive` <br>Choose if you are planning to run Hive and/or Spark workloads. | Hadoop, Livy, Knox, Spark, JEG, Ambari, Conda Py, Hive (non LLAP mode ) |
145145
| `AE 1.2 Spark and Hadoop`<br>Choose if you are planning to run Hadoop workloads in addition to Spark workloads. | (AE 1.2 Spark and Hive) + HBase, Phoenix, Oozie |
146146

147147
**Note:** Currently you cannot resize a cluster that uses the `AE 1.2 Hive LLAP` software package.

example-of-customizations.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
copyright:
44
years: 2017, 2020
5-
lastupdated: "2020-06-23"
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -147,7 +147,7 @@ For more information, see [Installing additional libraries](/docs/services/Analy
147147

148148
The Anaconda3 environment on `AE 1.2` clusters comes with Python 3.7.
149149

150-
To install Python 3.x libraries, your script must install to the `/home/common/conda/anaconda3` environment by using:
150+
To install Python 3.x libraries, your script must install to the `/home/common/conda/miniconda3.7` environment by using:
151151
```
152152
pip install <package-name>
153153
```

index.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
copyright:
44
years: 2017, 2020
5-
lastupdated: "2020-05-12"
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -105,8 +105,8 @@ The following software packages are available when you create a cluster based on
105105

106106
| AE 1.2 | Based on HDP 3.1 |
107107
|-----------------|-----------------------------|
108-
| `AE 1.2 Hive LLAP` | Hadoop, Livy, Knox, Ambari, <br>Anaconda-Py, Hive (LLAP mode) |
109-
| `AE 1.2 Spark and Hive` | Hadoop, Livy, Knox, Spark, JEG, Ambari, <br>Anaconda Py, Hive (non LLAP mode ) |
108+
| `AE 1.2 Hive LLAP` | Hadoop, Livy, Knox, Ambari, <br>Conda-Py, Hive (LLAP mode) |
109+
| `AE 1.2 Spark and Hive` | Hadoop, Livy, Knox, Spark, JEG, Ambari, <br>Conda Py, Hive (non LLAP mode ) |
110110
| `AE 1.2 Spark and Hadoop` | (AE 1.2 Spark and Hive) + HBase, Phoenix, <br>Oozie |
111111

112112
**Important:**
@@ -124,7 +124,7 @@ You can create a cluster based on Hortonworks Data Platform 3.1. The following s
124124
| Apache Livy 0.5|
125125
| Knox 1.0.0|
126126
| Ambari 2.7.3|
127-
| Anaconda with Python 3.7.1 |
127+
| Miniconda with Python 3.7.9 |
128128
| Jupyter Enterprise Gateway 0.8.0
129129
| HBase 2.1.6 |
130130
| Hive 3.1.0 |

release-notes.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
copyright:
44
years: 2017, 2020
5-
lastupdated: "2020-09-30"
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -23,6 +23,13 @@ Use these notes to learn about the latest features, additions and changes to {{s
2323
{: shortdesc}
2424
## {{site.data.keyword.iae_full_notm}} information
2525

26+
### 17 October 2020
27+
28+
- **[AE-1.2.v29]** - The following security fixes were applied:
29+
[CVE-2020-26116](https://exchange.xforce.ibmcloud.com/vulnerabilities/189404), [CVE-2019-20907](https://exchange.xforce.ibmcloud.com/vulnerabilities/185442) and [CVE-2020-14422](https://exchange.xforce.ibmcloud.com/vulnerabilities/184320)
30+
31+
To apply these fixes, the cluster Python runtime version was upgraded from Python3.7.1-Anaconda3 (conda 4.5.12) distribution to Python3.7.9-Miniconda3.7 (conda 4.8.5) distribution. Note that this implies that the provided Python packages that were present in the earlier Anaconda version will no longer be available out of the box in newly created clusters. If you need to use any of the old packages you need to install the packages explicitly by using the `pip install` command. Remember to follow the recommendations for creating and deleting clusters as described in [Best practices](/docs/AnalyticsEngine?topic=AnalyticsEngine-best-practices).
32+
2633
### 1 October 2020
2734

2835
- A fix was added to alleviate cluster instability issues caused by an error in an underlying Docker runtime. If you created a cluster between 23 July 2020 and 01 October 2020, you might have experienced intermittent instability that manifested as connectivity issues or down times in nodes. With today's deployment, the runtime version has been replaced by an earlier stable runtime version.

spark-interactive-notebooks-api.md

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -47,17 +47,11 @@ $ ssh clsadmin@iae-tmp-867-mn003.<changeme>.ae.appdomain.cloud
4747

4848
`<changeme>` is the {{site.data.keyword.Bluemix_short}} hosting location, for example `us-south`.
4949

50-
2. You can start Python 2, Python 3, Scala, and R interactive shells on the cluster as follows:
50+
2. You can start Python 3, Scala, and R interactive shells on the cluster as follows:
5151

52-
* Run Spark applications interactively with Python 2:
53-
```
54-
PYSPARK_PYTHON=/home/common/conda/anaconda2/bin/python pyspark \
55-
--master yarn \
56-
--deploy-mode client
57-
```
5852
* Run Spark applications interactively with Python 3:
5953
```
60-
PYSPARK_PYTHON=/home/common/conda/anaconda3/bin/python pyspark \
54+
PYSPARK_PYTHON=/home/common/conda/miniconda3.7/bin/python pyspark \
6155
--master yarn \
6256
--deploy-mode client
6357
```

working-with-spark-sql.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22

33
copyright:
4-
years: 2017, 2019
5-
lastupdated: "2019-07-24"
4+
years: 2017, 2020
5+
lastupdated: "2020-10-19"
66

77
subcollection: AnalyticsEngine
88

@@ -93,7 +93,7 @@ To run Spark SQL with Python 3:
9393

9494
1. Launch the PySpark shell:
9595
```
96-
PYSPARK_PYTHON=/home/common/conda/anaconda3/bin/python pyspark \
96+
PYSPARK_PYTHON=/home/common/conda/miniconda3.7/bin/python pyspark \
9797
--master yarn \
9898
--deploy-mode client
9999
```

0 commit comments

Comments
 (0)