Skip to content

Commit

Permalink
Merge
Browse files Browse the repository at this point in the history
  • Loading branch information
EmergentOrder committed Jun 17, 2019
2 parents 0faf78a + 93f5aa6 commit 92cab7a
Show file tree
Hide file tree
Showing 199 changed files with 2,598 additions and 1,420 deletions.
11 changes: 4 additions & 7 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,6 @@ env:
matrix:
- BUILD_TYPE=Unit
METADATA_REP=PGSQL EVENTDATA_REP=PGSQL MODELDATA_REP=PGSQL
- BUILD_TYPE=Integration
METADATA_REP=ELASTICSEARCH EVENTDATA_REP=PGSQL MODELDATA_REP=S3
PIO_ELASTICSEARCH_VERSION=1.7.3
- BUILD_TYPE=Integration
METADATA_REP=ELASTICSEARCH EVENTDATA_REP=ELASTICSEARCH MODELDATA_REP=S3
PIO_ELASTICSEARCH_VERSION=5.6.9
Expand All @@ -73,12 +70,12 @@ env:
- BUILD_TYPE=Integration
METADATA_REP=PGSQL EVENTDATA_REP=PGSQL MODELDATA_REP=PGSQL
PIO_SCALA_VERSION=2.11.12
PIO_SPARK_VERSION=2.2.2
PIO_SPARK_VERSION=2.2.3
PIO_HADOOP_VERSION=2.6.5
- BUILD_TYPE=Integration
METADATA_REP=PGSQL EVENTDATA_REP=PGSQL MODELDATA_REP=HDFS
PIO_SCALA_VERSION=2.11.12
PIO_SPARK_VERSION=2.3.2
PIO_SPARK_VERSION=2.3.3
PIO_HADOOP_VERSION=2.6.5

- BUILD_TYPE=Integration
Expand All @@ -94,12 +91,12 @@ env:
- BUILD_TYPE=Integration
METADATA_REP=PGSQL EVENTDATA_REP=PGSQL MODELDATA_REP=PGSQL
PIO_SCALA_VERSION=2.11.12
PIO_SPARK_VERSION=2.2.2
PIO_SPARK_VERSION=2.2.3
PIO_HADOOP_VERSION=2.7.7
- BUILD_TYPE=Integration
METADATA_REP=PGSQL EVENTDATA_REP=PGSQL MODELDATA_REP=HDFS
PIO_SCALA_VERSION=2.11.12
PIO_SPARK_VERSION=2.3.2
PIO_SPARK_VERSION=2.4.0
PIO_HADOOP_VERSION=2.7.7

- BUILD_TYPE=LicenseCheck
Expand Down
69 changes: 35 additions & 34 deletions PMC.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,40 +27,40 @@ http://apache.org/dev/openpgp.html#generate-key on how to generate a strong code
signing key.
2. Add your public key to the `KEYS` file at the root of the source code tree.
3. Create a new release branch, with version bumped to the next release version.
* `git checkout -b release/0.14.0`
* Replace all `0.14.0-SNAPSHOT` in the code tree to `0.14.0`
* `git commit -am "Prepare 0.14.0-rc1"`
* `git tag -am "Apache PredictionIO 0.14.0-rc1" v0.14.0-rc1`
* `git checkout -b release/0.15.0`
* Replace all `0.15.0-SNAPSHOT` in the code tree to `0.15.0`
* `git commit -am "Prepare 0.15.0-rc1"`
* `git tag -am "Apache PredictionIO 0.15.0-rc1" v0.15.0-rc1`
4. Push the release branch and tag to the apache git repo.
5. Wait for Travis to pass build on the release branch.
6. Package a clean tarball for staging a release candidate.
* `git archive --format tar v0.14.0-rc1 >
../apache-predictionio-0.14.0-rc1.tar`
* `cd ..; gzip apache-predictionio-0.14.0-rc1.tar`
* `git archive --format tar v0.15.0-rc1 >
../apache-predictionio-0.15.0-rc1.tar`
* `cd ..; gzip apache-predictionio-0.15.0-rc1.tar`
7. Generate detached signature for the release candidate.
(http://apache.org/dev/release-signing.html#openpgp-ascii-detach-sig)
* `gpg --armor --output apache-predictionio-0.14.0-rc1.tar.gz.asc
--detach-sig apache-predictionio-0.14.0-rc1.tar.gz`
* `gpg --armor --output apache-predictionio-0.15.0-rc1.tar.gz.asc
--detach-sig apache-predictionio-0.15.0-rc1.tar.gz`
8. Generate SHA512 checksums for the release candidate.
* `gpg --print-md SHA512 apache-predictionio-0.14.0-rc1.tar.gz >
apache-predictionio-0.14.0-rc1.tar.gz.sha512`
* `gpg --print-md SHA512 apache-predictionio-0.15.0-rc1.tar.gz >
apache-predictionio-0.15.0-rc1.tar.gz.sha512`
9. Run `./make-distribution.sh` and repeat steps 6 to 8 to create binary distribution release.
* `mv PredictionIO-0.14.0.tar.gz apache-predictionio-0.14.0-bin.tar.gz`
* `gpg --armor --output apache-predictionio-0.14.0-bin.tar.gz.asc
--detach-sig apache-predictionio-0.14.0-bin.tar.gz`
* `gpg --print-md SHA512 apache-predictionio-0.14.0-bin.tar.gz >
apache-predictionio-0.14.0-bin.tar.gz.sha512`
* `mv PredictionIO-0.15.0.tar.gz apache-predictionio-0.15.0-bin.tar.gz`
* `gpg --armor --output apache-predictionio-0.15.0-bin.tar.gz.asc
--detach-sig apache-predictionio-0.15.0-bin.tar.gz`
* `gpg --print-md SHA512 apache-predictionio-0.15.0-bin.tar.gz >
apache-predictionio-0.15.0-bin.tar.gz.sha512`
10. If you have not done so, use SVN to checkout
https://dist.apache.org/repos/dist/dev/predictionio. This is the area
for staging release candidates for voting.
* `svn co https://dist.apache.org/repos/dist/dev/predictionio`
11. Create a subdirectory at the SVN staging area. The area should have a `KEYS` file.
* `mkdir apache-predictionio-0.14.0-rc1`
* `cp apache-predictionio-0.14.0-* apache-predictionio-0.14.0-rc1`
* `mkdir apache-predictionio-0.15.0-rc1`
* `cp apache-predictionio-0.15.0-* apache-predictionio-0.15.0-rc1`
12. If you have updated the `KEYS` file, also copy that to the staging area.
13. `svn commit -m "Apache PredictionIO 0.14.0-rc1"`
13. `svn commit -m "Apache PredictionIO 0.15.0-rc1"`
14. Set up credentials with Apache Nexus using the SBT Sonatype plugin. Put this
in `~/.sbt/0.13/sonatype.sbt`.
in `~/.sbt/1.0/sonatype.sbt`.

```
publishTo := {
Expand All @@ -78,26 +78,27 @@ Close the staged repository on Apache Nexus.
16. Send out email for voting on PredictionIO dev mailing list.

```
Subject: [VOTE] Apache PredictionIO 0.14.0 Release (RC1)
Subject: [VOTE] Apache PredictionIO 0.15.0 Release (RC1)
This is the vote for 0.14.0 of Apache PredictionIO.
This is the vote for 0.15.0 of Apache PredictionIO.
The vote will run for at least 72 hours and will close on Apr 7th, 2017.
The release candidate artifacts can be downloaded here: https://dist.apache.org/repos/dist/dev/predictionio/apache-predictionio-0.14.0-rc1/
The release candidate artifacts can be downloaded here: https://dist.apache.org/repos/dist/dev/predictionio/apache-predictionio-0.15.0-rc1/
Test results of RC1 can be found here: https://travis-ci.org/apache/predictionio/builds/xxx
Maven artifacts are built from the release candidate artifacts above, and are provided as convenience for testing with engine templates. The Maven artifacts are provided at the Maven staging repo here: https://repository.apache.org/content/repositories/orgapachepredictionio-nnnn/
All JIRAs completed for this release are tagged with 'FixVersion = 0.14.0'. You can view them here: https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12320420&version=12337844
All JIRAs completed for this release are tagged with 'FixVersion = 0.15.0'. You can view them here: https://issues.apache.org/jira/secure/ReleaseNote
.jspa?projectId=12320420&version=12337844
The artifacts have been signed with Key : YOUR_KEY_ID
Please vote accordingly:
[ ] +1, accept RC as the official 0.14.0 release
[ ] -1, do not accept RC as the official 0.14.0 release because...
[ ] +1, accept RC as the official 0.15.0 release
[ ] -1, do not accept RC as the official 0.15.0 release because...
```
17. After the vote has been accepted, update `RELEASE.md`.
18. Create a release tag
Expand All @@ -106,36 +107,36 @@ Close the staged repository on Apache Nexus.
https://dist.apache.org/repos/dist/release/predictionio/. This is the area
for staging actual releases.
21. Create a subdirectory at the SVN staging area. The area should have a `KEYS` file.
* `mkdir 0.14.0`
* `mkdir 0.15.0`
* Copy the binary distribution from the dev/ tree to the release/ tree
* Copy the official release to the release/ tree
22. If you have updated the `KEYS` file, also copy that to the staging area.
23. Remove old releases from the ASF distribution mirrors.
(https://www.apache.org/dev/mirrors.html#location)
* `svn delete 0.13.0`
24. `svn commit -m "Apache PredictionIO 0.14.0"`
* `svn delete 0.14.0`
24. `svn commit -m "Apache PredictionIO 0.15.0"`
25. Document breaking changes in https://predictionio.apache.org/resources/upgrade/.
26. Mark the version as released on JIRA.
(https://issues.apache.org/jira/projects/PIO?selectedItem=com.atlassian.jira.jira-projects-plugin%3Arelease-page&status=no-filter)
27. Send out an email to the following mailing lists: announce, user, dev.

```
Subject: [ANNOUNCE] Apache PredictionIO 0.14.0 Release
Subject: [ANNOUNCE] Apache PredictionIO 0.15.0 Release
The Apache PredictionIO team would like to announce the release of Apache PredictionIO 0.14.0.
The Apache PredictionIO team would like to announce the release of Apache PredictionIO 0.15.0.
Release notes are here:
https://github.com/apache/predictionio/blob/release/0.14.0/RELEASE.md
https://github.com/apache/predictionio/blob/v0.15.0/RELEASE.md
Apache PredictionIO is an open source Machine Learning Server built on top of state-of-the-art open source stack, that enables developers to manage and deploy production-ready predictive services for various kinds of machine learning tasks.
More details regarding Apache PredictionIO can be found here:
https://predictionio.apache.org/
The release artifacts can be downloaded here:
https://www.apache.org/dyn/closer.lua/predictionio/0.14.0/apache-predictionio-0.14.0-bin.tar.gz
https://www.apache.org/dyn/closer.lua/predictionio/0.15.0/apache-predictionio-0.15.0-bin.tar.gz
All JIRAs completed for this release are tagged with 'FixVersion = 0.13.0'; the JIRA release notes can be found here:
All JIRAs completed for this release are tagged with 'FixVersion = 0.15.0'; the JIRA release notes can be found here:
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12320420&version=12337844
Thanks!
Expand Down
4 changes: 1 addition & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,9 +45,7 @@ A few installation options available.
* [Installing Apache PredictionIO from
Binary/Source](http://predictionio.apache.org/install/install-sourcecode/)
* [Installing Apache PredictionIO with
Docker](http://predictionio.apache.org/community/projects/#docker-images)
(community contributed)

Docker](http://predictionio.apache.org/install/install-docker/)

## Quick Start

Expand Down
56 changes: 49 additions & 7 deletions RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,59 @@ limitations under the License.

# Release Notes and News

**Note:** For upgrade instructions please refer to [this page](http://predictionio.apache.org/resources/upgrade/).
**Note:** For upgrade instructions please refer to [this page](https://predictionio.apache.org/resources/upgrade/).

## Deprecation Notice
## Version History

### 0.13.0
### 0.14.0

- Support for Scala 2.10.x and Spark 1.x are now deprecated. The next non-patch
version of PredictionIO may no longer support them
([PIO-158](https://issues.apache.org/jira/browse/PIO-158)).
Mar 11, 2019

## Version History
#### Breaking changes

- [PIO-168](https://issues.apache.org/jira/browse/PIO-168): Elasticsearch 6.x support (see the [pull request](https://github.com/apache/predictionio/pull/466))

#### New Features

- [PIO-183](https://issues.apache.org/jira/browse/PIO-183): Add Jupyter Docker image
- [PIO-199](https://issues.apache.org/jira/browse/PIO-199): Spark 2.4 (Scala 2.11) support

#### Behavior Changes

- [PIO-31](https://issues.apache.org/jira/browse/PIO-31): Move from spray to akka-http in servers
- [PIO-171](https://issues.apache.org/jira/browse/PIO-171): Drop Scala 2.10 and Spark 1.6 support
- [PIO-175](https://issues.apache.org/jira/browse/PIO-175): Deprecation of Elasticsearch 1.x support
- [PIO-179](https://issues.apache.org/jira/browse/PIO-179): bump up hbase client version and make it configurable
- [PIO-192](https://issues.apache.org/jira/browse/PIO-192): Enhance PySpark support
- [PIO-196](https://issues.apache.org/jira/browse/PIO-196): Use external PySpark environment variables in Jupyter Docker image

#### Other Changes

- [PIO-153](https://issues.apache.org/jira/browse/PIO-153): Allow use of GNU tar on non-GNU systems
- [PIO-170](https://issues.apache.org/jira/browse/PIO-170): Upgrade sbt to 1.x
- [PIO-176](https://issues.apache.org/jira/browse/PIO-176): Clean up unmanaged sources in the data module
- [PIO-182](https://issues.apache.org/jira/browse/PIO-182): Add asynchronous (non-blocking) methods to LEventStore
- [PIO-188](https://issues.apache.org/jira/browse/PIO-188): Update the build matrix to the latest supported versions
- [PIO-189](https://issues.apache.org/jira/browse/PIO-189): ES6 integration test fails
- [PIO-194](https://issues.apache.org/jira/browse/PIO-194): S3 Model Data Storage should allow more flexible ways for specifying AWS credentials
- [PIO-203](https://issues.apache.org/jira/browse/PIO-203): pio status warnings
- [PIO-205](https://issues.apache.org/jira/browse/PIO-205): Update Dockerfile to reflect new Spark version
- [PIO-206](https://issues.apache.org/jira/browse/PIO-206): Spark 2.3.2 to 2.3.3

#### Documentation

- [PIO-172](https://issues.apache.org/jira/browse/PIO-172): Migration guide for ES 6.x changes
- [PIO-180](https://issues.apache.org/jira/browse/PIO-180): Trivial LiveDoc Link Change in Readme
- [PIO-185](https://issues.apache.org/jira/browse/PIO-185): Non-tracked Link in Apache Project page
- [PIO-195](https://issues.apache.org/jira/browse/PIO-195): Improve readability and grammar of documentation

#### Credits

The following contributors have spent a great deal of effort to bring to you
this release:

Alexander Merritt, Chris Wewerka, Donald Szeto, Naoki Takezoe, Saurabh Gulati,
Shinsuke Sugaya, Takako Shimamoto, Wei Chen, Yavor Stoychev

### 0.13.0

Expand Down
1 change: 0 additions & 1 deletion bin/pio-shell
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,6 @@ then
# Get paths of assembly jars to pass to pyspark
. ${PIO_HOME}/bin/compute-classpath.sh
shift
export PYTHONSTARTUP=${PIO_HOME}/python/pypio/shell.py
export PYTHONPATH=${PIO_HOME}/python
${SPARK_HOME}/bin/pyspark --jars ${ASSEMBLY_JARS} $@
else
Expand Down
2 changes: 0 additions & 2 deletions bin/pio-start-all
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,6 @@ if [ `echo $SOURCE_TYPE | grep -i elasticsearch | wc -l` != 0 ] ; then
echo "Starting Elasticsearch..."
if [ -n "$PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME" ]; then
ELASTICSEARCH_HOME=$PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME
elif [ -n "$PIO_STORAGE_SOURCES_ELASTICSEARCH5_HOME" ]; then
ELASTICSEARCH_HOME=$PIO_STORAGE_SOURCES_ELASTICSEARCH5_HOME
fi
if [ -n "$ELASTICSEARCH_HOME" ]; then
if [ -n "$JAVA_HOME" ]; then
Expand Down
63 changes: 24 additions & 39 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ lazy val scalaSparkDepsVersion = Map(

name := "apache-predictionio-parent"

version in ThisBuild := "0.14.0-SNAPSHOT"
version in ThisBuild := "0.15.0-SNAPSHOT"

organization in ThisBuild := "org.apache.predictionio"

Expand All @@ -60,23 +60,20 @@ sparkVersion in ThisBuild := sys.props.getOrElse("spark.version", "2.1.3")

sparkBinaryVersion in ThisBuild := binaryVersion(sparkVersion.value)

akkaVersion in ThisBuild := sys.props.getOrElse(
"akka.version",
scalaSparkDepsVersion(scalaBinaryVersion.value)(sparkBinaryVersion.value)("akka"))
hadoopVersion in ThisBuild := sys.props.getOrElse("hadoop.version", "2.7.7")

lazy val es = sys.props.getOrElse("elasticsearch.version", "5.6.9")
akkaVersion in ThisBuild := sys.props.getOrElse("akka.version", "2.5.17")

elasticsearchVersion in ThisBuild := es
elasticsearchVersion in ThisBuild := sys.props.getOrElse("elasticsearch.version", "5.6.9")

lazy val hbase = sys.props.getOrElse("hbase.version", "1.2.6")
hbaseVersion in ThisBuild := sys.props.getOrElse("hbase.version", "1.2.6")

hbaseVersion in ThisBuild := hbase

json4sVersion in ThisBuild := scalaSparkDepsVersion(scalaBinaryVersion.value)(sparkBinaryVersion.value)("json4s")

hadoopVersion in ThisBuild := sys.props.getOrElse(
"hadoop.version",
scalaSparkDepsVersion(scalaBinaryVersion.value)(sparkBinaryVersion.value)("hadoop"))
json4sVersion in ThisBuild := {
sparkBinaryVersion.value match {
case "2.0" | "2.1" | "2.2" | "2.3" => "3.2.11"
case "2.4" => "3.5.3"
}
}

val conf = file("conf")

Expand All @@ -92,10 +89,6 @@ val commonTestSettings = Seq(
"org.postgresql" % "postgresql" % "9.4-1204-jdbc41" % "test",
"org.scalikejdbc" %% "scalikejdbc" % "3.1.0" % "test"))

val dataElasticsearch1 = (project in file("storage/elasticsearch1")).
settings(commonSettings: _*).
enablePlugins(GenJavadocPlugin)

val dataElasticsearch = (project in file("storage/elasticsearch")).
settings(commonSettings: _*)

Expand Down Expand Up @@ -151,33 +144,31 @@ val core = (project in file("core")).
enablePlugins(SbtTwirl).
disablePlugins(sbtassembly.AssemblyPlugin)

val tools = (project in file("tools")).
val e2 = (project in file("e2")).
dependsOn(core).
dependsOn(data).
settings(commonSettings: _*).
settings(commonTestSettings: _*).
settings(skip in publish := true).
enablePlugins(GenJavadocPlugin).
enablePlugins(SbtTwirl)
disablePlugins(sbtassembly.AssemblyPlugin)

val e2 = (project in file("e2")).
val tools = (project in file("tools")).
dependsOn(e2).
settings(commonSettings: _*).
settings(commonTestSettings: _*).
settings(skip in publish := true).
enablePlugins(GenJavadocPlugin).
disablePlugins(sbtassembly.AssemblyPlugin)

val dataEs = if (majorVersion(es) == 1) dataElasticsearch1 else dataElasticsearch
enablePlugins(SbtTwirl)

val storageSubprojects = Seq(
dataEs,
val storageProjectReference = Seq(
dataElasticsearch,
dataHbase,
dataHdfs,
dataJdbc,
dataLocalfs,
dataS3)
dataS3) map Project.projectToRef

val storage = (project in file("storage"))
.settings(skip in publish := true)
.aggregate(storageSubprojects map Project.projectToRef: _*)
.aggregate(storageProjectReference: _*)
.disablePlugins(sbtassembly.AssemblyPlugin)

val assembly = (project in file("assembly")).
Expand All @@ -187,9 +178,8 @@ val root = (project in file(".")).
settings(commonSettings: _*).
enablePlugins(ScalaUnidocPlugin).
settings(
skip in publish := true,
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(dataElasticsearch, dataElasticsearch1),
unidocProjectFilter in (JavaUnidoc, unidoc) := inAnyProject -- inProjects(dataElasticsearch, dataElasticsearch1),
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(storageProjectReference: _*),
unidocProjectFilter in (JavaUnidoc, unidoc) := inAnyProject -- inProjects(storageProjectReference: _*),
scalacOptions in (ScalaUnidoc, unidoc) ++= Seq(
"-groups",
"-skip-packages",
Expand All @@ -202,11 +192,6 @@ val root = (project in file(".")).
"org.apache.predictionio.controller.java",
"org.apache.predictionio.data.api",
"org.apache.predictionio.data.storage.*",
"org.apache.predictionio.data.storage.hdfs",
"org.apache.predictionio.data.storage.jdbc",
"org.apache.predictionio.data.storage.localfs",
"org.apache.predictionio.data.storage.s3",
"org.apache.predictionio.data.storage.hbase",
"org.apache.predictionio.data.view",
"org.apache.predictionio.data.webhooks",
"org.apache.predictionio.tools",
Expand Down
Loading

0 comments on commit 92cab7a

Please sign in to comment.