Skip to content

[SPARK-16880][ML][MLLib] make ann training data persisted if needed #14483

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

WeichenXu123
Copy link
Contributor

What changes were proposed in this pull request?

To Make sure ANN layer input training data to be persisted,
so that it can avoid overhead cost if the RDD need to be computed from lineage.

How was this patch tested?

Existing Tests.

@SparkQA
Copy link

SparkQA commented Aug 3, 2016

Test build #63178 has finished for PR 14483 at commit 0bfece8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Aug 3, 2016

LGTM. Do you see other instances of this type of pattern? worth a quick glance

@WeichenXu123
Copy link
Contributor Author

@srowen yeah, others algorithm using LBFGS all have this pattern, only ANN forgot it.

asfgit pushed a commit that referenced this pull request Aug 4, 2016
## What changes were proposed in this pull request?

To Make sure ANN layer input training data to be persisted,
so that it can avoid overhead cost if the RDD need to be computed from lineage.

## How was this patch tested?

Existing Tests.

Author: WeichenXu <WeichenXu123@outlook.com>

Closes #14483 from WeichenXu123/add_ann_persist_training_data.

(cherry picked from commit 462784f)
Signed-off-by: Sean Owen <sowen@cloudera.com>
@srowen
Copy link
Member

srowen commented Aug 4, 2016

Merged to master/2.0

@asfgit asfgit closed this in 462784f Aug 4, 2016
@WeichenXu123 WeichenXu123 deleted the add_ann_persist_training_data branch August 5, 2016 09:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants