Skip to content

Wide-deep example shutting down #394

Closed
@ksy3395

Description

@ksy3395

Hello,
I am trying to run Wide-deep example with Spark standalone cluster on my Mac and also on a ubuntu machine separately but it keeps shutting down the spark job, so I can't check anything since there is no error or log. I tried mnist stanalone example, but I didn't have any problem on these machines. Is there anything else I can check if there is anything wrong?

Command:
${SPARK_HOME}/bin/spark-submit --master spark://192.168.1.9:7077 --py-files census_dataset.py, wide_deep_run_loop.py --conf spark.cores.max=3 --conf spark.task.cpus=1 --conf spark.executorEnv.JAVA_HOME="$JAVA_HOME" --conf spark.task.maxFailures=1 --conf spark.stage.maxConsecutiveAttempts=1 census_main.py --cluster_size 3

2019-02-18 01:55:26 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-02-18 01:55:28 INFO ShutdownHookManager:54 - Shutdown hook called
2019-02-18 01:55:28 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/98/34ltsy7x4f5btv6bpt6d75xm0000gn/T/spark-7cf40b10-9420-4680-8059-fed7d4ae4930
2019-02-18 01:55:28 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/98/34ltsy7x4f5btv6bpt6d75xm0000gn/T/localPyFiles-75adab00-d652-47e4-92c5-b12dcb07ca99

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions