Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Prediction.io - pio train fails

I'm using the Elasticsearch + Hbase version of Prediction.IO from the sphereio/docker-predictionio docker image and the universal recommendation template template-scala-parallel-universal-recommendation.

pio-start-all and pio status work fine and the eventserver is prefectly functional. I have created an app and imported a few hundred events to start with.

However, after doing pio build on the template, pio train fails giving a couple of javax.naming.NameNotFoundException warnings. Even pio.log does not contain anything else.

Here's my engine.json:

{
    "comment": " This config file uses default settings for all but the required values see README.md for docs",
    "id": "default",
    "description": "Default settings",
    "engineFactory": "com.test.RecommendationEngine",
    "datasource": {
        "params": {
            "name": "sample-handmade-data.txt",
            "appName": "testapp",
            "eventNames": ["START"]
        }
    },
    "sparkConf": {
        "spark.serializer": "org.apache.spark.serializer.KryoSerializer",
        "spark.kryo.registrator": "org.apache.mahout.sparkbindings.io.MahoutKryoRegistrator",
        "spark.kryo.referenceTracking": "false",
        "spark.kryoserializer.buffer": "300m",
        "spark.executor.memory": "4g",
        "es.index.auto.create": "true"
    },
    "algorithms": [{
        "comment": "simplest setup where all values are default, popularity based backfill, must add eventsNames",
        "name": "ur",
        "params": {
            "appName": "testapp",
            "indexName": "urindex",
            "typeName": "items",
            "comment": "must have data for the first event or the model will not build, other events are optional",
            "eventNames": ["START"]
        }
    }]
}

And the pio train output:

[INFO] [Console$] Using existing engine manifest JSON at /PredictionIO-0.9.6/engines/universal-recommendation/manifest.json
[INFO] [Runner$] Submission command: /PredictionIO-0.9.6/vendors/spark-1.5.1-bin-hadoop2.6/bin/spark-submit --class io.prediction.workflow.CreateWorkflow --jars file:/PredictionIO-0.9.6/engines/universal-recommendation/target/scala-2.10/template-scala-parallel-universal-recommendation-assembly-0.2.3-deps.jar,file:/PredictionIO-0.9.6/engines/universal-recommendation/target/scala-2.10/template-scala-parallel-universal-recommendation_2.10-0.2.3.jar --files file:/PredictionIO-0.9.6/conf/log4j.properties,file:/PredictionIO-0.9.6/vendors/hbase-1.0.0/conf/hbase-site.xml --driver-class-path /PredictionIO-0.9.6/conf:/PredictionIO-0.9.6/vendors/hbase-1.0.0/conf file:/PredictionIO-0.9.6/lib/pio-assembly-0.9.6.jar --engine-id FYOHZGlAmUH2xAYWNmQFIf9Jls201WVr --engine-version a892fe59be15dcf27a17f07fb76135a967309fda --engine-variant file:/PredictionIO-0.9.6/engines/universal-recommendation/engine.json --verbosity 0 --json-extractor Both --env PIO_STORAGE_SOURCES_HBASE_TYPE=hbase,PIO_ENV_LOADED=1,PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta,PIO_VERSION=0.9.6,PIO_FS_BASEDIR=/root/.pio_store,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost,PIO_STORAGE_SOURCES_HBASE_HOME=/PredictionIO-0.9.6/vendors/hbase-1.0.0,PIO_HOME=/PredictionIO-0.9.6,PIO_FS_ENGINESDIR=/root/.pio_store/engines,PIO_STORAGE_SOURCES_LOCALFS_PATH=/root/.pio_store/models,PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch,PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH,PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS,PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event,PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=predictionio,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/PredictionIO-0.9.6/vendors/elasticsearch-1.4.4,PIO_FS_TMPDIR=/root/.pio_store/tmp,PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model,PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE,PIO_CONF_DIR=/PredictionIO-0.9.6/conf,PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300,PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
[INFO] [Engine] Extracting datasource params...
[INFO] [WorkflowUtils$] No 'name' is found. Default empty String will be used.
[INFO] [Engine] Datasource params: (,DataSourceParams(testapp,List(START)))
[INFO] [Engine] Extracting preparator params...
[INFO] [Engine] Preparator params: (,Empty)
[INFO] [Engine] Extracting serving params...
[INFO] [Engine] Serving params: (,Empty)
[INFO] [Remoting] Starting remoting
[INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://[email protected]:42582]
[WARN] [MetricsSystem] Using default name DAGScheduler for source because spark.app.id is not set.
[INFO] [Engine$] EngineWorkflow.train
[INFO] [Engine$] DataSource: com.test.DataSource@75bd28d
[INFO] [Engine$] Preparator: com.test.Preparator@13278a41
[INFO] [Engine$] AlgorithmList: List(com.test.URAlgorithm@2365ea38)
[INFO] [Engine$] Data sanity check is on.
[WARN] [TableInputFormatBase] Cannot resolve the host name for 9a94fb2890b3/172.17.0.2 because of javax.naming.NameNotFoundException: DNS name not found [response code 3]; remaining name '2.0.17.172.in-addr.arpa'
[INFO] [Engine$] com.test.TrainingData does not support data sanity check. Skipping check.
[WARN] [TableInputFormatBase] Cannot resolve the host name for 9a94fb2890b3/172.17.0.2 because of javax.naming.NameNotFoundException: DNS name not found [response code 3]; remaining name '2.0.17.172.in-addr.arpa'
like image 597
Harsha Bhat Avatar asked Sep 19 '16 06:09

Harsha Bhat


1 Answers

There is one way to short out this problem. Please use google d.n.s while starting your docker container.

--dns=8.8.8.8
like image 72
Sri Avatar answered Nov 06 '22 14:11

Sri