我们专注攀枝花网站设计 攀枝花网站制作 攀枝花网站建设
成都网站建设公司服务热线:400-028-6601

网站建设知识

十年网站开发经验 + 多家企业客户 + 靠谱的建站团队

量身定制 + 运营维护+专业推广+无忧售后,网站问题一站解决

spark-submit提交任务时报错,ErrorinitializingSparkContext

16/03/04 00:21:09 WARN SparkContext: Using SPARK_MEM to set amount of memory to use per executor process is deprecated, please use spark.executor.memory instead.

阜新ssl适用于网站、小程序/APP、API接口等需要进行数据传输应用场景,ssl证书未来市场广阔!成为成都创新互联公司的ssl证书销售渠道,可以享受市场价格4-6折优惠!如果有意向欢迎电话联系或者加微信:028-86922220(备注:SSL证书合作)期待与您的合作!

16/03/04 00:21:09 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Could not parse Master URL: 'at'

at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2554)

at org.apache.spark.SparkContext.(SparkContext.scala:489)

at com.bigdata.deal.scala.DomainLib$.main(DomainLib.scala:22)

at com.bigdata.deal.scala.DomainLib.main(DomainLib.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

配置conf时,务必要有 sparkHome  master地址

注意在spark-env.sh中要配置这些,就可以了

[root@mini-cp1 spark-1.4.0-bin-hadoop2.6]# cat conf/spark-env.sh

#!/usr/bin/env bash

SPARK_MASTER_IP=mini-cp1

#必须导入JAVA根目录路径

export JAVA_HOME=/usr/local/jdk1.7.0_65

export HADOOP_HOME=/usr/local/hadoop-2.6.0

#export SCALA_HOME=/opt/scala

export SPARK_WORKER_MEMORY=3g

export HADOOP_CONF_DIR=/usr/local/hadoop-2.6.0/etc/hadoop

#SPARK_MEM=${SPARK_MEM:-1g}

export SPARK_MEM=3g

export HADOOP_HOME=/usr/local/hadoop-2.6.0

export HADOOP_COMMON_LIB_NATIVE_DIR=/usr/local/hadoop-2.6.0/lib/native

export HADOOP_OPTS="-Djava.library.path=/usr/local/hadoop-2.6.0/lib"



名称栏目:spark-submit提交任务时报错,ErrorinitializingSparkContext
分享URL:http://shouzuofang.com/article/jggohi.html

其他资讯