MyException - 我的异常网
当前位置:我的异常网» 互联网 » Liunx筹建Spark开发环境

香港六合彩开第2页:Liunx筹建Spark开发环境

香港六合彩现场直播 www.hppyv.com  网友分享于:2018-05-08  浏览:0次
Liunx搭建Spark开发环境
Liunx搭建Spark开发环境
1.Spark?
2.Spark开发环境搭建
【1】Spark开发环境搭建需要Hadoop,Java,Scala环境的支持,由于本机已经默认安装Java?和Hadoop开发环境,这里也不再赘述
【2】搭建Scala开发环境和SBT开发环境:
? ? ? ?(2.1)下载Scala和SBT安装包:
? ? ? ?Scala官网://www.scala-lang.org/
? ? ? ?Sbt官网:https://www.scala-sbt.org/download.html
?
?
(2.2)环境安装Scala和Sbt:
Scala安装:
利用Xftp5工具把[scala-2.12.5.tgz]上传到Liunx服务器:/usr/local/scala
利用Xshell5工具登录到服务器,并进入到/usr/local/scala目录解压Scala,输入:tar -xvf??scala-2.12.5.tgz
Last login: Sat Apr??7 07:22:36 2018 from 192.168.3.4
[[email protected] ~]# cd /usr/local/scala
[[email protected] scala]# ll
total 19832
-rw-r--r--. 1 root root 20303983 Apr??7 10:10 scala-2.12.5.tgz
[[email protected] scala]# tar -xvf??scala-2.12.5.tgz
?
?
配置环境变量,输入:vim? /etc/profile
#Setting SCALA_HOME PATH
export SCALA_HOME=/usr/local/scala/scala-2.12.5
export PATH=${PATH}:${SCALA_HOME}/bin
?
输入:source /etc/profile使得环境变量生效
?
【3】官网下载【//spark.apache.org/】安装包:spark-2.3.0-bin-hadoop2.7.tgz
?
【4】把安装包:spark-2.3.0-bin-hadoop2.7.tgz上传到:/usr/local/spark
?
【5】进入到:/usr/local/spark,解压spark-2.3.0-bin-hadoop2.7.tgz,输入:tar -xvf??spark-2.3.0-bin-hadoop2.7.tgz
[[email protected] scala]# cd /usr/local/spark
[[email protected] spark]# ll
total 220832
-rw-r--r--. 1 root root 226128401 Apr??7 10:38 spark-2.3.0-bin-hadoop2.7.tgz
[[email protected] spark]# tar -xvf??spark-2.3.0-bin-hadoop2.7.tgz
?
?
【6】配置环境变量:vim? /etc/profile
#Setting SPARK_HOME PATH
export SPARK_HOME=/usr/local/spark/spark-2.3.0
export PATH=${PATH}:${SPARK_HOME}/bin
?
输入:source /etc/profile使得环境变量生效
?
【7】修改配置文件:
? 进入到:cd?/usr/local/spark/spark-2.3.0/conf,修改Spark conf目录下的slaves文件,
?修改前先备份并重命名cp slaves.template slaves,将slaves文件中的localhost修改为主机名,我的是marklin.com:
[[email protected] conf]# cp slaves.template slaves
total 40
-rw-r--r--. 1 1311767953 1876110778??996 Feb 22 14:42 docker.properties.template
-rw-r--r--. 1 1311767953 1876110778 1105 Feb 22 14:42 fairscheduler.xml.template
-rw-r--r--. 1 1311767953 1876110778 2025 Feb 22 14:42 log4j.properties.template
-rw-r--r--. 1 1311767953 1876110778 7801 Feb 22 14:42 metrics.properties.template
-rw-r--r--. 1 root???????root????????865 Apr??7 10:54 slaves
-rw-r--r--. 1 1311767953 1876110778??865 Feb 22 14:42 slaves.template
-rw-r--r--. 1 1311767953 1876110778 1292 Feb 22 14:42 spark-defaults.conf.template
-rwxr-xr-x. 1 1311767953 1876110778 4221 Feb 22 14:42 spark-env.sh.template
[[email protected] conf]# chmod +x slaves
total 40
-rw-r--r--. 1 1311767953 1876110778??996 Feb 22 14:42 docker.properties.template
-rw-r--r--. 1 1311767953 1876110778 1105 Feb 22 14:42 fairscheduler.xml.template
-rw-r--r--. 1 1311767953 1876110778 2025 Feb 22 14:42 log4j.properties.template
-rw-r--r--. 1 1311767953 1876110778 7801 Feb 22 14:42 metrics.properties.template
-rwxr-xr-x. 1 root???????root????????865 Apr??7 10:54 slaves
-rw-r--r--. 1 1311767953 1876110778??865 Feb 22 14:42 slaves.template
-rw-r--r--. 1 1311767953 1876110778 1292 Feb 22 14:42 spark-defaults.conf.template
-rwxr-xr-x. 1 1311767953 1876110778 4221 Feb 22 14:42 spark-env.sh.template
[[email protected] conf]# vim slaves
?
?
修改spark-env.sh文件:修改前先备份并重命名cp spark-env.sh.tempalte spark-env.sh
[[email protected] conf]# cp spark-env.sh.template spark-env.sh
?
然后打开spark-env.sh文件,追加内容:
export JAVA_HOME=/usr/local/java/jdk1.8.0_162
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.7.5
export SCALA_HOME=/usr/local/scala/scala-2.12.5
export SPARK_HOME=/usr/local/spark/spark-2.3.0
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export SPARK_LOCAL_IP=marklin.com
export SPARK_MASTER_HOST=marklin.com
export SPARK_WORKER_MEMORY=512M
export SPARK_CONF_DIR=${SPARK_HOME}/conf
export SPARK_LOG_DIR=/usr/local/spark/repository/logs
export SPARK_PID_DIR=/usr/local/spark/repository/pids
export SPARK_LIBARY_PATH=.:${JAVA_HOME}/lib:${JAVA_HOME}/jre/lib:${HADOOP_HOME}/lib/native
export SPARK_WORKER_DIR=/usr/local/spark/repository/worker
export SPARK_MASTER_PORT=8188
export SPARK_MASTER_WEBUI_PORT=8180
export SPARK_WORKER_PORT=8181
export SPARK_WORKER_WEBUI_PORT=8182
?
开放端口:
[[email protected] ~]# systemctl start firewalld.service
[[email protected] ~]# firewall-cmd --zone=public --add-port=8180/tcp --permanent
success
[[email protected] ~]# firewall-cmd --zone=public --add-port=8188/tcp --permanent
success
[[email protected] ~]# firewall-cmd --zone=public --add-port=8181/tcp --permanent
success
[[email protected] ~]# firewall-cmd --zone=public --add-port=8182/tcp --permanent
success
[[email protected] ~]# firewall-cmd --reload
success
[[email protected] ~]# systemctl stop firewalld.service
?
?
【8】启动测试
进入到:cd?/usr/local/spark/spark-2.3.0/sbin ,输入:start-master.sh
[[email protected] sbin]# start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/repository/logs/spark-root-org.apache.spark.deploy.master.Master-1-marklin.com.out
?
输入://192.168.3.4:8180/#running-app
?
输入:cd bin ,输入:spark-shell
[[email protected] sbin]# cd ..
[[email protected] spark-2.3.0]# cd bin
[[email protected] bin]# spark-shell
2018-04-07 11:43:08 WARN??NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at //marklin.com:4040
Spark context available as 'sc' (master = local[*], app id = local-1523115824100).
Spark session available as 'spark'.
Welcome to
??????____??????????????__
?????/ __/__??___ _____/ /__
????_\ \/ _ \/ _ `/ __/??'_/
???/___/ .__/\_,_/_/ /_/\_\???version 2.3.0
??????/_/
?????????
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_162)
Type in expressions to have them evaluated.
Type :help for more information.
?
scala>
?
?

文章评论

软件开发程序错误异常香港六合彩现场直播Copyright © 2009-2015 MyException 版权所有
413| 619| 915| 132| 257| 535| 431| 571| 510| 147|