Install Spark on Mac OS

Sean Source

Trying to install Spark and associated programs on Mac but receiving error messages when testing installation.

/Users/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/pyspark /Users/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/spark-class: line 71: /Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java: No such file or directory

from my bash_profile entry...

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home/

export SPARK_HOME=/Users/directory/apachespark/spark-2.3.0-bin-hadoop2.7

export SBT_HOME=/Users/directory/apachespark/sbt

export SCALA_HOME=/Users/directory/apachespark/scala-2.11.12

export PATH=$JAVA_HOME/bin:$SBT_HOME/bin:$SBT_HOME/lib:$SCALA_HOME/bin:$SCALA_HOME/lib:$PATH

export PATH=$JAVA_HOME/bin:$SPARK_HOME:$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH

export PYSPARK_PYTHON=python3

PATH="/Library/Frameworks/Python.framework/Versions/3.6/bin:${PATH}" export PATH

correction suggestions? Thanks.

pythonscalaapache-spark

Answers

answered 5 days ago Leo C #1

As shown in the reported error message:

/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java: No such file or directory

your file path for the Java executable $JAVA_HOME/bin generates an extra / due to the trailing / in your JAVA_HOME:

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home/

Removing the trailing / in JAVA_HOME should fix the problem. Better yet, setting JAVA_HOME as shown below would automatically point to the active JDK version on Mac OSX:

export JAVA_HOME=$(/usr/libexec/java_home)

comments powered by Disqus