Fix compatibility of Apache-Spark 2.1.0 with Python 3.6

If you are using MacOs Sierra and Homebrew like me, and you want to build comething cool with Apache-Spark and Python3, you would find the compatibility problem while using the pyspark framework. So this article is to tell you how to solve it.



The only way for now is to install Python 3.5 by using Pyenv on Homebrew since the compatibility will be fixed on Apache-Spark 2.2.0 and it hasn’t been released.


  1. Install Pyenv
    1. Supposed that you have install Apache-Spark and Python 3.6 correctly by using the homebrew
    2. brew install pyenv
    3. Change your shell-rc file
      1. add the following command at the begining of configuation

        eval “$(jenv init -)”

    4. Install Python 3.5 (for now the latest version is 3.5.2)
      1. pyenv install python 3.5.2

    5. Change global Python to 3.5.2 instead of 3.6.x
      1. pyenv global 3.5.2

  2. Install pyspark framework by using pip3
    1. So for now your pip would be directed to python 3.5.2
    2. Go to your $SPARK_HOME/libexec/python
    3. pip3 install -e .
      pip2 install -e.

    4. Then you would find that you cannot get to the $SPARK_HOME
    5. brew link –over-write apache-spark

    6. Then happy hacking with pyspark


  1. Vanban Wu


    I cannot find the Shell-rc file anywhere in my MacOS Sierra to fix the problem of Spark 2.1.0 working with Python 3.6. Please advice where I should look. Thanks much.

    Best regards,


    • Well whatever the main-stream shell you are using, bash or zsh, you would find your shell-rc file in your home directory. Open your terminal, then input “cat .bashrc” or “cat .zshrc” then see if there is something shows off, if yes then you could do some editing.