Skip to content Skip to sidebar Skip to footer

Pyspark_python Setup In Jupyter Notebooks Is Ignored

I've been trying to setup PYSPARK_PYTHON from a juputer notebook(using jupyter lab) to use a specific conda env but i cannot find a way to make it work, I have found some examples

Solution 1:

To get it working you should also pass those parameters to cli:

export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'

Another method is to install findspark package:

import findspark
findspark.init()

import pyspark

Hope it'll help: https://www.sicara.ai/blog/2017-05-02-get-started-pyspark-jupyter-notebook-3-minutes

Post a Comment for "Pyspark_python Setup In Jupyter Notebooks Is Ignored"