You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 17, 2022. It is now read-only.
Can you confirm that setting the environment variable is not working?
I think that the environment variable is read by Spark only while the SparkContext object is created.
The extension only imports pyspark and creates a SparkConf object. If I'm not wrong, you can still add properties to conf and as well set environment variables before starting the context.
(Here again you must pass the conf to create the SparkContext for the extension to work.)
You're right. The PYSPARK_SUBMIT_ARGS are not used only in the case of the PySpark kernel in jupyter. But that is because the PySpark kernel initializes the SparkContext internally and hence the args don't work (as sparkcontext has been initialized already)
An observation: It does look like sparkmonitor won't work correctly with the PySpark Kernel as PySparkKernel won't use the conf created by sparkmonitor.
Based on the discussion at #6 (comment)
The extension is doing an
import pyspark
inside the extension. Which means, that if I as a jupyter user want to do something like:I cannot, because the PYSPARK_SUBMIT_ARGS environment variable will be created after the
pyspark
imported in the sparkmonitor module.The text was updated successfully, but these errors were encountered: