-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Launch cluster with AMI that already has spark #308
Comments
It's probably because Spark isn't being configured with the addresses of the nodes in your cluster (which would happen as part of the "Configuring Spark master" step). Off the top of my head, I think to get this to work you'd need to set A more proper fix would perhaps be to add a new |
Got it, thank you. I'm not sure if/when I'll have the time, but would you consider a PR in the direction of your second suggestion? And one last question, for my own understanding: what is the intended use of the |
Yes, I would consider a PR along those lines. If you just want a cluster with HDFS, or plan to do the Spark config yourself, then setting In other words, you can make things work with Flintrock the way it is and set The |
That makes sense. I might have a little time to tinker this week; I'll be back if I make sufficient progress. Thanks for your help, and for Flintrock itself! |
Let's continue this discussion over on #237, which I think captures the same need expressed here. |
I tend to use Flintrock with custom builds of spark. Normally I host the build somewhere and use the
download-source
configuration parameter in the flintrock config to link to it, and this works fine. But I thought it might be convenient to create an AMI (starting with Amazon Linux 2) with Java and Spark already installed and setinstall-spark
toFalse
in the flintrock config, and so I gave this a try. The cluster launched as expected, but when I tried to start a spark session I got:It retried for awhile, and eventually errored out. Is there a known way to get this to work?
One last note: when I allow Flintrock to install Spark there is normally a message at the end of the launch process which says
Configuring Spark master...
. I didn't get that when I setinstall-spark
toFalse
.Thank you!
The text was updated successfully, but these errors were encountered: