Spark Will Not Start with Spark Error-java.lang.OutOfMemoryError PermGen space

The Problem

This issue came up when I was getting started with Spark 1.4. I could get the Spark-Shell running but as soon as I would look to execute any code, it would crash with the error. This issue also popped up on the Spark Mailing List. It's kind of an interesting JVM error that's worth elaborating on a bit.
java.lang.OutOfMemoryError: PermGen space
After some digging I found out the cause for the error.

What is PermGen space?

This permgen stackoverflow question answers the question but for brevity, the answer is that it stands for Permanent Generation. In the JVM, PermGen is used to hold loaded classes. As Spark is fairly heavy with classes, this seems to cause problems in certain environments. Luckily, this is actually a pretty straightforward fix.

Resolving the Problem

The resolution is pretty straightforward, you've got to add more PermGen! You can do that with the command line option when you're starting the shell or when you're submitting an application. You'll simply need to add the --driver-java-options -XX:MaxPermSize=256moption like you can see below.
./bin/spark-shell --driver-java-options -XX:MaxPermSize=256m
Feel free to bump it up to anything larger if you keep running into the error. This same option applies to spark-submitas well.
./bin/spark-submit  --driver-java-options -XX:MaxPermSize=256m

Further References

Spark Archives

Questions or comments?

comments powered by Disqus