By now, we should feel familiar with setting up a Spark project, running it in IntelliJ, and visiting localhost in the browser to view results. But it is also important to know how to terminate a Spark process correctly in order to avoid a frustrating issue we'll all encounter sooner or later - our app builds, but won't start the server - reporting
[Thread-1] ERROR spark.Spark - ignite failed java.net.BindException: Address already in use
Why does this happen?
As we know, hitting Run on a Spark project initiates a Java process, and assigns a certain port number to our application. This is why we have to visit localhost:4567 to see our project.
If we close the app's files in IntelliJ, and do not accurately shut down our process, or have more than one app running Spark, our process that "reserves" localhost:4567 will still be running in the background; but "out of reach" to us. We need to execute a special command to shut it down, otherwise we will get the following frustrating error:
[Thread-1] INFO org.eclipse.jetty.util.log - Logging initialized @711ms to org.eclipse.jetty.util.log.Slf4jLog [Thread-1] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - == Spark has ignited ... [Thread-1] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - >> Listening on 0.0.0.0:4567 [Thread-1] INFO org.eclipse.jetty.server.Server - jetty-9.4.4.v20170414 [Thread-1] INFO org.eclipse.jetty.server.session - DefaultSessionIdManager workerName=node0 [Thread-1] INFO org.eclipse.jetty.server.session - No SessionScavenger set, using defaults [Thread-1] INFO org.eclipse.jetty.server.session - Scavenging every 600000ms [Thread-1] ERROR spark.Spark - ignite failed java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method)
Let's practice dealing with this scenario. Open an application that uses Spark, run it, as normal, and visit localhost:4567 to confirm it's there. Now, click the red X in the top left-hand corner of the app editor window in IntelliJ to close the window and project. We should see the following prompt:
Choose "Disconnect" - which means we are disconnecting from the current Spark/Java process, with no way to get it back, effectively sending it floating around the ether like this guy.
Our window will close, yet, but the process is still running in the background. Ack!
We want to avoid this. Moving forward, we should always hit "Terminate" to shut down the process instead.
Open the same project again and try to re-run it. We'll see the above error in the terminal output, and our project won't load. Oh no! Neither will any other Spark backed project until we kill this process. Let's hit the ejector seat to get back on track.
Pull up a regular console window, and execute the following command:
$ killall java
Now, we can re-run our app again, and see that it can successfully build. Yay!
Important Note: Because this error will always occur when you have a.) disconnected instead of terminated a process or b.) have more than one app running spark in IntelliJ, check to make sure you don't have an app open in a different IntelliJ window before you
$ killall java. Killing the process should be the last resort.