I am pretty new to Spark and Scala at the same time, so some things need clarification. I went to the web looking for a definitive answer to my question, but I did not really end up with one.
At the moment, I am running the spark-shell, in order to write some basic Scala and complete my tutorials. Now, the tutorial wants me to add a library in spark, in order to import it and use it for the examples. I have downloaded the .jar file of the library. Should I put in the /spark/jars/ folder? Is this enough in order to import it or should I also declare it somewhere else as well? Do I need to add a command before running the ./spark-shell ?
Also, when I create a standalone program (using sbt and declaring the library in the build.sbt), will the spark find the .jar in the /spark/jars/ folder or do I need to put it elsewhere?