Trying to figure out how will the JVM decide which dependency to use:
- We have a
gradlecompiled fatJar, containing several dependencies, i.e.JacksonX version etc. - My app is a
Play framework 1.xapp, using the the fatJar artifact from stage 1, and other dependencies, includingaws java sdkwhich usesJacksonitself, newer than X version.
How can I tell which Jackson version is used in runtime?
[It seems that on 1 env it uses the correct one, and on the other, aws sdk is using the incorrect Jackson]