I'm looking to write a custom class loader that will load a JAR
file from across a custom network. In the end, all I have to work with is a byte array of the JAR
file.
I cannot dump the byte array onto the file system and use a URLClassLoader
.
My first plan was to create a JarFile
object from a stream or byte array, but it only supports a File
object.
I've already written up something that uses a JarInputStream
:
public class RemoteClassLoader extends ClassLoader {
private final byte[] jarBytes;
public RemoteClassLoader(byte[] jarBytes) {
this.jarBytes = jarBytes;
}
@Override
public Class<?> loadClass(String name, boolean resolve) throws ClassNotFoundException {
Class<?> clazz = findLoadedClass(name);
if (clazz == null) {
try {
InputStream in = getResourceAsStream(name.replace('.', '/') + ".class");
ByteArrayOutputStream out = new ByteArrayOutputStream();
StreamUtils.writeTo(in, out);
byte[] bytes = out.toByteArray();
clazz = defineClass(name, bytes, 0, bytes.length);
if (resolve) {
resolveClass(clazz);
}
} catch (Exception e) {
clazz = super.loadClass(name, resolve);
}
}
return clazz;
}
@Override
public URL getResource(String name) {
return null;
}
@Override
public InputStream getResourceAsStream(String name) {
try (JarInputStream jis = new JarInputStream(new ByteArrayInputStream(jarBytes))) {
JarEntry entry;
while ((entry = jis.getNextJarEntry()) != null) {
if (entry.getName().equals(name)) {
return jis;
}
}
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
}
This may work fine for small JAR
files, but I tried loading up a 2.7MB
jar file with almost 2000
classes and it was taking around 160 ms
just to iterate through all the entries let alone load the class it found.
If anyone knows a solution that's faster than iterating through a JarInputStream
's entries each time a class is loaded, please share!
First you have no need to use JarInputStream
as it only adds the support of the manifest to the class ZipInputStream
which we don't really care here. You cannot put your entries into a cache (unless you store directly the content of each entries which would be terrible in term of memory consumption) because a ZipInputStream
is not meant to be shared so it cannot be read concurrently. The best you can do is to store the name of the entries into a cache to only iterate over the entries when we know that the entry exists.
The code could be something like this:
public class RemoteClassLoader extends ClassLoader {
private final byte[] jarBytes;
private final Set<String> names;
public RemoteClassLoader(byte[] jarBytes) throws IOException {
this.jarBytes = jarBytes;
this.names = RemoteClassLoader.loadNames(jarBytes);
}
/**
* This will put all the entries into a thread-safe Set
*/
private static Set<String> loadNames(byte[] jarBytes) throws IOException {
Set<String> set = new HashSet<>();
try (ZipInputStream jis =
new ZipInputStream(new ByteArrayInputStream(jarBytes))) {
ZipEntry entry;
while ((entry = jis.getNextEntry()) != null) {
set.add(entry.getName());
}
}
return Collections.unmodifiableSet(set);
}
...
@Override
public InputStream getResourceAsStream(String name) {
// Check first if the entry name is known
if (!names.contains(name)) {
return null;
}
// I moved the JarInputStream declaration outside the
// try-with-resources statement as it must not be closed otherwise
// the returned InputStream won't be readable as already closed
boolean found = false;
ZipInputStream jis = null;
try {
jis = new ZipInputStream(new ByteArrayInputStream(jarBytes));
ZipEntry entry;
while ((entry = jis.getNextEntry()) != null) {
if (entry.getName().equals(name)) {
found = true;
return jis;
}
}
} catch (IOException e) {
e.printStackTrace();
} finally {
// Only close the stream if the entry could not be found
if (jis != null && !found) {
try {
jis.close();
} catch (IOException e) {
// ignore me
}
}
}
return null;
}
}
Accessing to a zip entry using JarInputStream
is clearly not the way to do it as you will need to iterate over the entries to find it which is not a scalable approach because the performance will depend on total amount of entries in your jar file.
To get the best possible performances, you need to use a ZipFile
in order to access directly to an entry thanks to the method getEntry(name)
whatever the size of your archive. Unfortunately the class ZipFile
doesn't provide any constructors that accept the content of your archive as a byte
array (it is not a good practice anyway as you could face OOME if the file is too big) but only as a File
, so you will need to change the logic of your class in order to store the content of your zip into a temporary file, then provide this temporary file to your ZipFile
to be able to access to the entry directly.
The code could be something like this:
public class RemoteClassLoader extends ClassLoader {
private final ZipFile zipFile;
public RemoteClassLoader(byte[] jarBytes) throws IOException {
this.zipFile = RemoteClassLoader.load(jarBytes);
}
private static ZipFile load(byte[] jarBytes) throws IOException {
// Create my temporary file
Path path = Files.createTempFile("RemoteClassLoader", "jar");
// Delete the file on exit
path.toFile().deleteOnExit();
// Copy the content of my jar into the temporary file
try (InputStream is = new ByteArrayInputStream(jarBytes)) {
Files.copy(is, path, StandardCopyOption.REPLACE_EXISTING);
}
return new ZipFile(path.toFile());
}
...
@Override
public InputStream getResourceAsStream(String name) {
// Get the entry by its name
ZipEntry entry = zipFile.getEntry(name);
if (entry != null) {
// The entry could be found
try {
// Gives the content of the entry as InputStream
return zipFile.getInputStream(entry);
} catch (IOException e) {
// Could not get the content of the entry
// you could log the error if needed
return null;
}
}
// The entry could not be found
return null;
}
}
I would iterate through the class once and cache the entries. I would also look at the source code for URLClassLoader to see how it does it. If that fails, write the data to a temporary file and load it via the normal class loader.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With