TL;DR
When benchmarking Java image libraries, what matters most depends on your use case. Speed is critical for high-throughput services, file size drives storage and bandwidth costs, and quality is non-negotiable when visual output matters. Testing across all three reveals a more honest picture. In this benchmark covering seven formats, JDeli leads overall on speed, quality, and reliability, producing virtually no broken files and handling every format without plugins. ImageIO (with plugins) holds its own on file size and raw speed in certain formats, and Apache delivers solid, consistent quality where it has coverage. But if you need one library that performs well across all three metrics, JDeli is the clear all-rounder. Check out our performance comparisons page if you’re just interested in the numbers
As a developer and product manager, I’m always looking closely at how our library performs, especially compared to other options. I want to know where we’re doing well, where we can improve, and what actually makes a difference in real-world use.
Over the years, I have benchmarked various image encoding and decoding libraries including JDeli and shared the results.
This time, I decided to expand things a bit further and run broader performance benchmarks across the board and dive into performance. Partly because it’s useful, and partly because let’s be honest, it’s fun to see how your work stacks up. I’ll walk you through how I ran the tests and, more importantly, which metrics I believe actually matter.
So… what actually matters?
In a world obsessed with faster and smaller, it’s tempting to look for a single number that tells you which library is “best”.
But as my boss likes to remind me, referencing Mazzeo’s Law:
“The answer to every strategic question is… It depends.”
The right answer depends entirely on what you care about.
If you’re running a high-throughput service, speed might be everything. If you’re trying to reduce bandwidth or storage costs, file size matters more. And if the visual result is critical, then quality is non-negotiable.
There isn’t one winner across every scenario.
So instead of focusing on just one metric, I’ve looked at the three that tend to matter most in the real world:
- Speed
- File size
- Quality
Together, they tell a much more useful story.
How the benchmarks work
To measure these, I used a combination of JMH for performance testing and SSIM for image quality.
Why JMH?
JMH (Java Microbenchmark Harness) is the gold standard for benchmarking Java code. It handles JVM warm-up, avoids common benchmarking pitfalls, and produces reliable, repeatable results.
This is important because naïve benchmarks can be wildly misleading.
Why use this over simpler approaches like System.nanoTime(), custom timers, or basic loop-based benchmarks? The main reason is accuracy. Benchmarking Java code correctly is surprisingly difficult because of JVM optimisations like JIT compilation, garbage collection, and dead code elimination. These can make naïve benchmarks produce misleading and sometimes completely wrong results.
JMH is designed specifically to handle these issues. It includes proper warm-up phases so the JVM can fully optimise the code before measurements are taken, runs multiple iterations to produce stable averages, and uses techniques to prevent the JVM from optimising away the code being tested. It also helps isolate the benchmark from other system activity as much as possible.
In short, JMH gives you confidence that you’re measuring the actual performance of your code, not the side effects of how the JVM happens to be behaving at that moment.
Why SSIM instead of VIF, PSNR or MSE?
For image quality, I’ve used the Structural Similarity Index (SSIM) rather than relying on metrics like PSNR (Peak Signal-to-Noise Ratio) or MSE (Mean Squared Error), and instead of going with Visual information fidelity (VIF ) .
The main reason is simple: I care about how the image looks, not just how different it is numerically.
Metrics like PSNR and MSE work by measuring pixel-by-pixel error. That’s useful, but it doesn’t always match what a person actually sees. You can end up with an image that scores worse on paper but looks better to the eye.
SSIM takes a more perceptual approach, looking at things like structure, luminance, and contrast. In practice, it lines up much more closely with how we judge image quality.
VIF is also a strong option and can often outperform SSIM, but it’s significantly more computationally expensive. For this kind of broad benchmarking, that extra cost doesn’t really justify itself.
SSIM hits a good balance: it’s fast enough to run at scale, and accurate enough to reflect real-world visual quality.
Why not use multiple Quality metrics together?
Many benchmarks include SSIM alongside PSNR and MSE, and there’s nothing inherently wrong with that approach.
However, in practice, these additional metrics often don’t change the overall conclusions — they just add more numbers to interpret.
For this benchmark, the goal is to compare practical performance in realistic scenarios, not to perform academic analysis of compression error.
SSIM provides a strong, reliable indicator of perceptual quality on its own, and adding additional error-based metrics wouldn’t materially change the decisions most developers would make based on the results.
It also keeps the results simpler and easier to interpret.
If there’s a visible quality difference, SSIM will reflect it. If there isn’t, that’s usually what matters most.
Benchmark Methodology
Before getting into the results, it’s worth explaining how the benchmarks were run. Performance numbers without context can be misleading, and small differences in setup can produce very different outcomes.
The goal here wasn’t to create a synthetic “best case” scenario, but to measure performance in a way that reflects how image libraries are actually used in real systems.
Test Environment
All benchmarks were run using the following setup:
- CPU: M1
- RAM: 16GB
- OS: MacOS Tahoe 26.3.1
- Java version: 17
- JDeli version: 2026.04
- Comparison libraries: ImageIO(including plugins: twelvemonkeys, JAI, darkXanther for webp), Apache.
Each test was run on an otherwise idle system to minimise interference from background processes.
Test Images
The image set used for testing is the PngSuite test-suite
Using a varied dataset helps ensure the results reflect real-world usage rather than favouring a specific optimisation. I would usually try to stick to images that work for all libraries I'll be comparing but this time I have chosen this corpus because it includes a lot of varied images that cover every aspect that should be supported and will report on if they can be handled as part of the benchmarking.
Ensuring Fair Comparisons
Where possible, equivalent settings were used across libraries to ensure fair comparisons.
However, this is one of the challenges of benchmarking image encoding libraries — different libraries expose different configuration options, and not all settings map perfectly.
The goal was to use realistic, production-style configurations rather than artificially tuning one library to outperform others.
Speed: The metric everyone looks at first
Let’s start with speed — because nobody wants slow image processing.
Speed benchmarks were run using JMH (Java Microbenchmark Harness).
Each benchmark was run across multiple iterations, and the reported results reflect the average performance after the JVM had fully stabilised.
BMP
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| Apache | 10.684 | ± 0.027 |
| ImageIO | 5.288 | ± 0.484 |
| JDeli | 17.223 | ± 0.064 |
GIF
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| Apache | 1.705 | ± 0.251 |
| ImageIO | 4.713 | ± 0.025 |
| JDeli | 0.538 | ± 0.003 |
JPEG
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| ImageIO | 13.345 | ± 0.049 |
| ImageIO_high | 10.049 | ± 0.033 |
| ImageIO_low | 13.989 | ± 0.040 |
| JDeli | 28.073 | ± 0.513 |
| JDeli_high | 15.778 | ± 0.126 |
| JDeli_low | 29.451 | ± 0.491 |
JPEG 2000
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| ImageIO | 2.306 | ± 0.007 |
| JDeli_JP2 | 6.226 | ± 0.023 |
| JDeli_JPX | 6.228 | ± 0.020 |
PNG
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| Apache | 5.100 | 5.100 ± 0.018 |
| ImageIO | 4.819 | ± 0.010 |
| ImageIO_fast | 5.597 | ± 0.014 |
| ImageIO_max_comp | 3.512 | ± 0.008 |
| JDeli_COMPRESS | 5.402 | ± 0.027 |
| JDeli_FAST | 13.061 | ± 0.026 |
| JDeli_QUANT | 1.192 | ± 0.001 |
| JDeli_UNCOMPRESS | 12.985 | ± 0.088 |
TIFF
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| Apache | 4.140 | ± 0.070 |
| ImageIO_Deflate | 4.533 | ± 0.054 |
| ImageIO_JPEG | 7.788 | ± 0.084 |
| ImageIO_LZW | 4.265 | ± 0.043 |
| ImageIO_uncompressed | 9.213 | ± 0.029 |
| JDeli_LZW | 9.473 | ± 0.027 |
| JDeli_better_comp | 5.484 | ± 0.007 |
| JDeli_better_speed | 13.116 | ± 0.026 |
| JDeli_deflate | 6.361 | ± 0.017 |
| JDeli_jpeg | 25.160 | ± 1.732 |
| JDeli_uncompressed | 99.850 | ± 3.051 |
WEBP
Mode: throughput
Count 25
Units: ops/s
| BenchMark | Score | Error |
|---|---|---|
| ImageIO | 6.508 | ± 0.044 |
| JDeli_lossless | 4.632 | ± 0.049 |
| JDeli_lossy | 3.659 | ± 0.034 |
Quality: How good does the output actually look?
Next, let’s look at image quality. For this I used the Structural Similarity Index (SSIM).
The purpose here was to measure perceived visual quality, rather than purely mathematical differences.
BMP
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| Apache | 1.0 | 0 | Identical or virtually identical |
| ImageIO | 0.4430668957720272 | 30 | Very different |
| JDeli | 0.9684172944944214 | 0 | Very similar (high quality) |
GIF
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| Apache | 0.8785090614816008 | 0 | Similar (noticeable but minor differences) |
| ImageIO | 0.18769307741202973 | 4 | Very different |
| JDeli | 0.8284792072170122 | 0 | Moderately similar (visible differences) |
JPEG
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| ImageIO | 0.9843408924688256 | 36 | Very similar (high quality) |
| ImageIO_high | 0.9955252550315202 | 36 | Identical or virtually identical |
| ImageIO_low | 0.753963280422963 | 36 | Moderately similar (visible differences) |
| JDeli | 0.9635278566129812 | 0 | Very similar (high quality) |
| JDeli_high | 0.9824853317459952 | 0 | Very similar (high quality) |
| JDeli_low | 0.7308454760175299 | 0 | Moderately similar (visible differences) |
JPEG2000
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| ImageIO | 0.9659635878077559 | 0 | Very similar (high quality) |
| JDeli_jp2 | 0.8817187006016771 | 0 | Similar (noticeable but minor differences) |
| JDeli_jpx | 0.8817187006016771 | 0 | Similar (noticeable but minor differences) |
PNG
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| Apache | 0.967281580718986 | 0 | Very similar (high quality) |
| ImageIO | 1.0 | 0 | Identical or virtually identical |
| ImageIO_fast | 1.0 | 0 | Identical or virtually identical |
| ImageIO_max_comp | 1.0 | 0 | Identical or virtually identical |
| JDeli | 0.9985227287696926 | 0 | Identical or virtually |
| JDeli_fast | 1.0 | 0 | Identical or virtually |
| JDeli_compress | 1.0 | 0 | Identical or virtually identical |
| JDeli_uncompressed | 1.0 | 0 | Identical or virtually identical |
| JDeli_quant | 0.9926136438484626 | 0 | Identical or virtually |
TIFF
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| Apache | 1.0 | 1 | Identical or virtually identical |
| ImageIO_deflate | 1.0 | 1 | Identical or virtually identical |
| ImageIO_jpeg | 0.9964492586342384 | 114 | Identical or virtually identical |
| ImageIO_lzw | 1.0 | 1 | Identical or virtually identical |
| ImageIO_uncompress | 1.0 | 1 | Identical or virtually identical |
| JDeli_better_speed | 1.0 | 1 | Identical or virtually identical |
| JDeli_deflate | 1.0 | 1 | Identical or virtually identical |
| JDeli_jpeg | 0.8720826599375827 | 1 | Similar (noticeable but minor differences) |
| JDeli_lzw | 1.0 | 1 | Identical or virtually identical |
| JDeli_uncompressed | 1.0 | 1 | Identical or virtually identical |
| JDeli_better_comp | 1.0 | 1 | Identical or virtually identical |
WEBP
| Benchmark | Score | zero length/broken files | similarity of score |
|---|---|---|---|
| ImageIO | 0.9764829133391945 | 28 | Very similar (high quality) |
| JDeli_lossy | 0.9539262155199079 | 0 | Very similar (high quality) |
| JDeli_lossless | 0.9684172944944212 | 2 | Very similar (high quality) |
File Size: Smaller isn’t always better — but it helps
Finally, let’s look at file size. This was measured by comparing the output size of encoded images using equivalent settings across each library.
This reflects how efficiently each encoder compresses image data, which directly impacts:
- Storage requirements
- Network transfer time
- Application performance at scale
BMP
| Benchmark | Size (in bytes) | Zero length or poor output |
|---|---|---|
| Apache | 4,989,192 | 0 |
| ImageIO | 4,960,616 | 30 |
| JDeli | 5,059,924 | 0 |
GIF
| Benchmark | Size (in bytes) | Zero length or poor output |
|---|---|---|
| Apache | 604,167 | 0 |
| ImageIO | 413,805 | 4 |
| JDeli | 739,280 | 0 |
JPEG
| Benchmark | Size (in bytes) | Zero length or poor output |
|---|---|---|
| ImageIO | 205,925 | 36 |
| ImageIO_high | 944,779 | 36 |
| ImageIO_low | 113,022 | 36 |
| JDeli | 237,632 | 0 |
| JDeli_high | 1,520,609 | 0 |
| JDeli_low | 135,372 | 0 |
JPEG2000
| Benchmark | Size | Zero length or poor output |
|---|---|---|
| ImageIO | 1,453,674 | 0 |
| JDeli_jp2 | 295,585 | 0 |
| JDeli_jpx | 283,965 | 0 |
PNG
| Benchmark | Size | Zero length or poor output |
|---|---|---|
| Apache | 2,741,666 | 0 |
| ImageIO | 2,768,119 | 0 |
| ImageIO_fast | 2,877,192 | 0 |
| ImageIO_max_comp | 2,749,323 | 0 |
| JDeli | 2,749,038 | 0 |
| JDeli_compress | 2,749,038 | 0 |
| JDeli_fast | 2,876,817 | 0 |
| JDeli_quant | 538,898 | 0 |
| JDeli_uncompress | 2,876,817 | 0 |
TIFF
| Benchmark | Size | Zero length or poor output |
|---|---|---|
| Apache | 1,917,348 | 1 |
| ImageIO_deflate | 2,911,052 | 1 |
| ImageIO_jpeg | 911,415 | 114 |
| ImageIO_lzw | 4,074,430 | 1 |
| ImageIO_uncompressed | 5,077,936 | 1 |
| JDeli_better_comp | 2,786,074 | 1 |
| JDeli_better_speed | 2,912,986 | 1 |
| JDeli_deflate | 2,787,004 | 1 |
| JDeli_jpeg | 274,036 | 1 |
| JDeli_lzw | 4,065,664 | 1 |
| JDeli_uncompressed | 5,075,876 | 1 |
WEBP
| Benchmark | Size | Zero length or poor output |
|---|---|---|
| ImageIO | 80,430 | 28 |
| JDeli_lossless | 1,542,834 | 2 |
| JDeli_lossy | 90,956 | 0 |
Summary: How does JDeli perform?
| Library | Format coverage | Speed | Quality | Average Similarity | Average Size (in MB) | Average amount of Zero length or broken files (out of the 166) |
|---|---|---|---|---|---|---|
| Apache | 4/7 | 5.40725 ± 0.0915 | 0.961447660550147 | Very similar (high quality) | 2.44435620307922 | 1 |
| ImageIO | 7/7 (with plugins) | 6.85178571428571 ± 0.0676923076923077 | 0.878820368634897 | Similar (noticeable but minor differences) | 2.01236983707973 | 18 |
| JDeli | 7/7 | 15.9943157894737 ± 0.332894736842105 | 0.951211619054583 | Very similar (high quality) | 1.88819909095764 | 1 |
So when we look into the individual tests each library has their strengths for instance: Apache and JDeli are both good for reliably writing to formats correctly and not writing out zero length or broken files, and ImageIO does well on some speed and output size. But, as you can see from the summary, if you are looking for a java image library that can do it all JDeli is a clear winner!
Over to you
These are my results, but benchmarks are always influenced by real-world usage. Download JDeli and try it yourself.
I’d genuinely love to hear how it went for you! Let me know in the comments.
What did you find when testing? Did your results match mine, or were they different?
Are you a Java Developer working with Image files?
// Read an image
BufferedImage bufferedImage = JDeli.read(avifImageFile);
// Write an image
JDeli.write(bufferedImage, "avif", outputStreamOrFile);// Read an image
BufferedImage bufferedImage = JDeli.read(dicomImageFile);// Read an image
BufferedImage bufferedImage = JDeli.read(heicImageFile);
// Write an image
JDeli.write(bufferedImage, "heic", outputStreamOrFile);// Read an image
BufferedImage bufferedImage = JDeli.read(jpegImageFile);
// Write an image
JDeli.write(bufferedImage, "jpeg", outputStreamOrFile);
// Read an image
BufferedImage bufferedImage = JDeli.read(jpeg2000ImageFile);
// Write an image
JDeli.write(bufferedImage, "jpx", outputStreamOrFile);
// Write an image
JDeli.write(bufferedImage, "pdf", outputStreamOrFile);
// Read an image
BufferedImage bufferedImage = JDeli.read(pngImageFile);
// Write an image
JDeli.write(bufferedImage, "png", outputStreamOrFile);
// Read an image
BufferedImage bufferedImage = JDeli.read(tiffImageFile);
// Write an image
JDeli.write(bufferedImage, "tiff", outputStreamOrFile);
// Read an image
BufferedImage bufferedImage = JDeli.read(webpImageFile);
// Write an image
JDeli.write(bufferedImage, "webp", outputStreamOrFile);
What is JDeli?
JDeli is a commercial Java Image library that is used to read, write, convert, manipulate and process many different image formats.
Why use JDeli?
To handle many well known formats such as JPEG, PNG, TIFF as well as newer formats like AVIF, HEIC and JPEG XL in java with no calls to any external system or third party library.
What licenses are available?
We have 3 licenses available:
Server for on premises and cloud servers, Distribution for use in a named end user applications, and Custom for more demanding requirements.
How does JDeli compare?
We work hard to make sure JDeli performance is better than or similar to other java image libraries. Check out our benchmarks to see just how well JDeli performs.