Compress multiple jpeg files together in C# - c#

I was trying compress jpeg files (say 16 files) together using C#. I did successfully created a tar file and finally a tar.gz (using C# GZipStream class). But the problem with my solution is that the gzip pass increased for 37% the size of the tar file (so a compression ratio of 137%). I tried to manually compress the files together using winrar and it gave me a reduction of 10% in the size (compress ratio of 90%).
I believe that my problem is with GZipStream. I think I should go for another kind of compression (or compressor?!), do you have any idea/suggestion of compression to use.

The framework's compression routines don't always go a great job.
I would recommend trying DotNetZip to compress this. My experience is that the compression (even Gzip) there is much closer to other software, and far smaller than the framework classes. This is also nice in that it requires nearly no code changes from the framework's GzipStream class if you want to use their GzipStream implementation.

Related

Compressing Image in C# [duplicate]

I am using C# and want to save images using JPEG format. However .NET reduces quality of the images and saves them with compression that is not enough.
I want to save files with their original quality and size. I am using the following code but compression and quality are not like the original ones.
Bitmap bm = (Bitmap)Image.FromFile(FilePath);
ImageCodecInfo[] codecs = ImageCodecInfo.GetImageEncoders();
ImageCodecInfo ici = null;
foreach (ImageCodecInfo codec in codecs)
{
if (codec.MimeType == "image/jpeg")
ici = codec;
}
EncoderParameters ep = new EncoderParameters();
ep.Param[0] = new EncoderParameter(System.Drawing.Imaging.Encoder.Quality, (long)100);
bm.Save("C:\\quality" + x.ToString() + ".jpg", ici, ep);
I am archiving studio photos and quality and compression is very important. Thanks.
The .Net encoder built-in to the library (at least the default Windows library provided by Microsoft) is pretty bad:
http://b9dev.blogspot.com/2013/06/nets-built-in-jpeg-encoder-convenient.html
Partial Update
I'm now using an approach outlined here, that uses ImageMagick for the resize then jpegoptim for the final compression, with far better results. I realize that's a partial answer but I'll expand on this once time allows.
Older Answer
ImageMagick is the best choice I've found so far. It performs relatively solid jpeg compression.
http://magick.codeplex.com/
It has a couple downsides:
It's better but not perfect. In particular, its Chroma subsampling is set to high detail at 90% or above, then jumps down to a lower detail level - one that can introduce a lot of artifacts. If you want to ignore subsampling, this is actually pretty convenient. But if you wanted high-detail subsampling at say, 50%, you have a larger challenge ahead. It also still won't quite hit quality/compression levels of Photoshop or Google PageSpeed.
It has a special deployment burden on the server that's very easy to miss. It requires a Visual Studio 2008 SDK lib installed. This lib is available on any dev machine with Visual Studio on it, but then you hit the server for the first time and it implodes with an obscure error. It's one of those lurking gotchas most people won't have scripted/automated, and you'll trip over it during some future server migration.
Oldest Answer
I dug around and came across a project to implement a C# JPEG encoder by translating a C project over:
http://www.codeproject.com/Articles/83225/A-Simple-JPEG-Encoder-in-C
which I've simplified slightly:
https://github.com/b9chris/ArpanJpegEncoder
It produces much higher quality JPEGs than the .Net built-in, but still is not as good as Gimp's or Photoshop's. Filesizes also tend to be larger.
BitMiracle's implementation is practically identical to the .Net built-in - same quality problems.
It's likely that just wrapping an existing open source implementation, like Google's jpeg_optimizer in PageSpeed Tools - seemingly libjpeg underneath, would be the most efficient option.
Update
ArpanJpegEncoder appears to have issues once it's deployed - maybe I need to increase the trust level of the code, or perhaps something else is going on. Locally it writes images fine, but once deployed I get a blank black image from it every time. I'll update if I determine the cause. Just a warning to others considering it.
It looks like you're setting the quality to 100%. That means that there will be no compression.
If you change the compression level (80, 50, etc.) and you're unsatisifed with the quality, you may want to try a different image library. LEADTools has a good (non-free) engine.
UPDATE: As a commenter mentioned, 100% quality still does not mean lossless compression when using JPEG. Loading the image, doing something to it, and then saving it again will ultimately result in image degradation. If you need to alter and save an image without losing any of the data you need to use a lossless format such as TIFF, PNG or BMP. I'd go with compressed TIFF (since it's still lossless even though it's compressed) or PNG.
Compression and quality are always a trade off.
JPEGs are always going to be lossy.
You may want to consider using PNG and minifying the files using PNGCrush or PNGauntlet
Regarding the setup of the compression level in .NET, please check this link (everything included): http://msdn.microsoft.com/en-us/library/bb882583.aspx
Rearding your question:
Usually you will save the uploaded image from users as PNG, then you use this PNG as base to generate your JPGs with different sizes (and you put a watermark ONLY on the JPGs, never on the original PNG!)
Advantage of this is: if you change your images-dimensions later on for your platform, you have the original PNG saved and based on this you can re-compute any new image sizes.
It must save the file like its orjinal quality and size
That doesn't make a lot of sense. When you are using lossy compression you are going to lose some information by definition. The point of compressing an image is to reduce the file size. If you need high quality and jpeg isn't doing it for you you may have to go with some type of lossless compression, but your file sizes will not be reduced by much. You could always try using the 'standard' library for compressing to jpeg (libjpeg) and see if that gives you any different results (I doubt it, but I don't know what .NET is using under the hood.)
Compressing the jpeg format by its very nature reduces quality. Perhaps you should look into file compression, such as #ziplib. You may be able to get a reasonable compression over a group of files.

Quickest Way To Decompress BIG .tar.gz In C#?

I have a load of .tar.gz files that are around 5GB. I have noticed that the .NET GZipStream actually gets stuck in an infinite loop trying to decompress them.
I found some pure C# code but these all had issues with the size of my files. Unlike other posters (24GB tar.gz Decompress using sharpziplib) I am compiling the application as a 64 bit .NET 4.5.1 application on an X64 bit machine.
I noticed that .NET 4.5.1 removes the 2GB limit.. but after reading it found it to be quite misleading, it appears actually it removes all the nested parts of an object not being able to use more than 2GB but the actual addressable range for objects such as byte arrays still appears to be 2GB even with the relevant option turned on
Does anyone have any solutions or have I hit a limitation in C#? I can invoke the 64bit 7ZIP DLL from my app or call the 7ZIP .exe and wait for it to finish (bit of a bodge) but there has to be a cleaner way? Also I want the quickest decompression and preferably something in pure C# code but I'm currently left thinking this is not possible in C# (due to limitations on the addressable range of byte arrays)
You won't be able to load the resulting data into a single byte[] in C#. You will still be limited by the array size.
However, you should be able to decompress these without issue by just using streams, and decompressing through a stream. I've had very good luck with DotNetZip and large streams - using it, you should be able to just do:
using (System.IO.Stream input = System.IO.File.OpenRead(inputFile))
using (Stream decompressor= new Ionic.Zlib.GZipStream(input, CompressionMode.Decompress, true))
using (var output = System.IO.File.Create(outputFile))
decompressor.CopyTo(output);

Compress a file with RAR

I have a text file that I want to compress after it gets an specified size. I've already seen GZipStream which works great, but RAR compression is much better.
I've been looking for a library that can compress a file with RAR (I really don't care about extracting or uncompressing), but I couldn't find one yet.
As the RAR compression algorithm isn't free (only the decompression algorithm is), you won't find a library for it (or have to purchase a license).
A good alternative is the LZMA SDK that delivers the compression algorithms used in 7-Zip.
For a compression ratio/speed comparison, you can have a look e.g. at the Maximum Compression summary page, ranks 50 and 52, comparing WinRAR 4.01 in "Best Solid" mode and 7-Zip 9.22 in "Ultra" mode. WinRAR compresses only slightly better (<1%) and faster, 7-Zip decompresses faster.
Note that, as peachykeen noted, if you look at the efficiency ratings instead of size, WinRAR in normal mode is much faster than 7-Zip.

C#/Android Compatible Compression Algorithm

I have a lot of plain-text content (English). I have a C# tool for creating the content, and it will be consumed in an Android app.
I need, therefore, to know my options for compression algorithms. What library can I use to compress/decompress, where I can compress in C# and decompress in Java?
I'm looking at probably 1-2MB of uncompressed text (at least), so it's definitely worth it to compress it.
You should be able to zip in C# using something like this and unzip with this. GZIP format should do the trick.

Compressing XML File

All,
I have a requirement to Compress an XML file. At the moment I am using C# and the gzip algorithm in the .NET Classes. I does compress it but not at the rate I would like to.
For example a 12MB file was compressed to a little less than 4MB.
Is there any other way to compress it more than that? Speed of compression / decompression is not very important.
Thanks,
M
ZIP compression is well suited for compressing XML data. In .NET you best rely on third party libraries:
DotNetZip
SharpZipLib
You may try 7zip.
7-zip has an SDK.
Use the client version of 7-zip to try different compression settings to find the one with best compression for your particular data set.
This website compared different compression libraries against large amount of text data. 7-zip is also included. I hope that this helps you to choose correct library that matches your requirements.
Take a look at System.IO.Packaging.ZipPackage in WindowsBase. It's the .NET framework code behind the DOCX & XLSX file formats and these are more or less zipped XML files. You can zip multiple files of any format together, not just XML.

Categories