The term data compression means lowering the number of bits of info which needs to be saved or transmitted. This can be done with or without the loss of information, so what will be deleted at the time of the compression shall be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the content and its quality shall be identical, while in the second case the quality shall be worse. You'll find various compression algorithms that are better for various sort of data. Compressing and uncompressing data often takes lots of processing time, which means that the server performing the action needs to have enough resources in order to be able to process your info quick enough. One simple example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 inside the binary code instead of storing the actual 1s and 0s.
Data Compression in Hosting
The ZFS file system which is run on our cloud Internet hosting platform uses a compression algorithm called LZ4. The latter is significantly faster and better than any other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data very well and it does that very quickly, we can generate several backups of all the content kept in the hosting accounts on our servers daily. Both your content and its backups will take less space and since both ZFS and LZ4 work very fast, the backup generation will not change the performance of the web hosting servers where your content will be kept.