The term data compression describes decreasing the number of bits of data that should be saved or transmitted. You can do this with or without losing data, so what will be removed at the time of the compression shall be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the info and the quality shall be the same, whereas in the second case the quality shall be worse. You can find different compression algorithms that are more effective for different kind of data. Compressing and uncompressing data normally takes plenty of processing time, so the server carrying out the action should have adequate resources in order to be able to process the info quick enough. A simple example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 within the binary code rather than storing the actual 1s and 0s.
Data Compression in Web Hosting
The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm named LZ4. The latter is considerably faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that very quickly, we can generate several backups of all the content stored in the web hosting accounts on our servers every day. Both your content and its backups will need less space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the web hosting servers where your content will be kept.