Data compression is the compacting of info by reducing the number of bits that are stored or transmitted. Because of this, the compressed information needs much less disk space than the original one, so much more content can be stored on identical amount of space. You can find many different compression algorithms that function in different ways and with some of them only the redundant bits are removed, therefore once the info is uncompressed, there is no loss of quality. Others delete excessive bits, but uncompressing the data later on will result in reduced quality in comparison with the original. Compressing and uncompressing content takes a huge amount of system resources, particularly CPU processing time, therefore any web hosting platform which uses compression in real time needs to have ample power to support that feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the actual code.
Data Compression in Shared Website Hosting
The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm called LZ4. The aforementioned is significantly faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data quite well and it does that very fast, we can generate several backup copies of all the content stored in the shared website hosting accounts on our servers every day. Both your content and its backups will need less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web hosting servers where your content will be kept.