The term data compression describes decreasing the number of bits of info that should be stored or transmitted. You can do this with or without losing data, which means that what will be erased during the compression can be either redundant data or unneeded one. When the data is uncompressed later on, in the first case the content and its quality shall be identical, while in the second case the quality will be worse. You'll find various compression algorithms which are better for various kind of info. Compressing and uncompressing data often takes plenty of processing time, so the server performing the action must have plenty of resources to be able to process the data quick enough. One simple example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 inside the binary code rather than storing the actual 1s and 0s.

Data Compression in Web Hosting

The compression algorithm used by the ZFS file system which runs on our cloud hosting platform is named LZ4. It can supercharge the performance of any Internet site hosted in a web hosting account on our end because not only does it compress info more efficiently than algorithms used by alternative file systems, but also uncompresses data at speeds which are higher than the hard disk reading speeds. This can be done by using a lot of CPU processing time, which is not a problem for our platform because it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to generate backup copies a lot faster and on lower disk space, so we can have a couple of daily backups of your databases and files and their generation will not affect the performance of the servers. In this way, we could always recover all the content that you could have removed by accident.