The term data compression describes lowering the number of bits of data that has to be stored or transmitted. You can do this with or without losing info, so what will be erased throughout the compression can be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the data and its quality will be identical, while in the second case the quality shall be worse. You can find different compression algorithms which are more effective for various sort of info. Compressing and uncompressing data normally takes a lot of processing time, which means that the server executing the action must have sufficient resources in order to be able to process the info quick enough. An example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 inside the binary code rather than storing the actual 1s and 0s.

Data Compression in Hosting

The compression algorithm used by the ZFS file system that runs on our cloud internet hosting platform is named LZ4. It can boost the performance of any site hosted in a hosting account with us since not only does it compress data significantly better than algorithms employed by various file systems, but also uncompresses data at speeds which are higher than the hard disk reading speeds. This can be done by using a great deal of CPU processing time, that is not a problem for our platform for the reason that it uses clusters of powerful servers working together. One more advantage of LZ4 is that it allows us to generate backup copies a lot faster and on lower disk space, so we will have several daily backups of your files and databases and their generation won't influence the performance of the servers. In this way, we could always recover all content that you may have deleted by mistake.

Data Compression in Semi-dedicated Servers

The semi-dedicated server plans that we provide are created on a powerful cloud platform which runs on the ZFS file system. ZFS works with a compression algorithm called LZ4 that exceeds any other algorithm these days in terms of speed and data compression ratio when it comes to processing web content. This is valid especially when data is uncompressed as LZ4 does that a lot faster than it would be to read uncompressed data from a hard disk drive and owing to this, sites running on a platform where LZ4 is enabled will function quicker. We're able to benefit from this feature although it requires quite a great deal of CPU processing time as our platform uses a large number of powerful servers working together and we never make accounts on a single machine like the majority of companies do. There's an additional reward of using LZ4 - considering that it compresses data very well and does that very fast, we can also generate multiple daily backups of all accounts without affecting the performance of the servers and keep them for a whole month. In this way, you can always recover any content that you delete by accident.