Data compression is the reduction of the number of bits that need to be saved or transmitted and the process is really important in the internet hosting field as information filed on hard disk drives is usually compressed in order to take less space. There're various algorithms for compressing data and they have different effectiveness based on the content. A lot of them remove just the redundant bits, so no data can be lost, while others remove unnecessary bits, which leads to worse quality when the particular data is uncompressed. The process needs plenty of processing time, therefore a web hosting server should be powerful enough in order to be able to compress and uncompress data instantly. One example how binary code could be compressed is by "remembering" that there are five consecutive 1s, for example, as an alternative to storing all five 1s.

Data Compression in Shared Hosting

The compression algorithm that we employ on the cloud hosting platform where your new shared hosting account will be created is called LZ4 and it's applied by the state-of-the-art ZFS file system that powers the system. The algorithm is a lot better than the ones other file systems employ as its compression ratio is higher and it processes data significantly quicker. The speed is most noticeable when content is being uncompressed as this happens even faster than info can be read from a hdd. For that reason, LZ4 improves the performance of every Internet site hosted on a server which uses this algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio allow us to generate a couple of daily backup copies of the entire content of all accounts and store them for one month. Not only do our backup copies take less space, but also their generation won't slow the servers down like it often happens with alternative file systems.

Data Compression in Semi-dedicated Servers

The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It is among the best algorithms out there and positively the best one when it comes to compressing and uncompressing web content, as its ratio is very high and it will uncompress data a lot faster than the same data can be read from a hard disk drive if it were uncompressed. In this way, using LZ4 will quicken any Internet site that runs on a platform where this algorithm is enabled. This high performance requires plenty of CPU processing time, which is provided by the great number of clusters working together as part of our platform. Furthermore, LZ4 enables us to generate several backups of your content every day and save them for one month as they will take much less space than regular backups and will be generated much more quickly without loading the servers.