Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Things that effect compression of files


  • Please log in to reply
3 replies to this topic

#1 TheDcoder

TheDcoder

  • Members
  • 175 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Earth
  • Local time:09:04 PM

Posted 08 February 2015 - 03:55 AM

I want to know all the things (or conditions) which effect the compression :)

 

So far I only know about the following things:

1. Free Memory (RAM Usage)

2. Compression methods

3. Archive format (or Compressor)

4. Files which are being compressed

 

(I want the following info to squeeze my files until they break :P)

 

Thanks in Advance! :)



BC AdBot (Login to Remove)

 


#2 Platypus

Platypus

  • Moderator
  • 14,971 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Australia
  • Local time:01:34 AM

Posted 08 February 2015 - 04:50 AM

IMO the major remaining factor is probably compression time. I think some of the other factors could be viewed as two sides of the one coin, compression method and file type/data characteristic or compression method/algorithm and archive format/utility for example.

 

Once the relationships between these are established (in terms of how much compression can be achieved), the main limitation would then become a trade-off between the ultimate compression factor and the feasible time that can be devoted to the process. We could imagine that the maximum possible compression could be achieved by a utility that had numerous available algorithms optimized for maximum results with different data types, and analysed each file to use the ideal algorithm, or multiple algorithms if necessary for different sections of some files. This would be a time consuming process, and how do you work out whether 5% more compression is worth taking 50% longer? Where does the law of diminishing returns really set in?

 

Most utilities do this to some degree by allowing selection of compression strength, with estimations of size variation and time required.


Top 5 things that never get done:

1.


#3 rp88

rp88

  • Members
  • 3,048 posts
  • OFFLINE
  •  
  • Gender:Not Telling
  • Local time:03:34 PM

Posted 08 February 2015 - 12:03 PM

As a general rule some file formats are already compressed, compressing into zip, 7z or rar files will have litle or no further effect. Compression works by finding sections in the file where the same code repeats itself and rather than keeping th full code replacing it with something saying "repeat the following code X times". Some file formats already do this to the content they contain so compression doesn't make it much smaller.
Back on this site, for a while anyway, been so busy the last year.

My systems:2 laptops, intel i3 processors, windows 8.1 installed on the hard-drive and linux mint 17.3 MATE installed to USB

#4 TheDcoder

TheDcoder
  • Topic Starter

  • Members
  • 175 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Earth
  • Local time:09:04 PM

Posted 09 February 2015 - 12:48 AM

@Platypus IMO Time is not that impotent as compression ratio :D

@rp88 I agree, by 4. Files which are being compressed I meant what you are trying to tell me :)






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users