It is my understanding that a *.jpg file loses a fraction of its resolution each time it is copied, due to its compression processing algorithm -- (correct me if I'm wrong).
"Copy" is the wrong word. When you copy one file to another, both files are perfectly identical.
A .jpg file is graphics data encoded with a lossy compression algorithm which uses special mathematical techniques to reduce the quality of the original graphical image and make its disk footprint much smaller. When you open a .jpg file, you are decoding it to see the reduced-quality image it contains. Now if you then take that image you decoded and elect to save it to another .jpg file, you are not copying but re-encoding the picture data and in effect running it through lossy compression yet again, further reducing the image quality.
Suppose instead that the .jpg file, (or a document containing .jpgs) is zipped with an archival program, (for example, WinRAR).
My question: Is the .jpg resolution similarly degraded during the zipping/unzipping process ? Or is this a "lossless" process ? Or is it program-dependent?
Thank you very much for your thoughts.
The answer is no. Compressing a .jpg file into .zip or .rar file involves a lossless compression algorithm which encodes data in such a way that it takes up less space yet can be decoded back to the exact same data which went into the compressed file. Although in the case of .jpg files, reduction in size into a .zip file or any compressed archive format is extremely minimal since JPEG data is mostly non-compressible.