We have demonstrated that very rapid and efficient compression is possible using a simple algorithm which takes into account basic information about the structure of digital image data that changes in time. Our results suggest that the method is nearly as fast as file copies for both compression and decompression. As a result, it can be deployed as a regular part of the data transfer process from the scanner to any choice of archive media without significant time impact. Once the data are compressed, of course, all further file operations will be much improved.
The times listed in Figure 3 may appear too short to be of much interest, even for the worst case. However, a 15 Mbyte set such as this represents only about one-twentieth of our usual data collection. Thus, compression with gzip adds almost twenty minutes of compute time to the study. Further, the ultimate file size is a major factor in the time required for archiving, network transfer, etc Our lab, for example, routinely uses ftp to move data from the scanners to our archiving computer (which takes about 5 minutes for the uncompressed files). The process of archiving the data to compact discs requires about an additional 20 minutes. We estimate conservatively that the proposed compression method will shorten processing time by at least 30 minutes for each study, and by much more, if the data are routinely moved, compressed and expanded during future analyses.