The collection of large numbers of serial images is a common process in functional imaging studies, such as fMRI [1,2], cerebral blood volume [3], DSA [4,5], MR mammography [6,7] and others. A typical functional neuroimaging study in our laboratory, for example, involves the collection of about 500 Mbytes of digital data and causes considerable burden on the data storage, archiving, backup and network systems. Commonly available data compression algorithms such as gzip (Open Software Foundation, Cambridge, MA), compress (Digital unix 4.0, Compaq Computer, Houston, TX) or StuffIt (Aladdin Systems, Watsonville, CA) offer lossless compression ratios of as much as 50%, but are relatively time consuming both in compression and in expansion, adding a significant time penalty to the entire data analysis stream.
The reason for the poor efficiency of the standard tools is that the image data typically contain noise and thus appear as essentially random numbers to the compression programs. By recognizing the repetitive structure of the imaging time series, it is easy to develop a much more efficient algorithm as described here