Content of review 1, reviewed on April 12, 2019

The Algorithmic Information Content (AIC) of a string s is defined as the shortest binary program p which gives s as its output. Unfortunately, this coding procedure cannot be performed by any algorithm and then the AIC is not computable. In consequence, other more practical measures are used to quantify the content of information of a string s. The empirical entropy is a sequence of numbers Hl, calculated from the l-digit substrings of s, giving statistical measures of the average information content of the digits of s. The notion of coarse optimality considers that an algorithm Z is coarsely optimal when its compression ratio, Z(s)/s, is less or equal than the empirical entropy Hl(s) for each l. Examples of this type are the Lempel-Ziv algorithms, LZ77 and LZ78, for positive entropy systems. But coarse optimality is not sufficient to have a coding algorithm able to characterize 0-entropy strings. For this reason, the asymptotic optimality is introduced that identifies those algorithms having the same asymptotic compression behavior as the empirical entropy. The authors show that the set of asymptotically optimal compression algorithms with respect to each Hl is not empty, and they describe one of these examples, the so-called the Frequency Coding algorithm. Finally, they also show that, for the orbits of the Manneville maps, the information content measured by these algorithms has the same asymptotic behavior as the AIC. (Reviewed for MathReviews, MathSciNet, 2007).

Source

    © 2019 the Reviewer (CC BY 4.0).