Since 2006, Baldson (Slashdot reader #78,598) has been involved in the ongoing challenge of compressing 100 MB of Wikipedia excerpts (roughly the amount a human can read in a single document). We are part of the team that validates the Footer Award for Lossless Compression. lifetime).
“The purpose of this award is to encourage the development of intelligent compressors/programs as a path to artificial general intelligence,” the project website explains. Fifteen years ago, Baldrson wrote a Slashdot post explaining the logic (titled “Compressing Wikipedia earns him an AI award”).
The basic theory that Hutter proved is that, after a set of observations, the optimal move by the AI is to find the smallest program that predicts those observations, and then assume that the environment is controlled by that program. Think of it like Occam's razor on steroids.
The prize amount also increases depending on the method many Compression is achieved. (That is, if you compress a 1 GB file by x% over the current record, you will receive x% of the prize money…) The first award was given to him in 2006. And now Baldrson writes:
Kaido Orav improved his footer award for Lossless Compression of Human Knowledge by 1.38% with his “fx-cmix” entry.
This year's winner came just six months after the previous winner, and competition appears to be heating up. This is all the more impressive because each improvement in the benchmark approaches an (unknown) minimum size of the data, called the Kolmogorov complexity.