Elementary Kolmogorov complexity
Two introductory lectures on program-size complexity
by G. Barmpalias in December 2023 at UESTC
Abstract. Algorithmic information theory is a subfield of computer science and mathematics. The program-size complexity of a finite string is the length of the shortest program that prints it. This complexity measure also known as Kolmogorov or Solomonoff–Kolmogorov–Chaitin complexity (from the founders of the theory), descriptive complexity, or algorithmic entropy. In two lectures I will introduce this concept, some simple and some fundamental properties of it.
Lecture 1: Program-size complexity - [Web] and [pdf slides]
- Machines, Programs and Numberings
- Universal machines and Invariance
- Plain Kolmogorov complexity
- Quantitative estimates and properties of information
- Algorithmic properties of complexity, Berry's paradox, undecidability
- Conditional complexity
Lecture 2: Universal distribution - [Web slides] and [pdf slides]
- Self-delimiting (prefix-free) complexity
- Prefix-free codes and Kraft's inequality
- Universal distribution
- The coding theorem and Algorithmic probability
- Symmetry of information
- Relationship with Shannon information
Reading material
Wikipedia article on Kolmogorov complexity
Wikiwand article on Kolmogorov complexity
Peter Gacs. Lecture notes on descriptional complexity and randomness. [Sections 1,3]
A. Shen, V. A. Uspensky, N. Vereshchagin. Kolmogorov Complexity and Algorithmic Randomness. AMS. 2013. [Chapters 1,2,4]
R.G. Downey, D.R. Hirschfeldt. Algorithmic Randomness and Complexity. Springer 2010. [Chapter 3]