In model theory, various notions of isolation - ways in which the full information of a type may be determined by a fragment of it - have been important tools for the analysis and classification of models of a theory. Meanwhile, much has been written in machine learning theory on sample compression schemes - ways to code information on a concept in a bounded part.

These related ideas came together in work of Chernikov and Simon on NIP and distal theories, where they found in particular that a theory is distal precisely when every type satisfies a certain isolation notion, termed "compressibility". A model-theoretically natural question to ask of such an isolation notion is whether isolated types are dense in the natural topological space of types.

Meanwhile, a recent result of Chen, Cheng, and Tang provides a strong form of compression scheme for a finite concept class, bounded in terms of its VC dimension.

In work joint with Itay Kaplan and Pierre Simon, we give a certain generalisation of this last result to infinite concept classes, and use it to obtain density of compressible types in countable NIP theories.

URL:https://www.imj-prg.fr/spip.php?article189 END:VEVENT END:VCALENDAR