Résume | Joint with Itay Kaplan and Pierre Simon.
Distal theories are NIP theories which are “wholly unstable”. Chernikov and Simon's “strong honest definitions” characterise distal theories as those in which every type is compressible. Adapting recent work in machine learning of Chen, Cheng, and Tang on bounds on the “recursive teaching dimension” of a finite concept class, we find that compressibility is dense in NIP structures, i.e. any formula can be completed to a compressible type in S(A). Considering compressibility as an isolation notion (which specialises to l-isolation in stable theories), we obtain consequences on the existence of models with certain properties. |