Entropy of AI

Speaker

Antal Jakovác (ELTE TTK)

Description

When we understand a subject, we feel that we put the earlier messy state of the thoughts in order. Since messiness is associated with high entropy, learning, in some sense, should mean entropy reduction, and a completely understood subject should be represented with the least entropy. To give these ideas a formal description, we need to speak about the different representations of a phenomenon, and associate a representation with a given level of understanding. Then we can define a representation entropy, and prove its minimality in the best representation of the knowledge. In this talk this program is demonstrated for a simple pattern recognition task, to tell apart a few bit long 'cat images' from the 'non-cat images'.

Presentation materials