What does it mean to understand something?
If an agent produces a solution for a problem, does it necessarily follow that the agent understands it? Many people would clearly say no. A chess computer excels at the problem of playing chess but has no understanding of what he does, it’s just an algorithm crunching numbers after all, right?
To really understand a problem from a human perspective means not just being able to solve it but having the ability to explain how to others. Ok well what does it mean to explain something then? I successfully explained something if another agent understands my explanation. Meaning he is able to solve the explained problem as well (applying the transferred knowledge) and able to explain it successfully to other agents.
Can I explain without language? Can I explain without any prior knowledge or beliefs about the knowledge the agent I am explaining to possesses?