Your conclusion at the end that real AGI should not be expected to remember everything is inline with my understanding. I have no objection there. However, I think “a few essential bits and pieces of what we experience” and remember do constitute models. We perceive the world from direct perception, with the help of these pieces. These pieces might not be visual representation of the world (e.g. they can be relations, predictions, etc.), thus not technically models for AI of this day and age, they are models nonetheless. Maybe our difference is about the definition of the word “model”. While you do not think the essential bits constitute “model”, I do.