Interesting paper on how computer systems can be more precise than natural language
... and one must avoid using terms for processes and relatioships in computer programming that suggest more than you wished to suggest is true. It's generally a good paper, and I found this section particularly interesting, partly because I have studies the ecological theory of visual perception, which departs dramatically from the idea that we see the world as imagery.
The obsession with natural language seems to have caused the feeling that the human use of language is a royal road to the cognitive psyche. I find this analogous to preoccupation with imagery as a way of studying vision. Most AI researchers react with amusement to proposals to explain vision in terms of stored images, reducing the physical eye to the mind's eye. But many of the same people notice themselves talking to themselves in English, and conclude that English is very close to the language of thought.
Clearly, there must be some other notation, or we will have done for the ear what imagery theory does for the eye. No matter how fascinating the structure of consiousness is, it is dangerous to gaze too long into its depths. The puzzles we find there can be solved only by sneaking up on them from behind. As of now, we have no idea at all why people experience their thoughts the way they do, in pictures and words. It will probably turn out to be very different, even simpler, than what we think now, once we understand why and how people experience their thoughts at all.
-Drew McDermott, "Artificial Intelligence Meets Natural Stupidity", MIT AI Lab, Cambridge Mass, April 1976