I saw a picture of the St. Louis Arch – it was an arc of nothing but light – it looked like it had been taken at night. But the caption said it was taken during the day – it was a picture of the sun reflecting off of the arch.

I realized that all we photograph is just light. We take pictures of light reflecting off of things, which is different from taking a picture of the actual thing itself.

We see light reflections as well. Nothing more than light, reflected.

I then realized (about a week later – it took some cogitating) that my view of the world – my internal map – is almost entirely visual. I judge my space, my situation, the hardness or softness of objects, the smiles of friends, on what I see with my eyes.

So I tried standing in a forest, focusing my eyes on one thing, and listening. I heard birds. I heard a woodpecker about 20 feet away off to my left. Then I realized there were two – it sounded like a child and parent. Behind me was a cardinal. A blue jay flew overhead. Something I couldn’t identify was to the right. Maybe a mocking bird around somewhere too.

I realized I could draw a different type of map in my head based on the sounds I heard. I think, in fact I strongly suspect, that birds hold these maps. They are constantly aware of the company around them, without visual confirmation. They hear the calls of the other birds, mark their locations in their heads, and so have a map.

Dolphins use echolocation. They make high-pitched whistles and clicking noises. Their teeth are exactly offset, so the uppers do not match the lowers. The teeth receive the sounds back, and based on the slight differences between reception time at upper and lower, left and right, they can tell where an object is and how far away it is. In a dolphin’s head, there is a map of the world that is built on vibrations.