Arrival is perhaps my favorite film, and while I was already familiar with the Wolfram documentary, and the original Ted Chiang story, I am always excited to revisit these ideas, and learn more about the interplay of language and cognition. I was impressed with the way the Electric Didact dissected this concept in the film and tied it back to the root of the very word “understand.”
Even more interesting to me is when we try to use language to express what we see, thereby translating visual cognition into an audible expression and back again. (As we are aiming to do with our projects in this course.)
This idea reminds me of a study that found that Russian speakers, who have separate words to distinguish between light and dark blue, are quicker to recognize these subtle differences than English speakers when shown two different shades, thus indicating language affecting visual perception right here on our own blue planet.
Article from the National Academy of Science
On the other end of the same cycle, Vox did an interesting piece looking at the evolution of words for color in language across different cultures, beginning almost always with just light and dark, then next to red before blue and green.
I’m curious if there is any way to actively adapt the interconnection between visual and linguistic cognition for use in interface design, or to create new connections by building a new vocabulary to map optical cues to concepts that do not have representations in the visual spectrum.