Toward the end of The Simpsons’ golden age, one episode sent the titular family off to Japan, not without resistance from its famously lazy patriarch. “Come on, Homer,” Marge insists, “Japan will be fun! You liked Rashomon.” To which Homer naturally replies, “That’s not how I remember it!” This joke must have written itself, not as a high-middlebrow cultural reference (as, say, Frasier would later name-check Tampopo) but as a play on a universally understood byword for the nature of human memory. Even those of us who’ve never seen Rashomon, the period crime drama that made its director Akira Kurosawa a household name in the West, know what its title represents: the tendency of each human being to remember the same event in his own way.
“A samurai is found dead in a quiet bamboo grove,” says the narrator of the animated TED-Ed lesson above. “One by one, the crime’s only known witnesses recount their version of the events that transpired. But as they each tell their tale, it becomes clear that every testimony is plausible, yet different, and each witness implicates themselves.”
So goes “In a Grove,” a story by celebrated early 20th-century writer Ryūnosuke Akutagawa. An avid reader, Kurosawa combined that literary work with another of Akutagawa’s to create the script for Rashomon. Both Akutagawa and Kurosawa “use the tools of their media to give each character’s testimony equal weight, transforming each witness into an unreliable narrator.” Neither reader nor viewer can trust anyone — nor, ultimately, can they arrive at a defensible conclusion as to the identity of the killer.
Such conflicts of memory and perception occur everywhere in human affairs: this TED-Ed lesson finds examples in biology, anthropology, politics, and media. Sufficiently many psychological phenomena converge to give rise to the Rashomon effect that it seems almost overdetermined; it may be more illuminating to ask under what conditions doesn’t it occur. But it also makes us ask even tougher questions: “What is truth, anyway? Are there situations when an objective truth doesn’t exist? What can different versions of the same event tell us about the time, place, and people involved? And how can we make group decisions if we’re all working with different information, backgrounds, and biases?” We seem to be no closer to definitive answers than we were when Rashomoncame out more than 70 years ago — only one of the reasons the film holds up so well still today.
In an old Zen story, two monks argue over whether a flag is waving or whether it’s the wind that waves. Their teacher strikes them both dumb, saying, “It is your mind that moves.” The centuries-old koan illustrates a point Zen masters — and later philosophers, psychologists, and neuroscientists — have all emphasized at one time or another: human experience happens in the mind, but we share reality through language and culture, and these in turn set the terms for how we perceive what we experience.
Such observations bring us to another koan-like question: if a language lacks a word for something like the color blue, can the thing be said to exist in the speaker’s mind? We can dispense with the idea that there’s a color blue “out there” in the world. Color is a collaboration between light, the eye, the optic nerve, and the visual cortex. And yet, claims Maria Michela Sassi, professor of ancient philosophy at Pisa University, “every culture has its own way of naming and categorizing colours.”
The most famous example comes from the ancient Greeks. Since the 18th century, scholars have pointed out that in the thousands of words in the Iliad and Odyssey, Homer never once describes anything — sea, sky, you name it — as blue. It wasn’t only the Greeks who didn’t see blue, or didn’t see it as we do, Sassi writes:
There is a specific Greek chromatic culture, just as there is an Egyptian one, an Indian one, a European one, and the like, each of them being reflected in a vocabulary that has its own peculiarity, and not to be measured only by the scientific meter of the Newtonian paradigm.
It was once thought cultural color differences had to do with stages of evolutionary development — that more “primitive” peoples had a less developed biological visual sense. But differences in color perception are “not due to varying anatomical structures of the human eye,” writes Sassi, “but to the fact that different ocular areas are stimulated, which triggers different emotional responses, all according to different cultural contexts.”
As the AsapSCIENCE video above explains, the evidence of ancient Greek literature and philosophy shows that since blue was not part of Homer and his readers’ shared vocabulary (yellow and green do not appear either), it may not have been part of their perceptual experience, either. The spread of blue ink across the world as a relatively recent phenomenon has to do with its availability. “If you think about it,” writes Business Insider’s Kevin Loria, “blue doesn’t appear much in nature — there aren’t blue animals, blue eyes are rare, and blue flowers are mostly human creations.”
The color blue took hold in modern times with the development of substances that could act as blue pigment, like Prussian Blue, invented in Berlin, manufactured in China and exported to Japan in the 19th century. “The only ancient culture to develop a word for blue was the Egyptians — and as it happens, they were also the only culture that had a way to produce a blue dye.” Color is not only cultural, it is also technological. But first, perhaps, it could be a linguistic phenomenon.
One modern researcher, Jules Davidoff, found this to be true in experiments with a Namibian people whose language makes no distinction between blue and green (but names many finer shades of green than English does). “Davidoff says that without a word for a colour,” Loria writes, “without a way of identifying it as different, it’s much harder for us to notice what’s unique about it.” Unless we’re color blind, we all “see” the same things when we look at the world because of the basic biology of human eyes and brains. But whether certain colors appear, it seems, has to do less with what we see than with what we’re already primed to expect.
Why do people play video games, and what keeps them playing? Do we want to have to think through innovative puzzles or just lose ourselves in mindless reactivity? Your hosts Mark Linsenmayer, Erica Spyres, and Brian Hirt are joined by Dr. Jamie Madigan, an organizational psychologist who runs the Psychology of Video Games podcast, to discuss what sort of a thing this is to research, the evolution of games, player types, motivation vs. engagement, incentives and feedback, as well as the gamification of work or school environments. Some games we touch on include Donkey Kong, Dark Souls, It Takes Two, Returnal, Hades, Subnautica, Fortnite, and Age of Z.
Some of the episodes of Jamie’s podcast relevant for our discussion are:
Back in 2017, Ray Dalio published Principles: Life and Work, a bestselling book where the creator of the world’s largest hedge fund shared “the unconventional principles that he’s developed, refined, and used over the past forty years to create unique results in both life and business.” You can find a distilled version of those unconventional principles in a 30-minute animation video previously featured on our site.
According to psychologist Brian Little, “PrinciplesYou was developed over a two-year intensive and creative R&D process with two goals in mind. First, it measures traits that Ray Dalio and his team have observed and studied for many years as critical for personal and organizational success. Second, it is based on the latest research in personality science. The assessment provides a person’s score on a comprehensive set of traits, their underlying facets and interactive patterns, and it has high reliability, internal structure, re-test reliability and validity of these traits and facets. A distinctive strength is its ability to predict an extraordinary array of actual behaviors observed by the Bridgewater staff over many years.”
Adam Grant adds: “To achieve success, you need to know yourself and the people around you. Although your car comes with an owner’s manual, your mind doesn’t—and neither do your colleagues. We designed PrinciplesYou to help you gain the self-awareness and other-awareness that are critical to making good decisions, getting things done, and turning a group of coworkers into a great team.”
The technology we put between ourselves and others tends to always create additional strains on communication, even as it enables near-constant, instant contact. When it comes to our now-primary mode of interacting — staring at each other as talking heads or Brady Bunch-style galleries — those stresses have been identified by communication experts as “Zoom fatigue,” now a subject of study among psychologists who want to understand our always-connected-but-mostly-isolated lives in the pandemic, and a topic for Today show segments like the one above.
As Stanford researcher Jeremy Bailenson vividly explains to Today, Zoom fatigue refers to the burnout we experience from interacting with dozens of people for hours a day, months on end, through pretty much any video conferencing platform. (But, let’s face it, mostly Zoom.) We may be familiar with the symptoms already if we spend some part of our day on video calls or lessons. Zoom fatigue combines the problems of overwork and technological overstimulation with unique forms of social exhaustion that do not plague us in the office or the classroom.
Bailenson, director of Stanford University’s Virtual Human Interaction Lab, refers to this kind of burnout as “Nonverbal Overload,” a collection of “psychological consequences” from prolonged periods of disembodied conversation. He has been studying virtual communication for two decades and began writing about the current problem in April of 2020 in a Wall Street Journal op-ed that warned, “software like Zoom was designed to do online work, and the tools that increase productivity weren’t meant to mimic normal social interaction.”
Now, in a new scholarly article published in the APA journal Technology, Mind, and Behavior, Bailenson elaborates on the argument with a focus on Zoom, not to “vilify the company,” he writes, but because “it has become the default platform for many in academia” (and everywhere else, perhaps its own form of exhaustion). The constituents of nonverbal overload include gazing into each others’ eyes at close proximity for long periods of time, even when we aren’t speaking to each other.
Anyone who speaks for a living understands the intensity of being stared at for hours at a time. Even when speakers see virtual faces instead of real ones, research has shown that being stared at while speaking causes physiological arousal (Takac et al., 2019). But Zoom’s interface design constantly beams faces to everyone, regardless of who is speaking. From a perceptual standpoint, Zoom effectively transforms listeners into speakers and smothers everyone with eye gaze.
On Zoom, we also have to expend much more energy to send and interpret nonverbal cues, and without the context of the room outside the screen, we are more apt to misinterpret them. Depending on the size of our screen, we may be staring at each other as larger-than-life talking heads, a disorienting experience for the brain and one that lends more impact to facial expressions than may be warranted, creating a false sense of intimacy and urgency. “When someone’s face is that close to ours in real life,” writes Vignesh Ramachandran at Stanford News, “our brains interpret it as an intense situation that is either going to lead to mating or to conflict.”
Unless we turn off the view of ourselves on the screen — which we generally don’t do because we’re conscious of being stared at — we are also essentially sitting in front of a mirror while trying to focus on others. The constant self-evaluation adds an additional layer of stress and taxes the brain’s resources. In face-to-face interactions, we can let our eyes wander, even move around the room and do other things while we talk to people. “There’s a growing research now that says when people are moving, they’re performing better cognitively,” says Bailenson. Zoom interactions, conversely, can inhibit movement for long periods of time.
“Zoom fatigue” may not be as dire as it sounds, but rather the inevitable trials of a transitional period, Bailenson suggests. He offers solutions we can implement now: using the “hide self-view” button, muting our video regularly, setting up the technology so that we can fidget, doodle, and get up and move around…. Not all of these are going to work for everyone — we are, after all, socialized to sit and stare at each other on Zoom; refusing to participate might send unintended messages we would have to expend more energy to correct. Bailenson further describes the phenomenon in the BBC Business Daily podcast interview above.
“Videoconferencing is here to stay,” Bailenson admits, and we’ll have to adapt. “As media psychologists it is our job,” he writes to his colleagues in the new article, to help “users develop better use practices” and help “technologists build better interfaces.” He mostly leaves it to the technologists to imagine what those are, though we ourselves have more control over the platform than we collectively acknowledge. Could we maybe admit, Bailenson writes, that “perhaps a driver of Zoom fatigue is simply that we are taking more meetings than we would be doing face-to-face”?
Fledgling animators may feel as if they’ve swallowed a stone—no matter how hard I try, nothing I make will approach the beauty on display here.
Sticklers—and there are plenty leaving comments on YouTube—may be irritated to realize that it’s actually not 30 but 6 minutes of visuals, looped 5 times.
Insomniacs (such as this reporter) may wish there was more looping and less content. The selected scenery is tranquil enough, but the clips themselves are brief, leading to some jarring transitions.
(One possible workaround for those hoping to lull themselves to sleep: fiddle with the speed settings. Played at .25 and muted, this compilation becomes very relaxing, much like artist Douglas Gordon’s video installation, 24 Hour Psycho. Leave the sound up and the lapping waves, gentle winds, and chuffing trains turn into something worthy of a slasher flick.
Finally, with so much attention focussed on Mars these days, we can’t help imagining what alien life forms might make of these earthly visions—ahh, this green, sheep-dotted pasture does lower my stress level…wait, WTF was THAT!? Nothing on my home planet prepared me for the possibility of a monstrous winged house comprised of overgrown bagpipes and chicken legs lumbering through the countryside!
We concede that 30 Minutes of Relaxing Visuals from Studio Ghibli is a pleasant thing to have playing in the background as we wait for COVID restrictions to be lifted… but ultimately, you may find these 36 minutes of music from Studio Ghibli films more genuinely relaxing.
We moderns might wonder what ancient peoples did when not hunting, gathering, and reproducing. The answer is that they did mushrooms, at least according to one interpretation of cave paintings at Tassili n’Ajjer in Algeria, some of which go back 9,000 years. “Here are the earliest known depictions of shamans with large numbers of grazing cattle,” writes ethnobotanist/mystic Terence McKenna in his book Food of the Gods: The Search for the Original Tree of Knowledge. “The shamans are dancing with fists full of mushrooms and also have mushrooms sprouting out of their bodies. In one instance they are shown running joyfully, surrounded by the geometric structures of their hallucinations. The pictorial evidence seems incontrovertible.”
McKenna wasn’t the only scholar of the psychedelic experience to take an interest in Tassili. Giorgio Samorini had written about its ancient paintings a few years before, focusing on one that depicts “a series of masked figures in line and hieratically dressed or dressed as dancers surrounded by long and lively festoons of geometrical designs of different kinds.” Each dancer “holds a mushroom-like object in the right hand,” but the key visual element is the parallel lines that “come out of this object to reach the central part of the head of the dancer.” These “could signify an indirect association or non-material fluid passing from the object held in the right hand and the mind,” an interpretation in line with the idea of “the universal mental value induced by hallucinogenic mushrooms and vegetals, which is often of a mystical and spiritual nature.”
The U.S. Forest Service acknowledges Tassili as “the oldest known petroglyph depicting the use of psychoactive mushrooms,” adding the postulate that “the mushrooms depicted on the ‘mushroom shaman’ are Psilocybe mushrooms.” That name will sound familiar to 21st-century consciousness-alteration enthusiasts, some of whom advocate for the use of psilocybin, the psychedelic compound that occurs in such mushrooms, as not just a recreational drug but a treatment for conditions like depression. Cave art like Tassili’s suggests that such instrumental uses of hallucinogenic plants — as vital parts of rituals, for example — may stretch all the way back to the Neolithic era, when last the Sahara desert was a relatively verdant savanna rather than the vast expanse of sand we know today.
A sense of continuity with the practices of these long-ago predecessors — ancient Egyptians to the ancient Egyptians, as one Redditor frames it — must enrich mushroom use for many psychonauts today. And indeed, the “bee-headed shaman” and his compatriots have had a robust cultural afterlife: “A popularly published drawing based on one of the Tassili figures has become an icon of post-1990’s psychedelia,” says Brian Akers of Mushroom: The Journal of Wild Mushrooming. The “abstract-bizarre” style of its images have also put it “among the sites favored by ancient ET theorizing.” However rich the visions experienced by the cave-painters who once lived there, surely none could have been as mind-blowing as the idea that their work would still fire up imaginations nine millennia later.
Would you like to support the mission of Open Culture? Please consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere.
Get the best cultural and educational resources on the web curated for you in a daily email. We never spam. Unsubscribe at any time.
FOLLOW ON SOCIAL MEDIA
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.
Open Culture (openculture.com) and our trusted partners use technology such as cookies on our website to personalise ads, support social media features, and analyze our traffic. Please click below to consent to the use of this technology while browsing our site.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.