Empathy, compassion and gratitude — these traits don’t usually spring to mind when you think about Darwinism and natural selection. No, your mind more immediately drifts toward anti-social characteristics like competition, survival of the fittest, and selfishness (as in the “selfish gene”). But above, on the first day of 2015, UC Berkeley psychologist Dacher Keltner reminds us that evolution can bring out the best in us, and Darwin recognized that. As Darwin wrote in The Descent of Man, the strengthening of our capacity for “sympathy” played a central role in human evolution:
With mankind, selfishness, experience, and imitation, probably add .… to the power of sympathy; for we are led by the hope of receiving good in return to perform acts of sympathetic kindness to others; and sympathy is much strengthened by habit. In however complex a manner this feeling may have originated, as it is one of high importance to all those animals which aid and defend one another, it will have been increased through natural selection; for those communities, which included the greatest number of the most sympathetic members, would flourish best, and rear the greatest number of offspring.
“Smartphones and laptops seem so ubiquitous to us all,” writes experience designer Jinsoo An. “But in reality, the ubiquitousness we experience every day is based on a series of learned behaviors. Someone once said that, ‘The only intuitive interface is the nipple. Everything else is learned.’ ” This, he points out, holds for the simple magazine as much as it does for the computer mouse — a device which certain generations use even more intuitively than they do anything involving the printed word. But, many computer users found the mouse, just a few years before it achieved ubiquity, hardly intuitive at all. “If you can point, you can use a Macintosh,” insisted an early Apple ad for that innovative desktop computer.
If, convinced, you went on to buy a Mac of your own, and you received with it a printed manual including a section explaining the mechanics of mouse usage. “Every move you make with the mouse moves the pointer in exactly the same way,” goes one of its sentences that would now seem comically unnecessary. “Usually the pointer is shaped like an arrow, but it changes shape depending on what you’re doing.“And for those who found the book too intimidating, Apple also included a cassette tape containing a production called “A Guided Tour of Macintosh,” in which friendly voices explain such important subjects as “Mousing Around,” “What’s the Finder?,” and “Why Do I Have Windows?” to a soundtrack by artists from the powerhouse new-age music label Wyndham Hill.
An’s post includes the audio of this techno-educational journey, and at the top of the post you can watch it synchronized with video of the accompanying application that came onboard the computer. We can all have a good laugh at this sort of thing now that we’ve fully internalized once-confusing concepts like windows, the finder, and the mouse — but isn’t it more startling, in this era when so few people even consider reading manuals that many companies seem to have stopped printing them entirely, to imagine anyone, before they dare use their new computer, popping in a tape?
The common conception of New Year’s resolutions frames them as disposable ideals, not to be taken too seriously or followed through past the first few months of winter; by spring, we all assume, we’ll be right back to our slothful, gluttonous ways. Perhaps the problem lies in the way we approach this yearly ritual. Lists of the most common resolutions tend towards the almost shockingly banal, such that most people’s desires for change are interchangeable with their friends and neighbors and might as well be scripted by greeting card companies. I’d hazard it’s impossible to be passionate about half-thoughts and boilerplate ambition.
But there are those few people who really pour their hearts into it, creating lists so individualized and authentic that the documents expose their inner lives, their hopes, fears, loves, struggles, and deep, personal yearnings and aspirations. One such list that circulates often, and which we featured last year, is this gem from Woody Guthrie circa 1943. It’s so completely him, so much in his voice, that no one else could have written it, even in parody. This year, we direct your attention to the list above, from Marilyn Monroe, written at the end of 1955 when the star was 29.
Already well-known for her acting in such fine films as All About Eve, Gentlemen Prefer Blondes, and The Seven Year Itch, Monroe had recently been accepted to Lee Strasberg’s Actors Studio. As Lists of Note puts it, “judging by this list, she was determined to make the most of her opportunities.” I’m not sure what to make of the odd use of random letters at the beginning of each resolution, but what the list does offer us is a glimpse into Monroe’s deep commitment—despite her feeling that her life was “miserable”—to growing and developing as an actor and a person.
See a full transcript of her list of resolutions below.
Must make effort to do
Must have the dicipline to do the following –
z – go to class – my own always – without fail
x – go as often as possible to observe Strassberg’s other private classes
g – never miss actor’s studio sessions
v – work whenever possible – on class assignments – and always keep working on the acting exercises
u – start attending Clurman lectures – also Lee Strassberg’s directors lectures at theater wing – enquire about both
l – keep looking around me – only much more so – observing – but not only myself but others and everything – take things (it) for what they (it’s) are worth
y – must make strong effort to work on current problems and phobias that out of my past has arisen – making much much much more more more more more effort in my analisis. And be there always on time – no excuses for being ever late.
w – if possible – take at least one class at university – in literature –
o – follow RCA thing through.
p – try to find someone to take dancing from – body work (creative)
t – take care of my instrument – personally & bodily (exercise)
try to enjoy myself when I can – I’ll be miserable enough as it is.
I often say that, if you want to vastly overestimate your own capabilities, you need only do one of two things: (a) get coked out of your mind, or (b) get behind the wheel of a car. But what if the problem runs deeper in humanity than that? Indeed, what if our inability to perceive our own incompetence exactly matches the degree of the incompetence itself? Now, none of us can do everything well, but we’ve all met people who, even well outside of the contexts of drugs or driving, simply cannot grasp the full extent of how much they can’t do well. “The problem with people like this is that they are so stupid,” explains Monty Python’s John Cleese in the clip above, “they have no idea how stupid they are.”
“In order to know how good you are at something requires exactly the same skills as it does to be good at that thing in the first place,” Cleese elaborates, “which means — and this is terribly funny — that if you are absolutely no good at something at all, then you lack exactly the skills you need to know that you are absolutely no good at it.” With that, he gives us an extremely brief introduction to the Dunning–Kruger effect, “a cognitive bias wherein unskilled individuals suffer from illusory superiority, mistakenly rating their ability much higher than is accurate” owing to “a metacognitive inability of the unskilled to recognize their ineptitude” (and, by the same token, of “highly skilled individuals to underestimate their relative competence, erroneously assuming that tasks which are easy for them are also easy for others”).
The effect takes its name from Cornell University researchers Justin Kruger and David Dunning, the latter of whom Cleese, who has spent time at Cornell as a long-term visiting professor (where he has, among other projects, taken part in a talk about creativity, group dynamics and celebrity), counts as a friend. He originally invoked Dunning and Kruger’s “wonderful bit of research” in the video “John Cleese Considers Your Futile Comments,” where he talks back to YouTube commenters on Monty Python videos — in this case, those who mentioned the names of certain political commentators beneath the 1970 sketch “Upperclass Twit of the Year.” “This explains not just Hollywood,” Cleese concludes, “but almost the entirety of Fox News.”
Those of you interested in both cognitive phenomena and conservative American political figures will surely have seen Gates of Heaven and A Brief History of Time documentarian Errol Morris’ most recent film The Unknown Known, a long-form conversation with former U.S. Secretary of Defense Donald Rumsfeld. In the years before its release, Morris wrote a five-part series for the New York Times called “The Anosognosic’s Dilemma,” fueled not just by his fascination with Rumsfeld but with his near-obsession over the Dunning-Kruger effect. In it, he actually interviews Dunning himself, who summarizes the issue thus: “We’re not very good at knowing what we don’t know.”
Dunning even brings up the subject of Rumsfeld first, specifically about his speech on “unknown unknowns” that gave Morris’ movie its title. It goes something like this: ‘There are things we know we know about terrorism. There are things we know we don’t know. And there are things that are unknown unknowns. We don’t know that we don’t know.’ He got a lot of grief for that. And I thought, ‘That’s the smartest and most modest thing I’ve heard in a year.’ ” When Morris followed up, Dunning added that “the notion of unknown unknowns really does resonate with me, and perhaps the idea would resonate with other people if they knew that it originally came from the world of design and engineering rather than Rumsfeld.” Or maybe they could associate it with the Ministry of Silly Walks instead.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
What is “Philosophy”? Yes, we know, the word comes from the Greek philosophia, which means “the love of wisdom.” This rote etymological definition does little, I think, to enhance our understanding of the subject, though it may describe the motivation of many a student. Like certain diseases, maybe philosophy is a spectrum, a collection of loosely related behaviors. Maybe a better question would be, “what are all the symptoms of this thing we call philosophy?” The medical metaphor is timely. We live in an age when the discipline of philosophy, like many of the humanities, gets treated like a pathology, in universities and in the wider culture. See, for example, popular articles on whether science has rendered philosophy (and religion) obsolete. There seems to be an underlying assumption in our society that philosophy is something to be eradicated, like smallpox.
Perhaps this sort of thing is just an empty provocation; after all, many logical positivists of the early 20th century also claimed to have invalidated large areas of philosophical inquiry by banishing every unclear concept to the dustbin. And yet, philosophy persists, infecting us with its relentless drive to define, inquire, critique, systematize, problematize, and deconstruct.
And of course, in a less technical sense, philosophy infects us with the drive to wonder. Without its tools, I maintain, we would not only lack the basis for understanding the world we live in, but we would also lack important means of imagining, and creating, a better one. If this sounds grandiose, wait till you encounter the thought of Plato, Spinoza, Hegel, Kant, Nietzsche, Kierkegaard, and jazz-futurist Sun Ra—all unaccustomed to thinking small and staying in their lane.
Some philosophers are more circumspect, some more precise, some more literary and imaginative, some more practical and technologically inclined. Like I said, many symptoms, one disease.
We at Open Culture have compiled a list of 140 free philosophy courses from as much of the wide spectrum as we could, spanning such diverse ways of thinking as University of Chicago’s Leo Strauss on Aristotle’s Ethics (Free Online Audio) and Plato’s Laws (Free Online Audio), to Columbia University Buddhist scholar Robert Thurman (Uma’s dad) on “The Central Philosophy of Tibet” (Free Online Audio). We have specific courses on Medical Ethics, taught by Notre Dame’s David Solomon (Free Online Audio) and the University of New Orlean’s Frank Schalow (Free iTunes Audio). We have hugely general courses like “The History of Philosophy Without Any Gaps,” from King’s College’s Peter Adamson (Free Course in Multiple Formats). We have philosophy courses on death, love, religion, film, law, the self, the ancients and the moderns…. See what I mean about the spectrum?
Perhaps philosophy incurs resentment because it roams at large and won’t be packaged into neatly salable—or jailable—units. Perhaps its amorphous nature, its tolerance of uncertainty and doubt, makes some kinds of people uncomfortable. Or perhaps some think it’s too abstruse and difficult to make sense of, or to matter. Not so! Visit our list of 140 philosophy courses and you will surely find a point of entry somewhere. One class will lead to another, and another, and before you know it, you’ll be asking questions all the time, of everything, and thinking rigorously and critically about the answers, and… well, by then it may be too late for a cure.
At the start of 2014, Edge.org posed its annual question to 176 scientific minds: “What Scientific Idea is Ready for Retirement?” The question (as we noted in January) came prefaced by this thought:
Science advances by discovering new things and developing new ideas. Few truly new ideas are developed without abandoning old ones first. As theoretical physicist Max Planck (1858–1947) noted, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” In other words, science advances by a series of funerals. Why wait that long?
As is its custom, Edge initially gathered and published the responses (in text format) from thinkers like Steven Pinker, Kevin Kelly, Sherry Turkle, Robert Sapolsky, and Daniel Dennett. Now, as the sun sets on 2014, filmmaker Jesse Dylan has created a four-minute film based on the project, featuring some of the same figures mentioned above. Watch it up top.
In a few short weeks, we’ll bring you the Edge question of 2015.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Earlier this year, Colin Marshall told you how “Chess has obsessed many of humanity’s finest minds over centuries and centuries and Marcel Duchamp seems to have shown little resistance to its intellectual and aesthetic pull.” His passion for the game (which he describes above) led him to design a now iconic Art Deco chess set, to print an array of chess tournament posters, and to become a pretty adept chess player himself, eventually earning the title of “grand master” as a result. In a pretty neat project, Scott Kildall has looked back at records of Duchamp’s chess matches and created a computer program that lets you play against a “Duchampian ghost.” Just click here, and then click on the chess piece you want to move. It will turn green, and then you can move it with your trackpad/mouse. Enjoy.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
If you heard Sun Ra’s Christmas-day radio broadcast of poetry and music we featured on, well, Christmas day, perhaps it inspired you to create something — music, poetry, radio — yourself. More than twenty years after his death, the flamboyant jazz visionary continues to inspire all kinds of creative acts on the part of his listeners. Surely he played no small part in motivating the production of Big Music, Little Musicians, an album by the fourth‑, fifth‑, and sixth-graders of music teacher Randy Porter’s classes at Chabot, Montclair, and Thornhill elementary schools in Oakland, California. The album offers not just 43 (!) compositions by these elementary schoolers, but, 42 tracks in, their interpretation of Sun Ra’s “Planet Earth” (in its original form the opening cut from 1966’s Sun Ra and His Solar Arkestra Visits Planet Earth):
You can hear the entirety of this out-of-print 1994 release (incidentally, the year after Sun Ra took his leave of planet Earth) at Ubuweb. “With as little as a couple months of experience under their belts,” say the notes there, the ten‑, eleven‑, and twelve-year-old students “are encouraged to improvise and compose and this disc documents it.” And admittedly, “while some may cringe at some of the technical problems young, inexperienced players are bound to have, the creativity exhibited is undeniable. It is also refreshing to hear such unabashed, egoless joy as we have here. Many a seasoned player could stand to give this a listen.” It puts me in the mind of not just the grade-schoolers who sang David Bowie’s Space Oddity but the Portsmouth Sinfonia, an amateur orchestra at the Portsmouth School of Art that compensated for each member’s shaky grasp of their instrument (including, at one point, none other than Brian Eno’s on the clarinet) with its sheer size and the famousness of its selections.
Ghost Train
Tom-Foolery
Help I’m Drowning In A Sea Of Harmony
Just above, you can hear a few original cuts of intriguingly named big music from these little musicians: “Ghost Train,” “Tom Foolery,” and “Help! I’m Drowning in a Sea of Harmony.” Seeing as these kids would be the same age as me today, it would certainly interest me to hear how they’ve turned out; such an early and strong dose of Sun Ra certainly couldn’t make one’s life less interesting.
“When they study our civilization two thousand years from now, there will only be three things that Americans will be known for: the Constitution, baseball and jazz music. They’re the three most beautiful things Americans have ever created.” — Gerald Early talking to Ken Burns.
In this clip unearthed by the Smithsonian earlier this year, we find two great American traditions intertwined — baseball and jazz. As John Edward Hasse explains in his online essay, jazz and baseball grew up together. According to some, the first documented use of the word “jazz” came from a 1913 newspaper article where a reporter, writing about the San Francisco Seals minor league team, said “The poor old Seals have lost their ‘jazz’ and don’t know where to find it.” “It’s a fact … that the ‘jazz,’ the pepper, the old life, has been either lost or stolen, and that the San Francisco club of today is made up of jazzless Seals.” Or, if you listen to this public radio report, another use of the word can be traced back to 1912. That’s when a washed-up pitcher named Ben Henderson claimed that he had invented a new pitch — the “jazz ball.”
During the Swing Era, jazz musicians often took a keen interest in baseball. Writes Ryan Whirty in Offbeat, Louis Armstrong’s “passion for America’s pastime was so intense that, in the early ’30s, he owned his own team, the Secret Nine, in his hometown of New Orleans, even decking the players out in the finest, whitest uniforms ever seen on the sandlots of the Big Easy.” (See them in the photo above.) And then other band leaders like Benny Goodman, Count Basie, Tommy Dorsey, and Duke Ellington formed baseball teams with members of their groups.
Above, you can watch Ellington playing ball in some home videos, both hitting and pitching. When the Duke was a kid, he imagined himself becoming a professional baseball player one day. But the youngster eventually got hit in the head with a bat during a game, and that’s where his baseball career ended. He later noted, “The mark is still there, but I soon got over it. With that, however, my mother decided I should take piano lessons.”
Note: The Duke Ellington Center writes on Youtube that “The appearance of Ben Webster at the end of the clip times the video to around 1940–41.”
Ten months before his death — a death he knew was coming — Christopher Hitchens debated the question, “Is there an afterlife?”. Sharing the stage with Sam Harris, and Rabbis David Wolpe and Bradley Shavit Artson at the American Jewish University in Los Angeles, Hitchens lamented how “It’s considered perfectly normal in this society to approach dying people who you don’t know, but who are unbelievers, and say, ‘Now are you gonna change your mind [about the existence of God]?’ That is considered almost a polite question.” “It’s a religious falsification that people like myself scream for a priest at the end. Most of us go to our end with dignity.”
After spending years as an unapologetic atheist, Hitchens also wasn’t going to start believing in an afterlife — or what he half jokingly called “The Never Ending Party.” The video above takes some of Hitchens comments from the debate and turns them into a whimsical animation. It’s classic Hitchens. Equal parts emphatic and funny. Below, you can watch the original debate in its entirety.
What do movies like Blade Runner, Her, Drive, and Repo Man, separated by the years and even more so by their sensibilities, have in common? All come from auteur directors, all have accumulated considerable fan followings, and all have styles all their own. But to my mind, one important quality unites them more than any other: all take place in Los Angeles. What’s more, all take place in a distinctive vision of Los Angeles, that most photographed but least understood city in the world. Every feature film that uses Los Angeles as something more than a backdrop, whether it tries to represent or reimagine it, also acts as an accidental documentary of the city: of its built environment, of its people, of the ever-shifting ideas we have of it.
?si=K9lV4aUGTMqE94YA
On that premise, I created Los Angeles, the City in Cinema, a series of video essays meant to examine the variety of Los Angeleses revealed in the films set there, both those new and old, mainstream and obscure, respectable and schlocky, appealing and unappealing — just like the contradictory characteristics of the city itself. At the top of the post, you can watch my episode on Blade Runner, Ridley Scott’s 1982 proto-cyberpunk future noir that remains, to this day, the popular idea of the Los Angeles of the future (as evidenced by the pejorative currency of the term “Blade Runner-ization” among NIMBYs): denser, darker, thoroughly Asianized, and taken back to a third-world industrial phase it never really passed through in the first place.
?si=esjTBpRSqjHa81eC
But more recently, a competing vision of Los Angeles’ future emerged in the form of Her, Spike Jonze’s tale of a mustachioed, ukulele-playing milquetoast who falls in love with a sentient computer operating system. He does so in the high-rises and high-speed trains of, by comparison to Blade Runner, a glossier, gentler, future Los Angeles not only free of killer android replicants but — even more surprisingly to many an Angeleno — free of cars. My video essay on Her compares and contrasts Scott and Jonze’s ideas of what lies ahead for the city: would you rather live in the former’s Los Angeles, hybridized with a grittier, less orderly Tokyo, or the latter’s, hybridized with a sanitized Shanghai?
Nicolas Winding Refn’s Drive gave us a new take on the old tradition of European filmmakers examining Los Angeles with a kind of perplexed fascination, as previously exemplified by John Boorman’s Point Blank, Jacques Deray’s The Outside Man, and Jacques Demy’s Model Shop. English cult director Alex Cox added his own rough-edged volume to that shelf with1984’s sci-fi punk favorite Repo Man. In 2000, Cox’s countryman Mike Figgis pulled off his real-time, four-screen experiment Timecode on the Sunset Strip, not far from the strip club where John Cassavetes set much of The Killing of a Chinese Bookie more than twenty years earlier. You can find video essays on these movies and others on the list of those I’ve produced so far:
New videos, including episodes on this year’s solid Los Angeles pictures, Nightcrawler and the Thomas Pynchon adaptation Inherent Vice, will appear regularly. If you live anywhere near Portland, Oregon, note that I’ll give a talk and screening there entitled “Los Angeles and Portland: The Cities in Cinema” at the Hollywood Theatre, featuring never-before-seen video essays on both Los Angeles and Portland films, on January 25, 2015. Keep an eye on their site for details.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.