Discover Rare 1980s CDs by Lou Reed, Devo & Talking Heads That Combined Music with Computer Graphics

When it first hit the market in 1982, the compact disc famously promised "perfect sound that lasts forever." But innovation has a way of marching continually on, and naturally the innovators soon started wondering: what if perfect sound isn't enough? What if consumers want something to go with it, something to look at? And so, when compact disc co-developers Sony and Philips updated its standards, they included documentation on the use of the format's channels not occupied by audio data. So was born the CD+G, which boasted "not only the CD's full, digital sound, but also video information — graphics — viewable on any television set or video monitor."

That text comes from a package scan posted by the online CD+G Museum, whose Youtube channel features rips of nearly every record released on the format, beginning with the first, the Firesign Theatre's Eat or Be Eaten.




When it came out, listeners who happened to own a CD+G-compatible player (or a CD+G-compatible video game console, my own choice at the time having been the Turbografx-16) could see that beloved "head comedy" troupe's densely layered studio production and even more densely layered humor accompanied by images rendered in psychedelic color — or as psychedelic as images can get with only sixteen colors available on the palette, not to mention a resolution of 288 pixels by 192 pixels, not much larger than a icon on the home screen of a modern smartphone. Those limitations may make CD+G graphics look unimpressive today, but just imagine what a cutting-edge novelty they must have seemed in the late 1980s when they first appeared.

Displaying lyrics for karaoke singers was the most obvious use of CD+G technology, but its short lifespan also saw a fair few experiments on such other major-label releases, all viewable at the CD+G Museum, as Lou Reed's New York, which combines lyrics with digitized photography of the eponymous city; Talking Heads' Naked, which provides musical information such as the chord changes and instruments playing on each phrase; Johann Sebastian Bach's St. Matthew Passion, which translates the libretto alongside works of art; and Devo's single "Disco Dancer," which tells the origin story of those "five Spudboys from Ohio." With these and almost every other CD+G release available at the CD+G museum, you'll have no shortage of not just background music but background visuals for your next late-80s-early-90s-themed party.

Related Content:

Watch 1970s Animations of Songs by Joni Mitchell, Jim Croce & The Kinks, Aired on The Sonny & Cher Show

The Story of How Beethoven Helped Make It So That CDs Could Play 74 Minutes of Music

Discover the Lost Early Computer Art of Telidon, Canada’s TV Proto-Internet from the 1970s

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

M.I.T. Computer Program Alarmingly Predicts in 1973 That Civilization Will End by 2040

In 1704, Isaac Newton predicted the end of the world sometime around (or after, "but not before") the year 2060, using a strange series of mathematical calculations. Rather than study what he called the “book of nature,” he took as his source the supposed prophecies of the book of Revelation. While such predictions have always been central to Christianity, it is startling for modern people to look back and see the famed astronomer and physicist indulging them. For Newton, however, as Matthew Stanley writes at Science, “laying the foundation of modern physics and astronomy was a bit of a sideshow. He believed that his truly important work was deciphering ancient scriptures and uncovering the nature of the Christian religion.”

Over three hundred years later, we still have plenty of religious doomsayers predicting the end of the world with Bible codes. But in recent times, their ranks have seemingly been joined by scientists whose only professed aim is interpreting data from climate research and sustainability estimates given population growth and dwindling resources. The scientific predictions do not draw on ancient texts or theology, nor involve final battles between good and evil. Though there may be plagues and other horrible reckonings, these are predictably causal outcomes of over-production and consumption rather than divine wrath. Yet by some strange fluke, the science has arrived at the same apocalyptic date as Newton, plus or minus a decade or two.




The “end of the world” in these scenarios means the end of modern life as we know it: the collapse of industrialized societies, large-scale agricultural production, supply chains, stable climates, nation states…. Since the late sixties, an elite society of wealthy industrialists and scientists known as the Club of Rome (a frequent player in many conspiracy theories) has foreseen these disasters in the early 21st century. One of the sources of their vision is a computer program developed at MIT by computing pioneer and systems theorist Jay Forrester, whose model of global sustainability, one of the first of its kind, predicted civilizational collapse in 2040. “What the computer envisioned in the 1970s has by and large been coming true,” claims Paul Ratner at Big Think.

Those predictions include population growth and pollution levels, “worsening quality of life,” and “dwindling natural resources.” In the video at the top, see Australia's ABC explain the computer’s calculations, “an electronic guided tour of our global behavior since 1900, and where that behavior will lead us,” says the presenter. The graph spans the years 1900 to 2060. "Quality of life" begins to sharply decline after 1940, and by 2020, the model predicts, the metric contracts to turn-of-the-century levels, meeting the sharp increase of the “Zed Curve" that charts pollution levels. (ABC revisited this reporting in 1999 with Club of Rome member Keith Suter.)

You can probably guess the rest—or you can read all about it in the 1972 Club of Rome-published report Limits to Growth, which drew wide popular attention to Jay Forrester’s books Urban Dynamics (1969) and World Dynamics (1971). Forrester, a figure of Newtonian stature in the worlds of computer science and management and systems theory—though not, like Newton, a Biblical prophecy enthusiast—more or less endorsed his conclusions to the end of his life in 2016. In one of his last interviews, at the age of 98, he told the MIT Technology Review, “I think the books stand all right.” But he also cautioned against acting without systematic thinking in the face of the globally interrelated issues the Club of Rome ominously calls “the problematic”:

Time after time … you’ll find people are reacting to a problem, they think they know what to do, and they don’t realize that what they’re doing is making a problem. This is a vicious [cycle], because as things get worse, there is more incentive to do things, and it gets worse and worse.

Where this vague warning is supposed to leave us is uncertain. If the current course is dire, “unsystematic” solutions may be worse? This theory also seems to leave powerfully vested human agents (like Exxon's executives) wholly unaccountable for the coming collapse. Limits to Growth—scoffed at and disparagingly called “neo-Malthusian” by a host of libertarian critics—stands on far surer evidentiary footing than Newton’s weird predictions, and its climate forecasts, notes Christian Parenti, “were alarmingly prescient.” But for all this doom and gloom it’s worth bearing in mind that models of the future are not, in fact, the future. There are hard times ahead, but no theory, no matter how sophisticated, can account for every variable.

via Big Think

Related Content:

In 1704, Isaac Newton Predicts the World Will End in 2060

A Century of Global Warming Visualized in a 35 Second Video

A Map Shows What Happens When Our World Gets Four Degrees Warmer: The Colorado River Dries Up, Antarctica Urbanizes, Polynesia Vanishes

It’s the End of the World as We Know It: The Apocalypse Gets Visualized in an Inventive Map from 1486

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Meet Grace Hopper, the Pioneering Computer Scientist Who Helped Invent COBOL and Build the Historic Mark I Computer (1906-1992)

On a page for its School of Technology, Rasmussen College lists six “Assumptions to Avoid” for women who want to enter the field of computer science. I couldn’t comment on whether these “assumptions” (alleged misconceptions like “the work environment is hostile to women”) are actually disproved by the commentary. But I might suggest a seventh “assumption to avoid”—that women haven’t always been computer scientists, integral to the development of the computer, programming languages, and every other aspect of computing, even 100 years before computers existed.

In fact, one of the most notable women in computer science, Grace Hopper, served as a member of the Harvard team that built the first computer, the room-sized Mark I designed in 1944 by physics professor Howard Aiken. Hopper also helped develop COBOL, the first universal programming language for business, still widely in use today, a system based on written English rather than on symbols or numbers. And she is credited with coining the term “computer bug” (and by extension “debug”), when she and her associates found a moth stuck inside the Mark II in 1947. (“From then on,” she told Time magazine in 1984, “when anything went wrong with a computer, we said it had bugs in it.”)




These are but a few of her achievements in a computer science career that spanned more than 42 years, during which time she rose through the ranks of the Naval Reserves, then later active naval duty, retiring as the oldest commissioned officer, a rear admiral, at age 79.

In addition to winning distinguished awards and commendations over the course of her career—including the first-ever computer science “Man of the Year” award—Hopper also acquired a few distinguished nicknames, including “Amazing Grace” and “Grandma COBOL.” She may become known to a new generation by the nickname, “Queen of Code,” the title of a recent documentary from FiveThirtyEight’s “Signals” series. Directed by Community star Gillian Jacobs, the short film, which you can watch in full here, tells the story of her “inimitable legacy as a brilliant programmer and pioneering woman in a male-dominated field,” writes Allison McCann at FiveThirtyEight.

Hopper’s name may be "mysteriously absent from many history books,” as Amy Poehler’s Smart Girls notes, but before her death in 1992, she was introduced to millions through TV appearances on shows like Late Night with David Letterman (top) and 60 Minutes, just above. As you’ll see in these clips, Hopper wasn’t just a crack mathematician and programmer but also an ace public speaker whose deadpan humor cracked up Letterman and the groups of students and fellow scientists she frequently addressed.

The 60 Minutes segment notes that Hopper became “one of that small band of brothers and sisters who ushered in the computer revolution” when she left her professor’s job at Vassar at the start of WWII to serve in the Naval Reserve, where she was assigned to the Bureau of Ships Computation Project at Harvard. But she never stopped being an educator and considered “training young people” her second-most important accomplishment. In this, her legacy lives on as well.

The world’s largest gathering of women technologists is called “The Grace Hopper Celebration.” And a documentary in production called Born with Curiosity (see a teaser above) hopes that “shining a light on and humanizing role models like Grace makes them relatable in a way that inspires others to greatness.” At a time when women make up the lowest enrollment in computer science out of all of the STEM fields, Hopper’s example and encouragement may be much needed.

via Mental Floss

Related Content:

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

Introduction to Computer Science and Programming: A Free Course from MIT 

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Computer Scientists Figure Out What’s the Longest Distance You Could Sail at Sea Without Hitting Land

Back in 2012, a redditor by the name of "Kepleronlyknows" wondered what's the longest distance you could travel by sea without hitting land. And then s/he hazarded an educated guess: "you can sail almost 20,000 miles in a straight line from Pakistan to the Kamchatka Peninsula, Russia."

Six years later, two computer scientists--Rohan Chabukswar (United Technologies Research Center in Ireland) and Kushal Mukherjee (IBM Research in India)--have developed an algorithm that offers a more definitive answer. According to their computations, "Kepleronlyknows was entirely correct," notes the MIT Technology Review.




The longest path over water "begins in Sonmiani, Balochistan, Pakistan, passes between Africa and Madagascar and then between Antarctica and Tierra del Fuego in South America, and ends in the Karaginsky District, Kamchatka Krai, in Russia. It is 32,089.7 kilometers long." Or 19,939 miles.

While they were at it, Chabukswar and Mukherjee also determined the longest land journey you could take without hitting the sea. That path, again notes the MIT Technology Review, "runs from near Jinjiang, Fujian, in China, weaves through Mongolia Kazakhstan and Russia, and finally reaches Europe to finish near Sagres in Portugal. In total the route passes through 15 countries over 11,241.1 kilometers." Or 6,984 miles. You can read Chabukswar and Mukherjee's research report here.

via the MIT Technology Review

Related Content:

A Colorful Map Visualizes the Lexical Distances Between Europe’s Languages: 54 Languages Spoken by 670 Million People

Colorful Maps from 1914 and 2016 Show How Planes & Trains Have Made the World Smaller and Travel Times Quicker

The Atlantic Slave Trade Visualized in Two Minutes: 10 Million Lives, 20,000 Voyages, Over 315 Years

Free Online Computer Science Courses

A Free Oxford Course on Deep Learning: Cutting Edge Lessons in Artificial Intelligence

Nando de Freitas is a "machine learning professor at Oxford University, a lead research scientist at Google DeepMind, and a Fellow of the Canadian Institute For Advanced Research (CIFAR) in the Neural Computation and Adaptive Perception program."

Above, you can watch him teach an Oxford course on Deep Learning, a hot subfield of machine learning and artificial intelligence which creates neural networks--essentially complex algorithms modeled loosely after the human brain--that can recognize patterns and learn to perform tasks.

To complement the 16 lectures you can also find lecture slides, practicals, and problems sets on this Oxford web site. If you'd like to learn about Deep Learning in a MOOC format, be sure to check out the new series of courses created by Andrew Ng on Coursera.

Oxford's Deep Learning course will be added to our list of Free Online Computer Science Courses, part of our meta collection, 1,300 Free Online Courses from Top Universities.

Related Content:

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

New Deep Learning Courses Released on Coursera, with Hope of Teaching Millions the Basics of Artificial Intelligence

Neural Networks for Machine Learning: A Free Online Course

Infographics Show How the Different Fields of Biology, Chemistry, Mathematics, Physics & Computer Science Fit Together

Ask anyone who's pursued a career in the sciences what first piqued their interest in what would become their field, and they'll almost certainly have a story. Gazing at the stars on a camping trip, raising a pet frog, fooling around with computers and their components: an experience sparks a desire for knowledge and understanding, and the pursuit of that desire eventually delivers one to their specific area of specialization.

Or, as they say in science, at least it works that way in theory; the reality usually unrolls less smoothly. On such a journey, just like any other, it might help to have a map.




Enter the work of science writer and physicist Dominic Walliman, whose animated work on the Youtube channel Domain of Science we've previously featured here on Open Culture. (See the "Related Content" section below for the links.)

Walliman's videos astutely explain how the subfields of biology, chemistry, mathematics, physics, and computer science relate to each other, but now he's turned that same material into infographics readable at a glance: maps, essentially, of the intellectual territory. He's made these maps, of biology, chemistry, mathematics, physics, and computer science, freely available on his Flickr account: you can view them all on a single page here along with a few more of his infographics..

As much use as Walliman's maps might be to science-minded youngsters looking for the best way to direct their fascinations into a proper course of study, they also offer a helpful reminder to those farther down the path — especially those who've struggled with the blinders of hyperspecialization — of where their work fits in the grand scheme of things. No matter one's field, scientific or otherwise, one always labors under the threat of losing sight of the forest for the trees. Or the realm of life for the bioinformatics, biophysics, and biomathematics; the whole of mathematics for the number theory, the differential geometry, and the differential equations; the workings of computers for the scheduling, the optimization, and the boolean satisfiability.

Related Content:

The Map of Biology: Animation Shows How All the Different Fields in Biology Fit Together

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

The Map of Mathematics: Animation Shows How All the Different Fields in Math Fit Together

The Map of Physics: Animation Shows How All the Different Fields in Physics Fit Together

The Map of Chemistry: New Animation Summarizes the Entire Field of Chemistry in 12 Minutes

The Art of Data Visualization: How to Tell Complex Stories Through Smart Design

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

A Turing Machine Handmade Out of Wood

It took Richard Ridel six months of tinkering in his workshop to create this contraption--a mechanical Turing machine made out of wood. The silent video above shows how the machine works. But if you're left hanging, wanting to know more, I'd recommend reading Ridel's fifteen page paper where he carefully documents why he built the wooden Turing machine, and what pieces and steps went into the construction.

If this video prompts you to ask, what exactly is a Turing Machine?, also consider adding this short primer by philosopher Mark Jago to your media diet.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

via BoingBoing

Related Content:

Free Online Computer Science Courses

The Books on Young Alan Turing’s Reading List: From Lewis Carroll to Modern Chromatics

The LEGO Turing Machine Gives a Quick Primer on How Your Computer Works

The Enigma Machine: How Alan Turing Helped Break the Unbreakable Nazi Code

Hear the Christmas Carols Made by Alan Turing’s Computer: Cutting-Edge Versions of “Jingle Bells” and “Good King Wenceslas” (1951)

More in this category... »
Quantcast