The Seven Road-Tested Habits of Effective Artists

Fifteen years ago, a young construction worker named Andrew Price went in search of free 3d software to help him achieve his goal of rendering a 3D car.

He stumbled onto Blender, a just-the-ticket open source software that helps users with every aspect of 3D creation—modeling, rigging, animation, simulation, rendering, compositing, and motion tracking.

Price describes his early learning style as "playing it by ear,” sampling tutorials, some of which he couldn’t be bothered to complete.




Desire for freelance gigs led him to forge a new identity, that of a Blender Guru, whose tutorials, podcasts, and articles would help other new users get the hang of the software.

But it wasn’t declaring himself an expert that ultimately improved his artistic skills. It was holding his own feet over the fire by placing a bet with his younger cousin, who stood to gain $1000 if Price failed to rack up 1,000 “likes” by posting 2D drawings to ArtStation within a 6-month period.

(If he succeeded—which he did, 3 days before his self-imposed deadline—his cousin owed him nothing. Loss aversion proved to be a more powerful motivator than any carrot on a stick…)

In order to snag the requisite likes, Price found that he needed to revise some habits and commit to a more robust daily practice, a journey he detailed in a presentation at the 2016 Blender Conference.

Price confesses that the challenge taught him much about drawing and painting, but even more about having an effective artistic practice. His seven rules apply to any number of creative forms:

 

Andrew Price’s Rules for an Effective Artist Practice:

  1. Practice Daily

A number of prolific artists have subscribed to this belief over the years, including novelist (and mother!) JK Rowling, comedian Jerry Seinfeld, autobiographical performer Mike Birbligia, and memoirist David Sedaris.

If you feel too fried to uphold your end of the bargain, pretend to go easy on yourself with a little trick Price picked up from music producer Rick Rubin: Do the absolute minimum. You’ll likely find that performing the minimum positions you to do much more than that. Your resistance is not so much to the doing as it is to the embarking.

  1. Quantity over Perfectionism Masquerading as Quality

This harkens back to Rule Number One. Who are we to say which of our works will be judged worthy. Just keep putting it out there—remember it’s all practice, and law of averages favors those whose output is, like Picasso’s, prodigious. Don’t stand in the way of progress by splitting a single work’s endless hairs.

  1. Steal Without Ripping Off

Immerse yourself in the creative brilliance of those you admire. Then profit off your own improved efforts, a practice advocated by the likes of musician David Bowie, computer visionary Steve Jobs, and artist/social commentator Banksy.

  1. Educate Yourself

As a stand-alone, that old chestnut about practice making perfect is not sufficient to the task. Whether you seek out online tutorials, as Price did, enroll in a class, or designate a mentor, a conscientious commitment to study your craft will help you to better master it.

  1. Give yourself a break

Banging your head against the wall is not good for your brain. Price celebrates author Stephen King’s practice of giving the first draft of a new novel six weeks to marinate. Your break may be shorter. Three days may be ample to juice you up creatively. Just make sure it’s in your calendar to get back to it.

  1. Seek Feedback

Filmmaker Taika Waititirapper Kanye Westand the big gorillas at Pixar are not threatened by others' opinions. Seek them out. You may learn something.

  1. Create What You Want To

Passion projects are the key to creative longevity and pleasurable process. Don’t cater to a fickle public, or the shifting sands of fashion. Pursue the sorts of things that interest you.

Implicit in Price’s seven commandments is the notion that something may have to budge—your nightly cocktails, the number of hours spent on social media, that extra half hour in bed after the alarm goes off... Don’t neglect your familial or civic obligations, but neither should you shortchange your art. Life’s too short.

Read the transcript of Andrew Price's Blender Conference presentation here.

Related Content:

The Daily Habits of Famous Writers: Franz Kafka, Haruki Murakami, Stephen King & More

The Daily Habits of Highly Productive Philosophers: Nietzsche, Marx & Immanuel Kant

How to Read Many More Books in a Year: Watch a Short Documentary Featuring Some of the World’s Most Beautiful Bookstores

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in NYC on Monday, December 9 when her monthly book-based variety show, Necromancers of the Public Domain celebrates Dennison’s Christmas Book (1921). Follow her @AyunHalliday.

How Margaret Hamilton Wrote the Computer Code That Helped Save the Apollo Moon Landing Mission

From a distance of half a century, we look back on the moon landing as a thoroughly analog affair, an old-school engineering project of the kind seldom even proposed anymore in this digital age. But the Apollo 11 mission could never have happened without computers and the people who program them, a fact that has become better-known in recent years thanks to public interest in the work of Margaret Hamilton, director of the Software Engineering Division of MIT's Instrumentation Laboratory when it developed on-board flight software for NASA's Apollo space program. You can learn more about Hamilton, whom we've previously featured here on Open Culture, from the short MAKERS profile video above.

Today we consider software engineering a perfectly viable field, but back in the mid-1960s, when Hamilton first joined the Apollo project, it didn't even have a name. "I came up with the term 'software engineering,' and it was considered a joke," says Hamilton, who remembers her colleagues making remarks like, "What, software is engineering?"




But her own experience went some way toward proving that working in code had become as important as working in steel. Only by watching her young daughter play at the same controls the astronauts would later use did she realize that just one human error could potentially bring the mission into ruin — and that she could minimize the possibility by taking it into account when designing its software. Hamilton's proposal met with resistance, NASA's official line at the time being that "astronauts are trained never to make a mistake."

But Hamilton persisted, prevailed, and was vindicated during the moon landing itself, when an astronaut did make a mistake, one that caused an overloading of the flight computer. The whole landing might have been aborted if not for Hamilton's foresight in implementing an "asynchronous executive" function capable, in the event of an overload, of setting less important tasks aside and prioritizing more important ones. "The software worked just the way it should have," Hamilton says in the Christie's video on the incident above, describing what she felt afterward as "a combination of excitement and relief." Engineers of software, hardware, and everything else know that feeling when they see a complicated project work — but surely few know it as well as Hamilton and her Apollo collaborators do.

Related Content:

Margaret Hamilton, Lead Software Engineer of the Apollo Project, Stands Next to Her Code That Took Us to the Moon (1969)

How 1940s Film Star Hedy Lamarr Helped Invent the Technology Behind Wi-Fi & Bluetooth During WWII

Meet Grace Hopper, the Pioneering Computer Scientist Who Helped Invent COBOL and Build the Historic Mark I Computer (1906-1992)

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.

Pioneering Computer Scientist Grace Hopper Shows Us How to Visualize a Nanosecond (1983)

Human imagination seems seriously limited when faced with the cosmic scope of time and space. We can imagine, through stop-motion animation and CGI, what it might be like to walk the earth with creatures the size of office buildings. But how to wrap our heads around the fact that they lived hundreds of millions of years ago, on a planet some four and a half billion years old? We trust the science, but can’t rely on intuition alone to guide us to such mind-boggling knowledge.

At the other end of the scale, events measured in nanoseconds, or billionths of a second, seem inconceivable, even to someone as smart as Grace Hopper, the Navy mathematician who invented COBOL and helped built the first computer. Or so she says in the 1983 video clip above from one of her many lectures in her role as a guest lecturer at universities, museums, military bodies, and corporations.




When she first heard of “circuits that acted in nanoseconds,” she says, “billionths of a second… Well, I didn’t know what a billion was…. And if you don’t know what a billion is, how on earth do you know what a billionth is? Finally, one morning in total desperation, I called over the engineering building, and I said, ‘Please cut off a nanosecond and send it to me.” What she asked for, she explains, and shows the class, was a piece of wire representing the distance a signal could travel in a nanosecond.

Now of course it wouldn’t really be through wire — it’d be out in space, the velocity of light. So if we start with a velocity of light and use your friendly computer, you’ll discover that a nanosecond is 11.8 inches long, the maximum limiting distance that electricity can travel in a billionth of a second.

Follow the rest of her explanation, with wire props, and see if you can better understand a measure of time beyond the reaches of conscious experience. The explanation was immediately successful when she began using it in the late 1960s “to demonstrate how designing smaller components would produce faster computers,” writes the National Museum of American History. The bundle of wires below, each about 30cm (11.8 inches) long, comes from a lecture Hopper gave museum docents in March 1985.

Photo via the National Museum of American History

Like the age of the dinosaurs, the nanosecond may only represent a small fraction of the incomprehensibly small units of time scientists are eventually able to measure—and computer scientists able to access. “Later,” notes the NMAH, “as components shrank and computer speeds increased, Hopper used grains of pepper to represent the distance electricity traveled in a picosecond, one trillionth of a second.”

At this point, the map becomes no more revealing than the unknown territory, invisible to the naked eye, inconceivable but through wild leaps of imagination. But if anyone could explain the increasingly inexplicable in terms most anyone could understand, it was the brilliant but down-to-earth Hopper.

via Kottke

Related Content:

Meet Grace Hopper, the Pioneering Computer Scientist Who Helped Invent COBOL and Build the Historic Mark I Computer (1906-1992)

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

Free Online Computer Science Courses 

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How to Take a Picture of a Black Hole: Watch the 2017 Ted Talk by Katie Bouman, the MIT Grad Student Who Helped Take the Groundbreaking Photo

What triggered the worst impulses of the Internet last week?

The world's first photo of a black hole, which proved the presence of troll life here on earth, and confirms that female scientists, through no fault of their own, have a much longer way to go, baby.

If you want a taste, sort the comments on the two year old TED Talk, above, so they're ordered  "newest first."

Katie Bouman, soon-to-be assistant professor of computing and mathematical sciences at the California Institute of Technology, was a PhD candidate at MIT two years ago, when she taped the talk, but she could've passed for a nervous high schooler competing in the National Science Bowl finals, in clothes borrowed from Aunt Judy, who works at the bank.




The focus of her studies were the ways in which emerging computational methods could help expand the boundaries of interdisciplinary imaging.

Prior to last week, I’m not sure how well I could have parsed the focus of her work had she not taken the time to help less STEM-inclined viewers such as myself wrap our heads around her highly technical, then-wholly-theoretical subject.

What I know about black holes could still fit in a thimble, and in truth, my excitement about one being photographed for the first time pales in comparison to my excitement about Game of Thrones returning to the airwaves.

Fortunately, we’re not obligated to be equally turned on by the same interests, an idea theoretical physicist Richard Feynman promoted:

I've always been very one-sided about science and when I was younger I concentrated almost all my effort on it. I didn't have time to learn and I didn't have much patience with what's called the humanities, even though in the university there were humanities that you had to take. I tried my best to avoid somehow learning anything and working at it. It was only afterwards, when I got older, that I got more relaxed, that I've spread out a little bit. I've learned to draw and I read a little bit, but I'm really still a very one-sided person and I don't know a great deal. I have a limited intelligence and I use it in a particular direction.

I'm pretty sure my lack of passion for science is not tied to my gender. Some of my best friends are guys who feel the same. (Some of them don't like team sports either.)

But I couldn't help but experience a wee thrill that this young woman, a science nerd who admittedly could’ve used a few theater nerd tips regarding relaxation and public speaking, realized her dream—an honest to goodness photo of a black hole just like the one she talked about in her TED Talk,  "How to take a picture of a black hole."

Bouman and the 200+ colleagues she acknowledges and thanks at every opportunity, achieved their goal, not with an earth-sized camera but rather a network of linked telescopes, much as she had described two years earlier, when she invoked disco balls, Mick Jagger, oranges, selfies, and a jigsaw puzzle in an effort to help people like me understand.

Look at that sucker (or, more accurately, its shadow!) That thing’s 500 million trillion kilometers from Earth!

(That's much farther than King's Landing is from Winterfell.)

I’ll bet a lot of elementary science teachers, be they male, female, or non-binary, are going to make science fun by having their students draw pictures of the picture of the black hole.

If we could go back (or forward) in time, I can almost guarantee that mine would be among the best because while I didn’t “get” science (or gym), I was a total art star with the crayons.

Then, crafty as Lord Petyr Baelish when presentation time rolled around, I would partner with a girl like Katie Bouman, who could explain the science with winning vigor. She genuinely seems to embrace the idea that it “takes a village,” and that one’s fellow villagers should be credited whenever possible.

(How did I draw the black hole, you ask? Honestly, it's not that much harder than drawing a doughnut. Now back to Katie!)

Alas, her professional warmth failed to register with legions of Internet trolls who began sliming her shortly after a colleague at MIT shared a beaming snapshot of her, taken, presumably, with a regular old phone as the black hole made its debut. That pic cemented her accidental status as the face of this project.

Note to the trolls—it wasn't a dang selfie.

“I’m so glad that everyone is as excited as we are and people are finding our story inspirational,’’ Bouman told The New York Times. “However, the spotlight should be on the team and no individual person. Focusing on one person like this helps no one, including me.”

Although Bouman was a junior team member, she and other grad students made major contributions. She directed the verification of images, the selection of imaging parameters, and authored an imaging algorithm that researchers used in the creation of three scripted code pipelines from which the instantly-famous picture was cobbled together.

As Vincent Fish, a research scientist at MIT's Haystack Observatory told CNN:

One of the insights Katie brought to our imaging group is that there are natural images. Just think about the photos you take with your camera phone—they have certain properties.... If you know what one pixel is, you have a good guess as to what the pixel is next to it.

Hey, that makes sense.

As The Verge’s science editor, Mary Beth Griggs, points out, the rush to defame Bouman is of a piece with some of the non-virtual realities women in science face:

Part of the reason that some posters found Bouman immediately suspicious had to do with her gender. Famously, a number of prominent men like disgraced former CERN physicist Alessandro Strumia have argued that women aren’t being discriminated against in science — they simply don’t like it, or don’t have the aptitude for it. That argument fortifies a notion that women don’t belong in science, or can’t really be doing the work. So women like Bouman must be fakes, this warped line of thinking goes…

Even I, whose 7th grade science teacher tempered a bad grade on my report card by saying my interest in theater would likely serve me much better than anything I might eek from her class, know that just as many girls and women excel at science, technology, engineering, and math as excel in the arts. (Sometimes they excel at both!)

(And power to every little boy with his sights set on nursing, teaching, or ballet!)

(How many black holes have the haters photographed recently?)

Griggs continues:

Saying that she was part of a larger team doesn’t diminish her work, or minimize her involvement in what is already a history-making project. Highlighting the achievements of a brilliant, enthusiastic scientist does not diminish the contributions of the other 214 people who worked on the project, either. But what it is doing is showing a different model for a scientist than the one most of us grew up with. That might mean a lot to some kids — maybe kids who look like her — making them excited about studying the wonders of the Universe.

via BoingBoing

Related Content:

Women’s Hidden Contributions to Modern Genetics Get Revealed by New Study: No Longer Will They Be Buried in the Footnotes

New Augmented Reality App Celebrates Stories of Women Typically Omitted from U.S. History Textbooks

Stephen Hawking (RIP) Explains His Revolutionary Theory of Black Holes with the Help of Chalkboard Animations

Watch a Star Get Devoured by a Supermassive Black Hole

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City tonight for the next installment of her book-based variety show, Necromancers of the Public Domain. Follow her @AyunHalliday.

Artificial Intelligence Identifies the Six Main Arcs in Storytelling: Welcome to the Brave New World of Literary Criticism

Is the singularity upon us? AI seems poised to replace everyone, even artists whose work can seem like an inviolably human industry. Or maybe not. Nick Cave’s poignant answer to a fan question might persuade you a machine will never write a great song, though it might master all the moves to write a good one. An AI-written novel did almost win a Japanese literary award. A suitably impressive feat, even if much of the authorship should be attributed to the program’s human designers.

But what about literary criticism? Is this an art that a machine can do convincingly? The answer may depend on whether you consider it an art at all. For those who do, no artificial intelligence will ever properly develop the theory of mind needed for subtle, even moving, interpretations. On the other hand, one group of researchers has succeeded in using “sophisticated computing power, natural language processing, and reams of digitized text,” writes Atlantic editor Adrienne LaFrance, “to map the narrative patterns in a huge corpus of literature.” The name of their literary criticism machine? The Hedonometer.




We can treat this as an exercise in compiling data, but it's arguable that the results are on par with work from the comparative mythology school of James Frazier and Joseph Campbell. A more immediate comparison might be to the very deft, if not particularly subtle, Kurt Vonnegut, who—before he wrote novels like Slaughterhouse Five and Cat’s Cradlesubmitted a master’s thesis in anthropology to the University of Chicago. His project did the same thing as the machine, 35 years earlier, though he may not have had the wherewithal to read “1,737 English-language works of fiction between 10,000 and 200,000 words long" while struggling to finish his graduate program. (His thesis, by the way, was rejected.)

Those numbers describe the dataset from Project Gutenberg fed into the The Hedonometer by the computer scientists at the University of Vermont and the University of Adelaide. After the computer finished "reading," it then plotted “the emotional trajectory” of all of the stories using a “sentiment analysis to generate an emotional arc for each work.” What it found were six broad categories of story, listed below:

  1. Rags to Riches (rise)
  2. Riches to Rags (fall)
  3. Man in a Hole (fall then rise)
  4. Icarus (rise then fall)
  5. Cinderella (rise then fall then rise)
  6. Oedipus (fall then rise then fall)

How does this endeavor compare with Vonnegut’s project? (See him present the theory below.) The novelist used more or less the same methodology, in human form, to come up with eight universal story arcs or “shapes of stories.” Vonnegut himself left out the Rags to Riches category; he called it an anomaly, though he did have a heading for the same rising-only story arc—the Creation Story—which he deemed an uncommon shape for Western fiction. He did include the Cinderella arc, and was pleased by his discovery that its shape mirrored the New Testament arc, which he also included in his schema, an act the AI surely would have judged redundant.

Contra Vonnegut, the AI found that one-fifth of all the works it analyzed were Rags-to-Riches stories. It determined that this arc was far less popular with readers than “Oedipus,” “Man in a Hole,” and “Cinderella.” Its analysis does get much more granular, and to allay our suspicions, the researchers promise they did not control the outcome of the experiment. “We’re not imposing a set of shapes,” says lead author Andy Reagan, Ph.D. candidate in mathematics at the University of Vermont. “Rather: the math and machine learning have identified them.”

But the authors do provide a lot of their own interpretation of the data, from choosing representative texts—like Harry Potter and the Deathly Hallows—to illustrate “nested and complicated” plot arcs, to providing the guiding assumptions of the exercise. One of those assumptions, unsurprisingly given the authors’ fields of interest, is that math and language are interchangeable. “Stories are encoded in art, language, and even in the mathematics of physics,” they write in the introduction to their paper, published on Arxiv.org.

“We use equations," they go on, "to represent both simple and complicated functions that describe our observations of the real world.” If we accept the premise that sentences and integers and lines of code are telling the same stories, then maybe there isn’t as much difference between humans and machines as we would like to think.

via The Atlantic

Related Content:

Nick Cave Answers the Hotly Debated Question: Will Artificial Intelligence Ever Be Able to Write a Great Song?

Kurt Vonnegut Diagrams the Shape of All Stories in a Master’s Thesis Rejected by U. Chicago

Kurt Vonnegut Maps Out the Universal Shapes of Our Favorite Stories

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Watch 110 Lectures by Donald Knuth, “the Yoda of Silicon Valley,” on Programming, Mathematical Writing, and More

Many see the realms of literature and computers as not just completely separate, but growing more distant from one another all the time. Donald Knuth, one of the most respected figures of all the most deeply computer-savvy in Silicon Valley, sees it differently. His claims to fame include The Art of Computer Programming, an ongoing multi-volume series of books whose publication began more than fifty years ago, and the digital typesetting system TeX, which, in a recent profile of Knuth, the New York Times' Siobhan Roberts describes as "the gold standard for all forms of scientific communication and publication."

Some, Roberts writes, consider TeX "Dr. Knuth’s greatest contribution to the world, and the greatest contribution to typography since Gutenberg." At the core of his lifelong work is an idea called "literate programming," which emphasizes "the importance of writing code that is readable by humans as well as computers — a notion that nowadays seems almost twee.




Dr. Knuth has gone so far as to argue that some computer programs are, like Elizabeth Bishop’s poems and Philip Roth’s American Pastoral, works of literature worthy of a Pulitzer." Knuth's mind, technical achievements, and style of communication have earned him the informal title of "the Yoda of Silicon Valley."

That appellation also reflects a depth of technical wisdom only attainable by getting to the very bottom of things, which in Knuth's case means fully understanding how computer programming works all the way down to the most basic level. (This in contrast to the average programmer, writes Roberts, who "no longer has time to manipulate the binary muck, and works instead with hierarchies of abstraction, layers upon layers of code — and often with chains of code borrowed from code libraries.) Now everyone can get more than a taste of Knuth's perspective and thoughts on computers, programming, and a host of related subjects on the Youtube channel of Stanford University, where Knuth is now professor emeritus (and where he still gives informal lectures under the banner "Computer Musings").

Stanford's online archive of Donald Knuth Lectures now numbers 110, ranging across the decades and covering such subjects as the usage and mechanics of TeX, the analysis of algorithms, and the nature of mathematical writing. "I am worried that algorithms are getting too prominent in the world,” he tells Roberts in the New York Times profile. “It started out that computer scientists were worried nobody was listening to us. Now I’m worried that too many people are listening." But having become a computer scientist before the field of computer science even had a name, the now-octogenarian Knuth possesses a rare perspective to which anyone in 21st-century technology could certainly benefit from exposure.

Related Content:

Free Online Computer Science Courses

50 Famous Academics & Scientists Talk About God

The Secret History of Silicon Valley

When J.M. Coetzee Secretly Programmed Computers to Write Poetry in the 1960s

Introduction to Computer Science and Programming: A Free Course from MIT

Peter Thiel’s Stanford Course on Startups: Read the Lecture Notes Free Online

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Discover Rare 1980s CDs by Lou Reed, Devo & Talking Heads That Combined Music with Computer Graphics

When it first hit the market in 1982, the compact disc famously promised "perfect sound that lasts forever." But innovation has a way of marching continually on, and naturally the innovators soon started wondering: what if perfect sound isn't enough? What if consumers want something to go with it, something to look at? And so, when compact disc co-developers Sony and Philips updated its standards, they included documentation on the use of the format's channels not occupied by audio data. So was born the CD+G, which boasted "not only the CD's full, digital sound, but also video information — graphics — viewable on any television set or video monitor."

That text comes from a package scan posted by the online CD+G Museum, whose Youtube channel features rips of nearly every record released on the format, beginning with the first, the Firesign Theatre's Eat or Be Eaten.




When it came out, listeners who happened to own a CD+G-compatible player (or a CD+G-compatible video game console, my own choice at the time having been the Turbografx-16) could see that beloved "head comedy" troupe's densely layered studio production and even more densely layered humor accompanied by images rendered in psychedelic color — or as psychedelic as images can get with only sixteen colors available on the palette, not to mention a resolution of 288 pixels by 192 pixels, not much larger than a icon on the home screen of a modern smartphone. Those limitations may make CD+G graphics look unimpressive today, but just imagine what a cutting-edge novelty they must have seemed in the late 1980s when they first appeared.

Displaying lyrics for karaoke singers was the most obvious use of CD+G technology, but its short lifespan also saw a fair few experiments on such other major-label releases, all viewable at the CD+G Museum, as Lou Reed's New York, which combines lyrics with digitized photography of the eponymous city; Talking Heads' Naked, which provides musical information such as the chord changes and instruments playing on each phrase; Johann Sebastian Bach's St. Matthew Passion, which translates the libretto alongside works of art; and Devo's single "Disco Dancer," which tells the origin story of those "five Spudboys from Ohio." With these and almost every other CD+G release available at the CD+G museum, you'll have no shortage of not just background music but background visuals for your next late-80s-early-90s-themed party.

Related Content:

Watch 1970s Animations of Songs by Joni Mitchell, Jim Croce & The Kinks, Aired on The Sonny & Cher Show

The Story of How Beethoven Helped Make It So That CDs Could Play 74 Minutes of Music

Discover the Lost Early Computer Art of Telidon, Canada’s TV Proto-Internet from the 1970s

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

More in this category... »
Quantcast