The Colors of Mister Rogers’ Hand-Knit Sweaters from 1979 to 2001: A Visual Graph Created with Data Science

Writer Owen Phillips may be a solid data analyst, but I suspect he’s not much of a knitter.

The software he used to run a scientific analysis of 22 years worth of Fred Rogers’ sweaters ultimately reduces the beloved children’s television host’s homey zip-front cardigans to a slick graphic of colorful bars.

A knitter would no doubt prioritize other types of patterns - stitch numbers, wool weight, cable variations…the sort of information Mister Rogers’ mother, Nancy, would have had at her fingertips.

As Mister Rogers reveals in the story of his sweaters, his mom was the knitter behind many of the on-air sweaters Phillips crunched with R code. Whether their subtly shifting palette reflects an adventurous spirit on the part of the maker or the recipient’s evolving taste is not for us to know.




After Mrs. Rogers’ death, producers had to resort to buying similar models. Many of her originals had worn through or been donated to charity events.

“Not an easy challenge in the 80’s and 90s,” Margy Whitmer, a producer of Mister Rogers’ Neighborhood told Rewire. “It certainly wasn’t in style! But we found a company who made cotton ones that were similar, so we bought a bunch and dyed them.”

(A moment of silent gratitude that no one tried to shoehorn Fred Rogers into a Cosby Show sweater…)

It would be interesting to see what Phillips’ code could do with faulty viewer memories.

His input for the Mister Rogers’ Cardigans of Many Colors project was a chart on super fan Tim Lybarger’s Neighborhood Archive detailing the hue of every sweater Mister Rogers changed into on-camera from 1979 to 2001.

Without samples of the actual sweaters, Lybarger’s color chart could only be approximate, but unlike viewers’ fading memories, it’s rooted in his own visual observations of distinct episodes. Aging fans tend to jettison Rogers’ spectral reality in favor of a single shade, the bright red in which he greeted Wicked Witch of the West Margaret Hamilton in 1975, say, or the pleasant mouse-colored number he sported for a 1985 breakdancing session with a visiting 12-year-old.

For those who’d rather code than purl, Phillips shares MrRogers.R, the program he used to scrape the Neighborhood Archive for Mister Rogers daily sweater colors.

Then have a look at Rogers’ sweaters as rendered by Phillips’ fellow data geek, Alan Joyce, who tinkered with Phillips’ code to produce a gradient image.

via Kottke

Related Content:

Mr. Rogers Takes Breakdancing Lessons from a 12-Year-Old (1985)

Mr. Rogers Introduces Kids to Experimental Electronic Music by Bruce Haack & Esther Nelson (1968)

Mister Rogers Turns Kids On to Jazz with Help of a Young Wynton Marsalis and Other Jazz Legends (1986)

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Her current project is Theater of the Apes Sub-Adult Division’s fast approaching production of Animal Farm at the Tank in New York City.  Follow her @AyunHalliday.

Coursera Partners with Leading Universities to Offer Master’s Degrees at a More Affordable Price

If you're a regular Open Culture reader, you're already familiar with Coursera, the ed tech company, which, since its founding in 2012, has given the world access to online courses from top universities--e.g. courses on Roman Architecture (Yale), Modern and Postmodern Philosophy (Wesleyan), and Buddhism and Neuroscience (Princeton). And you've perhaps noticed, too, that Coursera has recently bundled certain courses into "Specializations"--essentially areas of concentration--that let students specialize in fields like Deep Learning and Data Science.

But what if students want to deepen their knowledge further and get a traditional degree? In what perhaps marks the beginning of a significant new trend, Coursera has partnered with leading universities to offer full-fledged graduate degrees in a more affordable online format. As described in the video above, HEC Paris (the #2 business school in Europe) now offers through Coursera's platform a Master's in Innovation and Entrepreneurship. Designed for aspiring entrepreneurs, the program consists of 20 courses (all online) and takes an estimated 10-16 months to complete. The total tuition amounts to 20,000 Euros (roughly 23,500 U.S. dollars), a sum that's considerably less than what executive education programs usually cost.

For students looking for a broader education in business, the University of Illinois at Urbana-Champaign has launched an entire MBA program through Coursera. Consisting of 18 online courses and three capstone projects, the iMBA program covers the subjects usually found in b-school programs--leadership, strategy, economics, accounting, finance, etc. The complete curriculum should take roughly 24 to 36 months to complete, and costs less than $22,000--about 25%-33% of what an on-campus MBA program typically runs.

(The iMBA is actually one of three degree programs the University of Illinois has launched on Coursera. The other two include a Masters in Accounting (iMSA) and a Master of Computer Science in Data Science (MCS-DS).)

Now, in case you're wondering, the diplomas and transcripts for these programs are granted directly by the universities themselves (e.g., the University of Illinois at Urbana-Champaign and HEC Paris). The paperwork doesn't carry Coursera's name. Nor does it indicate that the student completed an "online program." In short, online students get the same transcript as bricks and mortar students.

Finally, all of the degree programs mentioned above are "stackable"--meaning students can (at no cost) take an individual course offered by any of these programs. And then they can decide later whether they want to apply to the degree program, and, if so, retroactively apply that course towards the actual degree. Essentially, you can try things out before making a larger commitment.

If you want to learn more about these programs, or submit an application, check out the following links. We've included the deadlines for submitting applications.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Note: Open Culture has a partnership with Coursera. If readers enroll in certain Cousera courses, it helps support Open Culture.

Related Content:

New Deep Learning Courses Released on Coursera, with Hope of Teaching Millions the Basics of Artificial Intelligence

MOOCs from Great Universities (Many With Certificates)

Alan Turing Algorithmically Approximated by Ellipses: A Computer Art Project

Just a cool find on Twitter, a work of computer art created by Jeremy Kun, a math PhD from the University of Illinois at Chicago, and now an engineer at Google.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

via BoingBoing

Related Content:

Japanese Computer Artist Makes “Digital Mondrians” in 1964: When Giant Mainframe Computers Were First Used to Create Art

When J.M. Coetzee Secretly Programmed Computers to Write Poetry in the 1960s

Hear the First Recording of Computer Music: Researchers Restore Three Melodies Programmed on Alan Turing’s Computer (1951)

 

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

I’ve never wanted to start a sentence with “I’m old enough to remember…” because, well, who does? But here we are. I remember the enormously successful Apple IIe and Commodore 64, and a world before Microsoft. Smart phones were science fiction. To do much more than word process or play games one had to learn a programming language. These ancient days seemed at the time—and in hindsight as well—to be the very dawn of computing. Before the personal computer, such devices were the size of kitchen appliances and were hidden away in military installations, universities, and NASA labs.

But of course we all know that the history of computing goes far beyond the early 80s: at least back to World War II, and perhaps even much farther. Do we begin with the abacus, the 2,200-Year-Old Antikythera Mechanism, the astrolabe, Ada Lovelace and Charles Babbage? The question is maybe one of definitions. In the short, animated video above, physicist, science writer, and YouTube educator Dominic Walliman defines the computer according to its basic binary function of “just flipping zeros and ones,” and he begins his condensed history of computer science with tragic genius Alan Turing of Turing Test and Bletchley Park codebreaking fame.




Turing’s most significant contribution to computing came from his 1936 concept of the “Turing Machine,” a theoretical mechanism that could, writes the Cambridge Computer Laboratory “simulate ANY computer algorithm, no matter how complicated it is!” All other designs, says Walliman—apart from a quantum computer—are equivalent to the Turing Machine, “which makes it the foundation of computer science.” But since Turing’s time, the simple design has come to seem endlessly capable of adaptation and innovation.

Walliman illustrates the computer's exponential growth by pointing out that a smart phone has more computing power than the entire world possessed in 1963, and that the computing capability that first landed astronauts on the moon is equal to “a couple of Nintendos” (first generation classic consoles, judging by the image). But despite the hubris of the computer age, Walliman points out that “there are some problems which, due to their very nature, can never be solved by a computer” either because of the degree of uncertainty involved or the degree of inherent complexity. This fascinating, yet abstract discussion is where Walliman’s “Map of Computer Science” begins, and for most of us this will probably be unfamiliar territory.

We’ll feel more at home once the map moves from the region of Computer Theory to that of Computer Engineering, but while Walliman covers familiar ground here, he does not dumb it down. Once we get to applications, we’re in the realm of big data, natural language processing, the internet of things, and “augmented reality.” From here on out, computer technology will only get faster, and weirder, despite the fact that the “underlying hardware is hitting some hard limits.” Certainly this very quick course in Computer Science only makes for an introductory survey of the discipline, but like Wallman’s other maps—of mathematics, physics, and chemistry—this one provides us with an impressive visual overview of the field that is both broad and specific, and that we likely wouldn’t encounter anywhere else.

As with his other maps, Walliman has made this the Map of Computer Science available as a poster, perfect for dorm rooms, living rooms, or wherever else you might need a reminder.

Related Content:

Free Online Computer Science Courses

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

Watch Breaking the Code, About the Life & Times of Alan Turing (1996)

The Map of Mathematics: Animation Shows How All the Different Fields in Math Fit Together

The Map of Physics: Animation Shows How All the Different Fields in Physics Fit Together

The Map of Chemistry: New Animation Summarizes the Entire Field of Chemistry in 12 Minutes

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

New Deep Learning Courses Released on Coursera, with Hope of Teaching Millions the Basics of Artificial Intelligence

FYI: If you follow edtech, you know the name Andrew Ng. He's the Stanford computer science professor, who co-founded MOOC-provider Coursera and later became chief scientist at Baidu. Since leaving Baidu, he's been working on three artificial intelligence projects, the first of which he unveiled yesterday. On Medium, he wrote:

I have been working on three new AI projects, and am thrilled to announce the first one: deeplearning.ai, a project dedicated to disseminating AI knowledge, is launching a new sequence of Deep Learning courses on Coursera. These courses will help you master Deep Learning, apply it effectively, and build a career in AI.

Speaking to the MIT Technology Review, Ng elaborated: "The thing that really excites me today is building a new AI-powered society... I don’t think any one company could do all the work that needs to be done, so I think the only way to get there is if we teach millions of people to use these AI tools so they can go and invent the things that no large company, or company I could build, could do."

Andrew's new 5-part series of courses on Deep Learning can be accessed here.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

Google’s DeepMind AI Teaches Itself to Walk, and the Results Are Kooky, No Wait, Chilling

Artificial Intelligence: A Free Online Course from MIT

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

Last Friday, we mentioned how Google's artificial intelligence software DeepMind has the ability to teach itself many things. It can teach itself how to walk, jump and run. Even take professional pictures. Or defeat the world's best player of the Chinese strategy game, Go. The science of teaching computers how to do things is called Deep Learning. And you can now immerse yourself in this world by taking a free, 3-month course on Deep Learning itself. Offered through Udacity, the course is taught by Vincent Vanhoucke, the technical lead in Google's Brain team. You can learn more about the course via Vanhoucke's blog post. Or just enroll here. (You will need to create an account with Udacity to get started.)

The free course takes about 3 months to complete. It will be added to our list of Free Computer Sciences courses, a subset of our larger collection,  1,300 Free Online Courses from Top Universities.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Google’s DeepMind AI Teaches Itself to Walk, and the Results Are Kooky, No Wait, Chilling

Learn Python: A Free Online Course from Google

Take a Free Course on Digital Photography from Stanford Prof Marc Levoy

 

How Aristotle Invented Computer Science

In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.

The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”




The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.

At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay "A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.

Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.

Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.

Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.

Related Content:

Free Online Computer Science Courses

How the World’s Oldest Computer Worked: Reconstructing the 2,200-Year-Old Antikythera Mechanism

The Books on Young Alan Turing’s Reading List: From Lewis Carroll to Modern Chromatics

How Arabic Translators Helped Preserve Greek Philosophy … and the Classical Tradition

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

 

More in this category... »
Quantcast