The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

I’ve never wanted to start a sentence with “I’m old enough to remember…” because, well, who does? But here we are. I remember the enormously successful Apple IIe and Commodore 64, and a world before Microsoft. Smart phones were science fiction. To do much more than word process or play games one had to learn a programming language. These ancient days seemed at the time—and in hindsight as well—to be the very dawn of computing. Before the personal computer, such devices were the size of kitchen appliances and were hidden away in military installations, universities, and NASA labs.

But of course we all know that the history of computing goes far beyond the early 80s: at least back to World War II, and perhaps even much farther. Do we begin with the abacus, the 2,200-Year-Old Antikythera Mechanism, the astrolabe, Ada Lovelace and Charles Babbage? The question is maybe one of definitions. In the short, animated video above, physicist, science writer, and YouTube educator Dominic Walliman defines the computer according to its basic binary function of “just flipping zeros and ones,” and he begins his condensed history of computer science with tragic genius Alan Turing of Turing Test and Bletchley Park codebreaking fame.




Turing’s most significant contribution to computing came from his 1936 concept of the “Turing Machine,” a theoretical mechanism that could, writes the Cambridge Computer Laboratory “simulate ANY computer algorithm, no matter how complicated it is!” All other designs, says Walliman—apart from a quantum computer—are equivalent to the Turing Machine, “which makes it the foundation of computer science.” But since Turing’s time, the simple design has come to seem endlessly capable of adaptation and innovation.

Walliman illustrates the computer's exponential growth by pointing out that a smart phone has more computing power than the entire world possessed in 1963, and that the computing capability that first landed astronauts on the moon is equal to “a couple of Nintendos” (first generation classic consoles, judging by the image). But despite the hubris of the computer age, Walliman points out that “there are some problems which, due to their very nature, can never be solved by a computer” either because of the degree of uncertainty involved or the degree of inherent complexity. This fascinating, yet abstract discussion is where Walliman’s “Map of Computer Science” begins, and for most of us this will probably be unfamiliar territory.

We’ll feel more at home once the map moves from the region of Computer Theory to that of Computer Engineering, but while Walliman covers familiar ground here, he does not dumb it down. Once we get to applications, we’re in the realm of big data, natural language processing, the internet of things, and “augmented reality.” From here on out, computer technology will only get faster, and weirder, despite the fact that the “underlying hardware is hitting some hard limits.” Certainly this very quick course in Computer Science only makes for an introductory survey of the discipline, but like Wallman’s other maps—of mathematics, physics, and chemistry—this one provides us with an impressive visual overview of the field that is both broad and specific, and that we likely wouldn’t encounter anywhere else.

As with his other maps, Walliman has made this the Map of Computer Science available as a poster, perfect for dorm rooms, living rooms, or wherever else you might need a reminder.

Related Content:

Free Online Computer Science Courses

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

Watch Breaking the Code, About the Life & Times of Alan Turing (1996)

The Map of Mathematics: Animation Shows How All the Different Fields in Math Fit Together

The Map of Physics: Animation Shows How All the Different Fields in Physics Fit Together

The Map of Chemistry: New Animation Summarizes the Entire Field of Chemistry in 12 Minutes

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

New Deep Learning Courses Released on Coursera, with Hope of Teaching Millions the Basics of Artificial Intelligence

FYI: If you follow edtech, you know the name Andrew Ng. He's the Stanford computer science professor, who co-founded MOOC-provider Coursera and later became chief scientist at Baidu. Since leaving Baidu, he's been working on three artificial intelligence projects, the first of which he unveiled yesterday. On Medium, he wrote:

I have been working on three new AI projects, and am thrilled to announce the first one: deeplearning.ai, a project dedicated to disseminating AI knowledge, is launching a new sequence of Deep Learning courses on Coursera. These courses will help you master Deep Learning, apply it effectively, and build a career in AI.

Speaking to the MIT Technology Review, Ng elaborated: "The thing that really excites me today is building a new AI-powered society... I don’t think any one company could do all the work that needs to be done, so I think the only way to get there is if we teach millions of people to use these AI tools so they can go and invent the things that no large company, or company I could build, could do."

Andrew's new 5-part series of courses on Deep Learning can be accessed here.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

Google’s DeepMind AI Teaches Itself to Walk, and the Results Are Kooky, No Wait, Chilling

Artificial Intelligence: A Free Online Course from MIT

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

Last Friday, we mentioned how Google's artificial intelligence software DeepMind has the ability to teach itself many things. It can teach itself how to walk, jump and run. Even take professional pictures. Or defeat the world's best player of the Chinese strategy game, Go. The science of teaching computers how to do things is called Deep Learning. And you can now immerse yourself in this world by taking a free, 3-month course on Deep Learning itself. Offered through Udacity, the course is taught by Vincent Vanhoucke, the technical lead in Google's Brain team. You can learn more about the course via Vanhoucke's blog post. Or just enroll here. (You will need to create an account with Udacity to get started.)

The free course takes about 3 months to complete. It will be added to our list of Free Computer Sciences courses, a subset of our larger collection,  1,300 Free Online Courses from Top Universities.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Google’s DeepMind AI Teaches Itself to Walk, and the Results Are Kooky, No Wait, Chilling

Learn Python: A Free Online Course from Google

Take a Free Course on Digital Photography from Stanford Prof Marc Levoy

 

How Aristotle Invented Computer Science

In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.

The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”




The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.

At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay "A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.

Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.

Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.

Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.

Related Content:

Free Online Computer Science Courses

How the World’s Oldest Computer Worked: Reconstructing the 2,200-Year-Old Antikythera Mechanism

The Books on Young Alan Turing’s Reading List: From Lewis Carroll to Modern Chromatics

How Arabic Translators Helped Preserve Greek Philosophy … and the Classical Tradition

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

 

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

I’ve never quite understood why the phrase “revisionist history” became purely pejorative. Of course, it has its Orwellian dark side, but all knowledge has to be revised periodically, as we acquire new information and, ideally, discard old prejudices and narrow frames of reference. A failure to do so seems fundamentally regressive, not only in political terms, but also in terms of how we value accurate, interesting, and engaged scholarship. Such research has recently brought us fascinating stories about previously marginalized people who made significant contributions to scientific discovery, such as NASA's “human computers,” portrayed in the book Hidden Figures, then dramatized in the film of the same name.

Likewise, the many women who worked at Bletchley Park during World War II—helping to decipher encryptions like the Nazi Enigma Code (out of nearly 10,000 codebreakers, about 75% were women)—have recently been getting their historical due, thanks to “revisionist” researchers. And, as we noted in a recent post, we might not know much, if anything, about film star Hedy Lamarr’s significant contributions to wireless, GPS, and Bluetooth technology were it not for the work of historians like Richard Rhodes. These few examples, among many, show us a fuller, more accurate, and more interesting view of the history of science and technology, and they inspire women and girls who want to enter the field, yet have grown up with few role models to encourage them.




We can add to the pantheon of great women in science the name Ada Byron, Countess of Lovelace, the daughter of Romantic poet Lord Byron. Lovelace has been renowned, as Hank Green tells us in the video at the top of the post, for writing the first computer program, “despite living a century before the invention of the modern computer.” This picture of Lovelace has been a controversial one. “Historians disagree,” writes prodigious mathematician Stephen Wolfram. “To some she is a great hero in the history of computing; to others an overestimated minor figure.”

Wolfram spent some time with “many original documents” to untangle the mystery. “I feel like I’ve finally gotten to know Ada Lovelace,” he writes, “and gotten a grasp on her story. In some ways it’s an ennobling and inspiring story; in some ways it’s frustrating and tragic.” Educated in math and music by her mother, Anne Isabelle Milbanke, Lovelace became acquainted with mathematics professor Charles Babbage, the inventor of a calculating machine called the Difference Engine, “a 2-foot-high hand-cranked contraption with 2000 brass parts.” Babbage encouraged her to pursue her interests in mathematics, and she did so throughout her life.

Widely acknowledged as one of the forefathers of computing, Babbage eventually corresponded with Lovelace on the creation of another machine, the Analytical Engine, which “supported a whole list of possible kinds of operations, that could in effect be done in arbitrarily programmed sequence.” When, in 1842, Italian mathematician Louis Menebrea published a paper in French on the Analytical Engine, “Babbage enlisted Ada as translator,” notes the San Diego Supercomputer Center's Women in Science project. “During a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These are the source of her enduring fame.” (You can read her translation and notes here.)

In the course of his research, Wolfram pored over Babbage and Lovelace’s correspondence about the translation, which reads “a lot like emails about a project might today, apart from being in Victorian English.” Although she built on Babbage and Menebrea’s work, “She was clearly in charge” of successfully extrapolating the possibilities of the Analytical Engine, but she felt “she was first and foremost explaining Babbage’s work, so wanted to check things with him.” Her additions to the work were very well-received—Michael Faraday called her “the rising star of Science”—and when her notes were published, Babbage wrote, “you should have written an original paper.”

Unfortunately, as a woman, “she couldn’t get access to the Royal Society’s library in London,” and her ambitions were derailed by a severe health crisis. Lovelace died of cancer at the age of 37, and for some time, her work sank into semi-obscurity. Though some historians have  seen her as simply an expositor of Babbage’s work, Wolfram concludes that it was Ada who had the idea of “what the Analytical Engine should be capable of.” Her notes suggested possibilities Babbage had never dreamed. As the Women in Science project puts it, "She rightly saw [the Analytical Engine] as what we would call a general-purpose computer. It was suited for 'developping [sic] and tabulating any function whatever. . . the engine [is] the material expression of any indefinite function of any degree of generality and complexity.' Her Notes anticipate future developments, including computer-generated music."

In a recent episode of the BBC’s In Our Time, above, you can hear host Melvyn Bragg discuss Lovelace’s importance with historians and scholars Patricia Fara, Doron Swade, and John Fuegi. And be sure to read Wolfram’s biographical and historical account of Lovelace here.

Related Content:

How 1940s Film Star Hedy Lamarr Helped Invent the Technology Behind Wi-Fi & Bluetooth During WWII

The Contributions of Women Philosophers Recovered by the New Project Vox Website

Real Women Talk About Their Careers in Science

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Google Uses Artificial Intelligence to Map Thousands of Bird Sounds Into an Interactive Visualization

If you were around in 2013, you may recall that we told you about Cornell's Archive of 150,000 Bird Calls and Animal Sounds, with Recordings Going Back to 1929. It's a splendid place for ornithologists and bird lovers to spend time. And, it turns out, the same also applies to computer programmers.

Late last year, Google launched an experiment where, drawing on Cornell's sound archive, they used machine learning (artificial intelligence that lets computers learn and do tasks on their own) to organize thousands of bird sounds into a map where similar sounds are placed closer together. And it resulted in this impressive interactive visualization. Check it out. Or head into Cornell's archive and do your own old-fashioned explorations.

Note: You can find free courses on machine learning and artificial intelligence in the Relateds below.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Cornell Launches Archive of 150,000 Bird Calls and Animal Sounds, with Recordings Going Back to 1929 

Neural Networks for Machine Learning: A Free Online Course 

A Free Course on Machine Learning & Data Science from Caltech

Introduction to Python, Data Science & Computational Thinking: Free Online Courses from MIT

FYI: MIT has posted online the video lectures for an essential series of courses. In the playlist of 38 lectures above, you can get an Introduction to Computer Science and Programming in Python. Recorded this past fall, and taught by Prof. Eric Grimson, Prof. John Guttag, and Dr. Ana Bell, the course is "intended for students with little or no programming experience. It aims to provide students with an understanding of the role computation can play in solving problems and to help students, regardless of their major, feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class uses the Python 3.5 programming language." Find accompanying course materials, including syllabus, here.

The follow up course, Introduction to Computational Thinking and Data Science, is again intended for students with little or no programming experience. "It aims to provide students with an understanding of the role computation can play in solving problems and to help students, regardless of their major, feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class uses the Python 3.5 programming language." Find related course materials here, and the 15 lectures on this playlist.

Both courses will be added to our collection of Free Computer Science Courses, a subset of our collection, 1,300 Free Online Courses from Top Universities.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Learn Python: A Free Online Course from Google

Learn How to Code for Free: A DIY Guide for Learning HTML, Python, Javascript & More

Download 243 Free eBooks on Design, Data, Software, Web Development & Business from O’Reilly Media

More in this category... »
Quantcast