How Aristotle Invented Computer Science

In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.

The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”




The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.

At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay "A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.

Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.

Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.

Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.

Related Content:

Free Online Computer Science Courses

How the World’s Oldest Computer Worked: Reconstructing the 2,200-Year-Old Antikythera Mechanism

The Books on Young Alan Turing’s Reading List: From Lewis Carroll to Modern Chromatics

How Arabic Translators Helped Preserve Greek Philosophy … and the Classical Tradition

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

 

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

I’ve never quite understood why the phrase “revisionist history” became purely pejorative. Of course, it has its Orwellian dark side, but all knowledge has to be revised periodically, as we acquire new information and, ideally, discard old prejudices and narrow frames of reference. A failure to do so seems fundamentally regressive, not only in political terms, but also in terms of how we value accurate, interesting, and engaged scholarship. Such research has recently brought us fascinating stories about previously marginalized people who made significant contributions to scientific discovery, such as NASA's “human computers,” portrayed in the book Hidden Figures, then dramatized in the film of the same name.

Likewise, the many women who worked at Bletchley Park during World War II—helping to decipher encryptions like the Nazi Enigma Code (out of nearly 10,000 codebreakers, about 75% were women)—have recently been getting their historical due, thanks to “revisionist” researchers. And, as we noted in a recent post, we might not know much, if anything, about silent film star Hedy Lamarr’s significant contributions to wireless, GPS, and Bluetooth technology were it not for the work of historians like Richard Rhodes. These few examples, among many, show us a fuller, more accurate, and more interesting view of the history of science and technology, and they inspire women and girls who want to enter the field, yet have grown up with few role models to encourage them.




We can add to the pantheon of great women in science the name Ada Byron, Countess of Lovelace, the daughter of Romantic poet Lord Byron. Lovelace has been renowned, as Hank Green tells us in the video at the top of the post, for writing the first computer program, “despite living a century before the invention of the modern computer.” This picture of Lovelace has been a controversial one. “Historians disagree,” writes prodigious mathematician Stephen Wolfram. “To some she is a great hero in the history of computing; to others an overestimated minor figure.”

Wolfram spent some time with “many original documents” to untangle the mystery. “I feel like I’ve finally gotten to know Ada Lovelace,” he writes, “and gotten a grasp on her story. In some ways it’s an ennobling and inspiring story; in some ways it’s frustrating and tragic.” Educated in math and music by her mother, Anne Isabelle Milbanke, Lovelace became acquainted with mathematics professor Charles Babbage, the inventor of a calculating machine called the Difference Engine, “a 2-foot-high hand-cranked contraption with 2000 brass parts.” Babbage encouraged her to pursue her interests in mathematics, and she did so throughout her life.

Widely acknowledged as one of the forefathers of computing, Babbage eventually corresponded with Lovelace on the creation of another machine, the Analytical Engine, which “supported a whole list of possible kinds of operations, that could in effect be done in arbitrarily programmed sequence.” When, in 1842, Italian mathematician Louis Menebrea published a paper in French on the Analytical Engine, “Babbage enlisted Ada as translator,” notes the San Diego Supercomputer Center's Women in Science project. “During a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These are the source of her enduring fame.” (You can read her translation and notes here.)

In the course of his research, Wolfram pored over Babbage and Lovelace’s correspondence about the translation, which reads “a lot like emails about a project might today, apart from being in Victorian English.” Although she built on Babbage and Menebrea’s work, “She was clearly in charge” of successfully extrapolating the possibilities of the Analytical Engine, but she felt “she was first and foremost explaining Babbage’s work, so wanted to check things with him.” Her additions to the work were very well-received—Michael Faraday called her “the rising star of Science”—and when her notes were published, Babbage wrote, “you should have written an original paper.”

Unfortunately, as a woman, “she couldn’t get access to the Royal Society’s library in London,” and her ambitions were derailed by a severe health crisis. Lovelace died of cancer at the age of 37, and for some time, her work sank into semi-obscurity. Though some historians have  seen her as simply an expositor of Babbage’s work, Wolfram concludes that it was Ada who had the idea of “what the Analytical Engine should be capable of.” Her notes suggested possibilities Babbage had never dreamed. As the Women in Science project puts it, "She rightly saw [the Analytical Engine] as what we would call a general-purpose computer. It was suited for 'developping [sic] and tabulating any function whatever. . . the engine [is] the material expression of any indefinite function of any degree of generality and complexity.' Her Notes anticipate future developments, including computer-generated music."

In a recent episode of the BBC’s In Our Time, above, you can hear host Melvyn Bragg discuss Lovelace’s importance with historians and scholars Patricia Fara, Doron Swade, and John Fuegi. And be sure to read Wolfram’s biographical and historical account of Lovelace here.

Related Content:

How 1940s Film Star Hedy Lamarr Helped Invent the Technology Behind Wi-Fi & Bluetooth During WWII

The Contributions of Women Philosophers Recovered by the New Project Vox Website

Real Women Talk About Their Careers in Science

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Google Uses Artificial Intelligence to Map Thousands of Bird Sounds Into an Interactive Visualization

If you were around in 2013, you may recall that we told you about Cornell's Archive of 150,000 Bird Calls and Animal Sounds, with Recordings Going Back to 1929. It's a splendid place for ornithologists and bird lovers to spend time. And, it turns out, the same also applies to computer programmers.

Late last year, Google launched an experiment where, drawing on Cornell's sound archive, they used machine learning (artificial intelligence that lets computers learn and do tasks on their own) to organize thousands of bird sounds into a map where similar sounds are placed closer together. And it resulted in this impressive interactive visualization. Check it out. Or head into Cornell's archive and do your own old-fashioned explorations.

Note: You can find free courses on machine learning and artificial intelligence in the Relateds below.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Cornell Launches Archive of 150,000 Bird Calls and Animal Sounds, with Recordings Going Back to 1929 

Neural Networks for Machine Learning: A Free Online Course 

A Free Course on Machine Learning & Data Science from Caltech

Introduction to Python, Data Science & Computational Thinking: Free Online Courses from MIT

FYI: MIT has posted online the video lectures for an essential series of courses. In the playlist of 38 lectures above, you can get an Introduction to Computer Science and Programming in Python. Recorded this past fall, and taught by Prof. Eric Grimson, Prof. John Guttag, and Dr. Ana Bell, the course is "intended for students with little or no programming experience. It aims to provide students with an understanding of the role computation can play in solving problems and to help students, regardless of their major, feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class uses the Python 3.5 programming language." Find accompanying course materials, including syllabus, here.

The follow up course, Introduction to Computational Thinking and Data Science, is again intended for students with little or no programming experience. "It aims to provide students with an understanding of the role computation can play in solving problems and to help students, regardless of their major, feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class uses the Python 3.5 programming language." Find related course materials here, and the 15 lectures on this playlist.

Both courses will be added to our collection of Free Computer Science Courses, a subset of our collection, 1,250 Free Online Courses from Top Universities.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Learn Python: A Free Online Course from Google

Learn How to Code for Free: A DIY Guide for Learning HTML, Python, Javascript & More

Download 243 Free eBooks on Design, Data, Software, Web Development & Business from O’Reilly Media

Artificial Intelligence: A Free Online Course from MIT

Today we're adding MIT's course on Artificial Intelligence to our ever-growing collection, 1,250 Free Online Courses from Top Universities. That's because, to paraphrase Amazon's Jeff Bezos, artificial intelligence (AI) is "not just in the first inning of a long baseball game, but at the stage where the very first batter comes up." Look around, and you will find AI everywhere--in self driving cars, Siri on your phone, online customer support, movie recommendations on Netflix, fraud detection for your credit cards, etc. To be sure, there's more to come.

Featuring 30 lectures, MIT's course "introduces students to the basic knowledge representation, problem solving, and learning methods of artificial intelligence." It includes interactive demonstrations designed to "help students gain intuition about how artificial intelligence methods work under a variety of circumstances." And, by the end of the course, students should be able "to develop intelligent systems by assembling solutions to concrete computational problems; understand the role of knowledge representation, problem solving, and learning in intelligent-system engineering; and appreciate the role of problem solving, vision, and language in understanding human intelligence from a computational perspective."

Taught by Prof. Patrick Henry Winston, the lectures can all be viewed above. Or watch them on YouTube and iTunes. Related course materials (including a syllabus) can be found on this MIT website. The textbook, available on Amazon, was written by Professor Winston.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Free Online Computer Science Courses

Hayao Miyazaki Tells Video Game Makers What He Thinks of Their Characters Made with Artificial Intelligence: “I’m Utterly Disgusted. This Is an Insult to Life Itself”

Artificial Intelligence Program Tries to Write a Beatles Song: Listen to “Daddy’s Car”

Two Artificial Intelligence Chatbots Talk to Each Other & Get Into a Deep Philosophical Conversation

Noam Chomsky Explains Where Artificial Intelligence Went Wrong

36 eBooks on Computer Programming from O’Reilly Media: Free to Download and Read

This past week, we featured a free course on the programming language Python, presented by MIT. A handy resource, to be sure.

And then it struck us that you might want to complement that course with some of the 36 free ebooks on computer programming from O’Reilly Media--of which 7 are dedicated to Python itself. Other books focus on Java, C++, Swift, Software Architecture, and more. See the list of programming books here.

If you're looking for yet more free ebooks from O’Reilly Media, see the post in our archive: Download 243 Free eBooks on Design, Data, Software, Web Development & Business from O’Reilly Media.\

For more computer science resources, see our collections:

Free Online Computer Science Courses

Free Textbooks: Computer Science

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Learn Python: A Free Online Course from Google

Free Online Computer Science Courses

Learn Python with a Free Online Course from MIT

800 Free eBooks for iPad, Kindle & Other Devices

A Free Course on Machine Learning & Data Science from Caltech

Right now, Machine Learning and Data Science are two hot topics, the subject of many courses being offered at universities today. Above, you can watch a playlist of 18 lectures from a course called Learning From Data: A Machine Learning Course, taught by Caltech's Feynman Prize-winning professor Yaser Abu-Mostafa. The course is summarized as follows:

This is an introductory course in machine learning (ML) that covers the basic theory, algorithms, and applications. ML is a key technology in Big Data, and in many financial, medical, commercial, and scientific applications. It enables computational systems to adaptively improve their performance with experience accumulated from the observed data. ML has become one of the hottest fields of study today, taken up by undergraduate and graduate students from 15 different majors at Caltech. This course balances theory and practice, and covers the mathematical as well as the heuristic aspects. The lectures follow each other in a story-like fashion.

A real Caltech course (it's not watered down at all), the course assumes a familiarity with basic probability, matrices, and calculus.

The lectures can be found on YouTubeiTunes U and this Caltech website, which hosts slides and other course materials. The professor wrote the course textbook, also called Learning from Data.

Learning From Data will be permanently added to our list of Free Online Computer Science Courses, part of our ever-growing collection, 1,250 Free Online Courses from Top Universities.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Download 243 Free eBooks on Design, Data, Software, Web Development & Business from O’Reilly Media

The Pioneering Physics TV Show, The Mechanical Universe, Is Now on YouTube: 52 Complete Episodes from Caltech

The Neuronal Basis of Consciousness Course: A Free Online Course from Caltech

More in this category... »
Quantcast