Coursera Partners with Leading Universities to Offer Master’s Degrees at a More Affordable Price

If you’re a reg­u­lar Open Cul­ture read­er, you’re already famil­iar with Cours­era, the ed tech com­pa­ny, which, since its found­ing in 2012, has giv­en the world access to online cours­es from top universities–e.g. cours­es on Roman Archi­tec­ture (Yale)Mod­ern and Post­mod­ern Phi­los­o­phy (Wes­leyan), and Bud­dhism and Neu­ro­science (Prince­ton). And you’ve per­haps noticed, too, that Cours­era has recent­ly bun­dled cer­tain cours­es into “Spe­cial­iza­tions”–essen­tial­ly areas of concentration–that let stu­dents spe­cial­ize in fields like Deep Learn­ing and Data Sci­ence.

But what if stu­dents want to deep­en their knowl­edge fur­ther and get a tra­di­tion­al degree? In what per­haps marks the begin­ning of a sig­nif­i­cant new trend, Cours­era has part­nered with lead­ing uni­ver­si­ties to offer full-fledged grad­u­ate degrees in a more afford­able online for­mat. As described in the video above, HEC Paris (the #2 busi­ness school in Europe) now offers through Cours­er­a’s plat­form a Mas­ter’s in Inno­va­tion and Entre­pre­neur­ship. Designed for aspir­ing entre­pre­neurs, the pro­gram con­sists of 20 cours­es (all online) and takes an esti­mat­ed 10–16 months to com­plete. The total tuition amounts to 20,000 Euros (rough­ly 23,500 U.S. dol­lars), a sum that’s con­sid­er­ably less than what exec­u­tive edu­ca­tion pro­grams usu­al­ly cost.

For stu­dents look­ing for a broad­er edu­ca­tion in busi­ness, the Uni­ver­si­ty of Illi­nois at Urbana-Cham­paign has launched an entire MBA pro­gram through Cours­era. Con­sist­ing of 18 online cours­es and three cap­stone projects, the iMBA pro­gram cov­ers the sub­jects usu­al­ly found in b‑school programs–leadership, strat­e­gy, eco­nom­ics, account­ing, finance, etc. The com­plete cur­ricu­lum should take rough­ly 24 to 36 months to com­plete, and costs less than $22,000–about 25%-33% of what an on-cam­pus MBA pro­gram typ­i­cal­ly runs.

(The iMBA is actu­al­ly one of three degree pro­grams the Uni­ver­si­ty of Illi­nois has launched on Cours­era. The oth­er two include a Mas­ters in Account­ing (iMSA) and a Mas­ter of Com­put­er Sci­ence in Data Sci­ence (MCS-DS).)

Now, in case you’re won­der­ing, the diplo­mas and tran­scripts for these pro­grams are grant­ed direct­ly by the uni­ver­si­ties them­selves (e.g., the Uni­ver­si­ty of Illi­nois at Urbana-Cham­paign and HEC Paris). The paper­work does­n’t car­ry Cours­er­a’s name. Nor does it indi­cate that the stu­dent com­plet­ed an “online pro­gram.” In short, online stu­dents get the same tran­script as bricks and mor­tar stu­dents.

Final­ly, all of the degree pro­grams men­tioned above are “stackable”–meaning stu­dents can (at no cost) take an indi­vid­ual course offered by any of these pro­grams. And then they can decide lat­er whether they want to apply to the degree pro­gram, and, if so, retroac­tive­ly apply that course towards the actu­al degree. Essen­tial­ly, you can try things out before mak­ing a larg­er com­mit­ment.

If you want to learn more about these pro­grams, or sub­mit an appli­ca­tion, check out the fol­low­ing links. We’ve includ­ed the dead­lines for sub­mit­ting appli­ca­tions.

If you would like to sign up for Open Culture’s free email newslet­ter, please find it here. Or fol­low our posts on Threads, Face­book, BlueSky or Mastodon.

If you would like to sup­port the mis­sion of Open Cul­ture, con­sid­er mak­ing a dona­tion to our site. It’s hard to rely 100% on ads, and your con­tri­bu­tions will help us con­tin­ue pro­vid­ing the best free cul­tur­al and edu­ca­tion­al mate­ri­als to learn­ers every­where. You can con­tribute through Pay­Pal, Patre­on, and Ven­mo (@openculture). Thanks!

Note: Open Cul­ture has a part­ner­ship with Cours­era. If read­ers enroll in cer­tain Cours­era cours­es, it helps sup­port Open Cul­ture.

Relat­ed Con­tent:

New Deep Learn­ing Cours­es Released on Cours­era, with Hope of Teach­ing Mil­lions the Basics of Arti­fi­cial Intel­li­gence

MOOCs from Great Uni­ver­si­ties (Many With Cer­tifi­cates)

Alan Turing Algorithmically Approximated by Ellipses: A Computer Art Project

Just a cool find on Twit­ter, a work of com­put­er art cre­at­ed by Jere­my Kun, a math PhD from the Uni­ver­si­ty of Illi­nois at Chica­go, and now an engi­neer at Google.

If you would like to sign up for Open Culture’s free email newslet­ter, please find it here. Or fol­low our posts on Threads, Face­book, BlueSky or Mastodon.

If you would like to sup­port the mis­sion of Open Cul­ture, con­sid­er mak­ing a dona­tion to our site. It’s hard to rely 100% on ads, and your con­tri­bu­tions will help us con­tin­ue pro­vid­ing the best free cul­tur­al and edu­ca­tion­al mate­ri­als to learn­ers every­where. You can con­tribute through Pay­Pal, Patre­on, and Ven­mo (@openculture). Thanks!

via Boing­Bo­ing

Relat­ed Con­tent:

Japan­ese Com­put­er Artist Makes “Dig­i­tal Mon­dri­ans” in 1964: When Giant Main­frame Com­put­ers Were First Used to Cre­ate Art

When J.M. Coet­zee Secret­ly Pro­grammed Com­put­ers to Write Poet­ry in the 1960s

Hear the First Record­ing of Com­put­er Music: Researchers Restore Three Melodies Pro­grammed on Alan Turing’s Com­put­er (1951)

 

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

I’ve nev­er want­ed to start a sen­tence with “I’m old enough to remem­ber…” because, well, who does? But here we are. I remem­ber the enor­mous­ly suc­cess­ful Apple IIe and Com­modore 64, and a world before Microsoft. Smart phones were sci­ence fic­tion. To do much more than word process or play games one had to learn a pro­gram­ming lan­guage. These ancient days seemed at the time—and in hind­sight as well—to be the very dawn of com­put­ing. Before the per­son­al com­put­er, such devices were the size of kitchen appli­ances and were hid­den away in mil­i­tary instal­la­tions, uni­ver­si­ties, and NASA labs.

But of course we all know that the his­to­ry of com­put­ing goes far beyond the ear­ly 80s: at least back to World War II, and per­haps even much far­ther. Do we begin with the aba­cus, the 2,200-Year-Old Antikythera Mech­a­nism, the astro­labe, Ada Lovelace and Charles Bab­bage? The ques­tion is maybe one of def­i­n­i­tions. In the short, ani­mat­ed video above, physi­cist, sci­ence writer, and YouTube edu­ca­tor Dominic Wal­li­man defines the com­put­er accord­ing to its basic bina­ry func­tion of “just flip­ping zeros and ones,” and he begins his con­densed his­to­ry of com­put­er sci­ence with trag­ic genius Alan Tur­ing of Tur­ing Test and Bletch­ley Park code­break­ing fame.

Turing’s most sig­nif­i­cant con­tri­bu­tion to com­put­ing came from his 1936 con­cept of the “Tur­ing Machine,” a the­o­ret­i­cal mech­a­nism that could, writes the Cam­bridge Com­put­er Lab­o­ra­to­ry “sim­u­late ANY com­put­er algo­rithm, no mat­ter how com­pli­cat­ed it is!” All oth­er designs, says Walliman—apart from a quan­tum computer—are equiv­a­lent to the Tur­ing Machine, “which makes it the foun­da­tion of com­put­er sci­ence.” But since Turing’s time, the sim­ple design has come to seem end­less­ly capa­ble of adap­ta­tion and inno­va­tion.

Wal­li­man illus­trates the com­put­er’s expo­nen­tial growth by point­ing out that a smart phone has more com­put­ing pow­er than the entire world pos­sessed in 1963, and that the com­put­ing capa­bil­i­ty that first land­ed astro­nauts on the moon is equal to “a cou­ple of Nin­ten­dos” (first gen­er­a­tion clas­sic con­soles, judg­ing by the image). But despite the hubris of the com­put­er age, Wal­li­man points out that “there are some prob­lems which, due to their very nature, can nev­er be solved by a com­put­er” either because of the degree of uncer­tain­ty involved or the degree of inher­ent com­plex­i­ty. This fas­ci­nat­ing, yet abstract dis­cus­sion is where Walliman’s “Map of Com­put­er Sci­ence” begins, and for most of us this will prob­a­bly be unfa­mil­iar ter­ri­to­ry.

We’ll feel more at home once the map moves from the region of Com­put­er The­o­ry to that of Com­put­er Engi­neer­ing, but while Wal­li­man cov­ers famil­iar ground here, he does not dumb it down. Once we get to appli­ca­tions, we’re in the realm of big data, nat­ur­al lan­guage pro­cess­ing, the inter­net of things, and “aug­ment­ed real­i­ty.” From here on out, com­put­er tech­nol­o­gy will only get faster, and weird­er, despite the fact that the “under­ly­ing hard­ware is hit­ting some hard lim­its.” Cer­tain­ly this very quick course in Com­put­er Sci­ence only makes for an intro­duc­to­ry sur­vey of the dis­ci­pline, but like Wallman’s oth­er maps—of math­e­mat­ics, physics, and chem­istry—this one pro­vides us with an impres­sive visu­al overview of the field that is both broad and spe­cif­ic, and that we like­ly wouldn’t encounter any­where else.

As with his oth­er maps, Wal­li­man has made this the Map of Com­put­er Sci­ence avail­able as a poster, per­fect for dorm rooms, liv­ing rooms, or wher­ev­er else you might need a reminder.

Relat­ed Con­tent:

Free Online Com­put­er Sci­ence Cours­es

How Ada Lovelace, Daugh­ter of Lord Byron, Wrote the First Com­put­er Pro­gram in 1842–a Cen­tu­ry Before the First Com­put­er

Watch Break­ing the Code, About the Life & Times of Alan Tur­ing (1996)

The Map of Math­e­mat­ics: Ani­ma­tion Shows How All the Dif­fer­ent Fields in Math Fit Togeth­er

The Map of Physics: Ani­ma­tion Shows How All the Dif­fer­ent Fields in Physics Fit Togeth­er

The Map of Chem­istry: New Ani­ma­tion Sum­ma­rizes the Entire Field of Chem­istry in 12 Min­utes

Josh Jones is a writer and musi­cian based in Durham, NC. Fol­low him at @jdmagness

New Deep Learning Courses Released on Coursera, with Hope of Teaching Millions the Basics of Artificial Intelligence

FYI: If you fol­low edtech, you know the name Andrew Ng. He’s the Stan­ford com­put­er sci­ence pro­fes­sor, who co-found­ed MOOC-provider Cours­era and lat­er became chief sci­en­tist at Baidu. Since leav­ing Baidu, he’s been work­ing on three arti­fi­cial intel­li­gence projects, the first of which he unveiled yes­ter­day. On Medi­um, he wrote:

I have been work­ing on three new AI projects, and am thrilled to announce the first one: deeplearning.ai, a project ded­i­cat­ed to dis­sem­i­nat­ing AI knowl­edge, is launch­ing a new sequence of Deep Learn­ing cours­es on Cours­era. These cours­es will help you mas­ter Deep Learn­ing, apply it effec­tive­ly, and build a career in AI.

Speak­ing to the MIT Tech­nol­o­gy Review, Ng elab­o­rat­ed: “The thing that real­ly excites me today is build­ing a new AI-pow­ered soci­ety… I don’t think any one com­pa­ny could do all the work that needs to be done, so I think the only way to get there is if we teach mil­lions of peo­ple to use these AI tools so they can go and invent the things that no large com­pa­ny, or com­pa­ny I could build, could do.”

Andrew’s new 5‑part series of cours­es on Deep Learn­ing can be accessed here. Cours­es include: Neur­al Net­works and Deep Learn­ing, Improv­ing Deep Neur­al Net­works, Struc­tur­ing Machine Learn­ing Projects, Con­vo­lu­tion­al Neur­al Net­works, and Sequence Mod­els.

You can find these cours­es on our list of Free Com­put­er Sci­ence Cours­es, a sub­set of our col­lec­tion, 1,700 Free Online Cours­es from Top Uni­ver­si­ties.

If you would like to sign up for Open Culture’s free email newslet­ter, please find it here. Or fol­low our posts on Threads, Face­book, BlueSky or Mastodon.

If you would like to sup­port the mis­sion of Open Cul­ture, con­sid­er mak­ing a dona­tion to our site. It’s hard to rely 100% on ads, and your con­tri­bu­tions will help us con­tin­ue pro­vid­ing the best free cul­tur­al and edu­ca­tion­al mate­ri­als to learn­ers every­where. You can con­tribute through Pay­Pal, Patre­on, and Ven­mo (@openculture). Thanks!

Relat­ed Con­tent:

Google Launch­es Free Course on Deep Learn­ing: The Sci­ence of Teach­ing Com­put­ers How to Teach Them­selves

Google’s Deep­Mind AI Teach­es Itself to Walk, and the Results Are Kooky, No Wait, Chill­ing

Arti­fi­cial Intel­li­gence: A Free Online Course from MIT

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves

Last Fri­day, we men­tioned how Google’s arti­fi­cial intel­li­gence soft­ware Deep­Mind has the abil­i­ty to teach itself many things. It can teach itself how to walk, jump and run. Even take pro­fes­sion­al pic­tures. Or defeat the world’s best play­er of the Chi­nese strat­e­gy game, Go. The sci­ence of teach­ing com­put­ers how to do things is called Deep Learn­ing. And you can now immerse your­self in this world by tak­ing a free, 3‑month course on Deep Learn­ing itself. Offered through Udac­i­ty, the course is taught by Vin­cent Van­houcke, the tech­ni­cal lead in Google’s Brain team. You can learn more about the course via Van­houck­e’s blog post. Or just enroll here. (You will need to cre­ate an account with Udac­i­ty to get start­ed.)

The free course takes about 3 months to com­plete. It will be added to our list of Free Com­put­er Sci­ences cours­es, a sub­set of our larg­er col­lec­tion,  1,700 Free Online Cours­es from Top Uni­ver­si­ties.

If you would like to sign up for Open Culture’s free email newslet­ter, please find it here. Or fol­low our posts on Threads, Face­book, BlueSky or Mastodon.

If you would like to sup­port the mis­sion of Open Cul­ture, con­sid­er mak­ing a dona­tion to our site. It’s hard to rely 100% on ads, and your con­tri­bu­tions will help us con­tin­ue pro­vid­ing the best free cul­tur­al and edu­ca­tion­al mate­ri­als to learn­ers every­where. You can con­tribute through Pay­Pal, Patre­on, and Ven­mo (@openculture). Thanks!

Relat­ed Con­tent:

Google’s Deep­Mind AI Teach­es Itself to Walk, and the Results Are Kooky, No Wait, Chill­ing

Learn Python: A Free Online Course from Google

Take a Free Course on Dig­i­tal Pho­tog­ra­phy from Stan­ford Prof Marc Lev­oy

 

by | Permalink | Make a Comment ( 6 ) |

How Aristotle Invented Computer Science

In pop­u­lar con­cep­tions, we take the com­put­er to be the nat­ur­al out­come of empir­i­cal sci­ence, an inher­i­tance of the Enlight­en­ment and sub­se­quent sci­en­tif­ic rev­o­lu­tions in the 19th and 20th cen­turies. Of course, mod­ern com­put­ers have their ancient pre­cur­sors, like the Antikythera Mech­a­nism, a 2,200-year-old bronze and wood machine capa­ble of pre­dict­ing the posi­tions of the plan­ets, eclipses, and phas­es of the moon. But even this fas­ci­nat­ing arti­fact fits into the nar­ra­tive of com­put­er sci­ence as “a his­to­ry of objects, from the aba­cus to the Bab­bage engine up through the code-break­ing machines of World War II.” Much less do we invoke the names of “philoso­pher-math­e­mati­cians,” writes Chris Dixon at The Atlantic, like George Boole and Got­t­lob Frege, “who were them­selves inspired by Leibniz’s dream of a uni­ver­sal ‘con­cept lan­guage,’ and the ancient log­i­cal sys­tem of Aris­to­tle.” But these thinkers are as essen­tial, if not more so, to com­put­er sci­ence, espe­cial­ly, Dixon argues, Aris­to­tle.

The ancient Greek thinker did not invent a cal­cu­lat­ing machine, though they may have exist­ed in his life­time. Instead, as Dixon writes in his recent piece, “How Aris­to­tle Cre­at­ed the Com­put­er,” Aris­to­tle laid the foun­da­tions of math­e­mat­i­cal log­ic, “a field that would have more impact on the mod­ern world than any oth­er.”

The claim may strike his­to­ri­ans of phi­los­o­phy as some­what iron­ic, giv­en that Enlight­en­ment philoso­phers like Fran­cis Bacon and John Locke announced their mod­ern projects by thor­ough­ly repu­di­at­ing the medieval scholas­tics, whom they alleged were guilty of a slav­ish devo­tion to Aris­to­tle. Their crit­i­cisms of medieval thought were var­ied and great­ly war­rant­ed in many ways, and yet, like many an empiri­cist since, they often over­looked the crit­i­cal impor­tance of Aris­totelian log­ic to sci­en­tif­ic thought.

At the turn of the 20th cen­tu­ry, almost three hun­dred years after Bacon sought to tran­scend Aristotle’s Organon with his form of nat­ur­al phi­los­o­phy, the for­mal log­ic of Aris­to­tle could still be “con­sid­ered a hope­less­ly abstract sub­ject with no con­ceiv­able appli­ca­tions.” But Dixon traces the “evo­lu­tion of com­put­er sci­ence from math­e­mat­i­cal log­ic” and Aris­totelian thought, begin­ning in the 1930s with Claude Shan­non, author of the ground­break­ing essay “A Sym­bol­ic Analy­sis of Switch­ing and Relay Cir­cuits.” Shan­non drew on the work of George Boole, whose name is now known to every com­put­er sci­en­tist and engi­neer but who, in 1938, “was rarely read out­side of phi­los­o­phy depart­ments.” And Boole owed his prin­ci­ple intel­lec­tu­al debt, as he acknowl­edged in his 1854 The Laws of Thought, to Aristotle’s syl­lo­gis­tic rea­son­ing.

Boole derived his oper­a­tions by replac­ing the terms in a syl­lo­gism with vari­ables, “and the log­i­cal words ‘all’ and ‘are’ with arith­meti­cal oper­a­tors.” Shan­non dis­cov­ered that “Boole’s sys­tem could be mapped direct­ly onto elec­tri­cal cir­cuits,” which hith­er­to “had no sys­tem­at­ic the­o­ry gov­ern­ing their design.” The insight “allowed com­put­er sci­en­tists to import decades of work in log­ic and math­e­mat­ics by Boole and sub­se­quent logi­cians.” Shan­non, Dixon writes, “was the first to dis­tin­guish between the log­i­cal and the phys­i­cal lay­er of com­put­ers,” a dis­tinc­tion now “so fun­da­men­tal to com­put­er sci­ence that it might seem sur­pris­ing to mod­ern read­ers how insight­ful it was at the time.” And yet, the field could not move for­ward with­out it—without, that is, a return to ancient cat­e­gories of thought.

Since the 1940s, com­put­er pro­gram­ming has become sig­nif­i­cant­ly more sophis­ti­cat­ed. One thing that hasn’t changed is that it still pri­mar­i­ly con­sists of pro­gram­mers spec­i­fy­ing rules for com­put­ers to fol­low. In philo­soph­i­cal terms, we’d say that com­put­er pro­gram­ming has fol­lowed in the tra­di­tion of deduc­tive log­ic, the branch of log­ic dis­cussed above, which deals with the manip­u­la­tion of sym­bols accord­ing to for­mal rules.

Dixon’s argu­ment for the cen­tral­i­ty of Aris­to­tle to mod­ern com­put­er sci­ence takes many turns—through the qua­si-mys­ti­cal thought of 13th-cen­tu­ry Ramon Llull and, lat­er, his admir­er Got­tfried Leib­niz. Through Descartes, and lat­er Frege and Bertrand Rus­sell. Through Alan Tur­ing’s work at Bletch­ley Park. Nowhere do we see Aris­to­tle, wrapped in a toga, build­ing a cir­cuit board in his garage, but his modes of rea­son­ing are every­where in evi­dence as the scaf­fold­ing upon which all mod­ern com­put­er sci­ence has been built. Aristotle’s attempts to under­stand the laws of the human mind “helped cre­ate machines that could rea­son accord­ing to the rules of deduc­tive log­ic.” The appli­ca­tion of ancient philo­soph­i­cal prin­ci­ples may, Dixon con­cludes, “result in the cre­ation of new minds—artificial minds—that might some­day match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entire­ty in the audio above.

Relat­ed Con­tent:

Free Online Com­put­er Sci­ence Cours­es

How the World’s Old­est Com­put­er Worked: Recon­struct­ing the 2,200-Year-Old Antikythera Mech­a­nism

The Books on Young Alan Turing’s Read­ing List: From Lewis Car­roll to Mod­ern Chro­mat­ics

How Ara­bic Trans­la­tors Helped Pre­serve Greek Phi­los­o­phy … and the Clas­si­cal Tra­di­tion

Josh Jones is a writer and musi­cian based in Durham, NC. Fol­low him at @jdmagness

 

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

I’ve nev­er quite under­stood why the phrase “revi­sion­ist his­to­ry” became pure­ly pejo­ra­tive. Of course, it has its Orwellian dark side, but all knowl­edge has to be revised peri­od­i­cal­ly, as we acquire new infor­ma­tion and, ide­al­ly, dis­card old prej­u­dices and nar­row frames of ref­er­ence. A fail­ure to do so seems fun­da­men­tal­ly regres­sive, not only in polit­i­cal terms, but also in terms of how we val­ue accu­rate, inter­est­ing, and engaged schol­ar­ship. Such research has recent­ly brought us fas­ci­nat­ing sto­ries about pre­vi­ous­ly mar­gin­al­ized peo­ple who made sig­nif­i­cant con­tri­bu­tions to sci­en­tif­ic dis­cov­ery, such as NASA’s “human com­put­ers,” por­trayed in the book Hid­den Fig­ures, then dra­ma­tized in the film of the same name.

Like­wise, the many women who worked at Bletch­ley Park dur­ing World War II—helping to deci­pher encryp­tions like the Nazi Enig­ma Code (out of near­ly 10,000 code­break­ers, about 75% were women)—have recent­ly been get­ting their his­tor­i­cal due, thanks to “revi­sion­ist” researchers. And, as we not­ed in a recent post, we might not know much, if any­thing, about film star Hedy Lamarr’s sig­nif­i­cant con­tri­bu­tions to wire­less, GPS, and Blue­tooth tech­nol­o­gy were it not for the work of his­to­ri­ans like Richard Rhodes. These few exam­ples, among many, show us a fuller, more accu­rate, and more inter­est­ing view of the his­to­ry of sci­ence and tech­nol­o­gy, and they inspire women and girls who want to enter the field, yet have grown up with few role mod­els to encour­age them.

We can add to the pan­theon of great women in sci­ence the name Ada Byron, Count­ess of Lovelace, the daugh­ter of Roman­tic poet Lord Byron. Lovelace has been renowned, as Hank Green tells us in the video at the top of the post, for writ­ing the first com­put­er pro­gram, “despite liv­ing a cen­tu­ry before the inven­tion of the mod­ern com­put­er.” This pic­ture of Lovelace has been a con­tro­ver­sial one. “His­to­ri­ans dis­agree,” writes prodi­gious math­e­mati­cian Stephen Wol­fram. “To some she is a great hero in the his­to­ry of com­put­ing; to oth­ers an over­es­ti­mat­ed minor fig­ure.”

Wol­fram spent some time with “many orig­i­nal doc­u­ments” to untan­gle the mys­tery. “I feel like I’ve final­ly got­ten to know Ada Lovelace,” he writes, “and got­ten a grasp on her sto­ry. In some ways it’s an ennobling and inspir­ing sto­ry; in some ways it’s frus­trat­ing and trag­ic.” Edu­cat­ed in math and music by her moth­er, Anne Isabelle Mil­banke, Lovelace became acquaint­ed with math­e­mat­ics pro­fes­sor Charles Bab­bage, the inven­tor of a cal­cu­lat­ing machine called the Dif­fer­ence Engine, “a 2‑foot-high hand-cranked con­trap­tion with 2000 brass parts.” Bab­bage encour­aged her to pur­sue her inter­ests in math­e­mat­ics, and she did so through­out her life.

Wide­ly acknowl­edged as one of the fore­fa­thers of com­put­ing, Bab­bage even­tu­al­ly cor­re­spond­ed with Lovelace on the cre­ation of anoth­er machine, the Ana­lyt­i­cal Engine, which “sup­port­ed a whole list of pos­si­ble kinds of oper­a­tions, that could in effect be done in arbi­trar­i­ly pro­grammed sequence.” When, in 1842, Ital­ian math­e­mati­cian Louis Mene­brea pub­lished a paper in French on the Ana­lyt­i­cal Engine, “Bab­bage enlist­ed Ada as trans­la­tor,” notes the San Diego Super­com­put­er Cen­ter’s Women in Sci­ence project. “Dur­ing a nine-month peri­od in 1842–43, she worked fever­ish­ly on the arti­cle and a set of Notes she append­ed to it. These are the source of her endur­ing fame.” (You can read her trans­la­tion and notes here.)

In the course of his research, Wol­fram pored over Bab­bage and Lovelace’s cor­re­spon­dence about the trans­la­tion, which reads “a lot like emails about a project might today, apart from being in Vic­to­ri­an Eng­lish.” Although she built on Bab­bage and Menebrea’s work, “She was clear­ly in charge” of suc­cess­ful­ly extrap­o­lat­ing the pos­si­bil­i­ties of the Ana­lyt­i­cal Engine, but she felt “she was first and fore­most explain­ing Babbage’s work, so want­ed to check things with him.” Her addi­tions to the work were very well-received—Michael Fara­day called her “the ris­ing star of Science”—and when her notes were pub­lished, Bab­bage wrote, “you should have writ­ten an orig­i­nal paper.”

Unfor­tu­nate­ly, as a woman, “she couldn’t get access to the Roy­al Society’s library in Lon­don,” and her ambi­tions were derailed by a severe health cri­sis. Lovelace died of can­cer at the age of 37, and for some time, her work sank into semi-obscu­ri­ty. Though some his­to­ri­ans have  seen her as sim­ply an expos­i­tor of Babbage’s work, Wol­fram con­cludes that it was Ada who had the idea of “what the Ana­lyt­i­cal Engine should be capa­ble of.” Her notes sug­gest­ed pos­si­bil­i­ties Bab­bage had nev­er dreamed. As the Women in Sci­ence project puts it, “She right­ly saw [the Ana­lyt­i­cal Engine] as what we would call a gen­er­al-pur­pose com­put­er. It was suit­ed for ‘devel­op­ping [sic] and tab­u­lat­ing any func­tion what­ev­er… the engine [is] the mate­r­i­al expres­sion of any indef­i­nite func­tion of any degree of gen­er­al­i­ty and com­plex­i­ty.’ Her Notes antic­i­pate future devel­op­ments, includ­ing com­put­er-gen­er­at­ed music.”

In a recent episode of the BBC’s In Our Time, above, you can hear host Melvyn Bragg dis­cuss Lovelace’s impor­tance with his­to­ri­ans and schol­ars Patri­cia Fara, Doron Swade, and John Fue­gi. And be sure to read Wolfram’s bio­graph­i­cal and his­tor­i­cal account of Lovelace here.

Relat­ed Con­tent:

How 1940s Film Star Hedy Lamarr Helped Invent the Tech­nol­o­gy Behind Wi-Fi & Blue­tooth Dur­ing WWII

The Con­tri­bu­tions of Women Philoso­phers Recov­ered by the New Project Vox Web­site

Real Women Talk About Their Careers in Sci­ence

Josh Jones is a writer and musi­cian based in Durham, NC. Fol­low him at @jdmagness

Google Uses Artificial Intelligence to Map Thousands of Bird Sounds Into an Interactive Visualization

If you were around in 2013, you may recall that we told you about Cor­nel­l’s Archive of 150,000 Bird Calls and Ani­mal Sounds, with Record­ings Going Back to 1929. It’s a splen­did place for ornithol­o­gists and bird lovers to spend time. And, it turns out, the same also applies to com­put­er pro­gram­mers.

Late last year, Google launched an exper­i­ment where, draw­ing on Cor­nel­l’s sound archive, they used machine learn­ing (arti­fi­cial intel­li­gence that lets com­put­ers learn and do tasks on their own) to orga­nize thou­sands of bird sounds into a map where sim­i­lar sounds are placed clos­er togeth­er. And it result­ed in this impres­sive inter­ac­tive visu­al­iza­tion. Check it out. Or head into Cor­nel­l’s archive and do your own old-fash­ioned explo­rations.

Note: You can find free cours­es on machine learn­ing and arti­fi­cial intel­li­gence in the Relat­eds below.

If you would like to sign up for Open Culture’s free email newslet­ter, please find it here. Or fol­low our posts on Threads, Face­book, BlueSky or Mastodon.

If you would like to sup­port the mis­sion of Open Cul­ture, con­sid­er mak­ing a dona­tion to our site. It’s hard to rely 100% on ads, and your con­tri­bu­tions will help us con­tin­ue pro­vid­ing the best free cul­tur­al and edu­ca­tion­al mate­ri­als to learn­ers every­where. You can con­tribute through Pay­Pal, Patre­on, and Ven­mo (@openculture). Thanks!

Relat­ed Con­tent:

Cor­nell Launch­es Archive of 150,000 Bird Calls and Ani­mal Sounds, with Record­ings Going Back to 1929 

Neur­al Net­works for Machine Learn­ing: A Free Online Course 

A Free Course on Machine Learn­ing & Data Sci­ence from Cal­tech

by | Permalink | Make a Comment ( 1 ) |

« Go BackMore in this category... »
Quantcast
Open Culture was founded by Dan Colman.