How to Take a Picture of a Black Hole: Watch the 2017 Ted Talk by Katie Bouman, the MIT Grad Student Who Helped Take the Groundbreaking Photo

What triggered the worst impulses of the Internet last week?

The world's first photo of a black hole, which proved the presence of troll life here on earth, and confirms that female scientists, through no fault of their own, have a much longer way to go, baby.

If you want a taste, sort the comments on the two year old TED Talk, above, so they're ordered  "newest first."

Katie Bouman, soon-to-be assistant professor of computing and mathematical sciences at the California Institute of Technology, was a PhD candidate at MIT two years ago, when she taped the talk, but she could've passed for a nervous high schooler competing in the National Science Bowl finals, in clothes borrowed from Aunt Judy, who works at the bank.

The focus of her studies were the ways in which emerging computational methods could help expand the boundaries of interdisciplinary imaging.

Prior to last week, I’m not sure how well I could have parsed the focus of her work had she not taken the time to help less STEM-inclined viewers such as myself wrap our heads around her highly technical, then-wholly-theoretical subject.

What I know about black holes could still fit in a thimble, and in truth, my excitement about one being photographed for the first time pales in comparison to my excitement about Game of Thrones returning to the airwaves.

Fortunately, we’re not obligated to be equally turned on by the same interests, an idea theoretical physicist Richard Feynman promoted:

I've always been very one-sided about science and when I was younger I concentrated almost all my effort on it. I didn't have time to learn and I didn't have much patience with what's called the humanities, even though in the university there were humanities that you had to take. I tried my best to avoid somehow learning anything and working at it. It was only afterwards, when I got older, that I got more relaxed, that I've spread out a little bit. I've learned to draw and I read a little bit, but I'm really still a very one-sided person and I don't know a great deal. I have a limited intelligence and I use it in a particular direction.

I'm pretty sure my lack of passion for science is not tied to my gender. Some of my best friends are guys who feel the same. (Some of them don't like team sports either.)

But I couldn't help but experience a wee thrill that this young woman, a science nerd who admittedly could’ve used a few theater nerd tips regarding relaxation and public speaking, realized her dream—an honest to goodness photo of a black hole just like the one she talked about in her TED Talk,  "How to take a picture of a black hole."

Bouman and the 200+ colleagues she acknowledges and thanks at every opportunity, achieved their goal, not with an earth-sized camera but rather a network of linked telescopes, much as she had described two years earlier, when she invoked disco balls, Mick Jagger, oranges, selfies, and a jigsaw puzzle in an effort to help people like me understand.

Look at that sucker (or, more accurately, its shadow!) That thing’s 500 million trillion kilometers from Earth!

(That's much farther than King's Landing is from Winterfell.)

I’ll bet a lot of elementary science teachers, be they male, female, or non-binary, are going to make science fun by having their students draw pictures of the picture of the black hole.

If we could go back (or forward) in time, I can almost guarantee that mine would be among the best because while I didn’t “get” science (or gym), I was a total art star with the crayons.

Then, crafty as Lord Petyr Baelish when presentation time rolled around, I would partner with a girl like Katie Bouman, who could explain the science with winning vigor. She genuinely seems to embrace the idea that it “takes a village,” and that one’s fellow villagers should be credited whenever possible.

(How did I draw the black hole, you ask? Honestly, it's not that much harder than drawing a doughnut. Now back to Katie!)

Alas, her professional warmth failed to register with legions of Internet trolls who began sliming her shortly after a colleague at MIT shared a beaming snapshot of her, taken, presumably, with a regular old phone as the black hole made its debut. That pic cemented her accidental status as the face of this project.

Note to the trolls—it wasn't a dang selfie.

“I’m so glad that everyone is as excited as we are and people are finding our story inspirational,’’ Bouman told The New York Times. “However, the spotlight should be on the team and no individual person. Focusing on one person like this helps no one, including me.”

Although Bouman was a junior team member, she and other grad students made major contributions. She directed the verification of images, the selection of imaging parameters, and authored an imaging algorithm that researchers used in the creation of three scripted code pipelines from which the instantly-famous picture was cobbled together.

As Vincent Fish, a research scientist at MIT's Haystack Observatory told CNN:

One of the insights Katie brought to our imaging group is that there are natural images. Just think about the photos you take with your camera phone—they have certain properties.... If you know what one pixel is, you have a good guess as to what the pixel is next to it.

Hey, that makes sense.

As The Verge’s science editor, Mary Beth Griggs, points out, the rush to defame Bouman is of a piece with some of the non-virtual realities women in science face:

Part of the reason that some posters found Bouman immediately suspicious had to do with her gender. Famously, a number of prominent men like disgraced former CERN physicist Alessandro Strumia have argued that women aren’t being discriminated against in science — they simply don’t like it, or don’t have the aptitude for it. That argument fortifies a notion that women don’t belong in science, or can’t really be doing the work. So women like Bouman must be fakes, this warped line of thinking goes…

Even I, whose 7th grade science teacher tempered a bad grade on my report card by saying my interest in theater would likely serve me much better than anything I might eek from her class, know that just as many girls and women excel at science, technology, engineering, and math as excel in the arts. (Sometimes they excel at both!)

(And power to every little boy with his sights set on nursing, teaching, or ballet!)

(How many black holes have the haters photographed recently?)

Griggs continues:

Saying that she was part of a larger team doesn’t diminish her work, or minimize her involvement in what is already a history-making project. Highlighting the achievements of a brilliant, enthusiastic scientist does not diminish the contributions of the other 214 people who worked on the project, either. But what it is doing is showing a different model for a scientist than the one most of us grew up with. That might mean a lot to some kids — maybe kids who look like her — making them excited about studying the wonders of the Universe.

via BoingBoing

Related Content:

Women’s Hidden Contributions to Modern Genetics Get Revealed by New Study: No Longer Will They Be Buried in the Footnotes

New Augmented Reality App Celebrates Stories of Women Typically Omitted from U.S. History Textbooks

Stephen Hawking (RIP) Explains His Revolutionary Theory of Black Holes with the Help of Chalkboard Animations

Watch a Star Get Devoured by a Supermassive Black Hole

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City tonight for the next installment of her book-based variety show, Necromancers of the Public Domain. Follow her @AyunHalliday.

Artificial Intelligence Identifies the Six Main Arcs in Storytelling: Welcome to the Brave New World of Literary Criticism

Is the singularity upon us? AI seems poised to replace everyone, even artists whose work can seem like an inviolably human industry. Or maybe not. Nick Cave’s poignant answer to a fan question might persuade you a machine will never write a great song, though it might master all the moves to write a good one. An AI-written novel did almost win a Japanese literary award. A suitably impressive feat, even if much of the authorship should be attributed to the program’s human designers.

But what about literary criticism? Is this an art that a machine can do convincingly? The answer may depend on whether you consider it an art at all. For those who do, no artificial intelligence will ever properly develop the theory of mind needed for subtle, even moving, interpretations. On the other hand, one group of researchers has succeeded in using “sophisticated computing power, natural language processing, and reams of digitized text,” writes Atlantic editor Adrienne LaFrance, “to map the narrative patterns in a huge corpus of literature.” The name of their literary criticism machine? The Hedonometer.

We can treat this as an exercise in compiling data, but it's arguable that the results are on par with work from the comparative mythology school of James Frazier and Joseph Campbell. A more immediate comparison might be to the very deft, if not particularly subtle, Kurt Vonnegut, who—before he wrote novels like Slaughterhouse Five and Cat’s Cradlesubmitted a master’s thesis in anthropology to the University of Chicago. His project did the same thing as the machine, 35 years earlier, though he may not have had the wherewithal to read “1,737 English-language works of fiction between 10,000 and 200,000 words long" while struggling to finish his graduate program. (His thesis, by the way, was rejected.)

Those numbers describe the dataset from Project Gutenberg fed into the The Hedonometer by the computer scientists at the University of Vermont and the University of Adelaide. After the computer finished "reading," it then plotted “the emotional trajectory” of all of the stories using a “sentiment analysis to generate an emotional arc for each work.” What it found were six broad categories of story, listed below:

  1. Rags to Riches (rise)
  2. Riches to Rags (fall)
  3. Man in a Hole (fall then rise)
  4. Icarus (rise then fall)
  5. Cinderella (rise then fall then rise)
  6. Oedipus (fall then rise then fall)

How does this endeavor compare with Vonnegut’s project? (See him present the theory below.) The novelist used more or less the same methodology, in human form, to come up with eight universal story arcs or “shapes of stories.” Vonnegut himself left out the Rags to Riches category; he called it an anomaly, though he did have a heading for the same rising-only story arc—the Creation Story—which he deemed an uncommon shape for Western fiction. He did include the Cinderella arc, and was pleased by his discovery that its shape mirrored the New Testament arc, which he also included in his schema, an act the AI surely would have judged redundant.

Contra Vonnegut, the AI found that one-fifth of all the works it analyzed were Rags-to-Riches stories. It determined that this arc was far less popular with readers than “Oedipus,” “Man in a Hole,” and “Cinderella.” Its analysis does get much more granular, and to allay our suspicions, the researchers promise they did not control the outcome of the experiment. “We’re not imposing a set of shapes,” says lead author Andy Reagan, Ph.D. candidate in mathematics at the University of Vermont. “Rather: the math and machine learning have identified them.”

But the authors do provide a lot of their own interpretation of the data, from choosing representative texts—like Harry Potter and the Deathly Hallows—to illustrate “nested and complicated” plot arcs, to providing the guiding assumptions of the exercise. One of those assumptions, unsurprisingly given the authors’ fields of interest, is that math and language are interchangeable. “Stories are encoded in art, language, and even in the mathematics of physics,” they write in the introduction to their paper, published on

“We use equations," they go on, "to represent both simple and complicated functions that describe our observations of the real world.” If we accept the premise that sentences and integers and lines of code are telling the same stories, then maybe there isn’t as much difference between humans and machines as we would like to think.

via The Atlantic

Related Content:

Nick Cave Answers the Hotly Debated Question: Will Artificial Intelligence Ever Be Able to Write a Great Song?

Kurt Vonnegut Diagrams the Shape of All Stories in a Master’s Thesis Rejected by U. Chicago

Kurt Vonnegut Maps Out the Universal Shapes of Our Favorite Stories

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Watch 110 Lectures by Donald Knuth, “the Yoda of Silicon Valley,” on Programming, Mathematical Writing, and More

Many see the realms of literature and computers as not just completely separate, but growing more distant from one another all the time. Donald Knuth, one of the most respected figures of all the most deeply computer-savvy in Silicon Valley, sees it differently. His claims to fame include The Art of Computer Programming, an ongoing multi-volume series of books whose publication began more than fifty years ago, and the digital typesetting system TeX, which, in a recent profile of Knuth, the New York Times' Siobhan Roberts describes as "the gold standard for all forms of scientific communication and publication."

Some, Roberts writes, consider TeX "Dr. Knuth’s greatest contribution to the world, and the greatest contribution to typography since Gutenberg." At the core of his lifelong work is an idea called "literate programming," which emphasizes "the importance of writing code that is readable by humans as well as computers — a notion that nowadays seems almost twee.

Dr. Knuth has gone so far as to argue that some computer programs are, like Elizabeth Bishop’s poems and Philip Roth’s American Pastoral, works of literature worthy of a Pulitzer." Knuth's mind, technical achievements, and style of communication have earned him the informal title of "the Yoda of Silicon Valley."

That appellation also reflects a depth of technical wisdom only attainable by getting to the very bottom of things, which in Knuth's case means fully understanding how computer programming works all the way down to the most basic level. (This in contrast to the average programmer, writes Roberts, who "no longer has time to manipulate the binary muck, and works instead with hierarchies of abstraction, layers upon layers of code — and often with chains of code borrowed from code libraries.) Now everyone can get more than a taste of Knuth's perspective and thoughts on computers, programming, and a host of related subjects on the Youtube channel of Stanford University, where Knuth is now professor emeritus (and where he still gives informal lectures under the banner "Computer Musings").

Stanford's online archive of Donald Knuth Lectures now numbers 110, ranging across the decades and covering such subjects as the usage and mechanics of TeX, the analysis of algorithms, and the nature of mathematical writing. "I am worried that algorithms are getting too prominent in the world,” he tells Roberts in the New York Times profile. “It started out that computer scientists were worried nobody was listening to us. Now I’m worried that too many people are listening." But having become a computer scientist before the field of computer science even had a name, the now-octogenarian Knuth possesses a rare perspective to which anyone in 21st-century technology could certainly benefit from exposure.

Related Content:

Free Online Computer Science Courses

50 Famous Academics & Scientists Talk About God

The Secret History of Silicon Valley

When J.M. Coetzee Secretly Programmed Computers to Write Poetry in the 1960s

Introduction to Computer Science and Programming: A Free Course from MIT

Peter Thiel’s Stanford Course on Startups: Read the Lecture Notes Free Online

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Discover Rare 1980s CDs by Lou Reed, Devo & Talking Heads That Combined Music with Computer Graphics

When it first hit the market in 1982, the compact disc famously promised "perfect sound that lasts forever." But innovation has a way of marching continually on, and naturally the innovators soon started wondering: what if perfect sound isn't enough? What if consumers want something to go with it, something to look at? And so, when compact disc co-developers Sony and Philips updated its standards, they included documentation on the use of the format's channels not occupied by audio data. So was born the CD+G, which boasted "not only the CD's full, digital sound, but also video information — graphics — viewable on any television set or video monitor."

That text comes from a package scan posted by the online CD+G Museum, whose Youtube channel features rips of nearly every record released on the format, beginning with the first, the Firesign Theatre's Eat or Be Eaten.

When it came out, listeners who happened to own a CD+G-compatible player (or a CD+G-compatible video game console, my own choice at the time having been the Turbografx-16) could see that beloved "head comedy" troupe's densely layered studio production and even more densely layered humor accompanied by images rendered in psychedelic color — or as psychedelic as images can get with only sixteen colors available on the palette, not to mention a resolution of 288 pixels by 192 pixels, not much larger than a icon on the home screen of a modern smartphone. Those limitations may make CD+G graphics look unimpressive today, but just imagine what a cutting-edge novelty they must have seemed in the late 1980s when they first appeared.

Displaying lyrics for karaoke singers was the most obvious use of CD+G technology, but its short lifespan also saw a fair few experiments on such other major-label releases, all viewable at the CD+G Museum, as Lou Reed's New York, which combines lyrics with digitized photography of the eponymous city; Talking Heads' Naked, which provides musical information such as the chord changes and instruments playing on each phrase; Johann Sebastian Bach's St. Matthew Passion, which translates the libretto alongside works of art; and Devo's single "Disco Dancer," which tells the origin story of those "five Spudboys from Ohio." With these and almost every other CD+G release available at the CD+G museum, you'll have no shortage of not just background music but background visuals for your next late-80s-early-90s-themed party.

Related Content:

Watch 1970s Animations of Songs by Joni Mitchell, Jim Croce & The Kinks, Aired on The Sonny & Cher Show

The Story of How Beethoven Helped Make It So That CDs Could Play 74 Minutes of Music

Discover the Lost Early Computer Art of Telidon, Canada’s TV Proto-Internet from the 1970s

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

M.I.T. Computer Program Alarmingly Predicts in 1973 That Civilization Will End by 2040

In 1704, Isaac Newton predicted the end of the world sometime around (or after, "but not before") the year 2060, using a strange series of mathematical calculations. Rather than study what he called the “book of nature,” he took as his source the supposed prophecies of the book of Revelation. While such predictions have always been central to Christianity, it is startling for modern people to look back and see the famed astronomer and physicist indulging them. For Newton, however, as Matthew Stanley writes at Science, “laying the foundation of modern physics and astronomy was a bit of a sideshow. He believed that his truly important work was deciphering ancient scriptures and uncovering the nature of the Christian religion.”

Over three hundred years later, we still have plenty of religious doomsayers predicting the end of the world with Bible codes. But in recent times, their ranks have seemingly been joined by scientists whose only professed aim is interpreting data from climate research and sustainability estimates given population growth and dwindling resources. The scientific predictions do not draw on ancient texts or theology, nor involve final battles between good and evil. Though there may be plagues and other horrible reckonings, these are predictably causal outcomes of over-production and consumption rather than divine wrath. Yet by some strange fluke, the science has arrived at the same apocalyptic date as Newton, plus or minus a decade or two.

The “end of the world” in these scenarios means the end of modern life as we know it: the collapse of industrialized societies, large-scale agricultural production, supply chains, stable climates, nation states…. Since the late sixties, an elite society of wealthy industrialists and scientists known as the Club of Rome (a frequent player in many conspiracy theories) has foreseen these disasters in the early 21st century. One of the sources of their vision is a computer program developed at MIT by computing pioneer and systems theorist Jay Forrester, whose model of global sustainability, one of the first of its kind, predicted civilizational collapse in 2040. “What the computer envisioned in the 1970s has by and large been coming true,” claims Paul Ratner at Big Think.

Those predictions include population growth and pollution levels, “worsening quality of life,” and “dwindling natural resources.” In the video at the top, see Australia's ABC explain the computer’s calculations, “an electronic guided tour of our global behavior since 1900, and where that behavior will lead us,” says the presenter. The graph spans the years 1900 to 2060. "Quality of life" begins to sharply decline after 1940, and by 2020, the model predicts, the metric contracts to turn-of-the-century levels, meeting the sharp increase of the “Zed Curve" that charts pollution levels. (ABC revisited this reporting in 1999 with Club of Rome member Keith Suter.)

You can probably guess the rest—or you can read all about it in the 1972 Club of Rome-published report Limits to Growth, which drew wide popular attention to Jay Forrester’s books Urban Dynamics (1969) and World Dynamics (1971). Forrester, a figure of Newtonian stature in the worlds of computer science and management and systems theory—though not, like Newton, a Biblical prophecy enthusiast—more or less endorsed his conclusions to the end of his life in 2016. In one of his last interviews, at the age of 98, he told the MIT Technology Review, “I think the books stand all right.” But he also cautioned against acting without systematic thinking in the face of the globally interrelated issues the Club of Rome ominously calls “the problematic”:

Time after time … you’ll find people are reacting to a problem, they think they know what to do, and they don’t realize that what they’re doing is making a problem. This is a vicious [cycle], because as things get worse, there is more incentive to do things, and it gets worse and worse.

Where this vague warning is supposed to leave us is uncertain. If the current course is dire, “unsystematic” solutions may be worse? This theory also seems to leave powerfully vested human agents (like Exxon's executives) wholly unaccountable for the coming collapse. Limits to Growth—scoffed at and disparagingly called “neo-Malthusian” by a host of libertarian critics—stands on far surer evidentiary footing than Newton’s weird predictions, and its climate forecasts, notes Christian Parenti, “were alarmingly prescient.” But for all this doom and gloom it’s worth bearing in mind that models of the future are not, in fact, the future. There are hard times ahead, but no theory, no matter how sophisticated, can account for every variable.

via Big Think

Related Content:

In 1704, Isaac Newton Predicts the World Will End in 2060

A Century of Global Warming Visualized in a 35 Second Video

A Map Shows What Happens When Our World Gets Four Degrees Warmer: The Colorado River Dries Up, Antarctica Urbanizes, Polynesia Vanishes

It’s the End of the World as We Know It: The Apocalypse Gets Visualized in an Inventive Map from 1486

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Meet Grace Hopper, the Pioneering Computer Scientist Who Helped Invent COBOL and Build the Historic Mark I Computer (1906-1992)

On a page for its School of Technology, Rasmussen College lists six “Assumptions to Avoid” for women who want to enter the field of computer science. I couldn’t comment on whether these “assumptions” (alleged misconceptions like “the work environment is hostile to women”) are actually disproved by the commentary. But I might suggest a seventh “assumption to avoid”—that women haven’t always been computer scientists, integral to the development of the computer, programming languages, and every other aspect of computing, even 100 years before computers existed.

In fact, one of the most notable women in computer science, Grace Hopper, served as a member of the Harvard team that built the first computer, the room-sized Mark I designed in 1944 by physics professor Howard Aiken. Hopper also helped develop COBOL, the first universal programming language for business, still widely in use today, a system based on written English rather than on symbols or numbers. And she is credited with coining the term “computer bug” (and by extension “debug”), when she and her associates found a moth stuck inside the Mark II in 1947. (“From then on,” she told Time magazine in 1984, “when anything went wrong with a computer, we said it had bugs in it.”)

These are but a few of her achievements in a computer science career that spanned more than 42 years, during which time she rose through the ranks of the Naval Reserves, then later active naval duty, retiring as the oldest commissioned officer, a rear admiral, at age 79.

In addition to winning distinguished awards and commendations over the course of her career—including the first-ever computer science “Man of the Year” award—Hopper also acquired a few distinguished nicknames, including “Amazing Grace” and “Grandma COBOL.” She may become known to a new generation by the nickname, “Queen of Code,” the title of a recent documentary from FiveThirtyEight’s “Signals” series. Directed by Community star Gillian Jacobs, the short film, which you can watch in full here, tells the story of her “inimitable legacy as a brilliant programmer and pioneering woman in a male-dominated field,” writes Allison McCann at FiveThirtyEight.

Hopper’s name may be "mysteriously absent from many history books,” as Amy Poehler’s Smart Girls notes, but before her death in 1992, she was introduced to millions through TV appearances on shows like Late Night with David Letterman (top) and 60 Minutes, just above. As you’ll see in these clips, Hopper wasn’t just a crack mathematician and programmer but also an ace public speaker whose deadpan humor cracked up Letterman and the groups of students and fellow scientists she frequently addressed.

The 60 Minutes segment notes that Hopper became “one of that small band of brothers and sisters who ushered in the computer revolution” when she left her professor’s job at Vassar at the start of WWII to serve in the Naval Reserve, where she was assigned to the Bureau of Ships Computation Project at Harvard. But she never stopped being an educator and considered “training young people” her second-most important accomplishment. In this, her legacy lives on as well.

The world’s largest gathering of women technologists is called “The Grace Hopper Celebration.” And a documentary in production called Born with Curiosity (see a teaser above) hopes that “shining a light on and humanizing role models like Grace makes them relatable in a way that inspires others to greatness.” At a time when women make up the lowest enrollment in computer science out of all of the STEM fields, Hopper’s example and encouragement may be much needed.

via Mental Floss

Related Content:

How Ada Lovelace, Daughter of Lord Byron, Wrote the First Computer Program in 1842–a Century Before the First Computer

The Map of Computer Science: New Animation Presents a Survey of Computer Science, from Alan Turing to “Augmented Reality”

Introduction to Computer Science and Programming: A Free Course from MIT 

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Computer Scientists Figure Out What’s the Longest Distance You Could Sail at Sea Without Hitting Land

Back in 2012, a redditor by the name of "Kepleronlyknows" wondered what's the longest distance you could travel by sea without hitting land. And then s/he hazarded an educated guess: "you can sail almost 20,000 miles in a straight line from Pakistan to the Kamchatka Peninsula, Russia."

Six years later, two computer scientists--Rohan Chabukswar (United Technologies Research Center in Ireland) and Kushal Mukherjee (IBM Research in India)--have developed an algorithm that offers a more definitive answer. According to their computations, "Kepleronlyknows was entirely correct," notes the MIT Technology Review.

The longest path over water "begins in Sonmiani, Balochistan, Pakistan, passes between Africa and Madagascar and then between Antarctica and Tierra del Fuego in South America, and ends in the Karaginsky District, Kamchatka Krai, in Russia. It is 32,089.7 kilometers long." Or 19,939 miles.

While they were at it, Chabukswar and Mukherjee also determined the longest land journey you could take without hitting the sea. That path, again notes the MIT Technology Review, "runs from near Jinjiang, Fujian, in China, weaves through Mongolia Kazakhstan and Russia, and finally reaches Europe to finish near Sagres in Portugal. In total the route passes through 15 countries over 11,241.1 kilometers." Or 6,984 miles. You can read Chabukswar and Mukherjee's research report here.

via the MIT Technology Review

Related Content:

A Colorful Map Visualizes the Lexical Distances Between Europe’s Languages: 54 Languages Spoken by 670 Million People

Colorful Maps from 1914 and 2016 Show How Planes & Trains Have Made the World Smaller and Travel Times Quicker

The Atlantic Slave Trade Visualized in Two Minutes: 10 Million Lives, 20,000 Voyages, Over 315 Years

Free Online Computer Science Courses

More in this category... »