This month MIT is celebrating the launch of the new $1 billion MIT Stephen A. Schwarzman College of Computing. To help commemorate the event, here’s a list of 25 ways in which MIT has already transformed the world of computing technology.
1937: Digital circuits
Master’s student Claude Shannon showed that the principles of true/false logic could be used to represent the on-off states of electric switches — a concept that served as the foundation of the field of digital circuits, and, therefore, the entire industry of digital computing itself.
1944: The digital computer
The first digital computer that could operate in real-time came out of Project Whirlwind, a initiative during World War II in which MIT worked with the U.S. Navy to develop a universal flight simulator. The device’s success led to the creation of MIT Lincoln Laboratory in 1951.
Professor Vannevar Bush proposed a data system called a “Memex” that would allow a user to “store all his books, records, and communications” and retrieve them at will — a concept that inspired the early hypertext systems that led, decades later, to the World Wide Web.
1958: Functional programming
The first functional programming language was invented at MIT by Professor John McCarthy. Before LISP, programming had difficulty juggling multiple processes at once because it was “procedural” (like cooking a recipe). Functional languages let you describe required behaviors more simply, allowing work on much bigger problems than ever before.
1959: The fax
In trying to understand the words of a strongly-accented colleague over the phone, MIT student Sam Asano was frustrated that they couldn’t just draw pictures and instantly send them to each other — so he created a technology to transmit scanned material through phone lines. His fax machine was licensed to a Japanese telecom company before becoming a worldwide phenomenon.
1962: The multiplayer video game
When a PDP-1 computer arrived at MIT’s Electrical Engineering Department, a group of crafty students — including Steven “Slug” Russell from Marvin Minsky’s artificial intelligence group — went to work creating “SpaceWar!,” a space-combat video game that became very popular among early programmers and is considered the world’s first multiplayer game. (Play it here.)
1963: The password
The average person has 13 passwords — and for that you can thank MIT’s Compatible Time-Sharing System, which by most accounts established the first computer password. “We were setting up multiple terminals which were to be used by multiple persons but with each person having his own private set of files,” Professor Corby Corbato told WIRED. “Putting a password on for each individual user as a lock seemed like a very straightforward solution.”
1963: Graphical user interfaces
Nearly 50 years before the iPad, an MIT PhD student had already come up with the idea of directly interfacing with a computer screen. The “Sketchpad” developed by Ivan Sutherland PhD ’63 allowed users to draw geometric shapes with a touch-pen, pioneering the practice of “computer-assisted drafting” — which has proven vital for architects, planners, and even toddlers.
MIT spearheaded the time-sharing system that inspired UNIX and laid the groundwork for many aspects of modern computing, from hierarchical file systems to buffer-overflow security. Multics furthered the idea of the computer as a “utility” to be used at any time, like water or electricity.
1969: Moon code
Margaret Hamilton led the MIT team that coded the Apollo 11 navigation system, which landed astronauts Neil Armstrong and Buzz Aldrin ScD ’63 on the moon. The robust software overrode a command to switch the flight computer’s priority system to a radar system, and no software bugs were found on any crewed Apollo missions.
The first email to ever travel across a computer network was sent to two computers that were right next to each other — and it came from MIT alumnus Ray Tomlinson ’65 when he was working at spinoff BBN Technologies. (He’s the one you can credit, or blame, for the @ symbol.)
1973: The PC
MIT Professor Butler Lampson founded Xerox’s Palo Alto Research Center (PARC), where his work earned him the title of “father of the modern PC.” The Xerox Alto platform was used to create the first graphical user interface (GUI), the first bitmapped display, and the first “What-You-See-Is-What-You-Get” (WYSIWYG) editor.
1977: Data encryption
E-commerce was first made possible by the MIT team behind the RSA algorithm, a method of data encryption based on the concept of how difficult it is to factor huge prime numbers. Who knew that math would be why you can get your last-minute holiday shopping done?
1979: The spreadsheet
In 1979, Bob Frankston ’70 and Dan Brickson ’73 worked late into the night on an MIT mainframe to create VisiCalc, the first electronic spreadsheet, which sold more than 100,000 copies in its first year. Three years later, Microsoft got into the game with “Multiplan,” a program that later became Excel.
Before there was Wi-Fi, there was Ethernet — the networking technology that lets you get online with a simple cable plug-in. Co-invented by MIT alumnus Bob Metcalfe ’69, who was part of MIT’s Project MAC team and later went on to found 3Com, Ethernet helped make the Internet the fast, convenient platform that it is today.
1980: The optical mouse
Undergrad Steve Kirsch ’80 was the first to patent an optical computer mouse — he had wanted to make a “pointing device” with a minimum of precision moving parts — and went on to found Mouse Systems Corp. (He also patented the method of tracking online ad impressions through click-counting.)
1983: The growth of freeware
Early AI Lab programmer Richard Stallman was a major pioneer in hacker culture and the free-software movement through his GNU Project, which set out to develop a free alternative to the Unix OS, and laid the groundwork for Linux and other important computing innovations.
1985: Spanning tree algorithm
Radia Perlman ’73, SM ’76, PhD ’88 hates when people call her “the mother of the Internet,” but her work developing the Spanning Tree Protocol was vital for being able to route data across global computer networks. (She also created LOGO, the first programming language geared toward children.)
1994: The World Wide Web consortium (W3C)
After inventing the web, Tim Berners-Lee joined MIT and launched a consortium devoted to setting global standards for building websites, browsers, and devices. Among other things, W3C standards ensure that sites are accessible, secure, and easily “crawled” for SEO.
1999: The birth of blockchain
MIT Institute Professor Barbara Liskov’s paper on Practical Byzantine Fault Tolerance helped kickstart the field of blockchain, a widely used cryptography system. Her team’s protocol could handle high-transaction throughputs and used concepts that are vital for many of today’s blockchain platforms.
While we don’t yet have robots running errands for us, we do have robo-vacuums — and for that, we can thank MIT spinoff iRobot. The company has sold more than 20 million of its Roombas and spawned an entire industry of automated cleaning products.
2007: The mobile personal assistant
Before Siri and Alexa, there was MIT Professor Boris Katz’s StartMobile, an app that allowed users to schedule appointments, get information, and do other tasks using natural language.
Led by former CSAIL director Anant Agarwal, MIT’s not-for-profit online platform with Harvard University offers free courses that have drawn more than 18 million learners around the globe, all while being open-source and nonprofit.
2013: Boston Dynamics
Professor Marc Raibert’s spinoff Boston Dynamics builds bots like “Big Dog” and “Spot Mini” that can climb, run, jump and even do back-flips. Their humanoid robot Atlas was used in the DARPA Robotics Challenge aimed at developing robots for disaster relief sites.
2016: Robots you can swallow
CSAIL Director Daniela Rus’ ingestible origami robot can unfold itself from a swallowed capsule. Using an external magnetic field, it could one day crawl across your stomach wall to remove swallowed batteries or patch wounds.