"Optical character recognition, or OCR, is a technology that came up with computing in general. In a lot of ways, it still feels like magic - even though it's a problem we solved long ago. Today's Tedium tells its story."
In celebration of Women’s History Month, Google Play is teaming up with Google’s Made with Code to encourage more teen girls to study computer science. Starting today, you can watch Hidden Figures on Google Play Movies & TV in the U.S. Show More Summary
The MITS Altair 8800 occupies a unique place in computing history as the first commercially succesful microcomputer for personal rather than business use. It is famous as the platform upon which the first Microsoft product ran, their first BASIC interpreter. [Josh Bensadon] has an Altair 8800, and became intrigued by its bootloader. Show More Summary
A Database Management System allows a person to organize, store, and retrieve data from a computer. It is a way of communicating with a computer’s “stored memory.” In the very early years of computers, “punch cards” were used for input, output, and data storage. Show More Summary
March 22, 1993: Apple launches the PowerCD, the first device from the company that doesn’t require a computer to work. A portable CD player that also worked as an external CD drive for Macs, PowerCD will ultimately fail in the marketplace — but it offers a glimpse of the extremely lucrative path Apple will follow […] (via Cult of Mac - Tech and culture through an Apple lens)
Later this month, one of the rarest specimens of hardware history will go up for auction: a working Apple-1. While most will people will never be able to afford to bid on it, this is a great opportunity to look at this beautifully ragtag machine before it disappears into someone’s collection. Read more...
Researchers at the American Museum of Natural History have used superfast computers to reorganize the raw notes that formed “On the Origin of Species.”
Nearly 30 years ago, Tim Berners-Lee, a British computer scientist, proposed a global system of technology that would change history: the world wide web. On this day in 1989, Tim … Click to Continue »
Conway’s life has to be the most enduring zero-player computer game in history. Four simple cellular automaton rules have been used to create amazing simulations since the 1970’s. The latest is an entire digital clock implemented in life. Show More Summary
In what appears to be the largest leak of C.I.A documents in history, WikiLeaks released on Tuesday thousands of pages describing sophisticated software tools and techniques used by the agency to break into smartphones, computers and even Internet-connected televisions. Show More Summary
Your California Bucket List: Essential adventures and experiences in the Golden State March 6, 2017, 6 a.m. (Al Seib, Brian van der Brug, Rick Loomis, Allen J. Schaben / Los Angeles Times) Here's our growing guide to essential California adventures, easy to edgy. We'll be adding to it daily all...
March 3, 1975: The Homebrew Computer Club, a hobbyist group that helps spark the personal computing revolution, holds it first meeting in Menlo Park, California. A forum for computer geeks at a time when few others cared, regular attendee...Show More Summary
March 1, 1991: Apple introduces the Apple IIe Card, a $199 peripheral which lets users turn their LC family Macs into fully-functioning Apple IIe computers. The ability to emulate Apple’s popular Apple IIe computer on a Mac brings together the two operating systems Apple was running side by side, which had previously remained separate. Show More Summary
February 28, 2006: Apple introduces its Mac mini with Intel processor. An affordable “headless” Mac for entry level users, it’s the third Apple computer overall to switch to Intel chips. Oh, and it makes one heckuva media player when plugged into a TV. Show More Summary
Code is almost everywhere. The advent of modern computers arrived in the 1940s. In its rich history, programming enabled better communication, and led to advancements across a myriad of industries. Everything from space travel to telecommunications and healthcare has been revolutionized and affected by code. Show More Summary
Archeological artefacts, such as the Jupiter Column of Ladenburg, a town with an impressive Roman history, hold many as yet undiscovered secrets. Discovered in 1973, the history of the monument that is more than 1800 years old is still unclear. Show More Summary
February 22, 2001: Apple introduces the iMac Special Edition, making its savior iMac G3 computer available in custom “Flower Power” and “Blue Dalmatian” patterned designs. A far cry from the super-serious, aluminum-heavy industrial design...Show More Summary
February 17, 1997: Apple launches the PowerBook 3400, a laptop that Apple claims is the fastest portable computer in the world. After a rough few years for the PowerBook, this laptop is designed to throw down the gauntlet to rivals —...Show More Summary
Writers and scientists throughout history have searched for an apt technological analogy for the human brain, often comparing it to a computer. For Pulkit Grover, Carnegie Mellon University assistant professor of electrical and computer engineering and the Center for Neural Basis of Cognition, this analogy couldn't be more fitting. Show More Summary
Deep in the Valley this afternoon, 500 Startups’ 19th demo day drew to a close at the Computer History Museum. Business-to-business software, fashion and beauty products made up the largest proportion of companies, but 500 left room for a few outliers. Show More Summary