(Princeton University, Engineering School) Researchers wrote computer programs that found patterns among anonymized data about web traffic and used those patterns to identify individual users. The researchers note web users with active social media are vulnerable to the attack. Show More Summary
Topic: Libraries "What cannot be automated is the understanding of the implications of these findings for people," said Dr. Tom Lansdall-Welfare, who led the computational part of the study. "That will always be the realm of the humanities and social sciences, and never that of machines." From How a computer sees history after "reading" 35 million news stories Tweet
This photo looks like an amazing piece of computer history. But nobody’s sure where it came from, not even the photo company that controls its rights.
January 18, 1983: Computer manufacturer Franklin Electronic Publishers announces its Franklin Ace 1200 computer, one of several Apple II clones the company made. Franklin’s line of unauthorized Apple clones (unlike the later official clone Macs in the 1990s) becomes the center of an important legal battle, in which a U.S. Show More Summary
January 17, 1984: One week before it made its more famous appearance at the SuperBowl XVIII, Apple’s iconic “1984” ad debuts as a pre-movie trailer in movie theaters. To sell its revolutionary new Macintosh computer, Apple buys several months of ad time from theatrical ad distributor ScreenVision. Show More Summary
This week’s milestones in the history of technology include the public unveiling of the air defense system that led to networked and interactive computing, and the Apple Lisa and Apple Macintosh that led to the mainstreaming of the graphical user interface.
So far, humans have relied on the written word to record what we know as history. When artificial intelligence researchers ran billions of those words from decades of news coverage through an automated analysis, however, even more patterns...Show More Summary
It's been 10 years since Apple debuted the first iPhone. We review the devices' history, its achievements, and where Apple might go from here. The post The iPhone was announced 10 years ago, reinventing smartphones, mobile computing appeared first on ExtremeTech.
The platform still has a cult following today.
Windows vs Mac is just about the oldest battle in computer fanboy history. Right-click sheeple against computers that just barely work is a flame war for the ages. But in the history of the rivalry, it's always been accepted that Windows will be sole on more devices. Show More Summary
Apple's iconic computer nugget is 10 years old on Monday, and my, how it's changed. When Steve Jobs first introduced the iPhone in 2007, it was a squat, chubby gadget sold largely on the merit that you could use it as a "widescreenShow More Summary
From the Computer History Museum: Years ahead of its time, the 1972 Xerox Alto featured Ethernet networking, a full page display, a mouse, laser printing, e-mail, and a windows-based user interface. Although its high price limited sales, the Alto was a groundbreaking invention and the inspiration for the Apple Macintosh and Microsoft Windows operating systems. (Thanks UPSO!)
Ten years ago today, January 9, 2007, was a milestone in the history of computing: The launch of the first iPhone. It wasn't the first "smartphone," or the first phone with a camera. It wasn't the first mobile device to have a touchscreen, or to let users install apps. Show More Summary
Today, modern man finds himself living in the single greatest time in history. Food is abundant, cell phones have more computing power than the combined world 70 years ago, and the internet gives access to limitless information at the touch of a button. Show More Summary
January 4, 1995: Apple signs a deal with third-party Mac accessory maker Radius, allowing it to build Macintosh clones. Radius is the second company to license the Macintosh operating system after Power Computing did the same thing one month earlier. Show More Summary
January 3, 1977: Apple Computer Co. is officially incorporated, with Steve Jobs and Steve Wozniak listed as co-founders. Third Apple founder Ron Wayne — who initially invested in the company — is not part of the deal, after selling back his share in Apple for $800. The funding and expertise needed to turn Apple into […] (via Cult of Mac - Tech and culture through an Apple lens)
January 2, 1979: Entrepreneurs Dan Bricklin and Bob Frankston incorporate their company Software Arts to publish a little program called VisiCalc. The first spreadsheet for the Apple II, the $100 VisiCalc becomes personal computing’s first “killer app” and helps transform personal computers from “cool to have” toy into “must have” business accessory. Show More Summary
December 29, 1999: Apple starts shipping its at-the-time unfathomably large 22-inch LCD “Cinema Display.” The biggest LCD computer display available anywhere in 1999, Apple’s all-digital flat panel is a far cry from the bulky cathode ray tube monitor the then-current iMac sported. Show More Summary
On November 18–20, the American Museum of Natural History (AMNH) hosted “Hack the Stacks,” a solution-building event where over 100 developers, programmers, and others with a passion for computer science worked overnight to develop innovative solutions for the challenges faced by modern libraries and archives. Show More Summary
For the past 30 years, Hollywood has consistently struggled to depict computer hacking in accurate and exciting ways. The history of Hollywood and hacking is littered with lazy writing, absurdly unrealistic computer interfaces and stereotypical "nerd" characters. Show More Summary