Trend Results : Computer History


Blog Post Results (1-20 of 1486)

FILTER RESULTS

Big data allows computer engineers to find genetic clues in humans

Computer scientists tackled some big data about an important protein and discovered its connection in human history as well as clues about its role in complex neurological diseases.

Generating the Mandelbrot Set With IBM Mainframes

[Ken Shirriff] is apparently very cool, and when he found out the Computer History Museum had a working IBM 1401 mainframe, he decided to write a program. Not just any program, mind you; one that would generate a Mandelbrot fractal on a line printer. The IBM 1401 is an odd beast. Show More Summary

Robots: Quest for Computer Vision

In this episode, Audrow Nash interviews Peter Corke, the professor of robotic vision at Queensland University of Technology, about computer vision. They speak about the history and future of computer vision, and Peter's upcoming Massive Open Online Courses (MOOC).

The story behind the first computer viruses ever

When we think about computer viruses, one tends to think about Windows or perhaps cross-platform malware that comes from visiting questionable websites. But truth be told, computer viruses have a long and storied history, both on the PC and Apple side of the equation. Show More Summary

Web History: The Most Hilarious, Weirdest, And Most Obvious Things People Google Search In Each State

15 Of The Most Googled Terms By Cost For Every State In The America If you were allowed to go on a random persons computer and look through their web history you would probably end up laughing, crying, and being

An Animated Lesson Exploring Whether Robots Can Truly Be Creative

last weekHumor / odd : Laughing Squid

A recent TED-Ed animated lesson by Gil Weinberg explores whether robots can truly be creative, or if they simply do what humans program them to do. The lesson provides a quick history of the first computer programmer Ada Lovelace and her stipulation that for a machine to be considered intelligent, it must be capable of creating original […]

Long Live Internet Explorer

Internet Explorer, as we know it, is dead. Microsoft is killing off the name of the most popular internet browser in the history of computing in favor of a browser called Spartan. Go ahead and make your jokes, yawn if you want to, then pour one out, because it's actually kind of a big deal. Show More Summary

Safari Private Browsing History Is Not Forgotten After All

Visited URLs from Safari private browsing sessions are stored in an easily viewable file on your computer. Safari users, take note: Your private browsing history is actually quite easy to retrieve. A list of the URLs you have visited...Show More Summary

Computing history under the hammer at Bonhams' Fine Books & Manuscripts auction

last weekArts : Artdaily

Exciting and valuable objects from the dawn of computing lead the Fine Books & Manuscripts auction on April 13 at Bonhams New York. The star lot is expected to fetch at least seven figures, being a recently discovered handwritten manuscript by Alan Turing in which he works on the foundations of mathematical notation and computer science. Show More Summary

NYPD Edits Wikipedia Posts on Police Corruption, Shootings

According to Capital New York, computers at the NYPD’s headquarters have been used to edit Wikipedia entries on NYPD corruption, alleged murders and more. Ladies and gentlemen, New York’s finest (at wiping its dirty history clean). The NYPD’s computers were tracked by IP addresses. Show More Summary

Susan Kare’s original Mac icon designs go on show in New York

As the artist responsible for the famous icons used for the original Macintosh, Susan Kare played an immensely important role in personal computer history. A new exhibition at the Museum of Modern Art in New York pays homage to the queen … Read more ›

Wikibon view: ODP clarifies choices in chaotic Hadoop market

Industry consortia have a mixed track record in computer industry history. At their best, they create standards the market can rally around, thereby simplifying user decision making (think W3C). At their worst, they foster bickering and divisiveness that shoves innovation … Continue reading ?

There's one big difference in the way Apple approached the Apple Watch versus the iPhone, according to Jony Ive (AAPL)

Many have labeled the iPhone as one of the most revolutionary computing devices in history — it essentially popularized the modern smartphone. Now, the tech industry is looking to Apple to make a similar impact on the smartwatch industry. Show More Summary

Alibaba Is Expanding Its Cloud Services To The U.S. To Give Amazon New Competition

3 weeks agoTechnology : TechCrunch

Alibaba, the Chinese commerce firm which held the largest IPO in history last year, is bringing cloud computing services in the U.S. after it announced plans to open a data center in Silicon Valley. Read More

Computational Anthropology Reveals How the Most Important People in History Vary by Culture

Data mining Wikipedia people reveals some surprising differences in the way eastern and western cultures identify important figures in history, say computational anthropologists.  

Today in Media History: In 1973, the Philadelphia Inquirer published one of the first computer-assisted reporting projects

On February 18, 1973, The Philadelphia Inquirer published the first in a series of computer-assisted reporting stories called, “Crime and Injustice.” Investigative reporters Donald Barlett and James Steele used a mainframe computer to...Show More Summary

International Hacking Group Steals $300 Million – Global Digital Banking System Not Secure

- Sophisticated "Ocean's 11" style heist is one of the largest in history - Hackers remotely accessed bank computers to manipulate accounts and A.T.M.s. - Banking groups make no comment - Details expose incredible systemic vulnerability...Show More Summary

Non-Arduino powered by a piece of Computing history

Sometimes it is a blessing to have some spare time on your hands, specially if you are a hacker with lots of ideas and skill to bring them to life. [Matt] was lucky enough to have all of that and recently completed an ambitious project 8 months in the making – a Non-Arduino powered by the giant of computing history – Intel’s 8086 processor. Show More Summary

This Arduino-Style Board Uses Intel's 37-year Old 8086 Chip

last monthTechnology / Gadgets : Gizmodo

Card-based computers keep getting more powerful, unless you're Matt Millman. Because this chap has decided to build an Arduino-style board that's powered by a giant of computing history – Intel's 8086 processor, which is now 37 years old. Read more...

RESTART: Russian Hackers Commit Largest Bank Heist In Human History – $1Billion Gone!

Using malware, Russian hackers steal more than $1 Billion in what’s being described as the biggest bank heist in human history. As much as £650 million is thought to have gone missing after the gang used computer viruses to infect … Continue reading ?

Copyright © 2015 Regator, LLC