Post Profile






Team accelerates rendering with AI

Modern films and TV shows are filled with spectacular, computer-generated sequences which are computed by rendering systems that simulate the flow of light in a 3D scene. However, computing many light rays is an immensely labor-intensive and time-consuming process. The alternative is to render the images using only a few light rays, but this shortcut results in inaccuracies that show up as objectionable noise in the final image.
read more

share

Related Posts


Computer graphics researchers crack realistic fabric

Technology / Gadgets : Gizmag

Computer scientists have come up with a new simple, accurate way to simulate the appearance of fabric that could change the way artists and animators in the film and computer game industries go about the business of rendering comput...

Disney's new rendering technique could usher in a new era of animation

Genres / Sci Fi : io9

An animation studio can spend days rendering a scene that features water, smoke and other substances that affect light (and its simulation) in complex ways. Now, a team led by Disney Research Z├╝rich has developed a computational alg...

How to Get Your Favorites Songs on Your iPhone From Your Computer

Technology / Software : Systweak Software

Transferring your favorite tracks to your iPhone from your computer can be a little tricky but not impossible. Unlike Android, you cannot directly transfer music and videos to your iPhone by just connecting it with the cable. But th...

Record-breaking 45-qubit quantum computing simulation run at NERSC

Academics / Physics : Physorg: Physics

When two researchers from the Swiss Federal Institute of Technology (ETH Zurich) announced in April that they had successfully simulated a 45-qubit quantum circuit, the science community took notice: it was the largest ever simulati...

New system greatly speeds common parallel-computing algorithms

Technology : Physorg: Technology news

The chips in most modern desktop computers have four "cores," or processing units, which can run different computational tasks in parallel. But the chips of the future could have dozens or even hundreds of cores, and taking advantag...

Comments


Copyright © 2016 Regator, LLC