PDA

View Full Version : New display tech: Better Than High Definition



Sporkman
August 10th, 2007, 05:03 AM
http://www.technologyreview.com/Infotech/19141/



Better Than High Definition

New high-contrast displays could provide more-realistic images.

By Kate Greene

High-definition displays are increasingly popular. More and more people are experiencing high-definition movies and television in breathtaking color and detail. But another technology, called high-dynamic range (HDR), is on the heels of high definition, and some experts think that it could be a quick successor. Whereas high-definition displays pump out more pixels, HDR displays provide more contrast. In other words, on an HDR display, the brightest whites are hundreds of thousands of times brighter than the darkest blacks; the contrast is key to making images on such a display appear more realistic. "A regular image just looks like a depiction of a scene," says Roland Fleming, a research scientist at the Max Planck Institute for Biological Cybernetics, in Tübingin, Germany. "But high-dynamic range looks like looking through a window."

Fleming, whose recent research on high-dynamic displays is being presented at SIGGRAPH, a graphics conference held this week in San Diego, suspects that this realism will draw people to the technology. And recently, manufacturers have started to pay attention to HDR. Major companies such as Phillips and Samsung have demonstrated prototypes at trade shows. Jason Ledder, a representative for Samsung, says that the company is "doing a variety of research and trying to figure out when and where to incorporate [HDR] into products."

Earlier this year, Dolby bought BrightSide Technologies, a startup based in British Columbia that developed a novel HDR display capable of four hundred times more contrast than a conventional monitor--closer to what the human eye can perceive. While a traditional liquid-crystal display is illuminated by a single white backlight, a BrightSide display is illuminated by an array of tiny white light-emitting diodes (LEDs). This means that individual LEDs can be turned off or on, increasing the darkness or brightness to various parts of the liquid-crystal display. Neither Dolby nor the other companies are providing specific timelines for a product, but Fleming has heard reports that displays could be available, for a few thousand dollars, within a year.

One of the problems with introducing a new type of display, however, is overcoming the perception that there won't be any content that will take advantage of its potential, Fleming says. This is something that has plagued the market for high-definition displays: many people are waiting to buy a high-definition TV until there is more content, and providers are slow to churn out high-definition content until more people have the displays. Many experts believe that the same issue could be a challenge regarding HDR products.

However, the research by Fleming and his colleagues at the University of Bristol, in the UK, and at the University of Central Florida suggests otherwise. "The key questions that everyone's been raising," he says, "are how [HDR] is going to make the transition and how it is going to show a regular image." Usually, he says, regular images can be processed using difficult-to-engineer software that adds contrast. His team's original plan was to determine people's perception of contrast on HDR displays to see how much extra information needs to be added to a regular image to make it appear as an HDR image on an HDR display. To the researchers' surprise, says Fleming, they learned that they didn't need complicated software at all. They surveyed people viewing low-contrast and high-contrast images, both on an HDR display. When the low-contrast images were processed with simple software that amplified pixels, the images were perceived as high contrast. In fact, Fleming says, the average person couldn't tell the difference between the low- and high-contrast images, and all the images looked significantly better than they would have on a regular display.

"The reason this is important," Fleming says, "is that it shows that the technology is ready to deploy immediately. There isn't a technology barrier between releasing these displays and having them widely adopted." He says that such a simple, pixel-amplifying algorithm could easily be incorporated into the display and automatically enhance low-contrast images in real time.

"I think the topic is interesting," says Paul Debevec, a professor of graphics research at the University of Southern California, in Los Angeles. "They're trying to get a handle on the implications of having these HDR displays and find out how it will change things."

In addition to new displays, Debevec says, there will eventually need to be HDR content because, while a low-contrast image looks great on an HDR display, a high-contrast image looks stunning. The basic premise behind producing an HDR image, he says, is to reshoot a scene under different lighting conditions and combine the shots using software. For example, a picture of a person standing in front of an open window would normally look like a dark silhouette surrounded by bright light. Different exposures gather different information, and in the end, the composite HDR image, which captures the bright light as well as the details in the shadows, looks more realistic. However, most cameras don't capture light this way, and while some animators and video-game makers are applying HDR to their work, moviemakers have yet to embrace it.

triptoe
August 10th, 2007, 05:16 AM
sounds nice but I am personally waiting for OLED

bonzodog
August 10th, 2007, 09:02 AM
The question now begs -- how good are the cameras filming the scenes in movies?

The cameras themselves need to be improved drastically if we are to make full use of viewing technology like this.

Kosimo
August 10th, 2007, 09:44 AM
sounds nice but I am personally waiting for OLED

Me too! :)

chewearn
August 10th, 2007, 10:00 AM
Now they tell me my spanking new LCD will be obsolete in 12 months? Jeezs...

popch
August 10th, 2007, 11:02 AM
The question now begs -- how good are the cameras filming the scenes in movies?

The cameras themselves need to be improved drastically if we are to make full use of viewing technology like this.

I do not think that the optics will change at all. However, the thingies that make bits from photons have to improve a bit.

What the article essentially says is that color depth is to be increased. 16 millions colors (24 bits) in a linear space just are not enough. We already know that traditional films (or slides) have a color depth which exceeds our meager 16 millions by an ample margin.

Sporkman
August 10th, 2007, 02:42 PM
The cameras themselves need to be improved drastically if we are to make full use of viewing technology like this.

Not really:



They surveyed people viewing low-contrast and high-contrast images, both on an HDR display. When the low-contrast images were processed with simple software that amplified pixels, the images were perceived as high contrast. In fact, Fleming says, the average person couldn't tell the difference between the low- and high-contrast images, and all the images looked significantly better than they would have on a regular display.