Page 1 of 1

HDMI vs DVI

Posted: Sat Oct 04, 2008 10:36 am
by woodchip
I'm a little confused here. I bought the Samsung T260 and on the back I can connect either for DVI or HDMI.
My Sapphire 4870 vid card has only a DVI port but does have adapter to use HDMI. So what connection will give me the best visuals? As I understand it the DVI is a 8 bit connection but the HDMI can be 8 bit or higher. Just don't know if I will see any difference if I adapt and use HDMI.
Thanks

Posted: Sat Oct 04, 2008 11:55 am
by Krom
They are essentially the same, but HDMI implies support for all the DRM fancy encryption nonsense that the MPAA insisted on having. Of course this is no guarantee that the encryption actually works over a DVI adapter and you are probably screwed from watching legal content in HD anyway if your monitor doesn't fully support HDCP. (Pirated content should still work fine though, so no worries.)

Basically every new standard added to display interfaces in recent history has nothing to do with actually improving the quality, bandwidth or ease of use of the connection, it is all for the purpose of content protection and encryption.

Posted: Sat Oct 04, 2008 1:19 pm
by fliptw
HDCP is supported over DVI. Your monitor just needs to support HDCP to watch contect protected with it.

Posted: Sat Oct 04, 2008 3:24 pm
by Grendel
HDMI also has an audio channel.

Posted: Sat Oct 04, 2008 4:18 pm
by fliptw
audio channels... Im pretty sure HDMI isn't mono.

tho I wouldn't put past them...

Posted: Sat Oct 04, 2008 5:54 pm
by Krom
HDMI has room (bandwidth) for 8 channel lossless PCM audio. However do to the strict content protection that was demanded by the industry and general lazyness on both sides...no current PC implementations of HDMI support it, the best you can do on HDMI from a PC is the lossy audio formats.

Posted: Sat Oct 04, 2008 6:09 pm
by woodchip
Looking at the monitor documentation a bit closer it looks as tho the HDMI connection is more for something like a stand alone dvd player.

Posted: Sun Oct 05, 2008 2:26 pm
by Canuck
Basically the DVI signals are the same as the HDMI signals to some extent evidence is because you can use a DVI to HDMI converter to provide video, but not audio to your set. Your TV wants the whole deal on the HDMI input and doesn't allow an analog audio in on that input like a Sony or Sharp would.

If you want to use your computer HDMI out that is a crapshoot as sometimes they don't include the audio out component or cabling between manufacturers devices to achieve this. If you just want video on the TV and audio on the stereo then your set, use the HDMI. *Edit;

Umm maybe use the vga out, I would after reading the specs on the vid card.

From ATI;
What a burn on the HDMI out! 1080i out max on HDMI...

Q: I have a query, your product page says that the Radeon HD3850 series has \"Hardware processed 1080p video playback of Blu-ray?and HD DVDs\" however the spec spreadsheet shows that the maximum HDMI output mode is 1080i. I am confused which is correct, or is it both - as in the card will decode 1080p video but is only able to output 1080i through the HDMI interface
A: the \"Hardware processed 1080p video playback of HD DVDs and Blu-ray\" means that our GPU can support up to 1080p decoder bandwidth. And the TVOUT and HDMI maximun to 1080i means that the \"DISPLAY\" mode can support up to 1080i. These are 2 different things. If you do not have HDMI 1080i support , just use a CRT monitor or traditional TV to see the HD DVD or Blu-ray DVD , our GPU still can decode 1080p video , and you still can see on the CRT or traditional TV. (071228)


I have converter box that takes the VGA and analog audio out and converts it to full HDMI 1.3a if you run into difficulties. My cost was $350.00 or so.

Posted: Sun Oct 05, 2008 5:44 pm
by woodchip
Why would one be concerned about audio in the stream when most of us would have a sound card and seperate speakers?

Re:

Posted: Sun Oct 05, 2008 9:21 pm
by Mr. Perfect
It seems like it's mostly to reduce cable clutter. Running one cable is easier then separate video and audio lines. Who knows, maybe they can use HDMI audio for additional nifty effects too.
Canuck wrote:I have a query, your product page says that the Radeon HD3850 series has "Hardware processed 1080p video playback of Blu-ray?and HD DVDs" however the spec spreadsheet shows that the maximum HDMI output mode is 1080i.
Does that FAQ apply to the new 4870 though? Woodchip doesn't have a 3000 series part that the question is about.

Re:

Posted: Sun Oct 05, 2008 11:33 pm
by Grendel
woodchip wrote:Why would one be concerned about audio in the stream when most of us would have a sound card and seperate speakers?
If you don't have a HD/BR player there's no concern. If you do and want full HD content played back things get a bit more complicated (DRM), else you should be fine using the DVI port.
Mr. Perfect wrote:Does that FAQ apply to the new 4870 though? Woodchip doesn't have a 3000 series part that the question is about.
No, The 4k series does 1080p + up to 7.1 audio over HDMI.

Posted: Mon Oct 06, 2008 12:07 pm
by Canuck
My mistake they had that note in the FAQ for the 4870.
HDMI carries audio and video on one cable, great when it works.

Posted: Mon Oct 06, 2008 1:01 pm
by Foil
From my personal experience:

Mine is an HD3450 (not as fast as the 3850 or 3870, but with the same featureset), and it does everything perfectly as advertised via the HDMI adapter, including 1080p and the 5.1 audio.

I use it along with a Blu-Ray / HD-DVD combo drive for hi-def movies, as well as streaming video from Netflix and such. I used to have issues with DRM/HDCP with my old video card, but the ATI 3xxx+ cards work just fine.

It plays both Blu-Ray and HD-DVD at 1080p just fine; it is 1080p (1920x1080x60Hz progressive), I've checked. Also, the 5.1 audio is perfectly fine through the RealTek chip on the video card, through the HDMI. I just had to install the RealTek driver.

It doesn't look like you're planning to use the HDMI audio, so there's no real advantage to using HDMI. However, if you do, and you have any issues with it, let me know. I did a lot of research when building my home-theater box.

[Edit: I've also used a DVI->HDMI converter cable to connect to my HDTV with my NVidia 8800 card. It worked just fine for 1080p, no DRM issues. The only difference I noted was that the ATI control center is a bit more flexible than the NVidia system about scaling to handle HDTV overscan.]