HDCP compliant monitor?

So.... is this still a thing?
I'm looking to buy a monitor (in the end two) for a new rig that I"m going to be building. But because of my immediate gratification complex I was looking and found a monitor up the road at my local Staples for a not bad price that I could be putting to use in the meantime as a second monitor for a laptop set up. However I was reading the spec and it says it's not HDCP compliant. And it occurs to me I haven't heard anything about that in years.

Is that still a thing? Any insight on this? What am I losing (if anything) picking up some non-compliant monitors these days?
 

figmentPez

Staff member
Yes, it's still a thing. I'm not sure what all you'll have trouble with, but if you watching streaming services like Netflix or Amazon Prime on your computer, they won't play back in HD on that monitor, possibly not at all as long as you have any monitor connected that isn't HDCP.
 
well that's annoying.
I don't know that I'll be doing any streaming or such with that rig but I don't want to limit myself anyhow.
Guess I either drop a few more dollars or convince myself that immediate gratification isn't worth it.
Thanks
 
Yes. Until such time as DRM goes away (don't hold your breath!), HDCP compatibility will be required if you want to play any hi-def content, otherwise your playback will be "dumbed down" to 480p...IF it plays at all.

Really all that HDCP certification means is that a device can identify itself as an "official" member of a High-bandwidth Digital Content Protection chain, which basically means that its manufacturer promises Intel* that their device will not allow diverting of the unencrypted audio/video stream to any other devices that are not in an "approved" HDCP chain...i.e., to be recorded or broadcast in an unencrypted state.

--Patrick
*HDCP was developed by Intel.
 
oh man dont even get me started on this as someone who watches non-pirated media on my computer. do you know hard it is to find a displayport 1.2a 4k compliant cable!? ask @PatrThom about the many nights i consulted with him trying to figure it out. I used no less than 6 computer/software professionals to find a cable that could do what was needed.

the short answer is if you are going to watch legal streams/copies of media all your parts should have the HDCP compliant sticker.
 
oh man dont even get me started on this as someone who watches non-pirated media on my computer. do you know hard it is to find a displayport 1.2a 4k compliant cable!? ask @PatrThom about the many nights i consulted with him trying to figure it out. I used no less than 6 computer/software professionals to find a cable that could do what was needed.

the short answer is if you are going to watch legal streams/copies of media all your parts should have the HDCP compliant sticker.
Or just crack everything because fuck drm.
 

GasBandit

Staff member
It's still an issue in professional A/V, a great deal. Mostly when people try to use macbooks as a signal source into a conference room's presentation system.
 
It's still an issue in professional A/V, a great deal. Mostly when people try to use macbooks as a signal source into a conference room's presentation system.
Should be less of an issue now that Apple revised their HDMI adapter in August to support 4K60.
...except for the fact that the new and old ones still look the same (aside from a teeny-tiny model number, that is), so you're going to get some "Why won't mine do that?" people who are still using the older 4K30 adapter.

--Patrick
 
and now you have the assholes with the 4k 120hz plus monitors and its all BUT IT DOESNT WORK RIGHT! dog no one has a standard for that. and my ire for 4k UHD blu-ray on PC is something legendary. AFAIK no video card on the market has support for intel SGX protection. When i decide I want to start collecting/streaming real 4K video Ill have to build or buy something just for that!

To be dead honest, I may just go to ripping to MKV when I start buying 4K stuff.
 
Last edited:
Top