Post Reply
Custom Resolution Utility (CRU)
03-21-2019, 04:18 AM (Last edited: 03-21-2019, 04:20 AM by Twone)
Post: #3931
RE: Custom Resolution Utility (CRU)
(03-20-2019 07:06 PM)ToastyX Wrote:  
(03-19-2019 08:33 PM)Twone Wrote:  Hey! Can you tell me what the "SCDC present" feature is under HDMI 2.0 support? I just bought Samsung NU7100 4k TV and connected it with a quality 10m cable to my PC with RX 580. Every time I power on/off the TV my main monitor blinks, sometimes the TV's resolution gets reset to native 4k of what I've set it (1080p), sometimes it takes ages to find the signal and all sorts of weird problems. If I turn this feature off on CRU or delete HDMI 2.0 block all together everything works normally so I'm wondering if it's safe to disable this and what it even is, why is it being enabled on default if it causes these problems.
SCDC is the Status and Control Data Channel. The HDMI 2.0 specification states that SCDC is required to support TMDS character rates greater than 340 Mcsc (same as the pixel clock MHz at 8 bpc), which means it's required to get 3840x2160 @ 60 Hz 4:4:4. Deleting the HDMI 2.0 data block would limit it to HDMI 1.4 speeds, which would also prevent 3840x2160 @ 60 Hz 4:4:4 from working, limiting it to either 30 Hz 4:4:4 or 60 Hz 4:2:0. When you say 10m, do you mean 10 meters? Long cables can cause signal and handshake problems at high data rates.

Yes, I mean 10 meters. I have the same problems with a shorter cable like 1 meter one, my GPU loses the signal to the TV for a couple of seconds whenever I power it off or on which makes my main monitor blink and flicker until the signal gets stable. It's fairly problematic as when SCDC feature is enabled the resolution gets reset to 4k whenever my GPU is trying to get the signal to the TV again, and with the 10m cable it takes a long while, which can make my main monitor flicker for up to a 1 minute. It does this with shorter cables too, even with a 1.4 cable which I've tried, it just finds the signal quicker. However if I disable SCDC present or delete the HDMI 2.0 block and switch to 1920x1080 which I mostly use anyways the signal doesn't get dropped nor the resoluton gets changed, even when I power off the TV so my main monitor doesn't blink or flicker, which is how I want it to be. I was just wondering what causes this behavior, I can't say it's the cable as I've tried multiple. AMD driver issues perhaps?
Find all posts by this user
Quote this message in a reply
03-22-2019, 06:17 PM
Post: #3932
RE: Custom Resolution Utility (CRU)
(03-21-2019 04:18 AM)Twone Wrote:  Yes, I mean 10 meters. I have the same problems with a shorter cable like 1 meter one, my GPU loses the signal to the TV for a couple of seconds whenever I power it off or on which makes my main monitor blink and flicker until the signal gets stable. It's fairly problematic as when SCDC feature is enabled the resolution gets reset to 4k whenever my GPU is trying to get the signal to the TV again, and with the 10m cable it takes a long while, which can make my main monitor flicker for up to a 1 minute. It does this with shorter cables too, even with a 1.4 cable which I've tried, it just finds the signal quicker. However if I disable SCDC present or delete the HDMI 2.0 block and switch to 1920x1080 which I mostly use anyways the signal doesn't get dropped nor the resoluton gets changed, even when I power off the TV so my main monitor doesn't blink or flicker, which is how I want it to be. I was just wondering what causes this behavior, I can't say it's the cable as I've tried multiple. AMD driver issues perhaps?
It sounds more like a handshake problem. It's possible SCDC is not present when the TV is turned off, so the video card has to renegotiate the connection. SCDC and HDMI 2.0 are required to support HDMI 2.0 speeds. 1920x1080 doesn't require HDMI 2.0. Does anything change if you disable HDCP support for the TV in Radeon Settings?
Find all posts by this user
Quote this message in a reply
03-23-2019, 01:13 AM (Last edited: 03-23-2019, 01:13 AM by nervous)
Post: #3933
RE: Custom Resolution Utility (CRU)
Why can't I use 1080 by 1080 I saw on a post on Reddit that I could use it. Huh Huh
Find all posts by this user
Quote this message in a reply
03-23-2019, 01:24 PM
Post: #3934
RE: Custom Resolution Utility (CRU)
(03-23-2019 01:13 AM)nervous Wrote:  Why can't I use 1080 by 1080 I saw on a post on Reddit that I could use it. Huh Huh
You can.
Find all posts by this user
Quote this message in a reply
03-23-2019, 02:29 PM (Last edited: 03-23-2019, 02:58 PM by Amazo)
Post: #3935
RE: Custom Resolution Utility (CRU)
Hi ToastyX

I have this TV:
https://www.rtings.com/tv/reviews/sony/x900f

According to this review, it supports 2560x1440@120Hz but I cannot manage to enable it.

I can run fine 3840x2160@60Hz, 1920x1080@120Hz 10-bit but when I select 2560x1440 in Windows settings, it tells me "Desktop Resolution 2560x1440, active signal 3840x2160@60Hz 8-bit with dithering".

Any tip? My card is an AMD RX Vega 64

[Image: Untitled.jpg]
Find all posts by this user
Quote this message in a reply
03-23-2019, 05:08 PM
Post: #3936
RE: Custom Resolution Utility (CRU)
(03-23-2019 02:29 PM)Amazo Wrote:  I have this TV:
https://www.rtings.com/tv/reviews/sony/x900f

According to this review, it supports 2560x1440@120Hz but I cannot manage to enable it.

I can run fine 3840x2160@60Hz, 1920x1080@120Hz 10-bit but when I select 2560x1440 in Windows settings, it tells me "Desktop Resolution 2560x1440, active signal 3840x2160@60Hz 8-bit with dithering".
You need HDMI 2.0 to get 3840x2160 @ 60 Hz 4:4:4 and 2560x1440 @ 120 Hz, but I don't see that in your screenshot. According to the article, "4k @ 60 Hz @ 4:4:4 or 4:2:2 is only possible on HDMI inputs 2 and 3, and only when 'HDMI Enhanced Format' is enabled."
Find all posts by this user
Quote this message in a reply
03-27-2019, 08:04 PM (Last edited: 03-28-2019, 02:29 PM by solewalker)
Post: #3937
RE: Custom Resolution Utility (CRU)
(03-06-2019 08:14 PM)Deusesque Wrote:  
(03-06-2019 04:49 PM)ToastyX Wrote:  The output is a stream of pixels sent at a fixed rate. The pixel clock is the number of pixels sent per second including blanking. FreeSync is like increasing the vertical blanking in CRU without changing the pixel clock. It always sends at the same rate as 75 Hz and varies the blanking to delay the next refresh. The monitor probably has a limit for how long the blanking period can be. Static 37 Hz doesn't have a delay beyond normal blanking. It just sends at a slower rate.

Appreciate the very clear explanation.
Your explanation made me thinking: what if i minimize the standard blanking so that the freesync induced extra blanking JUST falls into the range of my monitor.

And it worked! It took a lot of tweaking with basically all settings (back/front porch and sync both horizontally and vertically) and now my monitor displays 37-75hz perfectly and i have full LFC now! I only have very rare tearing if the FPS is between 37-38 (i suspect actually it's limited to a decimal range) but that is hardly noticably in practice.

So thanks a lot for your help Toasty!

For others having my monitor (LG 29UM59-P), please find below the settings that worked for me.

Man, I have been having a similar problem for so long, finally your settings worked on my monitor too. I have LG 22MP68VQ, with your settings I am using 37-76Hz, but I gotta ask which radeon driver version are you using?
Because previously I was using my own CRU settings of 37-74Hz with 74Hz being "LCD Reduced" but for some reason I got sever stuttering whenever my FPS drops to 32-37 range with any driver released after 18.6.1, so I am still using 18.6.1 driver to this day. If your settings worked even for newer driver then finally I would upgrade my driver. Cheers!

Edit: I updated to 19.3.3 to give it a shot and to my surprise it works even better than 18.6.1 with your settings, running 37-75Hz, with no issues so far.
Find all posts by this user
Quote this message in a reply
03-27-2019, 10:52 PM
Post: #3938
RE: Custom Resolution Utility (CRU)
Hi guys,
I own an HTPc with an NVIDIA RTX2080 Gpu and Win10. my display is a projector (benq X12000H)
I use it with Madvr and I don't have perfect refresh rate with my films.

So I've used your tool. In the image below you can see the refresh rate that I've with standard Nvidia resolution and the one created with CRU...

[Image: cresolution.jpg]

My situation now it's better (now 23,977Hz) but still not perfectly right.

Those are the settings used in CRU tool ...

[Image: prg.jpg]

So what I've to set to have perfect 23,976 Hz??

I've tried also with the custom resolution option into Nvidia panel ....

[Image: npl.jpg]

anyway, also with Nvidia custom resolution, madvr continue to tells me that my display is at 23,977hz ....

Please any help.
Find all posts by this user
Quote this message in a reply
03-29-2019, 09:02 PM
Post: #3939
RE: Custom Resolution Utility (CRU)
(03-27-2019 10:52 PM)actarusfleed Wrote:  So what I've to set to have perfect 23,976 Hz??
Hardware clocks are not guaranteed to be perfect. You would have to figure out how much the hardware clock is off by and adjust the refresh rate to make up the difference. You can adjust the timing parameters to nudge the refresh rate slightly lower. This calculates the totals needed to get a specific refresh rate: https://www.monitortests.com/pixelclock....=24&rate=1
Find all posts by this user
Quote this message in a reply
03-30-2019, 07:28 AM
Post: #3940
RE: Custom Resolution Utility (CRU)
(03-14-2019 12:15 AM)ToastyX Wrote:  
(03-13-2019 08:41 PM)Beagle Wrote:  I have a problem that has led me to this thread - I've just upgraded all my hardware except for my old Apple Cinema Display 30" which has a native res of 2560x1600 and that I've been using on 1920x1200 for the past couple of years. For reasons that remain a mystery to me, the new system is using a res of 1280x800 and I can't seem to get it any higher.

I'm connecting the display via a DVI (dual link) to DisplayPort adapter to my RTX2080, and I've updated my drivers for the card which I initially thought would solve the problem.

I've managed to set-up a new profile in CRU, pressed OK, reset.exe, went to display settings and selected the new resolution, the display goes black for a few seconds during which time the power button on the display flashes intermittently as if it is working, when the picture reappears it defaults back to 1280x800.
It sounds like you have a single-link DVI adapter, not dual-link. A dual-link DVI adapter would have a USB connection for power like this: https://www.amazon.com/dp/B00856WJH8/?tag=mtests-20#ad

The 30" Apple also doesn't have a scaler, so it only supports 2560x1600 and 1280x800 natively. The graphics driver automatically adds some common lower resolutions as scaled resolutions, which is how you had 1920x1200 before, but without 2560x1600 available, it only adds resolutions lower than 1280x800. When you add 1920x1200 with CRU, you're adding a non-scaled resolution that's sent to the monitor, but the monitor doesn't support 1920x1200, so you get a black screen. You need to get a proper dual-link DVI adapter. Then 1920x1200 will probably be available automatically without needing to use CRU.

This solution worked great for me and I never would have found it without you. Thanks for your help. My donation is scheduled for the beginning of April. Keep up the great work.
Find all posts by this user
Quote this message in a reply
 Post Reply


Forum Jump:


User(s) browsing this thread: 94 Guest(s)