Post Reply
Custom Resolution Utility (CRU)
02-01-2026, 11:21 PM
Post: #9431
RE: Custom Resolution Utility (CRU)
(01-31-2026 01:03 PM)djsolidsnake86 Wrote:  i have this problem: I have to disable GSync on some games like Vanquish (with gsync i can't set 4k resolution), Re6 and Re Revelations 2 (both with gsync on won't start) on my Panasonic 65LZ1000 to run them or see all screen resolutions, while if I use an LG C3 there's no need to disable GSync and everything start and all resolutions are available?
That's something you should ask NVIDIA support about.
Find all posts by this user
Quote this message in a reply
02-01-2026, 11:22 PM
Post: #9432
RE: Custom Resolution Utility (CRU)
(01-31-2026 04:58 AM)Grape Soda Wrote:  I just started using a dual gpu set up with a hd 7750 and a 3070 for the passthrough feature of windows 11. I was previously using a gtx 970 for interlaced resolutions and was able to create various high combinations that I can't seem to replicate with the hd 7750. With the 970 I could run 1600x1200i 200hz and 1920x1440i at 160hz. When I create these combinations in CRU with the amd card they do not show up in the widows display adapter. It seems like I'm hitting a cap that I don't know how to get around. I was able to get 1920x1440i 160hz to show up briefly but it vanished when I made a 60hz interlaced version of the resolution. I applied the ToastyX patch for amd cards to see if that would help but I'm still having issues with having these resolutions show up. Is there something I'm missing? I'm currently connected to the card via the vga port. Should I use the dvi-i or hdmi instead? I would prefer to stick with the vga or dvi port since I do not have another dac with a high pixel clock.
I remember older AMD GPUs had weird limitations where some resolution/refresh rate combinations wouldn't appear in some cases, but I couldn't figure out any particular limitation that would explain it. It wasn't a pixel clock limit but something else. DVI-I would behave the same as the VGA port. HDMI with an adapter might behave differently. Make sure you have GPU scaling disabled, and try making the highest resolution the first detailed resolution.
Find all posts by this user
Quote this message in a reply
02-02-2026, 04:01 AM (Last edited: 02-02-2026, 04:04 AM by Sunspark)
Post: #9433
RE: Custom Resolution Utility (CRU)
When it comes to de-interlacing video for playback, AMD is completely busted. There might be overlap here with interlaced video modes.

I think it's worth trying the HDMI port at a lower refresh rate just to see if interlacing works properly because the HDMI specification does use interlaced timings due to TV broadcast, etc.
Find all posts by this user
Quote this message in a reply
02-03-2026, 05:57 PM (Last edited: 02-04-2026, 05:46 AM by Grape Soda)
Post: #9434
RE: Custom Resolution Utility (CRU)
(02-01-2026 11:22 PM)ToastyX Wrote:  
(01-31-2026 04:58 AM)Grape Soda Wrote:  I just started using a dual gpu set up with a hd 7750 and a 3070 for the passthrough feature of windows 11. I was previously using a gtx 970 for interlaced resolutions and was able to create various high combinations that I can't seem to replicate with the hd 7750. With the 970 I could run 1600x1200i 200hz and 1920x1440i at 160hz. When I create these combinations in CRU with the amd card they do not show up in the widows display adapter. It seems like I'm hitting a cap that I don't know how to get around. I was able to get 1920x1440i 160hz to show up briefly but it vanished when I made a 60hz interlaced version of the resolution. I applied the ToastyX patch for amd cards to see if that would help but I'm still having issues with having these resolutions show up. Is there something I'm missing? I'm currently connected to the card via the vga port. Should I use the dvi-i or hdmi instead? I would prefer to stick with the vga or dvi port since I do not have another dac with a high pixel clock.
I remember older AMD GPUs had weird limitations where some resolution/refresh rate combinations wouldn't appear in some cases, but I couldn't figure out any particular limitation that would explain it. It wasn't a pixel clock limit but something else. DVI-I would behave the same as the VGA port. HDMI with an adapter might behave differently. Make sure you have GPU scaling disabled, and try making the highest resolution the first detailed resolution.

Thanks for getting back to me! So it sounds like I just have to live with the limitations of this gpu? With that said what other options do I have? Is there another card that is powered by the pcie slot that does not have these limitations?

One thing I can't understand is how the 1920x1440i 160hz option was available for a while until I made another custom res in the resolution that made it vanish. According to the amd control panel scaling is off.

Edit: Just tried the hdmi port and it seems to have the same limitation as the vga so I'm guessing it's the card itself. How strange :/
Find all posts by this user
Quote this message in a reply
02-06-2026, 10:46 PM
Post: #9435
RE: Custom Resolution Utility (CRU)
(02-03-2026 05:57 PM)Grape Soda Wrote:  Thanks for getting back to me! So it sounds like I just have to live with the limitations of this gpu? With that said what other options do I have? Is there another card that is powered by the pcie slot that does not have these limitations?
You could try the GTX 970 or another older NVIDIA GPU. Problem is newer NVIDIA drivers have a weird issue where interlaced resolutions won't show up unless you also create the progressive equivalent, which might be hindered by the 400 MHz pixel clock limit.
Find all posts by this user
Quote this message in a reply
02-06-2026, 11:57 PM (Last edited: 02-07-2026, 03:27 AM by Grape Soda)
Post: #9436
RE: Custom Resolution Utility (CRU)
(02-06-2026 10:46 PM)ToastyX Wrote:  
(02-03-2026 05:57 PM)Grape Soda Wrote:  Thanks for getting back to me! So it sounds like I just have to live with the limitations of this gpu? With that said what other options do I have? Is there another card that is powered by the pcie slot that does not have these limitations?
You could try the GTX 970 or another older NVIDIA GPU. Problem is newer NVIDIA drivers have a weird issue where interlaced resolutions won't show up unless you also create the progressive equivalent, which might be hindered by the 400 MHz pixel clock limit.

This is what I'm currently doing with an older pc and exact card. I don't have access to a higher clock dac so I hit limitations. Also the 400mhz pixel clock doesn't affect the progressive resolutions for some reason.

Now my main pc with the 3070 and 7750 works well aside from those strange limitations I'm seeing with some resolutions not showing up. But now I can't get regular resolutions to show up. I tried creating 1600x1200 85hz in cru and it just won't show up. The only resolutions that show up in the adapter view are scaled ones. Like 1280x720 was being scaled down from 1920x1440 120hz. I do not have scaling enabled in the amd app. What could be causing this?

Edit: I unintalled my monitor and reset my pc in order to get a fresh slate. I don't know why when I create resolutions with CRU after it makes all my standard resolutions disappear. I created 1920x1440i 140hz and that res replaced all my other resolutions in the all modes section with 70hz interlaced version -_-

So like 640x480 70hz interlaced, 800x600 70hz interlaced and so on. What is happening here? lol
Find all posts by this user
Quote this message in a reply
02-07-2026, 04:01 AM (Last edited: 02-07-2026, 04:03 AM by EeK)
Post: #9437
RE: Custom Resolution Utility (CRU)
Thanks again for the answers, MUC.

(01-27-2026 10:13 PM)MUC Wrote:  As far as I could determine experimentally with an RTX 30 card, ignoring an EDID overwrite depends solely on this pixel clock limit (1350 MHz). DSC itself plays no role.

Good to know. My statement was based on what ToastyX has pinned in the first post of this thread.

Quote:The raw EDID of the S90F shows that 3840 x 2160 @ 144 Hz corresponds to the CVT-RB standard (1332.75 MHz). This works at 8-bit without DSC. It's a close call. However, it's true that the TV needs DSC if you want to output 10-bit (HDR). The interface shows a maximum of FRL5 (40 Gbps). At 10-bit, the signal would be at 112% of FRL5.

You can check it here: https://tomverbeure.github.io/video_timings_calculator

3840 x 1600 @ 144 Hz is no problem. The original EDID shows 987.25 MHz. That's also CVT-RB. At 10-bit, that means 83% of FRL5.

Dang, every review of the S90F mentions that it supports the full 48Gbps bandwidth of HDMI 2.1 on all four HDMI ports. I guess that's a lie, then?

Is there any way I can create a custom resolution of 3840 x 2160 @ 144 Hz 10-bit and have DSC always disabled?

I know some people simply uncheck the box for "DSC 1.2a" in HDMI 2.1 support data block under the CTA-861 extension block, but considering what you said, that may not work (and I don't want to break anything, as it was already a struggle to get my current setup working Tongue).
Find all posts by this user
Quote this message in a reply
02-07-2026, 08:47 PM (Last edited: 02-07-2026, 09:15 PM by MUC)
Post: #9438
RE: Custom Resolution Utility (CRU)
(02-07-2026 04:01 AM)EeK Wrote:  Dang, every review of the S90F mentions that it supports the full 48Gbps bandwidth of HDMI 2.1 on all four HDMI ports. I guess that's a lie, then?

No, you're right. Sorry, I confused the S95F with the S90F.

The edid-test.txt file you uploaded clearly shows that the S90F does indeed support 48 Gbps FRL6.

This means: 4K @ 144 Hz RGB 10-bit (HDR) CVT-RB = 94% FRL6 data payload.

The Nvidia driver wouldn't need to enable DSC and could establish FRL6 without it. Whether this works depends on the result of the automatic link initialization (also called "link training"). However, with a high-quality, certified Ultra High Speed ​​HDMI copper cable, no longer than 3 meters (10 ft) and no shorter than 2 meters (6.6 ft), it should work.

Cable recommendation: https://zeskit.com/products/maya%E2%84%A...9567411360

You're using a native straight HDMI > HDMI connection, right?

(02-07-2026 04:01 AM)EeK Wrote:  I know some people simply uncheck the box for "DSC 1.2a" in HDMI 2.1 support data block under the CTA-861 extension block

Yes, if the interface is exclusively HDMI, this method should rule out DSC. Please try it.
Find all posts by this user
Quote this message in a reply
02-09-2026, 02:02 AM
Post: #9439
RE: Custom Resolution Utility (CRU)
(02-06-2026 11:57 PM)Grape Soda Wrote:  Edit: I unintalled my monitor and reset my pc in order to get a fresh slate. I don't know why when I create resolutions with CRU after it makes all my standard resolutions disappear. I created 1920x1440i 140hz and that res replaced all my other resolutions in the all modes section with 70hz interlaced version -_-

So like 640x480 70hz interlaced, 800x600 70hz interlaced and so on. What is happening here? lol
That's normal. The graphics driver will automatically add some common lower resolutions as scaled resolutions. You could delete all the scaled resolutions with SRE: https://www.monitortests.com/forum/Threa...Editor-SRE
Find all posts by this user
Quote this message in a reply
Yesterday, 06:22 AM (Last edited: Yesterday, 06:40 AM by matoduri)
Post: #9440
RE: Custom Resolution Utility (CRU)
Hi toastyX,

Thanks for this amazing piece of software!

I've been using CRU for some time to be able to use ULMB2 with my AMD graphics card by unchecking "Include slots if available" in range limits for my PG27AQN monitor. It was a set and forget thing that I needed to set again after a GPU driver update.


This weekend I upgraded my GPU from 6800XT to 9070XT, did a full driver uninstall, but now everytime I restart my PC, even though the changed profile is set active, I need to launch restart64.exe to get the settings back into effect and be able to use ULMB2 on my monitor again (the ULMB2 toggle is grayed out in my monitor's menu without it). After every restart64.exe launch, a second profile for the same monitor appears in the drop-down menu. This doesn't happen for the 2nd monitor for which I'm not making any changes in CRU as I don't need it.

Also I'm not sure what an asterisk means next to the monitor name, I haven't noticed if it has been there before or not and if it might have something to do with the settings being ignored on reboot.

Any ideas what might be going on?
Find all posts by this user
Quote this message in a reply
 Post Reply


Forum Jump:


User(s) browsing this thread: dr1bler, 269 Guest(s)