Monitor Tests Forum

Full Version: DDC Handshake for Nvidia 3D Vision (not lightboost) on 2233RZ
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Ah I see, I hadn't heard of those before.

And the increase of 5 in vertical total that seems to be essential for your Lightboost utility - did you stumble upon that number by trial and error? Or did you have some sort of systematic way to figure out what timings were being used?

I played around with NVAPI for a while, but I could not find any implemented function that retreives current custom timings being used. It seems there is a stub for it (using NVAPI_DISP_GetTimings with certain arguments) but the function has not been implemented yet... Bummer.

For this 3D glasses thing I am building a small board with a DVI deserializer on it. The goal is to use it to decode the VSYNC signal straight from the DVI cable and hook up glasses to that. The only way I can think of to figure out these timings is to hook up this same board and probe the HSYNC and VSYNC signals. Tallying them, it should at least be easy to figure out what the vertical total is (amount of HSYNCs per VSYNC). Then to get clues about the rest, I should somehow measure the pixel clock - maybe with a high speed clock divider chip.

Or just try out every single modeline in existence Big Grin


Anyway, before doing anything like that I will try sniffing the DDC line with a logic analyzer.
(08-06-2014 08:58 AM)NarcoticV Wrote: [ -> ]And the increase of 5 in vertical total that seems to be essential for your Lightboost utility - did you stumble upon that number by trial and error? Or did you have some sort of systematic way to figure out what timings were being used?
Someone else already figured it out, and the NVIDIA control panel will show the current timing parameters in the custom resolution dialog if it's opened with the timing standard set to Automatic.
That's interesting - it indeed seems to work to use the custom resolution window to see current timings. However, for me the results don't change when I turn 3D mode on or off. I tried this at two different resolutions - in both cases the screen was clearly in 3D mode but the resolution window showed the same timings (after refreshing) as before.

It could be NVidia patched this so the real timings are not being shown, or the display is set to 3D mode purely through DDC.



--Edit--
I've finally hooked up a logic analyzer to the DDC line. But, I haven't been able to find the handshake between card and monitor.
Switching between 3D mode and normal mode seems to not require any DDC communication. I only detect communication on the DDC when I connect or disconnect the monitor. Of course it could be that the handshake is happening at connect-time.

It also means there is not a DDC command that enables/disables stereo mode directly since I don't see any communications when it happens. It must be either timings or something else. If it is timings, then the NVidia custom resolution screen is "lying" to me in 3D mode.

In your experience, when did the handshaking get initiated by the card? When you enable stereo mode for the first time, or at the time the monitor is connected?
Do you remember on which driver version the custom resolution window trick worked for reading the timings of LightBoost?
Are you capturing the correct address? Make sure you are capturing 0x37 (0x6E/0x6F) for DDC communications and not 0x50 (0xA0/0xA1) for the EDID. The handshake should happen any time the driver tries to enable stereoscopic 3D.

I don't remember what driver I was using to get the current timing parameters. It might have been 314.22 or 320.49. I don't think that's changed in the newer drivers.
I was just capturing any activity at all on the DDC lines, without looking at the contents yet - and there was no activity at all on the line when I enabled 3D mode after plugging in the monitor for the first time. The only time anything is happening is at plug-in time of the monitor.

I checked that the capturing worked by sending some DDC commands manually.

I'll definitely try this again a few times to double-check. And I'll check at plug-in time specifically for the address you mentioned.

About the driver version - I'm actually stuck with older drivers (301.42 now) because I need the "3d Vision Emulator" that fools the driver into thinking a 3D emitter is there. The emulator doesn't work with later versions as far as I'm aware. I'm guessing you have an actual emitter to work with? In any case, the driver gives me the same timings in the custom resolution screen regardless of whether 3D is on or off.

There are still some tricks up my sleeve for finding those timings straight from the cable - I'm sure it'll all be made clear eventually. Thanks again for all the useful tips and quick replies!
Well, I'm glad to say it's finally been tackled! Although I do feel a little bit stupid now...

Two years ago I tried to use Powerstrip to find the special timings needed for 3D mode on the 2233RZ. It gave me wrong information back.

Now I've updated to the latest Powerstrip and NVidia drivers and tried again during stereoscopic mode - and the correct timings for 3D mode rolled straight out. No DDC stuff is needed at all to make this work on a non-NVidia card, even after re-powering the monitor.

The mode is:

HORIZONTAL
Active: 1680
Front Porch: 20
Sync Width: 20
Back Porch: 60 / Blanking: 100 / Total: 1780

VERTICAL
Active: 1050
Front Porch: 3
Sync Width: 6
Back Porch: 485 / Blanking: 494 / Total: 1544

Both sync polarities positive.


I tried this mode at 120Hz and 119Hz refresh rates, both work.

It would have been hell to find out these exact timings by measuring VSYNC and HSYNC off the cable... Thanks Powerstrip! Big Grin

And thanks very much ToastyX for all the advice. Secretly I was kind of disappointed no DDC sniffing, encryption cracking and ADL coding was actually needed. Sleepy

Now I'll start working on the DVI VSync extractor board. If anyone reading this is interested, here's a link to the Hackaday project page:

Hackaday: DVI Sync Extractor


In hindsight it kind of makes sense that a huge vertical back porch was needed. The Lightboost monitors only need a tiny amount of time to flash the screen to one eye of the user - but for the non-Lightboost version, to avoid ghosting, there needs to be a long stable image time where the shutterglass can be opened...
Would you have the Linux mode line for the 3D mode?

I managed to make nvidia 3dvision kit1 work in Linux using
http://users.csc.calpoly.edu/~zwood/teac...1/rsomers/

However in the default 120Hz mode there is a lot of ghosting in the upper part of the image.

I translated your captured readings into a modeline for xorg.conf and it works. The ghosting is gone and I have a clear 3D pulsar coming out of the screen. (demo from the tutorial)

However the monitor displays a message on the screen saying "not optimal mode" and turns itself off after few minutes. There is no way to switch this warning off using the buttons on the side or menu options.

Would you have the modeline from PowerStrip? Maybe I converted the readings wrong?
Modeline "1680x1050_120" 238.460 1680 1728 1760 1820 1050 1053 1059 1544 +hsync +vsync
You have the wrong horizontal parameters and pixel clock. You added some weird 85 Hz mode.

The modeline should be:
Modeline "1680x1050_120" 329.80 1680 1700 1720 1780 1050 1053 1059 1544 +hsync +vsync
(09-02-2014 06:44 PM)ToastyX Wrote: [ -> ]You have the wrong horizontal parameters and pixel clock. You added some weird 85 Hz mode.

The modeline should be:
Modeline "1680x1050_120" 329.80 1680 1700 1720 1780 1050 1053 1059 1544 +hsync +vsync

Thank you very much for replying. It's working properly now! But I had to still tweak the last value.

I applied the above line to my xorg.conf but had ghosting in the upper 1/4 of the image again... So in desperation I started incrementing the last number until I found a mode that has no ghosting and doesn't switch the monitor off.

So the correct Linux mode-line is:
Modeline "1680x1050_120" 329.80 1680 1700 1720 1780 1050 1053 1059 1700 +hsync +vsync

No sure how -- but it works. So I'm very happy. Thank you guys again.
Pages: 1 2
Reference URL's