Home > Not Working > Dvi Not Working

Dvi Not Working

Contents

Log in with Facebook Log in with Twitter Log in with Google Your name or email address: Do you already have an account? A sufficient PSU is also required, where Nvidia specs say "Minimum Recommended System Power (W) : 500 W". But in theory you should never get a black screen either. The most likely cause is that the video card has decided not to use the standard ancient 640 X 480 mode when it powers up.

Please click the link in the confirmation email to activate your subscription. more hot questions question feed about us tour help blog chat data legal privacy policy work here advertising info mobile contact us feedback Technology Life / Arts Culture / Recreation Science The monitor returns different sets of EDID data depending on whether it is to display in digital or analog mode. This document describes the analog/digital confusion issue with DVI-I monitors. http://www.playtool.com/pages/dvitrouble/dvitrouble.html

Dvi Not Detected

If Windows is visible then it's probably not a hardware problem because the old, slow VGA mode puts way less strain on the DVI transmitter, cable, and receiver than any mode Solutions: lower the refresh rate Lowering the screen refresh rate lowers the DVI pixel clock which makes it much easier to transmit the digital image through the cable. In theory it should never happen. This is a dodge rather than a solution but it will allow you to keep using the computer while you try to figure out the problem.

Why is looping over find's output bad practice? If Windows is giving you a blank screen then you can try to boot in VGA mode and then set a new screen mode. Privacy Policy | Legal | Steam Subscriber Agreement | Refunds STORE Featured Explore Curators Wishlist News Stats COMMUNITY Home Discussions Workshop Greenlight Market Broadcasts ABOUT SUPPORT Install Steam login | language Dvi Not Working But Vga Is DVI-I cables can cause more problems than DVI-Ds so make sure which one you're running.

I am considering RMA but it is a last resort. It can be solved by adding a lower valued resistor in parallel with the improperly-selected pullup resistor. Yes, my password is: Forgot your password? http://www.tomshardware.com/forum/258505-33-working Whichever monitor is plugged in HDMI works fine, but whatever is plugged into DVI will not work until I physically unplug the cable and plug it back in one the machine

If you have a dual link video card, a dual link monitor, and a single link cable, then the single link video modes will work properly and the dual link modes Dvi Stopped Working Use the monitor's on-screen-display to manually select analog or digital - This is another way to force the monitor to always return an analog or digital EDID consistently. Once it goes below that level the image quality deteriorates rapidly and you end up with a corrupted screen. There's only one set of wires in the DVI-I connector for both its analog and digital sections so the monitor has to figure out whether to return the analog or digital

Dvi Cable Not Detected

I can't get the onscreen menu to display before it is done detecting for that matter. http://www.edugeek.net/forums/hardware/61988-monitor-dvi-not-working-vga-fine.html Use reduced blanking - This reduces the pixel clock which may fix the corruption. Dvi Not Detected I have two-cards setup here (one is 560 Dual-DVI) and I can confirm - Windows can't handle more than one monitor per card's DVI. Dvi Output Not Working Jun 4, 2006 #8 jgroves27 TS Rookie altheman said: Did you try fiddeling with the buttons at the front of your monitor.

It might be prudent to first verify this point with Nvidia Support. –harrymc Dec 6 '11 at 10:25 @dtech - Have you tried connecting just one monitor to the I have not tried replacing the DVI cable nor do I have a second DVI enabled display to test that either. GenericHbombDec 27, 2008, 12:28 AM As it seems to often be the case with electronics we will never know. If you have a monitor which can display in both digital and analog mode, then it may have a VGA connector for analog mode and a DVI-D connector for digital mode. Dvi-d Not Working

  1. All help appreciated!
  2. Read your video card's manual maybe?
  3. Solutions: install a monitor .INF file All modern monitors can provide EDID data to the video card which describes the capabilities of the monitor.
  4. share|improve this answer answered Sep 3 '13 at 15:35 wgm 111 add a comment| up vote 0 down vote Before deciding it is a hardware problem, you should search for the
  5. Some DVI-I monitors support an "auto" option which allows the monitor to automatically decide whether to be analog or digital.

Go analog - If you're running in digital mode and you've never gotten the monitor to work at all then try analog mode. i might have missed it that you mentioned it.Click to expand... The display drivers aren't loaded until later in the boot process so you can rule out any driver problems. I have reviewed my BIOS settings and I don't see anything in there that would seem related to my graphics adapter output mode detection (which would stand to reason as it

Check your DVI cable type - Check your cable to make sure you don't have a monitor accepting digital only while using an analog only cable or vice versa. Vga Works But Dvi Doesn't Well guess what? Stop using a DVI-I cable - Sometimes switching to a DVI-D cable can convince the monitor to return only digital EDIDs.

Add a comment Submit · just now Report Abuse Add your answer Monitor DVI cable not working?

Take a look and make sure you're using the right kind. This prevents the video card from reading the capabilities of the monitor. And yet, it does. Dvi Cable No Signal simple(!) FTR, the monitor is a 19" Suyama branded one (not exactly well known) but it does work fine nonetheless.

How to plot a simple circle in LaTeX Code ladder, Robbers Previous examples of large scale protests after Presidential elections in US? In some monitors, all that is required to force the monitor to return the digital EDID is to use a DVI-D cable. You can only upload files of type PNG, JPG, or JPEG. When I use the DVI to VGA converter everything works but when I use DVI to DVI is when I have problems.

LCD monitors usually support reduced blanking for modes which require pixel clocks anywhere close to the 165 MHz limit. Way too many people having the same type issue to have them all be monitor problems. For those of you who know some electronics, I'll explain why this happens. All help appreciated!

Symptoms: BIOS and powerup screens are visible but monitor does not display in Windows This means that the Windows display driver is setting a screen mode that the monitor can't display. You're right ... It works with a vga cable, but not a DVI. Telling your monitor to work as digital only and then using a DVI-A or VGA cable is a sure way to a blank screen.

Jun 4, 2006 #7 altheman TS Rookie Posts: 425 Did you try fiddeling with the buttons at the front of your monitor. Stop using a DVI-I cable - A DVI-I cable used with a DVI-I video card can cause drivers to have a hard time selecting between analog and digital screen modes. DVI not working after windows boot. The left part of the image above is displayed correctly.

Can leaked nude pictures damage one's academic career? Flash the video BIOS - There may be a compatibility problem with the combination of video card and monitor. This page gives detailed instructions on how to uninstall your display driver and this page tells you how to get a new display driver and install it. It's easy to think you're using a DVI-D cable when in fact you're using a DVI-I or vice versa.

Go analog - This is a crummy solution but it often works. Most LCDs default to a screen refresh rate of 60 Hz which is the slowest rate supported in digital mode so they're already running as slowly as possible.