2

For a couple of years, I've had trouble with my homebuilt dual-monitor PC. It's set up to shut off the monitors when not used for several minutes. The problem is, when I wake it up, about 5% of the time something goes wrong: only one monitor wakes up, or the computer bluescreens with an error in ati_something-or-other - obviously a display driver.

Recently I replaced one of the screens with a larger, significantly higher-resolution one, and now things are worse. Intermittently while using the computer, both screens will go black, and my computer will repeatedly make the "hardware removed" sound - you know, the one it makes any time you disconnect a USB device. Turning off the big monitor will allow the other one to come on, then I can sometimes turn the first one back on. Plugging in ONLY the big one seems to work fine. I've tried different cables and swapping ports. The monitor works fine on other computers, too.

UPDATE: it's gotten worse - now I can consistently only run one of the monitors at a time. When I connect both monitors, it's always the same one that shuts off (the newer, bigger one), regardless of which monitor is attached to which DVI port. But when I connect either one alone, it works fine.

My top theories so far:

  1. The video card is bad, and gets unstable when trying to drive that many pixels at once.
  2. The power supply is screwy, and fails to provide enough power to the video card when it's trying to drive two monitors.

That second one is a wild guess. It assumes that a video card needs more power to drive more pixels / more monitors / bigger monitors. Is this true? If it were a power issue, wouldn't GPU-heavy games cause problems too? (They don't.)

Specs:

  • Motherboard: Asus M5A99FX Pro R2.0
  • GPU: Gigabyte Radeon R7 260x 2GB
  • CPU: AMD FX-6300
  • Power: Rosewill Capstone 450
  • Monitors: Acer B273HU (2048x1152), Planar PL2010M (1600x1200)

UPDATE #2

I replaced the power supply with a (higher power, more reliable) 550W EVGA SuperNova G2, and the problem persists. The only remaining option I can see is the graphics card, but there's gotta be a better way to solve this than to keep spending money until I happen to replace the right part :)

Josh
  • 2,979
  • 7
  • 27
  • 23
  • @JakeGould good point on providing some specs... but who ever heard of the make and model of the *display* being relevant for troubleshooting? "Oh, your monitor is a B273HU? Sorry, that one only works with NVIDIA cards." :) – Josh Sep 04 '15 at 23:48
  • 1
    **“…but who ever heard of the make and model of the display being relevant for troubleshooting?”** If I had a dime for every time I have asked for specs and someone said, “Why does that matter?” and then after providing specs the problem was pinpointed to an issue connected to something traceable from the specs I would be a fairly rich person. You are engaging in that kind of “I have a problem, I have decided that isn’t an issue…” behavior that closes you off from the obvious/simple stuff. – Giacomo1968 Sep 05 '15 at 01:20
  • FWIW, please look at [this answer to another question](http://superuser.com/a/968790/167207). Some modern displays are set by default to auto-detect which input is being used. And if you think about it, perhaps auto-detection could choke arbitrarily on some setups causing the display to think no input is selected and then… “Disconnected.” So maybe setting your displays for manual selection of input would solve this issue? Just an idea. – Giacomo1968 Sep 05 '15 at 18:41
  • @JakeGould, good thought, but neither of my displays have selectable inputs. Also, fwiw, even powering off the display doesn't trigger the "hardware disconnect" sound - the computer seems to recognize the monitor even when it's off, as long as it's powered. – Josh Sep 08 '15 at 20:45
  • check your events for anything to do with GPU (VPU?) recovery. Perhaps the disconnects are really the driver crashing and reinitializing. – Yorik Sep 08 '15 at 21:48

0 Answers0