View Full Version : Dell Monitor / Radeon Card

02-04-2007, 02:59 PM
I recently replaced one of my dual monitors with a Dell 2407WFP. I have a Radeon 9200 graphics card and I thought I'd just add the new monitor and see what happens. It works fine most of the time but when I do work with video editing (Adobe Premiere Pro 2.0) and encoding (eg Encore DVD), the problem starts.

A coloured static noise starts appearing on the Dell monitor. Over time it gradually gets worse until the whole screen is covered in noise. Eventually the monitor goes into power save mode. The other monitor is unaffected.

I'm picking that I need a better graphics card but I'd like to be confident about the cause before I spend my bucks. I've Googled but can't find anyone else with the same problem.

If I do need a new card, any recommendations? My main use is editing HDV video. I'm told that I should go with a PCI Express card which I believe means a new motherboard as well (The Radeon is AGP). I know stuff all about Mobos and graphics cards so I'm really in the dark here.

Also, I'd like to add a third monitor in the future. If I'm going to spend money now I might as well allow for the third monitor if possible (I already have a spare 17" which would do).

Here's my setup. I hope I've got it all right:

Windows XP Home SP2
Foxconn 865A01 Mobo
Intel Dual-Core Pentium 4 3.00GHz
Radeon 9200 Graphics Card
Realtek AV'97 Audio
3 Internal IDE HDs, 3 external USB HDs
Dell 2407WFP 24" (1920x1200)
Philips 190S 19" (1280x1024)

Edit: There's a photo of my PC innards here (http://www.dave.co.nz/mobo.jpg) if that helps.

02-04-2007, 05:27 PM
Maybe a faulty output on the card, try swapping to see what happens!

Pete O'Neil
02-04-2007, 08:52 PM
Doesn't the 2407WFP require dual link DVI?

02-04-2007, 09:53 PM
Cant be as it has VGA input!

03-04-2007, 11:29 AM
Thanks guys. I have tried swapping things around a bit and changing some settings. There doesn't seem to be any physical fault as far as I can see. What I did discover is that the problem goes away if I use 16-bit colour mode instead of 32-bit.

The 2407WFP uses single-link DVI. I've read that 1920x1200 is at the top end of what a single-link can do. One forum poster said it's technically out of spec but the headroom is enough to let it work. Seems a bit dodgy to make a monitor that is technically out of spec if that's true.

Until I get some confidence that a new mobo & graphics card will actually make a difference I might hobble along in 16-bit colour. I'd appreciate any more comments.

Speedy Gonzales
03-04-2007, 03:08 PM
Sounds like this, if u change the main display to 32 bit it'll change to 1204*768, instead of 1440*900.

I cant see the diff between 16 and 32 bit anyway.

Only the resolution. Which looks crap.

03-04-2007, 03:46 PM
The resolution of the monitor is 1920x1200, and it stays the same in both 16-bit and 32-bit display modes. I did a couple of quick tests this morning and to be honest I can't really see a lot of difference either. I think maybe this will do me for now.

03-04-2007, 04:49 PM
Yeah, I havn't been able to see a visible difference between 32bit and 16bit either. Interesting to hear about 1920x1200 being slightly overspecced. Perhaps it ran into that problem, and by decreasing it to 16 bit means it can handle it.

Speedy Gonzales
03-04-2007, 04:57 PM
I think it only goes to 1024*768 if the 2nd monitor is enabled (if I switch to 32 bit), its ok if I disable the 2nd monitor (TV).