davidsword 0 Posted July 4, 2007 Report Share Posted July 4, 2007 Seriously 1440x900 is nice but has been the biggest headach I've ever had with computers, 1440x in XP thread. 1440x in Ubuntu and now, 1440x in Vista... nVidia Gforce 5500 256mb Graphics Samsung SyncMaster 941BW 19" Wide Monitor & Card drivers have been updated w/ vista drivers, Heres the ***** part of it, the 1st monitor doesnt get it, but the second one does, (yes I have unchecked "Hide modes that this monitor cannont display" in Advanced settings) than the next thought it to just use monitor "2" as your main monitor, but after reboot it displays the "2nd" monitor with nothing on it... Im in a pickle... I somehow figured it out a few months ago, but can't grasp what to do now, anyhelp would be greatly apreshiated --- the monitor driver was released prevista, and on samsungs site, they dont support my model; http://www.samsung.com/support/productsupp...load/index.aspx Link to post
Godmic 0 Posted July 4, 2007 Report Share Posted July 4, 2007 You don't need dual high res moniters in my opinion. One high res is fine and just use the other at a lower resolution to keep all your PS tools and chat to keep that out of your main window. Link to post
davidsword 0 Posted July 4, 2007 Author Report Share Posted July 4, 2007 nonono let me be more specific. I only have one monitor, the card itself has to ports (digitalDIV & analogRBG /&Svideoout) so its registered as two monitors by default Link to post
Linwe 0 Posted July 4, 2007 Report Share Posted July 4, 2007 I don't get your problem... I'm using 1440x900 in Vista & XP (dualboot), my video card is a 256 MB nVidia 6200 A -LE, with 2 ports, too, my monitor is a 19'' Samsung SyncMaster 940NW (BTW, it's seems to be slightly older than ur monitor and mine it's supported), and i haven't experienced any problems...Did u tried Windows Update? I upgraded my driver to some sort of 'Plus' Driver using it. Maybe that would work. Link to post
davidsword 0 Posted July 4, 2007 Author Report Share Posted July 4, 2007 I think (6 hours deep) I've finally digured it out, if anyones familiar with samsung BW94X monitors, they have digital and analog inputs... I've had them both pluged in this whole time, after hitting [->] on the monitor, i realized it switches between analog and digital, so, this whole time i was looking a digital, which doesnt support 1440x while Analog sat their on "monitor 2" projecting 1440x ... "opps" Link to post
virstulte 0 Posted July 4, 2007 Report Share Posted July 4, 2007 Digital and analog should support the same resolutions.I was having a similar problem with my 20.5" widescreen 1680x1050 monitor. It would actually only display up to 1440x900 on DVI, which was frustrating because DVI is infinitely better than VGA. But, if I plugged it in using VGA, I was able to display the native resolution of 1680x1050 no problem.I spent FOREVER with video drivers, etc, trying to get that @#%#$ monitor to work at the native resolution, and after wasting hour after hour I gave up and just used VGA anyway.Then, about 3 weeks or so ago, I made the switch to mac. I plugged in the monitor, and it instantly (literally) recognized the monitor's native resolution, chose the proper color profile, and set me up with a dual monitor setup...all without me even doing so much as clicking the mouse. Just plugged it in and it worked. Amazing.So, I would like to gently inform you that yes, your monitor does support all the same resolutions for digital as it does for analog, the idea is that your computer can't figure it out. Link to post
davidsword 0 Posted July 4, 2007 Author Report Share Posted July 4, 2007 lol, thats insaine how Digital {the superier of the two} doesnt hold native res's, Im 99% sure Im grabbing a mac in a week anyways, so we'll see how that works out, (btw my Vista color profile is once again wronge projecting my images with a peachy tone overlay *yay windows!*) Link to post
deathmedic3rd 0 Posted July 4, 2007 Report Share Posted July 4, 2007 only have one port plugged in and remove the 'other' monitor.set resolution to what you want.. shouldn'd be too hard. Link to post
Astyanax 0 Posted July 6, 2007 Report Share Posted July 6, 2007 I think its your videocard. That isnt a videocard that nVidia supports for Vista if I can recall correctly. I used to have problems with 1440 on my old moniter with my 7900 but that was way back in early beta drivers (late last year). But they cleared up before I switched to ATi. Link to post
Linwe 0 Posted July 7, 2007 Report Share Posted July 7, 2007 nVidia on Vista sucks. Just switched to a brand new ATI Radeon 9550. Way better!Anyway, hope u solved your problem, coldwerturkey! Link to post
Astyanax 0 Posted July 7, 2007 Report Share Posted July 7, 2007 Well it depends what nVidia card you have. I don't know about the 6 series but the 7 and 8 series have no more performance issues from what I hear. That doesn't stop me from saying I had a 7900 back in January and I waited till March with no new decent drivers, bought a X1950 XTX and never thought of nVidia again. Link to post
nixter 0 Posted September 3, 2007 Report Share Posted September 3, 2007 I have a similar problem. I have an 8800 GTS with an advent 19" widescreen monitor. Now this thing doesn't have a DVI port so I've been using a VGA-DVI converter. When I first installed Vista and the card supplied nvidia drivers, I got an option to use 1440 x 900 - and all worked fine for about a month then one day I switched on the computer - display was grainy and the 1440x900 option gone!!! And try as I may i can't get it to come back, installed countless drivers, tried rivaTuner to try and over-ride can't get it to work! Anyone got any ideas... it must be possible otherwise how else would I have used it initially. plesae help this is driving me nuts Link to post
AirForceOnes 0 Posted September 3, 2007 Report Share Posted September 3, 2007 im having no problems, thats my resolution right now (19 in monitor) Link to post
Astyanax 0 Posted September 4, 2007 Report Share Posted September 4, 2007 I have a similar problem. I have an 8800 GTS with an advent 19" widescreen monitor. Now this thing doesn't have a DVI port so I've been using a VGA-DVI converter. When I first installed Vista and the card supplied nvidia drivers, I got an option to use 1440 x 900 - and all worked fine for about a month then one day I switched on the computer - display was grainy and the 1440x900 option gone!!! And try as I may i can't get it to come back, installed countless drivers, tried rivaTuner to try and over-ride can't get it to work! Anyone got any ideas... it must be possible otherwise how else would I have used it initially. plesae help this is driving me nutsFirst off, have you updated to the latest beta of nVidia's drivers?BTW: Latest Beta 32 Bit: http://www.nvidia.com/object/winvista_x86_163.44.htmlLatest Beta 64 Bit: http://www.nvidia.com/object/winvista_x64_163.44.html Link to post
zaimek 0 Posted September 4, 2007 Report Share Posted September 4, 2007 From what I remeber Samsung has two version of drivers (digital/analog), check if you have installed both of them. Link to post
nixter 0 Posted September 4, 2007 Report Share Posted September 4, 2007 Astyanax; I have the latest drivers to my knowledge version 163.44 - no avail? any other ideas?zaimek; I may be being stupid but I have an advent monitor, and I can't get any drivers for it anywhere Link to post
zaimek 0 Posted September 5, 2007 Report Share Posted September 5, 2007 nixter; sorry, I was refering to coldwerturkey's post. Did you updte any drivers prior to when the problem occurred? Link to post
davidsword 0 Posted September 5, 2007 Author Report Share Posted September 5, 2007 yeah first thing i did, the problem was my monitor reads DIV before RGB, I had them both plugged in so it ignored the analog cord. Human error :S Link to post
Astyanax 0 Posted September 5, 2007 Report Share Posted September 5, 2007 I had this issue in the beta of Vista with my old 19" moniter, if you are using DVI switch to VGA. And yes, that was with nVidia. Link to post
painkilleryusuf 0 Posted September 5, 2007 Report Share Posted September 5, 2007 yea i m running the new ATI driver and gotta say.. they run flawlessly on my vista.. Link to post
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now