Jump to content

1440x900 resolution in Vista


Recommended Posts

Seriously 1440x900 is nice but has been the biggest headach I've ever had with computers, 1440x in XP thread. 1440x in Ubuntu and now, 1440x in Vista...

nVidia Gforce 5500 256mb Graphics

Samsung SyncMaster 941BW 19" Wide

Monitor & Card drivers have been updated w/ vista drivers,

Heres the ***** part of it,

4ly4hl5.jpg

the 1st monitor doesnt get it, but the second one does, (yes I have unchecked "Hide modes that this monitor cannont display" in Advanced settings) than the next thought it to just use monitor "2" as your main monitor, but after reboot it displays the "2nd" monitor with nothing on it...

Im in a pickle... I somehow figured it out a few months ago, but can't grasp what to do now, anyhelp would be greatly apreshiated

---

the monitor driver was released prevista, and on samsungs site, they dont support my model; http://www.samsung.com/support/productsupp...load/index.aspx

Link to post

I don't get your problem...

I'm using 1440x900 in Vista & XP (dualboot), my video card is a 256 MB nVidia 6200 A -LE, with 2 ports, too, my monitor is a 19'' Samsung SyncMaster 940NW (BTW, it's seems to be slightly older than ur monitor and mine it's supported), and i haven't experienced any problems...

Did u tried Windows Update? I upgraded my driver to some sort of 'Plus' Driver using it. Maybe that would work.

Link to post

I think (6 hours deep) I've finally digured it out, if anyones familiar with samsung BW94X monitors, they have digital and analog inputs... I've had them both pluged in this whole time, after hitting [->] on the monitor, i realized it switches between analog and digital, so, this whole time i was looking a digital, which doesnt support 1440x while Analog sat their on "monitor 2" projecting 1440x ... "opps"

Link to post

Digital and analog should support the same resolutions.

I was having a similar problem with my 20.5" widescreen 1680x1050 monitor. It would actually only display up to 1440x900 on DVI, which was frustrating because DVI is infinitely better than VGA. But, if I plugged it in using VGA, I was able to display the native resolution of 1680x1050 no problem.

I spent FOREVER with video drivers, etc, trying to get that @#%#$ monitor to work at the native resolution, and after wasting hour after hour I gave up and just used VGA anyway.

Then, about 3 weeks or so ago, I made the switch to mac. I plugged in the monitor, and it instantly (literally) recognized the monitor's native resolution, chose the proper color profile, and set me up with a dual monitor setup...all without me even doing so much as clicking the mouse. Just plugged it in and it worked. Amazing.

So, I would like to gently inform you that yes, your monitor does support all the same resolutions for digital as it does for analog, the idea is that your computer can't figure it out.

Link to post

lol, thats insaine how Digital {the superier of the two} doesnt hold native res's, Im 99% sure Im grabbing a mac in a week anyways, so we'll see how that works out, (btw my Vista color profile is once again wronge projecting my images with a peachy tone overlay *yay windows!*)

Link to post

I think its your videocard. That isnt a videocard that nVidia supports for Vista if I can recall correctly.

I used to have problems with 1440 on my old moniter with my 7900 but that was way back in early beta drivers (late last year). But they cleared up before I switched to ATi.

Link to post

Well it depends what nVidia card you have. I don't know about the 6 series but the 7 and 8 series have no more performance issues from what I hear.

That doesn't stop me from saying I had a 7900 back in January and I waited till March with no new decent drivers, bought a X1950 XTX and never thought of nVidia again.

Link to post
  • 1 month later...

I have a similar problem. I have an 8800 GTS with an advent 19" widescreen monitor. Now this thing doesn't have a DVI port so I've been using a VGA-DVI converter.

When I first installed Vista and the card supplied nvidia drivers, I got an option to use 1440 x 900 - and all worked fine for about a month then one day I switched on the computer - display was grainy and the 1440x900 option gone!!! And try as I may i can't get it to come back, installed countless drivers, tried rivaTuner to try and over-ride can't get it to work! Anyone got any ideas... it must be possible otherwise how else would I have used it initially. plesae help this is driving me nuts

Link to post
I have a similar problem. I have an 8800 GTS with an advent 19" widescreen monitor. Now this thing doesn't have a DVI port so I've been using a VGA-DVI converter.

When I first installed Vista and the card supplied nvidia drivers, I got an option to use 1440 x 900 - and all worked fine for about a month then one day I switched on the computer - display was grainy and the 1440x900 option gone!!! And try as I may i can't get it to come back, installed countless drivers, tried rivaTuner to try and over-ride can't get it to work! Anyone got any ideas... it must be possible otherwise how else would I have used it initially. plesae help this is driving me nuts

First off, have you updated to the latest beta of nVidia's drivers?

BTW: Latest Beta 32 Bit: http://www.nvidia.com/object/winvista_x86_163.44.html

Latest Beta 64 Bit: http://www.nvidia.com/object/winvista_x64_163.44.html

Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...