Computer Help forum

General discussion

"Input not Supported" message

by PvtVoulge / January 10, 2009 11:42 AM PST

I have an Acer P221W monitor and just recently installed an Nvidia GeForce 9800 GT OC. I was running it with just the DVI cable, but would get "Input not Supported" when running at 1650x1080. I then used both the VGA and DVI (simultaneously connected) and still get the message. When running games, I will sometimes get a bunch of blue dots (usually around smoky areas) that flow with the graphics. If I change the resolution to somewhere around 1280x1024 it works fine, but looks incredibly pixelated. I just installed the latest drivers, but that didn't help. Thanks for any help!

Post a reply
Discussion is locked
You are posting a reply to: "Input not Supported" message
The posting of advertisements, profanity, or personal attacks is prohibited. Please refer to our CNET Forums policies for details. All submitted content is subject to our Terms of Use.
Track this discussion and email me when there are updates

If you're asking for technical help, please be sure to include all your system info, including operating system, model number, and any other specifics related to the problem. Also please exercise your best judgment when posting in the forums--revealing personal information such as your e-mail address, telephone number, and address is not recommended.

You are reporting the following post: "Input not Supported" message
This post has been flagged and will be reviewed by our staff. Thank you for helping us maintain CNET's great community.
Sorry, there was a problem flagging this post. Please try again now or at a later time.
If you believe this post is offensive or violates the CNET Forums' Usage policies, you can report it below (this will not automatically remove the post). Once reported, our moderators will be notified and the post will be reviewed.
Collapse -
I really doubt that is correct
by ChuckT / January 10, 2009 5:23 PM PST

I don't think you have a 1650x1080 resolution, and maybe that's your problem.

However a resolution of 1680x1050 is likely.

Collapse -
You're right
by PvtVoulge / January 11, 2009 12:02 AM PST

My mistake. It is 1680x1050.

Collapse -
and was that the problem?
by ChuckT / January 11, 2009 4:54 AM PST
In reply to: You're right

Is your problem now resolved?
If not then here's some info.
First, you do NOT have to have both the DVI and VGA cables connected at the same time. It does not hurt, but your monitor will only use the input from one or the other, and ignore the one not being used at the time. It does not hurt to have both connected. I have a couple multi-input LCD monitors, and I have both inputs connected at the same time. But my inputs are coming from different computers, so I use the monitors own input switch to flip back and forth between the different computers. (I do it that way because this current KVM, a Keyboard, Video, Mouse switch, does not switch DVI inputs).

The resolution does not have to be identical between the inputs, either. However, it only makes sense to supply a resolution to the monitor that is the screen's native resolution. In my case it is 1920x1200, and my other large monitor is 1680x1050, just like yours.

Now, if you are getting an "Input not Supported" message (on the monitor) that is either; telling you that you are not connected to the correct input (that's possible is you have multiple inputs on the monitor - however it sounds like you are using both of a probable two, anyway). Or, it is telling you that the resolution signal you are sending to your monitor is not the correct range pixel-wise (like 1680x1050) or else, the refresh rate is too slow, or more likely, too fast, for the monitor's circuits. Try to see if you can set your refresh rate to 60Hz. 60Hz is the most supported refresh rate for LCD monitors. While 60Hz is not the best for tube-type monitors, since it can cause screen flicker (I can see it, some people can't) or even headaches and eyestrain. However, in LCD monitors, because of their inherent slowness of pixel change, you will not notice a screen flicker at 60Hz.

Since it sounds like you have selected the correct HxV (1680x1050) "size" I would guess that you may have the refresh rate higher than 60Hz. Not every LCD monitor is going to support 85 or 120Hz. Try 60Hz and see if that works.

Get back to us.

Collapse -
by PvtVoulge / January 11, 2009 6:24 AM PST

My video card and monitor refresh rate are both set to 60 Hertz. I have read elsewhere that this message usually pops up if the card rate does not match the monitor rate, or vice versa. However, this is not the case and is quite frustrating.

Collapse -
re: card rate does not match the monitor rate
by ChuckT / January 11, 2009 6:52 AM PST
In reply to: Hmm...

That is inaccurate. The card can put out a selection of refresh rates and resolutions. The monitor can ACCEPT a range of refresh rates and resolution parameters.

It is only a matter of finding a common setting between the two. And since you know your monitor is a 1680x1050, and that 60Hz is the most common of the refresh rates that all US bound LCD monitors can use, it is only a matter of making those settings and see what happens.

If this doesn't work, then there is some other issue.
Perhaps this is not the driver for your video card, and you need to find the correct one.
Perhaps the driver has been damaged, and you need to replace.
Perhaps you are running another video interface "solution" (that gives some special features) that are screwing up your video driver.

I can keep on guessing, but I need more to go on.

Collapse -
by PvtVoulge / January 11, 2009 7:04 AM PST

When I installed the card I installed the drivers that came with it. However, the message popped up with them too, so I downloaded the most recent drivers straight from Nvidia. I'll try uninstalling the downloaded drivers and clean installing them. If that doesn't work then I don't know what will.

Collapse -
by PvtVoulge / January 28, 2009 7:36 AM PST

That didn't work. I'm still getting the message. This is quite frustrating!

Collapse -
Swap out
by ChuckT / January 28, 2009 11:52 AM PST
In reply to: >_>

When all else fails, one of the next steps is to swap out possibly bad components.

If you have access to another monitor try that.
If you have only a few bucks, try swapping out the video card. Many decent (but not GREAT!) video cards can be bought cheaply, new at some stores. If you really want to go cheap, find a used video card on Craigslist.

Popular Forums
Computer Help 49,613 discussions
Computer Newbies 10,349 discussions
Laptops 19,436 discussions
Security 30,426 discussions
TVs & Home Theaters 20,308 discussions
Windows 10 360 discussions
Phones 15,802 discussions
Windows 7 7,351 discussions
Networking & Wireless 14,641 discussions

CNET Holiday Gift Guide

Looking for great gifts under $100?

Trendy tech gifts don't require a hefty price tag. Choose from these CNET-recommended useful and high-quality gadgets.