26 total posts
DVI Cable help
I just bought a Samsung SyncMaster S19B420 monitor, resolution 1,400x900.
The DVI cable that came with it is much too short for my needs and I want to buy a longer one.
I need to know if the number of pins on the DVI cable matters. The one that came with the monitor has six rows of pins with two blank rows in the centre. Does it matter if I buy a cable with the eight full rows, or do I need to buy a cable with the six rows just like the one that came with monitor.
I don't know much about cables and would really appreciate if I could get advice on the best possible DVI cable for my monitor. By that, I mean one that gives the best resolution.
I did phone Samsung to ask, but they were useless. Firstly, they couldn't tell me if the number of pins on the cable mattered; and then they told me I should take my monitor to a store and try out a bunch of different cables to see which works best.
Dan, thanks in advance for any help you can give. In the meanwhile I will read about the five different DVI cables you mentioned and see if I can figure out what I might need for my monitor.
When it comes to digital video.
The cable should not matter. I use Amazon Basics all the time. I'm sure you'll get a client that may want to try 5 cables but here I only do that once and to dive into why DVI/HDMI is the same, well, that's a dissertation so I'll keep it short.
My understanding is that it came with BOTH cable so...
try them both and you be the judge. But if your computer don't have both connection then you have no choice but only use what you have.
Re: DVI Cable help
Thanks for the reply Oldartq.
Yes, the monitor did come with both DVI and VGA cables. I'm currently using the VGA cable with an extension and, in the less than two weeks since I've had the monitor, I've had issues with the monitor going into sleep mode while I'm typing, or the screen shifting when the monitor boots up and I have to press the AUTO button to reconfigure the screen. This should not be happening. I have read that Samsung monitors can be finicky with some cables or with cable extensions. I reckon the cable extension is a problem.
In any case, I want to buy a long DVI cable because I keep reading that DVI cables do give a noticeably better picture than the VGA. I won't know till I try it and compare the difference.
problem could be ...
....resolution setting or DPI setting. Look at monitor manual and see what the preferred resolution and dpi settings are and try those.
Re: problem could be ...
My resolution is set at the preferred resolution, but the manual doesn't tell me anything about the DPI setting; nor can I find any info about it on the Net. My default DPI setting is "Normal Size (96 DPI)". The only other choice is "Large Size (120 DPI)". There is also a CUSTOM SETTING. Is there a way I can find out what the Preferred DPI should be?
dpi sounds OK if 96
maybe something wrong with cable itself.
Re: DVI Cable help
Hi bob b,
Thanks for the reply.
Based on the wikipedia link, I need a "DVI-D" cable, which directly matches the one that came with the monitor. Luckily, Amazon has a 10-ft one going cheap, so I'll get it and see how it goes.
To answer you have to tell more.
There is not only DVI but flavors of DVI. More at http://en.wikipedia.org/wiki/Digital_Visual_Interface
Notice there are 5 variations in DVI.
While the Digital ones are better, many can't tell any difference over VGA.
In other words, not something to change to.
DVI vs. VGA
VGA cables, also sometimes called "15-pin," are the way we used to connect our monitors to our computers back in the Pleistocene. Its been around since 1987, and its standard was superseded in 1999 by DVI. VGA cables have fifteen pins arranged in three rows of five.
DVI cables, at least the ones supplied with virtually all consumer monitors (DVI-D Single-Link) have two banks of nine pins each and one flat paddle-type pin.
There is a VERY noticeable difference in the quality of your display with higher resolution monitors when a DVI cable is used. While there are specifications in existence for several different types of DVI connectors, in real-world practice for consumer PC monitors and video cards only DVI-D is in use, so don't let the existence of other standards confuse you. It is highly likely that DVI-D will remain the standard for many years to come; it supports resolutions only available today on extremely expensive professional/scientific monitors (albeit in the "Dual-Link" configuration of 21 pins plus one flat paddle), which is backward-compatible, for higher framing rates).
By the way, with my EVGA nVidia GTX 460, a fairly low-end card, and my ViewSonic 27" 1080p monitor, I get far better results with DVI than I do with HDMI.
DVI cables are DEFINITELY better. VGA cables are prone to crosstalk and other artifacts that can muddy or misalign a picture on your screen, problems that DVI cables don't exhibit.
I am pretty sure that if you change to a DVI cable you will never change back.
Definitely worth replacing VGA Chords for DVI chords
I'm a "whatever works" kind of guy. But I was honestly SHOCKED when I saw the difference that a DVI chord makes over VGA. Text is so much more clear.
It improves the musical quality too
James, you are going to get yourself into trouble again...
Better at what resolution?
I have two monitors both at 1920x1080 resolution one connected with a VGA and the other with a DVI-D. I see no difference at all with either of the two cables. Is it more noticeable playing games or something? basic use I see no difference.
It's the interfaces, not the cables.
when I got a replacement monitor for my HP desktop (well, actually, a converted server), an LG 22" widescreen, it came with both VGA and DVI ports and cables for both. Initially, since the machine didn't have a DVI connection, I just used the VGA connectors and, of course, compared to the old monitor, it looked just great. However, as an ex-server, the on-board graphics capability of the machine was limited and so I fitted a new graphics card with 512 MB memory and both VGA and DVI ports. Comparing the VGA connection on the new card, the difference was very significant but when I switched to using the DVI ports, the display was even better and seemed more responsive (subjective, no way to measure).
So, would I go out of my way to replace a VGA monitor with DVI for no other reason? As Bob says, probably not worth it. But if I was replacing the monitor anyway, yes, I would not consider one that didn't offer DVI. That said, if the machine you are connecting it to doesn't have a DVI port (e.g. a laptop possibly), then you would stick with VGA.
You could go along to your local computer shop and see if they can demonstrate the same monitor on both VGA and DVI to help you make up your mind.
Just put it this way,
On all latest(newest) desktop PCs, as of 'Right Now', VGA ports will no longer exist in the Monitor's connectivity. And, monitors + HD Display Screens will no longer provide VGA ports either.
Only DVI will exist and, quite possibly, Dual DVI connectivity might be strongly considered by many manufacturers of desktop machines as a standard packaging + pre-configuration of such unit. The Home main entertaiment and productivity environment is, now, very strongly, put in consideration. The SOHO world is quickly rising to become the predominant reality of everyday's computing needs.
The technicians call it: 'The much faster cable connection for today's demand of high speed data transit'. And, Engineers call it:' VGA had its time. Now, lets have way better'.
Getting my point? Do hope so. In my case, the cables might come within the newest PC purchase. Do not think I, will have any problems with that kind of transition.
Hope that Spring season comes anytime soon.
Big difference when connecting monitor to macbook air
I noticed a huge difference when I switched from a VGA cable to a DVI cable for the external monitor for my Macbook Air 11". Of course I had to buy another adaptor from Apple to make the DVI connection. I have a 19" Samsung monitor I'm using to help expand the small macbook air 11" screen and using the VGA cable I wasn't able to set the options on my mac to the ideal setting for my monitor 1440x900 60hz. That option simply wasn't available in the options on my mac. With the VGA cable the image was fairly fuzzy and kindof hurt my eyes, not very crisp or clear. After buying the adaptor and connecting the DVI cable I was able to choose the 1440x900 setting and I got a much crisper and clearer image. I was worried this monitor wouldn't work, but just by switching the cable it is working great now.
DVI Vs VGA from technician perspective
Hi, while DVI deliver better quality it is also tend to wear much more. Many times I was called to fix a screen and because there was no signal and it was actually the VDI cable that had to be replaced. I usually replace it with a VGA cable because in my company the employees don't work with high definition (don'y worry, I don't get money for it)
That has more to do with
That has more to do with the digital nature of DVI. Either it works or it doesn't, there's essentially no in between like you can get with VGA. I wouldn't be at all surprised if the VGA cables fail at more or less the same rate because users will jam computers against walls or something and bend cables at all manner of angles they weren't meant to bend in.
I respect your experience but...
that doesn't make any sense. cable wear out faster?
Difference between VGA and DVI.
VGA and DVI connectors are utilized to transmit feature from a source (like a PC or tablet) to a showcase gadget (like a screen, TV or projector). The primary distinction in the middle of VGA and DVI is standing out the feature signs travel. VGA connectors and links convey simple signs while DVI can convey both simple and advanced. DVI is more current and offers better, more keen presentation contrasted with VGA. You can without much of a stretch let them know separated on the grounds that VGA connectors (and ports) are blue while DVI connectors are white.
Pushing the Limits--->Quality
VGA= enhanced analog
If you are pushing the limit on cable length go for quality.
A good shielded cable will give you good results in a noisy world.
VGA just doesn't do 1280p....
Because of the device I am connecting my monitor to is limited to vga only. Where does vga max out regarding 1080p etc.?
The "p" is for progressive scan.
Which is how most VGA settings work. But here's the rub. Some DVD and BluRay players will not work over VGA. Why is on the web.
VGA max out is beyond 1920x1080 so I don't see the issued of 1080 here. Maybe you should start a new post with the issue you encountered.