56 total posts
(Page 1 of 2)
I have a 32 Inch HD tv as monitor
I have a Olevia Silver 32" 16:9 8ms HD LCD TV Model 332H.
It is great if you use the DVI input, choose a native resolution for your display. Mine runs best quality at 1280 X 720.
If you have a decent Video card you will have lots of choices.
Just keep trying resolutions until you get the best picture possible.
Then tweak the colors and hue, and wow, it is really sharp and fast.
Playing DVD's on my computer through it are outstanding, no ghosting.
Text is crisp, (correct Settings for your Video Card)and everything is distinct. I have heard some complain about using large screens like this, but it is usually people who didn't take the time to find the correct display resolution to get the sharpest screen.
I got mine here.
My video card is an ATI Radeon, 256 MB, nothing special card.
With DVI it is great. Tried the standard PC connection, not so good.
I would say DVI is a must for best display.
HDTV manual sez I shouldn't hook up PC using the HDMI port
Thanks for your reply Tim.
OK so I have this TV's manual (the TV mentioned above) and it says:
"NOTE: DO NOT CONNECT A PC USING THE HDMI PORT. Always use the TV's PCI IN (VGA) port to connect a pc.
- The HDMI port is not designed to support input from a PC.
- Only TV models that include a PC IN (VGA) port are suitable for a connection to a PC."
OK, that's a bit of a bummer. Wouldn't a VGA connection be analog, and thus not use the 1080i capability of this TV?
Why can't I connect my PC in this way? I have a Radeon X1950 xtx card with DVI output and HDTV support. I was going to get a DVI to HDMI cable for hooking it up to the TV. I hope this manual is not taking into consideration video cards such as this.
Do I risk seriously messing up the TV if I use the HDMI port given my video card specs?
Would hooking it up via VGA port be compromising the quality of the picture relative to the HDMI?
Toshiba tech support says...
Toshiba's tech support said not to connect the PC via the DVI-to-HDMI cable to the HDMI input of the TV. The guy didn't know how to explain why I shouldn't do that, given that my video card supports HDTV. I understand the manual says not to do so, but I'm one of those people that wants to know the "why." I really don't want to fry the TV either.
I have an ATI Radeon x1950 video card.
Anyone know the "why?" If my video card supports it, why can't I connect the tv in that way? Something with the way the signal is transferred between the devices?
Really, I'm curious too.
Just wanna let ya'll know I am watching this thread for answer too. Is it just a gimmic to sell their "computer monitor" displays?
Have a question about the HDMI hooking up to a tv
I bought the new Sony Google 32" tv and i have a killer computer with all the good stuff and it has the HDMI connection on the tv and the HDMI micro on the computer. The picture is fine and i dont have any problems except one....I cannot get the mouse to move faster and not have a delay, Ive set the mouse settings to high and tried to change resolutions and ive asked everyone at the Best Buy stores and they have no clue what theyre talking about lol...So my question is, How can i get the mouse cursor on the sony google tv to move faster??
My bet is you'll have to ask if the Sony TV has a GAME MODE.
I bought the same tv and will be getting a mac pro and use the google tv as a monitor. I have read in a lot of articles about what your going through. Its called lag. When you move your mouse and it moves just a bit later on the screen. It can become annoying especially for me that i make music and mix with a mouse. Now i wonder if I should even get the mac pro. I don't want to be stressing out over the lagging of the mouse. Hopefully there is some good news.
This is my thoughts on your problem, even though this is a 2 year old reply, I will post it here so that anyone with this same problem can read up on what I think.
Sometimes the problem you are having can occur with the responsetime of your monitor/tv/mouse, the DPI you have set (but you talked about maxing your sensitivity so maybe this is not the problem)
So really the problem you are having is response time.
Okay, Best buy are idiots. I know that and I'm in Australia. We don't have best buy here, but if you guys are wondering why your mouse is going slowly across the screen, try tweaking the response time (MS) of your mouse and see if that fixes it. Mice are designed for small monitors so using a big screen tv is like introducing a skinny kid into a sumo boxing tournament, he's out of his natural environment and is not set up for what you're trying to make him do. Same with the mouse. I've got a deathadder 3.5G, I haven't tested it out, but I assume it would work fine with a big screen TV.
DVI to HDMI connection from PC to HDTV
Heres' why your manual says not to connect the PC to the HDMI input. Your video card is capable of putting out higher resolutions that can kill your HDTV. It is limited to 1366X768. All you need to do is set the resolution to that or slightly lower and it works out just fine. Most good cards have a 1280X720 setting. I can't live without dual video outputs, one to my HDTV and one to a 20" digital viewsonic. The monitor handles much higher resolutions therefore fonts (and everything) are smaller and you can fit a lot on the screen-making for a larger desktop than the HDTV. So the smaller screen fits much more information, the bigger HDTV fits less information but in a much bigger format and DVD's playing on the PC look great on the HDTV. P.Lambert SC
Vista Ultimate 64bit, Radeon x1950pro + Toshiba 40" zf355
hi, I've got a problem. When I hook up my pc to this tv it runs only to the moment of loading system and before login screen signal is disappearing, do you know what is going on? i use DVI-HDMI cable Prolink Exclusive. It runs properly only on failure/emergency mode
HDMI inputs on TV's only accept CE resolutions, + overscan
Typically, HDMI inputs on TV's are indeed counterintuitive and even *broken* if you attempt to use them as a "PC display". There are two main technical reasons (limitations by design): the inputs only accept "Consumer Electronics" resolutions, and the input signal is subject to "overscan" + further filtering no matter what.
The CE resolutions should be something like 1920x1080 or 1280x720, both of them have an interlaced and non-interlaced variety. The panel's native frame rate should be 50 or 60 Hz (which would correspond to interlaced video at that rate), non-interlaced video material should be 25 or 30 Hz typically. Anyway - let's abstract from all the frame rate conversion quirks you can see even on demo screens in shops... the first important point is: note that 1366x768, a native display resolution typical for "HD-ready" TV's, is typically NOT accepted on the HDMI input!
Okay - from there you may easily conclude: HD-ready is so terribly out these days, anybody would go full-HD anyway, certainly when shopping for a "PC display TV". Hehe - beware of gotcha #2: most TV's will indeed accept 1920x1080 from a PC via the HDMI input, but: the TV will overscan it! = will zoom in on the picture a bit (just a few per cent) and overlap=crop some pixels around the edge of your picture, at the same time scaling the visible picture by some non-integer multiplier. Next, the TV would likely throw in some "edge enhancement" filter for a good measure...
The unavoidable ovescanning is a relic of the old days, when the true edge of a PAL or NTSC analog signal would often carry snippets of digital data for service purposes, appearing as "digital garbage on the edges" if the picture ever gets displayed whole. For that reason, analog CRT TV's always did overscan a bit, and the scaling didn't harm picture quality very much, owing to the analog picture re-composition on the CRT screen. Well the modern LCD/Plasma TV's still do overscan even the HD signals received via HDMI, even signals at the native resolution (that is, if the TV allows you to spoonfeed it the native resolution).
I've read rumours that HD broadcasters actually counter that by *shrinking* the actual visible content in the picture... It's a crazy world. No way for you to get a true 1:1 full-HD image from the camera all the way to your TV. The picture will always be scaled back'n'forth several times.
Note that analog VGA DB15 inputs are considered "PC inputs" per definition, do support a much greater number of resolutions, and if you provide the TV's native display resolution, the TV performs no overscan and no filtering on the input video (perhaps some proper level of color conversion to match the Gamuts). Yes it's analog transmission - subject to noise and limited bandwidth. Then again, with modern semiconductors the noise should be below the cca -50 dB per color channel, and the RAMDAC's on modern VGA's typically have something on the order of 400 MHz maximum pixel clock - whereas 1920x1080 @ 60 Hz = cca 180 Mpix/s (considering some typical blank space around the visible region). The necessary analog bandwidth is theoretically even smaller (Mr. Nyquist would say one half the pixel clock), but even those 400 MHz are not a problem with modern silicon.
Practically, in most cases, I'd expect the bottleneck causing "horizontal pixel smear", ghosting of edges etc, to consist in an output EMC-compliance filter on the analog VGA output of your graphics card. This is a fairly simple RLC filter. You can improve your VGA picture quality by desoldering / shorting that RLC filter, at the expense of voiding the EMC compliance of your VGA card (potentially irritating the FCC or your respective national EM compliance regulator). Some (old) VGA cables are also pretty bad. I've also seen an early LCD TV years ago (from some cheap noname brand) that did smear pixels on the DB15 input for some technical cause of its own... - try before you buy.
If you still manage to find an LCD TV with *DVI* input, chances are that the DVI will behave much more sanely and support more resolutions than the "TV HDMI" input. Note that for DVI, 1920x1080 is likely over spec (162 MHz max.) - unless you massage your graphical card into some "reduced blanking" timings (which is possible) and you have the luck that your TV accepts that.
For the future, I'd be more optimistic if DisplayPort inputs start to appear on TV sets. DisplayPort comes definitely from the PC side of things, so it should make no sense to cripple such inputs by overcan+scaling+filtering.
VGA is perfectly fine
1. Analog or digital has nothing to do with 1080i
2. 1080i is a supported input for your TV, not its output. The TV is a native 720p display, so it just converts 1080i signals to this. You will get best picture quality by using a 720p signal with this TV.
3. It would probably work okay to use a DVI to HDMI cable with your video card/TV, but there's really no reason to. You will almost certainly get better results with a VGA cable.
Also, according to the Best Buy product details, this TV has a DVI input. If that's true, I don't know why you would want to convert to HDMI, and DVI would most likely be the best connection. And when I said you would get best quality with a 720p signal, I really meant a 1366x768 resolution from your computer. Make sure your video card supports this resolution or has the ability to enter it as a custom resolution. ATI drivers do not have a custom resolution feature to my knowledge, but considering the model and HDTV support, you should be fine there.
I would like to disagree...
In my experience you are not going to get better picture with VGA. I have had my pc hooked up to my HD 720p projector with both VGA and HDMI and while they both put out ample resolution, the HDMI clearly gave a richer picture with blacker blacks and whiter whites
re: I have a 32 Inch HD tv as monitor
I would recommend you get a HDMI video card (they are 30-50$) and try again. I have a Radeon Card in my entertainment PC connected to my 32" Panasonic Viera. I even forced my 720p TV to display 1080p (needs a lot of adjusting but it can and does work)
If not I have used HDMI to DVI adapters -which work on both 720p and 1080p. you will need a robust video card to do it- one that has hdtv resolutions (most 2008 and newer Nvidia/AMD cards can do this) the reason is so you can get the display on the tv to go edge to edge
Using HD Plasma (50"0 as a monitor
I don't normally reply to these messages (I am not an expert) but I have done exactly what you have described. I have Panasonic 50" Plasma HD TV and I have a Dell Computer ( XP Home ed ) I do have a video card associated with the TV's DVI connector. At first I used the S video & the picture was ok but fuzzy. I went to Fry's and purchased a HDMI-to-DVI cable (4 Feet) and I couldn't believe my eyes. The picture was great. The HDMI was on the TV and the DVI on the computer. The only thing I notice was the computer picture was slightly larger than my TV screen but I think that is a Display function on my computer. When I say bigger, it was only 1/8 ". I can read everything perfect. I left it alone. I do my iTunes and E-Mail on the big screen.
Sony Bravia KDL-XBR240
I have a Sony Bravia XBR2 - 40' and connected to my computer via VGA and HDMI to DVI; VGA looks really great, excellent i would say, best that the orginal 19" DELL monitor. No issues at all on playing DVDs, games, etc. Just perfect. Rez 1920 x 1080 (same as 1080p).
Connecting to HDMI port the signal is recognized as 1080p, actually this is the only way at present that I can get 1080p, but for some reason it comes slightly larger; though i have set it to be 1920 x 1080 it seems that it is more like 2048 x 1200, so almost all the task bar and half of the first row of icons are out of the screen.
i called DELL and they were looking for updated drivers, but i decided not to wait as i am planning to upgrade to VISTA and i have seen HDMI drivers for my video card, can't wait.
Shortly, having my 1080p Sony connected to computer it is really great...
Hi, I think you may be experiencing overscan. TV's will zoom in to the image as a safety feature. Different TV's zoom in by different amounts. I fixed this on my set by using the Video Card configurations. Do you have an nvidia. You may be able to adjust the image using the video card (left right up down zoomin zoomout). Give it a shot. Your Sony may also have adjustments like these in the Setup.
Common problem though.
This is overscan due to the HDMI input not being designed for computer use. You may be able to fix this with graphics card settings, but since you said VGA looks excellent, I'm not sure why you are using the HDMI to DVI. If your VGA is using the native resolution of the TV (1920x1080 from what you've said), then that is 1080p--It doesn't matter what the TV says it is. If you're getting the right resolution and it looks excellent, use the VGA!
Only 480 lines on Sony
I can only select 720 by 480 pixels on my display properties. I think the TV would support up to about 1360 by 765 according to the manual. I'm running Windows XP Pro, NVIDIA GeForce FX 5600, and a Sony KDF-55WF655 (HDMI). I tried removing the monitors from the Device Manager and redetecting the hardware but I end up with the Plug and Play Monitor with 720 by 480 resolution. Selecting anothez monitor manually, such as Digital Flat Panel 800x600, does not change the resolution choices. How can I get the full monitor resolution?
Any help will be sincerely appreciated.
Re: Only 480 lines on Sony
I resolved this by downloading the latest drivers for the Nvidia card. BTW the FX5600 works very well with my Sony. It has a nice selection of scan resolutions and allows trimming the overscan off the desktop.
Help on getting sound from the TV coming from the PC
Hello I have HDMI from my PC to my TV but no sound? Does anyone know what I need to do to get the sound going?
I have this hooked up from the PC in the back office to the flat screen HD TV in the living room. (PERFECT PICTURE)but no sound?
Thanks in advance.
Not all PCs can pipe sound over HDMI.
Be sure to create a NEW discussion and add in the make model of all parts of that PC.
LCD Big Screen
I purchased a Olevia 37" LT37HVS LCD/TV monitor it has the whole works installed. I just hooked it up and it was astonishing,no problems,it has split screen,as well. got it at Tiger direct$999 with $100 rebate.
HD TV purchased from Tigerdirect
Did you ever get the Tiger rebate????
Big Screens (HDTV) as monitors
I do this from time to time. The biggest bang for your buck would be on a 1080p system as you can then take your resolution up to 1920x1080
I have a 1080i screen and as one of the other respondents said, your stuck at a maximum 1280x720 which gives you a very vertical screen. But the net effect of using the HDTV as a monitor is great if you get all the screen content visible that you desire. I get clear picture resolution and text and graphics are fantastic. One other thing I would point out is that you can also use the tv as an additional monitor (running two monitors on your PC) which give you the ability to do detailed stuff on your small monitor while showing your big results on the big screen. I currently run my 1080i screen at 1024x768 via my laptop and I get a really good picture. Hope that helps.
Connect via VGA if possible
I wanted to connect my Sony Vaio desktop running MCE2005 to my HDTV Sony 55" LCD. I used a DVI to HDMI cable. The results were VERY disappointing. No matter what resolution I used, the text was blurry and difficult to read. Graphics are just "ok". Since this TV has no VGA input, this was my only option on this set.
So I then took the same PC and connected to my 37" Vizio HDTV LCD via its VGA input. This is AMAZING! Great picture. Clear, readable text. Score 1 for Vizio!
I was the first
I was the first to put a computer system together with an HDTV seven years ago. I use one of the VGA inputs on my 36-inch RCA MM36100 (with USB hub) in SVGA mode (800*600). You can't buy this HDTV anymore. There are far better ones on the market today.
The other VGA input is from my RCA DTC100 HDTV receiver (also obsolete). The component input used to be from my DVD player, but is now from my HD DVR. I play DVDs in my computer's DVD burner, now.
I haven't had a "computer monitor" this century. That was not a problem. The problem I saw in advance was the mouse, keyboard and game controller being so far away (12 feet through wood). So, I bought the mouse, game controller and keyboard first. Most on the market are only good to 6 feet and many require line-of-sight. Those would never do.
I am still using my RF Intel Wireless Series keyboard and mouse, but upgrading to Windows XP killed the game controller and crippled the rest.
I ordered the RF "Long Ranger" keyboard and mouse today. It is said to be good to 100 feet without barriers, but reviews complain about the layout and key feel.
People with modern HDTVs will not have the troubles I have had. Mine will not exceed 800*600, and some programs will not work at such low resolution. Get a life! I can't see the pixels from 12 feet away, so why do I need more?!
Not a problem with the Sony.
Your images were blurry on the Sony because you didn't hit the native resolution of the TV with your video card. The TV cannot adjust to resolutions outside of its range and it does the best it can when presented with an in-range but not native resolution. All you have to do is find out the res. by looking at the specs at the back of the manual. If you can't find it there, go the the Sony website and see if they have it in their knowledge base. The Vizio was able to display your video card's output resolution natively, therefore the picture is excellent, as would the Sony's be if the res was right. Try it, but as folks said in previous posts, don't go PAST the native resolution. Good luck!
hdtv as monitor
yes, you can do it and should work fine. I have hooked my Vista pc(with an ATI Radeon x1600pro)-via normal VGA- to 42 Plasma HD(Samsung), and it runs at 1280/768 without any problems, and the quality is absolutely brilliant. Vista can adjust the settings for you if you use the media center tutorial, but for me it worked just fine by letting the tv to auto-adjust the images that received(I believe though that depends on the tv...).If you got an LCD, you would probably be able to get a higher resolution(plasma have their limitations for the moment...), but also consider the graphic card.
Just as a personal finding, I will go from now on only for ATI cards, as Windows installed them without any problems or drivers needed, and also the TV worked in a perfect tandem with this cards(I have used three different ATI cards on three computers), whilst the Nvidia caused me headaches with the drivers not accepted by windows and miserable display(with all settings adjusted to very fine tuning, i.e.:instead of a nice green colour I was receiving a red colour for a field of grass...). If you wish, you can buy a VGA-HDMI adaptor, but make sure that the tv has at least two HDMI ports, otherwise it would be very unconvenient to unplug all the time the HDMI to plug the other one from your Sky or satelite.
These being said, I wish you luck with your choice.
Back to Peripherals Forum
(Page 1 of 2)