PC to HDTV with Dualview?

decorativeed

Full Member
Joined
Oct 19, 2009
Messages
13,167
Location
Tameside
Does anyone have a set-up like this?

I recently bought one of these: Viera TX-L32S20B

It has a PC input (VGA) and the top resolution for that is only 1280x1024 (something, which had I known beforehand may have steered me clear of this TV - my dad's cheapo TV allows this input at full 1920x1080 native resolution). Films look ok on it, but you can't really read any text. I really only want to watch the video off my PC on the TV, so it's not a major issue, but it is quite annoying.

I set it up as a dualview monitor, with the DVI output into my 22" monitor and other than the resolution issue on the TV, it worked really well.

To solve the resolution problem, I bought a DVI to HDMI cable, swapped the monitor to to VGA output, but this caused the dualview to go a bit funny. I initially set it up and it worked well (although for some reason, despite PC DVI outputting at 1920x1080 and the TV receiving it at the same res, it has a "zoomed in" look about it with about 10% of the screen missing around all the edges). Then, I switched the TV off, keeping everything else on and when I switched it back on, all the settings were gone on the graphics card set-up. I had to do it all over again.

Same thing happened this morning, except that it wouldn't let me set up dualview so that the TV worked as an extended desktop, just as a clone.

Anyone any experience of doing this stuff properly? I thought it would be a bit more straight-forward like with my dad's set-up. Alternatively, does anyone have any suggestions for a graphics card that might be better for this, but not too expensive?
 
I just do HDMI to HDMI and it works fine

Except that my graphics card doesn't have an HDMI output. When IO did that on my dad's set-up (his graphics card does), the picture was zoomed out with thick black borders around it.

Do most graphics cards with HDMI out simply clone the view and settings of you monitor or do they allow extended desktops and seperate configuration?

My main problem is that the settings keep changing - such as now not even recognising the TV, which it did do perfectly (other than the zooming/overscanning issue) yesterday. Why should turning the TV off change the graphics card settings?
 
Update - I've found there was an overscan function set by default to on. Switched it off and that has solved yhe zoom problem. The other problem is caused by switching the PC on before the TV - is there any way to overcome it, as it's not good to have to have both of them on all the time.
 
Good suggestion, I've just saved it and will give it a try later. Watching Big Lebowski on the other screen as I type!
 
Made a profile and it overcomes the problem I was having, thanks CD.

Also installed a programme called DisplayFusion which is really handy fro switching things from screen to screen and having seperate wallpapers and stuff. Pleased with the set-up now.
 
I want a refund. Just tried turning my PC on before the TV and when I do that, it cannot recognise the fact that there is another monitor to output to and does not go into dualview. Surely this is something to do with HDMI signals, but how can I overcome it?
 
Even when you switch to single-view and then back to dual-view? You have to switch to dual-view after the TV has turned on, or at least that's what my girlfriend has to do. If that's still not working then my expertise is exhausted I'm afraid!
 
Even when you switch to single-view and then back to dual-view? You have to switch to dual-view after the TV has turned on, or at least that's what my girlfriend has to do. If that's still not working then my expertise is exhausted I'm afraid!

Yep, that's what I've been doing. It just will not recognise the TV is connected unless I switch it on before the PC.
 
I'm not going to the PC input though, I'm going into the HDMI. When I did connect via the PC input (VGA socket on the back of the TV) I did not have this problem, but I did have shite resolution.
 
Like I say - it doesn't have a DVI input (very few TVs seem to have them - in my price range at least) and the VGA input is limited in resolution - I could not find a manual online and I stupidly assumed that like most other TVs I'd come across, would allow that input at it's native resolution. Why it only goes to 1280x1024, I don't know.

The DVI-HDMI lead works fine and looks great - except for this issue. How can the connector on the TV end have any bearing on the graphics card end? If it were DVI-DVI, I'm sure there would be no issue, just as there is not if I use two monitors.

Edit: looks like the problem could be with the EDID:

Computers can sometimes lose the EDID -- basically the electronic identification of a display which lists its possible resolutions and frequencies. When displays are switched away or disconnected and then reconnected, the EDID can be lost and the computer can lose the ability to display the image properly.