if i want to use my dvi connection from my lcd monitor to my video card do i have to change anything in my video card set-up?
how do i maximized its used?
wat is the difference between dvi and vga in terms of quality? how do i maximized it?
wat resolution is ideal for 19″ widescreen lcd?
sorry for so many question.
pls help.
thanks in advance.
do i need to adjust anything from my video card settings to maximized its purpose?
i have set the resolution to 1280 x 1024 and it gets elongated, the desktop stretched why is that?
do i need to adjust anything from my video card settings to maximized its purpose?
i have set the resolution to 1280 x 1024 and it gets elongated, the desktop stretched why is that?
Your monitor or your gfx card shoulda came with a removeable switch u can take off for DVI, look on the back of your PC and see if the monitor runs straight into the video card or if theres an inch long adapter on it, remove it and u’ll have DVI.
DVI is like Hi-def, and VGA is like standard, since your monitor is HD compatible.
I use 1280×1024 as it looks the best and is well spaced out.
If you have a DVI output on the video card and a DVI input on the monitor, then just connect them via a DVI-D cable. If you are buying one you only need a single link cable.
Ideal resolution for a 19″ widescreen is 1440 by 900.
You do not need to do anything else to get the system to work.
DVI-D should be better than VGA because it is a direct digital link.
For VGA the system sends out the video information it converts it from the digital signal inside the video card and sends it as an analog signal, clocked by the pixel clock. However that pixel clock is not sent with the data.
When the monitor gets the signal it has to then sample it and convert it back to a digital format to display on the screen. But since there is no pixel clock information it does not know exactly where to sample the signal. Usually the monitors do a good job estimating where to sample, but sometimes the signal drifts or it estimates wrong and yiu end up with a blurry or jittery image.
DVI-D is digital in the PC to digital on the cable to digital in the monitor. (usually) a perfect signal transfer. No sampling errors no extra noise, no ghosting or ringing form the cable. No conversion errors in the D to A and the A to D converters.
best video baby monitor consumer reports
Three weeks ago I purchased a Samsung SyncMaster 245BW 24″ LCD monitor. It worked fine the first 2 weeks with no problems. However, for the past week I keep getting flashing blue lines on the screen. At one point, there were over 40 horizontal and 20 vertical blue lines appearing on the screen at the same time. The only way to get rid of it is to power off and on.
http://www.youtube.com/watch?v=FjTn52Xmy…
You’ll see that I pan in on my 71″ DLP that’s getting a duplicate feed from my PC and there are no blue lines. I also connected a crappy 17″ CRT and it did not show these lines.
So I called up Samsung tech support and they said it was a driver problem. Based on the video, can a driver cause the flashing blue lines or is the tech guy I spoke to blowing smoke.
36 minutes ago – 3 days left to answer.
Report It
Video Card Is:
XFX GeForce 8600GT nVidia 8600GT Chipset (540Mhz) 256MB (1.4Ghz) GDDR3 Dual DVI PCI-Express Graphics Card
Unfortunatly it sounds like your Video card is not compatable with the spec of the screen, you may need to upgrade.
You are looking at around £25 – £50 for a graphics card that can handle the spec.
No comments:
Post a Comment