Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Screen resolution vs. graphics card resolution 3

Status
Not open for further replies.
Oct 7, 2007
6,597
US
This is gonna be a dumb question. Are these two always equal? In other words, if the screen resolution says 1360x760 is the graphics card not capable of higher resolutions? Just wondering because I'd like to connect my (new) laptop to my TV and lower resolution wouldn't be so great.

Wondering if I need to get a laptop that says "full HD - 1920x1080"?

"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
You'd need to look up the specs on your particular graphic card. Laptops often drive external displays of greater resolution than the built-in display. And, depending on your graphics card, laptops can support both displays at once (mirrored or extended).
 
So, is the maximum resolution setting (when you go into Control Panel\All Control Panel Items\Display\Screen Resolution) based on what your monitor can handle?

Because I've got an Intel HD Graphics 2500 on my desktop and my max resolution only allows 1280x1024 on my VERY OLD Samsung Syncmaster 191T. But that graphics card can do higher resolution than that.

"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
yes that max setting will be what the monitor or tv can handle, if you want to see the max the card can output, there should be a box that says check to show resolutions not supported. that will show everything the video card can output. at a minimum it will support 1920 x 1200.
 
There's usually a driver setting somewhere that determines whether it will allow you to display non-native resolutions. If this is set to off it will only let you choose the native resolution of the display. Apart from things like CRT monitors most displays nowadays only have one native resolution so this means that with scaling off you can't change resolution. With the setting on you can choose any standard resolution and it will either be scaled to fit (making it blurry or blocky) or will under/over-fill the display (i.e. you'll get black borders if you select a resolution smaller than native or a pannable display larger than the screen for resolutions larger than native).

The maximum resolution you see depends partly on what the monitor can handle and partly on what the other settings (allow non-native, scaling mode) are.

In some drivers (or is it in monitor properties?) there's also a setting that allows you to choose resolutions the monitor can't handle. Turning this on is rarely a good idea, unless you like blank screens.

Nelviticus
 
Why am I so ignorant on video issues? I guess I never had to support anyone that cared about video too much.
there should be a box that says check to show resolutions not supported. that will show everything the video card can output.
I don't see that option in Windows 7. Am I missing it somewhere?

If you want to output video to your TV and bypass the internal monitor, is the resolution more "flexible" vs. just one native resolution that works well on the built-in display?


"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
We're all ignorant of everything until we learn it.

If you click 'advanced settings' on the Screen Resolution page you'll get a dialog for the monitor's properties. On the 'monitor' tab there's a checkbox labelled 'hide modes that this monitor cannot display', although on my system that's disabled.

Regarding the TV question, it depends on the connection. Some (e.g. HDMI) are two-way connections that let the TV communicate back to the graphics card all the modes it supports. Others are one-way only, in which case your gfx card will probably let you choose any resolution and you'll just have to experiment to find one that works. You may have to click the 'detect' button in this case to get the card to recognise the TV is there.

Nelviticus
 
What's stopping you from plugging it in and seeing if it blows up? [bigsmile]
 
Any display manufactured since the late 90s contains Extended Display Identification Data (EDID) which is shared with the display controller. The EDID lets the video source know what resolutions, refresh rates, etc., that the display supports.

According to Intel, the max resolution supported by the HD 2500 is 2560x1600 @ 60Hz. In clone mode, the max for each display is 1920x1200 @ 60Hz.



-Carl
"The glass is neither half-full nor half-empty: it's twice as big as it needs to be."

[tab][navy]For this site's posting policies, click [/navy]here.
 
Thanks everybody!! I found the "hide modes that this monitor cannot display" but strangely it gives me no additional choices than before. 1280x1024 is max available and cdogg showed that my video card can do higher.

I would be connecting laptop via HDMI so that means I won't have to choose by guessing.

What's stopping you from plugging it in and seeing if it blows up?
Same reason I haven't tried to "see how fast my car will go". Consequences. Though less probably when talking laptop/TV connection!!!!

"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
One thing you also need to be aware of is the possibility of setting up a driver for the monitor (display) or out to a TV. Windows is probably smart enough to just use the default driver built into windows if it can not identify the monitor or display name. There may also be limits imposed in the custom BIOS on the laptop. Often you can go to the manufacturer's website and download the driver for the display or monitor. On a laptop to save power it is possible that the display may revert to 720p or SD or some other setting to conserve power.

Then when you attach a computer or a laptop to an external monitor or TV, The cable you use also may be a limiting factor. A dvi cable will limit you to about 720p if it is converting to HDMI. HDMI is the default for HDTV's and should be used if available. This is because it carries both HD Video and HD Audio.

Do not use a resolution setting on the TV that is higher than the computer or laptop or you will overdrive the graphics on your system and it may skip or turn black from time to time and eventually overheat or damage something.

 
ceh4702 said:
Do not use a resolution setting on the TV that is higher than the computer or laptop or you will overdrive the graphics on your system and it may skip or turn black from time to time and eventually overheat or damage something.

Say what? Are you talking about laptops in general, or a specific GPU? Connecting monitors and TV's to a laptop that have higher resolutions than the laptop's display is not an issue. For example, a laptop with a 1365x768 resolution should not have an issue displaying 1920x1080 on an external monitor.

Maybe you meant to say "higher than the computer or laptop supports", but even then, that probably wouldn't be an issue, because on most laptops you are not able to "force" higher resolutions than the GPU supports. Software provided by Intel for the Intel HD series, for example, does not give you the option to force unsupported resolutions.



-Carl
"The glass is neither half-full nor half-empty: it's twice as big as it needs to be."

[tab][navy]For this site's posting policies, click [/navy]here.
 
Also, ceh4702, but you are also wrong on the cables, DVI and HDMI, use the exact same digital signal, it has no bearing on the video resolution, and actually supports higher then 1920 x 1080. Hell my monitor is using DVI and it is 2048 x 1152. Since the digital audio and the Digital video are carried on different pins of the cable, there is no issues. A dual link DVI cable can support a resolution of 2560 x 1600. Link here is just such a cable in a 5 second google search. Notice resolution supported. A DVI to hdmi cable will support 1080p/60hz.
 
I'm learning a lot here, so thanks. I actually get to plug my new laptop into my TV tonight when my HDMI cable arrives. Very exciting. Finally getting updated from my (cough) Dell Latitude D800 with Windows XP that I take out into the field to be kicked around.

I live in a 1366x768 world so this talk about 2048 x 1152 is interesting.

"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
I live in a 1366x768 world so this talk about 2048 x 1152 is interesting. "

That's a couple of horizontal steps up from me. Must be nice.

Could do the higher but wouldn't be able to read it without leaning in to the screen.

Ed Fair
Give the wrong symptoms, get the wrong solutions.
 
How in the hell do you manage that?
A. I didn't know any better (was out there). Not really, but I just didn't care.
B. I couldn't see it if I had higher res.




C. Is there where I admit to still having a VCR?

"Living tomorrow is everyone's sorrow.
Modern man's daydreams have turned into nightmares.
 
We're all ignorant of everything until we learn it."

or forget it.

Kinda revives memories of writing to the control registers in the graphics chips back before there was an IBM PC. Glad that the operating system has taken over the details.



Ed Fair
Give the wrong symptoms, get the wrong solutions.
 
See, that's where the beauty of using a newer windows platform comes in handy. You don't need to change the resolution, you can change the DPI settings, or the Font size for reading easier, and you can also go to control panel and display, and change the whole display setting to 125% or 150%, I did this for my step-mother. She kept trying to change the screen resolution, and then complain that it looked pixelated, this solves that issue. It's basically the magnifier but it is the whole display area, instead of a little box you move around the screen.
 
Right, I know Windows 7 (and maybe even Vista) uses higher resolutions for desktop icons, the taskbar, window text, etc., that makes everything appear bigger by default. So higher-res monitors aren't fighting as much with small, hard-to-read icons and text off the bat like they would be in XP. Plus they don't suffer from as much blur when resizing settings like DPI as rclarke mentions.

Still, a 2560x1600 monitor, which is becoming more common now (see HP ZR30), does need some adjusting to get things to a readable size.

-Carl
"The glass is neither half-full nor half-empty: it's twice as big as it needs to be."

[tab][navy]For this site's posting policies, click [/navy]here.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top