17

I have a Lenovo W520 laptop with two graphics cards:

Device Manager showing "Intel(R) HD Graphics Family" and "NVIDIA Quadro 1000M"

I think Windows 7 (64 bit) is using my Intel graphics card ₃ which I think is integrated — because I have a low graphics rating in the Windows Experience Index. Also, the Intel card has 750MB of RAM while the NVIDIA has 2GB.

  1. How do I know for certain which card is Windows 7 really using?
  2. How do I change it?
  3. Since this is a laptop and the display is built in, how would changing the graphics card affect the built in display?
fixer1234
  • 27,064
  • 61
  • 75
  • 116
Jonas Stawski
  • 980
  • 6
  • 12
  • 28
  • 2
    I think it knows to switch which card, based on demand? 3D games should use the NVidia, and most everything else should use the much lower power Intel built in video. – geoffc Sep 01 '11 at 02:24
  • I think you're on to something there, geoffc. @jstawski, Is there any Lenovo brand software running in the system tray, particularly one that manages power or other advanced features? – Hand-E-Food Sep 01 '11 at 02:55
  • You can also disable Nvidia Optimus in the BIOS. :) – John Aug 21 '12 at 21:12

1 Answers1

11

geoffc is right. I found out from exploring the BIOS that my machine is using NVIDIA Optimus, a "new" technology for saving battery. The general idea is that it allows the driver to pick the right graphics card based on the demand, i.e. a 3D game will use NVIDIA, while surfing the net in Chrome will use the integrated Intel card.

There are two ways to manually use a specific card:

  1. Set it at the BIOS level.
  2. Change it in the NVIDIA Control Panel:

    NVIDIA Control Panel

Gaff
  • 18,569
  • 15
  • 57
  • 68
Jonas Stawski
  • 980
  • 6
  • 12
  • 28
  • 1
    This is becoming a common way to pack greater graphics capabilities into laptops without sacrificing battery life. Running a full 3D accelerated graphics system in a laptop, even when it's just showing desktop stuff, uses significantly more power. By using "switchable" graphics, the more powerful device can be turned off and on as needed. – music2myear Sep 01 '11 at 18:08
  • 2
    That seems ill thought-out. Instead of putting **two** graphics adapters in a laptop, they should just make sure that vid mfgs design their chips to use as little power as needed. A high-performance graphics adapter *should not* be using more power if it is doing simple rendering. – Synetech Sep 02 '11 at 03:13
  • @Synetech inc. I agree! – Jonas Stawski Sep 02 '11 at 19:25
  • 4
    @Synetech: The discrete adapter has its own GDDR memory chips, etc., which draw just as much power in 2D mode as full load even though barely any of the memory is in use. Clock generators, which Intel HD Graphics share with CPU cores, are separate and consuming more power with a discrete GPU. Idle power for a discrete GPU just can't be as low as an integrated one no matter how much you optimize. – Ben Voigt May 06 '12 at 17:48
  • @Ben, that may be for Intel HDG, but what about laptops that use other architectures like AMD? Do they also use dual-adapters? I don’t know about these days, but I recall AMD specifically designing its mobile chipsets to be low-power in the past. – Synetech May 06 '12 at 20:25
  • @Synetech: Same thing there. Intel HD and AMD APU graphics idle with less power because they can take advantage of circuitry shared with the CPU for clocks and memory access. AMD and NVidia discrete GPUs, even the "mobile" editions which are optimized for power consumption, always will need more power, if only a little. – Ben Voigt May 06 '12 at 21:56
  • In my case, I could not change my card from the NVIDIA Control Panel. I had to change it in my BIOS (the NVIDIA card there was called "discrete graphics"). – Tripartio Feb 08 '21 at 16:03