I don't think the game is using my descrete GPU

In my NVIDIA control panel, the auto-select is “integrated” so I changed it to the NVIDIA GPU but I swear the game is still using the integrated one. I get horrible performance and some of the graphics settings are grayed out (the ones in the right column).

Hey there Stuffedtiger! Try resetting your in-game options to see if this forces the game to use your NVIDIA GPU instead of the integrated graphics.

If that doesn’t do the trick, please post a ‘DxDiag’ so we can further investigate the problem. Here are the instructions:

  1. Press Windows Key + R.
  2. Type DxDiag and press Enter.
  3. In the DxDiag window, click Save All Information.
  4. Name the file “dxdiag” and click Save
  5. Once you have that made, open the file. You’ll need to copy and paste the contents of the Text document into the post, and put four Tilde (~) marks above the DXDiag. It’ll look like this:
DXDiag goes here

If you have issues pasting here, please use Pastebin and post the link (ex: Pastebin (dot) com/123456).

There are reset options for my other games, but not for WC3. Anyway, here’s my DXDIAG: https://pastebin.com/raw/2pLhxfS5

Just for a quick band aid fix, when that happens, you can just get into control panel, the driver manager page and disable the intel integrated or intel HD driver under graphic drivers. That way your game runs smooth because now its forced to use your nvdia graphics card.

I think the intel driver serves the purpose of running lower intensity stuff to save power and resources, it is also there before downloading your GPU driver, I guess? But I havent seen any bad sides of disabling the intel drivers for myself.

Some people have reported that Warcraft III respects the power management feature of Windows 10, using which ever GPU in hybrid GPU systems that it specifies. Add Warcraft III to the list of managed applications and then set it to use the discrete GPU.

I can’t do that, as this is a laptop and the screen is connected to the Intel chip.

I’ve already tried that. I even tried right-clicking the shortcut and using the “run with GPU” feature. Still no dice.

EDIT: I found the Windows 10 feature that controls the GPU used, as per DrSuperGood’s instructions, but it still didn’t work.

This doesnt work for me either.

Mine is a laptop too, but it works for me.

https://imgur.com/a/iwUmsrv

I dont know, maybe it works only for me?

I disabled the driver (in device manager) and this is what I get:

https://imgur.com/a/O1MO6uc

I’m not sure how I would get it to run on the NVIDIA GPU.

EDIT: I just tried to set the default global GPU in the NVIDIA control panel but the game still seems to be using the Intel chip.

Something seems wrong with your display…

If you are using the the UK or US locale then it is refreshing at a slide show rate of 1 Hz or 1 FPS. This is basically a slide show and no game would ever look good due to how slow this refresh rate is.

If you are using a central European locale then it is refreshing at one-thousand Hz (1,000 UK/US). No such consumer display currently exists on the market and even if it did there is no CPU or GPU capable of achieving that in common applications.

For a contrast my display shows 60 Hz in that field. Some high end gaming orientated displays might show 240 Hz odd.

That was a reply to Johaylon since his suggestion was to disable my Intel driver (to force the use of the NVIDIA GPU). Of course with the driver active I am on 144hz.

https://imgur.com/a/Do2QvSZ

I’m clueless about why this happens. It works for me. Sorry I could not help more

OK, so I might get flamed for this but, I never actually started a match of WC3 because the performance on the menu screen was such garbage. Plus with some of the video options grayed out I thought I was doing something wrong.

I’m pretty sure the game is using my dGPU since I saw the dedicated memory usage go up when I launched it, and the performance in-game is good (I played the campaign). Are we saving frames for a reason?

In the Nvidia Control Panel, under “Desktop”, there’s an option to display the “GPU Activity Icon”:

Once enabled, an icon will appear in the system tray that will show which apps are using the discrete GPU. In the example below, I launched WC3 and SC2:

Thanks for that. I’ll remember that for the future.

Just some food for thought for everyone here.

It’s been a while since I bought a laptop so things may have changed over the last eight or so years. But depending on the makes of your laptop/iGPU/dGPU it may not be beneficial to right out disable the iGPU.

For example, my laptop has an integrated Intel and discrete Nvidia. However, the Nvidia is not an actual stand alone GPU. It requires the Intel to work via a tech called “Optimus”. Meaning that if I disable the integrated Intel, things get any where from worse to not working at all because the discrete Nvidia needs the Intel to render properly/optimally.

So in my case, I must leave them both enabled and use the Nvidia Control Panel to set which apps use the Nvidia GPU. For me anyway, WC3 automatically uses the Nvidia without having to set it manually. You can check to see what apps are using the Nvidia GPU in real time in my reply two above.

True. Thats why I tried disabling, it worked, but im not playing anyways because I dont want to disable my intel entirely(even though it seems like it doesnt have any problems). I’m waiting for the new technology.

So a lot of things ended up said in this thread, but it seems like things are good for the OP, right?

The only thing I really saw wrong with anything you did in this whole thread was that your intel drivers were old. If you end up with performance issues I’d update those, but otherwise you should be good to go.

Johaylon,

If you’re still having an issue let’s move to a different thread for you so that we can troubleshoot things individually, all right? I’d repeat what i mentioned for the OP though - make sure your drivers are totally up to date before going further. That’s often all it takes.

yes. the game is indeed using my dGPU as evident by the GPU indicator from the NVIDIA control panel.

As an FYI I’m using the drivers posted by my laptop manufacturer, so if they are old this is why.