Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
258 views
in Technique[技术] by (71.8m points)

c++ - Forcing Machine to Use Dedicated Graphics Card?

I am part of a team developing an application using C++ with SDL and OpenGL.

On laptops when the application is ran the dedicated graphics card is not used and the GL context fails to create because the integrated graphics card does not support the version of GL we want.

I have a feeling that this problem is specific to the laptop in question and not something we can solve through code. But, if anyone knows if there is a solution that'd be great.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

The easiest way from C++ to ensure that the dedicated graphics card is used instead of chipset switchable graphics under Windows is to export the following symbols (MSVC sample code):

Enable dedicated graphics for NVIDIA:

extern "C" 
{
  __declspec(dllexport) unsigned long NvOptimusEnablement = 0x00000001;
}

Enable dedicated graphics for AMD Radeon:

extern "C"
{
  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

Caveat: If the user has created a profile for the application to use integrated chipset, then these will not work.

I am unsure if this would work similarly under Linux / MacOS (unlikely).


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

56.9k users

...