我制作了一个简单的 OSG 离屏渲染器,它可以在不弹出窗口的情况下进行渲染。
osg::ref_ptr<osg::GraphicsContext::Traits> traits = new osg::GraphicsContext::Traits;
traits->x = 0;
traits->y = 0;
traits->width = screenWidth;
traits->height = screenHeight;
if (offScreen) {
traits->windowDecoration = false;
traits->doubleBuffer = true;
traits->pbuffer = true;
} else {
traits->windowDecoration = true;
traits->doubleBuffer = true;
traits->pbuffer = false;
}
traits->sharedContext = 0;
std::cout << "DisplayName : " << traints->displayName() << std::endl;
traits->readDISPLAY();
osg::GraphicsContext* _gc = osg::GraphicsContext::createGraphicsContext(traits.get());
if (!_gc) {
osg::notify(osg::NOTICE)<< "Failed to create pbuffer, failing back to normal graphics window." << std::endl;
traits->pbuffer = false;
_gc = osg::GraphicsContext::createGraphicsContext(traits.get());
}
但是,如果我ssh
要服务器并运行应用程序,它实际上使用客户端 GPU 而不是服务器 GPU。服务器上有四个 GeForce GPU。我试图将其更改DISPLAY
为hostname:0.0
但没有用。
我应该怎么做才能使应用程序在 Linux 中使用服务器 GPU 而不是客户端 GPU?