这显然是对 WEBGL-Gurus 的调用:
我正在使用第二个深度相机渲染到 2x2 像素分辨率,但我只读取一个像素。
我努力了:
renderer.readRenderTargetPixels()
gl.readPixels
在 Three.js 中(我认为它与pure中的相同WEBGL
)。getImageData()
渲染到Canvas
.
两者似乎都非常慢:在我的情况下它们加起来为 25 毫秒,即使对于 2x2 像素的渲染目标也是如此。gl.readpixels
这里还提到了特别是不合理的低效率:使用 gl.readPixels 的正确方法是什么?
因此,我正在寻找一种解决方案——任何解决方案,它可以有效地读取该单像素,并将其放入 JS 数组或对象中。谢谢。
更新:
-我更新了标题和正文,使其更加准确。
-我创建了一个 JSFiddle 来演示 readRenderTargetPixels()(相当于 gl.readpixels())延迟。我试图让它尽可能简单,但不是......更简单:)
注意事项/使用方法:
此演示在正常模式和 WebXR 模式下运行。 你不必进入 VR 就能看到巨大的延迟。JSFiddle 有一个 BUG,它没有在移动设备上显示 VRButton。为了在移动设备上进入 VR,您需要将代码复制到 index.html 并通过 WIFI 运行安全的本地服务器 (HTTPS),或者使用不会使 VRButton ...不可见的 JSFiddle-alternative -I已经测试了很多,但我找不到一个有效的(!)
该演示显示了过去 10 帧的渲染统计信息(最小和最大毫秒渲染时间),为方便起见,在屏幕上以正常和 WebXR 模式显示,因此在移动设备上也很容易看到。
有一个主摄像头和一个深度摄像头。深度相机的分辨率为 2x2 像素,FOV 为 0.015 度。它指向一个旋转的立方体,它测量深度并在立方体表面上绘制一条射线和一个点。我什至优化了代码以消除 JS 中昂贵的解码数学。
我主要对移动设备感兴趣,那里的延迟要高得多,为了您的方便,我提供了二维码。所以要在手机上测试它请扫描下面的#1 QR码,等待一分钟稳定,然后扫描第二个QR码进行比较。要查看/编辑 JSFiddle 中的代码,请从 url 中删除“show”并重新加载页面。
请在 Chrome 浏览器上测试演示。如果您在 PC 上运行演示,在判断延迟之前等待一分钟也很重要。我注意到 PC 上的 Firefox 的延迟远低于 PC 上的 Chrome,并且更稳定,但是我感兴趣的功能在 FF 上不受支持,所以我只对 Chrome 感兴趣。在我的 PC 上,Chrome 以大约 5 毫秒的渲染时间开始(与没有该功能的 1 到 2 毫秒相比仍然很长),然后过一会儿它会增加一倍和三倍。在移动设备上,它几乎总是很高,在 15 到 30 毫秒之间(强大的移动设备)。
readRenderTargetPixels() 开启:
https://jsfiddle.net/dlllb/h3ywjeud/show
readRenderTargetPixels() 关闭:
https://jsfiddle.net/dlllb/mbk4Lfy1/show
<!DOCTYPE html>
<html lang="en" >
<head>
<title>READPIXELS LATENCY</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no, viewport-fit=cover" />
<meta name="mobile-web-app-capable" content="yes">
<meta name="HandheldFriendly" content="true" />
<style>
html{height: 100%}
body {text-align:center; background-color: #000000; height: 100%; margin: 0px; overflow: hidden}
#info { position: absolute; top: 0px; width: 800px; margin: 10px; text-align: left; z-index: 90; display:none}
</style>
</head>
<body>
<div id="container"></div>
<script type="module" defer>
import * as THREE from "https://threejs.org/build/three.module.js";
import { VRButton } from "https://threejs.org/examples/jsm/webxr/VRButton.js";
var win = {
x: window.innerWidth,
y: window.innerHeight
}
// multiply deg with deg2rad to find radians (rad = deg * Pi/180)
const deg2rad = 0.017453292519943295769236907685;
var tmp;
var startms = 0;
var endms = 0;
var deltams = 0;
var fcounter = 0;
var fbuffer = [0,0,0,0,0,0,0,0,0,0];
var maxms = 0;
var minms = 1000;
let avframes = 10;
//_________________________SCENE___________________________________________
var scene = new THREE.Scene();
scene.background = new THREE.Color( "#346189" );
//___________________xrRig __xrCam________________________________________
var xrRig = new THREE.Object3D();
scene.add(xrRig);
var fov = 50;
var aspect = win.x/win.y;
var cnear = 10;
var cfar = 4000;
var xrCam = new THREE.PerspectiveCamera( fov, aspect, cnear, cfar );
xrRig.add(xrCam);
xrRig.position.set(0, 20, 125);
//___________________ depthRig ____ depthCam ____________________________
var depthRig = new THREE.Object3D();
scene.add(depthRig);
var dres = 2;
var lfov = 0.015625;
var laspect = 1;
var lnear = 1;
var lfar = 2000;
var depthCam = new THREE.PerspectiveCamera( lfov, laspect, lnear, lfar );
depthRig.add(depthCam);
depthRig.position.set(40, 0, 50);
depthRig.rotateOnAxis(new THREE.Vector3(0,1,0), 40 * deg2rad)
// show camera cone (depth won't work)
// const helper = new THREE.CameraHelper( depthCam );
// scene.add( helper );
//_________________________________________________________________
var depthTarget = new THREE.WebGLRenderTarget( dres, dres );
depthTarget.texture.format = THREE.RGBAFormat;
// depthTarget.texture.minFilter = THREE.NearestFilter;
// depthTarget.texture.magFilter = THREE.NearestFilter;
// depthTarget.texture.generateMipmaps = false;
// depthTarget.stencilBuffer = false;
// depthTarget.depthBuffer = true;
// depthTarget.depthTexture = new THREE.DepthTexture();
// depthTarget.depthTexture.format = THREE.DepthFormat;
// depthTarget.depthTexture.type = THREE.UnsignedShortType;
var depthMaterial = new THREE.MeshDepthMaterial({depthPacking: THREE.RGBADepthPacking});
var pb = new Uint8Array(4);
var onpos = new THREE.Vector3();
//_________________________________________________________________
const Dlight = new THREE.DirectionalLight( 0xffffff, 1);
Dlight.position.set( 0, 1000, 1000 );
scene.add( Dlight );
//_________________________________________________________________
// *** WebGLRenderer XR ***
var renderer = new THREE.WebGLRenderer({ antialias: true, precision:'highp'});
renderer.setPixelRatio( window.devicePixelRatio );
renderer.setSize( win.x, win.y );
renderer.autoClear = false;
renderer.xr.enabled = true;
var cont = document.getElementById( 'container' );
cont.appendChild( renderer.domElement );
document.body.appendChild( VRButton.createButton( renderer ) );
//____________ LASER RAY VARS _________________________________
var startray = new THREE.Vector3();
var endray = new THREE.Vector3();
var raygeom = new THREE.BufferGeometry();
var points = [startray, endray];
raygeom.setFromPoints( points );
var rayline = new THREE.Line( raygeom, new THREE.MeshBasicMaterial({color: 0xff0000}) );
scene.add(rayline);
var marker = new THREE.Mesh(new THREE.SphereGeometry(0.8), new THREE.MeshBasicMaterial({color: 0xff0000}));
scene.add(marker);
//____________ CUBE _________________________________
var cubeGeometry = new THREE.BoxGeometry(40,40,40);
var material = new THREE.MeshStandardMaterial({color: "#eabf11"});
var cube = new THREE.Mesh(cubeGeometry, material);
scene.add(cube);
cube.position.set(0,0,0);
// ______________________VERTICAL_PLANE____________________________
var ccw = 500;
var cch = 150;
var vplane = new THREE.PlaneBufferGeometry( ccw, cch );
var vmap = new THREE.MeshBasicMaterial();
var m = new THREE.Mesh( vplane, vmap );
m.visible = false;
m.position.set(0, 150, -1000);
scene.add( m );
//_________ CANVAS _______________
var canvas = document.createElement("canvas");
var ctx = canvas.getContext("2d");
ctx.canvas.width = ccw;
ctx.canvas.height = cch;
ctx.font = "60px Arial";
ctx.textBaseline = "top";
var img, tex;
function drawtext(t1, t2){
ctx.clearRect(0, 0, ccw, cch);
ctx.fillStyle = "#346189";
ctx.fillRect(0, 0, ccw, cch);
ctx.fillStyle = "#ffffff";
ctx.fillText(t1, 100, 10, ccw);
ctx.fillText(t2, 100, 80, ccw);
img = ctx.getImageData(0, 0, ccw, cch);
tex = new THREE.Texture(img);
tex.needsUpdate = true;
m.material.map = tex;
m.material.needsUpdate = true;
tex.dispose();
m.visible = true;
}
//_________________________________
window.addEventListener('resize', onResize, false);
renderer.setAnimationLoop( render );
//_________handle_window_resizing___________________________
function onResize(){
if (renderer.xr.isPresenting) return;
win.x = window.innerWidth;
win.y = window.innerHeight;
xrCam.aspect = win.x / win.y;
xrCam.updateProjectionMatrix();
renderer.setSize( win.x, win.y );
}
// ____________________render_frame______________________________
function render() {
startms = Date.now();
renderer.clear();
cube.rotation.y += 0.01;
renderer.xr.enabled = false;
//---------------------- RENDER RGBA-Depth to depthTarget--------------------------
renderer.setClearColor("#ffffff", 1);
rayline.visible = false;
marker.visible = false;
renderer.setRenderTarget(depthTarget);
scene.overrideMaterial = depthMaterial;
renderer.render(scene, depthCam);
// ******* COMMENT-OUT THE FOLLOWING LINE TO COMPARE ******
renderer.readRenderTargetPixels(depthTarget, dres/2, dres/2, 1, 1, pb);
var dp = pb[0]*0.0000000002328306436538696
+ pb[1]*0.00000005960464477539063
+ pb[2]*0.0000152587890625
+ pb[3]*0.00390625;
var viewZ = (lnear * lfar) / ((lfar - lnear) * dp - lfar);
var midZ = viewZ;
if (viewZ < -lfar) {
midZ = -lfar;
}
onpos.set(0, 0, 0.5).applyMatrix4(depthCam.projectionMatrixInverse);
onpos.multiplyScalar(midZ / onpos.z);
onpos.applyMatrix4(depthCam.matrixWorld);
startray = new THREE.Vector3();
depthCam.getWorldPosition(startray);
raygeom.attributes.position.setXYZ(0, startray.x, startray.y, startray.z);
raygeom.attributes.position.setXYZ(1, onpos.x, onpos.y, onpos.z);
raygeom.attributes.position.needsUpdate = true;
//-------------------- RENDER NORMAL SCENE ------------------------
renderer.setClearColor("#346189", 1);
renderer.xr.enabled = true;
rayline.visible = true;
marker.visible = true;
marker.position.copy(onpos);
scene.overrideMaterial = null;
renderer.setRenderTarget(null);
renderer.render( scene, xrCam );
//------- delta time statistics for the last 10 frames -----------
endms = Date.now();
deltams = endms - startms;
for (let f=avframes; f>0; f--){
tmp = fbuffer[f];
minms = Math.min(tmp, minms);
maxms = Math.max(tmp, maxms);
fbuffer[f+1]=tmp;
}
fbuffer[1] = deltams;
fcounter++;
if (fcounter === avframes){
fcounter = 0;
drawtext("max-ms:"+maxms, "min-ms:"+minms);
minms = 1000;
maxms = 0;
}
}//end render() _______________________________________________________________________________
</script>
</body>
</html>
更新#2
Andre van Kammen 的模组:移动端的延迟仍然很高(小米红米 Note 9S)。
使用网络摄像头拍摄的视频: https ://www.bitchute.com/video/5WJxdo649KiF/
有一篇关于 Pixel Buffer Objects (PBO) 的文章:http: //www.songho.ca/opengl/gl_pbo.html 它看起来很有希望,它是关于异步读取 GPU 数据的,不会让 GPU 停顿......直到我尝试了按下空格键时关闭和打开的演示,并得到零(0)差异!在我的电脑上:http: //www.songho.ca/opengl/files/pboUnpack.zip
所以,显然,自从 gl.readpixels 推出 30 年后,技术未能提供一种可靠且有效的方式来读取该死的像素......这非常可耻,我正在设计硬件,而我从中学到了一件事多年来,电子领域的每个问题都有解决方案,软件和所有其他领域也是如此。显然,对于行业的某些部分来说,这是马虎第一,而不是性能第一。请证明我错了。