admin管理员组文章数量:1342620
I'm currently following this guide to render my scene to a texture to generate a depth / shadow map: /
The guide is in C++. I'm converting it into WebGL - JavaScript and so far have been successful, but unfortunately e across this blunder in Chrome:
WebGL: INVALID_OPERATION: texImage2D: ArrayBufferView not big enough for request
This is an error relating to:
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, 1024, 768, 0, gl.RGB, gl.UNSIGNED_BYTE, new Uint8Array([0, 0, 0, 0]));
When the width and height of 1024 to 768 are set to 1, it does not produce an error.
In the guide, it uses the following:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 1024, 768, 0, GL_RGB, GL_UNSIGNED_BYTE, 0);
There is a great answer to a similar question here: Error when creating textures in WebGL with the RGB format which leads me to believe that since the texture is non-existent at the time of calling the method, it can not be bigger than 1 pixel, but I'm unsure if this is correct? EDIT: Not a duplicate of this question for 2 reasons. 1, I would not have asked this question if it was a duplicate and 2, the answer explains why it is not a duplicate.
The rest of the converted code from the guide I'll dump below:
// shadow test
this.frameBuffer = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, this.frameBuffer);
this.texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, this.texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, 1024, 768, 0, gl.RGB, gl.UNSIGNED_BYTE, new Uint8Array([0, 0, 0, 0]));
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
this.depthBuffer = gl.createRenderbuffer();
gl.bindRenderbuffer(gl.RENDERBUFFER, this.depthBuffer);
gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, 1024, 768);
gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, this.depthBuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, this.texture, 0);
gl.drawBuffers([gl.NONE, gl.COLOR_ATTACHMENT0_EXT]);
gl.bindFramebuffer(gl.FRAMEBUFFER, this.buffer);
gl.viewport(0, 0, 1024, 768);
I've tagged this with C++ and OpenGL for help in understanding the differences with this method between WebGL - JavaScript and OpenGL - C++.
I'm currently following this guide to render my scene to a texture to generate a depth / shadow map: http://www.opengl-tutorial/intermediate-tutorials/tutorial-14-render-to-texture/
The guide is in C++. I'm converting it into WebGL - JavaScript and so far have been successful, but unfortunately e across this blunder in Chrome:
WebGL: INVALID_OPERATION: texImage2D: ArrayBufferView not big enough for request
This is an error relating to:
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, 1024, 768, 0, gl.RGB, gl.UNSIGNED_BYTE, new Uint8Array([0, 0, 0, 0]));
When the width and height of 1024 to 768 are set to 1, it does not produce an error.
In the guide, it uses the following:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 1024, 768, 0, GL_RGB, GL_UNSIGNED_BYTE, 0);
There is a great answer to a similar question here: Error when creating textures in WebGL with the RGB format which leads me to believe that since the texture is non-existent at the time of calling the method, it can not be bigger than 1 pixel, but I'm unsure if this is correct? EDIT: Not a duplicate of this question for 2 reasons. 1, I would not have asked this question if it was a duplicate and 2, the answer explains why it is not a duplicate.
The rest of the converted code from the guide I'll dump below:
// shadow test
this.frameBuffer = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, this.frameBuffer);
this.texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, this.texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, 1024, 768, 0, gl.RGB, gl.UNSIGNED_BYTE, new Uint8Array([0, 0, 0, 0]));
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
this.depthBuffer = gl.createRenderbuffer();
gl.bindRenderbuffer(gl.RENDERBUFFER, this.depthBuffer);
gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, 1024, 768);
gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, this.depthBuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, this.texture, 0);
gl.drawBuffers([gl.NONE, gl.COLOR_ATTACHMENT0_EXT]);
gl.bindFramebuffer(gl.FRAMEBUFFER, this.buffer);
gl.viewport(0, 0, 1024, 768);
I've tagged this with C++ and OpenGL for help in understanding the differences with this method between WebGL - JavaScript and OpenGL - C++.
Share Improve this question edited May 14, 2023 at 17:38 BanksySan 28.6k36 gold badges125 silver badges230 bronze badges asked Jan 20, 2019 at 12:43 user8826104user8826104 1-
1
Use a WebGL guide instead of an OpenGL guide? BTW this is really the same in OpenGL and WebGL. The difference is WebGL catches the error, OpenGL just reads past the end of the buffer you supply causing both a potential security issue as well as a possible crash. In other words your WebGL code is equivalent to something like in C
gl.texImage2D(....., 1024, 768, ...., calloc(4, 1));
This would allocate 4 bytes, pass a pointer to it to OpenGL, and tell OpenGL to read 3145728 bytes from the memory where those 4 bytes you allocated are. – user128511 Commented Jan 20, 2019 at 14:22
1 Answer
Reset to default 11The error you're getting has nothing to do with unpack alignment but the fact that you simply can not fill a 1024x768 texture with just 4 bytes. texImage2D
requires you to either provide null
(in which case a buffer the size of the texture is initialized, zero filled and used to initialize the texture) or a buffer the size of the texture which in your case would be 1024 * 768 * 3
bytes which coincidentally is a multiple of 4(so you would not run into any unpacking issues).
Here's the relevant excerpt of the WebGL 1 specification:
If pixels is null, a buffer of sufficient size initialized to 0 is passed. [...] If pixels is non-null but its size is less than what is required by the specified width, height, format, type, and pixel storage parameters, generates an INVALID_OPERATION error.
本文标签: javascriptWebGLINVALIDOPERATION texImage2D ArrayBufferView not big enough for requestStack Overflow
版权声明:本文标题:javascript - WebGL - INVALID_OPERATION: texImage2D: ArrayBufferView not big enough for request - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743696306a2523562.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论