I'm trying to render to a single channel texture attached to a framebuffer but its not working. I create the texture using a sized format:
glCreateTextures(GL_TEXTURE_2D, 1, &entityTexture);
glTextureStorage2D(entityTexture, 1, GL_R32F, viewport.size.x, viewport.size.y);
glTextureParameteri(entityTexture, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTextureParameteri(entityTexture, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
UPDATE: I then attach it to my framebuffer together with some others:
std::array<GLenum, 4> colorAttachments = {
GL_COLOR_ATTACHMENT0,
GL_COLOR_ATTACHMENT1,
GL_COLOR_ATTACHMENT2,
GL_COLOR_ATTACHMENT3,
};
glNamedFramebufferDrawBuffers(GBuffer, static_cast<GLsizei>(colorAttachments.size()), colorAttachments.data());
Inside the shader I write up top:
layout(location = 0) out vec4 gNormal;
layout(location = 1) out vec4 gColor;
layout(location = 2) out vec4 gMetallicRoughness;
layout(location = 3) out float gEntityID;
And then write some random test value to it in the fragment shader (drawing a bunch of meshes):
gEntityID = 12.3;
When I check in RenderDoc the image turns up black with a value of 0. Does all this check out in terms of API usage?
question from:
https://stackoverflow.com/questions/65557538/opengl-render-to-single-channel-texture 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…