1

我正在使用 c++、opengl 4.0 和 glsh 着色器语言。

我想知道如何正确地将漫反射纹理与光照贴图纹理混合。

假设我们有一个房间。每个对象都有漫反射纹理和光照贴图。在像 gamedev.net 或 stackoverflow 这样的每个论坛上,人们都说,这些纹理应该成倍增加。在大多数情况下,它会产生良好的效果,但有时某些物体非常靠近光源(例如白灯泡)。这种用于近距离物体的光源会生成白色光照贴图。但是当我们将漫反射纹理与白色光照贴图相乘时,我们会得到原始的漫反射纹理颜色。

但是如果光源靠近某个物体,那么光的颜色应该占主导地位

这意味着,如果白色的强光靠近红墙,那么这堵墙的某些部分应该是白色的,而不是红色的!

我认为我需要的不仅仅是一张光照贴图。光照贴图没有关于光强度的信息。这意味着,最闪亮的颜色只是最大的漫反射颜色。

也许我应该有 2 个纹理 - 阴影贴图和光照贴图?那么方程应该是这样的:

vec3 color = shadowmapColor * diffuseTextureColor + lightmapColor;

这是好方法吗?

4

2 回答 2

3

Generally speaking, if you're still using lightmaps, you are probably also not using HDR rendering. And without that, what you want is not particularly reasonable. Unless your light map provides the light intensity as an HDR floating-point value (perhaps in a GL_R11F_G11F_B10F or GL_RGBA16F format), this is not going to work very well.

And of course, you'll have to do the usual stuff that you do with HDR, such as tone mapping and so forth.

Lastly, your additive equation makes no sense. If the light map color represents the diffuse interaction between the light and the surface, then simply adding the light map color doesn't mean anything. The standard diffuse lighting equation is C * (dot(N, L) * I * D), where I is the light intensity, D is the distance attenuation factor, and C is the diffuse color. The value from the lightmap is presumably the parenthesized quantity. So adding it doesn't make sense.

It still needs to multiply with the surfaces's diffuse color. Any over-brightening will be due to the effective intensity of the light as a function of D.

于 2013-09-04T14:23:40.307 回答
0

What you need is the distance (or to save some sqrt-ing, the squared distance) of the light source to the fragment being illuminated. Then you can, in the simplest case, interpolate linearly between the light map and light source contributions:

The distance is a simple calculation which can be done per vertex in you vertex shader:

in      vec4  VertexPosition; // let's assume world space for simplicity
uniform vec4  LightPosisiton; // world-space - might also be part of a uniform block etc.
out     float LightDistance;  // pass the distance to the fragment shader

// other stuff you need here ....

void main()
{
  // do stuff 
  LightDistance = length(VertexPosition - LightPosisiton);
}

In your fragment shader, you use the distance to compute interpolation factors betweem light source and lightmap contributions:

in      float     LightDistance;
const   float     MAX_DISTANCE = 10.0;
uniform sampler2D LightMap;

// other stuff ...

out vec4 FragColor;

void main()
{
  vec4 LightContribution;
  // calculate illumination (including shadow map evaluation) here
  // store in LightContribution

  vec4 LightMapConstribution = texture(LightMap, /* tex coords here */);

  // The following DistanceFactor will map distances in the range [0, MAX_DISTANCE] to
  // [0,1]. The idea is that at LightDistance >= MAX_DISTANCE, the light source
  // doesn't contribute anymore.
  float DistanceFactor  = min(1.0, LightDistance / MAX_DISTANCE); 

  // linearly interpolat between LightContribution and LightMapConstribution 
  vec4 FinalContribution = mix(LightContribution, LightMapConstribution, DistanceFactor);

  FragColor = WhatEverColor * vec4(FinalContribution.xyz, 1.0);
}

HTH.

EDIT: To factor in Nicol Bolas' remarks, I assume that the LightMap stores the contribution encoded as an RGB color, storing the contributions for each channel. If you actually have a single channel lightmap which only store monochromatic contributions, you'll have to either use the surface color, use the color of the light source or reduce the light source contribution to a single channel.

EDIT2: Although this works mathematically, it's definitely not physically sound. You might need some correction of the final contribution to make it at least physically plausible. If your only aiming for effect, you can simply play around with correction factors until you're satisfied with the result.

于 2013-09-04T14:22:30.190 回答