3

我正在尝试Overlay Blend使用相机提要的输出来制作库存图像,其中库存图像的不透明度小于 100%。我想我可以GPUImageOpacityFilter在过滤器堆栈中放置一个,一切都会好起来的:

  1. GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
  2. GPUImagePicture -> GPUImageOpacityFilter (Opacity 0.1f) -> MY_GPUImageOverlayBlendFilter
  3. MY_GPUImageOverlayBlendFilter -> GPUImageView

但这导致的结果不是将 GPUImagePicture 的 0.1f alpha 版本混合到 GPUImageVideoCamera 中,而是导致 GPUImagePicture 的颜色/对比度变柔和并混合。所以我做了一些搜索,并根据一个建议尝试从 GPUImageOpacity 过滤器中获取 UIImageimageFromCurrentlyProcessedOutput并将其发送到 BlendFilter:

  1. GPUImagePicture -> MY_GPUImageOpacityFilter (Opacity 0.1f)
  2. [MY_GPUImageOpacityFilter imageFromCurrentlyProcessedOutput] -> MY_alphaedImage
  3. GPUImagePicture (MY_alphaedImage) -> MY_GPUImageOverlayBlendFilter
  4. GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
  5. MY_GPUImageOverlayBlendFilter -> GPUImageView

这完全符合我的预期。那么,为什么我必须这样做,这不imageFromCurrentlyProcessedOutput应该只是顺其自然吗?以下是上述两种场景的代码片段:

第一:

//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];

//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];

//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];

//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];

//Point the opacityFilter's output at the blendFilter
[opacityFilter addTarget:blendFilter];

[textureImage processImage];

//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];

第二个:

//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];

//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];

//Process the image so we get a UIImage with 0.5 opacity of the original
[textureImage processImage];
UIImage *processedImage = [opacityFilter imageFromCurrentlyProcessedOutput];
GPUImagePicture *processedTextureImage = [[[GPUImagePicture alloc] initWithImage:processedImage] autorelease];

//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];

//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];

//Point the opacityFilter's output at the blendFilter
[processedTextureImage addTarget:blendFilter];

[processedTextureImage processImage];

//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];
4

0 回答 0