The term post-processing (or postproc for short) is used in the video and film industry for quality-improvement image processing (specifically digital image processing) methods used in video playback devices, such as stand-alone DVD-Video players; video playing software; and transcoding software. It is also commonly used in real-time 3D rendering (such as in video games) to add additional effects.
Uses in video production
editVideo post-processing is the process of changing the perceived quality of a video on playback (done after the decoding process). Image scaling routines such as linear interpolation, bilinear interpolation, or cubic interpolation can for example be performed when increasing the size of images; this involves either subsampling (reducing or shrinking an image) or zooming (enlarging an image). This helps reduce or hide image artifacts and flaws in the original film material. Post-processing always involves a trade-off between speed, smoothness and sharpness.
- Image scaling and multivariate interpolation:
- SPP (Statistical-Post-Processing)
- Deblocking
- Deringing
- Sharpen / Unsharpen (often referred to as "soften")
- Requantization
- Luminance alterations
- Blurring / denoising
- Deinterlacing
- weave deinterlace method
- bob deinterlace method
- linear deinterlace method
- yadif deinterlace method
- Deflicking
- 2:3 pull-down / ivtc (inverse telecine) for conversion from 24 frames/s and 23.976 frames/s to 30 frames/s and 29.97 frames/s
- 3:2 pull-up (telecine conversion) for conversion from 30 frames/s and 29.97 frames/s to 24 frames/s and 23.976 frames/s
Uses in 3D rendering
editAdditionally, post-processing is commonly used in 3D rendering, especially for video games. Instead of rendering 3D objects directly to the display, the scene is first rendered to a buffer in the memory of the video card. Pixel shaders and optionally vertex shaders are then used to apply post-processing filters to the image buffer before displaying it to the screen. Some post-processing effects also require multiple-passes, gamma inputs, vertex manipulation, and depth buffer access. Post-processing allows effects to be used that require awareness of the entire image (since normally each 3D object is rendered in isolation). Such effects include:
- Ambient occlusion (HBAO, Screen space ambient occlusion (SSAO, reflections), etc.
- Anaglyph
- Anti-aliasing (FXAA, AGAA,[1] SMAA, MLAA, and custom anti-aliasing methods—not sample-size AA like MSAA and SSAA)
- Bloom
- Blur (depth of field, motion blur, smart)
- Bloodlust effect (red vignetting with particles, etc.)
- Bokeh
- Bump mapping
- Cel shading
- Chromatic aberration
- Color correction
- Color grading
- Contrast adjustment
- Dynamic contrast
- Crepuscular rays
- Digital camera light compensation
- Dithering (including subpixel)
- Eye adaptation
- Film grain
- Filmic scene tone mapping
- Fog/mist
- Gamma correction
- Global illumination
- Glow
- Grayscale
- Haze (depth, heat)
- High-dynamic-range rendering
- Image distortion
- Infrared
- Lens flare (cubic lens distortion flare,[2] pseudo lens flare[3])
- Light scattering
- Nightvision
- Outlines
- Particle effects
- Pixel vibrance
- Point-light attenuation
- Posterization and deposterization
- Scanline
- Screen borders
- Screen rotation
- Shading (ink, paint, sketch)
- Shadow mapping
- Sepia tone
- Sharpen/unsharpen (texture unsharp mask, LumaSharpen, sharpen, sharpen complex 1/2, adaptive-sharpen)
- Sobel operator
- Split screen
- Upscaling (i.e. xBR, Super xBR, SuperRes)
- Texture filtering (point, linear, bilinear, trilinear, anisotropic, and custom algorithms)
- Vignette
See also
editReferences
edit- ^ "Aggregate G-Buffer Anti-Aliasing". Archived from the original on 2016-04-27. Retrieved 2016-01-16.
- ^ "//game dev log of martins upitis: GLSL Cubic Lens Distortion". October 13, 2011.
- ^ "john-chapman-graphics: Pseudo Lens Flare". February 22, 2013.
External links
edit- Videotranscoding Wiki -(documentation on server-side usage of MPlayer for transcoding)