
Image quality perception varies widely among individuals. While some struggle to notice differences between a game rendered at high quality and one using lower settings, others find the flaws unbearable. To address this issue, Intel has introduced the Computer Graphics Video Quality Metric (CVGM) tool, which quantifies visual error and assists developers in detecting rendering anomalies.
The CVGM tool is engineered to identify and rate distortions caused by rendering techniques, including neural supersampling and path tracing. Intel’s testing involved 80 short video clips showcasing various artifacts, assessed by 20 participants who rated the quality against a reference video. Using this data, a 3D CNN model was trained to predict image quality issues, offering a global quality score and detailed error maps.
Quote from Intel: “Whether you’re evaluating engine updates or employing new upscaling techniques, having a perceptual metric that aligns with human judgment represents a significant advantage.”
Despite its current limitations, ongoing improvements to the CVGM aim to enhance its capabilities for real-world applications further. This tool is available on GitHub as a PyTorch implementation.