Image Measure

MSE

PSNR

SSIM

\[\text{SSIM}(x,y) = \left[ l(x,y)^\alpha \cdot c(x,y)^\beta \cdot s(x,y)^\gamma \right] \]

l: luminance, c: contrast, s: structure

\[\begin{split}l(x,y)=\frac{2\mu_x\mu_y + c_1}{\mu^2_x + \mu^2_y + c_1} \\ c(x,y)=\frac{2\sigma_x\sigma_y + c_2}{\sigma_x^2 + \sigma_y^2 + c_2} \\ s(x,y)=\frac{\sigma_{xy} + c_3}{\sigma_x \sigma_y + c_3}\end{split}\]

c: constant depend on dynamic range of pixel value

\[\text{SSIM}(x,y) = \frac{(2\mu_x\mu_y + c_1)(2\sigma_{xy} + c_2)}{(\mu_x^2 + \mu_y^2 + c_1)(\sigma_x^2 + \sigma_y^2 + c_2)}\]

MOS

MOS stands for Mean Opinion Score, rated by human

LPIPS

LPIPS stands for Learned Perceptual Image Patch Similarity The Unreasonable Effectiveness of Deep Features as a Perceptual Metric(CVPR 2018)
Project | pyTorch 1.0+ Higher means further/more different. Lower means more similar.

GAN

LPIPS could also used to measure the average feature distances between generated samples. Higher LPIPS scores indicate better diversity among the generated images.