Giter VIP home page Giter VIP logo

abhijay9 / shifttolerant-lpips Goto Github PK

View Code? Open in Web Editor NEW
29.0 1.0 2.0 245.76 MB

[ECCV 2022] We investigated a broad range of neural network elements and developed a robust perceptual similarity metric. Our shift-tolerant perceptual similarity metric (ST-LPIPS) is consistent with human perception and is less susceptible to imperceptible misalignments between two images than existing metrics.

License: BSD 2-Clause "Simplified" License

Python 68.85% Shell 1.17% Jupyter Notebook 29.98%
full-reference-image-quality-assessment full-reference-iqa image-quality-assessment iqa perceptual-similarity ml-robustness

shifttolerant-lpips's Introduction

ShiftTolerant-LPIPS

Update

[2023 Aug] Added ST-LPIPS to PyTorch Toolbox for Image Quality Assessment.

[2023 May] In the ST-LPIPS work (available in this repository), we developed a perceptual similarity metric that remains robust even in the presence of imperceptible pixel shifts. However, stronger corruptions can be generated via adversarial attacks. Consequently, in a separate study, we conducted a systematic investigation of the robustness of both traditional and learned perceptual similarity metrics against imperceptible adversarial perturbations. Our findings reveal that all metrics are susceptible to such attacks. For details, please consider reading our study on 'Attacking Perceptual Similarity Metrics' (TMLR'23 $\textcolor{red}{\text{Featured Certification}}$).

Shift-tolerant Perceptual Similarity Metric

Abhijay Ghildyal, Feng Liu. In ECCV, 2022. [Arxiv], [PyPI], [video]

Quick start

pip install stlpips_pytorch

from stlpips_pytorch import stlpips
from stlpips_pytorch import utils

path0 = "<dir>/ShiftTolerant-LPIPS/imgs/ex_p0.png"
path1 = "<dir>/ShiftTolerant-LPIPS/imgs/ex_ref.png"

img0 = utils.im2tensor(utils.load_image(path0))
img1 = utils.im2tensor(utils.load_image(path1))

stlpips_metric = stlpips.LPIPS(net="alex", variant="shift_tolerant")
# or for the vgg variant use `stlpips.LPIPS(net="vgg", variant="shift_tolerant")`

stlpips_distance = stlpips_metric(img0,img1).item()

or, please clone this repo and run

python lpips_2imgs.py

Training

nohup python -u ./train.py --from_scratch --train_trunk \
    --use_gpu --gpu_ids 0 \
    --net alex --variant vanilla --name alex_vanilla \
    > logs/train_alex_vanilla.out &

nohup python -u ./train.py --from_scratch --train_trunk \
    --use_gpu --gpu_ids 1 \
    --net alex --variant shift_tolerant --name alex_shift_tolerant \
    > logs/train_alex_shift_tolerant.out &

nohup python -u ./train.py --from_scratch --train_trunk \
    --use_gpu --gpu_ids 2 \
    --net vgg --variant vanilla --name vgg_vanilla \
    > logs/train_vgg_vanilla.out &

nohup python -u ./train.py --from_scratch --train_trunk \
    --use_gpu --gpu_ids 3 \
    --net vgg --variant shift_tolerant --name vgg_shift_tolerant \
    > logs/train_vgg_shift_tolerant.out &

Testing

Please download the original BAPPS dataset using this script (here). Then, update path to the dataset in global_config.json.

To reproduce the results in the paper run the following:

# bash n_pixel_shift_study/test_scripts/test.sh <net> <variant> <gpu_id> <img_resize> <batch_size>

# AlexNet Vanilla
nohup bash n_pixel_shift_study/test_scripts/test.sh alex vanilla 0 64 50 > logs/eval_alex_vanilla.out &

# AlexNet Shift-tolerant
nohup bash n_pixel_shift_study/test_scripts/test.sh alex shift_tolerant 1 64 50 > logs/eval_alex_shift_tolerant.out &

# Vgg Vanilla
nohup bash n_pixel_shift_study/test_scripts/test.sh vgg vanilla 2 64 50 > logs/eval_vgg_vanilla.out &

# Vgg Shift-tolerant
nohup bash n_pixel_shift_study/test_scripts/test.sh vgg shift_tolerant 3 64 50 > logs/eval_vgg_shift_tolerant.out &

Note: To train and test our models in this paper, we used Image.BICUBIC. The results are similar when other resizing methods are used. Please feel free to switch back to bilinear as used in the original LPIPS work (here).

Other Evaluations

For other evaluations refer to ./n_pixel_shift_study/.

Citation

If you find this repository useful for your research, please use the following.

@inproceedings{ghildyal2022stlpips,
  title={Shift-tolerant Perceptual Similarity Metric},
  author={Abhijay Ghildyal and Feng Liu},
  booktitle={European Conference on Computer Vision},
  year={2022}
}

Acknowledgements

This repository borrows from LPIPS, Anti-aliasedCNNs, and CNNsWithoutBorders. We thank the authors of these repositories for their incredible work and inspiration.

shifttolerant-lpips's People

Contributors

abhijay9 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

dylanyh2 peterzs

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.