Skip to content

zcx-language/AdversarialDistortionDomainTranslation

Repository files navigation


Adversarial Distortion Domain Translation

PyTorch Lightning Config: Hydra Template
Paper Conference

Abstract

Deep watermarking models optimize robustness by incorporating distortions between the encoder and decoder. To tackle non-differentiable distortions, current methods only train the decoder with distorted images, which breaks the joint optimization of the encoder-decoder, resulting in suboptimal performance. To address this problem, we propose an Adversarial Distortion Domain Translation (AD$^2$T) method by treating the distortion as an image-to-image translation task. AD$^2$T adopts conditional GANs to learn the non-differentiable distortion mappings. It employs generators to transform the encoded image into the distorted one to bridge the encoder-decoder for joint optimization. We also supervise the GANs to generate challenging distorted samples to augment the watermarking model via adversarial training. This further improves the model robustness by minimizing the maximum decoding loss. Extensive experiments demonstrate the superiority of our method when tested on non-differentiable distortions, including lossy compression and style transfers.

Visualizations

Visuallizations

How to run

Install dependencies

# clone project
git clone https://github.com/zcx-language/AdversarialDistortionTranslation.git
cd AdversarialDistortionTranslation

# [OPTIONAL] create conda environment
conda create -n myenv python=3.9
conda activate myenv

# install pytorch according to instructions
# https://pytorch.org/get-started/

# install requirements
pip install -r requirements.txt

Train model

# train AdversarialDistortionGANS
python main.py model=adversarial_distortion_gans.yaml

Test model

# test saved checkpoints
python test.py model=adversarial_distortion_gans.yaml ckpt_path=$YOURPATH

You can override any parameter from command line like this

python main.py trainer.max_epochs=20 datamodule.batch_size=64

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published