TY - GEN
T1 - Artwork protection against neural style transfer using locally adaptive adversarial color attack
AU - Guo, Zhongliang
AU - Dong, Junhao
AU - Qian, Yifei
AU - Wang, Kaixuan
AU - Li, Weiye
AU - Guo, Ziheng
AU - Wang, Yuheng
AU - Li, Yanli
AU - Arandjelović, Ognjen
AU - Fang, Lei
PY - 2024/10/19
Y1 - 2024/10/19
N2 - Neural style transfer (NST) generates new images by combining the style of one image with the content of another. However, unauthorized NST can exploit artwork, raising concerns about artists’ rights and motivating the development of proactive protection methods. We propose Locally Adaptive Adversarial Color Attack (LAACA), empowering artists to protect their artwork from unauthorized style transfer by processing before public release. By delving into the intricacies of human visual perception and the role of different frequency components, our method strategically introduces frequency-adaptive perturbations in the image. These perturbations significantly degrade the generation quality of NST while maintaining an acceptable level of visual change in the original image, ensuring that potential infringers are discouraged from using the protected artworks, because of its bad NST generation quality. Additionally, existing metrics often overlook the importance of color fidelity in evaluating color-mattered tasks, such as the quality of NST-generated images, which is crucial in the context of artistic works. To comprehensively assess the color-mattered tasks, we propose the Aesthetic Color Distance Metric (ACDM), designed to quantify the color difference of images pre- and post-manipulations. Experimental results confirm that attacking NST using LAACA results in visually inferior style transfer, and the ACDM can efficiently measure color-mattered tasks. By providing artists with a tool to safeguard their intellectual property, our work relieves the socio-technical challenges posed by the misuse of NST in the art community.
AB - Neural style transfer (NST) generates new images by combining the style of one image with the content of another. However, unauthorized NST can exploit artwork, raising concerns about artists’ rights and motivating the development of proactive protection methods. We propose Locally Adaptive Adversarial Color Attack (LAACA), empowering artists to protect their artwork from unauthorized style transfer by processing before public release. By delving into the intricacies of human visual perception and the role of different frequency components, our method strategically introduces frequency-adaptive perturbations in the image. These perturbations significantly degrade the generation quality of NST while maintaining an acceptable level of visual change in the original image, ensuring that potential infringers are discouraged from using the protected artworks, because of its bad NST generation quality. Additionally, existing metrics often overlook the importance of color fidelity in evaluating color-mattered tasks, such as the quality of NST-generated images, which is crucial in the context of artistic works. To comprehensively assess the color-mattered tasks, we propose the Aesthetic Color Distance Metric (ACDM), designed to quantify the color difference of images pre- and post-manipulations. Experimental results confirm that attacking NST using LAACA results in visually inferior style transfer, and the ACDM can efficiently measure color-mattered tasks. By providing artists with a tool to safeguard their intellectual property, our work relieves the socio-technical challenges posed by the misuse of NST in the art community.
U2 - 10.3233/FAIA240643
DO - 10.3233/FAIA240643
M3 - Conference contribution
T3 - Frontiers in artificial intelligence and applications
SP - 1414
EP - 1421
BT - 27th European conference on artificial intelligence, 19–24 October 2024, Santiago de Compostela, Spain
A2 - Endriss, Ulle
A2 - Melo, Francesco S.
A2 - Bach, Kerstin
A2 - Bugarín-Diz, Alberto
A2 - Alonso-Moral, José M.
A2 - Barro, Senén
A2 - Heintz, Fredrik
PB - IOS Press
CY - Amsterdam
ER -