Interactive Neural Painting

1University of Trento, 2Picsart AI Research (PAIR), 3Politecnico di Milano, 4ETH Zurich, 5University of Illinois Urbana Champaign (UIUC), 6Fondazione Bruno Kessler (FBK).

Abstract

In the last few years, Neural Painting (NP) techniques became capable of producing extremely realistic artworks. This paper advances the state of the art in this emerging research domain by proposing the first approach for Interactive NP. Considering a setting where a user looks at a scene and tries to reproduce it on a painting, our objective is to develop a computational framework to assist the user’s creativity by suggesting the next strokes to paint, that can be possibly used to complete the artwork. To accomplish such a task, we propose I-Paint, a novel method based on a conditional transformer Variational AutoEncoder (VAE) architecture with a two-stage decoder. To evaluate the proposed approach and stimulate research in this area, we also introduce two novel datasets. Our experiments show that our approach provides good stroke suggestions and compares favorably to the state of the art.

Overview

In Interactive Neural Painting (INP) scenario, the model interacts with the user to paint a reference image. The model provides a set of stroke suggestions based on the reference image and current state of the painting, while the user either selects a suggested set of strokes or directly draws new strokes on the canvas. The user has complete control and can query the model multiple times until satisfied with the suggestion. The example sunflower contains a textual description of the process.
``

Demo

BibTeX

@article{PERUZZO2023103778,
title = {Interactive Neural Painting},
journal = {Computer Vision and Image Understanding},
volume = {235},
pages = {103778},
year = {2023},
issn = {1077-3142},
doi = {https://doi.org/10.1016/j.cviu.2023.103778},
url = {https://www.sciencedirect.com/science/article/pii/S1077314223001583},
author = {Elia Peruzzo and Willi Menapace and Vidit Goel and Federica Arrigoni and Hao Tang and Xingqian Xu and Arman Chopikyan and Nikita Orlov and Yuxiao Hu and Humphrey Shi and Nicu Sebe and Elisa Ricci},