HomeProjectsPublicationsTeachingLinks
Japanese / English

Image-based translucency transfer through correlation analysis over multi-scale spatial color distribution

Hideki Todo, Tatsuya Yatagawa, Masataka Sawayama, Yoshinori Dobashi, Masanori Kakimoto

The Visual Computer, Volume 35, Issue 6–8 (May 2019), pp 811–822.

Figure 1: Results of proposed material transfer. Given an input image (a) and a reference material image (b), our material transfer operations can reproduce various reference material styles.

Figure 1: Results of proposed material transfer. Given an input image (a) and a reference material image (b), our material transfer operations can reproduce various reference material styles.

Abstract

This paper introduces an image-based material transfer framework which only requires single input and reference images as an ordinary color transfer method. In contrast to previous material transfer methods, we focus on transferring the appearances of translucent objects. There are two challenging problems in such material transfer for translucent objects. First, the appearances of translucent materials are characterized by not only colors but also their spatial distribution. Unfortunately, the traditional color transfer methods can hardly handle the translucency because they only consider the colors of the objects. Second, temporal coherency in the transferred results cannot be handled by the traditional methods and furthermore by recent neural style transfer methods, as we demonstrated in this paper. To address these problems, we propose a novel image-based material transfer method based on the analysis of spatial color distribution. We focus on ``subbands,'' which represent multi-scale image structures, and find that the correlation between color distribution and subbands is a key feature for reproducing the appearances of translucent materials. Our method relies on a standard principal component analysis (PCA), which harmonizes the correlation of input and reference images to reproduce the translucent appearance. Considering the spatial color distribution in the input and reference images, our method can be naturally applied to video sequences in a frame-by-frame manner without any additional pre-/post-process. Through experimental analyses, we demonstrate that the proposed method can be applied to a broad variety of translucent materials, and their resulting appearances are perceptually similar to those of the reference images.