--- license: apache-2.0 pipeline_tag: video-to-video library_name: diffusers ---

Refaçade: Editing Object with Given Reference Texture

Youze Huang1,* Penghui Ruan2,* Bojia Zi3,* Xianbiao Qi4,† Jianan Wang5 Rong Xiao4
* Equal contribution. Corresponding author.

Huggingface Model Github arXiv Huggingface Space Demo Page

## 🚀 Overview **Refaçade** is a unified image–video retexturing model built upon the Wan2.1-based VACE framework. It edits the surface material of specified objects in a video using user-provided reference textures, while preserving the original geometry and background. We use **Jigsaw Permutation** to decouple structural information in the reference image and a **Texture Remover** to disentangle the original object’s appearance. This functionality enables users to explore diverse possibilities effectively. --- ## 🛠️ Installation Our project is built upon [Wan2.1-based VACE](https://github.com/ali-vilab/VACE). ```bash pip install -r requirements.txt pip install wan@git+https://github.com/Wan-Video/Wan2.1 ``` --- ## 🏃‍♂️ Gradio Demo You can use this gradio demo to retexture objects. Note that you don't need to compile the SAM2. ```bash python app.py ``` --- ## 📂 Download First, download our checkpoints: ```shell huggingface-cli download --resume-download fishze/Refacade --local-dir models ``` Next, download SAM2 [sam2_hiera_large.pt](https://huggingface.co/facebook/sam2-hiera-large) and place it at: ```shell sam2/SAM2-Video-Predictor/checkpoints/ ``` We recommend to organize local directories as: ```angular2html Refacade ├── ... ├── examples ├── models │ ├── refacade │ │ └── ... │ ├── texture_remover │ │ └── ... │ └── vae │ └── ... ├── sam2 └── ... ``` ---