Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,25 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
pipeline_tag: text-to-image
|
| 4 |
+
library_name: transformers
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
# 🛡️DAA: Dynamic Attention Analysis for Backdoor Detection in Text-to-Image Diffusion Models
|
| 8 |
+
|
| 9 |
+
This repository contains artifacts and code related to the paper: [**Dynamic Attention Analysis for Backdoor Detection in Text-to-Image Diffusion Models**](https://huggingface.co/papers/2504.20518).
|
| 10 |
+
|
| 11 |
+
Code: https://github.com/Robin-WZQ/DAA
|
| 12 |
+
|
| 13 |
+
This study introduces a novel backdoor detection perspective from **Dynamic Attention Analysis (DAA)**, which shows that the **dynamic feature in attention maps** can serve as a much better indicator for backdoor detection in text-to-image diffusion models. By examining the dynamic evolution of cross-attention maps, backdoor samples exhibit distinct feature evolution patterns compared to benign samples, particularly at the `<EOS>` token.
|
| 14 |
+
|
| 15 |
+
## 📄 Citation
|
| 16 |
+
|
| 17 |
+
If you find this project useful in your research, please consider cite:
|
| 18 |
+
```bibtex
|
| 19 |
+
@article{wang2025dynamicattentionanalysisbackdoor,
|
| 20 |
+
title={Dynamic Attention Analysis for Backdoor Detection in Text-to-Image Diffusion Models},
|
| 21 |
+
author={Zhongqi Wang and Jie Zhang and Shiguang Shan and Xilin Chen},
|
| 22 |
+
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
|
| 23 |
+
year={2025},
|
| 24 |
+
}
|
| 25 |
+
```
|