🥯 BAGEL-NHR-Edit-V2
🌐 NHR Website | 📜 NHR Paper on arXiv | 🤗 NHR-Edit Dataset (part1) | 🤗 NHR-Edit Dataset (part2) |
This repository hosts the model weights for BAGEL, fine-tuned on the NHR-Edit + NHR-Edit-part2 dataset. For installation, usage instructions, and further documentation, please visit the official BAGEL GitHub repository.
🛠️ Training Setup
We performed parameter-efficient adaptation on the generation expert’s attention and FFN projection layers using LoRA.
LoRA parameters:
r = 16
lora_alpha = 16
dropout = 0.05
bias = "none"
target_modules = [
"v_proj_moe_gen",
"k_proj_moe_gen",
"mlp_moe_gen.down_proj",
"mlp_moe_gen.gate_proj",
"q_proj_moe_gen",
"mlp_moe_gen.up_proj",
"o_proj_moe_gen"
]
Metrics for GEdit-Bench-EN:
| Model | GEdit-Bench-EN (SC) ↑ | GEdit-Bench-EN (PQ) ↑ | GEdit-Bench-EN (O) ↑ |
|---|---|---|---|
| BAGEL-7B-MoT | 7.610 ± 0.150 | 6.180 ± 0.150 | 6.530 ± 0.140 |
| BAGEL-NHR-Edit-V2 | 7.800 ± 0.070 | 6.560 ± 0.080 | 6.800 ± 0.070 |
Scoring model:
gpt-4.1-2025-04-14(with default temperature)
Metrics for ImgEdit-Bench:
| Model | Style | Extract | Remove | Background | Action | Adjust | Add | Replace | Compose | Overall ↑ |
|---|---|---|---|---|---|---|---|---|---|---|
| BAGEL-7B-MoT | 4.20 ± 0.05 | 1.59 ± 0.10 | 3.16 ± 0.10 | 3.29 ± 0.06 | 3.96 ± 0.17 | 3.51 ± 0.20 | 3.98 ± 0.02 | 3.54 ± 0.11 | 2.93 ± 0.26 | 3.30 ± 0.03 |
| BAGEL-NHR-Edit-V2 | 4.28 ± 0.04 | 1.65 ± 0.07 | 3.12 ± 0.06 | 3.31 ± 0.02 | 3.81 ± 0.17 | 3.48 ± 0.12 | 4.19 ± 0.03 | 3.51 ± 0.06 | 2.99 ± 0.21 | 3.33 ± 0.02 |
Scoring model:
gpt-4o-2024-11-20(with temperature = 0.0)
License
BAGEL-NHR-Edit-V2 is licensed under the Apache 2.0 license. It is finetuned from ByteDance-Seed/BAGEL-7B-MoT, which is also licensed under Apache 2.0.
✍️ Citation
@article{Layer2025NoHumansRequired,
arxivId = {2507.14119},
author = {Maksim Kuprashevich and Grigorii Alekseenko and Irina Tolstykh and Georgii Fedorov and Bulat Suleimanov and Vladimir Dokholyan and Aleksandr Gordeev},
title = {{NoHumansRequired: Autonomous High-Quality Image Editing Triplet Mining}},
year = {2025},
eprint = {2507.14119},
archivePrefix = {arXiv},
primaryClass = {cs.CV},
url = {https://arxiv.org/abs/2507.14119},
journal={arXiv preprint arXiv:2507.14119}
}
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support