Error

#1
by qpqpqpqpqpqp - opened

Just "lora key not loaded:" spam and doesn't enhance eyes

Just "lora key not loaded:" spam and doesn't enhance eyes

I haven't encountered this issue myself. Perhaps you can update comfyui and then use a lora loading node without clip.

using latest comfyui version (git pull done before trying) and got same error. Which lora node do you use ?

[ZImageTextEncoder] Loaded 4 templates
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_v.alpha

image

Thanks

using latest comfyui version (git pull done before trying) and got same error. Which lora node do you use ?

[ZImageTextEncoder] Loaded 4 templates
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_v.alpha

image

Thanks

I guess this is because the key for the FP8 gguf version has changed; I'm using the official original bf16 key structure.

how did you train the lora???? VERY interested in the process!

using latest comfyui version (git pull done before trying) and got same error. Which lora node do you use ?

[ZImageTextEncoder] Loaded 4 templates
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_v.alpha

image

Thanks

I guess this is because the key for the FP8 gguf version has changed; I'm using the official original bf16 key structure.

i tried using also the bf16 model in my wf to load your lora.
still this error.

qpqpqpqpqpqp changed discussion status to closed
qpqpqpqpqpqp changed discussion status to open

using latest comfyui version (git pull done before trying) and got same error. Which lora node do you use ?

[ZImageTextEncoder] Loaded 4 templates
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_k.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_out_0.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.alpha
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_down.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_q.lora_up.weight
lora key not loaded: lora_unet_context_refiner_0_attention_to_v.alpha

image

Thanks

I guess this is because the key for the FP8 gguf version has changed; I'm using the official original bf16 key structure.

i tried using also the bf16 model in my wf to load your lora.
still this error.

It looks like comfyui doesn't support the lora format. After the update, you can use this converted version
https://huggingface.co/Kijai/Z-Image_comfy_fp8_scaled/blob/main/qinglong_detailedeye_lora_z-image_comfy.safetensors

qpqpqpqpqpqp changed discussion status to closed

Sign up or log in to comment