runtime error
Exit code: 1. Reason: e checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) module._load_from_state_dict( /usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py:2586: UserWarning: for base_model.model.model.layers.35.mlp.down_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) module._load_from_state_dict( /usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py:2586: UserWarning: for base_model.model.model.layers.35.mlp.down_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) module._load_from_state_dict( Traceback (most recent call last): File "/app/app.py", line 13, in <module> model = PeftModel.from_pretrained(base_model, "mistral-hackaton-2026/sentinel-trade-classifier-ministral-8b") File "/usr/local/lib/python3.13/site-packages/peft/peft_model.py", line 568, in from_pretrained load_result = model.load_adapter( model_id, ...<5 lines>... **kwargs, ) File "/usr/local/lib/python3.13/site-packages/peft/peft_model.py", line 1427, in load_adapter self._update_offload(offload_index, adapters_weights) ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/peft/peft_model.py", line 1236, in _update_offload safe_module = dict(self.named_modules())[extended_prefix] ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ KeyError: 'base_model.model.model.model.embed_tokens'
Container logs:
Fetching error logs...