YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

CE-CoLLM Cloud LLM Partition

This repository hosts the cloud-side partition of the CE-CoLLM model used in our IEEE ICWS 2025 paper "CE-CoLLM: Efficient and Adaptive Large Language Models Through Cloud-Edge Collaboration".

Citation

Please cite the following paper when using CE-CoLLM:

@INPROCEEDINGS{jin2024cecollmefficientadaptivelarge,
  author={Jin, Hongpeng and Wu, Yanzhao},
  booktitle={2025 IEEE International Conference on Web Services (ICWS)},
  title={CE-CoLLM: Efficient and Adaptive Large Language Models Through Cloud-Edge Collaboration},
  year={2025},
  pages={316-323},
  keywords={Cloud computing;Accuracy;Web services;Large language models;Collaboration;Benchmark testing;Reliability engineering;Low latency communication;Edge computing;Software development management;Large Language Model;LLM Deployment;Cloud-Edge Collaboration;Cloud Services;Adaptive LLM Inference;Edge AI},
  doi={10.1109/ICWS67624.2025.00046},
  ISSN={2836-3868},
  month={July},
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support