YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
CE-CoLLM Cloud LLM Partition
This repository hosts the cloud-side partition of the CE-CoLLM model used in our IEEE ICWS 2025 paper "CE-CoLLM: Efficient and Adaptive Large Language Models Through Cloud-Edge Collaboration".
- Codebase: https://github.com/mlsysx/CE-CoLLM
- Edge-side partition: https://huggingface.co/hopenjin/CE-CoLLM_Edge_LLM_Partition
- Paper (IEEE Xplore): https://ieeexplore.ieee.org/abstract/document/11169709
Citation
Please cite the following paper when using CE-CoLLM:
@INPROCEEDINGS{jin2024cecollmefficientadaptivelarge,
author={Jin, Hongpeng and Wu, Yanzhao},
booktitle={2025 IEEE International Conference on Web Services (ICWS)},
title={CE-CoLLM: Efficient and Adaptive Large Language Models Through Cloud-Edge Collaboration},
year={2025},
pages={316-323},
keywords={Cloud computing;Accuracy;Web services;Large language models;Collaboration;Benchmark testing;Reliability engineering;Low latency communication;Edge computing;Software development management;Large Language Model;LLM Deployment;Cloud-Edge Collaboration;Cloud Services;Adaptive LLM Inference;Edge AI},
doi={10.1109/ICWS67624.2025.00046},
ISSN={2836-3868},
month={July},
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support