Context size?

#1
by opiiopip - opened

Dear developers of the model Delta-Vector/Plesio-70B

I found this model via the tag roleplay, but I couldn't find the context size for this model. For roleplay, chat memory is crucial. Summaries of the chat history can only go so far; sooner or later, details the user clearly remembers are forgotten by the AI because of the limited context and the immersion is lost.

On the Featherless AI deployment of this model https://featherless.ai/models/Delta-Vector/Plesio-70B , I found the tag "Ctx length: 32768" -- but I'm a bit confused because for some models which were trained on a much larger context length like https://huggingface.co/aifeifei798/DarkIdol-Llama-3.1-8B-Instruct-1.2-Uncensored?inference_provider=featherless-ai with context length 128k , Featherless AI still has this tag.

So does Featherless AI somehow offer these models with lower context length than was used in the training?

Thank you very much in advance!

The model should be able to handle upto 64K~, Featherless is compute limited and only can serve 32K ctx length.

Sign up or log in to comment