metadata
library_name: transformers
pipeline_tag: text-generation
license: mit
language:
- en
Pretrained Historical Model (1750 - 1820)
This model was trained using the BabyLlama2 training recipe, from two trainers:
It was trained on 10M words from the Gutenberg Corpus attributed to the time period 1750 - 1820.
Model Sources
- Repository: https://github.com/comp-int-hum/historical-perspectival-lm
- Paper (ArXiv): https://arxiv.org/abs/2504.05523
- Paper (Hugging Face): https://huggingface.co/papers/2504.05523
Downloading the Model
Load the model like this:
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Hplm/student_1750_1820", torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained("Hplm/student_1750_1820")
License
This model is released under the MIT licence.
Citation
@article{fittschen_diachroniclanguagemodels_2025,
title = {Pretraining Language Models for Diachronic Linguistic Change Discovery},
author = {Fittschen, Elisabeth and Li, Sabrina and Lippincott, Tom and Choshen, Leshem and Messner, Craig},
year = {2025},
month = apr,
eprint = {2504.05523},
primaryclass = {cs.CL},
publisher = {arXiv},
doi = {10.48550/arXiv.2504.05523},
url = {https://arxiv.org/abs/2504.05523},
urldate = {2025-04-14},
archiveprefix = {arXiv},
journal = {arxiv:2504.05523[cs.CL]}
}