NewsScope LoRA Adapter

This repository contains a LoRA adapter fine-tuned for schema-grounded claim extraction from news articles.

It produces structured JSON outputs with:

  • domain
  • headline
  • key_points
  • whos_involved
  • how_it_unfolded
  • claims (2-3 verifiable claims with evidence)

Key Result (Human Evaluation)

  • NewsScope: 89.4% accuracy
  • GPT-4o-mini baseline: 93.7%
  • Reported difference is not statistically significant (p=0.07)

Important: LLaMA License

You must accept the Meta LLaMA license for the base model on Hugging Face:
meta-llama/Meta-Llama-3.1-8B-Instruct

Then either:

  • run huggingface-cli login, or
  • set HF_TOKEN in your environment.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch

base = AutoModelForCausalLM.from_pretrained(
    "meta-llama/Meta-Llama-3.1-8B-Instruct",
    torch_dtype=torch.float16,
    device_map="auto",
)

model = PeftModel.from_pretrained(base_model, "nidhipandya/NewsScope-lora")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")

Training Details

  • Base model: meta-llama/Meta-Llama-3.1-8B-Instruct
  • LoRA rank: 16
  • Training set size: 315 articles (URLs + annotations; article text not publicly redistributed)
  • Notes: Training reproduction requires fetching article text from URLs due to copyright.

Links

Citation

@article{pandyaNewsscope,
  title={NewsScope: Schema-Grounded Cross-Domain News Claim Extraction with Open Models},
  author={Pandya, Nidhi},
  journal={arXiv preprint arXiv:TBD},
  year={TBD}
}
Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for nidhipandya/NewsScope-lora

Adapter
(1341)
this model