Gemma 2 9B - Terraform Principal Architect

This is a fine-tuned LoRA adapter for Gemma 2 9B Instruct, specialized in generating production-ready Google Cloud Platform (GCP) Terraform code.

Training Performance

  • Eval Loss: 0.4558
  • BLEU Score: 0.3416
  • Training Steps: Final Checkpoint
  • Hardware: Vertex AI (GCP)

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer

base_model = "google/gemma-2-9b-it"
adapter_id = "AdarshRL/gemma2-9b-terraform-architect-adapter"

model = AutoModelForCausalLM.from_pretrained(base_model, device_map="auto", load_in_4bit=True)
model = PeftModel.from_pretrained(model, adapter_id)

⚠️ Limitations & Known Issues

While this model is specialized in GCP Terraform generation, users should be aware of the following technical constraints observed during evaluation:

1. Resource Dependency Logic

  • Issue: The model occasionally struggles with complex resource dependencies (e.g., trying to create a node_pool before the cluster is fully initialized).
  • Recommendation: Always verify that depends_on blocks are present for sub-resources like Node Pools, IAM Bindings, and Firewall rules.

2. Syntax Drift (Inline vs. Separate Resources)

  • Issue: The model may occasionally default to "Inline" syntax for GKE node pools (node_config inside the google_container_cluster resource).
  • Architect's Note: For production stability, it is highly recommended to manually refactor these into separate google_container_node_pool resources to avoid unnecessary cluster recreations.

3. Versioning

  • Issue: Fine-tuned on data that may favor older provider versions (~v4.x - v5.x).
  • Fix: Users should manually update the required_providers block to the latest version (e.g., ~> 7.0 as of 2026) and run terraform validate.

4. Hallucination Risk

  • Issue: Like all 9B parameter models, "hallucinations" of non-existent Terraform arguments can occur in highly edge-case configurations.
Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AdarshRL/gemma2-9b-terraform-architect-adapter

Base model

google/gemma-2-9b
Adapter
(226)
this model

Dataset used to train AdarshRL/gemma2-9b-terraform-architect-adapter