You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

InventoryGemma

This repository contains a GGUF model file (.gguf) based on google/gemma-3-270m, tuned to act as an expert grocery inventory manager.

Quickstart (run the GGUF)

Option A: llama.cpp

  1. Build or install llama.cpp (see the upstream project).
  2. Run an inference command like:
./llama-cli -m "InventoryGemma-268M-BF16.gguf" -p "Hello!"

Option B: GUI apps (LM Studio / similar)

  • Import/open the .gguf file in your app and start chatting.

Prompt format (expected input/output)

Provide the prompt in this shape (Markdown table works well):

You are an expert inventory manager for a grocery store.

Current inventory:

| Product | Current Stock | Average Weekly Sales |
|---|---:|---:|
| Apples | 52 | 48 |
| Oil | 379 | 44 |
| Sugar | 279 | 58 |
| Pasta | 302 | 6 |
| Cereal | 16 | 28 |
| Cheese | 47 | 2 |

Provide recommendations:
- List products that are low on stock (current stock < weekly sales) as urgent.
- Suggest restocking for products where current stock <= 3 * weekly sales.
- Recommended order quantity: enough to reach 4 weeks of stock, rounded up to nearest 10, minimum one week's sales.
- Provide a summary of total units to order.

Example output:

Urgent low stock items:
- Cereal

Recommended restocks:
- Apples: order 140 units (current: 52, weekly sales: 48)
- Cereal: order 100 units (current: 16, weekly sales: 28)

Summary: restock 2 products, total order quantity 240 units.

Notes

  • Runtimes differ: Some runtimes handle system prompts/templates differently; if results look off, try placing the instruction block at the start of the user prompt exactly as shown above.
  • Hardware: BF16 GGUF typically runs best on modern CPUs/GPUs supported by your runtime; exact performance depends on backend and quantization.

License

The repository is marked as MIT (see front matter). If the included weights have additional upstream license terms, document them here as well.

Downloads last month
-
GGUF
Model size
0.3B params
Architecture
gemma3
Hardware compatibility
Log In to add your hardware

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for rajofearth/inventorygemma

Quantized
(32)
this model