John1604's LLM for HIPAA
LLM for HIPAA
This is the LLM about HIPPA law. Ask the LLM about HIPAA. It runs in both ollama and LM studio.
Use the model in ollama
First download and install ollama.
Command
in windows command line, or in terminal in ubuntu, type:
ollama run hf.co/ling1000T/John1604-HIPAA-English-gguf:q6_k
(q6_k is the model quant type, q5_k_s, q4_k_m, ..., can also be used)
C:\Users\developer>ollama run hf.co/ling1000T/John1604-HIPAA-English-gguf:q6_k
pulling manifest
...
verifying sha256 digest
writing manifest
success
>>> in a form authorization, patient was asked to provide agreement
to let other third parties and provider to share patient information.
can patient refuse?
Yes, a patient has the right to refuse in a form authorization. Even though
some commenters suggested that patients might be confused about what they are agreeing to,
there is no indication that this confusion would lead patients to sign
authorizations without understanding or consent. The current rule allows
for conditional authorizations when the condition is clearly indicated—such
as requiring payment before treatment—but the Department maintains
that this standard should not be expanded in a way that undermines patient autonomy.
Therefore, patients retain control over their health information and can choose
not to agree to the sharing of their information with other third parties or providers.
>>> Send a message (/? for help)
After you run command: ollama run hf.co/ling1000T/John1604-HIPAA-English-gguf:q6_k, it will appear in ollama UI - you may select this model hf.co/ling1000T/John1604-HIPAA-English-gguf:q6_k from the model list, and run it the same way as other ollama pre-configured models.
Use the model in LM Studio
download and install LM Studio
Discover models
In the LM Studio, click "Discover" icon. "Mission Control" popup window will be displayed.
In the "Mission Control" search bar, type "ling1000T/John1604-HIPAA-English-gguf" and check "GGUF", the model should be found.
Download the model.
Load the model.
Ask questions.
quantized models
| Type | Bits | Quality | Description |
|---|---|---|---|
| Q2_K | 2-bit | 🟥 Low | Minimal footprint; only for tests |
| Q3_K_S | 3-bit | 🟧 Low | “Small” variant (less accurate) |
| Q3_K_M | 3-bit | 🟧 Low–Med | “Medium” variant |
| Q4_K_S | 4-bit | 🟨 Med | Small, faster, slightly less quality |
| Q4_K_M | 4-bit | 🟩 Med–High | “Medium” — best 4-bit balance |
| Q5_K_S | 5-bit | 🟩 High | Slightly smaller than Q5_K_M |
| Q5_K_M | 5-bit | 🟩🟩 High | Excellent general-purpose quant |
| Q6_K | 6-bit | 🟩🟩🟩 Very High | Almost FP16 quality, larger size |
| Q8_0 | 8-bit | 🟩🟩🟩🟩 | Near-lossless baseline |
| F16 | 16-bit | 🟩🟩🟩🟩 | baseline |
International Inventor's License
If the use is not commercial, it is free to use without any fees.
For commercial use, if the company or individual does not make any profit, no fees are required.
For commercial use, if the company or individual has a net profit, they should pay 1% of the net profit or 0.5% of the sales revenue, whichever is less.
For commercial use, we provide product-related services.
- Downloads last month
- 33
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Model tree for ling1000T/John1604-HIPAA-English-gguf
Base model
Qwen/Qwen3-8B-Base