Files changed (1) hide show
  1. README.md +69 -84
README.md CHANGED
@@ -5,115 +5,100 @@ datasets:
5
  language:
6
  - en
7
  base_model:
8
- - mistralai/Mistral-8x7B
9
  tags:
10
- - transfer-orbits
 
 
 
 
 
11
  - hohmann-transfer-orbits
12
- - text-generation-inference
13
- - transformers
14
  ---
15
-
16
- # Mistral 7B Fine-Tuned on Titan-Hohmann-Transfer-Orbit Dataset
17
-
18
- ## Model Card
19
-
20
- ### Model Overview
21
-
22
- **Model Name**: `mistral-7b-titan-hohmann`
23
- **Model Type**: Transformer-based language model
24
- **Languages**: English
25
- **License**: MIT
26
-
27
- This model is based on the **Mistral-8x7B** foundation model and is being fine tuned on the **titan-hohmann-transfer-orbit** dataset. It is designed to assist with advanced orbital calculations, specifically focusing on space probe Hohmann transfer orbits, and includes support for native function calling and multimodal inputs.
28
-
29
- ---
30
-
31
- ### Model Details
32
-
33
- - **Developers**: A Taylor
34
- - **Model Architecture**: Transformer based with enhancements for code generation and multimodal processing
35
- - **Parameters**: 7 Billion
36
- - **Native Function Calling**: Supported
37
- - **Multimodal Capabilities**: Text and Image processing
38
-
39
- ---
40
-
41
- ### Intended Use
42
-
43
- - **Primary Applications**:
44
- - Assist aerospace engineers with calculations related to Hohmann transfer orbits
45
- - Facilitate research by offering computational assistance in orbital dynamics
46
- - **Usage Scenarios**:
47
- - Calculating delta-v requirements
48
- - Estimating transfer times between celestial bodies
49
- - Visualizing orbital trajectories
50
-
51
- ---
52
-
53
- ### Training Data
54
-
55
- - **Dataset Name**: `titan-hohmann-transfer-orbit`
56
- - **Description**: A specialized dataset containing textual explanations, code snippets, and diagrams related to probe orbital mechanics and Hohmann transfer orbits.
57
- - **Data Modalities**:
58
- - **Text**: Technical documentation and literature on orbital mechanics
59
- - **Code**: Sample code for orbital calculations and simulations
60
- - **Images**: Diagrams illustrating probe orbital paths and spacecraft maneuvers
61
-
62
  ---
63
 
64
- ### Training Procedure
65
 
66
- The model is currently being fine tuned to enhance its capabilities in handling advanced orbital calculations and multimodal inputs. Future enhancements will include:
67
 
68
- 1. **Multimodal Processing**: Enabling the model to interpret and generate both text and image data related to orbital mechanics.
69
- 2. **Native Function Calling**: Integrating the ability to execute predefined computational functions within responses for accurate and dynamic calculations.
70
- 3. **Domain-Specific Fine-Tuning**: Refining the model's parameters with the titan-hohmann-transfer-orbit dataset to enhance accuracy and expertise in calculating Hohmann transfer trajectories.
71
- 4. **Validation and Testing**: Verifying the model outputs against industry standards and benchmarks to ensure precision and reliability in aerospace applications.
72
 
73
- ---
74
 
75
- ### How to Use
76
 
77
- - **Input Format**:
 
 
 
 
78
 
79
- - **Examples**:
80
 
81
-
82
- ---
 
83
 
84
- ### Limitations
85
 
86
- - **Work in Progress**: The model is currently being fine-tuned; performance may improve over time.
87
- - **Domain Specificity**: Optimized for Hohman transfer orbits.
88
- - **Computational Resources**: Requires adequate computational power for optimal performance due to model size and complexity.
 
89
 
90
- ---
 
91
 
92
- ### Ethical Considerations
 
 
 
 
 
 
 
 
 
 
 
93
 
94
- - **Accuracy**: Users should verify critical calculations independently.
95
- - **Professional Use**: Intended to assist professionals; not a substitute for expert judgment.
 
 
96
 
97
- ---
 
 
98
 
99
- ### Acknowledgements
 
 
 
 
100
 
101
- - **Mistral AI**: For providing the Mistral 8B foundation model.
102
- - **Dataset Contributors**: A Taylor
103
- - **Open-Source Community**: Appreciation for the tools and libraries that made this project possible.
104
 
105
- ---
106
-
107
- ### License
108
 
109
- - **Model License**: MIT
110
- - **Dataset License**: MIT
111
 
112
- ### Future Work
 
113
 
114
- - **Next Version**: Calculations for mineral rich asteroids within the Van Allen belt, and other natural satellites beyond it, including significant inclination adjustments, will be included in version 2.0
115
 
116
- ---
 
117
 
118
  ### Contact Information
119
 
 
5
  language:
6
  - en
7
  base_model:
8
+ - mistralai/Pixtral-12B-Base-2409
9
  tags:
10
+ - mistral
11
+ - pixtral
12
+ - vlm
13
+ - multimodal
14
+ - image-text-to-text
15
+ - orbital-mechanics
16
  - hohmann-transfer-orbits
 
 
17
  ---
18
+ language:
19
+ - en
20
+ license: mit
21
+ library_name: transformers
22
+ pipeline_tag: image-text-to-text
23
+ base_model: mistralai/Pixtral-12B-Base-2409
24
+ datasets:
25
+ - Taylor658/titan-hohmann-transfer-orbit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ---
27
 
28
+ # Pixtral 12B Fine-Tuned on Titan-Hohmann-Transfer-Orbit
29
 
30
+ > Updated to the latest suitable Mistral multimodal base: `mistralai/Pixtral-12B-Base-2409`.
31
 
32
+ ## Overview
 
 
 
33
 
34
+ Fine-tuned variant of Pixtral 12B for orbital mechanics with emphasis on Hohmann transfer orbits. Supports multimodal (image + text) inputs and text outputs.
35
 
36
+ ## Model Details
37
 
38
+ - Base: `mistralai/Pixtral-12B-Base-2409`
39
+ - Type: Multimodal (Vision + Text)
40
+ - Params: ~12B (decoder) + vision encoder
41
+ - Languages: English
42
+ - License: MIT
43
 
44
+ ## Intended Use
45
 
46
+ - Hohmann transfer ∆v estimation
47
+ - Transfer-time approximations
48
+ - Orbit analysis aids and reasoning
49
 
50
+ ## Quickstart
51
 
52
+ ### vLLM (multimodal)
53
+ ```python
54
+ from vllm import LLM
55
+ from vllm.sampling_params import SamplingParams
56
 
57
+ llm = LLM(model="mistralai/Pixtral-12B-Base-2409", tokenizer_mode="mistral")
58
+ sampling = SamplingParams(max_tokens=512, temperature=0.2)
59
 
60
+ messages = [
61
+ {
62
+ "role": "user",
63
+ "content": [
64
+ {"type": "text", "text": "Given this diagram, estimate the delta-v for a Hohmann transfer to Titan."},
65
+ {"type": "image_url", "image_url": {"url": "https://example.com/orbit_diagram.png"}}
66
+ ]
67
+ }
68
+ ]
69
+ resp = llm.chat(messages, sampling_params=sampling)
70
+ print(resp[0].outputs[0].text)
71
+ ```
72
 
73
+ ### Transformers (text-only demo)
74
+ ```python
75
+ from transformers import AutoModelForCausalLM, AutoTokenizer
76
+ import torch
77
 
78
+ model_id = "mistralai/Pixtral-12B-Base-2409"
79
+ tok = AutoTokenizer.from_pretrained(model_id)
80
+ model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype="auto", device_map="auto")
81
 
82
+ prompt = "Compute approximate delta-v for a Hohmann transfer to Titan. State assumptions."
83
+ inputs = tok(prompt, return_tensors="pt").to(model.device)
84
+ out = model.generate(**inputs, max_new_tokens=512, temperature=0.2)
85
+ print(tok.decode(out[0], skip_special_tokens=True))
86
+ ```
87
 
88
+ ## Training Data
 
 
89
 
90
+ - Dataset: `Taylor658/titan-hohmann-transfer-orbit`
91
+ - Modalities: text (explanations), code (snippets), images (orbital diagrams)
 
92
 
93
+ ## Limitations
 
94
 
95
+ - Optimized for Hohmann transfers and related reasoning
96
+ - Requires sufficient GPU VRAM for best throughput
97
 
98
+ ## Acknowledgements
99
 
100
+ - Base model by Mistral AI (Pixtral 12B)
101
+ - Dataset by A Taylor
102
 
103
  ### Contact Information
104