πΈ DroneMamba-RCS Classifier
A high-performance Selective State Space Model (Mamba) for classifying Radar Cross Section (RCS) signatures of drones and objects.
π Model Performance
- Test Accuracy: 90.88%
- Macro F1-Score: 0.8850
- Classes: 10 (F450, Heli, Hexa, M100, Mavic, P4P, Parrot, Walkera, Y600, battery)
- Test Set Size: 8,961 sequences (20% split)
π― Classification Classes
- F450
- Heli
- Hexa
- M100
- Mavic
- P4P
- Parrot
- Walkera
- Y600
- battery
π Dataset Information
This model was trained on the Drone_RCS_Measurement dataset:
- Source: https://huggingface.co/datasets/Goorm-AI-04/Drone_RCS_Measurement
- Total Samples: 44,805 RCS sequences
- Preprocessing:
- Resampled to 181 points (0Β°-180Β° coverage)
- Normalized using
StandardScaler - Grouped by frequency and azimuth angle
π§ Architecture Details
Model Specifications:
- Backbone: 6-Layer Mamba (Selective State Space Model)
- Hidden Dimension: 256 (
d_model) - State Dimension: 8 (
d_state) - Expansion Factor: 1.5
- Total Parameters: ~2.8M
- Input Shape:
[batch_size, 181, 1] - Output: Class logits
[batch_size, 10]
Training Configuration:
- Optimizer: AdamW (lr=5e-4, weight_decay=0.05)
- Scheduler: Cosine Annealing (eta_min=1e-6)
- Regularization: Dropout 0.1 + Weight Decay
- Batch Size: 1024
- Epochs: 200
- Hardware: NVIDIA GeForce RTX 5090
π» Usage
Quick Start
import torch
import joblib
import numpy as np
from model_code import DroneMambaClassifier
# 1. Load model and preprocessing
model = DroneMambaClassifier(num_classes=10, d_model=256, depth=6)
model.load_state_dict(torch.load("pytorch_model.bin", map_location='cpu'))
scaler = joblib.load("scaler.pkl")
model.eval()
# 2. Prepare your RCS data (181 points from 0Β° to 180Β°)
rcs_signal = np.random.randn(181, 1) # Replace with your actual RCS measurements
# 3. Normalize and predict
normalized = scaler.transform(rcs_signal)
x = torch.tensor(normalized, dtype=torch.float32).unsqueeze(0)
with torch.no_grad():
logits = model(x)
probs = torch.softmax(logits, dim=1)
pred_class = logits.argmax(dim=1).item()
confidence = probs[0, pred_class].item()
print(f"Predicted Class: {pred_class}")
print(f"Confidence: {confidence*100:.2f}%")
Batch Prediction
# For multiple sequences
rcs_batch = np.random.randn(10, 181, 1) # 10 sequences
normalized_batch = scaler.transform(rcs_batch.reshape(-1, 181)).reshape(10, 181, 1)
x_batch = torch.tensor(normalized_batch, dtype=torch.float32)
with torch.no_grad():
predictions = model(x_batch).argmax(dim=1)
print(predictions) # Tensor of predicted classes
π Repository Files
| File | Description |
|---|---|
pytorch_model.bin |
Model weights (state_dict) |
scaler.pkl |
StandardScaler fitted on training data |
model_code.py |
Model architecture definition |
confusion_matrix.png |
Test set confusion matrix visualization |
classification_report.txt |
Detailed per-class metrics |
per_class_metrics.csv |
Per-class accuracy table |
π¬ Technical Details
State Space Model (SSM) Formulation
The Mamba architecture implements a selective SSM with:
- Continuous-time state equation:
h'(t) = Ah(t) + Bx(t) - Discretization: Time-varying
Ξtcomputed per token - Selectivity: Parameters
B, C, Ξtare input-dependent - Efficient scanning: Linear complexity in sequence length
Why Mamba for RCS Classification?
- Sequential Nature: RCS varies smoothly across angles
- Long-range Dependencies: Pattern recognition across full 180Β° sweep
- Efficiency: Linear complexity vs O(LΒ²) for transformers
- Inductive Bias: SSM structure matches physics of radar scattering
π Training Insights
- Convergence: Model reached 94%+ accuracy after ~150 epochs
- Regularization Impact: Weight decay (0.05) prevented overfitting
- Class Balance: All classes have >90% accuracy (see confusion matrix)
- Gradient Clipping: Norm clipping at 1.0 stabilized training
π Deployment
ONNX Export (Optional)
# Export to ONNX for production deployment
dummy_input = torch.randn(1, 181, 1)
torch.onnx.export(
model,
dummy_input,
"dronemamba.onnx",
input_names=['rcs_signal'],
output_names=['class_logits'],
dynamic_axes={'rcs_signal': {0: 'batch'}, 'class_logits': {0: 'batch'}}
)
π Citation
If you use this model in your research, please cite:
@model{dronemamba2024,
title={DroneMamba-RCS: Selective State Space Model for Drone Classification},
author={Bombek1},
year={2024},
url={https://huggingface.co/Bombek1/DroneMamba-RCS}
}
@dataset{drone_rcs_2024,
title={Drone RCS Measurement Dataset},
author={Goorm-AI-04},
year={2024},
url={https://huggingface.co/datasets/Goorm-AI-04/Drone_RCS_Measurement}
}
π License
MIT License - Free to use with attribution.
π€ Acknowledgments
- Dataset: Goorm-AI-04 team
- Architecture: Inspired by "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" (Gu & Dao, 2023)
- Training Hardware: NVIDIA GeForce RTX 5090
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
