Dataset Viewer
Auto-converted to Parquet Duplicate
T
stringclasses
1 value
Modelo
stringclasses
5 values
Tipo
stringclasses
1 value
Arquitetura
stringclasses
2 values
Tipo de Peso
stringclasses
1 value
Precisão
stringclasses
1 value
Licença
stringclasses
2 values
#Params (B)
float64
0.49
3.21
Hub Likes
int64
270
1.57k
Disponível no hub
bool
1 class
SHA do modelo
stringclasses
5 values
Discurso de Ódio
float64
0.42
0.72
Área do Direito
float64
0
0
Computação
float64
0
0
Provas Militares
float64
0
0
Área Médica
float64
0
0
Multidisciplinar
float64
0
0
Economia e Contabilidade
float64
0
0
Semântica e Inferência
float64
0
0
HateBR
float64
0.5
0.8
PT Hate Speech
float64
0.53
0.71
tweetSentBR
float64
0.01
0.7
ToxSyn-PT
float64
0.44
0.84
OAB
float64
0
0
Revalida
float64
0
0
MREX
float64
0
0
ENAM
float64
0
0
AFA
float64
0
0
ITA
float64
0
0
IME
float64
0
0
POSCOMP
float64
0
0
OBI
float64
0
0
BCB
float64
0
0
CFCES
float64
0
0
ASSIN2 RTE
float64
0
0
ASSIN2 STS
float64
0
0
FAQUAD NLI
float64
0
0
BLUEX
float64
0
0
ENEM
float64
0
0
CNPU
float64
0
0
ENADE
float64
0
0
BNDES
float64
0
0
CACD (1ª fase)
float64
0
0
CACD (2ª fase)
float64
0
0
Média Geral
float64
0.42
0.72
Datasets Área Médica
stringclasses
1 value
Datasets Área do Direito
stringclasses
1 value
Datasets Provas Militares
stringclasses
1 value
Datasets Computação
stringclasses
1 value
Datasets Discurso de Ódio
stringclasses
1 value
Datasets Economia e Contabilidade
stringclasses
1 value
Datasets Semântica e Inferência
stringclasses
1 value
Datasets Multidisciplinar
stringclasses
1 value
energy_dataset
float64
0.5
0.5
reasoning_dataset
float64
0.5
0.5
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
334
true
7ae557604adf67be50417f59c2c2f167def9a775
0.424598
0
0
0
0
0
0
0
0.501099
0.699168
0.057711
0.440415
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.424598
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-1.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
1.544
463
true
989aa7980e4cf806f80c7fef2b1adb7bc71aa306
0.684296
0
0
0
0
0
0
0
0.673571
0.713278
0.669652
0.680684
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.684296
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-3B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
3.086
270
true
aa8e72537993ba99e69dfaafa59ed015b17504d1
0.717461
0
0
0
0
0
0
0
0.802143
0.525264
0.699502
0.842934
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.717461
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-1B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
1.236
984
true
9213176726f574b556790deb65791e0c5aa438b6
0.490798
0
0
0
0
0
0
0
0.5
0.701528
0.236318
0.525346
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.490798
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-3B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
3.213
1,570
true
0cb88a4f764b7a12671c53f0838cd831a0843b95
0.55658
0
0
0
0
0
0
0
0.803571
0.700353
0.009453
0.712942
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.55658
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
334
true
7ae557604adf67be50417f59c2c2f167def9a775
0.424598
0
0
0
0
0
0
0
0.501099
0.699168
0.057711
0.440415
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.424598
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-1.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
1.544
463
true
989aa7980e4cf806f80c7fef2b1adb7bc71aa306
0.684296
0
0
0
0
0
0
0
0.673571
0.713278
0.669652
0.680684
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.684296
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-3B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
3.086
270
true
aa8e72537993ba99e69dfaafa59ed015b17504d1
0.717585
0
0
0
0
0
0
0
0.802143
0.525264
0.7
0.842934
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.717585
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-1B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
1.236
984
true
9213176726f574b556790deb65791e0c5aa438b6
0.491793
0
0
0
0
0
0
0
0.5
0.701528
0.240299
0.525346
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.491793
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-3B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
3.213
1,570
true
0cb88a4f764b7a12671c53f0838cd831a0843b95
0.712674
0
0
0
0
0
0
0
0.803571
0.700353
0.633831
0.712942
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.712674
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
334
true
7ae557604adf67be50417f59c2c2f167def9a775
0.424598
0
0
0
0
0
0
0
0.501099
0.699168
0.057711
0.440415
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.424598
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-1.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
1.544
463
true
989aa7980e4cf806f80c7fef2b1adb7bc71aa306
0.684296
0
0
0
0
0
0
0
0.673571
0.713278
0.669652
0.680684
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.684296
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-3B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
3.086
270
true
aa8e72537993ba99e69dfaafa59ed015b17504d1
0.717585
0
0
0
0
0
0
0
0.802143
0.525264
0.7
0.842934
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.717585
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-1B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
1.236
984
true
9213176726f574b556790deb65791e0c5aa438b6
0.491793
0
0
0
0
0
0
0
0.5
0.701528
0.240299
0.525346
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.491793
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
meta-llama/Llama-3.2-3B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
3.213
1,570
true
0cb88a4f764b7a12671c53f0838cd831a0843b95
0.712674
0
0
0
0
0
0
0
0.803571
0.700353
0.633831
0.712942
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.712674
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
README.md exists but content is empty.
Downloads last month
1