code-sim-roberta-small

RoBERTa-small์„ ์ฝ”๋“œ ์œ ์‚ฌ๋„ ๋ถ„๋ฅ˜ ํƒœ์Šคํฌ๋กœ ํŒŒ์ธํŠœ๋‹ํ•œ ๊ฐ€์ค‘์น˜์ž…๋‹ˆ๋‹ค.

Task : https://dacon.io/competitions/official/235900/overview/description

Decription : ๋‘ ์ฝ”๋“œ๊ฐ„ ์œ ์‚ฌ์„ฑ(๋™์ผ ๊ฒฐ๊ณผ๋ฌผ ์‚ฐ์ถœ ๊ฐ€๋Šฅํ•œ์ง€) ์—ฌ๋ถ€๋ฅผ ํŒ๋‹จํ•  ์ˆ˜ ์žˆ๋Š” AI ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฐœ๋ฐœ

์‚ฌ์šฉ pretrained_model : "hosung1/roberta_small_mlm_from_scratch"

์‚ฌ์šฉ Datasets : Dacon์ œ๊ณต

How to use

from transformers import AutoTokenizer, AutoModelForSequenceClassification
tok = AutoTokenizer.from_pretrained("hosung1/code-sim-roberta-small")
mdl = AutoModelForSequenceClassification.from_pretrained("hosung1/code-sim-roberta-small")
Downloads last month
29
Safetensors
Model size
35.8M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support