shatonix commited on
Commit
ae7d591
·
verified ·
1 Parent(s): 2f51f62

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,1097 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - dense
7
+ - generated_from_trainer
8
+ - dataset_size:4500
9
+ - loss:MatryoshkaLoss
10
+ - loss:MultipleNegativesRankingLoss
11
+ base_model: ibm-granite/granite-embedding-english-r2
12
+ widget:
13
+ - source_sentence: You are tasked with creating a module in Elm to render an article
14
+ document. The module should include functions to render the body, navigation,
15
+ main content, titles, headings, subheadings, paragraphs, code blocks, images,
16
+ and references. You need to define the necessary types and functions to achieve
17
+ this.
18
+ sentences:
19
+ - '4605'
20
+ - "Context: \nAnswer: ```elm\nmodule ArticleRenderer exposing (Rendered, renderDocument,\
21
+ \ body, navigation, mainContent, title, heading, subheading, paragraph, codeBlock,\
22
+ \ imageBlock, renderReference)\n\nimport Html exposing (Html, div, nav, main,\
23
+ \ h1, h2, h3, p, img, pre, figure, a, span)\nimport Html.Attributes exposing (style,\
24
+ \ src, href, target, rel, hidden, type_)\nimport Html.Styled.Attributes as Attributes\
25
+ \ exposing (css)\nimport Document exposing (Inline, FlatInline, Key, Text, Link,\
26
+ \ Reference, Code, Image)\n\n\ntype alias Rendered msg =\n Html msg\n\n\n--\
27
+ \ Navigation\nnavigation : Rendered msg\nnavigation =\n Html.nav\n [\
28
+ \ css [ Css.marginBottom (rem 1) ]\n ]\n [ navLink \"Go back to\
29
+ \ overview\" Route.Index\n ]\n\n\nnavLink : String -> Route -> Rendered\
30
+ \ msg\nnavLink text route =\n viewLink\n { url = Route.toPath route\n\
31
+ \ , text = [ Html.text text ]\n , styles = [ Css.fontStyle Css.italic\
32
+ \ ]\n }\n\n\n-- Document rendering\nrenderDocument : Article -> Rendered\
33
+ \ msg\nrenderDocument article =\n article.document\n |> List.map renderBlock\n\
34
+ \ |> document\n\n\nrenderBlock : Block -> Rendered msg\nrenderBlock block\
35
+ \ =\n case block of\n Title content ->\n title content\n\n\
36
+ \ Heading contents ->\n heading contents\n\n Subheading\
37
+ \ contents ->\n subheading contents\n\n Paragraph contents ->\n\
38
+ \ paragraph [] contents\n\n CodeBlock code ->\n codeBlock\
39
+ \ code\n\n ImageBlock image ->\n imageBlock image\n\n\n-- Landmarks\n\
40
+ body : List (Rendered msg) -> Rendered msg\nbody content =\n Html.div\n \
41
+ \ [ css\n [ Css.padding (rem 1)\n , Css.maxWidth (rem\
42
+ \ 48)\n , Css.margin Css.auto\n ]\n ]\n content\n\
43
+ \n\nmainContent : List (Rendered msg) -> Rendered msg\nmainContent contents =\n\
44
+ \ Html.main_ [] contents\n\n\n-- Text\ntitle : String -> Rendered msg\ntitle\
45
+ \ text =\n Html.h1\n [ css\n [ headingStyle\n \
46
+ \ , Css.fontSize (rem 1.5)\n ]\n ]\n [ Html.text text\
47
+ \ ]\n\n\nheading : List (Inline Path) -> Rendered msg\nheading contents =\n \
48
+ \ Html.h2\n [ css\n [ headingStyle\n , Css.fontSize\
49
+ \ (rem 1.25)\n ]\n ]\n (List.map renderInline contents)\n\
50
+ \n\nsubheading : List (Inline Path) -> Rendered msg\nsubheading contents =\n \
51
+ \ Html.h3\n [ css\n [ headingStyle\n , Css.fontSize\
52
+ \ (rem 1.1)\n ]\n ]\n (List.map renderInline contents)\n\
53
+ \n\nparagraph : List Css.Style -> List (Inline Path) -> Rendered msg\nparagraph\
54
+ \ styles content =\n Html.p\n [ css (paragraphStyle :: styles)\n \
55
+ \ ]\n (List.map renderInline content)\n\n\ncodeBlock : Document.Code\
56
+ \ -> Rendered msg\ncodeBlock code =\n Html.pre [ css [ codeBackgroundStyle,\
57
+ \ Css.padding (em 1) ] ]\n [ Html.code [ css [ codeFontStyle ] ] [ Html.text\
58
+ \ code.src ] ]\n\n\nimageBlock : Document.Image Path -> Rendered msg\nimageBlock\
59
+ \ image =\n Html.figure\n [ css\n [ Css.margin2 paragraphSpacing\
60
+ \ zero\n , framedStyle\n ]\n ]\n [ Html.a\n\
61
+ \ [ Attributes.href <| Path.toAbsolute image.fallbackSource.source.src\n\
62
+ \ , Attributes.target \"_blank\"\n , Attributes.rel \"noopener\"\
63
+ \n ]\n [ Html.img\n [ Attributes.src (Path.toAbsolute\
64
+ \ image.fallbackSource.source.src)\n , Attributes.alt image.alt\n\
65
+ \ , Attributes.width image.fallbackSource.source.width\n \
66
+ \ , Attributes.height image.fallbackSource.source.height\n \
67
+ \ , css\n [ Css.display Css.block\n \
68
+ \ , Css.maxWidth (pct 100)\n , Css.width (pct 100)\n \
69
+ \ , Css.height Css.auto\n ]\n \
70
+ \ ]\n ]\n , Html.figcaption\n [ css\n \
71
+ \ [ Css.boxSizing Css.borderBox\n , Css.width (pct 100)\n \
72
+ \ , Css.padding (rem 0.5)\n ]\n ]\n \
73
+ \ [ Html.text image.caption ]\n ]\n\n\n-- Text styles\nheadingStyle\
74
+ \ : Css.Style\nheadingStyle =\n Css.batch\n [ headingFontFamily\n \
75
+ \ , Css.fontWeight Css.bold\n , Css.margin zero\n , Css.lineHeight\
76
+ \ (num 1.2)\n ]\n\n\nparagraphStyle : Css.Style\nparagraphStyle =\n \
77
+ \ Css.batch\n [ paragraphFontStyle\n , Css.margin zero\n \
78
+ \ ]\n\n\ncodeFontStyle : Css.Style\ncodeFontStyle =\n Css.batch\n [\
79
+ \ Css.whiteSpace Css.preWrap\n , Css.fontFamilies [ \"Source Code Pro\"\
80
+ , \"monospace\" ]\n ]\n\n\ncodeBackgroundStyle : Css.Style\ncodeBackgroundStyle\
81
+ \ =\n Css.batch\n [ Css.borderRadius (em 0.2)\n , Css.backgroundColor\
82
+ \ (Css.hsla 0 0 0.5 0.15)\n , Css.padding2 (em 0.05) (em 0.2)\n \
83
+ \ , Css.fontSize (em 0.95)\n ]\n\n\nframedStyle : Css.Style\nframedStyle\
84
+ \ =\n let\n spacing =\n 0.5\n in\n Css.batch\n \
85
+ \ [ framedBorderStyle\n , Css.borderRadius (rem spacing)\n ]\n\
86
+ \n\nframedBorderStyle : Css.Style\nframedBorderStyle =\n Css.border3 (px 1)\
87
+ \ Css.solid borderColor\n\n\nparagraphSpacing : Css.Rem\nparagraphSpacing =\n\
88
+ \ rem 1\n\n\nborderColor : Css.Color\nborderColor =\n Css.hsla 0 0 0 0.25\n\
89
+ \n\nparagraphFontStyle : Css.Style\nparagraphFontStyle =\n Css.batch\n \
90
+ \ [ Css.lineHeight (num 1.35)\n ]\n\n\n-- Helper functions\nrenderInline\
91
+ \ : Inline Path -> Rendered msg\nrenderInline inline =\n case inline of\n \
92
+ \ TextInline text ->\n renderText [] text\n\n LinkInline\
93
+ \ link ->\n renderLink link\n\n ReferenceInline reference ->\n\
94
+ \ renderReference reference\n\n CodeInline code ->\n \
95
+ \ renderCode code\n\n KeysInline keys ->\n renderKeys keys\n\
96
+ \n\nrenderText : List Css.Style -> Text -> Rendered msg\nrenderText extraStyles\
97
+ \ text =\n let\n italic =\n if text.style.emphasized then\n\
98
+ \ [ Css.fontStyle Css.italic ]\n\n else\n \
99
+ \ []\n\n styles =\n italic ++ extraStyles\n in\n \
100
+ \ if text.style.emphasized then\n Html.em [ css styles ] [ Html.text text.content\
101
+ \ ]\n\n else if List.isEmpty styles then\n Html.text text.content\n\n\
102
+ \ else\n Html.span [ css styles ] [ Html.text text.content ]\n\n\nrenderLink\
103
+ \ : Link -> Rendered msg\nrenderLink link =\n viewLink\n { text = List.map\
104
+ \ (renderText []) link.text\n , url = Url.toString link.url\n ,\
105
+ \ styles = []\n }\n\n\nviewLink : { text : List (Rendered msg), url : String,\
106
+ \ styles : List Css.Style } -> Rendered msg\nviewLink { text, url, styles } =\n\
107
+ \ let\n unvisitedColor =\n Css.rgb 22 22 162\n\n visitedColor\
108
+ \ =\n Css.inherit\n in\n Html.a\n [ Attributes.href url\n\
109
+ \ , css\n ([ Css.color unvisitedColor\n , Css.visited\n\
110
+ \ [ Css.color visitedColor\n ]\n , hover\n\
111
+ \ [ Css.textDecorationStyle Css.dotted\n ]\n \
112
+ \ ]\n ++ styles\n )\n ]\n text\n\
113
+ \n\nrenderReference : Reference Path -> Rendered msg\nrenderReference reference\
114
+ \ =\n viewLink\n { text =\n List.map\n (renderText\n\
115
+ \ [ Css.fontWeight Css.bold\n , Css.fontSize\
116
+ \ (em 0.8)\n ]\n )\n reference.text\n\
117
+ \ , url = Path.toAbsolute reference.path\n , styles = []\n \
118
+ \ }\n\n\nrenderCode : Code -> Rendered msg\nrenderCode code =\n Html.code\n\
119
+ \ [ css [ codeFontStyle, codeBackgroundStyle ]\n ]\n [ Html.text\
120
+ \ code.src ]\n\n\nrenderKeys : Keys -> Rendered msg\nrenderKeys keys =\n case\
121
+ \ keys of\n ( first, [] ) ->\n renderKey first\n\n (\
122
+ \ first, rest ) ->\n Html.kbd [ css [ Css.whiteSpace Css.preWrap ]\
123
+ \ ]\n (List.map renderKey (first :: rest)\n \
124
+ \ |> List.intersperse (Html.text \"+\\u{200B}\")\n )\n\n\nrenderKey\
125
+ \ : Key -> Rendered msg\nrenderKey key =\n let\n keyBorderColor =\n\
126
+ \ Css.hsl 0 0 0.75\n\n keyText =\n case key of\n\
127
+ \ Letter l ->\n String.fromChar l\n\n \
128
+ \ Ctrl ->\n \"Ctrl\"\n\n Shift ->\n\
129
+ \ \"Shift\\u{00A0}⇧\"\n\n Enter ->\n \
130
+ \ \"Enter\\u{00A0}↵\"\n\n Tab ->\n \
131
+ \ \"Tab\\u{00A0}↹\"\n\n Up ->\n \"↑\\u{00A0}up\"\
132
+ \n\n Down ->\n \"↓\\u{00A0}down\"\n in\n\
133
+ \ Html.kbd\n [ css\n [ codeFontStyle\n , Css.fontSize\
134
+ \ (em 0.8)\n , Css.padding2 (em 0) (em 0.1)\n , Css.border3\
135
+ \ (px 1) Css.solid keyBorderColor\n , Css.borderRadius (em 0.2)\n \
136
+ \ , Css.boxShadow5 Css.inset zero (px -1) zero keyBorderColor\n \
137
+ \ , Css.verticalAlign Css.center\n , Css.whiteSpace Css.pre\n\
138
+ \ ]\n ]\n [ Html.text keyText ]\n```"
139
+ - "Context: \nAnswer: 40"
140
+ - source_sentence: 'You are tasked with creating a graded object in a category `C`
141
+ using a type `β`. A graded object over `C` is essentially a function `β → C` into
142
+ the objects of `C`. Your goal is to define the `graded_object` in a category,
143
+ where you will project the `i`-th component of the graded object and create a
144
+ shift functor on the graded objects. Additionally, you need to construct the total
145
+ functor `graded_object β C ⥤ C` and prove that it is faithful.
146
+
147
+
148
+ Write a program that accomplishes the following tasks:
149
+
150
+ 1. Define the graded object in a category.
151
+
152
+ 2. Create a shift functor on the graded objects.
153
+
154
+ 3. Construct the total functor and prove that it is faithful.'
155
+ sentences:
156
+ - "Context: \nAnswer: 33"
157
+ - '1514'
158
+ - "Context: \nAnswer: ```haskell\nimport algebra.group.basic\nimport category_theory.pi.basic\n\
159
+ import category_theory.shift\n\nnamespace category_theory\n\nuniverses w v u\n\
160
+ \ndef graded_object (β : Type w) (C : Type u) : Type (max w u) := β → C\n\ninstance\
161
+ \ inhabited_graded_object (β : Type w) (C : Type u) [inhabited C] :\n inhabited\
162
+ \ (graded_object β C) :=\n⟨λ b, inhabited.default C⟩\n\ninstance category_of_graded_objects\
163
+ \ (β : Type w) : category.{(max w v)} (graded_object β C) :=\ncategory_theory.pi\
164
+ \ (λ _, C)\n\ndef eval {β : Type w} (b : β) : graded_object β C ⥤ C :=\n{ obj\
165
+ \ := λ X, X b,\n map := λ X Y f, f b }\n\nsection\nvariable (C)\n\ndef comap_eq\
166
+ \ {β γ : Type w} {f g : β → γ} (h : f = g) : comap (λ _, C) f ≅ comap (λ _, C)\
167
+ \ g :=\n{ hom := { app := λ X b, eq_to_hom begin dsimp [comap], subst h, end },\n\
168
+ \ inv := { app := λ X b, eq_to_hom begin dsimp [comap], subst h, end }, }\n\n\
169
+ lemma comap_eq_symm {β γ : Type w} {f g : β → γ} (h : f = g) :\n comap_eq C h.symm\
170
+ \ = (comap_eq C h).symm :=\nby tidy\n\nlemma comap_eq_trans {β γ : Type w} {f\
171
+ \ g h : β → γ} (k : f = g) (l : g = h) :\n comap_eq C (k.trans l) = comap_eq\
172
+ \ C k ≪≫ comap_eq C l :=\nbegin\n ext X b,\n simp,\nend\n\ndef comap_equiv {β\
173
+ \ γ : Type w} (e : β ≃ γ) :\n (graded_object β C) ≌ (graded_object γ C) :=\n\
174
+ { functor := comap (λ _, C) (e.symm : γ → β),\n inverse := comap (λ _, C) (e\
175
+ \ : β → γ),\n counit_iso := (comap_comp (λ _, C) _ _).trans (comap_eq C (by {\
176
+ \ ext, simp } )),\n unit_iso := (comap_eq C (by { ext, simp } )).trans (comap_comp\
177
+ \ _ _ _).symm,\n functor_unit_iso_comp' := λ X, by { ext b, dsimp, simp, } }\n\
178
+ \nend\n\ninstance has_shift {β : Type*} [add_comm_group β] (s : β) :\n has_shift\
179
+ \ (graded_object_with_shift s C) :=\n{ shift := comap_equiv C\n { to_fun := λ\
180
+ \ b, b-s,\n inv_fun := λ b, b+s,\n left_inv := λ x, (by simp),\n right_inv\
181
+ \ := λ x, (by simp), } }\n\ninstance has_zero_morphisms [has_zero_morphisms C]\
182
+ \ (β : Type w) :\n has_zero_morphisms.{(max w v)} (graded_object β C) :=\n{ has_zero\
183
+ \ := λ X Y,\n { zero := λ b, 0 } }\n\nlemma zero_apply [has_zero_morphisms C]\
184
+ \ (β : Type w) (X Y : graded_object β C) (b : β) :\n (0 : X ⟶ Y) b = 0 := rfl\n\
185
+ \nopen_locale zero_object\n\ninstance has_zero_object [has_zero_object C] [has_zero_morphisms\
186
+ \ C] (β : Type w) :\n has_zero_object.{(max w v)} (graded_object β C) :=\n{ zero\
187
+ \ := λ b, (0 : C),\n unique_to := λ X, ⟨⟨λ b, 0⟩, λ f, (by ext)⟩,\n unique_from\
188
+ \ := λ X, ⟨⟨λ b, 0⟩, λ f, (by ext)⟩, } }\n\nvariables (β : Type)\nvariables (C\
189
+ \ : Type u) [category.{v} C]\nvariables [has_coproducts C]\n\ndef total : graded_object\
190
+ \ β C ⥤ C :=\n{ obj := λ X, ∐ (λ i : ulift.{v} β, X i.down),\n map := λ X Y f,\
191
+ \ limits.sigma.map (λ i, f i.down) }\n\ninstance : faithful (total β C) :=\n{\
192
+ \ map_injective' := λ X Y f g w,\n begin\n classical,\n ext i,\n replace\
193
+ \ w := sigma.ι (λ i : ulift.{v} β, X i.down) ⟨i⟩ ≫= w,\n erw [colimit.ι_map,\
194
+ \ colimit.ι_map] at w,\n exact mono.right_cancellation _ _ w,\n end }\n\n\
195
+ end graded_object\n```"
196
+ - source_sentence: 'You are required to implement a CUDA kernel function for deterministic
197
+ devoxelization. The function takes input features of size (n, c), indices of size
198
+ (N, 8), weight of size (N, 8), and is expected to produce output features of size
199
+ (N, c). The kernel function should be able to perform the following steps:
200
+
201
+ 1. Calculate the index based on the thread and block dimensions.
202
+
203
+ 2. Iterate through each value of i (less than N) and j (less than c).
204
+
205
+ 3. Access the corresponding indices, weight, and feat based on the current iteration.
206
+
207
+ 4. Perform a calculation involving the indices, weight, and feat arrays to populate
208
+ the output array.'
209
+ sentences:
210
+ - "Context: \nAnswer: -15"
211
+ - "Context: \nAnswer: ```c\n#include <stdio.h>\n#include <stdlib.h>\n#include <thrust/device_vector.h>\n\
212
+ #include \"../common/gpu.cuh\"\n\n__global__ void deterministic_devoxelize_kernel(int\
213
+ \ N, int c, const int *__restrict__ indices, const float *__restrict__ weight,\
214
+ \ const float *__restrict__ feat, float *__restrict__ out){\n int index = blockIdx.x\
215
+ \ * blockDim.x + threadIdx.x;\n int i = index / c;\n int j = index % c;\n\
216
+ \ \n if(i < N){\n const int* indices_ = indices + 8 * i;\n \
217
+ \ const float *weight_ = weight + 8 * i;\n const float *feat_ = feat +\
218
+ \ j;\n \n float cur_feat;\n for(int k = 0; k < 8; k++){\n\
219
+ \ cur_feat = (indices_[k] >= 0) ? feat_[indices_[k] * c] : 0; \n \
220
+ \ out[i * c + j] += weight_[k] * cur_feat;\n }\n }\n}\n\n\
221
+ void deterministic_devoxelize_wrapper(int N, int c, const int * indices, const\
222
+ \ float * weight, const float * feat, float * out){\n deterministic_devoxelize_kernel<<<N,\
223
+ \ c>>>(N, c, indices, weight, feat, out);\n}\n```\nPlease note that the solution\
224
+ \ provided is a CUDA kernel function for deterministic devoxelization, along with\
225
+ \ a wrapper function for easier usage."
226
+ - '4073'
227
+ - source_sentence: 'You are tasked with implementing a grayscale morphological operation
228
+ with flat linear structuring elements in C++ using the CUDA framework. Your function
229
+ should take in an input volume, step vectors for line segments, the number of
230
+ steps for line segments, an operation to perform, and the block size for GPU processing.
231
+ The input volume is a numeric array, the lineSteps parameter is an int32 matrix
232
+ representing step vectors for line segments, and the lineLens parameter is an
233
+ int32 vector representing the number of steps for line segments.
234
+
235
+
236
+ Your function should validate the inputs and ensure that the number of steps vectors
237
+ and step lengths are equal. It should then allocate an output volume and extract
238
+ line segments from the input parameters. Finally, your function should perform
239
+ the specified morphological operation using flat linear structuring elements and
240
+ return the output volume with the result of the operation.'
241
+ sentences:
242
+ - '3896'
243
+ - "Context: \nAnswer: SMTP is a push protocol for sending emails, while IMAP (Internet\
244
+ \ Message Access Protocol) is a pull protocol for retrieving emails from a server.\
245
+ \ IMAP allows users to manage emails on the server (e.g., filtering, searching),\
246
+ \ whereas SMTP only transmits messages. IMAP is preferred for client access because\
247
+ \ it enables asynchronous retrieval, supports offline use, and maintains email\
248
+ \ state across devices, unlike SMTP's stateless nature."
249
+ - "Context: \nAnswer: ```cpp\n#include <stdexcept>\n#include <vector>\n\n#include\
250
+ \ \"mex.h\"\n\n#include \"view.cuh\"\n#include \"morph.cuh\"\n#include \"strel.cuh\"\
251
+ \n#include \"flat_linear_morph.cuh\"\n\n#include \"mex_common.cuh\"\n\ntemplate\
252
+ \ <class Ty>\nvoid flatLinearMorphOp(mxArray *mxRes, const mxArray *mxVol, const\
253
+ \ std::vector<gpho::LineSeg>& lines,\n int op, int3 blockSize);\n\nvoid mexFunction(int\
254
+ \ nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[])\n{\n // Validate\
255
+ \ inputs (to be implemented)\n\n // Allocate output (to be implemented)\n\n\
256
+ \ // Extract line segments (to be implemented)\n\n // Run function (to be\
257
+ \ implemented)\n}\n\ntemplate <class Ty>\nvoid flatLinearMorphOp(mxArray *mxRes,\
258
+ \ const mxArray *mxVol, const std::vector<gpho::LineSeg>& lines,\n int op,\
259
+ \ int3 blockSize)\n{\n gpho::HostView<Ty> res(static_cast<Ty *>(mxGetData(mxRes)),\
260
+ \ volSize(mxRes));\n gpho::HostView<const Ty> vol(static_cast<const Ty *>(mxGetData(mxVol)),\
261
+ \ volSize(mxVol));\n\n switch (op) {\n case MOP_DILATE:\n gpho::flatLinearDilateErode<gpho::MORPH_DILATE>(res,\
262
+ \ vol, lines, blockSize);\n break;\n case MOP_ERODE:\n gpho::flatLinearDilateErode<gpho::MORPH_ERODE>(res,\
263
+ \ vol, lines, blockSize);\n break;\n default:\n abortWithMsg(\"\
264
+ invalid morphology op\");\n }\n}\n```"
265
+ - source_sentence: Calculate $(-1)^{47} + 2^{(3^3+4^2-6^2)}$.
266
+ sentences:
267
+ - "Context: \nAnswer: 2"
268
+ - "Context: \nAnswer: 127"
269
+ - '4750'
270
+ pipeline_tag: sentence-similarity
271
+ library_name: sentence-transformers
272
+ metrics:
273
+ - cosine_accuracy@1
274
+ - cosine_accuracy@3
275
+ - cosine_accuracy@5
276
+ - cosine_accuracy@10
277
+ - cosine_precision@1
278
+ - cosine_precision@3
279
+ - cosine_precision@5
280
+ - cosine_precision@10
281
+ - cosine_recall@1
282
+ - cosine_recall@3
283
+ - cosine_recall@5
284
+ - cosine_recall@10
285
+ - cosine_ndcg@10
286
+ - cosine_mrr@10
287
+ - cosine_map@100
288
+ model-index:
289
+ - name: SentenceTransformer based on ibm-granite/granite-embedding-english-r2
290
+ results:
291
+ - task:
292
+ type: information-retrieval
293
+ name: Information Retrieval
294
+ dataset:
295
+ name: dim 768
296
+ type: dim_768
297
+ metrics:
298
+ - type: cosine_accuracy@1
299
+ value: 0.626
300
+ name: Cosine Accuracy@1
301
+ - type: cosine_accuracy@3
302
+ value: 0.706
303
+ name: Cosine Accuracy@3
304
+ - type: cosine_accuracy@5
305
+ value: 0.726
306
+ name: Cosine Accuracy@5
307
+ - type: cosine_accuracy@10
308
+ value: 0.758
309
+ name: Cosine Accuracy@10
310
+ - type: cosine_precision@1
311
+ value: 0.626
312
+ name: Cosine Precision@1
313
+ - type: cosine_precision@3
314
+ value: 0.23533333333333334
315
+ name: Cosine Precision@3
316
+ - type: cosine_precision@5
317
+ value: 0.1452
318
+ name: Cosine Precision@5
319
+ - type: cosine_precision@10
320
+ value: 0.07579999999999998
321
+ name: Cosine Precision@10
322
+ - type: cosine_recall@1
323
+ value: 0.626
324
+ name: Cosine Recall@1
325
+ - type: cosine_recall@3
326
+ value: 0.706
327
+ name: Cosine Recall@3
328
+ - type: cosine_recall@5
329
+ value: 0.726
330
+ name: Cosine Recall@5
331
+ - type: cosine_recall@10
332
+ value: 0.758
333
+ name: Cosine Recall@10
334
+ - type: cosine_ndcg@10
335
+ value: 0.6916401386587206
336
+ name: Cosine Ndcg@10
337
+ - type: cosine_mrr@10
338
+ value: 0.6704460317460319
339
+ name: Cosine Mrr@10
340
+ - type: cosine_map@100
341
+ value: 0.6750546261347222
342
+ name: Cosine Map@100
343
+ - task:
344
+ type: information-retrieval
345
+ name: Information Retrieval
346
+ dataset:
347
+ name: dim 512
348
+ type: dim_512
349
+ metrics:
350
+ - type: cosine_accuracy@1
351
+ value: 0.636
352
+ name: Cosine Accuracy@1
353
+ - type: cosine_accuracy@3
354
+ value: 0.7
355
+ name: Cosine Accuracy@3
356
+ - type: cosine_accuracy@5
357
+ value: 0.724
358
+ name: Cosine Accuracy@5
359
+ - type: cosine_accuracy@10
360
+ value: 0.758
361
+ name: Cosine Accuracy@10
362
+ - type: cosine_precision@1
363
+ value: 0.636
364
+ name: Cosine Precision@1
365
+ - type: cosine_precision@3
366
+ value: 0.23333333333333334
367
+ name: Cosine Precision@3
368
+ - type: cosine_precision@5
369
+ value: 0.1448
370
+ name: Cosine Precision@5
371
+ - type: cosine_precision@10
372
+ value: 0.07579999999999998
373
+ name: Cosine Precision@10
374
+ - type: cosine_recall@1
375
+ value: 0.636
376
+ name: Cosine Recall@1
377
+ - type: cosine_recall@3
378
+ value: 0.7
379
+ name: Cosine Recall@3
380
+ - type: cosine_recall@5
381
+ value: 0.724
382
+ name: Cosine Recall@5
383
+ - type: cosine_recall@10
384
+ value: 0.758
385
+ name: Cosine Recall@10
386
+ - type: cosine_ndcg@10
387
+ value: 0.694038077949631
388
+ name: Cosine Ndcg@10
389
+ - type: cosine_mrr@10
390
+ value: 0.6738849206349207
391
+ name: Cosine Mrr@10
392
+ - type: cosine_map@100
393
+ value: 0.6784599411787897
394
+ name: Cosine Map@100
395
+ - task:
396
+ type: information-retrieval
397
+ name: Information Retrieval
398
+ dataset:
399
+ name: dim 256
400
+ type: dim_256
401
+ metrics:
402
+ - type: cosine_accuracy@1
403
+ value: 0.638
404
+ name: Cosine Accuracy@1
405
+ - type: cosine_accuracy@3
406
+ value: 0.698
407
+ name: Cosine Accuracy@3
408
+ - type: cosine_accuracy@5
409
+ value: 0.712
410
+ name: Cosine Accuracy@5
411
+ - type: cosine_accuracy@10
412
+ value: 0.75
413
+ name: Cosine Accuracy@10
414
+ - type: cosine_precision@1
415
+ value: 0.638
416
+ name: Cosine Precision@1
417
+ - type: cosine_precision@3
418
+ value: 0.2326666666666667
419
+ name: Cosine Precision@3
420
+ - type: cosine_precision@5
421
+ value: 0.14239999999999997
422
+ name: Cosine Precision@5
423
+ - type: cosine_precision@10
424
+ value: 0.075
425
+ name: Cosine Precision@10
426
+ - type: cosine_recall@1
427
+ value: 0.638
428
+ name: Cosine Recall@1
429
+ - type: cosine_recall@3
430
+ value: 0.698
431
+ name: Cosine Recall@3
432
+ - type: cosine_recall@5
433
+ value: 0.712
434
+ name: Cosine Recall@5
435
+ - type: cosine_recall@10
436
+ value: 0.75
437
+ name: Cosine Recall@10
438
+ - type: cosine_ndcg@10
439
+ value: 0.6915490012428582
440
+ name: Cosine Ndcg@10
441
+ - type: cosine_mrr@10
442
+ value: 0.6731365079365079
443
+ name: Cosine Mrr@10
444
+ - type: cosine_map@100
445
+ value: 0.6780550682950801
446
+ name: Cosine Map@100
447
+ - task:
448
+ type: information-retrieval
449
+ name: Information Retrieval
450
+ dataset:
451
+ name: dim 128
452
+ type: dim_128
453
+ metrics:
454
+ - type: cosine_accuracy@1
455
+ value: 0.636
456
+ name: Cosine Accuracy@1
457
+ - type: cosine_accuracy@3
458
+ value: 0.698
459
+ name: Cosine Accuracy@3
460
+ - type: cosine_accuracy@5
461
+ value: 0.716
462
+ name: Cosine Accuracy@5
463
+ - type: cosine_accuracy@10
464
+ value: 0.74
465
+ name: Cosine Accuracy@10
466
+ - type: cosine_precision@1
467
+ value: 0.636
468
+ name: Cosine Precision@1
469
+ - type: cosine_precision@3
470
+ value: 0.23266666666666666
471
+ name: Cosine Precision@3
472
+ - type: cosine_precision@5
473
+ value: 0.1432
474
+ name: Cosine Precision@5
475
+ - type: cosine_precision@10
476
+ value: 0.074
477
+ name: Cosine Precision@10
478
+ - type: cosine_recall@1
479
+ value: 0.636
480
+ name: Cosine Recall@1
481
+ - type: cosine_recall@3
482
+ value: 0.698
483
+ name: Cosine Recall@3
484
+ - type: cosine_recall@5
485
+ value: 0.716
486
+ name: Cosine Recall@5
487
+ - type: cosine_recall@10
488
+ value: 0.74
489
+ name: Cosine Recall@10
490
+ - type: cosine_ndcg@10
491
+ value: 0.6863050743630661
492
+ name: Cosine Ndcg@10
493
+ - type: cosine_mrr@10
494
+ value: 0.6692539682539683
495
+ name: Cosine Mrr@10
496
+ - type: cosine_map@100
497
+ value: 0.6739281987227784
498
+ name: Cosine Map@100
499
+ - task:
500
+ type: information-retrieval
501
+ name: Information Retrieval
502
+ dataset:
503
+ name: dim 64
504
+ type: dim_64
505
+ metrics:
506
+ - type: cosine_accuracy@1
507
+ value: 0.628
508
+ name: Cosine Accuracy@1
509
+ - type: cosine_accuracy@3
510
+ value: 0.692
511
+ name: Cosine Accuracy@3
512
+ - type: cosine_accuracy@5
513
+ value: 0.714
514
+ name: Cosine Accuracy@5
515
+ - type: cosine_accuracy@10
516
+ value: 0.734
517
+ name: Cosine Accuracy@10
518
+ - type: cosine_precision@1
519
+ value: 0.628
520
+ name: Cosine Precision@1
521
+ - type: cosine_precision@3
522
+ value: 0.23066666666666666
523
+ name: Cosine Precision@3
524
+ - type: cosine_precision@5
525
+ value: 0.14279999999999998
526
+ name: Cosine Precision@5
527
+ - type: cosine_precision@10
528
+ value: 0.0734
529
+ name: Cosine Precision@10
530
+ - type: cosine_recall@1
531
+ value: 0.628
532
+ name: Cosine Recall@1
533
+ - type: cosine_recall@3
534
+ value: 0.692
535
+ name: Cosine Recall@3
536
+ - type: cosine_recall@5
537
+ value: 0.714
538
+ name: Cosine Recall@5
539
+ - type: cosine_recall@10
540
+ value: 0.734
541
+ name: Cosine Recall@10
542
+ - type: cosine_ndcg@10
543
+ value: 0.680595460049486
544
+ name: Cosine Ndcg@10
545
+ - type: cosine_mrr@10
546
+ value: 0.6635103174603174
547
+ name: Cosine Mrr@10
548
+ - type: cosine_map@100
549
+ value: 0.6681288217342812
550
+ name: Cosine Map@100
551
+ ---
552
+
553
+ # SentenceTransformer based on ibm-granite/granite-embedding-english-r2
554
+
555
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [ibm-granite/granite-embedding-english-r2](https://huggingface.co/ibm-granite/granite-embedding-english-r2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
556
+
557
+ ## Model Details
558
+
559
+ ### Model Description
560
+ - **Model Type:** Sentence Transformer
561
+ - **Base model:** [ibm-granite/granite-embedding-english-r2](https://huggingface.co/ibm-granite/granite-embedding-english-r2) <!-- at revision 47ea694b257b703fee9253d75c2b1f2985180498 -->
562
+ - **Maximum Sequence Length:** 512 tokens
563
+ - **Output Dimensionality:** 768 dimensions
564
+ - **Similarity Function:** Cosine Similarity
565
+ <!-- - **Training Dataset:** Unknown -->
566
+ <!-- - **Language:** Unknown -->
567
+ <!-- - **License:** Unknown -->
568
+
569
+ ### Model Sources
570
+
571
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
572
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
573
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
574
+
575
+ ### Full Model Architecture
576
+
577
+ ```
578
+ SentenceTransformer(
579
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
580
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
581
+ )
582
+ ```
583
+
584
+ ## Usage
585
+
586
+ ### Direct Usage (Sentence Transformers)
587
+
588
+ First install the Sentence Transformers library:
589
+
590
+ ```bash
591
+ pip install -U sentence-transformers
592
+ ```
593
+
594
+ Then you can load this model and run inference.
595
+ ```python
596
+ from sentence_transformers import SentenceTransformer
597
+
598
+ # Download from the 🤗 Hub
599
+ model = SentenceTransformer("shatonix/granite-embedding-math-cs")
600
+ # Run inference
601
+ sentences = [
602
+ 'Calculate $(-1)^{47} + 2^{(3^3+4^2-6^2)}$.',
603
+ 'Context: \nAnswer: 127',
604
+ '4750',
605
+ ]
606
+ embeddings = model.encode(sentences)
607
+ print(embeddings.shape)
608
+ # [3, 768]
609
+
610
+ # Get the similarity scores for the embeddings
611
+ similarities = model.similarity(embeddings, embeddings)
612
+ print(similarities)
613
+ # tensor([[ 1.0000, 0.5650, -0.0154],
614
+ # [ 0.5650, 1.0000, -0.0246],
615
+ # [-0.0154, -0.0246, 1.0000]])
616
+ ```
617
+
618
+ <!--
619
+ ### Direct Usage (Transformers)
620
+
621
+ <details><summary>Click to see the direct usage in Transformers</summary>
622
+
623
+ </details>
624
+ -->
625
+
626
+ <!--
627
+ ### Downstream Usage (Sentence Transformers)
628
+
629
+ You can finetune this model on your own dataset.
630
+
631
+ <details><summary>Click to expand</summary>
632
+
633
+ </details>
634
+ -->
635
+
636
+ <!--
637
+ ### Out-of-Scope Use
638
+
639
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
640
+ -->
641
+
642
+ ## Evaluation
643
+
644
+ ### Metrics
645
+
646
+ #### Information Retrieval
647
+
648
+ * Dataset: `dim_768`
649
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
650
+ ```json
651
+ {
652
+ "truncate_dim": 768
653
+ }
654
+ ```
655
+
656
+ | Metric | Value |
657
+ |:--------------------|:-----------|
658
+ | cosine_accuracy@1 | 0.626 |
659
+ | cosine_accuracy@3 | 0.706 |
660
+ | cosine_accuracy@5 | 0.726 |
661
+ | cosine_accuracy@10 | 0.758 |
662
+ | cosine_precision@1 | 0.626 |
663
+ | cosine_precision@3 | 0.2353 |
664
+ | cosine_precision@5 | 0.1452 |
665
+ | cosine_precision@10 | 0.0758 |
666
+ | cosine_recall@1 | 0.626 |
667
+ | cosine_recall@3 | 0.706 |
668
+ | cosine_recall@5 | 0.726 |
669
+ | cosine_recall@10 | 0.758 |
670
+ | **cosine_ndcg@10** | **0.6916** |
671
+ | cosine_mrr@10 | 0.6704 |
672
+ | cosine_map@100 | 0.6751 |
673
+
674
+ #### Information Retrieval
675
+
676
+ * Dataset: `dim_512`
677
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
678
+ ```json
679
+ {
680
+ "truncate_dim": 512
681
+ }
682
+ ```
683
+
684
+ | Metric | Value |
685
+ |:--------------------|:----------|
686
+ | cosine_accuracy@1 | 0.636 |
687
+ | cosine_accuracy@3 | 0.7 |
688
+ | cosine_accuracy@5 | 0.724 |
689
+ | cosine_accuracy@10 | 0.758 |
690
+ | cosine_precision@1 | 0.636 |
691
+ | cosine_precision@3 | 0.2333 |
692
+ | cosine_precision@5 | 0.1448 |
693
+ | cosine_precision@10 | 0.0758 |
694
+ | cosine_recall@1 | 0.636 |
695
+ | cosine_recall@3 | 0.7 |
696
+ | cosine_recall@5 | 0.724 |
697
+ | cosine_recall@10 | 0.758 |
698
+ | **cosine_ndcg@10** | **0.694** |
699
+ | cosine_mrr@10 | 0.6739 |
700
+ | cosine_map@100 | 0.6785 |
701
+
702
+ #### Information Retrieval
703
+
704
+ * Dataset: `dim_256`
705
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
706
+ ```json
707
+ {
708
+ "truncate_dim": 256
709
+ }
710
+ ```
711
+
712
+ | Metric | Value |
713
+ |:--------------------|:-----------|
714
+ | cosine_accuracy@1 | 0.638 |
715
+ | cosine_accuracy@3 | 0.698 |
716
+ | cosine_accuracy@5 | 0.712 |
717
+ | cosine_accuracy@10 | 0.75 |
718
+ | cosine_precision@1 | 0.638 |
719
+ | cosine_precision@3 | 0.2327 |
720
+ | cosine_precision@5 | 0.1424 |
721
+ | cosine_precision@10 | 0.075 |
722
+ | cosine_recall@1 | 0.638 |
723
+ | cosine_recall@3 | 0.698 |
724
+ | cosine_recall@5 | 0.712 |
725
+ | cosine_recall@10 | 0.75 |
726
+ | **cosine_ndcg@10** | **0.6915** |
727
+ | cosine_mrr@10 | 0.6731 |
728
+ | cosine_map@100 | 0.6781 |
729
+
730
+ #### Information Retrieval
731
+
732
+ * Dataset: `dim_128`
733
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
734
+ ```json
735
+ {
736
+ "truncate_dim": 128
737
+ }
738
+ ```
739
+
740
+ | Metric | Value |
741
+ |:--------------------|:-----------|
742
+ | cosine_accuracy@1 | 0.636 |
743
+ | cosine_accuracy@3 | 0.698 |
744
+ | cosine_accuracy@5 | 0.716 |
745
+ | cosine_accuracy@10 | 0.74 |
746
+ | cosine_precision@1 | 0.636 |
747
+ | cosine_precision@3 | 0.2327 |
748
+ | cosine_precision@5 | 0.1432 |
749
+ | cosine_precision@10 | 0.074 |
750
+ | cosine_recall@1 | 0.636 |
751
+ | cosine_recall@3 | 0.698 |
752
+ | cosine_recall@5 | 0.716 |
753
+ | cosine_recall@10 | 0.74 |
754
+ | **cosine_ndcg@10** | **0.6863** |
755
+ | cosine_mrr@10 | 0.6693 |
756
+ | cosine_map@100 | 0.6739 |
757
+
758
+ #### Information Retrieval
759
+
760
+ * Dataset: `dim_64`
761
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
762
+ ```json
763
+ {
764
+ "truncate_dim": 64
765
+ }
766
+ ```
767
+
768
+ | Metric | Value |
769
+ |:--------------------|:-----------|
770
+ | cosine_accuracy@1 | 0.628 |
771
+ | cosine_accuracy@3 | 0.692 |
772
+ | cosine_accuracy@5 | 0.714 |
773
+ | cosine_accuracy@10 | 0.734 |
774
+ | cosine_precision@1 | 0.628 |
775
+ | cosine_precision@3 | 0.2307 |
776
+ | cosine_precision@5 | 0.1428 |
777
+ | cosine_precision@10 | 0.0734 |
778
+ | cosine_recall@1 | 0.628 |
779
+ | cosine_recall@3 | 0.692 |
780
+ | cosine_recall@5 | 0.714 |
781
+ | cosine_recall@10 | 0.734 |
782
+ | **cosine_ndcg@10** | **0.6806** |
783
+ | cosine_mrr@10 | 0.6635 |
784
+ | cosine_map@100 | 0.6681 |
785
+
786
+ <!--
787
+ ## Bias, Risks and Limitations
788
+
789
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
790
+ -->
791
+
792
+ <!--
793
+ ### Recommendations
794
+
795
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
796
+ -->
797
+
798
+ ## Training Details
799
+
800
+ ### Training Dataset
801
+
802
+ #### Unnamed Dataset
803
+
804
+ * Size: 4,500 training samples
805
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>id</code>
806
+ * Approximate statistics based on the first 1000 samples:
807
+ | | anchor | positive | id |
808
+ |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
809
+ | type | string | string | string |
810
+ | details | <ul><li>min: 8 tokens</li><li>mean: 80.08 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 165.53 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.81 tokens</li><li>max: 4 tokens</li></ul> |
811
+ * Samples:
812
+ | anchor | positive | id |
813
+ |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------|
814
+ | <code>Stella’s antique shop has 3 dolls, 2 clocks and 5 glasses for sale. She sells the dolls for $5 each. The clocks are priced at $15 each. The glasses are priced at $4 each. If she spent $40 to buy everything and she sells all of her merchandise, how much profit will she make?</code> | <code>Context: <br>Answer: 25</code> | <code>3430</code> |
815
+ | <code>You are tasked with creating a Ruby program that defines a service for creating a project in a Continuous Integration (CI) system. The service should be able to execute with valid parameters and handle specific scenarios.<br><br>The program should include the following:<br>- A class called `Ci::CreateProjectService` that defines the service for creating a project.<br>- A method within the `Ci::CreateProjectService` class called `execute` that takes in three parameters: `current_user` (representing the current user), `project` (representing the project to be created), and `ci_origin_project` (optional, representing the project to use as a template for settings and jobs).<br>- The `execute` method should handle the following scenarios:<br> 1. When executed with valid parameters, it should return a new instance of `Ci::Project` that is persisted.<br> 2. When executed without a project dump (empty string), it should raise an exception.<br> 3. When executed with a `ci_origin_project` for forking, it should use ...</code> | <code>Context: <br>Answer: ```ruby<br>class Ci::CreateProjectService<br> def execute(current_user, project, ci_origin_project = nil)<br> if project.empty?<br> raise StandardError, 'Project dump is required'<br> end<br><br> new_project = Ci::Project.new<br> new_project.save<br><br> if ci_origin_project<br> new_project.shared_runners_enabled = ci_origin_project.shared_runners_enabled<br> new_project.public = ci_origin_project.public<br> new_project.allow_git_fetch = ci_origin_project.allow_git_fetch<br> end<br><br> new_project<br> end<br>end<br>```</code> | <code>656</code> |
816
+ | <code>Why is the Insertion Sort algorithm considered optimal for nearly sorted datasets, and how does its time complexity compare to other quadratic sorting algorithms?</code> | <code>Context: <br>Answer: Insertion Sort operates in O(n²) time complexity in the worst case, but for nearly sorted datasets, it achieves O(n) time complexity. This is because it only requires a minimal number of swaps to place elements in order. For datasets where most elements are already in their correct positions, the number of inversions (pairs out of order) is small, reducing the number of comparisons and swaps. This contrasts with other quadratic algorithms like Selection Sort, which must scan the entire dataset for each element, leading to O(n²) operations regardless of initial order. The efficiency of Insertion Sort for nearly sorted data stems from its ability to leverage existing order, making it a better choice for such scenarios.</code> | <code>1305</code> |
817
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
818
+ ```json
819
+ {
820
+ "loss": "MultipleNegativesRankingLoss",
821
+ "matryoshka_dims": [
822
+ 768,
823
+ 512,
824
+ 256,
825
+ 128,
826
+ 64
827
+ ],
828
+ "matryoshka_weights": [
829
+ 1,
830
+ 1,
831
+ 1,
832
+ 1,
833
+ 1
834
+ ],
835
+ "n_dims_per_step": -1
836
+ }
837
+ ```
838
+
839
+ ### Training Hyperparameters
840
+ #### Non-Default Hyperparameters
841
+
842
+ - `eval_strategy`: epoch
843
+ - `per_device_train_batch_size`: 64
844
+ - `per_device_eval_batch_size`: 64
845
+ - `gradient_accumulation_steps`: 2
846
+ - `num_train_epochs`: 10
847
+ - `lr_scheduler_type`: cosine
848
+ - `warmup_ratio`: 0.1
849
+ - `bf16`: True
850
+ - `tf32`: True
851
+ - `dataloader_num_workers`: 4
852
+ - `load_best_model_at_end`: True
853
+ - `batch_sampler`: no_duplicates
854
+
855
+ #### All Hyperparameters
856
+ <details><summary>Click to expand</summary>
857
+
858
+ - `overwrite_output_dir`: False
859
+ - `do_predict`: False
860
+ - `eval_strategy`: epoch
861
+ - `prediction_loss_only`: True
862
+ - `per_device_train_batch_size`: 64
863
+ - `per_device_eval_batch_size`: 64
864
+ - `per_gpu_train_batch_size`: None
865
+ - `per_gpu_eval_batch_size`: None
866
+ - `gradient_accumulation_steps`: 2
867
+ - `eval_accumulation_steps`: None
868
+ - `torch_empty_cache_steps`: None
869
+ - `learning_rate`: 5e-05
870
+ - `weight_decay`: 0.0
871
+ - `adam_beta1`: 0.9
872
+ - `adam_beta2`: 0.999
873
+ - `adam_epsilon`: 1e-08
874
+ - `max_grad_norm`: 1.0
875
+ - `num_train_epochs`: 10
876
+ - `max_steps`: -1
877
+ - `lr_scheduler_type`: cosine
878
+ - `lr_scheduler_kwargs`: {}
879
+ - `warmup_ratio`: 0.1
880
+ - `warmup_steps`: 0
881
+ - `log_level`: passive
882
+ - `log_level_replica`: warning
883
+ - `log_on_each_node`: True
884
+ - `logging_nan_inf_filter`: True
885
+ - `save_safetensors`: True
886
+ - `save_on_each_node`: False
887
+ - `save_only_model`: False
888
+ - `restore_callback_states_from_checkpoint`: False
889
+ - `no_cuda`: False
890
+ - `use_cpu`: False
891
+ - `use_mps_device`: False
892
+ - `seed`: 42
893
+ - `data_seed`: None
894
+ - `jit_mode_eval`: False
895
+ - `bf16`: True
896
+ - `fp16`: False
897
+ - `fp16_opt_level`: O1
898
+ - `half_precision_backend`: auto
899
+ - `bf16_full_eval`: False
900
+ - `fp16_full_eval`: False
901
+ - `tf32`: True
902
+ - `local_rank`: 0
903
+ - `ddp_backend`: None
904
+ - `tpu_num_cores`: None
905
+ - `tpu_metrics_debug`: False
906
+ - `debug`: []
907
+ - `dataloader_drop_last`: False
908
+ - `dataloader_num_workers`: 4
909
+ - `dataloader_prefetch_factor`: None
910
+ - `past_index`: -1
911
+ - `disable_tqdm`: False
912
+ - `remove_unused_columns`: True
913
+ - `label_names`: None
914
+ - `load_best_model_at_end`: True
915
+ - `ignore_data_skip`: False
916
+ - `fsdp`: []
917
+ - `fsdp_min_num_params`: 0
918
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
919
+ - `fsdp_transformer_layer_cls_to_wrap`: None
920
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
921
+ - `parallelism_config`: None
922
+ - `deepspeed`: None
923
+ - `label_smoothing_factor`: 0.0
924
+ - `optim`: adamw_torch_fused
925
+ - `optim_args`: None
926
+ - `adafactor`: False
927
+ - `group_by_length`: False
928
+ - `length_column_name`: length
929
+ - `project`: huggingface
930
+ - `trackio_space_id`: trackio
931
+ - `ddp_find_unused_parameters`: None
932
+ - `ddp_bucket_cap_mb`: None
933
+ - `ddp_broadcast_buffers`: False
934
+ - `dataloader_pin_memory`: True
935
+ - `dataloader_persistent_workers`: False
936
+ - `skip_memory_metrics`: True
937
+ - `use_legacy_prediction_loop`: False
938
+ - `push_to_hub`: False
939
+ - `resume_from_checkpoint`: None
940
+ - `hub_model_id`: None
941
+ - `hub_strategy`: every_save
942
+ - `hub_private_repo`: None
943
+ - `hub_always_push`: False
944
+ - `hub_revision`: None
945
+ - `gradient_checkpointing`: False
946
+ - `gradient_checkpointing_kwargs`: None
947
+ - `include_inputs_for_metrics`: False
948
+ - `include_for_metrics`: []
949
+ - `eval_do_concat_batches`: True
950
+ - `fp16_backend`: auto
951
+ - `push_to_hub_model_id`: None
952
+ - `push_to_hub_organization`: None
953
+ - `mp_parameters`:
954
+ - `auto_find_batch_size`: False
955
+ - `full_determinism`: False
956
+ - `torchdynamo`: None
957
+ - `ray_scope`: last
958
+ - `ddp_timeout`: 1800
959
+ - `torch_compile`: False
960
+ - `torch_compile_backend`: None
961
+ - `torch_compile_mode`: None
962
+ - `include_tokens_per_second`: False
963
+ - `include_num_input_tokens_seen`: no
964
+ - `neftune_noise_alpha`: None
965
+ - `optim_target_modules`: None
966
+ - `batch_eval_metrics`: False
967
+ - `eval_on_start`: False
968
+ - `use_liger_kernel`: False
969
+ - `liger_kernel_config`: None
970
+ - `eval_use_gather_object`: False
971
+ - `average_tokens_across_devices`: True
972
+ - `prompts`: None
973
+ - `batch_sampler`: no_duplicates
974
+ - `multi_dataset_batch_sampler`: proportional
975
+ - `router_mapping`: {}
976
+ - `learning_rate_mapping`: {}
977
+
978
+ </details>
979
+
980
+ ### Training Logs
981
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
982
+ |:-------:|:-------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
983
+ | -1 | -1 | - | 0.6227 | 0.6213 | 0.6163 | 0.6036 | 0.5905 |
984
+ | 0.2817 | 10 | 10.3671 | - | - | - | - | - |
985
+ | 0.5634 | 20 | 8.1302 | - | - | - | - | - |
986
+ | 0.8451 | 30 | 6.6781 | - | - | - | - | - |
987
+ | 1.0 | 36 | - | 0.6371 | 0.6373 | 0.6368 | 0.6384 | 0.6297 |
988
+ | 1.1127 | 40 | 5.6041 | - | - | - | - | - |
989
+ | 1.3944 | 50 | 5.3589 | - | - | - | - | - |
990
+ | 1.6761 | 60 | 5.2615 | - | - | - | - | - |
991
+ | 1.9577 | 70 | 5.1322 | - | - | - | - | - |
992
+ | 2.0 | 72 | - | 0.6584 | 0.6599 | 0.6567 | 0.6590 | 0.6588 |
993
+ | 2.2254 | 80 | 4.2222 | - | - | - | - | - |
994
+ | 2.5070 | 90 | 3.6282 | - | - | - | - | - |
995
+ | 2.7887 | 100 | 3.5652 | - | - | - | - | - |
996
+ | 3.0 | 108 | - | 0.6679 | 0.6724 | 0.6750 | 0.6699 | 0.6645 |
997
+ | 3.0563 | 110 | 3.1212 | - | - | - | - | - |
998
+ | 3.3380 | 120 | 1.8016 | - | - | - | - | - |
999
+ | 3.6197 | 130 | 1.8941 | - | - | - | - | - |
1000
+ | 3.9014 | 140 | 1.8576 | - | - | - | - | - |
1001
+ | 4.0 | 144 | - | 0.6900 | 0.6923 | 0.6937 | 0.6863 | 0.6771 |
1002
+ | 4.1690 | 150 | 1.0872 | - | - | - | - | - |
1003
+ | 4.4507 | 160 | 0.7482 | - | - | - | - | - |
1004
+ | 4.7324 | 170 | 0.7307 | - | - | - | - | - |
1005
+ | 5.0 | 180 | 0.8322 | 0.6909 | 0.6988 | 0.6947 | 0.6873 | 0.6800 |
1006
+ | 5.2817 | 190 | 0.329 | - | - | - | - | - |
1007
+ | 5.5634 | 200 | 0.3246 | - | - | - | - | - |
1008
+ | 5.8451 | 210 | 0.274 | - | - | - | - | - |
1009
+ | 6.0 | 216 | - | 0.6898 | 0.6929 | 0.6904 | 0.6900 | 0.6801 |
1010
+ | 6.1127 | 220 | 0.2161 | - | - | - | - | - |
1011
+ | 6.3944 | 230 | 0.1178 | - | - | - | - | - |
1012
+ | 6.6761 | 240 | 0.1418 | - | - | - | - | - |
1013
+ | 6.9577 | 250 | 0.1319 | - | - | - | - | - |
1014
+ | 7.0 | 252 | - | 0.6920 | 0.6890 | 0.6910 | 0.6880 | 0.6789 |
1015
+ | 7.2254 | 260 | 0.0979 | - | - | - | - | - |
1016
+ | 7.5070 | 270 | 0.0653 | - | - | - | - | - |
1017
+ | 7.7887 | 280 | 0.0852 | - | - | - | - | - |
1018
+ | **8.0** | **288** | **-** | **0.6934** | **0.69** | **0.6934** | **0.6877** | **0.6825** |
1019
+ | 8.0563 | 290 | 0.08 | - | - | - | - | - |
1020
+ | 8.3380 | 300 | 0.0526 | - | - | - | - | - |
1021
+ | 8.6197 | 310 | 0.066 | - | - | - | - | - |
1022
+ | 8.9014 | 320 | 0.0549 | - | - | - | - | - |
1023
+ | 9.0 | 324 | - | 0.6911 | 0.6929 | 0.6905 | 0.6858 | 0.6802 |
1024
+ | 9.1690 | 330 | 0.0384 | - | - | - | - | - |
1025
+ | 9.4507 | 340 | 0.0523 | - | - | - | - | - |
1026
+ | 9.7324 | 350 | 0.0333 | - | - | - | - | - |
1027
+ | 10.0 | 360 | 0.0488 | 0.6916 | 0.6940 | 0.6915 | 0.6863 | 0.6806 |
1028
+
1029
+ * The bold row denotes the saved checkpoint.
1030
+
1031
+ ### Framework Versions
1032
+ - Python: 3.12.12
1033
+ - Sentence Transformers: 5.2.0
1034
+ - Transformers: 4.57.3
1035
+ - PyTorch: 2.9.1+cu128
1036
+ - Accelerate: 1.12.0
1037
+ - Datasets: 4.4.2
1038
+ - Tokenizers: 0.22.1
1039
+
1040
+ ## Citation
1041
+
1042
+ ### BibTeX
1043
+
1044
+ #### Sentence Transformers
1045
+ ```bibtex
1046
+ @inproceedings{reimers-2019-sentence-bert,
1047
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1048
+ author = "Reimers, Nils and Gurevych, Iryna",
1049
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1050
+ month = "11",
1051
+ year = "2019",
1052
+ publisher = "Association for Computational Linguistics",
1053
+ url = "https://arxiv.org/abs/1908.10084",
1054
+ }
1055
+ ```
1056
+
1057
+ #### MatryoshkaLoss
1058
+ ```bibtex
1059
+ @misc{kusupati2024matryoshka,
1060
+ title={Matryoshka Representation Learning},
1061
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
1062
+ year={2024},
1063
+ eprint={2205.13147},
1064
+ archivePrefix={arXiv},
1065
+ primaryClass={cs.LG}
1066
+ }
1067
+ ```
1068
+
1069
+ #### MultipleNegativesRankingLoss
1070
+ ```bibtex
1071
+ @misc{henderson2017efficient,
1072
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
1073
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
1074
+ year={2017},
1075
+ eprint={1705.00652},
1076
+ archivePrefix={arXiv},
1077
+ primaryClass={cs.CL}
1078
+ }
1079
+ ```
1080
+
1081
+ <!--
1082
+ ## Glossary
1083
+
1084
+ *Clearly define terms in order to be accessible across audiences.*
1085
+ -->
1086
+
1087
+ <!--
1088
+ ## Model Card Authors
1089
+
1090
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1091
+ -->
1092
+
1093
+ <!--
1094
+ ## Model Card Contact
1095
+
1096
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1097
+ -->
config.json ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "ModernBertModel"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "attention_probs_dropout_prob": 0.1,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "silu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "dtype": "float32",
17
+ "embedding_dropout": 0.0,
18
+ "eos_token_id": 50282,
19
+ "global_attn_every_n_layers": 3,
20
+ "global_rope_theta": 80000.0,
21
+ "gradient_checkpointing": false,
22
+ "hidden_act": "gelu",
23
+ "hidden_activation": "gelu",
24
+ "hidden_dropout_prob": 0.1,
25
+ "hidden_size": 768,
26
+ "initializer_cutoff_factor": 2.0,
27
+ "initializer_range": 0.02,
28
+ "intermediate_size": 1152,
29
+ "layer_norm_eps": 1e-05,
30
+ "local_attention": 128,
31
+ "local_rope_theta": 10000.0,
32
+ "max_position_embeddings": 8192,
33
+ "mlp_bias": false,
34
+ "mlp_dropout": 0.0,
35
+ "model_type": "modernbert",
36
+ "norm_bias": false,
37
+ "norm_eps": 1e-05,
38
+ "num_attention_heads": 12,
39
+ "num_hidden_layers": 22,
40
+ "pad_token_id": 50283,
41
+ "position_embedding_type": "absolute",
42
+ "repad_logits_with_grad": false,
43
+ "sep_token_id": 50282,
44
+ "sparse_pred_ignore_index": -100,
45
+ "sparse_prediction": false,
46
+ "transformers_version": "4.57.3",
47
+ "vocab_size": 50368
48
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "SentenceTransformer",
3
+ "__version__": {
4
+ "sentence_transformers": "5.2.0",
5
+ "transformers": "4.57.3",
6
+ "pytorch": "2.9.1+cu128"
7
+ },
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4947d92ab9822101b88590dff231f9160ff25f9092c778b2b4240f46eb0abb10
3
+ size 596070136
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 8192,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }