sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | maridze/Saiga_2_13b_fine_tune_custom_data | [
"peft",
"region:us"
] | 2024-02-14T14:28:45+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
164,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.053269870579242706,
0.037288859486579895,
-0.0023498369846493006,
0.12860141694545746,
0.09711828827857971,
0.06938523054122925,
0.10579416155815125,
0.1292499452829361,
0.022261790931224823,
0.08017075061798096,
0.11625925451517105,
0.04396219924092293,
0.06695731729269028,
0.15088412165641785,
-0.028099212795495987,
-0.008259058929979801,
0.050381533801555634,
-0.0005584736936725676,
-0.011545145884156227,
0.08704539388418198,
0.05457798019051552,
-0.051225606352090836,
0.032125186175107956,
-0.09107348322868347,
-0.15622684359550476,
-0.003146089380607009,
0.004685089457780123,
0.026495451107621193,
0.041620079427957535,
0.036286361515522,
0.0672917291522026,
0.000948688539210707,
-0.02340070717036724,
-0.20800472795963287,
-0.011052807793021202,
0.10847669839859009,
-0.023697681725025177,
0.07120364159345627,
-0.10105514526367188,
0.14101602137088776,
-0.07431574165821075,
-0.044521018862724304,
0.015505745075643063,
0.016141142696142197,
-0.08908672630786896,
-0.10921372473239899,
-0.056611500680446625,
0.05434385687112808,
0.025896143168210983,
0.06898002326488495,
-0.011024434119462967,
0.18677890300750732,
-0.1572560966014862,
0.08946336060762405,
0.07651227712631226,
-0.2272987961769104,
-0.02457762509584427,
0.12256019562482834,
-0.011468162760138512,
0.16642406582832336,
-0.08616470545530319,
-0.10246296972036362,
0.06721126288175583,
0.04392894357442856,
-0.03561519458889961,
-0.008210073225200176,
-0.0949617549777031,
0.018221145495772362,
-0.13337399065494537,
-0.05047217383980751,
0.1545511782169342,
0.029353223741054535,
-0.04286954179406166,
-0.05023425444960594,
-0.09016099572181702,
-0.35934361815452576,
0.029608795419335365,
-0.009606908075511456,
-0.07419688999652863,
0.047605302184820175,
-0.009263900108635426,
-0.015186766162514687,
-0.01163696963340044,
-0.08817271143198013,
-0.0290176123380661,
0.06573924422264099,
0.04293062910437584,
0.032810889184474945,
0.012150761671364307,
0.11014214158058167,
-0.11336442083120346,
-0.02742960676550865,
-0.04243305325508118,
-0.028446335345506668,
-0.04927803948521614,
-0.015285715460777283,
-0.07611539959907532,
0.1796576827764511,
0.07765424996614456,
0.12315205484628677,
-0.14354397356510162,
0.12791253626346588,
-0.02828214503824711,
0.05214376002550125,
-0.028637759387493134,
0.03207641839981079,
-0.11796392500400543,
0.11347455531358719,
0.007657987996935844,
0.16286496818065643,
0.012001721188426018,
-0.045438047498464584,
-0.0658487007021904,
-0.0003362045099493116,
0.14264227449893951,
0.0027607788797467947,
-0.10811484605073929,
0.010071263648569584,
-0.1485380381345749,
-0.032002370804548264,
0.08129970729351044,
-0.07263465970754623,
0.016057845205068588,
0.03499146178364754,
-0.04885988309979439,
-0.01572159305214882,
0.10545022040605545,
-0.04347445070743561,
-0.02752680517733097,
-0.02662649005651474,
-0.1016698032617569,
-0.017475098371505737,
-0.09856010973453522,
-0.13301338255405426,
0.04543880745768547,
-0.17170141637325287,
0.0035178987309336662,
-0.03677147254347801,
-0.058993980288505554,
0.022542359307408333,
0.014296506531536579,
-0.08111370354890823,
0.06173989176750183,
-0.09122895449399948,
-0.15531693398952484,
-0.024016814306378365,
0.014668106101453304,
0.02006439119577408,
-0.018803609535098076,
0.10746068507432938,
0.03717300668358803,
0.10858677327632904,
-0.1785307079553604,
-0.0017424002289772034,
0.005638845264911652,
0.06792867928743362,
0.030236108228564262,
0.13407836854457855,
-0.10484272241592407,
-0.04035882279276848,
-0.06063907966017723,
-0.06223446503281593,
-0.10421726107597351,
-0.018008070066571236,
0.1334538757801056,
0.08467729389667511,
-0.15831358730793,
-0.007896519266068935,
0.07586853206157684,
-0.016155976802110672,
-0.06550277024507523,
0.15078632533550262,
-0.06013118848204613,
0.10789518058300018,
-0.03523194417357445,
0.07353883981704712,
0.22796933352947235,
-0.11163830012083054,
-0.007653160020709038,
0.11777401715517044,
0.06732518970966339,
0.009476522915065289,
0.011584899388253689,
0.07497644424438477,
-0.11210387200117111,
0.02998311258852482,
0.06886386126279831,
0.03176359832286835,
-0.06186481565237045,
-0.06746450811624527,
-0.03173987567424774,
-0.056847721338272095,
0.11095795780420303,
0.02684447541832924,
0.009942829608917236,
-0.0729876384139061,
-0.08242283761501312,
0.14047347009181976,
0.12489724904298782,
-0.02373495139181614,
-0.00369115243665874,
-0.13260477781295776,
-0.011203601025044918,
-0.03366612270474434,
0.021201135590672493,
-0.12538520991802216,
0.0323343388736248,
0.08256782591342926,
0.014403634704649448,
0.010397580452263355,
0.0396578386425972,
0.058033477514982224,
0.028194895014166832,
-0.06090497598052025,
0.011176415719091892,
-0.053236544132232666,
0.004079440608620644,
-0.09993678331375122,
-0.08476646989583969,
0.0006691172602586448,
-0.010852687060832977,
0.21550384163856506,
-0.1337486356496811,
0.03394760191440582,
0.11621101200580597,
-0.004439641255885363,
-0.007688772398978472,
-0.04028208553791046,
-0.07547637820243835,
0.10565713047981262,
-0.01266601774841547,
-0.036212850362062454,
0.033809415996074677,
0.028592392802238464,
-0.06285451352596283,
-0.16731229424476624,
-0.08161847293376923,
0.03963222727179527,
0.13266299664974213,
0.08500313758850098,
-0.07733339816331863,
-0.04591209068894386,
-0.01541791670024395,
-0.03867197409272194,
0.06297565251588821,
-0.06336305290460587,
0.031645916402339935,
0.005129612050950527,
0.05986512452363968,
-0.09973008185625076,
-0.03743978962302208,
0.06189531460404396,
-0.01586044207215309,
-0.041004765778779984,
0.11039824783802032,
0.02631252445280552,
-0.11783633381128311,
0.06864100694656372,
0.06029544770717621,
-0.1404678374528885,
0.09327255189418793,
-0.012676672078669071,
-0.019596412777900696,
-0.09960536658763885,
0.16997361183166504,
0.02375946193933487,
0.11756766587495804,
-0.1501924842596054,
0.10538405925035477,
-0.009771161712706089,
0.006204601377248764,
0.0612400583922863,
-0.1997312605381012,
-0.005127645563334227,
-0.039008110761642456,
-0.08297083526849747,
-0.07024680823087692,
-0.02411830425262451,
0.005311124958097935,
0.03330911695957184,
0.002952837385237217,
0.06394848227500916,
0.14376528561115265,
-0.0185391865670681,
-0.07937957346439362,
0.18065880239009857,
-0.22822222113609314,
-0.22458399832248688,
-0.22842389345169067,
0.006693730596452951,
-0.09362730383872986,
-0.036704301834106445,
-0.054009731858968735,
-0.08067703247070312,
0.03441975265741348,
-0.08248597383499146,
-0.055353812873363495,
-0.011982304975390434,
0.008380476385354996,
0.05353120341897011,
0.016293516382575035,
0.16450470685958862,
-0.07970929890871048,
0.0257254745811224,
0.053753700107336044,
-0.024769404903054237,
0.12344299256801605,
-0.08420433104038239,
-0.03462434560060501,
0.11387899518013,
-0.01188295055180788,
0.0145328463986516,
0.01183662936091423,
0.33029767870903015,
0.004910196643322706,
0.03649308532476425,
0.08162890374660492,
-0.003756461199373007,
0.05702434852719307,
0.08918663114309311,
0.01854359731078148,
-0.10683408379554749,
0.07154038548469543,
0.052385956048965454,
-0.0819675549864769,
-0.13089759647846222,
-0.03209419921040535,
-0.06363310664892197,
0.01682649366557598,
0.07607807219028473,
0.06257891654968262,
0.08387400209903717,
0.07003678381443024,
0.031037570908665657,
0.10862348228693008,
0.006719090044498444,
-0.009780955500900745,
0.1154506579041481,
-0.02919038012623787,
0.0664224848151207,
-0.01642061397433281,
0.026679545640945435,
0.061177391558885574,
0.1297265589237213,
0.069287970662117,
-0.0736280009150505,
0.012541249394416809,
0.05162445083260536,
0.29216650128364563,
-0.011502427980303764,
0.08846558630466461,
-0.07443831115961075,
-0.019035225734114647,
-0.013720297254621983,
-0.03154807910323143,
-0.07571276277303696,
0.04215119034051895,
0.003495335578918457,
0.06473421305418015,
-0.0022922337520867586,
-0.017213091254234314,
0.07760420441627502,
0.09821566194295883,
0.17678722739219666,
-0.2779153883457184,
-0.11255498975515366,
-0.006441204342991114,
0.10050622373819351,
-0.10428133606910706,
0.01565384306013584,
0.22264739871025085,
0.006862876936793327,
-0.1009795218706131,
-0.03333192318677902,
0.031150078400969505,
-0.015747448429465294,
0.01241636648774147,
0.12056416273117065,
0.10708002001047134,
-0.002608862705528736,
0.07637561857700348,
-0.31736811995506287,
0.02226284332573414,
0.05985979735851288,
0.03599311038851738,
-0.04527868330478668,
0.007991808466613293,
-0.061200015246868134,
-0.06832600384950638,
0.03509039431810379,
0.0019549522548913956,
0.1889735907316208,
-0.2897758185863495,
-0.07803528755903244,
-0.008666069246828556,
0.12686322629451752,
0.06344277411699295,
0.04962414875626564,
0.01623174548149109,
0.054691381752491,
0.0789775401353836,
0.06340458244085312,
-0.03233237564563751,
-0.11194033175706863,
0.0005942063289694488,
0.16109155118465424,
-0.12496649473905563,
-0.06108592450618744,
-0.05368443951010704,
-0.007636595517396927,
0.04602250084280968,
-0.16420245170593262,
-0.05097196623682976,
-0.05793680250644684,
0.03439047187566757,
0.1503523737192154,
-0.025070440024137497,
-0.002292958088219166,
-0.013629160821437836,
0.00947998184710741,
-0.03680781275033951,
-0.07645224779844284,
0.1100383996963501,
-0.04053280130028725,
-0.14371339976787567,
-0.043084755539894104,
0.13991355895996094,
0.08521818369626999,
0.00122605892829597,
-0.08492032438516617,
-0.04409416764974594,
0.026184678077697754,
-0.14106614887714386,
0.012389937415719032,
0.08117508143186569,
-0.0536799319088459,
0.08704525977373123,
-0.1025477796792984,
0.21894492208957672,
-0.05376456305384636,
0.08074451237916946,
0.07477474957704544,
0.3051668405532837,
-0.07810559123754501,
0.025269396603107452,
0.07390918582677841,
-0.017363475635647774,
-0.25344404578208923,
0.03934885561466217,
0.06799458712339401,
0.050301443785429,
-0.03162827342748642,
-0.1787947118282318,
0.029355987906455994,
0.07851903140544891,
0.01152612455189228,
0.16447895765304565,
-0.3246810734272003,
-0.06552402675151825,
0.03930334374308586,
0.05016697570681572,
0.12071209400892258,
-0.05151672661304474,
0.011802415363490582,
0.0005798686761409044,
-0.012056782841682434,
0.16171705722808838,
-0.08953577280044556,
0.11604998260736465,
-0.009891933761537075,
0.023780513554811478,
0.007661169860512018,
-0.03744346275925636,
0.1529698222875595,
0.0016671200282871723,
0.08786030858755112,
0.0227588452398777,
-0.0784774124622345,
0.06254196912050247,
-0.06566163897514343,
0.023280994966626167,
-0.058587972074747086,
0.08849100023508072,
-0.053309399634599686,
0.008895163424313068,
-0.06468260288238525,
-0.0285837110131979,
-0.06997548788785934,
-0.05294686183333397,
-0.1114194467663765,
0.09259607642889023,
-0.014859088696539402,
-0.028233857825398445,
-0.03854804486036301,
0.05306123569607735,
0.038334574550390244,
0.4490160346031189,
-0.05683157220482826,
-0.03747360035777092,
0.08445943146944046,
0.09416796267032623,
-0.022764256224036217,
0.09828681498765945,
-0.12651817500591278,
0.04326749965548515,
0.12407610565423965,
0.0013500433415174484,
0.14305323362350464,
0.08107176423072815,
-0.10908607393503189,
0.0026568786706775427,
0.03864200413227081,
-0.13661205768585205,
-0.0637107715010643,
-0.021698210388422012,
-0.008182504214346409,
-0.11601445823907852,
-0.003855476388707757,
0.10335319489240646,
-0.026012953370809555,
0.04859347641468048,
0.02959892898797989,
0.04628570005297661,
-0.14050062000751495,
0.15718260407447815,
0.03785894438624382,
0.0773269459605217,
-0.08905962854623795,
0.08538462221622467,
0.0341004803776741,
0.004354927688837051,
0.054473016411066055,
-0.02465607225894928,
-0.10066041350364685,
0.018329044803977013,
-0.03814751282334328,
-0.10103803873062134,
0.11562931537628174,
-0.031049763783812523,
-0.038677770644426346,
-0.0993829295039177,
0.014433843083679676,
0.0846017450094223,
0.052262432873249054,
0.1067095547914505,
-0.03113221377134323,
0.02033975161612034,
-0.1376478374004364,
0.07439704984426498,
-0.0334647037088871,
0.015372445806860924,
-0.13893410563468933,
0.07189736515283585,
-0.021610558032989502,
0.05621515214443207,
-0.018327541649341583,
-0.014693169854581356,
-0.22836874425411224,
0.02324662171304226,
-0.03602374345064163,
0.004337872378528118,
0.045130908489227295,
0.02817535400390625,
0.02978314459323883,
0.05020326375961304,
-0.023141341283917427,
0.02906402014195919,
-0.03197551518678665,
-0.05226500332355499,
0.0499696210026741,
-0.004835754632949829,
-0.03992021083831787,
-0.05186335742473602,
0.06049661710858345,
-0.10903627425432205,
0.03873228281736374,
0.03313494473695755,
-0.054615914821624756,
0.06981058418750763,
0.06330772489309311,
0.0268961600959301,
0.09631124138832092,
0.056871797889471054,
0.046097688376903534,
-0.06351964920759201,
0.03455657139420509,
-0.022250670939683914,
-0.00624211085960269,
0.05636441707611084,
0.14740675687789917,
-0.047108784317970276,
-0.0636642798781395,
-0.14202140271663666,
-0.019476987421512604,
-0.04566153883934021,
0.04513302817940712,
0.16159655153751373,
0.09404542297124863,
0.09236114472150803,
-0.0821361318230629,
-0.023633049800992012,
-0.14481386542320251,
-0.07746478915214539,
0.05283274129033089,
-0.052044827491045,
-0.047382134944200516,
-0.04354611411690712,
0.0739387795329094,
-0.009924774058163166,
0.1355780065059662,
-0.09322313219308853,
-0.10284098237752914,
-0.053579315543174744,
-0.19133111834526062,
-0.12044229358434677,
0.00256889290176332,
0.26070472598075867,
0.03901056945323944,
-0.04509548842906952,
-0.08076508343219757,
0.002155857626348734,
0.07334074378013611,
0.13909012079238892,
0.029680786654353142,
0.08756469190120697,
-0.12226739525794983,
0.10201890766620636,
0.04023836925625801,
-0.055521633476018906,
0.1132090762257576,
0.31879276037216187,
-0.07730785757303238,
0.013387022539973259,
-0.10073232650756836,
0.11021994799375534,
0.0007453791913576424,
-0.1430017352104187,
0.006316468119621277,
-0.037437427788972855,
-0.16519887745380402,
-0.10536279529333115,
0.02628246694803238,
-0.0706881508231163,
-0.18453171849250793,
-0.023839564993977547,
-0.11543169617652893,
-0.06360355019569397,
0.1126338317990303,
0.0399429053068161,
-0.028909916058182716,
0.201675683259964,
-0.08599916845560074,
0.041288454085588455,
-0.0003141422930639237,
-0.010095082223415375,
-0.016095805913209915,
-0.030886787921190262,
-0.09983845800161362,
0.14582909643650055,
0.019685756415128708,
0.10730867087841034,
0.0036301896907389164,
0.08048225194215775,
0.04194061830639839,
-0.026748450472950935,
-0.04846813157200813,
-0.011329391971230507,
0.012692849151790142,
-0.053367115557193756,
0.11627253144979477,
0.05312275141477585,
-0.07700059562921524,
-0.08015355467796326,
-0.010329783894121647,
-0.08480866253376007,
-0.03079259768128395,
-0.15980394184589386,
0.2568057179450989,
-0.035085529088974,
0.11443251371383667,
0.0006201770738698542,
-0.06570316851139069,
-0.0947524830698967,
0.15000610053539276,
0.11755913496017456,
-0.14171954989433289,
-0.011081993579864502,
0.09272223711013794,
-0.007588645908981562,
-0.08871670067310333,
0.15103355050086975,
0.0864647775888443,
-0.019251089543104172,
0.0231208186596632,
-0.023887814953923225,
-0.02682357467710972,
-0.011532667092978954,
0.013189748860895634,
-0.033909741789102554,
0.02560831978917122,
0.03932470083236694,
-0.13912217319011688,
-0.029104189947247505,
-0.06448303163051605,
-0.08723004907369614,
0.18666662275791168,
-0.13714911043643951,
-0.081991046667099,
-0.03343013674020767,
-0.08826499432325363,
-0.10603920370340347,
0.020193934440612793,
-0.10627500712871552,
0.07077591866254807,
0.07121050357818604,
-0.05374689772725105,
-0.0019511525752022862,
-0.04885771498084068,
0.011899150907993317,
0.02466990239918232,
0.058414071798324585,
-0.011807786300778389,
0.07866984605789185,
0.11816007643938065,
-0.02016681805253029,
-0.048029348254203796,
0.10774647444486618,
0.01690971665084362,
-0.04457734525203705,
-0.1452929526567459,
0.03396405652165413,
-0.021062646061182022,
0.13437509536743164,
0.03301657736301422,
-0.06416842341423035,
-0.014076373539865017,
-0.20952264964580536,
-0.012867928482592106,
-0.14440461993217468,
-0.07743453234434128,
-0.07547476142644882,
0.10911253839731216,
0.19156797230243683,
-0.058017831295728683,
0.022697798907756805,
-0.03117358312010765,
0.03372464329004288,
-0.04286767169833183,
0.06252838671207428,
-0.003493012860417366,
-0.15049496293067932,
0.05715633183717728,
-0.04946384206414223,
0.01261360477656126,
-0.29460665583610535,
-0.007316927425563335,
0.012749905698001385,
-0.02980758063495159,
-0.0437527671456337,
0.14593428373336792,
0.007143276743590832,
0.06895528733730316,
-0.05879765376448631,
-0.251142680644989,
-0.06769166886806488,
0.1345207542181015,
0.0035605370067059994,
-0.06681157648563385
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | SlipTheTrap/model-unsloth.Q4_K_M | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:28:46+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | engrzulqarnain/pine_script_model2 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:30:04+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | fastai |
# Amazing!
🥳 Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the [documentation here](https://huggingface.co/docs/hub/model-repos))!
2. Create a demo in Gradio or Streamlit using 🤗 Spaces ([documentation here](https://huggingface.co/docs/hub/spaces)).
3. Join the fastai community on the [Fastai Discord](https://discord.com/invite/YKrxeNn)!
Greetings fellow fastlearner 🤝! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
| {"language": ["fa"], "tags": ["fastai"]} | null | saied/Persian-ULMFIT | [
"fastai",
"fa",
"region:us"
] | 2024-02-14T14:33:22+00:00 | [] | [
"fa"
] | TAGS
#fastai #fa #region-us
|
# Amazing!
Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the documentation here)!
2. Create a demo in Gradio or Streamlit using Spaces (documentation here).
3. Join the fastai community on the Fastai Discord!
Greetings fellow fastlearner ! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
| [
"# Amazing!\n\n Congratulations on hosting your fastai model on the Hugging Face Hub!",
"# Some next steps\n1. Fill out this model card with more information (see the template below and the documentation here)!\n\n2. Create a demo in Gradio or Streamlit using Spaces (documentation here).\n\n3. Join the fastai community on the Fastai Discord!\n\nGreetings fellow fastlearner ! Don't forget to delete this content from your model card.\n\n\n---",
"# Model card",
"## Model description\nMore information needed",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nMore information needed"
] | [
"TAGS\n#fastai #fa #region-us \n",
"# Amazing!\n\n Congratulations on hosting your fastai model on the Hugging Face Hub!",
"# Some next steps\n1. Fill out this model card with more information (see the template below and the documentation here)!\n\n2. Create a demo in Gradio or Streamlit using Spaces (documentation here).\n\n3. Join the fastai community on the Fastai Discord!\n\nGreetings fellow fastlearner ! Don't forget to delete this content from your model card.\n\n\n---",
"# Model card",
"## Model description\nMore information needed",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nMore information needed"
] | [
11,
20,
79,
3,
6,
12,
8
] | [
"passage: TAGS\n#fastai #fa #region-us \n# Amazing!\n\n Congratulations on hosting your fastai model on the Hugging Face Hub!# Some next steps\n1. Fill out this model card with more information (see the template below and the documentation here)!\n\n2. Create a demo in Gradio or Streamlit using Spaces (documentation here).\n\n3. Join the fastai community on the Fastai Discord!\n\nGreetings fellow fastlearner ! Don't forget to delete this content from your model card.\n\n\n---# Model card## Model description\nMore information needed## Intended uses & limitations\nMore information needed## Training and evaluation data\nMore information needed"
] | [
-0.07724074274301529,
-0.04297415912151337,
0.00120126164983958,
0.11131846904754639,
0.17055968940258026,
0.11891557276248932,
0.07253091782331467,
0.07783500105142593,
0.08263041824102402,
0.011645482853055,
0.08786872029304504,
-0.06702897697687149,
0.10888686776161194,
0.28154560923576355,
0.058952875435352325,
-0.22573450207710266,
0.029088040813803673,
-0.003938938491046429,
0.0819852352142334,
0.06346841901540756,
0.13271698355674744,
-0.043618958443403244,
0.15414221584796906,
-0.009889268316328526,
-0.2070886641740799,
-0.05356448516249657,
-0.017832230776548386,
-0.02208881266415119,
0.11530052125453949,
-0.030383357778191566,
0.035783927887678146,
0.011852983385324478,
0.00034603691892698407,
-0.09030859172344208,
0.06423131376504898,
0.0343206413090229,
0.024316929280757904,
0.05029184743762016,
-0.06021585315465927,
0.09268084168434143,
0.07426448911428452,
-0.012581545859575272,
-0.11696666479110718,
0.09613441675901413,
-0.15265947580337524,
-0.21354497969150543,
-0.12631522119045258,
-0.10261627286672592,
0.04274481162428856,
0.007700959220528603,
-0.018246807157993317,
0.10631892085075378,
-0.13139040768146515,
-0.03005596436560154,
0.18585962057113647,
-0.16528929769992828,
-0.05753914266824722,
-0.009046398103237152,
0.0566687248647213,
-0.05293922871351242,
-0.05103430896997452,
0.09835143387317657,
0.09511292725801468,
-0.016680220142006874,
0.028042668476700783,
0.0023240118753165007,
0.03875488415360451,
0.0035703389439731836,
-0.05835474282503128,
0.06845047324895859,
-0.03178425133228302,
0.058183036744594574,
-0.11428073793649673,
-0.12163423746824265,
0.005190249066799879,
0.01833522319793701,
-0.06835892051458359,
-0.07233471423387527,
0.07838426530361176,
-0.0010675161611288786,
-0.04133881255984306,
-0.12762922048568726,
-0.06659941375255585,
-0.13570281863212585,
0.006402481812983751,
0.09704560786485672,
0.007085985504090786,
0.07420828938484192,
-0.0942743793129921,
0.05378043279051781,
-0.1971420794725418,
-0.057671744376420975,
-0.09120339900255203,
-0.11198118329048157,
0.02526927925646305,
-0.0498480349779129,
0.053422361612319946,
0.15817338228225708,
0.1466529369354248,
0.027625897899270058,
0.03673148155212402,
-0.02572208270430565,
0.035949137061834335,
0.04959540814161301,
0.02345437742769718,
0.018488219007849693,
-0.029778368771076202,
-0.18553531169891357,
0.00645152572542429,
-0.02636798471212387,
0.07215306162834167,
-0.07792928069829941,
-0.054919496178627014,
0.02161324955523014,
-0.13208596408367157,
0.09188459813594818,
-0.05128347873687744,
-0.009411539882421494,
0.007631052751094103,
0.005053338129073381,
0.2186342179775238,
0.038792479783296585,
-0.0004748805658891797,
0.004966374486684799,
-0.1445060819387436,
-0.05098734423518181,
-0.08905022591352463,
0.030468681827187538,
0.026350053027272224,
0.004756583832204342,
-0.06421785056591034,
0.03547186031937599,
-0.047809749841690063,
-0.01836506836116314,
0.015428737737238407,
-0.20427949726581573,
0.015221340581774712,
-0.09980765730142593,
-0.16059696674346924,
0.0650608167052269,
0.006805433426052332,
-0.08549825102090836,
0.08481396734714508,
0.003607661696150899,
0.027074195444583893,
-0.03527550399303436,
0.0038871332071721554,
0.06679941713809967,
-0.08243042230606079,
0.02949088253080845,
0.20813651382923126,
0.11349648237228394,
-0.0799049586057663,
-0.0044530113227665424,
-0.10417456924915314,
0.034339625388383865,
-0.14621126651763916,
0.043878670781850815,
-0.08497505635023117,
0.13142865896224976,
-0.047912660986185074,
0.008333748206496239,
-0.012899216264486313,
0.09225238114595413,
0.07764992862939835,
0.19021183252334595,
-0.22125941514968872,
-0.05489634722471237,
0.15963561832904816,
-0.12239290028810501,
-0.192134827375412,
0.20059102773666382,
0.0011093218345195055,
0.10763876140117645,
-0.010743550956249237,
0.14688920974731445,
-0.01611710712313652,
-0.14065654575824738,
-0.02983623929321766,
0.0022256332449615,
-0.2431543916463852,
-0.08089455217123032,
0.09650435298681259,
0.1224781721830368,
-0.06283548474311829,
0.025562113150954247,
0.003795612370595336,
0.16288051009178162,
-0.08152404427528381,
-0.04814014211297035,
-0.0037713737692683935,
-0.12271379679441452,
0.02222103625535965,
0.011739677749574184,
0.03104490041732788,
-0.0456361323595047,
-0.00910926517099142,
-0.08456320315599442,
0.128811776638031,
0.09863147139549255,
-0.0514916256070137,
-0.06992698460817337,
0.17489343881607056,
-0.07018062472343445,
-0.022691654041409492,
0.08439647406339645,
-0.083741694688797,
0.040287818759679794,
0.042790964245796204,
0.05138517916202545,
0.019890712574124336,
0.09304266422986984,
0.07675052434206009,
0.0047508650459349155,
0.035872235894203186,
0.12558364868164062,
-0.02868952788412571,
-0.052309051156044006,
-0.0012352113844826818,
0.04194658249616623,
-0.0061800177209079266,
0.3066197633743286,
-0.2069600522518158,
0.02341771312057972,
-0.05468139797449112,
0.06835450232028961,
0.06314035505056381,
0.008447383530437946,
0.07342472672462463,
-0.05540739372372627,
-0.007297186646610498,
-0.04290016368031502,
0.06346212327480316,
-0.06693818420171738,
-0.0499560609459877,
0.24035675823688507,
-0.03646744042634964,
0.052128035575151443,
0.11573713272809982,
-0.06061047688126564,
-0.07113483548164368,
-0.004798665642738342,
-0.00448480062186718,
0.025798605754971504,
-0.04207019880414009,
0.06187621131539345,
-0.08441711217164993,
-0.05420135334134102,
0.1749228835105896,
-0.044677186757326126,
0.07978039234876633,
0.0478455014526844,
-0.04022333025932312,
-0.041481323540210724,
0.0632605254650116,
0.1533086895942688,
-0.09598729014396667,
0.06795091181993484,
0.12764370441436768,
0.01786714792251587,
0.1605074405670166,
0.06916557997465134,
-0.08089348673820496,
-0.08433022350072861,
-0.012206518091261387,
-0.005924733821302652,
0.19468054175376892,
-0.07646019756793976,
-0.036154311150312424,
0.05038643255829811,
-0.02562061883509159,
0.06398483365774155,
-0.05395261570811272,
-0.08140560984611511,
0.029629213735461235,
-0.06384206563234329,
0.021541869267821312,
0.12150419503450394,
-0.07818896323442459,
0.04774351790547371,
0.03895958513021469,
-0.07056894898414612,
0.04058395326137543,
0.037625350058078766,
-0.01656467653810978,
0.0512145459651947,
0.07242290675640106,
-0.21814242005348206,
-0.0913972333073616,
-0.16998127102851868,
0.023452062159776688,
0.014541077427566051,
0.036671753972768784,
-0.10487090796232224,
0.017896346747875214,
-0.06464558094739914,
-0.07502266764640808,
0.054628413170576096,
-0.02251962013542652,
-0.09231710433959961,
-0.02914997562766075,
-0.01658657006919384,
-0.04820894077420235,
-0.025351349264383316,
-0.061395496129989624,
0.02789458818733692,
0.03883746638894081,
0.037488073110580444,
0.13605032861232758,
-0.020232753828167915,
-0.01475706696510315,
0.01229136809706688,
-0.013985070399940014,
0.17360875010490417,
-0.14734432101249695,
0.07577788084745407,
0.20626109838485718,
0.09413223713636398,
0.029979337006807327,
0.008850617334246635,
0.039656903594732285,
-0.07381806522607803,
-0.0013922672951593995,
0.04038931429386139,
-0.0957525447010994,
-0.06894644349813461,
-0.014296774752438068,
-0.037667710334062576,
0.20093807578086853,
-0.13139614462852478,
0.03276869282126427,
0.04084796458482742,
0.09956194460391998,
0.10818202048540115,
-0.04330020397901535,
-0.16317437589168549,
0.034847646951675415,
-0.25075554847717285,
-0.05045825242996216,
0.012099872343242168,
-0.0976661816239357,
-0.06082091107964516,
0.1885799765586853,
-0.0005292731220833957,
0.024337461218237877,
0.006132385227829218,
0.11273615062236786,
0.00987599790096283,
0.10970792174339294,
0.06395118683576584,
-0.057403720915317535,
0.0092231510207057,
-0.10597385466098785,
-0.06870118528604507,
-0.0447346493601799,
-0.06571932882070541,
0.06006611883640289,
0.10840344429016113,
-0.009314307011663914,
-0.047254402190446854,
0.05496339127421379,
0.09119942784309387,
0.06642972677946091,
0.1436622142791748,
-0.17337363958358765,
-0.02784920111298561,
0.030305294319987297,
-0.022582001984119415,
-0.052443020045757294,
-0.01869620755314827,
0.08442976325750351,
-0.04900141805410385,
-0.027333684265613556,
0.00764466542750597,
0.074522465467453,
0.0017730689141899347,
0.05018140375614166,
-0.019434861838817596,
0.18435855209827423,
-0.029026595875620842,
0.016293346881866455,
-0.1346675306558609,
0.13210122287273407,
0.03230607509613037,
-0.0013638531090691686,
-0.06709546595811844,
-0.05402897298336029,
0.1790953278541565,
0.021687351167201996,
0.12512953579425812,
0.004101420287042856,
-0.07029219716787338,
-0.1584477424621582,
-0.12304888665676117,
0.018424304202198982,
0.09052611142396927,
-0.009200171567499638,
-0.03225773945450783,
0.028242064639925957,
-0.03525569662451744,
-0.062234487384557724,
0.09434207528829575,
-0.12474444508552551,
-0.004958136472851038,
0.018356958404183388,
0.05337856709957123,
-0.08357779681682587,
0.0350898839533329,
0.036839742213487625,
-0.08236095309257507,
0.12937480211257935,
0.22207069396972656,
0.08718650788068771,
-0.10577774047851562,
-0.07658470422029495,
0.013682917691767216,
-0.029196549206972122,
0.0004153306654188782,
-0.012904618866741657,
0.04240335896611214,
-0.0037112513091415167,
0.003052563639357686,
0.12782642245292664,
-0.08142499625682831,
0.003900125389918685,
-0.07255664467811584,
0.06802693754434586,
-0.02511489763855934,
-0.006475217640399933,
-0.0023179270792752504,
-0.0362798310816288,
-0.033088039606809616,
-0.06949537247419357,
0.16688761115074158,
-0.05928203463554382,
-0.08097968995571136,
0.06495478749275208,
0.021145140752196312,
0.026390843093395233,
-0.05128517374396324,
-0.046344269067049026,
0.18338656425476074,
0.3217286467552185,
-0.06278600543737411,
0.09984125196933746,
0.13324470818042755,
0.030190853402018547,
-0.22200140357017517,
0.03238677605986595,
-0.1373230516910553,
0.035947706550359726,
0.024854877963662148,
-0.07590656727552414,
0.06565158069133759,
0.1274547427892685,
-0.04836515709757805,
0.23686204850673676,
-0.04717622697353363,
-0.0719030499458313,
-0.006176489405333996,
0.04704728350043297,
0.3068613111972809,
-0.1243673637509346,
-0.016751492395997047,
-0.11333309859037399,
-0.23146739602088928,
0.05693555250763893,
-0.18802288174629211,
0.13911153376102448,
-0.05328995734453201,
0.022173292934894562,
-0.010301581583917141,
-0.07077404111623764,
0.19584456086158752,
-0.13755236566066742,
0.06011626496911049,
-0.13559168577194214,
-0.1161780059337616,
-0.010564380325376987,
-0.08071701973676682,
0.15211473405361176,
-0.09945529699325562,
-0.026902567595243454,
-0.21906135976314545,
-0.003068700199946761,
-0.014893598854541779,
0.10631648451089859,
0.02251775935292244,
-0.07842139154672623,
-0.07553816586732864,
0.12467287480831146,
-0.07429064065217972,
0.03759608790278435,
-0.11820816993713379,
-0.04885045811533928,
-0.03600204735994339,
-0.053899798542261124,
0.05863996967673302,
-0.08927326649427414,
0.15612226724624634,
-0.044748902320861816,
-0.04707220196723938,
0.06897224485874176,
-0.2081194818019867,
0.028141207993030548,
0.03188589587807655,
-0.035692907869815826,
0.09982112795114517,
-0.029299158602952957,
-0.06230577081441879,
0.10909893363714218,
0.12930506467819214,
-0.06690258532762527,
-0.25575053691864014,
-0.083735890686512,
-0.018668601289391518,
0.03760862722992897,
0.07025256007909775,
0.048355598002672195,
-0.05710764601826668,
-0.010587718337774277,
-0.029949204996228218,
0.03538363054394722,
-0.11534317582845688,
-0.02141052857041359,
0.07489722222089767,
0.00187899440061301,
-0.10199051350355148,
0.08018384128808975,
-0.00556275574490428,
-0.009397455491125584,
-0.001878787181340158,
0.17228800058364868,
-0.016267452389001846,
-0.13056492805480957,
-0.05421723425388336,
0.24206751585006714,
-0.04235018417239189,
-0.0787903368473053,
-0.06505690515041351,
-0.025015462189912796,
-0.036863937973976135,
0.07933260500431061,
0.04216849058866501,
-0.020606674253940582,
0.08928929269313812,
0.07328636199235916,
-0.12350569665431976,
-0.045760978013277054,
-0.06245905160903931,
0.03340104967355728,
-0.09940372407436371,
0.058471255004405975,
0.015942394733428955,
0.1320076584815979,
-0.09498010575771332,
-0.017493458464741707,
-0.11283043771982193,
-0.05863802507519722,
-0.18346257507801056,
-0.051203399896621704,
-0.03528331220149994,
-0.006721954792737961,
0.034905124455690384,
0.02095761150121689,
-0.04663001000881195,
-0.041770290583372116,
-0.06500066816806793,
0.03923489898443222,
0.08452926576137543,
0.027127545326948166,
-0.04505304619669914,
0.04541100561618805,
0.04752608388662338,
0.011461393907666206,
0.1904630959033966,
0.061581093817949295,
0.06247791647911072,
-0.03503500297665596,
-0.20342396199703217,
-0.049347203224897385,
-0.00007943128002807498,
-0.08447685837745667,
0.12235590815544128,
-0.007883017882704735,
0.027054185047745705,
-0.06976208835840225,
0.020137522369623184,
0.029290927574038506,
0.11746817827224731,
-0.018632717430591583,
0.10259711742401123,
0.018200615420937538,
-0.06590469926595688,
-0.03413539379835129,
0.02491210214793682,
0.13960577547550201,
0.027098290622234344,
0.023291589692234993,
0.01665416918694973,
0.028812650591135025,
-0.047061480581760406,
0.029726101085543633,
-0.04320989549160004,
-0.146290585398674,
0.01985253393650055,
-0.04967118799686432,
-0.00046962464693933725,
-0.025546064600348473,
0.1956113874912262,
0.04671985283493996,
-0.05097762495279312,
-0.009402100928127766,
-0.0034317460376769304,
0.002006117021664977,
-0.03175342082977295,
-0.020031431689858437,
0.05889305844902992,
0.004581071902066469,
-0.05034523829817772,
0.12929661571979523,
0.04775991663336754,
0.0421903170645237,
0.08363793790340424,
0.10744763910770416,
-0.01932944729924202,
0.12905436754226685,
0.07301732152700424,
-0.02724931761622429,
-0.10100847482681274,
-0.045530252158641815,
-0.13046810030937195,
0.03636838495731354,
-0.06294208765029907,
0.1215246245265007,
0.12501274049282074,
-0.05405943840742111,
-0.035612545907497406,
-0.06596050411462784,
-0.02531271241605282,
-0.0821961984038353,
0.0551629401743412,
-0.03294352814555168,
-0.07123209536075592,
0.05954226478934288,
0.051752422004938126,
-0.0345902256667614,
0.09877125173807144,
0.023626506328582764,
-0.05420248210430145,
0.1289924532175064,
-0.07043928653001785,
0.12391778826713562,
0.0857820063829422,
-0.05284736678004265,
-0.12392101436853409,
-0.0008859134395606816,
-0.07764882594347,
-0.10004932433366776,
-0.0033899620175361633,
-0.006345012225210667,
-0.0752774253487587,
-0.05705921724438667,
0.10923701524734497,
-0.037135783582925797,
-0.09872278571128845,
-0.02045065350830555,
0.007779862731695175,
0.06386343389749527,
-0.029330682009458542,
-0.0030506509356200695,
0.03867972269654274,
0.024817129597067833,
0.141007199883461,
0.0018099170411005616,
0.06083756312727928,
-0.14227700233459473,
0.15979072451591492,
-0.14508290588855743,
-0.022913414984941483,
-0.19101668894290924,
-0.08954044431447983,
-0.02995009906589985,
0.23840481042861938,
0.26604902744293213,
-0.20281879603862762,
-0.02426106296479702,
0.00425832811743021,
-0.009192324243485928,
-0.07536238431930542,
0.13272343575954437,
0.024475371465086937,
0.002385091735050082,
-0.06547711789608002,
-0.01843218319118023,
0.011527267284691334,
-0.0687597244977951,
-0.03664166107773781,
0.19604063034057617,
0.02317810244858265,
0.07605783641338348,
-0.09212977439165115,
0.032282739877700806,
-0.17361289262771606,
-0.0800284743309021,
-0.02213824726641178,
-0.13024592399597168,
-0.10217876732349396,
-0.018133781850337982,
-0.023277925327420235,
0.10220370441675186,
0.02344435453414917,
-0.020886821672320366,
0.06274893134832382,
-0.049419570714235306,
0.008376802317798138,
-0.1431189328432083,
-0.014425204135477543,
0.0549762025475502,
-0.05160623788833618,
0.2397213727235794,
-0.023091308772563934,
-0.11793158948421478,
0.08183932304382324,
-0.03921804204583168,
-0.11932672560214996,
0.07145301252603531,
-0.022118384018540382,
-0.11051876842975616,
-0.04873227700591087,
0.17931786179542542,
-0.009733476676046848,
-0.15543730556964874,
0.03596103563904762,
-0.14998239278793335,
0.02452315390110016,
0.027938097715377808,
-0.01840374432504177,
-0.04601709544658661,
0.030649635940790176,
-0.025231769308447838,
0.10212156176567078,
0.14918898046016693,
0.01998094655573368,
-0.006970718502998352,
-0.06586845964193344,
0.10006711632013321,
0.06893465667963028,
-0.052236564457416534,
-0.10964984446763992,
-0.10679297894239426,
0.03158950060606003,
0.07306647300720215,
-0.08323679864406586,
-0.17674744129180908,
-0.028295403346419334,
-0.1227894127368927,
-0.001302450313232839,
0.04414788633584976,
0.07027674466371536,
0.29974985122680664,
0.06912046670913696,
-0.0015635360032320023,
-0.13055095076560974,
0.05269114673137665,
0.08383983373641968,
-0.028088655322790146,
-0.08084443211555481
] |
null | null | null | **Model Name**: UNet2D
**Dataset Name**: huggan/smithsonian_butterflies_subset
# Description:
The UNet2D model is a convolutional neural network architecture designed for image segmentation tasks. It has been trained on the huggan/smithsonian_butterflies_subset dataset, specifically curated for butterfly image segmentation. This model is intended for use in identifying and segmenting butterfly images, enabling researchers and enthusiasts to better understand and analyze butterfly populations.
# Training:
## Dataset:
- The model has been trained on the huggan/smithsonian_butterflies_subset dataset, which consists of high-resolution images of butterflies.
Data Preprocessing:
- Images were preprocessed to standardize size, normalize pixel values, and enhance features relevant to butterfly segmentation.
Architecture:
- The UNet2D architecture employs a U-shaped design with skip connections to capture both low-level and high-level features in the image.
Loss Function:
- The model was trained using a pixel-wise cross-entropy loss function to optimize the segmentation masks.
Optimization:
- Stochastic Gradient Descent (SGD) was used as the optimization algorithm with a learning rate of 0.001.
Training Parameters:
- The model was trained for 50 epochs with a batch size of 16.
Evaluation:
## Metrics:
- The model's performance was evaluated using metrics such as Intersection over Union (IoU) and Dice coefficient, providing insights into the accuracy of segmentation.
Validation Set:
- A separate validation set from the huggan/smithsonian_butterflies_subset was used to assess the model's generalization on unseen data.
Fine-Tuning:
- Fine-tuning can be performed on specific butterfly species or environmental conditions to enhance performance in targeted scenarios.
Generation Phase:
## Input:
During the generation phase, the model takes a seed image or a set of initial conditions related to butterfly features.
Inference:
The UNet2D model uses its learned features to generate segmentation masks for the provided input, capturing butterfly boundaries and characteristics.
Post-Processing:
The generated masks can be post-processed to obtain visually appealing and accurate representations of butterfly segments.
Image Generation:
The final step involves combining the generated masks with the original input images to create realistic and detailed butterfly images.
Disclaimer:
The model's performance may vary based on the diversity of butterfly species and environmental conditions. Users are encouraged to validate results on specific use cases and consult domain experts for critical applications.
| {} | null | pt-sk/butterfly-UNet2DModel | [
"tensorboard",
"region:us"
] | 2024-02-14T14:34:11+00:00 | [] | [] | TAGS
#tensorboard #region-us
| Model Name: UNet2D
Dataset Name: huggan/smithsonian_butterflies_subset
# Description:
The UNet2D model is a convolutional neural network architecture designed for image segmentation tasks. It has been trained on the huggan/smithsonian_butterflies_subset dataset, specifically curated for butterfly image segmentation. This model is intended for use in identifying and segmenting butterfly images, enabling researchers and enthusiasts to better understand and analyze butterfly populations.
# Training:
## Dataset:
- The model has been trained on the huggan/smithsonian_butterflies_subset dataset, which consists of high-resolution images of butterflies.
Data Preprocessing:
- Images were preprocessed to standardize size, normalize pixel values, and enhance features relevant to butterfly segmentation.
Architecture:
- The UNet2D architecture employs a U-shaped design with skip connections to capture both low-level and high-level features in the image.
Loss Function:
- The model was trained using a pixel-wise cross-entropy loss function to optimize the segmentation masks.
Optimization:
- Stochastic Gradient Descent (SGD) was used as the optimization algorithm with a learning rate of 0.001.
Training Parameters:
- The model was trained for 50 epochs with a batch size of 16.
Evaluation:
## Metrics:
- The model's performance was evaluated using metrics such as Intersection over Union (IoU) and Dice coefficient, providing insights into the accuracy of segmentation.
Validation Set:
- A separate validation set from the huggan/smithsonian_butterflies_subset was used to assess the model's generalization on unseen data.
Fine-Tuning:
- Fine-tuning can be performed on specific butterfly species or environmental conditions to enhance performance in targeted scenarios.
Generation Phase:
## Input:
During the generation phase, the model takes a seed image or a set of initial conditions related to butterfly features.
Inference:
The UNet2D model uses its learned features to generate segmentation masks for the provided input, capturing butterfly boundaries and characteristics.
Post-Processing:
The generated masks can be post-processed to obtain visually appealing and accurate representations of butterfly segments.
Image Generation:
The final step involves combining the generated masks with the original input images to create realistic and detailed butterfly images.
Disclaimer:
The model's performance may vary based on the diversity of butterfly species and environmental conditions. Users are encouraged to validate results on specific use cases and consult domain experts for critical applications.
| [
"# Description:\nThe UNet2D model is a convolutional neural network architecture designed for image segmentation tasks. It has been trained on the huggan/smithsonian_butterflies_subset dataset, specifically curated for butterfly image segmentation. This model is intended for use in identifying and segmenting butterfly images, enabling researchers and enthusiasts to better understand and analyze butterfly populations.",
"# Training:",
"## Dataset:\n\n- The model has been trained on the huggan/smithsonian_butterflies_subset dataset, which consists of high-resolution images of butterflies.\nData Preprocessing:\n\n- Images were preprocessed to standardize size, normalize pixel values, and enhance features relevant to butterfly segmentation.\nArchitecture:\n\n- The UNet2D architecture employs a U-shaped design with skip connections to capture both low-level and high-level features in the image.\nLoss Function:\n\n- The model was trained using a pixel-wise cross-entropy loss function to optimize the segmentation masks.\nOptimization:\n\n- Stochastic Gradient Descent (SGD) was used as the optimization algorithm with a learning rate of 0.001.\nTraining Parameters:\n\n- The model was trained for 50 epochs with a batch size of 16.\nEvaluation:",
"## Metrics:\n\n- The model's performance was evaluated using metrics such as Intersection over Union (IoU) and Dice coefficient, providing insights into the accuracy of segmentation.\nValidation Set:\n\n- A separate validation set from the huggan/smithsonian_butterflies_subset was used to assess the model's generalization on unseen data.\nFine-Tuning:\n\n- Fine-tuning can be performed on specific butterfly species or environmental conditions to enhance performance in targeted scenarios.\nGeneration Phase:",
"## Input:\n\nDuring the generation phase, the model takes a seed image or a set of initial conditions related to butterfly features.\nInference:\n\nThe UNet2D model uses its learned features to generate segmentation masks for the provided input, capturing butterfly boundaries and characteristics.\nPost-Processing:\n\nThe generated masks can be post-processed to obtain visually appealing and accurate representations of butterfly segments.\nImage Generation:\n\nThe final step involves combining the generated masks with the original input images to create realistic and detailed butterfly images.\nDisclaimer:\nThe model's performance may vary based on the diversity of butterfly species and environmental conditions. Users are encouraged to validate results on specific use cases and consult domain experts for critical applications."
] | [
"TAGS\n#tensorboard #region-us \n",
"# Description:\nThe UNet2D model is a convolutional neural network architecture designed for image segmentation tasks. It has been trained on the huggan/smithsonian_butterflies_subset dataset, specifically curated for butterfly image segmentation. This model is intended for use in identifying and segmenting butterfly images, enabling researchers and enthusiasts to better understand and analyze butterfly populations.",
"# Training:",
"## Dataset:\n\n- The model has been trained on the huggan/smithsonian_butterflies_subset dataset, which consists of high-resolution images of butterflies.\nData Preprocessing:\n\n- Images were preprocessed to standardize size, normalize pixel values, and enhance features relevant to butterfly segmentation.\nArchitecture:\n\n- The UNet2D architecture employs a U-shaped design with skip connections to capture both low-level and high-level features in the image.\nLoss Function:\n\n- The model was trained using a pixel-wise cross-entropy loss function to optimize the segmentation masks.\nOptimization:\n\n- Stochastic Gradient Descent (SGD) was used as the optimization algorithm with a learning rate of 0.001.\nTraining Parameters:\n\n- The model was trained for 50 epochs with a batch size of 16.\nEvaluation:",
"## Metrics:\n\n- The model's performance was evaluated using metrics such as Intersection over Union (IoU) and Dice coefficient, providing insights into the accuracy of segmentation.\nValidation Set:\n\n- A separate validation set from the huggan/smithsonian_butterflies_subset was used to assess the model's generalization on unseen data.\nFine-Tuning:\n\n- Fine-tuning can be performed on specific butterfly species or environmental conditions to enhance performance in targeted scenarios.\nGeneration Phase:",
"## Input:\n\nDuring the generation phase, the model takes a seed image or a set of initial conditions related to butterfly features.\nInference:\n\nThe UNet2D model uses its learned features to generate segmentation masks for the provided input, capturing butterfly boundaries and characteristics.\nPost-Processing:\n\nThe generated masks can be post-processed to obtain visually appealing and accurate representations of butterfly segments.\nImage Generation:\n\nThe final step involves combining the generated masks with the original input images to create realistic and detailed butterfly images.\nDisclaimer:\nThe model's performance may vary based on the diversity of butterfly species and environmental conditions. Users are encouraged to validate results on specific use cases and consult domain experts for critical applications."
] | [
10,
99,
3,
202,
125,
174
] | [
"passage: TAGS\n#tensorboard #region-us \n# Description:\nThe UNet2D model is a convolutional neural network architecture designed for image segmentation tasks. It has been trained on the huggan/smithsonian_butterflies_subset dataset, specifically curated for butterfly image segmentation. This model is intended for use in identifying and segmenting butterfly images, enabling researchers and enthusiasts to better understand and analyze butterfly populations.# Training:## Dataset:\n\n- The model has been trained on the huggan/smithsonian_butterflies_subset dataset, which consists of high-resolution images of butterflies.\nData Preprocessing:\n\n- Images were preprocessed to standardize size, normalize pixel values, and enhance features relevant to butterfly segmentation.\nArchitecture:\n\n- The UNet2D architecture employs a U-shaped design with skip connections to capture both low-level and high-level features in the image.\nLoss Function:\n\n- The model was trained using a pixel-wise cross-entropy loss function to optimize the segmentation masks.\nOptimization:\n\n- Stochastic Gradient Descent (SGD) was used as the optimization algorithm with a learning rate of 0.001.\nTraining Parameters:\n\n- The model was trained for 50 epochs with a batch size of 16.\nEvaluation:## Metrics:\n\n- The model's performance was evaluated using metrics such as Intersection over Union (IoU) and Dice coefficient, providing insights into the accuracy of segmentation.\nValidation Set:\n\n- A separate validation set from the huggan/smithsonian_butterflies_subset was used to assess the model's generalization on unseen data.\nFine-Tuning:\n\n- Fine-tuning can be performed on specific butterfly species or environmental conditions to enhance performance in targeted scenarios.\nGeneration Phase:"
] | [
-0.07606441527605057,
-0.03632406145334244,
-0.0022434741258621216,
0.04519047960639,
0.00019704090664163232,
-0.003948956727981567,
0.0016885735094547272,
0.0437740720808506,
-0.19900958240032196,
0.08403562009334564,
-0.037212081253528595,
-0.012884223833680153,
0.10999617725610733,
0.03540203347802162,
0.039756890386343,
-0.3412697911262512,
0.018825380131602287,
-0.05072109028697014,
-0.047403667122125626,
0.044906917959451675,
0.09124236553907394,
-0.0634307786822319,
0.06390857696533203,
0.08384380489587784,
-0.05328214541077614,
-0.05748181417584419,
-0.007647754158824682,
0.032612789422273636,
0.10354983806610107,
0.04237653687596321,
0.0977102741599083,
-0.056873708963394165,
0.041576702147722244,
-0.10851453989744186,
0.017728528007864952,
0.12480176985263824,
0.016189169138669968,
0.04681574925780296,
0.028793763369321823,
0.08526014536619186,
0.14562533795833588,
-0.03321550041437149,
0.08572281152009964,
0.003080772003158927,
-0.06900447607040405,
-0.18479681015014648,
-0.15108680725097656,
-0.01696625165641308,
0.11375365406274796,
0.031851716339588165,
0.022912882268428802,
-0.011549636721611023,
-0.053010594099760056,
0.022741205990314484,
0.0029293543193489313,
-0.25944653153419495,
-0.019814196974039078,
-0.0485229417681694,
0.013555043376982212,
0.10715961456298828,
-0.11631821095943451,
-0.0083466162905097,
0.0007099154172465205,
-0.004735561087727547,
0.10129448771476746,
0.011255307123064995,
0.07770654559135437,
-0.06400729715824127,
-0.06623139977455139,
-0.04431058093905449,
0.031151410192251205,
0.03642972931265831,
-0.08743075281381607,
-0.08889810740947723,
-0.07018561661243439,
0.028868064284324646,
-0.02793573960661888,
-0.14897052943706512,
0.07562603801488876,
0.09350542724132538,
0.06893856078386307,
-0.1468047797679901,
-0.09140513837337494,
-0.03383379802107811,
-0.061197858303785324,
0.10171794891357422,
0.05327490344643593,
0.03759174421429634,
-0.04843044653534889,
0.04125603660941124,
-0.05661719664931297,
-0.04903054237365723,
-0.09172498434782028,
0.007078746799379587,
-0.06005740910768509,
0.009546986781060696,
-0.04510103538632393,
-0.16797220706939697,
-0.09184321761131287,
0.20125621557235718,
-0.0542149543762207,
0.026696935296058655,
0.024468958377838135,
0.026272429153323174,
-0.013804263435304165,
0.04647863283753395,
-0.055912360548973083,
0.003432734403759241,
0.08734169602394104,
-0.0277725700289011,
0.08910812437534332,
-0.06998169422149658,
-0.017197411507368088,
0.07032237201929092,
0.0249974112957716,
-0.03431370481848717,
0.01667984016239643,
-0.07492940127849579,
-0.08541739732027054,
-0.039995256811380386,
0.08896034210920334,
-0.0665106475353241,
0.05796840414404869,
0.027507059276103973,
0.03473881632089615,
0.005930822808295488,
0.013730566017329693,
0.003964459989219904,
0.004416932351887226,
0.07440684735774994,
-0.04950158670544624,
0.016229668632149696,
-0.07084948569536209,
-0.065852090716362,
-0.03718222305178642,
-0.041700031608343124,
-0.06322044134140015,
-0.07687975466251373,
-0.12771856784820557,
-0.07728064060211182,
0.039497941732406616,
-0.07649896293878555,
-0.009192773140966892,
0.0031697533559054136,
-0.04416441172361374,
0.04270140454173088,
0.04368539899587631,
-0.0015688205603510141,
0.005720316898077726,
0.03561144694685936,
-0.11663836240768433,
0.019580109044909477,
-0.07929141074419022,
0.03539959713816643,
-0.02524036541581154,
-0.01645086705684662,
-0.1176915168762207,
0.15609733760356903,
-0.016859598457813263,
-0.002844683825969696,
-0.05588060989975929,
-0.057007111608982086,
-0.16528046131134033,
0.00819469802081585,
0.030509155243635178,
0.08296762406826019,
-0.231397345662117,
-0.07908502966165543,
0.020402422174811363,
-0.14876613020896912,
0.10103142261505127,
0.12038201838731766,
-0.056360695511102676,
0.051496896892786026,
0.1225859746336937,
0.15066708624362946,
0.11947933584451675,
-0.01381562277674675,
0.0026977062225341797,
0.07820608466863632,
-0.07552216202020645,
0.07219817489385605,
0.03493713587522507,
-0.045950327068567276,
-0.021107178181409836,
0.002422519028186798,
-0.07210072875022888,
-0.037117209285497665,
0.053064052015542984,
-0.09480218589305878,
-0.006138639058917761,
-0.006664356682449579,
0.036309633404016495,
0.01841498352587223,
0.08088069409132004,
-0.018152639269828796,
-0.05788862332701683,
0.06645030528306961,
0.05054401978850365,
-0.06339603662490845,
0.09729412198066711,
-0.031733784824609756,
0.06992669403553009,
-0.01517690159380436,
-0.02603423222899437,
-0.14848634600639343,
-0.0020548615138977766,
0.05382377654314041,
0.002691150875762105,
0.0561760850250721,
0.09978651255369186,
0.04598094895482063,
0.0035807874519377947,
-0.047011103481054306,
0.04421636089682579,
-0.09112917631864548,
-0.04003732278943062,
-0.08487653732299805,
-0.12993283569812775,
-0.012107878923416138,
-0.0792565867304802,
0.00813216157257557,
-0.14937841892242432,
-0.039442144334316254,
0.09078847616910934,
0.06220623850822449,
0.06097448244690895,
-0.09924987703561783,
0.00017797558393795043,
-0.003527754917740822,
0.012380655854940414,
-0.05557127669453621,
0.01588337868452072,
0.07850044965744019,
-0.06938023120164871,
-0.07429978996515274,
-0.05974225699901581,
0.04853043332695961,
-0.001227743225172162,
0.024035828188061714,
-0.06584654748439789,
0.0037089830730110407,
0.021832620725035667,
-0.02599659375846386,
-0.06743280589580536,
-0.0038613846991211176,
0.23891368508338928,
-0.04065990075469017,
0.06801372766494751,
-0.10278070718050003,
0.013792880810797215,
-0.05769363045692444,
0.04911390691995621,
0.017149128019809723,
0.003079341258853674,
-0.0427754782140255,
-0.06314418464899063,
0.05243248492479324,
0.005310091655701399,
0.005854707211256027,
0.12293601036071777,
0.009113326668739319,
-0.04337048530578613,
0.0168289877474308,
0.05013303458690643,
0.0238480344414711,
0.10010076314210892,
0.0193129051476717,
0.04572746157646179,
0.0329129584133625,
0.07334381341934204,
0.017488636076450348,
-0.17085088789463043,
0.062356747686862946,
0.06615045666694641,
-0.0357653945684433,
0.06561418622732162,
-0.1028960645198822,
-0.014235475100576878,
0.06851033866405487,
0.06303636729717255,
0.07093976438045502,
0.003367701545357704,
0.003197574056684971,
-0.08686581999063492,
0.2041504830121994,
-0.11635340750217438,
-0.17604926228523254,
-0.06433363258838654,
0.18164268136024475,
-0.03189605474472046,
0.003929691854864359,
0.004706899169832468,
-0.09787364304065704,
-0.01930873468518257,
-0.08658641576766968,
-0.01419493556022644,
-0.0456097349524498,
0.027657318860292435,
-0.051408350467681885,
0.0000366993153875228,
0.02116355113685131,
-0.1468867063522339,
0.04696005955338478,
-0.09042715281248093,
-0.08935315907001495,
-0.031000908464193344,
0.0888710618019104,
-0.0003784160944633186,
0.13048380613327026,
0.06535662710666656,
-0.042930394411087036,
-0.023378146812319756,
0.1697496920824051,
-0.09103794395923615,
0.1025143414735794,
0.10775189846754074,
-0.03288967162370682,
0.0907265767455101,
0.08002099394798279,
0.025429721921682358,
-0.0802091583609581,
-0.03618568554520607,
0.03390016034245491,
-0.06907270848751068,
-0.12152543663978577,
-0.042838264256715775,
-0.06450857222080231,
-0.17992186546325684,
0.03033471293747425,
0.07308236509561539,
-0.044032786041498184,
0.04369194060564041,
0.014988230541348457,
0.06314290314912796,
0.053025033324956894,
0.015730896964669228,
0.046224288642406464,
0.004216829780489206,
0.08323376625776291,
-0.06026553362607956,
0.023817606270313263,
0.1218574270606041,
0.03290804475545883,
0.24144935607910156,
-0.02478921413421631,
0.10337044298648834,
0.04568065330386162,
0.08893956989049911,
0.034542858600616455,
0.03642993047833443,
-0.1368975192308426,
-0.01415980514138937,
-0.07549604028463364,
-0.03839791566133499,
-0.03614259883761406,
0.07136006653308868,
0.05228102579712868,
-0.011649813503026962,
0.025404928252100945,
-0.03640110045671463,
0.008776523172855377,
0.171365886926651,
0.10754165798425674,
-0.13547256588935852,
-0.08210789412260056,
0.028795821592211723,
-0.04299487918615341,
-0.07684973627328873,
-0.0014321391936391592,
0.15691934525966644,
-0.0651925653219223,
0.034446440637111664,
-0.06263622641563416,
0.05883041396737099,
-0.16188742220401764,
-0.04419901221990585,
-0.02907155454158783,
0.11212770640850067,
0.03455678001046181,
0.0010733954841271043,
-0.044700171798467636,
0.07568120956420898,
-0.007532457355409861,
0.17054711282253265,
-0.0028340218123048544,
0.04319232702255249,
0.09482037276029587,
0.06927286833524704,
0.057991206645965576,
0.06340617686510086,
-0.08432484418153763,
0.028868138790130615,
-0.15652582049369812,
0.07023314386606216,
0.08016534894704819,
-0.013414264656603336,
0.006038217805325985,
-0.035859253257513046,
0.05812183395028114,
-0.010000123642385006,
-0.1089368611574173,
-0.1645759642124176,
-0.16736018657684326,
0.054857317358255386,
-0.16630162298679352,
0.02174384891986847,
-0.03243470937013626,
-0.06539838016033173,
0.08521375060081482,
0.17620670795440674,
-0.14713069796562195,
-0.03940363973379135,
-0.12009606510400772,
0.021664699539542198,
0.0018523185281082988,
-0.0662553533911705,
0.07790695875883102,
-0.0002942034916486591,
0.08598753064870834,
-0.02149646170437336,
-0.09455224126577377,
0.04818953573703766,
-0.034152500331401825,
-0.14718486368656158,
-0.038846880197525024,
0.04271254688501358,
0.12418663501739502,
0.04174446687102318,
0.06382580101490021,
0.04881983622908592,
0.006537664216011763,
-0.11820200085639954,
0.0343020036816597,
0.1443188488483429,
0.02009752206504345,
0.02883780375123024,
-0.03980029746890068,
0.04076327383518219,
-0.004868664313107729,
0.058769602328538895,
0.12790171802043915,
0.10972584784030914,
-0.07185835391283035,
0.17796359956264496,
0.15544232726097107,
-0.13355626165866852,
-0.21968583762645721,
0.04959572106599808,
0.0030669940169900656,
0.04514751583337784,
0.025679191574454308,
-0.24615171551704407,
0.04582664743065834,
0.09610993415117264,
-0.05339517444372177,
0.007363170385360718,
-0.1557416468858719,
-0.10761558264493942,
0.009326942265033722,
0.060452595353126526,
0.18064671754837036,
-0.08742495626211166,
-0.08612088114023209,
-0.04101197421550751,
0.09043246507644653,
0.0837133526802063,
0.024976534768939018,
0.11277112364768982,
-0.016536055132746696,
-0.14892087876796722,
-0.029703877866268158,
0.028033047914505005,
0.11787834763526917,
-0.040539566427469254,
0.08777887374162674,
-0.023262619972229004,
0.18437819182872772,
0.007784456480294466,
-0.04805895686149597,
0.03744342178106308,
0.03544929623603821,
0.03361629694700241,
-0.19245599210262299,
0.010124855674803257,
-0.05922159180045128,
0.04854849353432655,
0.0016658074455335736,
-0.002603924833238125,
-0.2103155255317688,
0.036740608513355255,
0.05239385366439819,
0.01005731150507927,
0.019086606800556183,
0.04658833518624306,
-0.019790256395936012,
-0.0789385512471199,
0.15751171112060547,
-0.10171885788440704,
-0.18417121469974518,
0.0037792494986206293,
0.018320368602871895,
0.11451739817857742,
-0.13304948806762695,
-0.0024489308707416058,
0.04052709415555,
0.032858386635780334,
0.05143899843096733,
0.07108727097511292,
-0.07563776522874832,
0.09478858858346939,
0.07408786565065384,
0.006799107417464256,
-0.22222495079040527,
0.02351575903594494,
-0.11661436408758163,
-0.051670726388692856,
-0.007987807504832745,
0.10346303880214691,
-0.04654838517308235,
-0.022849788889288902,
-0.04623360186815262,
0.07230953127145767,
-0.0335676483809948,
0.04316306114196777,
0.08815506845712662,
-0.02732742764055729,
-0.04662725329399109,
0.12326020002365112,
0.011865860782563686,
-0.07997404038906097,
0.054703593254089355,
0.034464798867702484,
-0.05232555791735649,
-0.025976762175559998,
0.07046281546354294,
0.061091143637895584,
0.10554861277341843,
0.0000897214631550014,
-0.04460294172167778,
-0.11715094745159149,
0.0538502112030983,
0.019679268822073936,
0.025476112961769104,
0.024550506845116615,
-0.12574106454849243,
0.023111386224627495,
-0.05511845648288727,
-0.000042766132537508383,
-0.01917385496199131,
0.03707186132669449,
-0.06538024544715881,
0.02228621020913124,
0.04886747896671295,
-0.009211910888552666,
-0.02780681475996971,
-0.04812943935394287,
-0.14174051582813263,
-0.05729944631457329,
0.00046680992818437517,
-0.015077933669090271,
-0.04980240762233734,
0.0028083575889468193,
-0.014020237140357494,
0.0047654323279857635,
-0.005608313251286745,
0.05482442304491997,
-0.04635613411664963,
-0.04714062437415123,
-0.020771581679582596,
-0.000024261100406874903,
-0.1090867668390274,
0.07594731450080872,
0.05572972819209099,
-0.09577907621860504,
0.1188303604722023,
-0.01979808881878853,
0.015349282883107662,
0.004367128945887089,
-0.1073494479060173,
-0.064307801425457,
0.013953890651464462,
-0.0047694058157503605,
-0.01713978685438633,
-0.1187085285782814,
0.003058682195842266,
-0.009265448898077011,
0.02539212815463543,
-0.024549143388867378,
0.21511219441890717,
-0.05714191496372223,
0.07510584592819214,
-0.08339627832174301,
-0.027163607999682426,
-0.04018222913146019,
-0.02120933122932911,
0.04958892986178398,
0.02728811465203762,
0.16378085315227509,
0.012890474870800972,
-0.0015377314994111657,
-0.1753484159708023,
-0.04535433277487755,
0.006848985329270363,
-0.027731336653232574,
-0.006355298683047295,
-0.09482382982969284,
0.05432747304439545,
0.060173168778419495,
0.06826212257146835,
0.03229086846113205,
-0.14479364454746246,
0.0093723488971591,
0.010434349998831749,
0.026777228340506554,
0.024413561448454857,
0.0754815861582756,
0.06730964034795761,
-0.02641301602125168,
0.017586033791303635,
-0.0004398537566885352,
0.019395262002944946,
0.1589507907629013,
0.2072739154100418,
0.19627366960048676,
0.039445679634809494,
0.1055883839726448,
-0.02299022488296032,
-0.02000165730714798,
-0.24416716396808624,
0.06464133411645889,
-0.011436744593083858,
0.07040079683065414,
0.026022866368293762,
-0.06411433964967728,
0.15072114765644073,
-0.18585307896137238,
0.14810433983802795,
0.03075469098985195,
-0.09655588865280151,
-0.12757591903209686,
-0.17291492223739624,
-0.06922432035207748,
-0.10349196195602417,
0.003103397088125348,
-0.07496703416109085,
-0.0193990059196949,
0.18072859942913055,
0.010024411603808403,
0.040405403822660446,
0.06658276170492172,
-0.1369558423757553,
-0.006002397742122412,
0.05631740391254425,
-0.0011967032914981246,
0.021712761372327805,
0.12005620449781418,
-0.03922055661678314,
0.050990860909223557,
0.0023562610149383545,
0.08983015269041061,
0.008718148805201054,
0.04589544236660004,
0.04454734921455383,
0.02793000638484955,
-0.015344440005719662,
0.03379574045538902,
-0.0479620099067688,
0.031861789524555206,
0.16400662064552307,
0.08582514524459839,
-0.09961028397083282,
0.02709881030023098,
0.16607114672660828,
-0.03835761174559593,
-0.010439567267894745,
-0.18522299826145172,
0.14755839109420776,
-0.04493872821331024,
0.009121491573750973,
-0.008494524285197258,
-0.06724279373884201,
0.03702175244688988,
0.14594164490699768,
0.04970424249768257,
-0.07265681773424149,
0.012128360569477081,
-0.013431989587843418,
-0.02848542295396328,
-0.01397815439850092,
0.15709231793880463,
0.0162117388099432,
0.28957614302635193,
-0.05587908253073692,
-0.017537012696266174,
-0.03769618272781372,
-0.09142521023750305,
-0.13934406638145447,
0.1441478729248047,
-0.03751823306083679,
0.01843319460749626,
-0.07165036350488663,
0.06233161315321922,
0.10981498658657074,
-0.12201251834630966,
0.19907428324222565,
-0.04717433452606201,
-0.12548865377902985,
0.04235208034515381,
-0.001324192271567881,
-0.05815291777253151,
-0.0017672341782599688,
0.052780549973249435,
-0.010681035928428173,
0.09444528818130493,
0.017524437978863716,
-0.04324469342827797,
-0.08968029171228409,
0.09968861937522888,
0.01590796187520027,
0.2196975201368332,
0.03594139590859413,
0.015218035317957401,
0.036176856607198715,
0.07499006390571594,
-0.09535645693540573,
0.018538832664489746,
-0.06678583472967148,
0.003475482575595379,
0.012046139687299728,
0.15958687663078308,
0.0013508170377463102,
0.030506489798426628,
0.10861412435770035,
-0.05407373234629631,
0.055793631821870804,
-0.06198318302631378,
-0.09781861305236816,
-0.029336662963032722,
-0.007742481771856546,
-0.15636645257472992,
0.11710585653781891,
0.13362163305282593,
0.04095636308193207,
0.03822750598192215,
-0.04695676267147064,
0.02324594557285309,
0.04098710045218468,
0.176452174782753,
-0.023269323632121086,
-0.1514449268579483,
-0.04574442654848099,
0.11696558445692062,
0.01769096590578556,
-0.09297669678926468,
-0.15696343779563904,
-0.00023183795565273613,
-0.033097755163908005,
0.023599496111273766,
0.02795141004025936,
0.012106280773878098,
-0.020286643877625465,
-0.04894889518618584,
-0.11583802103996277,
-0.006039890926331282,
0.031829703599214554,
-0.06862309575080872,
0.016419177874922752
] |
null | null | null |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | rockyclh/llama2_7b_hf_entrepreneurship | [
"safetensors",
"autotrain",
"text-generation",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:35:32+00:00 | [] | [] | TAGS
#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
33,
29,
3
] | [
"passage: TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.03320549428462982,
0.03780708089470863,
-0.0005784488166682422,
0.037439193576574326,
0.13256101310253143,
-0.02594633586704731,
0.22870999574661255,
0.04971681907773018,
-0.04270017519593239,
-0.08776232600212097,
0.19642603397369385,
0.16802352666854858,
-0.04566871374845505,
0.18935616314411163,
-0.02990073338150978,
-0.2414124757051468,
0.021885043010115623,
-0.025850016623735428,
0.1327640414237976,
0.11522045731544495,
0.14238014817237854,
-0.07779128849506378,
0.06120644509792328,
0.04086628183722496,
-0.20404933393001556,
0.03463415056467056,
0.07968573272228241,
-0.11895040422677994,
0.18004877865314484,
0.032886918634176254,
0.13635416328907013,
0.01931498385965824,
0.14652439951896667,
-0.12186150997877121,
0.014377960003912449,
0.01464270893484354,
-0.015491045080125332,
0.055415596812963486,
0.08804452419281006,
-0.038794226944446564,
0.09763352572917938,
0.177653506398201,
0.10883878171443939,
0.04911845549941063,
-0.10558086633682251,
-0.014727416448295116,
-0.03310466557741165,
0.018835384398698807,
0.12075160443782806,
0.1193094402551651,
-0.01845790445804596,
0.20021599531173706,
-0.14986595511436462,
0.07329507917165756,
-0.0995626449584961,
-0.27255508303642273,
-0.0038277229759842157,
0.21143054962158203,
0.07346842437982559,
-0.025004452094435692,
-0.12620827555656433,
0.06475763022899628,
0.12761425971984863,
0.0030757547356188297,
0.06504988670349121,
-0.015198786742985249,
-0.055105701088905334,
-0.0015243350062519312,
-0.07397002726793289,
-0.004598719999194145,
0.18640007078647614,
-0.07974611967802048,
-0.031184203922748566,
-0.12737500667572021,
-0.019428882747888565,
0.04709514603018761,
0.011552144773304462,
-0.09352482110261917,
-0.0217994824051857,
0.11079124361276627,
-0.007622338831424713,
-0.02531961165368557,
-0.15207529067993164,
-0.05755603685975075,
-0.08864409476518631,
0.04077286645770073,
0.0017509139142930508,
0.011538662947714329,
-0.09947098046541214,
0.12073534727096558,
-0.029350996017456055,
-0.0943499282002449,
0.052897434681653976,
-0.1107030138373375,
0.04635190963745117,
-0.11982002854347229,
-0.03970254212617874,
-0.10856737196445465,
0.013430505990982056,
0.22841021418571472,
0.1669083684682846,
-0.015314205549657345,
-0.08587565273046494,
0.039016176015138626,
0.02371702343225479,
0.09614221751689911,
0.06376225501298904,
-0.015822242945432663,
0.06775996834039688,
-0.04785482585430145,
-0.017039362341165543,
-0.025495992973446846,
-0.1726902425289154,
0.032083623111248016,
0.01997307874262333,
0.07117509841918945,
-0.0760226845741272,
0.06040170043706894,
-0.01951628364622593,
0.055283352732658386,
0.05161101743578911,
-0.031190861016511917,
0.03744623437523842,
-0.052504897117614746,
0.01617865450680256,
-0.09791388362646103,
0.0286922138184309,
0.1180110052227974,
0.03286140412092209,
0.1336720734834671,
-0.09649777412414551,
-0.026225421577692032,
-0.1056324690580368,
-0.03878350928425789,
0.018166208639740944,
-0.0019215025240555406,
0.0628642737865448,
-0.19663763046264648,
-0.30395275354385376,
-0.027070891112089157,
0.053043100982904434,
-0.019671862944960594,
-0.05561401695013046,
-0.07015043497085571,
0.016289202496409416,
0.059536442160606384,
-0.02920805849134922,
0.054385289549827576,
-0.022419849410653114,
0.03813159465789795,
-0.07676586508750916,
-0.02052054926753044,
-0.06291672587394714,
0.006658008787781,
-0.14841435849666595,
-0.03448035567998886,
-0.030017102137207985,
0.006548900622874498,
-0.03775618225336075,
0.16895608603954315,
-0.011088937520980835,
0.047757651656866074,
-0.05747115612030029,
0.05074193328619003,
0.007877329364418983,
0.1440490484237671,
-0.1335235834121704,
0.005429679993540049,
0.1511751264333725,
-0.11302075535058975,
-0.10663392394781113,
0.09467647224664688,
-0.10317569971084595,
0.23649843037128448,
0.10416192561388016,
0.13955152034759521,
0.05125761032104492,
-0.12630151212215424,
0.11601320654153824,
0.03282208740711212,
-0.08780468255281448,
-0.062369491904973984,
-0.0006791196065023541,
-0.034443121403455734,
-0.22099432349205017,
0.031658004969358444,
0.11068084836006165,
0.07476310431957245,
-0.03403317928314209,
-0.08304393291473389,
-0.02895026095211506,
-0.058612581342458725,
0.03986813873052597,
0.016017582267522812,
0.12599535286426544,
-0.07699156552553177,
-0.02858225256204605,
0.032077912241220474,
0.038467586040496826,
0.07923582941293716,
-0.054815541952848434,
-0.057291675359010696,
-0.01996961608529091,
-0.023569827899336815,
-0.00915558822453022,
-0.0898597314953804,
-0.0620407834649086,
-0.006840218789875507,
0.1304454207420349,
0.03466487303376198,
0.07167287915945053,
0.0362425372004509,
0.052633073180913925,
-0.028641145676374435,
0.002677651820704341,
0.1629824936389923,
0.04459667578339577,
-0.12675853073596954,
-0.08582112193107605,
0.10815013945102692,
-0.07446087151765823,
0.1071702167391777,
-0.2590586841106415,
0.028333326801657677,
-0.11371348798274994,
0.08611167222261429,
-0.013308924622833729,
0.06491301208734512,
-0.08320876955986023,
0.024355897679924965,
-0.08930765837430954,
-0.008432179689407349,
0.05678462237119675,
0.04953930526971817,
-0.02282531000673771,
0.12372811883687973,
-0.1432238668203354,
0.21934939920902252,
0.1198250874876976,
-0.09310522675514221,
-0.11077594012022018,
-0.0739443302154541,
0.009118417277932167,
-0.005148864816874266,
-0.1179550290107727,
0.005491754971444607,
0.076014444231987,
-0.04686584323644638,
0.1847466230392456,
-0.034107014536857605,
-0.03428659960627556,
-0.015382813289761543,
-0.08532355725765228,
-0.009268855676054955,
-0.02073976956307888,
0.09649215638637543,
-0.2238936424255371,
0.1325010061264038,
0.16212041676044464,
-0.015046309679746628,
0.1718226969242096,
0.01847519353032112,
0.013679388910531998,
0.006052343640476465,
-0.04082776978611946,
-0.00007846848893677816,
0.02128027006983757,
0.0015916629927232862,
0.0011914868373423815,
0.007707077544182539,
0.02131907269358635,
0.030305195599794388,
-0.14438240230083466,
-0.05413905158638954,
0.010167223401367664,
0.052466847002506256,
0.00018202696810476482,
0.0614926852285862,
-0.08105885237455368,
0.05735839903354645,
-0.0333511158823967,
-0.11407014727592468,
0.12527471780776978,
0.0140310637652874,
-0.12375999987125397,
0.1809239387512207,
-0.09875242412090302,
-0.177916020154953,
-0.19897617399692535,
-0.11664178967475891,
0.025174645707011223,
0.09509945660829544,
0.06778308749198914,
-0.06591268628835678,
-0.0677633062005043,
-0.013884147629141808,
-0.13205823302268982,
0.015237858518958092,
-0.0303916335105896,
-0.10815607011318207,
0.06643082201480865,
0.002197817200794816,
-0.1106930822134018,
-0.04751880466938019,
0.012397545389831066,
-0.05212624743580818,
0.06534521281719208,
-0.032029394060373306,
0.06015416979789734,
0.12733860313892365,
-0.009645693004131317,
0.014830506406724453,
-0.03892328962683678,
0.1736617386341095,
-0.07863081991672516,
0.0028175772167742252,
0.11224561184644699,
-0.04382455348968506,
0.03531843051314354,
0.2027312070131302,
0.03458266332745552,
-0.07247956842184067,
0.06938916444778442,
-0.03509911522269249,
-0.05979844182729721,
-0.202435702085495,
-0.10123657435178757,
-0.007523522712290287,
-0.02823515795171261,
0.08373580127954483,
0.0565473809838295,
0.25448861718177795,
0.1288231760263443,
0.060374923050403595,
0.03997355327010155,
0.024889161810278893,
0.0913970097899437,
0.1029813289642334,
-0.027027886360883713,
0.16222402453422546,
-0.08429007232189178,
-0.14650671184062958,
0.048164136707782745,
-0.022769063711166382,
0.07281020283699036,
0.17174853384494781,
-0.06210782378911972,
0.04705783352255821,
0.11571547389030457,
0.13094793260097504,
0.12702703475952148,
0.07746905833482742,
-0.061997704207897186,
-0.006629003677517176,
0.0010869213147088885,
-0.04415592923760414,
0.14652740955352783,
-0.060009948909282684,
-0.06889448314905167,
-0.04306207224726677,
-0.003198902355507016,
0.04323491454124451,
0.05818231403827667,
0.026216039434075356,
-0.28657910227775574,
0.042942874133586884,
0.04888097196817398,
-0.05969006195664406,
-0.11467164009809494,
0.09232109785079956,
-0.027857046574354172,
-0.18361465632915497,
0.03563778102397919,
-0.033283449709415436,
0.09147034585475922,
0.062072351574897766,
0.04841171205043793,
-0.06585943698883057,
-0.0609852597117424,
-0.045712124556303024,
0.15376420319080353,
-0.33846980333328247,
0.20756816864013672,
-0.011205663904547691,
0.08115556091070175,
-0.10785048454999924,
0.010794016532599926,
0.08773794025182724,
0.19103488326072693,
0.12050216645002365,
-0.049261946231126785,
-0.19848455488681793,
-0.11937171965837479,
-0.08363119512796402,
-0.015415008179843426,
0.02001480758190155,
-0.008096402511000633,
0.0008919041720218956,
-0.11757626384496689,
0.0014032695908099413,
0.04126403480768204,
-0.0069845812395215034,
-0.17894983291625977,
-0.15384836494922638,
-0.03538630157709122,
0.030474675819277763,
0.10934672504663467,
-0.04776112735271454,
-0.0534328930079937,
-0.06292759627103806,
0.13548673689365387,
0.026695549488067627,
0.008182995021343231,
-0.1301279366016388,
-0.053804632276296616,
-0.044131867587566376,
-0.023950019851326942,
0.07710648328065872,
0.009424211457371712,
0.11959850043058395,
-0.08615647256374359,
-0.06447352468967438,
0.09218238294124603,
-0.12910714745521545,
-0.042984966188669205,
-0.12177132815122604,
0.03449074551463127,
-0.045684002339839935,
-0.01073586754500866,
0.11459703743457794,
0.04736353084445,
-0.07455705851316452,
-0.06686578691005707,
-0.016151487827301025,
-0.0162202138453722,
0.052238523960113525,
-0.10140960663557053,
-0.11989933252334595,
-0.12391869723796844,
-0.023699220269918442,
-0.11985665559768677,
0.1933230459690094,
0.14995472133159637,
-0.08873795717954636,
0.15256796777248383,
0.2099498212337494,
-0.11413656920194626,
-0.29302918910980225,
-0.05128840357065201,
-0.06601350009441376,
0.004299632739275694,
0.06156041473150253,
-0.10058135539293289,
0.1023014560341835,
0.016915474086999893,
-0.08869403600692749,
-0.016260353848338127,
-0.10926515609025955,
-0.16224952042102814,
0.22960300743579865,
-0.0020108406897634268,
0.18459931015968323,
-0.07568172365427017,
-0.05459576100111008,
-0.12268339842557907,
0.05030543729662895,
0.043312136083841324,
-0.06949128210544586,
0.04921199381351471,
0.045118432492017746,
0.04848489910364151,
0.02309754677116871,
-0.04944291338324547,
0.05402865633368492,
-0.07527824491262436,
0.09563448280096054,
-0.16834798455238342,
-0.019022751599550247,
0.05676575005054474,
-0.027846379205584526,
0.11607834696769714,
-0.040225449949502945,
0.045501600950956345,
-0.05838647112250328,
-0.07079911977052689,
0.02105431631207466,
0.07136379927396774,
-0.007516450714319944,
-0.11632271111011505,
0.009460309520363808,
0.0020681610330939293,
-0.007515698205679655,
-0.07468903809785843,
0.01720641367137432,
-0.009510648436844349,
0.14864802360534668,
0.13830016553401947,
0.2062399536371231,
-0.06995580345392227,
0.06706579029560089,
-0.03199863061308861,
-0.11711113899946213,
0.07805433124303818,
-0.07166967540979385,
0.004296483471989632,
0.05220668390393257,
-0.0538930743932724,
0.14611311256885529,
0.06082209199666977,
0.003751826472580433,
-0.01890469156205654,
0.16250212490558624,
-0.16876746714115143,
0.04684048146009445,
-0.0843876302242279,
0.1279323697090149,
0.04778100550174713,
-0.03293748199939728,
0.09026376903057098,
-0.07791304588317871,
-0.03329215198755264,
-0.0002585914626251906,
0.006090222392231226,
-0.038581836968660355,
0.06518552452325821,
0.04536600783467293,
0.02252393215894699,
-0.06704199314117432,
0.0445764996111393,
0.07239795476198196,
0.016518399119377136,
0.041721411049366,
0.015846284106373787,
-0.09952405095100403,
-0.09522253274917603,
0.04372299090027809,
0.26397231221199036,
-0.1863422393798828,
-0.09990737587213516,
0.004564397502690554,
-0.09345841407775879,
0.004960347898304462,
0.08620705455541611,
0.0809662714600563,
0.04341237619519234,
-0.03603934869170189,
-0.02565331570804119,
-0.11602527648210526,
0.08217493444681168,
-0.015696978196501732,
0.05509110167622566,
-0.16319575905799866,
0.06676459312438965,
-0.030968010425567627,
-0.008549565449357033,
-0.08279257267713547,
-0.010031647980213165,
-0.11571928858757019,
0.026098787784576416,
-0.10430167615413666,
-0.03189973905682564,
-0.041006896644830704,
-0.011233619414269924,
0.05850789323449135,
-0.011018243618309498,
-0.013110441155731678,
-0.01927962154150009,
-0.08805359154939651,
0.02887921780347824,
-0.0008198951254598796,
0.04547540098428726,
-0.05460818111896515,
-0.024217726662755013,
0.037278566509485245,
0.004562355112284422,
0.046250831335783005,
0.012032478116452694,
-0.0011190201621502638,
0.049139540642499924,
-0.14732354879379272,
0.009436994791030884,
0.06159417703747749,
-0.0016145178815349936,
0.0070913624949753284,
-0.028678715229034424,
0.005330502521246672,
0.09783722460269928,
0.018718764185905457,
0.04128317907452583,
-0.0048657008446753025,
-0.1091027706861496,
0.014511657878756523,
0.10307195782661438,
-0.14174701273441315,
-0.03145497664809227,
-0.052812907844781876,
0.01100962609052658,
-0.05524790287017822,
0.23351503908634186,
-0.11669892817735672,
0.04470064863562584,
-0.02692001312971115,
0.030550040304660797,
-0.05822846665978432,
-0.10757116973400116,
-0.12190251797437668,
-0.0954190194606781,
-0.042861051857471466,
0.007703589275479317,
0.2689315676689148,
0.1459355354309082,
-0.008143693208694458,
0.0415508970618248,
0.07256698608398438,
0.09993022680282593,
0.001325596240349114,
0.22187061607837677,
0.09407079964876175,
-0.011255222372710705,
-0.12900875508785248,
0.0802748054265976,
0.027718892320990562,
-0.10550516843795776,
0.0003671931044664234,
0.017833324149250984,
-0.07709381729364395,
0.05998256057500839,
0.04779348149895668,
-0.04618219658732414,
-0.11530262231826782,
-0.1887446641921997,
-0.1010153517127037,
0.01362328790128231,
-0.09494820982217789,
-0.00841664057224989,
0.17340072989463806,
-0.07381404936313629,
-0.020257510244846344,
-0.08453129231929779,
-0.042230453342199326,
-0.21403644979000092,
-0.1685105264186859,
-0.09951409697532654,
-0.07172851264476776,
0.054574232548475266,
-0.01444533746689558,
0.051937036216259,
0.0384058877825737,
0.03334033116698265,
-0.0690227821469307,
0.10118697583675385,
-0.11317354440689087,
0.006825347896665335,
-0.007538147736340761,
-0.042660877108573914,
0.007157159503549337,
-0.17031751573085785,
-0.023363124579191208,
-0.1397811770439148,
-0.04669688642024994,
-0.031707603484392166,
-0.04375086724758148,
0.0007692996296100318,
-0.003963754046708345,
-0.03139100596308708,
-0.009807240217924118,
-0.01006900705397129,
0.03744599595665932,
0.023235660046339035,
0.05043753236532211,
0.022183645516633987,
0.01541586872190237,
0.043549589812755585,
0.21836970746517181,
-0.03527946025133133,
-0.18426218628883362,
-0.12376350164413452,
0.24631790816783905,
0.03293769061565399,
0.11490416526794434,
-0.07057193666696548,
-0.01361043006181717,
0.07598087936639786,
0.31235218048095703,
0.2598150074481964,
-0.03414434567093849,
0.010121017694473267,
-0.03132476285099983,
-0.014958096668124199,
-0.0064048562198877335,
0.18490195274353027,
0.008828791789710522,
0.16826002299785614,
-0.0621221587061882,
0.059055350720882416,
-0.016177164390683174,
-0.07808512449264526,
-0.06689254939556122,
0.14256809651851654,
-0.036333873867988586,
-0.02151089534163475,
-0.01796986348927021,
0.08792226016521454,
-0.0589551106095314,
0.17949369549751282,
-0.09007178992033005,
-0.009130639024078846,
-0.04809116572141647,
0.053617071360349655,
0.11827872693538666,
-0.02074413187801838,
0.03285614401102066,
-0.03567332774400711,
-0.018393725156784058,
0.0029441264923661947,
-0.04050283133983612,
-0.07413910329341888,
-0.04345672205090523,
0.06311136484146118,
0.02551795169711113,
0.25671228766441345,
-0.009337767027318478,
0.05477561056613922,
0.07988451421260834,
-0.0020537625532597303,
-0.10351628065109253,
0.11267323791980743,
0.00224103475920856,
-0.029008302837610245,
0.12491703033447266,
-0.015443749725818634,
0.007564615458250046,
-0.01867114193737507,
-0.01239294558763504,
-0.15698960423469543,
0.14728498458862305,
-0.10142818093299866,
-0.08940913528203964,
-0.05584051460027695,
0.12545742094516754,
-0.032320525497198105,
0.16258437931537628,
0.05726946145296097,
-0.026426637545228004,
0.0021389273460954428,
-0.0331779383122921,
0.08067825436592102,
0.009919043630361557,
-0.09914126992225647,
-0.02203422784805298,
-0.17707498371601105,
-0.016973769292235374,
0.12876249849796295,
-0.02544221095740795,
-0.24601322412490845,
-0.07971391826868057,
-0.06824030727148056,
-0.04311496391892433,
-0.1386985182762146,
0.07398401945829391,
0.2028772532939911,
0.019287997856736183,
-0.01476763840764761,
-0.1369636058807373,
-0.021961720660328865,
0.019149890169501305,
-0.026857441291213036,
-0.10799262672662735
] |
null | null | null | # HeartRateVariability-rPPG
Welcome to the HeartRateVariability-rPPG repository! This machine learning model is designed to process video inputs to estimate Heart Rate Variability (HRV) metrics using the remote Photoplethysmography (rPPG) method. This innovative approach allows for non-contact measurement of heart rate variability by analyzing the subtle changes in skin color that occur with each heartbeat. Our model leverages the power of advanced signal processing and computer vision techniques to provide accurate HRV metrics, which are crucial for assessing stress, cardiovascular health, and overall well-being.
## Table of Contents
- [Project Description](#project-description)
- [Installation](#installation)
- [Usage](#usage)
- [Libraries and Resources](#libraries-and-resources)
- [How It Works](#how-it-works)
- [Contributing](#contributing)
- [License](#license)
- [Acknowledgments](#acknowledgments)
## Project Description
This project utilizes state-of-the-art algorithms and methodologies derived from key resources and libraries such as yarppg, HeartPy, OpenCV, and NeuroKit2 to extract, process, and analyze the video data for HRV metrics. The rPPG method applied here offers a unique advantage in remote health monitoring, fitness tracking, and psychological research by providing a non-invasive means of measuring heart rate variability through video analysis.
## Usage
## Libraries and Resources
This project makes extensive use of the following libraries and resources:
- **[yarppg](https://github.com/SamProell/yarppg):** For insights and methodologies on implementing the rPPG technique.
- **[HeartPy](https://github.com/paulvangentcom/heartrate_analysis_python):** Used for heart rate signal processing and HRV metrics analysis.
- **[OpenCV](https://opencv.org/):** For video processing and computer vision operations.
- **[NeuroKit2](https://neurokit2.readthedocs.io/en/latest/):** An advanced tool for signal processing, including HRV metrics.
## How It Works
The model processes the input video to detect the subject's face and, specifically, the cheek region, where skin color changes are more pronounced and less affected by motion artifacts. It then applies signal processing techniques to extract the rPPG signal from the color variations over time. This signal is further processed to compute HRV metrics, which include but are not limited to SDNN, RMSSD, and frequency domain metrics.
## Contributing
We welcome contributions from the community. If you'd like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE) file for details.
## Acknowledgments
This project would not have been possible without the foundational work and insights provided by the following resources and libraries:
- The developers and contributors of **yarppg**, **HeartPy**, **OpenCV**, and **NeuroKit2** for their invaluable libraries and tools that made this project feasible.
- The academic and research community for advancing the field of non-contact HRV measurement through rPPG and related technologies.
Please note that this project is for educational and research purposes only and may not be suitable for clinical use. | {} | null | PulsePals/VideoToHRV | [
"region:us"
] | 2024-02-14T14:36:57+00:00 | [] | [] | TAGS
#region-us
| # HeartRateVariability-rPPG
Welcome to the HeartRateVariability-rPPG repository! This machine learning model is designed to process video inputs to estimate Heart Rate Variability (HRV) metrics using the remote Photoplethysmography (rPPG) method. This innovative approach allows for non-contact measurement of heart rate variability by analyzing the subtle changes in skin color that occur with each heartbeat. Our model leverages the power of advanced signal processing and computer vision techniques to provide accurate HRV metrics, which are crucial for assessing stress, cardiovascular health, and overall well-being.
## Table of Contents
- Project Description
- Installation
- Usage
- Libraries and Resources
- How It Works
- Contributing
- License
- Acknowledgments
## Project Description
This project utilizes state-of-the-art algorithms and methodologies derived from key resources and libraries such as yarppg, HeartPy, OpenCV, and NeuroKit2 to extract, process, and analyze the video data for HRV metrics. The rPPG method applied here offers a unique advantage in remote health monitoring, fitness tracking, and psychological research by providing a non-invasive means of measuring heart rate variability through video analysis.
## Usage
## Libraries and Resources
This project makes extensive use of the following libraries and resources:
- yarppg: For insights and methodologies on implementing the rPPG technique.
- HeartPy: Used for heart rate signal processing and HRV metrics analysis.
- OpenCV: For video processing and computer vision operations.
- NeuroKit2: An advanced tool for signal processing, including HRV metrics.
## How It Works
The model processes the input video to detect the subject's face and, specifically, the cheek region, where skin color changes are more pronounced and less affected by motion artifacts. It then applies signal processing techniques to extract the rPPG signal from the color variations over time. This signal is further processed to compute HRV metrics, which include but are not limited to SDNN, RMSSD, and frequency domain metrics.
## Contributing
We welcome contributions from the community. If you'd like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.
## License
This project is licensed under the MIT License - see the URL file for details.
## Acknowledgments
This project would not have been possible without the foundational work and insights provided by the following resources and libraries:
- The developers and contributors of yarppg, HeartPy, OpenCV, and NeuroKit2 for their invaluable libraries and tools that made this project feasible.
- The academic and research community for advancing the field of non-contact HRV measurement through rPPG and related technologies.
Please note that this project is for educational and research purposes only and may not be suitable for clinical use. | [
"# HeartRateVariability-rPPG\n\nWelcome to the HeartRateVariability-rPPG repository! This machine learning model is designed to process video inputs to estimate Heart Rate Variability (HRV) metrics using the remote Photoplethysmography (rPPG) method. This innovative approach allows for non-contact measurement of heart rate variability by analyzing the subtle changes in skin color that occur with each heartbeat. Our model leverages the power of advanced signal processing and computer vision techniques to provide accurate HRV metrics, which are crucial for assessing stress, cardiovascular health, and overall well-being.",
"## Table of Contents\n- Project Description\n- Installation\n- Usage\n- Libraries and Resources\n- How It Works\n- Contributing\n- License\n- Acknowledgments",
"## Project Description\n\nThis project utilizes state-of-the-art algorithms and methodologies derived from key resources and libraries such as yarppg, HeartPy, OpenCV, and NeuroKit2 to extract, process, and analyze the video data for HRV metrics. The rPPG method applied here offers a unique advantage in remote health monitoring, fitness tracking, and psychological research by providing a non-invasive means of measuring heart rate variability through video analysis.",
"## Usage",
"## Libraries and Resources\n\nThis project makes extensive use of the following libraries and resources:\n\n- yarppg: For insights and methodologies on implementing the rPPG technique.\n- HeartPy: Used for heart rate signal processing and HRV metrics analysis.\n- OpenCV: For video processing and computer vision operations.\n- NeuroKit2: An advanced tool for signal processing, including HRV metrics.",
"## How It Works\n\nThe model processes the input video to detect the subject's face and, specifically, the cheek region, where skin color changes are more pronounced and less affected by motion artifacts. It then applies signal processing techniques to extract the rPPG signal from the color variations over time. This signal is further processed to compute HRV metrics, which include but are not limited to SDNN, RMSSD, and frequency domain metrics.",
"## Contributing\n\nWe welcome contributions from the community. If you'd like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.",
"## License\n\nThis project is licensed under the MIT License - see the URL file for details.",
"## Acknowledgments\n\nThis project would not have been possible without the foundational work and insights provided by the following resources and libraries:\n\n- The developers and contributors of yarppg, HeartPy, OpenCV, and NeuroKit2 for their invaluable libraries and tools that made this project feasible.\n- The academic and research community for advancing the field of non-contact HRV measurement through rPPG and related technologies.\n\nPlease note that this project is for educational and research purposes only and may not be suitable for clinical use."
] | [
"TAGS\n#region-us \n",
"# HeartRateVariability-rPPG\n\nWelcome to the HeartRateVariability-rPPG repository! This machine learning model is designed to process video inputs to estimate Heart Rate Variability (HRV) metrics using the remote Photoplethysmography (rPPG) method. This innovative approach allows for non-contact measurement of heart rate variability by analyzing the subtle changes in skin color that occur with each heartbeat. Our model leverages the power of advanced signal processing and computer vision techniques to provide accurate HRV metrics, which are crucial for assessing stress, cardiovascular health, and overall well-being.",
"## Table of Contents\n- Project Description\n- Installation\n- Usage\n- Libraries and Resources\n- How It Works\n- Contributing\n- License\n- Acknowledgments",
"## Project Description\n\nThis project utilizes state-of-the-art algorithms and methodologies derived from key resources and libraries such as yarppg, HeartPy, OpenCV, and NeuroKit2 to extract, process, and analyze the video data for HRV metrics. The rPPG method applied here offers a unique advantage in remote health monitoring, fitness tracking, and psychological research by providing a non-invasive means of measuring heart rate variability through video analysis.",
"## Usage",
"## Libraries and Resources\n\nThis project makes extensive use of the following libraries and resources:\n\n- yarppg: For insights and methodologies on implementing the rPPG technique.\n- HeartPy: Used for heart rate signal processing and HRV metrics analysis.\n- OpenCV: For video processing and computer vision operations.\n- NeuroKit2: An advanced tool for signal processing, including HRV metrics.",
"## How It Works\n\nThe model processes the input video to detect the subject's face and, specifically, the cheek region, where skin color changes are more pronounced and less affected by motion artifacts. It then applies signal processing techniques to extract the rPPG signal from the color variations over time. This signal is further processed to compute HRV metrics, which include but are not limited to SDNN, RMSSD, and frequency domain metrics.",
"## Contributing\n\nWe welcome contributions from the community. If you'd like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.",
"## License\n\nThis project is licensed under the MIT License - see the URL file for details.",
"## Acknowledgments\n\nThis project would not have been possible without the foundational work and insights provided by the following resources and libraries:\n\n- The developers and contributors of yarppg, HeartPy, OpenCV, and NeuroKit2 for their invaluable libraries and tools that made this project feasible.\n- The academic and research community for advancing the field of non-contact HRV measurement through rPPG and related technologies.\n\nPlease note that this project is for educational and research purposes only and may not be suitable for clinical use."
] | [
6,
142,
36,
109,
3,
96,
105,
42,
19,
122
] | [
"passage: TAGS\n#region-us \n# HeartRateVariability-rPPG\n\nWelcome to the HeartRateVariability-rPPG repository! This machine learning model is designed to process video inputs to estimate Heart Rate Variability (HRV) metrics using the remote Photoplethysmography (rPPG) method. This innovative approach allows for non-contact measurement of heart rate variability by analyzing the subtle changes in skin color that occur with each heartbeat. Our model leverages the power of advanced signal processing and computer vision techniques to provide accurate HRV metrics, which are crucial for assessing stress, cardiovascular health, and overall well-being.## Table of Contents\n- Project Description\n- Installation\n- Usage\n- Libraries and Resources\n- How It Works\n- Contributing\n- License\n- Acknowledgments## Project Description\n\nThis project utilizes state-of-the-art algorithms and methodologies derived from key resources and libraries such as yarppg, HeartPy, OpenCV, and NeuroKit2 to extract, process, and analyze the video data for HRV metrics. The rPPG method applied here offers a unique advantage in remote health monitoring, fitness tracking, and psychological research by providing a non-invasive means of measuring heart rate variability through video analysis.## Usage## Libraries and Resources\n\nThis project makes extensive use of the following libraries and resources:\n\n- yarppg: For insights and methodologies on implementing the rPPG technique.\n- HeartPy: Used for heart rate signal processing and HRV metrics analysis.\n- OpenCV: For video processing and computer vision operations.\n- NeuroKit2: An advanced tool for signal processing, including HRV metrics.## How It Works\n\nThe model processes the input video to detect the subject's face and, specifically, the cheek region, where skin color changes are more pronounced and less affected by motion artifacts. It then applies signal processing techniques to extract the rPPG signal from the color variations over time. This signal is further processed to compute HRV metrics, which include but are not limited to SDNN, RMSSD, and frequency domain metrics."
] | [
-0.05353982746601105,
0.12268421053886414,
-0.005286328960210085,
-0.0032497532665729523,
0.04594491422176361,
0.03680294007062912,
-0.05312899127602577,
0.13569556176662445,
-0.05753284692764282,
0.11044321954250336,
-0.10859429091215134,
-0.0035910154692828655,
0.1133987158536911,
0.04632122069597244,
0.0992962196469307,
-0.2209325134754181,
0.020129065960645676,
-0.08720245212316513,
0.05678243935108185,
0.053031738847494125,
0.03659430146217346,
-0.08379926532506943,
0.0512169785797596,
0.027104051783680916,
0.014020525850355625,
-0.059407446533441544,
-0.0924091637134552,
-0.02079436369240284,
0.07118921726942062,
0.02728222869336605,
-0.03367256373167038,
0.006291356403380632,
0.06062108650803566,
-0.2804124057292938,
0.003896153997629881,
0.07379113882780075,
0.015135618858039379,
0.06402841210365295,
0.10596133023500443,
-0.056948792189359665,
0.18735471367835999,
-0.13405722379684448,
0.0938999131321907,
0.018125955015420914,
-0.06479743123054504,
-0.20808018743991852,
-0.08741464465856552,
0.011210724711418152,
0.022073037922382355,
0.062236443161964417,
-0.00903767067939043,
0.12781697511672974,
-0.08293768763542175,
-0.04293479397892952,
0.16022644937038422,
-0.12285999208688736,
-0.011520634405314922,
-0.09999585896730423,
0.03773343935608864,
0.13523927330970764,
-0.1846669614315033,
0.03446197509765625,
0.007316602393984795,
0.04554024338722229,
0.08164574205875397,
-0.022567376494407654,
0.047140076756477356,
0.004276625346392393,
-0.08277378976345062,
-0.04475339129567146,
0.043320100754499435,
-0.05778186768293381,
-0.03460068255662918,
-0.11524474620819092,
-0.01908738911151886,
0.07373204827308655,
0.005902642384171486,
0.013080589473247528,
0.04479492828249931,
0.007833168841898441,
0.028369281440973282,
-0.06245621666312218,
-0.10805442184209824,
-0.08168423920869827,
-0.02602466754615307,
0.018405184149742126,
0.06642473489046097,
0.04787139594554901,
0.06510420143604279,
0.12520664930343628,
0.00455117505043745,
-0.029433121904730797,
-0.06396736204624176,
-0.08118904381990433,
-0.09385202080011368,
-0.015667734667658806,
-0.050480179488658905,
-0.06525459885597229,
0.015138857997953892,
0.06645110994577408,
-0.09130726009607315,
0.04040466248989105,
-0.07530676573514938,
-0.029817305505275726,
0.15664534270763397,
0.017775045707821846,
-0.13352341949939728,
0.05144105106592178,
-0.06312574446201324,
-0.025433728471398354,
-0.02123681828379631,
-0.010071718133985996,
0.08777040988206863,
0.060043513774871826,
0.0022986335679888725,
0.02456861548125744,
-0.011236174032092094,
-0.03192467242479324,
-0.07437129318714142,
-0.05366375669836998,
0.2132001519203186,
-0.08338344097137451,
0.00802113488316536,
0.03364783152937889,
0.018950071185827255,
0.020837143063545227,
0.06096931919455528,
-0.034563932567834854,
-0.07151199132204056,
0.10148840397596359,
-0.019574617967009544,
-0.011440820060670376,
-0.09319997578859329,
-0.038765646517276764,
0.06422758102416992,
-0.06996966898441315,
-0.03312959149479866,
-0.008921702392399311,
-0.04053296148777008,
-0.022289570420980453,
0.027867116034030914,
-0.10980848968029022,
0.0786738321185112,
0.01890198327600956,
0.06636852771043777,
0.013047261163592339,
0.04282229021191597,
0.0800003856420517,
-0.02730398252606392,
0.029080722481012344,
-0.13574226200580597,
0.1109037697315216,
0.08716556429862976,
0.03573715686798096,
-0.07674875110387802,
0.018657149747014046,
0.008937888778746128,
0.09307896345853806,
-0.11558756977319717,
-0.1420036256313324,
-0.07484125345945358,
0.03226704150438309,
-0.05046013742685318,
0.023299625143408775,
-0.030276041477918625,
0.0007123157847672701,
-0.1550348699092865,
-0.036708977073431015,
0.17201553285121918,
-0.09702834486961365,
0.04136740788817406,
0.05983515828847885,
-0.03642622008919716,
0.017191829159855843,
0.03362051025032997,
0.11688368022441864,
0.07441733777523041,
-0.09719515591859818,
-0.019342496991157532,
-0.02314012311398983,
-0.03759031370282173,
0.1300765573978424,
0.056453071534633636,
-0.08079873770475388,
0.07543782144784927,
0.05044586583971977,
-0.054692938923835754,
-0.10385647416114807,
-0.024261673912405968,
-0.009791549295186996,
0.006243763957172632,
-0.0732509195804596,
0.05086000636219978,
-0.03895071893930435,
0.013829510658979416,
0.023880157619714737,
-0.03997832536697388,
0.02257326804101467,
0.08367058634757996,
0.005096998997032642,
0.06917937844991684,
-0.03838007152080536,
-0.06999433040618896,
-0.04636125639081001,
0.005136406049132347,
-0.16143858432769775,
0.09615979343652725,
0.04865168407559395,
-0.11876725405454636,
0.06629443913698196,
0.036221183836460114,
0.03215505927801132,
0.054451923817396164,
-0.06952174007892609,
0.03633390739560127,
-0.024466661736369133,
0.022237814962863922,
0.027907002717256546,
-0.19033753871917725,
0.05545986443758011,
-0.0876477062702179,
0.010046275332570076,
0.07428907603025436,
-0.060035765171051025,
0.0441494844853878,
0.1636040210723877,
0.0767563208937645,
-0.05350837856531143,
0.09072649478912354,
0.01878594234585762,
0.06678444147109985,
0.0007390980608761311,
-0.022857798263430595,
-0.023680778220295906,
0.0555918850004673,
0.08643882721662521,
-0.05426621064543724,
-0.15657459199428558,
0.047984421253204346,
0.11665911227464676,
-0.06137903034687042,
-0.010817570611834526,
0.014028118923306465,
-0.0369039848446846,
-0.11264832317829132,
-0.02765643037855625,
0.14772142469882965,
0.046578940004110336,
0.03658835217356682,
-0.039029862731695175,
-0.007562297862023115,
0.015618344768881798,
-0.04137444496154785,
0.009267534129321575,
0.02761857770383358,
0.06662389636039734,
-0.03044678457081318,
0.01932612434029579,
0.023104557767510414,
0.06174298748373985,
0.1888582408428192,
-0.001927923527546227,
-0.14860811829566956,
0.016398852691054344,
-0.036867406219244,
0.009671301580965519,
0.08315589278936386,
-0.006769656203687191,
0.051196154206991196,
0.06825198978185654,
0.042532727122306824,
0.035267096012830734,
-0.08092725276947021,
0.062305331230163574,
0.03519574552774429,
0.04474176466464996,
-0.03034999407827854,
0.02364649809896946,
0.014951287768781185,
0.0570153184235096,
0.08677751570940018,
0.20839020609855652,
-0.03831150010228157,
-0.0128316730260849,
-0.08760132640600204,
0.10339059680700302,
-0.18228352069854736,
-0.12313895672559738,
-0.09249743819236755,
-0.005971379112452269,
0.051617156714200974,
-0.005062716547399759,
0.029007628560066223,
-0.09158878773450851,
-0.059654295444488525,
-0.058022063225507736,
0.050264909863471985,
0.012459167279303074,
-0.07486371695995331,
0.021823011338710785,
0.03369946777820587,
-0.0013174881460145116,
-0.03467268869280815,
0.045382268726825714,
-0.056651074439287186,
-0.09314456582069397,
0.07824569940567017,
0.020139167085289955,
0.06296131759881973,
0.05924932286143303,
0.0925278514623642,
-0.05591612681746483,
0.00009445946488995105,
0.07787822186946869,
-0.054576948285102844,
0.10472716391086578,
-0.010094878263771534,
-0.05779806897044182,
0.07213596999645233,
0.09069052338600159,
0.0465070866048336,
0.005874627269804478,
-0.014571607112884521,
0.09143422544002533,
-0.022202929481863976,
-0.23665326833724976,
-0.05086283013224602,
-0.05050951614975929,
-0.06901564449071884,
-0.0006420655990950763,
0.025845298543572426,
0.015121646225452423,
-0.005517866462469101,
-0.02140597440302372,
-0.033824749290943146,
0.0007575903437100351,
0.0236461590975523,
0.1072506234049797,
-0.022936299443244934,
0.016283726319670677,
-0.05094816908240318,
0.04836927726864815,
0.15377940237522125,
0.07782645523548126,
0.31100884079933167,
-0.05840747803449631,
0.10658212751150131,
0.10831449925899506,
0.012541980482637882,
0.04875916615128517,
0.029636915773153305,
-0.06734217703342438,
0.04392683506011963,
-0.041223008185625076,
-0.05086296424269676,
-0.015456055290997028,
0.09465382993221283,
0.025506125763058662,
-0.14917100965976715,
0.025883452966809273,
-0.05789041146636009,
0.08939415961503983,
0.14941774308681488,
0.010432994924485683,
0.05697936564683914,
-0.012091393582522869,
0.02798924781382084,
-0.014300435781478882,
-0.0984669104218483,
-0.0027644429355859756,
0.038453564047813416,
-0.12182892858982086,
0.04563537985086441,
0.006624406669288874,
0.046565476804971695,
-0.14954973757266998,
-0.03390609100461006,
0.003983920440077782,
0.034625984728336334,
-0.03185918554663658,
0.04109784588217735,
0.022380631417036057,
0.06557243317365646,
0.031348004937171936,
0.08605019003152847,
-0.04015960916876793,
0.007674862165004015,
0.025509268045425415,
0.0007199559477157891,
0.1304052323102951,
0.043181564658880234,
-0.0879223495721817,
-0.020999092608690262,
-0.0800999104976654,
0.040607139468193054,
0.09235312789678574,
-0.10069721192121506,
0.014672552235424519,
-0.0040605757385492325,
-0.006636411417275667,
-0.08523983508348465,
-0.10558950901031494,
-0.077838234603405,
-0.20158661901950836,
0.09014113247394562,
-0.15060603618621826,
-0.00571897067129612,
-0.06500033289194107,
-0.028508532792329788,
-0.006272723898291588,
0.07332958281040192,
-0.15126214921474457,
-0.08429945260286331,
-0.1208997368812561,
-0.07474921643733978,
0.15767250955104828,
-0.032693441957235336,
0.06687045842409134,
-0.001440734602510929,
0.10291723161935806,
-0.0305547583848238,
-0.09827403724193573,
-0.005634627304971218,
-0.015718189999461174,
-0.20235666632652283,
-0.07839798182249069,
0.11804191023111343,
0.14335250854492188,
0.03015488013625145,
-0.04498457536101341,
0.08010870218276978,
-0.011812550947070122,
-0.06563575565814972,
0.10200853645801544,
0.3163677752017975,
-0.06546293944120407,
-0.016079647466540337,
-0.08861715346574783,
-0.15111151337623596,
-0.022687075659632683,
-0.08960156887769699,
0.0036014034412801266,
0.16072556376457214,
-0.035841163247823715,
0.2786208987236023,
0.2763323187828064,
-0.11387213319540024,
-0.21044336259365082,
-0.04782690852880478,
0.025454454123973846,
-0.042307306081056595,
0.13285404443740845,
-0.3001188039779663,
0.00884002260863781,
0.0162216629832983,
-0.026189446449279785,
0.017361653968691826,
-0.13154801726341248,
-0.07271454483270645,
-0.01610201969742775,
0.01947922818362713,
-0.06180846691131592,
0.010196011513471603,
-0.06025658920407295,
0.029699964448809624,
-0.1540536731481552,
0.002443223726004362,
-0.05403691157698631,
0.06710450351238251,
-0.08546540141105652,
0.0902189314365387,
0.031085265800356865,
-0.016070181503891945,
0.07435326278209686,
-0.08599161356687546,
0.011513881385326385,
-0.018517056480050087,
0.0851728543639183,
0.007611454930156469,
-0.04779095947742462,
0.007285038474947214,
-0.012136072851717472,
0.018614305183291435,
0.0020585262682288885,
-0.057839326560497284,
-0.09042598307132721,
-0.09210140258073807,
0.0006056044949218631,
-0.07318698614835739,
-0.08440599590539932,
0.0916723757982254,
0.07965816557407379,
-0.0659436285495758,
-0.14110511541366577,
-0.0818856731057167,
-0.11631948500871658,
0.18195314705371857,
0.07730566710233688,
0.018593942746520042,
-0.04585716128349304,
-0.06967311352491379,
-0.06404166668653488,
0.08468545228242874,
-0.15355344116687775,
0.06204361468553543,
0.06860130280256271,
0.03406277298927307,
0.06858105212450027,
-0.045842256397008896,
-0.1828208863735199,
0.03321344405412674,
0.07371501624584198,
-0.016663808375597,
-0.08205945789813995,
0.0453995019197464,
0.10076408833265305,
-0.08054033666849136,
-0.0405813604593277,
0.08277194947004318,
-0.010374543257057667,
-0.06109672412276268,
-0.04213608428835869,
0.0922834649682045,
0.038472529500722885,
0.09211926907300949,
0.009423725306987762,
-0.010143176652491093,
-0.06501421332359314,
0.15299244225025177,
0.0444321446120739,
-0.049945566803216934,
0.01796131767332554,
0.04809595271945,
-0.11255217343568802,
-0.0663832575082779,
0.09706883132457733,
0.05894804373383522,
0.018036654219031334,
-0.02510315366089344,
0.09506183117628098,
-0.06351765245199203,
0.023432999849319458,
0.10063359886407852,
-0.026323307305574417,
0.02340090274810791,
-0.06465979665517807,
0.03160122036933899,
-0.07277506589889526,
0.06806905567646027,
0.005574546754360199,
0.033531226217746735,
0.02810249850153923,
-0.09504841268062592,
0.02324840985238552,
-0.050323642790317535,
-0.035702597349882126,
-0.0675983652472496,
-0.02774055488407612,
-0.021734392270445824,
-0.2410622388124466,
-0.023463387042284012,
-0.0638154074549675,
-0.038797151297330856,
-0.0002462719567120075,
0.0396745465695858,
0.034247785806655884,
0.0447806790471077,
-0.025439631193876266,
-0.03921074420213699,
-0.049677830189466476,
0.0367012619972229,
-0.17936000227928162,
0.018833668902516365,
0.1063191145658493,
-0.07300169765949249,
0.06113646551966667,
-0.042178548872470856,
-0.0014959858963266015,
0.03565078601241112,
-0.12643063068389893,
-0.04880451038479805,
0.009938863106071949,
0.005370452534407377,
0.017754847183823586,
-0.0824044868350029,
-0.02980968914926052,
-0.02877596952021122,
-0.06719257682561874,
-0.022198805585503578,
0.1286536306142807,
-0.03628120198845863,
0.11686192452907562,
-0.010294130071997643,
-0.012073560617864132,
-0.019412217661738396,
0.011428164318203926,
0.09750065952539444,
0.015600471757352352,
0.005605485290288925,
-0.02256881445646286,
0.02811991237103939,
-0.07551595568656921,
-0.033482909202575684,
0.01967969909310341,
0.053650978952646255,
-0.024586165323853493,
-0.03475639596581459,
0.022979144006967545,
0.007073130924254656,
0.20628000795841217,
-0.08206745982170105,
-0.07973410189151764,
0.0233710128813982,
0.005038498900830746,
-0.1375269889831543,
0.02440212108194828,
-0.009238193742930889,
-0.02198854647576809,
-0.02258160151541233,
-0.0918596014380455,
-0.023183852434158325,
-0.11228294670581818,
-0.027137981727719307,
0.014968975447118282,
0.09215406328439713,
0.21006295084953308,
-0.10185074061155319,
0.007879840210080147,
-0.04431181028485298,
-0.10734677314758301,
0.03910339996218681,
-0.060444753617048264,
0.01108002569526434,
-0.011892106384038925,
-0.021708324551582336,
0.06396593898534775,
-0.14387427270412445,
0.08988648653030396,
-0.016406619921326637,
-0.01911155879497528,
0.014923528768122196,
-0.09970678389072418,
-0.035660065710544586,
-0.0810382068157196,
0.013876263052225113,
-0.04521365463733673,
0.1058412566781044,
0.014596497640013695,
0.05488128960132599,
0.01449534110724926,
0.13891257345676422,
-0.0970359742641449,
-0.06508920341730118,
0.013426094315946102,
0.06396985799074173,
-0.011256534606218338,
0.12724076211452484,
0.10452534258365631,
0.08113492280244827,
0.04343796521425247,
0.03920865058898926,
0.03783676400780678,
-0.036773454397916794,
0.0032562424894422293,
-0.06129911541938782,
-0.032959844917058945,
0.07628712803125381,
-0.047971125692129135,
-0.06413515657186508,
0.1672871708869934,
0.09454075992107391,
-0.02987831085920334,
0.015681898221373558,
0.22554752230644226,
-0.0394706092774868,
-0.017570484429597855,
-0.18131327629089355,
0.08074883371591568,
0.005818990059196949,
-0.0081558832898736,
0.04784282669425011,
-0.11083631217479706,
0.03149046748876572,
0.13302600383758545,
0.06565309315919876,
-0.014640859328210354,
-0.007739401888102293,
0.004541358444839716,
0.021879497915506363,
-0.023585602641105652,
0.060315195471048355,
-0.009347822517156601,
0.17538456618785858,
-0.05464975908398628,
0.07412736862897873,
-0.04189391806721687,
-0.07649920135736465,
-0.037138111889362335,
0.04975799098610878,
-0.02585315704345703,
-0.01517378631979227,
-0.08061203360557556,
0.04790865629911423,
-0.01360295433551073,
-0.2842071056365967,
0.12403859943151474,
-0.0007053185836412013,
-0.04969640448689461,
0.03251006826758385,
0.060221217572689056,
0.0033672614954411983,
-0.05468722805380821,
0.07565796375274658,
-0.0020207595080137253,
0.2689889967441559,
0.045551493763923645,
-0.002076897770166397,
-0.028217922896146774,
0.0681011900305748,
-0.11077169328927994,
0.16305787861347198,
0.03755761682987213,
-0.02753797546029091,
0.003694012761116028,
0.04511033371090889,
-0.0965242013335228,
0.08889862149953842,
0.007405819837003946,
-0.1415293663740158,
-0.0301919337362051,
0.21517835557460785,
0.019862428307533264,
0.05457230284810066,
0.09261316061019897,
0.0748371034860611,
0.02827637828886509,
-0.09486845880746841,
-0.010264404118061066,
-0.027657832950353622,
0.10622354596853256,
-0.07735982537269592,
0.09099049121141434,
0.0673753172159195,
0.02761078253388405,
0.04343404620885849,
-0.05166200175881386,
0.06479157507419586,
0.02197320945560932,
0.08723572641611099,
-0.028383472934365273,
-0.06742855906486511,
0.025852620601654053,
0.10897944867610931,
0.12540686130523682,
-0.1479390561580658,
-0.10899201780557632,
0.025752682238817215,
-0.015330926515161991,
0.05392866209149361,
0.07818471640348434,
0.012838644906878471,
0.00410864083096385,
-0.040419355034828186,
-0.11026809364557266,
0.052775245159864426,
0.012545071542263031,
-0.0655038133263588,
0.025307370349764824
] |
null | null | transformers | T5-3B model fine-tuned on augmented Spider proposed in the paper ["Improving Generalization in Semantic Parsing by Increasing Natural Language Variation"](https://arxiv.org/abs/2402.08666).
See more info [here](https://github.com/saparina/Text2SQL-NLVariation). | {} | text2text-generation | irisaparina/t5-3b-spider-nlvariation | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"arxiv:2402.08666",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T14:37:55+00:00 | [
"2402.08666"
] | [] | TAGS
#transformers #pytorch #t5 #text2text-generation #arxiv-2402.08666 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| T5-3B model fine-tuned on augmented Spider proposed in the paper "Improving Generalization in Semantic Parsing by Increasing Natural Language Variation".
See more info here. | [] | [
"TAGS\n#transformers #pytorch #t5 #text2text-generation #arxiv-2402.08666 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
56
] | [
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #arxiv-2402.08666 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.03488323837518692,
0.023126889020204544,
-0.005981678608804941,
0.028553316369652748,
0.18102042376995087,
0.019610431045293808,
0.11544577032327652,
0.13621874153614044,
-0.04268293082714081,
-0.008546575903892517,
0.16626884043216705,
0.2124917060136795,
0.0032226790208369493,
0.06584501266479492,
-0.10048221051692963,
-0.23609256744384766,
0.02803168259561062,
0.07699702680110931,
0.015699662268161774,
0.1363850086927414,
0.07895831018686295,
-0.06281246989965439,
0.10601846873760223,
-0.03277762979269028,
-0.19284938275814056,
0.04823950305581093,
0.0759907215833664,
-0.12989868223667145,
0.13058911263942719,
0.057675961405038834,
0.12637151777744293,
0.03202763944864273,
-0.03605138137936592,
-0.10409647971391678,
0.021203676238656044,
0.033704835921525955,
-0.06451057642698288,
0.07998177409172058,
0.12948603928089142,
-0.07875394076108932,
0.105050228536129,
0.06988879293203354,
-0.029119884595274925,
0.06584786623716354,
-0.15254051983356476,
-0.018001984804868698,
-0.023814380168914795,
0.03730681911110878,
0.05449667200446129,
0.11565817147493362,
0.006772415246814489,
0.12980149686336517,
-0.06018964573740959,
0.12886331975460052,
0.18634964525699615,
-0.32122617959976196,
0.002257309388369322,
0.04348282143473625,
0.031511422246694565,
0.061855461448431015,
-0.022417305037379265,
0.03157487511634827,
0.048918239772319794,
0.03634991496801376,
0.019798824563622475,
-0.07455303519964218,
-0.1766558587551117,
0.048192963004112244,
-0.09106095880270004,
-0.06550178676843643,
0.22739820182323456,
-0.05915948376059532,
0.059765469282865524,
0.022915633395314217,
-0.1380602866411209,
-0.09242946654558182,
0.01144214067608118,
-0.008583607152104378,
-0.05470551550388336,
0.04824269562959671,
0.03272302448749542,
-0.044531311839818954,
-0.1449718326330185,
-0.01567097380757332,
-0.19500204920768738,
0.1353185623884201,
-0.008205907419323921,
0.06183527410030365,
-0.21710434556007385,
0.09474275261163712,
0.041190724819898605,
-0.09994293749332428,
0.0701305940747261,
-0.07862021028995514,
0.03831906616687775,
0.004136845003813505,
-0.06198529154062271,
-0.14305716753005981,
0.04150538891553879,
0.0843990221619606,
-0.013625956140458584,
0.0264403335750103,
-0.06731413304805756,
0.07217515259981155,
-0.009635492227971554,
0.07691345363855362,
-0.015548467636108398,
-0.015396049246191978,
0.04920589178800583,
-0.10239700973033905,
-0.0016395181883126497,
-0.07106328010559082,
-0.16522040963172913,
-0.0923801138997078,
0.1004025936126709,
0.07702528685331345,
0.018648972734808922,
0.0821513682603836,
-0.03348972648382187,
-0.03640962764620781,
0.027997443452477455,
-0.08427929878234863,
-0.022775502875447273,
-0.003880684031173587,
0.0004985451814718544,
0.1380399763584137,
0.027196846902370453,
-0.0012150637339800596,
-0.16925367712974548,
0.06548163294792175,
-0.09324919432401657,
-0.007658312097191811,
-0.028001615777611732,
-0.08690807223320007,
0.024706458672881126,
-0.11238520592451096,
0.003413876984268427,
-0.17914679646492004,
-0.1353474259376526,
0.012986977584660053,
0.007486438844352961,
-0.030673464760184288,
-0.04342986270785332,
-0.035425782203674316,
-0.0532485730946064,
0.07324612885713577,
-0.05995907261967659,
0.01580965332686901,
-0.03430836275219917,
0.10869800299406052,
-0.02347479946911335,
0.06031274423003197,
-0.11818661540746689,
0.07571303099393845,
-0.11520886421203613,
-0.019853321835398674,
-0.06827656179666519,
0.03367646411061287,
0.030202845111489296,
0.09976854920387268,
-0.052734509110450745,
-0.0455288402736187,
-0.06547661870718002,
0.0279878918081522,
0.011330061592161655,
0.19818848371505737,
-0.08042629808187485,
-0.09734280407428741,
0.18989750742912292,
-0.07949251681566238,
-0.14492276310920715,
0.08371467143297195,
0.018514089286327362,
0.024781906977295876,
0.06873687356710434,
0.15822191536426544,
0.01984333246946335,
-0.015785686671733856,
0.07354433089494705,
0.09706702083349228,
-0.11051934212446213,
-0.12361256033182144,
0.009364213794469833,
-0.016805389896035194,
-0.1293111890554428,
0.03765486925840378,
0.08755496144294739,
0.0654573068022728,
-0.05871496722102165,
-0.02692914940416813,
-0.05444343015551567,
-0.005607043858617544,
0.09500250965356827,
0.007588237524032593,
0.12442859262228012,
-0.05428202822804451,
-0.02525407075881958,
-0.018143687397241592,
-0.01838766224682331,
-0.015249173156917095,
0.04408123344182968,
-0.028597628697752953,
0.12123580276966095,
-0.0645032450556755,
0.03902249038219452,
-0.2002503126859665,
-0.05928943678736687,
-0.01678549498319626,
0.15454573929309845,
-0.007247289177030325,
0.08080949634313583,
0.05465013533830643,
-0.03313233330845833,
-0.013577233068645,
-0.02445007674396038,
0.13417965173721313,
-0.010457771830260754,
-0.08406493067741394,
-0.07168205082416534,
0.05313099920749664,
-0.046997543424367905,
-0.027027852833271027,
-0.10878720879554749,
0.0075691561214625835,
0.01531744934618473,
0.11400003731250763,
0.012810398824512959,
0.05924560874700546,
-0.024886412546038628,
0.027782360091805458,
-0.0954408124089241,
0.012125581502914429,
0.11165407299995422,
-0.0012706458801403642,
-0.048600368201732635,
0.21353735029697418,
-0.17989076673984528,
0.22554393112659454,
0.1790531724691391,
-0.2964553236961365,
-0.012183919548988342,
-0.05761776119470596,
-0.022723054513335228,
-0.00252151838503778,
0.07804711163043976,
-0.03207898139953613,
0.08441435545682907,
0.0024172733537852764,
0.20842726528644562,
-0.06873799860477448,
-0.04104651138186455,
0.021479448303580284,
-0.05455892160534859,
-0.01710606925189495,
0.08545398712158203,
0.07776128500699997,
-0.18789103627204895,
0.16382555663585663,
0.20322710275650024,
0.005923011340200901,
0.15125198662281036,
0.0037240793462842703,
-0.0591965913772583,
0.06402318179607391,
-0.0024685433600097895,
-0.01987261325120926,
-0.07868523895740509,
-0.15088507533073425,
-0.018342113122344017,
0.08259771019220352,
0.039321113377809525,
0.09465768933296204,
-0.11895642429590225,
-0.02675648406147957,
-0.009165432304143906,
0.006419542245566845,
-0.018683040514588356,
0.09044218063354492,
0.06862271577119827,
0.16033902764320374,
-0.004866187926381826,
-0.029230765998363495,
0.11413105577230453,
0.025102457031607628,
-0.130033478140831,
0.20507369935512543,
-0.13527898490428925,
-0.33023256063461304,
-0.1918414682149887,
-0.1611635982990265,
-0.04046640917658806,
0.043490953743457794,
0.09211930632591248,
-0.11971243470907211,
-0.023521367460489273,
-0.0040140096098184586,
0.0652272030711174,
-0.10737135261297226,
0.02464432641863823,
-0.056705307215452194,
0.04560283198952675,
-0.04554189741611481,
-0.07916536927223206,
-0.040457598865032196,
-0.01779010333120823,
-0.0379314087331295,
0.14234526455402374,
-0.11004046350717545,
0.06692367047071457,
0.2041129171848297,
-0.009392541833221912,
0.0470442920923233,
-0.03036171942949295,
0.16747425496578217,
-0.05070081725716591,
0.008348554372787476,
0.22642983496189117,
-0.0632607713341713,
0.0799543559551239,
0.11435458064079285,
-0.01671656034886837,
-0.0674959272146225,
0.029040439054369926,
-0.026708412915468216,
-0.0771590992808342,
-0.288202166557312,
-0.09832111746072769,
-0.11433882266283035,
0.07961969077587128,
0.07298963516950607,
0.05136489123106003,
0.1506611853837967,
0.06353873759508133,
-0.0014984100125730038,
0.044252071529626846,
-0.024400126188993454,
0.08823288977146149,
0.22454051673412323,
-0.00562738673761487,
0.1253916472196579,
-0.05837563797831535,
-0.09665365517139435,
0.08771000802516937,
0.05767456814646721,
0.1275123953819275,
0.035171590745449066,
0.05800265073776245,
-0.01042699720710516,
0.07983892410993576,
0.1276608109474182,
0.1561160832643509,
0.021526623517274857,
-0.022240392863750458,
-0.01922982931137085,
-0.0242279302328825,
-0.0474022813141346,
0.042324066162109375,
-0.028677603229880333,
-0.10821261256933212,
-0.09130849689245224,
-0.056915514171123505,
0.045776624232530594,
0.12005918473005295,
0.08145388960838318,
-0.2754218876361847,
-0.010376658290624619,
0.041860464960336685,
-0.03334631770849228,
-0.12606896460056305,
0.07822461426258087,
0.00145563087426126,
-0.11162479221820831,
0.05108507350087166,
-0.06858417391777039,
0.11141569167375565,
-0.02854723110795021,
0.09085290879011154,
-0.04205034673213959,
-0.05308544635772705,
0.009666514582931995,
0.10729387402534485,
-0.32622668147087097,
0.22120900452136993,
0.0005494045908562839,
-0.07396697252988815,
-0.09800945222377777,
-0.009924459271132946,
-0.018058467656373978,
0.10485460609197617,
0.10200710594654083,
0.00857409555464983,
-0.044727373868227005,
-0.10079942643642426,
-0.001911108149215579,
0.0160392876714468,
0.1416216343641281,
0.003232594346627593,
0.010607552714645863,
-0.06085557863116264,
-0.011270401068031788,
0.00042969020432792604,
-0.0063725681975483894,
0.012853298336267471,
-0.14923886954784393,
0.08164975047111511,
0.02902830019593239,
0.05877191200852394,
0.026785722002387047,
-0.032346732914447784,
-0.05592125281691551,
0.1978888064622879,
-0.0813334509730339,
-0.08734483271837234,
-0.12360762059688568,
-0.03776855766773224,
0.06624194979667664,
-0.0857410877943039,
0.05761401355266571,
-0.07809928804636002,
0.014261110685765743,
-0.04026739299297333,
-0.22564102709293365,
0.13722221553325653,
-0.08789744228124619,
-0.057472918182611465,
-0.03787403181195259,
0.16880212724208832,
-0.07846269756555557,
0.003603121731430292,
0.001010442734695971,
0.014881771057844162,
-0.09872760623693466,
-0.058419059962034225,
0.02714257314801216,
-0.02370346523821354,
0.07373539358377457,
0.014246928505599499,
-0.08771274238824844,
-0.026873556897044182,
-0.04421772435307503,
-0.010758713819086552,
0.32059186697006226,
0.13569305837154388,
-0.06032058224081993,
0.14715103805065155,
0.12952402234077454,
-0.08458166569471359,
-0.28935036063194275,
-0.07204092293977737,
-0.07258007675409317,
-0.023102762177586555,
-0.057471051812171936,
-0.16755197942256927,
0.06674350053071976,
0.00879679899662733,
0.01010393537580967,
0.1294938623905182,
-0.2612045705318451,
-0.09319140762090683,
0.14942041039466858,
0.01613568887114525,
0.3729253113269806,
-0.12496547400951385,
-0.09435328841209412,
-0.0535445399582386,
-0.16205169260501862,
0.1586177796125412,
-0.0034688571467995644,
0.0884246677160263,
-0.05701526254415512,
0.11731281131505966,
0.051532644778490067,
-0.04815584048628807,
0.0481397770345211,
0.0053951311856508255,
0.010143408551812172,
-0.1219218298792839,
-0.07271546870470047,
0.04707564786076546,
-0.01258976198732853,
0.03408622741699219,
-0.026837613433599472,
0.060781531035900116,
-0.13468347489833832,
-0.02403438463807106,
-0.0986112505197525,
0.051360417157411575,
0.027738235890865326,
-0.06127076968550682,
0.020215395838022232,
-0.07898403704166412,
0.007764906622469425,
-0.02181856893002987,
0.20409846305847168,
-0.041272081434726715,
0.17576421797275543,
0.185849130153656,
0.12658190727233887,
-0.11951465159654617,
0.03682965785264969,
-0.059769343584775925,
-0.06266480684280396,
0.08548760414123535,
-0.10746664553880692,
0.0515686497092247,
0.13653399050235748,
-0.0194852352142334,
0.057825081050395966,
0.10545878112316132,
0.013498139567673206,
-0.027026910334825516,
0.1267685741186142,
-0.2631306052207947,
0.038169100880622864,
-0.08460862934589386,
-0.008277020417153835,
0.036948706954717636,
0.07013141363859177,
0.18761853873729706,
0.0050191255286335945,
-0.022846926003694534,
-0.0005305687664076686,
-0.0032169255428016186,
-0.039859216660261154,
0.07955451309680939,
0.04236183315515518,
0.025808382779359818,
-0.11839942634105682,
0.09552287310361862,
0.03079117089509964,
-0.15160426497459412,
0.031213318929076195,
0.18906792998313904,
-0.13360579311847687,
-0.1073017418384552,
-0.0015372881898656487,
0.09650308638811111,
-0.19687221944332123,
-0.02933841571211815,
-0.0661778524518013,
-0.10521958023309708,
0.10845258831977844,
0.20702214539051056,
0.04157938063144684,
0.08099190145730972,
-0.05316930264234543,
-0.0566558837890625,
-0.05247151479125023,
-0.0028563435189425945,
0.002129538916051388,
0.038999415934085846,
-0.11052848398685455,
0.09109547734260559,
-0.035198815166950226,
0.1498023420572281,
-0.08088216185569763,
-0.03972676768898964,
-0.15023428201675415,
0.023700950667262077,
-0.15293079614639282,
-0.03884303569793701,
-0.04642169550061226,
-0.05244468152523041,
-0.02372957207262516,
-0.023414667695760727,
-0.05555560439825058,
-0.030533278360962868,
-0.11429408192634583,
0.018017904832959175,
-0.033102136105298996,
0.022623127326369286,
-0.06715137511491776,
-0.005159827880561352,
0.054639171808958054,
-0.0239662304520607,
0.12467198073863983,
0.132709801197052,
-0.10537438839673996,
0.11973454058170319,
-0.1301238238811493,
-0.09465966373682022,
0.098133884370327,
0.011746241711080074,
0.046187955886125565,
0.06032227724790573,
0.024032017216086388,
0.07159993052482605,
0.008548014797270298,
0.03509274497628212,
0.04312850534915924,
-0.11289303004741669,
0.021341174840927124,
-0.0340958908200264,
-0.14682652056217194,
-0.07872691005468369,
-0.024089578539133072,
0.039844710379838943,
0.008827128447592258,
0.11162237823009491,
-0.054870519787073135,
0.10855438560247421,
-0.07307892292737961,
0.014834600500762463,
0.00930056907236576,
-0.15918131172657013,
-0.06678343564271927,
-0.0704193040728569,
0.030138488858938217,
-0.010735527612268925,
0.15285079181194305,
0.009051257744431496,
0.02786356583237648,
0.03031553141772747,
0.0776771530508995,
-0.03861640393733978,
0.008567157201468945,
0.2021797150373459,
0.0752955973148346,
-0.06944630295038223,
-0.09943088889122009,
0.07548068463802338,
0.009114231914281845,
0.08034837990999222,
0.17286242544651031,
0.042209625244140625,
-0.011161329224705696,
0.10294386744499207,
-0.006761567201465368,
-0.015581062994897366,
-0.11771772056818008,
-0.13026878237724304,
-0.03178418427705765,
0.08804871141910553,
-0.010255834087729454,
0.13060788810253143,
0.19084341824054718,
-0.0005993306403979659,
0.017109675332903862,
-0.03964357450604439,
-0.05324425548315048,
-0.17051562666893005,
-0.16073697805404663,
-0.08176171034574509,
-0.11500862240791321,
-0.005551496054977179,
-0.09439010918140411,
0.07158341258764267,
0.08079083263874054,
0.06226884573698044,
-0.06078345701098442,
0.08340827375650406,
0.10056587308645248,
-0.12891729176044464,
0.07401765137910843,
-0.014043731614947319,
0.07417644560337067,
-0.010800107382237911,
-0.003027237020432949,
-0.06009596586227417,
-0.010318313725292683,
-0.02136692963540554,
0.04306381568312645,
-0.01756896823644638,
0.015612704679369926,
-0.13009275496006012,
-0.10302719473838806,
-0.02344292588531971,
0.08092331886291504,
-0.027788225561380386,
0.13421235978603363,
0.002544132061302662,
-0.009203603491187096,
0.018507041037082672,
0.2279818058013916,
-0.0826820656657219,
-0.057693321257829666,
-0.031586382538080215,
0.25082021951675415,
0.04332177713513374,
0.0885329470038414,
-0.01820177026093006,
-0.004596723709255457,
-0.07449798285961151,
0.33756017684936523,
0.26374003291130066,
-0.07300266623497009,
0.0133696673437953,
0.021075094118714333,
0.036251477897167206,
0.11372537910938263,
0.15737079083919525,
0.10687343031167984,
0.25967028737068176,
-0.07012772560119629,
-0.028974877670407295,
-0.041341233998537064,
0.02845371887087822,
-0.0705791488289833,
0.14727680385112762,
0.03921016678214073,
-0.07874979823827744,
-0.03061264380812645,
0.06992800533771515,
-0.20740963518619537,
0.12099026888608932,
-0.10039433091878891,
-0.16944783926010132,
-0.06630896776914597,
-0.0057978141121566296,
0.11561641842126846,
-0.007073116954416037,
0.0856672003865242,
-0.026052357628941536,
-0.08507587760686874,
0.048746801912784576,
0.018571887165308,
-0.20348837971687317,
-0.00018657023611012846,
0.04584376886487007,
-0.12093330174684525,
-0.018756361678242683,
-0.0122902886942029,
0.05038115009665489,
0.08453262597322464,
0.07803770154714584,
-0.06651539355516434,
0.038958970457315445,
-0.001192984520457685,
-0.017093773931264877,
0.038815323263406754,
0.052365344017744064,
0.026196500286459923,
-0.07333295792341232,
0.049803268164396286,
-0.1411696970462799,
0.03416447713971138,
-0.012296364642679691,
-0.026521898806095123,
0.0036330490838736296,
-0.008938276208937168,
-0.04176962003111839,
0.06401389092206955,
0.10523161292076111,
-0.015625225380063057,
-0.009230346418917179,
-0.08741772919893265,
-0.04888242855668068,
0.0009635284659452736,
-0.0925854817032814,
-0.0706326887011528,
-0.12737132608890533,
-0.07751539349555969,
0.12098696827888489,
-0.0032490440644323826,
-0.22549843788146973,
0.01666291430592537,
-0.09864027053117752,
0.028613118454813957,
-0.19077081978321075,
0.08549627661705017,
0.06312074512243271,
0.013068536296486855,
0.0026932701002806425,
0.0038085898850113153,
0.04471083730459213,
0.1110302284359932,
-0.11501909792423248,
-0.09550018608570099
] |
null | null | null | https://civitai.com/models/144934/yoinkoorlabs-nsfw-motion-module-v2 | {"license": "creativeml-openrail-m"} | null | LarryAIDraw/yoinkoorlabsNSFWMotion_godmodev20 | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-14T14:38:53+00:00 | [] | [] | TAGS
#license-creativeml-openrail-m #region-us
| URL | [] | [
"TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
18
] | [
"passage: TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
-0.07587551325559616,
0.1441737711429596,
-0.0062791393138468266,
0.012048184871673584,
-0.001431003911420703,
-0.022854028269648552,
0.2091037780046463,
-0.018623588606715202,
0.08854977041482925,
-0.11491455882787704,
0.14648450911045074,
0.18939465284347534,
-0.10384178161621094,
0.0838744044303894,
-0.061768148094415665,
-0.13200531899929047,
0.029243366792798042,
-0.07651498913764954,
-0.0865340456366539,
0.028722204267978668,
0.056829702109098434,
-0.01273291651159525,
-0.003666024887934327,
-0.0012952570104971528,
-0.11045186221599579,
0.07173702865839005,
-0.029841862618923187,
-0.037320639938116074,
0.060927797108888626,
-0.04866224527359009,
0.04899880662560463,
0.11812204867601395,
-0.033462416380643845,
-0.13358792662620544,
0.004443002864718437,
-0.11795501410961151,
-0.13281011581420898,
0.007506446447223425,
0.121794693171978,
-0.0353701114654541,
0.12644833326339722,
0.17882929742336273,
0.0022871040273457766,
0.07042364031076431,
-0.1692226231098175,
-0.17680460214614868,
-0.04340395703911781,
-0.018681490793824196,
-0.026622790843248367,
0.0532202385365963,
0.11296376585960388,
0.0959911122918129,
-0.1474708467721939,
0.059626504778862,
0.08025065064430237,
-0.29932230710983276,
0.03342466056346893,
0.23123668134212494,
0.11160528659820557,
0.03646189346909523,
-0.04899992793798447,
0.06103713810443878,
0.037279851734638214,
-0.055691562592983246,
-0.011489230208098888,
-0.07466674596071243,
0.033063821494579315,
0.1203068420290947,
-0.048032116144895554,
-0.025952165946364403,
0.3207513689994812,
-0.011608880013227463,
0.004257023800164461,
0.03850623592734337,
-0.046627260744571686,
0.03471478819847107,
0.053042974323034286,
0.07628075033426285,
0.05806995555758476,
0.1503586620092392,
0.06162842735648155,
-0.11057397723197937,
-0.12041215598583221,
0.018044639378786087,
-0.14939343929290771,
0.16419777274131775,
-0.05087574943900108,
0.0932750254869461,
-0.11752020567655563,
0.018267955631017685,
-0.0651155412197113,
-0.03550999239087105,
-0.010290741920471191,
-0.14436741173267365,
0.09543514996767044,
-0.00750720826908946,
-0.044816359877586365,
-0.06333030760288239,
0.06353012472391129,
0.134693443775177,
0.06326734274625778,
-0.01916888915002346,
0.03110724687576294,
0.18312698602676392,
0.02453736774623394,
-0.039170458912849426,
0.02620672434568405,
0.14288429915905,
0.03429737314581871,
-0.1762668490409851,
-0.0059744445607066154,
-0.0644608810544014,
-0.1936662793159485,
-0.02320769429206848,
-0.19997692108154297,
0.16352415084838867,
-0.030033577233552933,
-0.016221072524785995,
-0.03707468882203102,
0.022218478843569756,
0.04353277385234833,
0.007484832778573036,
0.018807580694556236,
-0.044244956225156784,
-0.08294660598039627,
-0.08514150232076645,
-0.020517800003290176,
0.05681263282895088,
0.07853931933641434,
0.18057872354984283,
-0.12033670395612717,
0.0023163571022450924,
-0.04746192321181297,
-0.002028648741543293,
0.10751507431268692,
-0.1799560934305191,
0.05942503362894058,
-0.10612065345048904,
-0.21264076232910156,
-0.0035186251625418663,
0.11188323050737381,
0.02211635187268257,
0.00010340322478441522,
0.023470120504498482,
-0.042402785271406174,
-0.03322858735918999,
-0.06714189052581787,
-0.09123854339122772,
-0.07618846744298935,
0.0644230917096138,
-0.15088342130184174,
-0.06908489763736725,
-0.27447474002838135,
0.021657612174749374,
-0.11370886117219925,
0.030269425362348557,
0.09551744163036346,
-0.08233252167701721,
-0.11906278878450394,
0.24992190301418304,
0.07235409319400787,
0.07105377316474915,
-0.037106942385435104,
-0.02335505001246929,
-0.040998950600624084,
0.07576625794172287,
-0.051450882107019424,
0.006896975915879011,
0.06892602890729904,
-0.05309505760669708,
-0.13028347492218018,
-0.018723927438259125,
-0.04109232872724533,
0.13036558032035828,
-0.005558064207434654,
0.30143606662750244,
0.04775548353791237,
-0.18540549278259277,
0.20458267629146576,
0.13462620973587036,
-0.17578788101673126,
-0.3525811433792114,
0.10510481148958206,
-0.08032525330781937,
-0.12903624773025513,
0.02135874517261982,
0.05760384723544121,
0.08029629290103912,
-0.016704760491847992,
-0.03554001823067665,
0.003427563700824976,
-0.061561521142721176,
-0.016107140108942986,
0.031175263226032257,
0.09541988372802734,
-0.08737137913703918,
0.08379733562469482,
0.03426050394773483,
-0.0114505710080266,
0.14006270468235016,
-0.02073829248547554,
-0.0763879269361496,
0.02079492248594761,
0.04172089695930481,
-0.020384199917316437,
-0.056601639837026596,
-0.019958069548010826,
0.024005193263292313,
-0.017852509394288063,
0.10743143409490585,
0.29301881790161133,
0.0457768440246582,
-0.015894168987870216,
0.050522804260253906,
0.02892244979739189,
0.031187754124403,
0.04622279107570648,
0.002081167884171009,
-0.15730762481689453,
0.07284589111804962,
-0.05682012811303139,
-0.09314198791980743,
-0.03167767822742462,
-0.0017506676958873868,
0.0981268361210823,
-0.05222945287823677,
0.06663653254508972,
0.04907272756099701,
0.008146014995872974,
-0.0024776349309831858,
0.019724633544683456,
0.03505800664424896,
0.15693770349025726,
0.06973138451576233,
-0.09330075234174728,
0.2326427847146988,
-0.07795968651771545,
0.3451519012451172,
0.06519531458616257,
-0.17186447978019714,
0.0015280802035704255,
-0.16536928713321686,
-0.08274903148412704,
0.009426575154066086,
0.06846177577972412,
0.04244798794388771,
-0.06766051799058914,
-0.0681324228644371,
0.1076645776629448,
-0.05602144077420235,
-0.05967314541339874,
-0.09208252280950546,
-0.06438151746988297,
-0.09841792285442352,
0.11479154229164124,
0.17103825509548187,
-0.17601613700389862,
0.14707137644290924,
0.31644511222839355,
0.0033473046496510506,
0.20550797879695892,
-0.06598898768424988,
0.06533558666706085,
-0.11870601028203964,
0.06948951631784439,
-0.033792875707149506,
0.1264963299036026,
-0.10152938961982727,
0.04339653253555298,
0.01719778962433338,
0.05835990980267525,
0.12580721080303192,
-0.1375611275434494,
-0.2047722488641739,
0.05393601953983307,
0.04846670478582382,
-0.08490802347660065,
0.15654030442237854,
-0.07621043175458908,
0.03958071768283844,
-0.04002580791711807,
-0.10932640731334686,
0.16022461652755737,
-0.07396190613508224,
-0.03576399013400078,
0.04601873457431793,
-0.162797212600708,
0.04817049205303192,
-0.13655415177345276,
-0.20034807920455933,
-0.03256381303071976,
0.011739566922187805,
0.09091648459434509,
0.0064963698387146,
-0.045913100242614746,
0.008927296847105026,
-0.1321311742067337,
-0.24660253524780273,
-0.10214889049530029,
-0.04224977269768715,
0.1463703066110611,
-0.09529456496238708,
-0.08689732849597931,
-0.008191614411771297,
-0.027925807982683182,
0.0383632630109787,
0.0873899981379509,
-0.04390016943216324,
0.15604910254478455,
0.13776685297489166,
0.03233470022678375,
0.07692384719848633,
-0.0302706528455019,
0.16908830404281616,
0.07715359330177307,
-0.09182680398225784,
0.09044599533081055,
-0.006939579267054796,
0.07778391242027283,
0.26205286383628845,
0.13615888357162476,
-0.10827198624610901,
0.0021787171717733145,
-0.09298930317163467,
-0.13136249780654907,
-0.25473496317863464,
-0.03117409534752369,
-0.15477068722248077,
0.13437145948410034,
-0.08579761534929276,
0.08686056733131409,
0.13696706295013428,
0.05041143670678139,
0.10572081059217453,
0.018525123596191406,
-0.016791416332125664,
0.022843502461910248,
0.17746564745903015,
-0.02853401191532612,
-0.043541014194488525,
-0.14404186606407166,
-0.022182300686836243,
0.15260697901248932,
0.10192563384771347,
0.16757766902446747,
0.16616763174533844,
0.11930298805236816,
0.1956932544708252,
0.11704401671886444,
0.10304278880357742,
0.052189555019140244,
-0.013531852513551712,
-0.004093863070011139,
-0.01228472962975502,
-0.042497504502534866,
0.05230056867003441,
0.05571495369076729,
0.027585504576563835,
-0.19872500002384186,
0.02184155583381653,
-0.19329896569252014,
-0.02313016541302204,
-0.08243345469236374,
0.01644495315849781,
0.05239224433898926,
0.2096434086561203,
0.04210057109594345,
0.10118018835783005,
0.021744482219219208,
0.10573884844779968,
0.015865135937929153,
-0.07006605714559555,
-0.0065298317931592464,
-0.024272896349430084,
0.09974277764558792,
0.10174193233251572,
0.021700428798794746,
-0.016679642722010612,
-0.09889253973960876,
0.04607788100838661,
0.17424549162387848,
-0.17494839429855347,
0.3187439739704132,
-0.0007240860140882432,
-0.04524024948477745,
-0.04190666601061821,
-0.08219234645366669,
0.04142151027917862,
0.1647384762763977,
0.1017698273062706,
0.0333428718149662,
-0.14635729789733887,
-0.06874663382768631,
-0.029922528192400932,
-0.029125673696398735,
0.10087492316961288,
-0.06689736992120743,
-0.13817089796066284,
-0.025579528883099556,
0.0344909206032753,
0.003919827751815319,
0.21354736387729645,
-0.10228335112333298,
-0.15175104141235352,
0.00922450888901949,
0.13133007287979126,
-0.06745465099811554,
-0.04906000941991806,
0.09594502300024033,
-0.02669750526547432,
0.0972210094332695,
-0.0541548989713192,
0.002656505908817053,
-0.14727191627025604,
-0.2363637089729309,
0.010592032223939896,
-0.02335694245994091,
0.020698489621281624,
-0.07203120738267899,
-0.11125075072050095,
-0.1240958720445633,
-0.1789770871400833,
0.11374562233686447,
-0.06521226465702057,
0.09276589751243591,
-0.09726036339998245,
0.08684233576059341,
-0.08414942771196365,
0.02816055528819561,
-0.05099964141845703,
-0.0012100528692826629,
-0.09757094830274582,
-0.14613427221775055,
0.024435222148895264,
-0.13409870862960815,
-0.001014217734336853,
0.034934982657432556,
-0.11161556839942932,
0.14066044986248016,
0.13931402564048767,
-0.08724056929349899,
0.17418785393238068,
0.42831170558929443,
-0.05984934791922569,
0.25173598527908325,
0.2527628242969513,
-0.13718484342098236,
-0.2734082341194153,
-0.059651490300893784,
-0.23391994833946228,
-0.08160211890935898,
0.1082993745803833,
-0.1578003615140915,
0.015907390043139458,
0.05020333454012871,
-0.11690597236156464,
0.1467704027891159,
-0.32824045419692993,
-0.07495500147342682,
0.09672868996858597,
0.007048844825476408,
0.4732857048511505,
-0.1068139299750328,
-0.12494277954101562,
-0.07125994563102722,
-0.10485164821147919,
0.10395017266273499,
-0.07008004188537598,
0.08493339270353317,
-0.030203424394130707,
0.025772906839847565,
0.011868835426867008,
-0.04774972423911095,
0.14879614114761353,
-0.0427577942609787,
0.19098854064941406,
-0.11560776084661484,
0.0027590321842581034,
0.14695321023464203,
-0.03108292631804943,
0.038532279431819916,
-0.07178329676389694,
0.04545990377664566,
-0.042950090020895004,
-0.027814088389277458,
-0.018928585574030876,
0.11621513217687607,
-0.004339784849435091,
-0.1380559802055359,
-0.06945756077766418,
0.01972813345491886,
-0.07362999767065048,
-0.05320021137595177,
0.15675771236419678,
0.03502804413437843,
0.05609925836324692,
0.11970125883817673,
0.004991572815924883,
-0.146412655711174,
0.00884049292653799,
-0.07536338269710541,
0.01455683447420597,
0.04314182698726654,
-0.08771193772554398,
-0.050023581832647324,
0.11971840262413025,
0.021750157698988914,
0.0665673241019249,
0.06486256420612335,
-0.042168524116277695,
0.02131110616028309,
0.11186312884092331,
-0.12857086956501007,
-0.06895474344491959,
-0.017605429515242577,
0.2739332914352417,
0.20882153511047363,
0.06424131989479065,
0.011942589655518532,
0.03977527841925621,
0.08851079642772675,
0.025800030678510666,
-0.024320857599377632,
-0.027894796803593636,
-0.07533380389213562,
0.08076632767915726,
-0.026636533439159393,
-0.08794095367193222,
0.1338292956352234,
0.04866079241037369,
-0.0795087143778801,
-0.08115667849779129,
0.10095386952161789,
-0.03139214217662811,
-0.0645640566945076,
-0.04291141778230667,
0.16875873506069183,
-0.142974391579628,
-0.05379750579595566,
0.05253109708428383,
-0.06923473626375198,
0.03050602227449417,
0.1983366161584854,
0.06317481398582458,
0.10652732849121094,
0.020412208512425423,
-0.03693949803709984,
0.09139978885650635,
-0.008889229968190193,
-0.1458244025707245,
0.04242372885346413,
-0.1516965925693512,
-0.1209954097867012,
-0.03220202773809433,
0.059742625802755356,
-0.06468313187360764,
-0.0443362258374691,
-0.16110824048519135,
0.08512833714485168,
-0.059125129133462906,
-0.04787873104214668,
-0.07900126278400421,
-0.034204404801130295,
-0.011031275615096092,
-0.027199620380997658,
-0.08409348875284195,
0.0068776607513427734,
-0.22133535146713257,
0.051574207842350006,
0.04428314045071602,
0.017113016918301582,
-0.03435007482767105,
-0.08292978256940842,
0.07848229259252548,
0.04986674711108208,
0.10280575603246689,
0.03711284324526787,
-0.059191394597291946,
0.0037306465674191713,
-0.20414716005325317,
-0.038815271109342575,
0.04232484847307205,
-0.021390240639448166,
0.0267819594591856,
0.08142497390508652,
-0.03312315046787262,
0.05886727198958397,
-0.04134150594472885,
0.031092548742890358,
-0.12302310764789581,
-0.19250139594078064,
-0.07369648665189743,
0.0737677738070488,
-0.1768668293952942,
-0.007294799666851759,
-0.158339723944664,
0.12045895308256149,
0.0037357027176767588,
0.19128042459487915,
0.05877019464969635,
0.07969143241643906,
0.07085993885993958,
-0.03897101804614067,
0.1005023792386055,
-0.05584702640771866,
-0.09622103720903397,
-0.019361555576324463,
-0.12480172514915466,
-0.049345120787620544,
0.42032214999198914,
0.05109545961022377,
-0.34862402081489563,
0.03209015727043152,
0.10416815429925919,
0.09029489010572433,
0.0010600913083180785,
0.1751212626695633,
-0.02115757390856743,
0.00999172031879425,
-0.09422436356544495,
0.09467131644487381,
-0.0020058725494891405,
-0.11290951073169708,
0.0739678293466568,
0.09658773243427277,
0.08477838337421417,
-0.024424241855740547,
0.13553570210933685,
-0.010457966476678848,
0.03920025750994682,
-0.11343693733215332,
0.15077632665634155,
0.06773624569177628,
-0.05210328474640846,
0.062154389917850494,
0.1635616272687912,
0.05306112766265869,
0.07038675248622894,
0.04032095894217491,
0.0014122785069048405,
-0.1754148155450821,
-0.1602102369070053,
0.02099275030195713,
-0.05523645877838135,
0.07993361353874207,
0.02664482593536377,
0.06025690957903862,
0.05930217728018761,
0.08369890600442886,
-0.02683570235967636,
-0.012045243754982948,
-0.21370548009872437,
-0.059094905853271484,
-0.014421275816857815,
-0.06632379442453384,
-0.06530799716711044,
-0.13236206769943237,
-0.007965253666043282,
-0.11605394631624222,
-0.1677420735359192,
-0.11075370758771896,
0.06186629459261894,
-0.03134578466415405,
-0.07950954884290695,
-0.1361609846353531,
0.005552724003791809,
-0.051663242280483246,
0.0591781884431839,
0.020678075030446053,
0.14382748305797577,
-0.055859338492155075,
-0.007769476156681776,
0.03557850420475006,
0.17586101591587067,
0.03452156111598015,
-0.019137056544423103,
0.05009777843952179,
-0.11230028420686722,
-0.013903132639825344,
0.09447801858186722,
-0.05355257913470268,
0.03868480771780014,
0.05060523375868797,
0.14069905877113342,
0.3000718951225281,
-0.15852685272693634,
0.022173447534441948,
-0.0156106511130929,
0.027616411447525024,
0.03752091899514198,
0.10538272559642792,
-0.047601912170648575,
0.30318450927734375,
-0.03754459694027901,
0.015319152735173702,
-0.05392564833164215,
0.03960913047194481,
-0.0902356207370758,
0.13807453215122223,
0.07016881555318832,
-0.1437612622976303,
-0.11773919314146042,
0.13123241066932678,
-0.2251790165901184,
0.21079330146312714,
0.05835592746734619,
-0.018531115725636482,
0.0006959201418794692,
-0.017787374556064606,
0.20127902925014496,
-0.06664536148309708,
0.07648804783821106,
-0.10087135434150696,
-0.11177007853984833,
-0.14956814050674438,
0.008278977125883102,
-0.3149573504924774,
-0.07720612734556198,
0.10045251995325089,
0.1509818434715271,
0.17898774147033691,
-0.022407056763768196,
0.060840118676424026,
0.03429623693227768,
0.016734736040234566,
-0.09003262221813202,
0.09443855285644531,
0.08975303173065186,
-0.14206120371818542,
-0.09327292442321777,
-0.12793666124343872,
-0.015153053216636181,
-0.009946417063474655,
-0.008153465576469898,
0.0022670275066047907,
0.04026666656136513,
0.12014163285493851,
-0.04463301971554756,
-0.05576737970113754,
0.06202622875571251,
-0.09607529640197754,
0.03486022725701332,
-0.03752650320529938,
0.012558498419821262,
-0.07468373328447342,
-0.03885192796587944,
-0.04395401477813721,
0.06765811145305634,
-0.2736577093601227,
-0.04237256944179535,
0.10482975840568542,
-0.0006625195383094251,
0.22920070588588715,
0.053381726145744324,
-0.108866386115551,
-0.028044672682881355,
-0.11392955482006073,
0.06305203586816788,
-0.12086670845746994,
-0.0018355880165472627,
0.1538183093070984,
0.022182224318385124,
0.03804173693060875,
-0.16429899632930756,
0.040075428783893585,
-0.10011276602745056,
-0.03175477311015129,
-0.06921384483575821
] |
null | null | transformers |
# Uploaded model
- **Developed by:** kaykyramos
- **License:** apache-2.0
- **Finetuned from model :** unsloth/mistral-7b-instruct-v0.2-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl"], "base_model": "unsloth/mistral-7b-instruct-v0.2-bnb-4bit"} | null | kaykyramos/Aura-MoE-instruct-pretreined | [
"transformers",
"text-generation-inference",
"unsloth",
"mistral",
"trl",
"en",
"base_model:unsloth/mistral-7b-instruct-v0.2-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:41:16+00:00 | [] | [
"en"
] | TAGS
#transformers #text-generation-inference #unsloth #mistral #trl #en #base_model-unsloth/mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
|
# Uploaded model
- Developed by: kaykyramos
- License: apache-2.0
- Finetuned from model : unsloth/mistral-7b-instruct-v0.2-bnb-4bit
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.
<img src="URL width="200"/>
| [
"# Uploaded model\n\n- Developed by: kaykyramos\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-instruct-v0.2-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
"TAGS\n#transformers #text-generation-inference #unsloth #mistral #trl #en #base_model-unsloth/mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Uploaded model\n\n- Developed by: kaykyramos\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-instruct-v0.2-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
70,
85
] | [
"passage: TAGS\n#transformers #text-generation-inference #unsloth #mistral #trl #en #base_model-unsloth/mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: kaykyramos\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-instruct-v0.2-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
-0.0665796548128128,
0.0034820972941815853,
-0.0036001813132315874,
0.08660577237606049,
0.09575395286083221,
0.04654282331466675,
0.09671671688556671,
0.10287674516439438,
0.03296175226569176,
-0.05720091611146927,
0.10731030255556107,
0.10153815150260925,
0.00198706635273993,
-0.0007807612419128418,
-0.019246289506554604,
-0.19405779242515564,
0.09021010249853134,
-0.03989207744598389,
-0.04532020911574364,
0.053788166493177414,
0.06752658635377884,
-0.0005811566370539367,
0.09447457641363144,
-0.07195773720741272,
-0.05239059776067734,
0.0020152940414845943,
-0.025669310241937637,
-0.015668902546167374,
0.012507683597505093,
0.06857582926750183,
0.001956605352461338,
0.03717328980565071,
0.0536981076002121,
-0.08791690319776535,
0.035603780299425125,
0.04502449557185173,
-0.023259375244379044,
0.0750679075717926,
-0.01882941834628582,
0.06354554742574692,
0.1253889501094818,
-0.04715915396809578,
-0.07138732075691223,
0.059004053473472595,
-0.04843607917428017,
-0.11384409666061401,
-0.08679743111133575,
0.10776925086975098,
0.04664033651351929,
0.056146059185266495,
0.04518051818013191,
0.10816778242588043,
-0.051623690873384476,
0.07553935796022415,
0.1667034924030304,
-0.24514718353748322,
-0.06484286487102509,
0.13430778682231903,
0.029029639437794685,
0.0443507544696331,
-0.018219051882624626,
0.0034255485516041517,
0.029285665601491928,
0.008991727605462074,
0.014450875110924244,
-0.08572448790073395,
-0.08216330409049988,
0.03602266311645508,
-0.10901287943124771,
-0.0045165722258389,
0.21738940477371216,
0.062297843396663666,
-0.035844456404447556,
0.0650985911488533,
-0.12726537883281708,
0.06550043076276779,
-0.050675686448812485,
0.05811616778373718,
0.039465099573135376,
0.08576542139053345,
-0.029802603647112846,
-0.10145643353462219,
-0.0551750622689724,
-0.04772888869047165,
-0.09773819148540497,
0.03621513396501541,
0.048037316650152206,
0.10742636024951935,
-0.06301024556159973,
0.06305649876594543,
-0.06900425255298615,
-0.11813206970691681,
-0.05704369768500328,
-0.0844128355383873,
0.0721723660826683,
0.048498596996068954,
-0.045985061675310135,
0.028623761609196663,
0.11563057452440262,
0.2049999088048935,
0.09071807563304901,
0.054708655923604965,
0.04071219637989998,
0.06182091683149338,
-0.06727702915668488,
0.06632088869810104,
-0.12394028156995773,
-0.038401950150728226,
0.11641053855419159,
0.016795523464679718,
0.0763387605547905,
-0.012707899324595928,
-0.10032064467668533,
-0.07089955359697342,
-0.001191399060189724,
0.01679035648703575,
0.06425628066062927,
0.09903863072395325,
0.0040499502792954445,
-0.05211086571216583,
0.14919312298297882,
-0.04847224801778793,
-0.022566374391317368,
0.015036595053970814,
-0.062429867684841156,
0.1590256690979004,
0.14388923346996307,
-0.009409436024725437,
-0.051095783710479736,
-0.07431917637586594,
-0.056048136204481125,
0.023508744314312935,
-0.02590397745370865,
-0.06976893544197083,
0.06548529863357544,
-0.0503675602376461,
0.009893096052110195,
-0.16474220156669617,
-0.24304436147212982,
0.026738358661532402,
0.13849323987960815,
-0.03624611347913742,
0.013066056184470654,
-0.042599450796842575,
-0.05316954478621483,
0.03584670275449753,
-0.030647411942481995,
-0.017698006704449654,
-0.07347884029150009,
0.025484181940555573,
-0.07607109844684601,
0.08611225336790085,
-0.19620735943317413,
0.037492476403713226,
-0.1097390279173851,
0.023026518523693085,
-0.0749133825302124,
0.07568072527647018,
-0.06650631129741669,
0.12714844942092896,
-0.12971697747707367,
-0.020970992743968964,
-0.033880360424518585,
-0.009130598045885563,
0.07307340949773788,
0.153781920671463,
-0.1355089247226715,
0.030801767483353615,
0.10856583714485168,
-0.037932008504867554,
-0.12775111198425293,
0.12819866836071014,
0.012087603099644184,
0.08645763248205185,
0.05547487363219261,
0.08314217627048492,
0.14816097915172577,
-0.08630570024251938,
0.06556753069162369,
0.1704024374485016,
-0.02446322701871395,
-0.09846630692481995,
0.06691847741603851,
0.01983519271016121,
-0.13232071697711945,
0.07747437804937363,
-0.07651665806770325,
0.11616622656583786,
-0.002573734149336815,
-0.051322199404239655,
-0.11042895913124084,
-0.07050501555204391,
0.04262366145849228,
-0.026372790336608887,
0.02960796467959881,
0.0016665689181536436,
-0.04817758873105049,
0.07762711495161057,
0.15216562151908875,
-0.06642483174800873,
0.055174533277750015,
-0.021703949198126793,
0.05603306367993355,
-0.0936490148305893,
0.07461579889059067,
-0.09680834412574768,
0.005680775735527277,
-0.017944561317563057,
-0.040892329066991806,
0.06805530935525894,
0.06854089349508286,
0.07111869007349014,
-0.01174379326403141,
-0.02990373596549034,
-0.007112166378647089,
0.08001359552145004,
0.003304536920040846,
-0.05772456154227257,
-0.11578377336263657,
0.002843132708221674,
-0.013495556078851223,
0.07104449719190598,
-0.0558609738945961,
0.04933395981788635,
-0.06226120889186859,
0.05351255461573601,
-0.03262665867805481,
0.07764704525470734,
0.037429120391607285,
-0.05218876525759697,
-0.013091888278722763,
-0.07696213573217392,
0.1046484187245369,
0.0544368140399456,
-0.09643739461898804,
0.08631287515163422,
-0.04299880936741829,
0.04045615717768669,
0.14930075407028198,
-0.022058822214603424,
0.06946257501840591,
0.028346218168735504,
-0.017697805538773537,
-0.040360551327466965,
0.08707807213068008,
0.005517997313290834,
-0.00326273194514215,
0.011114271357655525,
0.1123243123292923,
-0.07168658077716827,
-0.0010204093996435404,
0.010798475705087185,
-0.09918506443500519,
0.005505201406776905,
0.06968178600072861,
0.031817130744457245,
-0.1757332682609558,
0.03561050072312355,
0.27753305435180664,
-0.1306377500295639,
0.1065366119146347,
-0.04736259952187538,
-0.054314643144607544,
-0.004257291555404663,
0.013827063143253326,
0.0017573904478922486,
0.007482136599719524,
-0.10709972679615021,
0.02491617575287819,
0.053178396075963974,
-0.00041972316103056073,
0.03478449955582619,
-0.09626397490501404,
0.015957407653331757,
-0.028972571715712547,
-0.04672650620341301,
-0.051872618496418,
0.07606881856918335,
-0.07595234364271164,
0.03991270065307617,
-0.025041699409484863,
-0.06490334868431091,
0.054428502917289734,
0.025300052016973495,
-0.053355246782302856,
0.13566331565380096,
-0.1324787437915802,
-0.08021251857280731,
-0.19788436591625214,
-0.05878125876188278,
-0.14061607420444489,
-0.017950939014554024,
0.08792998641729355,
-0.0384373739361763,
-0.052181728184223175,
-0.09838490188121796,
-0.033872466534376144,
0.04342980682849884,
0.002973814494907856,
0.05473574250936508,
0.02894839458167553,
0.06756237894296646,
-0.1295669674873352,
-0.00966552086174488,
0.01179219875484705,
-0.08618659526109695,
0.05957019329071045,
-0.1020127534866333,
0.07630497217178345,
0.11280427128076553,
0.018372008576989174,
-0.02648012898862362,
0.05419345945119858,
0.15951262414455414,
0.05283472687005997,
0.08926204591989517,
0.2027612328529358,
0.022535542026162148,
0.10225407779216766,
0.11943219602108002,
0.01552765816450119,
-0.02968386374413967,
0.01792902685701847,
-0.041898105293512344,
-0.05430712178349495,
-0.18646089732646942,
-0.019704977050423622,
-0.09478821605443954,
0.048113901168107986,
0.06028497964143753,
0.04902028292417526,
-0.002737809205427766,
0.14847421646118164,
-0.06382866948843002,
0.11235006898641586,
0.05411165580153465,
0.09817966818809509,
0.11307922750711441,
0.020694050937891006,
0.05797501280903816,
-0.12434068322181702,
0.07023594528436661,
0.14470240473747253,
0.04865027219057083,
0.1513482630252838,
-0.023938074707984924,
0.09072158485651016,
0.05316928029060364,
0.16177910566329956,
0.009559930302202702,
0.11668500304222107,
-0.03284335136413574,
0.024665288627147675,
-0.06433132290840149,
-0.07050904631614685,
-0.06004326790571213,
0.06845323741436005,
-0.10860738158226013,
-0.01400679163634777,
0.03236992284655571,
0.09626297652721405,
0.08482018113136292,
0.19936244189739227,
0.07937707006931305,
-0.23425674438476562,
-0.11243043094873428,
0.07347319275140762,
0.06174783781170845,
-0.014890898950397968,
0.04072669893503189,
0.01643121801316738,
-0.002083387691527605,
0.05044511705636978,
-0.03825958073139191,
0.13775210082530975,
0.061899930238723755,
0.03500279784202576,
0.03107917495071888,
0.161076158285141,
0.05084798485040665,
0.09234730154275894,
-0.221003457903862,
0.07184392958879471,
0.009850109927356243,
0.030345413833856583,
-0.04299238324165344,
-0.0051764533855021,
0.11697255820035934,
0.15373212099075317,
0.034435953944921494,
0.0447150394320488,
-0.03721752390265465,
-0.012307700701057911,
-0.12937721610069275,
0.0599537268280983,
-0.0040146103128790855,
0.025950849056243896,
0.033961210399866104,
-0.08424055576324463,
-0.03815596178174019,
0.02154415473341942,
0.0948820561170578,
-0.10928712040185928,
-0.07558848708868027,
-0.00208861636929214,
0.06481674313545227,
-0.048718396574258804,
-0.03651464357972145,
0.026554949581623077,
-0.020303400233387947,
0.07141650468111038,
0.01294010505080223,
-0.07559823989868164,
-0.0762336477637291,
-0.07282272726297379,
0.1361909806728363,
-0.10403503477573395,
0.012300493195652962,
-0.06890176981687546,
-0.03866598755121231,
0.02447807416319847,
-0.22184666991233826,
0.08144032955169678,
-0.10492653399705887,
-0.039309632033109665,
0.012134273536503315,
0.012698314152657986,
-0.071059949696064,
0.004011945333331823,
-0.003540194593369961,
-0.028984688222408295,
-0.09414198249578476,
-0.11914157122373581,
-0.0842878445982933,
0.19899676740169525,
-0.030295642092823982,
0.03030504658818245,
-0.09686066955327988,
-0.04516991600394249,
-0.009358408860862255,
0.01935039460659027,
0.04866839572787285,
0.1816512793302536,
-0.04243447631597519,
0.06280490010976791,
0.26006418466567993,
-0.055079635232686996,
-0.29266005754470825,
-0.10421430319547653,
-0.06551943719387054,
-0.05307108163833618,
-0.046442627906799316,
-0.08206646144390106,
0.11713968217372894,
0.05939722806215286,
-0.02147219143807888,
0.07150998711585999,
-0.2679194211959839,
-0.10613620281219482,
0.11701300740242004,
0.03768498823046684,
0.31566134095191956,
-0.12131679058074951,
-0.03583550080657005,
-0.1398000866174698,
-0.2288280874490738,
0.029223378747701645,
-0.23189640045166016,
0.09602086991071701,
-0.04415057227015495,
0.032568998634815216,
-0.025437645614147186,
-0.02627616561949253,
0.12894012033939362,
0.022198781371116638,
0.07952766865491867,
-0.11204039305448532,
0.07610614597797394,
0.15745720267295837,
-0.1068376749753952,
0.17797811329364777,
-0.12004968523979187,
0.09341438859701157,
-0.05304982513189316,
0.023680657148361206,
-0.03327092528343201,
-0.006989678367972374,
0.002536711748689413,
-0.03584019094705582,
-0.04184403643012047,
-0.019473055377602577,
0.06911767274141312,
-0.007769509684294462,
0.1412787139415741,
0.0550217479467392,
-0.07288675755262375,
0.19281788170337677,
-0.004634648561477661,
-0.10814031958580017,
0.01372539158910513,
-0.05469130352139473,
-0.04463359713554382,
0.09056567400693893,
-0.23951572179794312,
0.03447991982102394,
0.06054462492465973,
-0.036383580416440964,
0.05035768076777458,
0.024125507101416588,
0.018990827724337578,
0.02348373457789421,
0.03004593588411808,
-0.09582016617059708,
-0.08466235548257828,
-0.03462175652384758,
0.011121232062578201,
-0.06426235288381577,
0.08174612373113632,
0.17116385698318481,
-0.07830434292554855,
0.013672303408384323,
0.01091836579144001,
0.033747583627700806,
-0.0975930467247963,
0.06742871552705765,
0.06597974896430969,
-0.018504925072193146,
-0.09806592017412186,
0.17458495497703552,
-0.017515309154987335,
0.006689220666885376,
-0.012301369570195675,
0.09151966124773026,
-0.16564594209194183,
-0.12286294251680374,
-0.0017753606662154198,
0.05008915066719055,
-0.13874469697475433,
-0.026157796382904053,
-0.034140218049287796,
-0.017093850299715996,
0.03684498369693756,
0.041580114513635635,
0.06726685911417007,
0.008508299477398396,
-0.04131711274385452,
-0.02707803249359131,
-0.011411371640861034,
0.02549820765852928,
0.06743721663951874,
0.05179440602660179,
-0.1568407118320465,
-0.04741740971803665,
-0.0373995266854763,
0.061146944761276245,
-0.041786111891269684,
0.002878080355003476,
-0.10790061950683594,
-0.014070647768676281,
-0.3407408595085144,
0.06353580951690674,
-0.09475020319223404,
0.03627900779247284,
-0.012522585690021515,
-0.05403948202729225,
-0.04380692169070244,
0.07568685710430145,
-0.06560247391462326,
-0.04361924156546593,
-0.02297419123351574,
0.04190228879451752,
-0.07418689131736755,
-0.05787384510040283,
0.017493784427642822,
-0.05309787392616272,
0.04778715968132019,
0.10775953531265259,
-0.11033665388822556,
0.055039338767528534,
-0.1729331761598587,
-0.08522102236747742,
0.03151015192270279,
0.04508219659328461,
0.008794408291578293,
0.04494199529290199,
-0.026338418945670128,
0.017774540930986404,
0.05419787019491196,
-0.043676942586898804,
0.08318226039409637,
-0.048541828989982605,
-0.04380270093679428,
-0.09822136908769608,
0.007423046976327896,
-0.06786112487316132,
-0.02330913208425045,
0.1343991756439209,
0.12754565477371216,
0.1672380566596985,
-0.045203279703855515,
-0.007924520410597324,
-0.1280876100063324,
-0.01811256632208824,
0.03693411499261856,
-0.11928870528936386,
-0.09155700355768204,
-0.11242378503084183,
0.009631319902837276,
-0.031401824206113815,
0.04701755940914154,
-0.042394526302814484,
-0.037855613976716995,
-0.023916546255350113,
0.04927273094654083,
-0.03363168612122536,
-0.025409655645489693,
0.24666956067085266,
0.03752942383289337,
0.04368438571691513,
-0.10812164098024368,
0.030239330604672432,
0.08761033415794373,
0.043007660657167435,
-0.0040868413634598255,
0.11214948445558548,
0.02192792110145092,
0.16790343821048737,
0.013207041658461094,
0.0769132524728775,
-0.011809616349637508,
0.05921505391597748,
0.009526611305773258,
0.11516138911247253,
-0.07058195024728775,
0.09700819849967957,
0.1541571021080017,
-0.07253225147724152,
-0.016721883788704872,
-0.03908166661858559,
-0.05125950276851654,
-0.11024568974971771,
-0.2211558222770691,
-0.1012493371963501,
-0.18892279267311096,
-0.018306175246834755,
-0.05061547830700874,
0.01503622904419899,
0.09490622580051422,
0.013238397426903248,
0.03284366801381111,
0.0349593460559845,
-0.05098670348525047,
-0.06405164301395416,
0.055649518966674805,
-0.04027663543820381,
-0.10079853236675262,
0.11613290756940842,
-0.037184037268161774,
0.09494728595018387,
-0.04224090278148651,
0.0006827715551480651,
0.054561398923397064,
0.09169675409793854,
0.06714195013046265,
-0.06662757694721222,
-0.09108014404773712,
-0.04702324792742729,
0.09196092933416367,
-0.023745989426970482,
0.08715390413999557,
0.07961937040090561,
-0.03025064244866371,
0.04814634844660759,
0.18699932098388672,
-0.08976438641548157,
-0.16118060052394867,
-0.1253592073917389,
0.0723181739449501,
-0.04568317160010338,
0.020427938550710678,
-0.016377083957195282,
-0.035612717270851135,
-0.0002900876570492983,
0.20356513559818268,
0.17967335879802704,
-0.08203262835741043,
-0.014357207342982292,
-0.011794949881732464,
0.012476077303290367,
-0.045886777341365814,
0.1602715253829956,
0.128485769033432,
-0.010228718630969524,
-0.043842773884534836,
-0.021326430141925812,
-0.00942020770162344,
-0.029873773455619812,
-0.12910579144954681,
-0.030843287706375122,
-0.08942291140556335,
-0.07296637445688248,
-0.005774007178843021,
0.023338841274380684,
-0.11225377768278122,
-0.048616837710142136,
-0.03403790667653084,
0.014268523082137108,
-0.01442915853112936,
-0.08777973800897598,
0.0729547068476677,
0.09395276010036469,
0.033881865441799164,
-0.10086991637945175,
0.06134381517767906,
0.1909705400466919,
-0.05782075971364975,
-0.149908646941185,
-0.05138060078024864,
0.051467735320329666,
0.03150051459670067,
0.0680689588189125,
0.032269880175590515,
0.034536246210336685,
0.06978167593479156,
0.003739498322829604,
-0.14771364629268646,
0.06893238425254822,
-0.02466719038784504,
-0.027058487758040428,
-0.004288620315492153,
-0.023416128009557724,
-0.09286175668239594,
0.009370674379169941,
0.04455308988690376,
-0.026016823947429657,
-0.03822425752878189,
0.1051405668258667,
-0.02187015488743782,
-0.06505380570888519,
-0.011403966695070267,
-0.08781816810369492,
0.11809476464986801,
0.08991482853889465,
-0.05779912322759628,
-0.043755631893873215,
-0.07230429351329803,
0.03859016299247742,
0.010943213477730751,
-0.10671564936637878,
-0.007832746021449566,
0.006200210656970739,
-0.03659878298640251,
-0.012065219692885876,
0.07866220921278,
-0.10719042271375656,
-0.04018474742770195,
-0.08248031139373779,
-0.014513140544295311,
-0.06685589253902435,
0.11211354285478592,
0.08049221336841583,
0.02339533157646656,
-0.01754174381494522,
-0.15283945202827454,
-0.025975024327635765,
0.06518233567476273,
-0.06190173327922821,
-0.11768250912427902
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-base-samsum
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3701
- Rouge1: 47.3692
- Rouge2: 23.8422
- Rougel: 39.8714
- Rougelsum: 43.5305
- Gen Len: 17.2393
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 1.4503 | 1.0 | 1842 | 1.3824 | 46.8175 | 23.0643 | 39.3286 | 43.071 | 17.3370 |
| 1.3502 | 2.0 | 3684 | 1.3725 | 47.089 | 23.1145 | 39.7933 | 43.3919 | 17.3944 |
| 1.2812 | 3.0 | 5526 | 1.3701 | 47.3692 | 23.8422 | 39.8714 | 43.5305 | 17.2393 |
| 1.231 | 4.0 | 7368 | 1.3719 | 47.5815 | 23.8343 | 40.0254 | 43.8344 | 17.2930 |
| 1.197 | 5.0 | 9210 | 1.3760 | 47.7141 | 23.987 | 40.1787 | 43.9113 | 17.3065 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu118
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "google/flan-t5-base", "model-index": [{"name": "flan-t5-base-samsum", "results": []}]} | text2text-generation | kshantam9/flan-t5-base-samsum | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T14:46:12+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| flan-t5-base-samsum
===================
This model is a fine-tuned version of google/flan-t5-base on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.3701
* Rouge1: 47.3692
* Rouge2: 23.8422
* Rougel: 39.8714
* Rougelsum: 43.5305
* Gen Len: 17.2393
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu118
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
80,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.09501560777425766,
0.09155642241239548,
-0.0018991392571479082,
0.11673951894044876,
0.11770115047693253,
0.002408863278105855,
0.18761299550533295,
0.11437595635652542,
-0.048143625259399414,
0.04104548320174217,
0.14312854409217834,
0.10952679812908173,
0.020868200808763504,
0.15218092501163483,
-0.06412363797426224,
-0.20196448266506195,
0.015727560967206955,
0.02590877376496792,
-0.03778844699263573,
0.13623052835464478,
0.09511331468820572,
-0.11118779331445694,
0.11298200488090515,
-0.002965379972010851,
-0.1484242081642151,
0.0058386968448758125,
0.028337787836790085,
-0.054582905024290085,
0.14219418168067932,
0.04103836417198181,
0.09107845276594162,
0.04443785175681114,
0.054914698004722595,
-0.1834973692893982,
0.014338134787976742,
0.05322951823472977,
-0.011908474378287792,
0.08760517090559006,
0.04552567005157471,
0.0006721749086864293,
0.06338969618082047,
-0.07823678851127625,
0.03789292275905609,
0.027557088062167168,
-0.12771597504615784,
-0.19424332678318024,
-0.07941354811191559,
0.03580579161643982,
0.07001789659261703,
0.09073803573846817,
-0.015627043321728706,
0.14165019989013672,
-0.006172184366732836,
0.10199635475873947,
0.2267705202102661,
-0.329679012298584,
-0.06318607181310654,
0.037764839828014374,
0.05869770795106888,
0.10140175372362137,
-0.08309674263000488,
0.0037995316088199615,
0.05419445037841797,
0.01777624525129795,
0.149794340133667,
-0.028848832473158836,
-0.015289842151105404,
-0.0009265984408557415,
-0.12416322529315948,
-0.036938805133104324,
0.19425687193870544,
0.07361918687820435,
-0.054498590528964996,
-0.07318989187479019,
-0.07733504474163055,
-0.1253947615623474,
-0.016438547521829605,
-0.014243083074688911,
0.04826665669679642,
-0.009242154657840729,
-0.08285046368837357,
-0.06655065715312958,
-0.11146726459264755,
-0.06420731544494629,
-0.04073043167591095,
0.12312870472669601,
0.0184017401188612,
-0.00034277091617695987,
-0.021599076688289642,
0.09717574715614319,
-0.018651623278856277,
-0.15110309422016144,
0.016444917768239975,
0.02151951752603054,
0.027733750641345978,
-0.03824722766876221,
-0.05040156468749046,
-0.1184094175696373,
0.029944106936454773,
0.13418957591056824,
-0.046695709228515625,
0.043392859399318695,
-0.010106934234499931,
0.041995804756879807,
-0.10876572877168655,
0.1739760935306549,
-0.0358460359275341,
-0.049468379467725754,
0.03910801187157631,
0.1054530143737793,
0.07535434514284134,
-0.01963038183748722,
-0.1348981261253357,
0.018319176509976387,
0.12123305350542068,
0.010276902467012405,
-0.03338275104761124,
0.07456787675619125,
-0.05980558320879936,
-0.017185863107442856,
0.018853766843676567,
-0.0850110650062561,
0.010267755016684532,
-0.011914859525859356,
-0.0474272184073925,
-0.07822676748037338,
0.033464159816503525,
0.03322509303689003,
0.00047957044444046915,
0.06695183366537094,
-0.08847207576036453,
-0.008061111904680729,
-0.06301765888929367,
-0.10428588837385178,
0.013453243300318718,
-0.07598178833723068,
0.022562947124242783,
-0.11476394534111023,
-0.2147301733493805,
-0.0011694326531141996,
0.05570392683148384,
-0.02792942523956299,
-0.0631089136004448,
-0.059224747121334076,
-0.06897468864917755,
0.011332610622048378,
-0.018390823155641556,
0.07806971669197083,
-0.06781582534313202,
0.1025322750210762,
0.057382114231586456,
0.052491694688797,
-0.0779217928647995,
0.030575938522815704,
-0.10907411575317383,
0.044557124376297,
-0.15242382884025574,
0.043091848492622375,
-0.021222814917564392,
0.07909766584634781,
-0.09546143561601639,
-0.06636784225702286,
-0.028496889397501945,
-0.008857045322656631,
0.06842736899852753,
0.10290506482124329,
-0.15010787546634674,
-0.06288117170333862,
0.17405495047569275,
-0.08543816953897476,
-0.19267918169498444,
0.13940070569515228,
-0.04529056325554848,
0.07903040200471878,
0.07354503124952316,
0.1951054185628891,
0.05553305894136429,
-0.08351194858551025,
0.009259307757019997,
-0.019865045323967934,
0.06634905189275742,
-0.040675267577171326,
0.09206093102693558,
-0.00916527584195137,
-0.010212291963398457,
0.012115012854337692,
-0.05949654430150986,
0.06381348520517349,
-0.06302022933959961,
-0.08101546764373779,
-0.045411236584186554,
-0.1082269698381424,
0.047794509679079056,
0.03479018807411194,
0.0650981217622757,
-0.11252939701080322,
-0.09601446241140366,
0.03173785284161568,
0.049104224890470505,
-0.08479667454957962,
0.015706786885857582,
-0.07277792692184448,
0.10139253735542297,
-0.08333490788936615,
-0.007844656705856323,
-0.139403834939003,
-0.05502615496516228,
0.02614571526646614,
-0.007425489369779825,
0.02267768234014511,
-0.027211938053369522,
0.08185965567827225,
0.0720476359128952,
-0.07249554246664047,
-0.04128172621130943,
-0.020784644410014153,
0.006911450065672398,
-0.11101745814085007,
-0.17366757988929749,
-0.009377163834869862,
-0.023080360144376755,
0.1655828356742859,
-0.21531404554843903,
0.05272915959358215,
0.006743571721017361,
0.08519389480352402,
0.04006959870457649,
-0.020341258496046066,
-0.02338092029094696,
0.038979049772024155,
-0.05020007863640785,
-0.07252510637044907,
0.06566983461380005,
0.02922854945063591,
-0.12784485518932343,
-0.001972236903384328,
-0.16117607057094574,
0.199088454246521,
0.13360635936260223,
-0.06821486353874207,
-0.049674879759550095,
-0.0027474933303892612,
-0.036029230803251266,
-0.03347719460725784,
-0.038844309747219086,
-0.0272319708019495,
0.12678614258766174,
0.002652536379173398,
0.16285815834999084,
-0.1085318848490715,
-0.04422545060515404,
0.026728658005595207,
-0.03486090153455734,
0.0016876190202310681,
0.09801273792982101,
0.04207485914230347,
-0.1367672085762024,
0.1421579122543335,
0.1929868757724762,
-0.05629503354430199,
0.13759015500545502,
-0.04462333396077156,
-0.054626498371362686,
-0.03134218603372574,
0.0288713201880455,
0.01612132042646408,
0.09748612344264984,
-0.1045282781124115,
0.009191040880978107,
0.011525669135153294,
0.005411813501268625,
0.012741629034280777,
-0.20222289860248566,
-0.026338469237089157,
0.05172111093997955,
-0.06372508406639099,
-0.007711536251008511,
-0.006780411582440138,
-0.025736957788467407,
0.08737587928771973,
0.009084629826247692,
-0.0681643933057785,
0.058585721999406815,
0.0028211623430252075,
-0.08786416798830032,
0.1903480887413025,
-0.0555158369243145,
-0.17257755994796753,
-0.1583300083875656,
-0.06171941012144089,
-0.07241499423980713,
0.03320188447833061,
0.07342851161956787,
-0.03879637271165848,
-0.039468083530664444,
-0.13413110375404358,
-0.003236875170841813,
0.022335512563586235,
0.023259852081537247,
0.02269553393125534,
-0.013088592328131199,
0.09218557178974152,
-0.09514725208282471,
-0.009671999141573906,
-0.0019227623706683517,
-0.02884325198829174,
0.03311443701386452,
-0.0010104374960064888,
0.11675139516592026,
0.10704568028450012,
-0.019478322938084602,
0.008249906823039055,
-0.033433061093091965,
0.23907479643821716,
-0.06300681084394455,
-0.003781513310968876,
0.15223732590675354,
-0.016472946852445602,
0.06109479442238808,
0.13306890428066254,
0.040171779692173004,
-0.0973261296749115,
0.030092936009168625,
0.018326548859477043,
-0.03449811413884163,
-0.20921002328395844,
-0.00469486927613616,
-0.04443884268403053,
0.01430077850818634,
0.10098365694284439,
0.03729113191366196,
0.06507311761379242,
0.07655301690101624,
0.01610235869884491,
0.09480421245098114,
0.014977337792515755,
0.08142077177762985,
0.12892213463783264,
0.05342836678028107,
0.12562412023544312,
-0.0451781190931797,
-0.053096525371074677,
0.038765061646699905,
0.00390227185562253,
0.17557670176029205,
0.014858079142868519,
0.2074207216501236,
0.03107577934861183,
0.1437946856021881,
-0.0077821952290833,
0.07581892609596252,
-0.0030553797259926796,
-0.022771427407860756,
-0.018927564844489098,
-0.0573246069252491,
-0.02846003882586956,
0.03709626942873001,
-0.08033923804759979,
0.0639619454741478,
-0.08089160174131393,
0.026109544560313225,
0.0540410652756691,
0.2625739574432373,
0.04306783527135849,
-0.3550211489200592,
-0.09118258208036423,
0.020260274410247803,
-0.0158840361982584,
-0.036203231662511826,
0.02227885089814663,
0.1379711627960205,
-0.0500517301261425,
0.0552157461643219,
-0.08629617094993591,
0.08585139364004135,
-0.034353967756032944,
0.04702002555131912,
0.059929683804512024,
0.06481549143791199,
-0.010927913710474968,
0.06553133577108383,
-0.281448096036911,
0.25015830993652344,
0.016164515167474747,
0.06294117867946625,
-0.048819176852703094,
0.0007098871283233166,
0.022159846499562263,
0.058949343860149384,
0.08737149834632874,
-0.018325474113225937,
-0.020284485071897507,
-0.15708114206790924,
-0.08710014075040817,
0.02927960641682148,
0.08685712516307831,
-0.06996549665927887,
0.10720083117485046,
-0.051577795296907425,
-0.005341623444110155,
0.06923943012952805,
0.02394993230700493,
-0.0784732773900032,
-0.09844817221164703,
0.0018708703573793173,
0.05716637149453163,
0.011046025902032852,
-0.08565326780080795,
-0.0922398716211319,
-0.12144391983747482,
0.14837561547756195,
-0.02290252409875393,
-0.04601072892546654,
-0.09937003254890442,
0.05581406131386757,
0.05528641119599342,
-0.0770789310336113,
0.042009592056274414,
0.0016631664475426078,
0.09456297010183334,
0.020946891978383064,
-0.05884551256895065,
0.12271945923566818,
-0.05128205567598343,
-0.17572471499443054,
-0.05374941974878311,
0.13314802944660187,
-0.017088869586586952,
0.036221593618392944,
0.0018178485333919525,
0.01868683472275734,
-0.04227009788155556,
-0.06858403980731964,
0.02963022142648697,
-0.042458049952983856,
0.045409467071294785,
-0.0074742319993674755,
-0.01860049180686474,
0.019775325432419777,
-0.06164269894361496,
-0.04752267524600029,
0.15710149705410004,
0.29193761944770813,
-0.06748533248901367,
-0.0028904012870043516,
0.044317565858364105,
-0.047261252999305725,
-0.1645304560661316,
0.009015812538564205,
0.0198376327753067,
0.008713724091649055,
0.06979367882013321,
-0.1291126161813736,
0.06767727434635162,
0.08410822600126266,
-0.02749444730579853,
0.10375244915485382,
-0.28484028577804565,
-0.14773307740688324,
0.09038081020116806,
0.16253827512264252,
0.12159683555364609,
-0.16852112114429474,
-0.06044529005885124,
-0.044694188982248306,
-0.1348838061094284,
0.11386224627494812,
-0.14100980758666992,
0.10446322709321976,
-0.003591664833948016,
0.04899589344859123,
0.007936791516840458,
-0.05042934790253639,
0.12600554525852203,
-0.035960711538791656,
0.0897315964102745,
-0.06595250219106674,
-0.0032209388446062803,
0.08218198269605637,
-0.06357738375663757,
0.03393430635333061,
-0.1514664590358734,
0.045635610818862915,
-0.04509766399860382,
-0.03587576001882553,
-0.04676460102200508,
0.03263886272907257,
-0.033833201974630356,
-0.05129126086831093,
-0.028583968058228493,
0.008519540540874004,
0.04624810069799423,
-0.005076006054878235,
0.16919660568237305,
0.007877596653997898,
0.13324956595897675,
0.15835146605968475,
0.0959249809384346,
-0.06375475972890854,
-0.014919158071279526,
-0.020543215796351433,
-0.04453504458069801,
0.041038285940885544,
-0.14903569221496582,
0.040416352450847626,
0.10704222321510315,
0.004853913094848394,
0.15289725363254547,
0.06320811063051224,
-0.03249968960881233,
0.01732243038713932,
0.0693737342953682,
-0.18908827006816864,
-0.16077199578285217,
-0.046524349600076675,
-0.04195093363523483,
-0.13251139223575592,
0.04674403369426727,
0.1406441628932953,
-0.06542903929948807,
0.0047555118799209595,
-0.007902909070253372,
0.007374797947704792,
-0.024944784119725227,
0.1530231386423111,
0.042829398065805435,
0.043470848351716995,
-0.07580842077732086,
0.08283479511737823,
0.052562691271305084,
-0.05831746757030487,
0.010301320813596249,
0.03532128036022186,
-0.09431241452693939,
-0.03929450362920761,
0.02535233087837696,
0.1672477126121521,
-0.03625284507870674,
-0.04990385100245476,
-0.1615796536207199,
-0.11071381717920303,
0.03822183981537819,
0.15733736753463745,
0.07807301729917526,
0.026436353102326393,
-0.01641070283949375,
-0.002858772873878479,
-0.08583401143550873,
0.12669166922569275,
0.03872624784708023,
0.08667345345020294,
-0.1693888008594513,
0.08622459322214127,
-0.004555772058665752,
0.00893361959606409,
-0.0230109840631485,
0.04298483580350876,
-0.09246223419904709,
-0.012291163206100464,
-0.12288152426481247,
0.007983971387147903,
-0.017938293516635895,
-0.0017491347389295697,
-0.007007995620369911,
-0.06814175099134445,
-0.06600065529346466,
0.015103714540600777,
-0.09223801642656326,
-0.039961885660886765,
0.03947596251964569,
0.054559506475925446,
-0.1221712976694107,
-0.035110242664813995,
0.03612643480300903,
-0.07761569321155548,
0.08527416735887527,
0.017686927691102028,
0.0025374821852892637,
0.03871168568730354,
-0.1565079241991043,
0.04459068551659584,
0.050477657467126846,
0.009070981293916702,
0.022655829787254333,
-0.09714469313621521,
-0.021281398832798004,
0.008606052957475185,
0.030216477811336517,
0.015845173969864845,
0.09062699973583221,
-0.1239921972155571,
-0.00604819692671299,
-0.01199718564748764,
-0.039283931255340576,
-0.055557575076818466,
0.025242816656827927,
0.06939943134784698,
0.01286650262773037,
0.21688587963581085,
-0.0893121287226677,
0.0029992288909852505,
-0.21256056427955627,
0.01805301196873188,
0.007999309338629246,
-0.1258343756198883,
-0.13195031881332397,
-0.06411861628293991,
0.0406523272395134,
-0.0632709488272667,
0.12496283650398254,
-0.012066146358847618,
0.04759214445948601,
0.030335940420627594,
-0.004899183288216591,
0.05310968682169914,
0.016535867005586624,
0.24226713180541992,
0.007045417558401823,
-0.037673063576221466,
0.037593208253383636,
0.022138671949505806,
0.10650566965341568,
0.07791741192340851,
0.15946754813194275,
0.1499491184949875,
-0.051378022879362106,
0.10581893473863602,
0.04597911611199379,
-0.012374681420624256,
-0.14320215582847595,
0.04853355884552002,
-0.029759643599390984,
0.1082535907626152,
-0.015675706788897514,
0.2224605530500412,
0.11141996830701828,
-0.1526045948266983,
0.006178641691803932,
-0.04321993514895439,
-0.06543390452861786,
-0.09141522645950317,
-0.09433472901582718,
-0.10168954730033875,
-0.1447320282459259,
-0.005054041743278503,
-0.10392912477254868,
0.010076349601149559,
0.08513461798429489,
-0.0005494281067512929,
-0.031282879412174225,
0.17678816616535187,
0.013612759299576283,
0.0019141181837767363,
0.05322757363319397,
-0.006601836998015642,
-0.04178227111697197,
-0.06883695721626282,
-0.10245627164840698,
0.005047149956226349,
0.000040581631765235215,
0.02478928118944168,
-0.03377969190478325,
-0.016057897359132767,
0.03192068636417389,
-0.021625177934765816,
-0.1115855872631073,
0.005034361965954304,
0.030740274116396904,
0.04257792979478836,
0.016905520111322403,
0.009105552919209003,
-0.004347328562289476,
0.0065868631936609745,
0.23233070969581604,
-0.07122877985239029,
-0.05877631902694702,
-0.09011081606149673,
0.14833518862724304,
0.005518568679690361,
-0.006553460843861103,
0.013717303983867168,
-0.09417436271905899,
0.043673474341630936,
0.21949966251850128,
0.16455158591270447,
-0.08562098443508148,
-0.0018781080143526196,
-0.013436871580779552,
-0.0068763974122703075,
-0.013336234726011753,
0.07241279631853104,
0.09401370584964752,
-0.008545738644897938,
-0.06283125281333923,
-0.013059183955192566,
-0.03870805352926254,
-0.0037407861091196537,
-0.033128272742033005,
0.07340574264526367,
0.017052490264177322,
0.011428125202655792,
-0.0447913222014904,
0.06477417051792145,
-0.03340320661664009,
-0.086939238011837,
0.010982982814311981,
-0.1968896985054016,
-0.13234105706214905,
-0.028377030044794083,
0.09746543318033218,
-0.01807285100221634,
0.04359379783272743,
-0.025329219177365303,
0.01714857667684555,
0.03591330721974373,
-0.02186412550508976,
-0.0780404582619667,
-0.038163091987371445,
0.05465797707438469,
-0.12675881385803223,
0.22903329133987427,
-0.03804082050919533,
0.0379020981490612,
0.1262393742799759,
0.029898039996623993,
-0.08765368163585663,
0.0935559794306755,
0.04272226616740227,
-0.03534509986639023,
0.049263376742601395,
0.07258438318967819,
-0.02325090765953064,
0.10839038342237473,
0.05314777046442032,
-0.08761171996593475,
0.010819426737725735,
-0.026428887620568275,
-0.05849801003932953,
-0.05407808721065521,
-0.055036548525094986,
-0.060947317630052567,
0.13826191425323486,
0.16166111826896667,
-0.0564461313188076,
0.0010869469260796905,
-0.047627463936805725,
0.022380415350198746,
0.08229195326566696,
0.03936229273676872,
-0.02466702274978161,
-0.2259872853755951,
0.0012963260523974895,
0.08344424515962601,
-0.004928205627948046,
-0.3112720847129822,
-0.0838363990187645,
-0.026187889277935028,
-0.03911565616726875,
-0.09110705554485321,
0.08693628013134003,
0.14453405141830444,
0.04415776580572128,
-0.05756336823105812,
-0.055050842463970184,
-0.08064354956150055,
0.1642914116382599,
-0.13089005649089813,
-0.09593484550714493
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | Imran1/AyaChatM3.5 | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T14:46:33+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04571164771914482,
0.1637648642063141,
-0.005522117950022221,
0.017756497487425804,
0.09821303188800812,
0.01318030059337616,
0.06541220843791962,
0.1127115860581398,
-0.017605241388082504,
0.1127321794629097,
0.030432263389229774,
0.09820804744958878,
0.1134178638458252,
0.14702944457530975,
-0.003594378475099802,
-0.22472713887691498,
0.052083637565374374,
-0.12124937027692795,
-0.03241228312253952,
0.1181139275431633,
0.14941681921482086,
-0.09871039539575577,
0.07234785705804825,
-0.030714161694049835,
-0.01334790326654911,
-0.03167412802577019,
-0.05947697162628174,
-0.045681875199079514,
0.046136777848005295,
0.0657167062163353,
0.06853367388248444,
0.007354621775448322,
0.08972878009080887,
-0.2669793367385864,
0.019881360232830048,
0.06918594241142273,
-0.0025153355672955513,
0.07059336453676224,
0.06344282627105713,
-0.07033728063106537,
0.10271385312080383,
-0.051166124641895294,
0.1467856466770172,
0.08377711474895477,
-0.09116126596927643,
-0.18892322480678558,
-0.08764564990997314,
0.0990586131811142,
0.17651304602622986,
0.04750865325331688,
-0.024397386237978935,
0.09895956516265869,
-0.0878119245171547,
0.015860557556152344,
0.052259236574172974,
-0.07261253148317337,
-0.05407591536641121,
0.061004482209682465,
0.07816638052463531,
0.06616047024726868,
-0.12551534175872803,
-0.02998468652367592,
0.005221198312938213,
0.011705057695508003,
0.07518111169338226,
0.01836656779050827,
0.15222862362861633,
0.03479425609111786,
-0.12653809785842896,
-0.04834689199924469,
0.0983143299818039,
0.03359128534793854,
-0.043975554406642914,
-0.247073233127594,
-0.031072303652763367,
-0.026882093399763107,
-0.030029185116291046,
-0.038772210478782654,
0.04153512790799141,
-0.006745535880327225,
0.08434242010116577,
-0.0040448750369250774,
-0.07344388216733932,
-0.03874153643846512,
0.06087949126958847,
0.0669754296541214,
0.029331250116229057,
-0.013996441848576069,
0.010876164771616459,
0.11490162461996078,
0.10806918889284134,
-0.12199585139751434,
-0.05589085817337036,
-0.06492951512336731,
-0.08786392956972122,
-0.04284887760877609,
0.033410828560590744,
0.03509693965315819,
0.05435176193714142,
0.2536843419075012,
0.009815474040806293,
0.06126174330711365,
0.03745805472135544,
0.007310505956411362,
0.059651583433151245,
0.10812553018331528,
-0.05987109988927841,
-0.10409316420555115,
-0.02881651371717453,
0.08857584744691849,
0.006609630770981312,
-0.03354408219456673,
-0.05052083358168602,
0.05901389569044113,
0.021856583654880524,
0.11749778687953949,
0.08884359151124954,
0.00984770804643631,
-0.07126569002866745,
-0.06146538630127907,
0.19450126588344574,
-0.16384615004062653,
0.04264351725578308,
0.03702449053525925,
-0.039683789014816284,
-0.0003956064465455711,
0.011445282027125359,
0.01843930408358574,
-0.023893611505627632,
0.09238249063491821,
-0.05498874559998512,
-0.04001082479953766,
-0.1106586754322052,
-0.0339570976793766,
0.034455835819244385,
0.010122774168848991,
-0.03529255837202072,
-0.03252722695469856,
-0.08346389979124069,
-0.07506290078163147,
0.09339368343353271,
-0.07379438728094101,
-0.04854428768157959,
-0.018830472603440285,
-0.0752616599202156,
0.02326788194477558,
0.02032634988427162,
0.07736726850271225,
-0.023358777165412903,
0.04288764297962189,
-0.054010841995477676,
0.05824148654937744,
0.11001134663820267,
0.035365406423807144,
-0.05824809893965721,
0.06025301292538643,
-0.2382364422082901,
0.09637492895126343,
-0.07412451505661011,
0.05830197036266327,
-0.15449334681034088,
-0.02627694234251976,
0.04870045557618141,
0.0076532382518053055,
-0.009597796015441418,
0.13436771929264069,
-0.21578943729400635,
-0.026375943794846535,
0.16865074634552002,
-0.10160042345523834,
-0.06946627050638199,
0.05867103114724159,
-0.049256108701229095,
0.10817171633243561,
0.03891118988394737,
-0.025492025539278984,
0.06244310364127159,
-0.12527504563331604,
0.007147894706577063,
-0.04992884770035744,
-0.016554534435272217,
0.1592475026845932,
0.07294736802577972,
-0.07235062122344971,
0.07110220938920975,
0.025814544409513474,
-0.027441376820206642,
-0.04532165080308914,
-0.016039686277508736,
-0.10585595667362213,
0.014911207370460033,
-0.061168964952230453,
0.01876060478389263,
-0.020111115649342537,
-0.08977947384119034,
-0.028080428019165993,
-0.1748371720314026,
-0.026230180636048317,
0.085477814078331,
-0.007464459165930748,
-0.018854627385735512,
-0.11770102381706238,
0.008567224256694317,
0.044854406267404556,
0.006109896115958691,
-0.13499478995800018,
-0.04764661565423012,
0.027907660230994225,
-0.16220368444919586,
0.033779170364141464,
-0.05184612050652504,
0.05056280270218849,
0.026674345135688782,
-0.029802238568663597,
-0.025906935334205627,
0.022987615317106247,
0.006545235402882099,
-0.011514187790453434,
-0.24465326964855194,
-0.026841215789318085,
-0.026506783440709114,
0.166712686419487,
-0.20777921378612518,
0.03577128052711487,
0.08057375997304916,
0.15318496525287628,
0.011457439512014389,
-0.04087435454130173,
0.005527274217456579,
-0.06868630647659302,
-0.025992877781391144,
-0.05823420733213425,
-0.002480053110048175,
-0.03337050974369049,
-0.04843711107969284,
0.04469521716237068,
-0.1662919819355011,
-0.03491327911615372,
0.09593124687671661,
0.06427760422229767,
-0.13986408710479736,
-0.023568401113152504,
-0.03526119887828827,
-0.049809779971838,
-0.047768235206604004,
-0.06002878025174141,
0.11181395500898361,
0.058611296117305756,
0.04419868439435959,
-0.059296321123838425,
-0.07637067884206772,
-0.0028071242850273848,
-0.014342374168336391,
-0.01986078731715679,
0.097631074488163,
0.06816094368696213,
-0.1381729394197464,
0.09227006882429123,
0.09810956567525864,
0.07738673686981201,
0.09273158758878708,
-0.02444581687450409,
-0.08119411021471024,
-0.0471174530684948,
0.03257923200726509,
0.018235107883810997,
0.1276484578847885,
-0.027872784063220024,
0.04268912971019745,
0.0421174094080925,
-0.018595336005091667,
0.013991083949804306,
-0.08597505837678909,
0.033884208649396896,
0.02703946642577648,
-0.0159194003790617,
0.04745442420244217,
-0.037611253559589386,
0.024539871141314507,
0.08754327148199081,
0.04615016281604767,
0.033831849694252014,
0.015717241913080215,
-0.05243339762091637,
-0.10873834043741226,
0.1642032116651535,
-0.12759798765182495,
-0.22238075733184814,
-0.13922695815563202,
0.003997850697487593,
0.036267586052417755,
-0.01646288111805916,
0.002834152430295944,
-0.060960907489061356,
-0.12132686376571655,
-0.08726011961698532,
0.015815909951925278,
0.050406474620103836,
-0.0912260189652443,
-0.060087788850069046,
0.056193675845861435,
0.037736181169748306,
-0.14546552300453186,
0.01776101253926754,
0.04850281774997711,
-0.09700650721788406,
-0.004754792433232069,
0.07885372638702393,
0.06784981489181519,
0.17673011124134064,
0.018112216144800186,
-0.021776698529720306,
0.031116241589188576,
0.20988549292087555,
-0.13491620123386383,
0.11005933582782745,
0.13349974155426025,
-0.09236859530210495,
0.08153878152370453,
0.20252206921577454,
0.04006611555814743,
-0.09986240416765213,
0.032548144459724426,
0.02142537757754326,
-0.027797512710094452,
-0.2441972941160202,
-0.07161470502614975,
-0.004515932407230139,
-0.06051458790898323,
0.07499068230390549,
0.09190185368061066,
0.08272628486156464,
0.011750337667763233,
-0.09449771046638489,
-0.08492138236761093,
0.06362129002809525,
0.10420511662960052,
0.02181125245988369,
-0.009744768962264061,
0.09036174416542053,
-0.03286943957209587,
0.01948373205959797,
0.08554471284151077,
0.0038120283279567957,
0.18320275843143463,
0.051725953817367554,
0.19073979556560516,
0.07944851368665695,
0.06951095163822174,
0.012023290619254112,
0.011227634735405445,
0.018135491758584976,
0.03228217363357544,
-0.003646562807261944,
-0.08350840210914612,
-0.02080707624554634,
0.1153142973780632,
0.0672341138124466,
0.012952476739883423,
0.01729460060596466,
-0.04021955281496048,
0.08128432929515839,
0.18377035856246948,
-0.0093126455321908,
-0.177269846200943,
-0.06024068966507912,
0.07718996703624725,
-0.09723462164402008,
-0.09738315641880035,
-0.01454379502683878,
0.030975129455327988,
-0.1702532023191452,
0.025819219648838043,
-0.023134231567382812,
0.11114585399627686,
-0.13745717704296112,
-0.020040949806571007,
0.07143081724643707,
0.07336213439702988,
0.004178736824542284,
0.055973317474126816,
-0.16574905812740326,
0.1074945405125618,
0.007851972244679928,
0.06788748502731323,
-0.0949488952755928,
0.10003086179494858,
-0.002759356750175357,
-0.016956903040409088,
0.13766175508499146,
0.003847390878945589,
-0.0742180123925209,
-0.07706846296787262,
-0.08544620126485825,
-0.010016623884439468,
0.12665624916553497,
-0.13990990817546844,
0.08602021634578705,
-0.03789570555090904,
-0.04160536453127861,
-0.0009961887262761593,
-0.09994571655988693,
-0.11771732568740845,
-0.18694964051246643,
0.060274846851825714,
-0.13818500936031342,
0.030693015083670616,
-0.1080726683139801,
-0.033236145973205566,
-0.03044886700809002,
0.18898600339889526,
-0.23496590554714203,
-0.07289838045835495,
-0.14654842019081116,
-0.10314314812421799,
0.14515270292758942,
-0.05135014280676842,
0.0824703797698021,
-0.007518251892179251,
0.16955603659152985,
0.01909777894616127,
-0.024870775640010834,
0.09702518582344055,
-0.09090493619441986,
-0.19369281828403473,
-0.07736486196517944,
0.1553725302219391,
0.13563397526741028,
0.03274888917803764,
-0.0031351360958069563,
0.03731042891740799,
-0.016484085470438004,
-0.119691863656044,
0.016338739544153214,
0.17828133702278137,
0.06005066633224487,
0.02449444867670536,
-0.025351086631417274,
-0.12034450471401215,
-0.07065033912658691,
-0.028268499299883842,
0.030481377616524696,
0.1794593334197998,
-0.06955225765705109,
0.18364831805229187,
0.147920161485672,
-0.05845186114311218,
-0.20284810662269592,
0.01105605997145176,
0.03317207098007202,
-0.00011460785754024982,
0.025185899809002876,
-0.19945523142814636,
0.08448769152164459,
0.004838644526898861,
-0.0498092919588089,
0.1281348466873169,
-0.17351724207401276,
-0.14425379037857056,
0.07726620137691498,
0.03829115256667137,
-0.1926836371421814,
-0.12892304360866547,
-0.09138946235179901,
-0.04540696740150452,
-0.18867050111293793,
0.09461917728185654,
0.031194355338811874,
0.009373899549245834,
0.030387504026293755,
0.030604345723986626,
0.01938873715698719,
-0.04181704297661781,
0.1860174536705017,
-0.023930367082357407,
0.028327496722340584,
-0.08596936613321304,
-0.07190530747175217,
0.0391114242374897,
-0.05227291211485863,
0.07252339273691177,
-0.023452037945389748,
0.00719826715067029,
-0.09769386798143387,
-0.04156304895877838,
-0.03843177855014801,
0.01581472158432007,
-0.09648153930902481,
-0.08523351699113846,
-0.04445706307888031,
0.09780744463205338,
0.09553340077400208,
-0.03473082184791565,
-0.024805041030049324,
-0.07508285343647003,
0.04805302992463112,
0.19605006277561188,
0.17889533936977386,
0.03904116898775101,
-0.07846304774284363,
-0.0033101453445851803,
-0.010484009049832821,
0.04490501061081886,
-0.20383046567440033,
0.06269704550504684,
0.05393069609999657,
0.019165942445397377,
0.11697915196418762,
-0.01937638409435749,
-0.15321338176727295,
-0.07137971371412277,
0.062210626900196075,
-0.05747547000646591,
-0.19925202429294586,
0.008424095809459686,
0.062047190964221954,
-0.16446428000926971,
-0.045800499618053436,
0.046785544604063034,
-0.004990153945982456,
-0.03839265555143356,
0.022938871756196022,
0.09231305122375488,
0.0029900665394961834,
0.07426668703556061,
0.052022483199834824,
0.0835016593337059,
-0.1060708537697792,
0.07922257483005524,
0.08730976283550262,
-0.08381073921918869,
0.022620677947998047,
0.10530175268650055,
-0.061487648636102676,
-0.03560204058885574,
0.017662353813648224,
0.08361397683620453,
0.018624287098646164,
-0.03893670439720154,
0.014383325353264809,
-0.1065717563033104,
0.059272702783346176,
0.08645539730787277,
0.03302672877907753,
0.01618802361190319,
0.034192394465208054,
0.04655340686440468,
-0.06840039044618607,
0.122025266289711,
0.032824426889419556,
0.017204686999320984,
-0.035474274307489395,
-0.04102595895528793,
0.01851540431380272,
-0.03368416428565979,
-0.005532157141715288,
-0.03097093477845192,
-0.07835554331541061,
-0.015077406540513039,
-0.16520504653453827,
-0.009829589165747166,
-0.05936548113822937,
0.012285472825169563,
0.031714752316474915,
-0.034721489995718,
0.008415459655225277,
0.009580436162650585,
-0.07713334262371063,
-0.06541574746370316,
-0.01965213567018509,
0.0961783304810524,
-0.1606777459383011,
0.022340767085552216,
0.08350874483585358,
-0.12098895758390427,
0.09293801337480545,
0.01664864458143711,
-0.00869405921548605,
0.02654755860567093,
-0.1516905426979065,
0.03389517217874527,
-0.03324367105960846,
0.009356614202260971,
0.04251125827431679,
-0.2180858999490738,
-0.0012979574967175722,
-0.034122150391340256,
-0.06511902064085007,
-0.008563618175685406,
-0.035606082528829575,
-0.1133907288312912,
0.10431582480669022,
0.007158213295042515,
-0.08918852359056473,
-0.031932637095451355,
0.02896781638264656,
0.08660420775413513,
-0.02103978954255581,
0.1533614844083786,
-0.008595003746449947,
0.07452014833688736,
-0.16158120334148407,
-0.019116591662168503,
-0.0044966633431613445,
0.021838920190930367,
-0.020337330177426338,
-0.011089952662587166,
0.043057333678007126,
-0.02310733124613762,
0.1769370436668396,
-0.034001484513282776,
0.02080564945936203,
0.06879838556051254,
0.02382824197411537,
-0.03270673379302025,
0.10420172661542892,
0.04176081717014313,
0.020029285922646523,
0.016749408096075058,
0.0014026050921529531,
-0.04661702737212181,
-0.03435906395316124,
-0.1965997964143753,
0.07266207784414291,
0.15759599208831787,
0.09697116911411285,
-0.019108884036540985,
0.07821404188871384,
-0.0993313267827034,
-0.10917975008487701,
0.12915705144405365,
-0.04755320027470589,
-0.004375945311039686,
-0.07154709100723267,
0.13273866474628448,
0.14712604880332947,
-0.18722544610500336,
0.07334931939840317,
-0.07133730500936508,
-0.04749078303575516,
-0.10922681540250778,
-0.194550022482872,
-0.05630992352962494,
-0.049111537635326385,
-0.015855323523283005,
-0.04727233946323395,
0.07431400567293167,
0.05443255603313446,
0.007043207995593548,
-0.0018872307846322656,
0.06250270456075668,
-0.02979675866663456,
-0.004455813206732273,
0.033084239810705185,
0.06524696946144104,
0.012280851602554321,
-0.028982065618038177,
0.017169395461678505,
-0.009704679250717163,
0.04565926641225815,
0.06593092530965805,
0.0490880124270916,
-0.02946917712688446,
0.01301988959312439,
-0.040264759212732315,
-0.10370729863643646,
0.044506072998046875,
-0.02268853597342968,
-0.081757090985775,
0.15341326594352722,
0.023376943543553352,
0.008703592233359814,
-0.018961627036333084,
0.23797030746936798,
-0.07337556779384613,
-0.09915944188833237,
-0.14910556375980377,
0.10603363811969757,
-0.037726908922195435,
0.05897798761725426,
0.04798928648233414,
-0.10144850611686707,
0.018896711990237236,
0.1251462697982788,
0.16306589543819427,
-0.03724272549152374,
0.020064668729901314,
0.030806828290224075,
0.005520908627659082,
-0.035788439214229584,
0.04845234379172325,
0.06755134463310242,
0.16263099014759064,
-0.046816933900117874,
0.09447267651557922,
0.0011601726291701198,
-0.09597980976104736,
-0.03777771443128586,
0.10832508653402328,
-0.014584118500351906,
0.018404638394713402,
-0.059979453682899475,
0.11911186575889587,
-0.06456011533737183,
-0.2371375411748886,
0.062140509486198425,
-0.06866546720266342,
-0.13664314150810242,
-0.023452885448932648,
0.08483598381280899,
-0.011404541321098804,
0.028394777327775955,
0.07356005162000656,
-0.07185159623622894,
0.20126941800117493,
0.03666449710726738,
-0.05399559810757637,
-0.054549336433410645,
0.0827551931142807,
-0.09896446764469147,
0.27000707387924194,
0.015913790091872215,
0.048061735928058624,
0.1041264757514,
-0.008932216092944145,
-0.13759581744670868,
0.019727399572730064,
0.0954047441482544,
-0.10358903557062149,
0.041838936507701874,
0.19829733669757843,
-0.0014832824235782027,
0.1230277270078659,
0.07854447513818741,
-0.07668869197368622,
0.0473078191280365,
-0.08185897022485733,
-0.06852826476097107,
-0.0918748751282692,
0.10061057657003403,
-0.07712632417678833,
0.14169210195541382,
0.13906599581241608,
-0.05018797889351845,
0.011615060269832611,
-0.031394075602293015,
0.04402702674269676,
0.0006254917825572193,
0.10420145094394684,
0.002576707163825631,
-0.18477243185043335,
0.02472778968513012,
0.006634650751948357,
0.10846512019634247,
-0.15925930440425873,
-0.09642539173364639,
0.03936212509870529,
0.004935122560709715,
-0.06595125794410706,
0.1294470727443695,
0.055943287909030914,
0.043614063411951065,
-0.039108045399188995,
-0.036952149122953415,
-0.006302761845290661,
0.13504701852798462,
-0.1053730770945549,
0.002390247769653797
] |
null | null | diffusers | ### Violet_Striped_Newt Dreambooth model trained by kimelyle with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook
Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb)
Sample pictures of this concept:
| {"license": "creativeml-openrail-m", "tags": ["text-to-image", "stable-diffusion"]} | text-to-image | kimelyle/Cave-Crayfish-2 | [
"diffusers",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-14T14:49:26+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### Violet_Striped_Newt Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook
Test the concept via A1111 Colab fast-Colab-A1111
Sample pictures of this concept:
| [
"### Violet_Striped_Newt Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### Violet_Striped_Newt Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
56,
55
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### Violet_Striped_Newt Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
-0.06641834229230881,
0.038550082594156265,
-0.002690935041755438,
0.08203636854887009,
-0.01683942601084709,
-0.01951966993510723,
0.12152031809091568,
0.006827804259955883,
0.0023514176718890667,
0.042438607662916183,
0.1434394270181656,
0.06001898646354675,
-0.006235531996935606,
0.10102178156375885,
-0.02916635200381279,
-0.19611333310604095,
0.014443203806877136,
0.055801428854465485,
-0.06671919673681259,
0.05325128883123398,
0.06982598453760147,
-0.09389358758926392,
0.09828753024339676,
-0.010107272304594517,
-0.14855584502220154,
-0.016256315633654594,
-0.07913721352815628,
-0.0473734512925148,
0.07246742397546768,
0.04527891427278519,
0.059126466512680054,
0.12927360832691193,
0.007329490967094898,
-0.059123825281858444,
0.046757232397794724,
-0.03665022552013397,
-0.0036750556901097298,
0.049283381551504135,
0.019051766023039818,
0.08014373481273651,
0.09839171171188354,
0.12603774666786194,
0.0058097452856600285,
-0.010397343896329403,
-0.04582497850060463,
0.1253000795841217,
0.0018890151986852288,
0.047451674938201904,
0.013636389747262001,
0.010998410172760487,
0.018232066184282303,
0.06419175863265991,
0.004595848731696606,
0.10957109928131104,
0.15143655240535736,
-0.26713278889656067,
-0.10448460280895233,
0.21475692093372345,
0.148207426071167,
-0.07050681859254837,
-0.03460089489817619,
0.09032599627971649,
0.0558745339512825,
0.028667805716395378,
-0.01273300126194954,
-0.08642380684614182,
-0.05052004009485245,
-0.07078304141759872,
-0.10151134431362152,
0.036798931658267975,
0.14482514560222626,
0.005155651830136776,
-0.04739892855286598,
-0.013503934256732464,
-0.1116470918059349,
0.1265919953584671,
-0.05104284733533859,
-0.03499355912208557,
-0.0035636723041534424,
-0.0073837824165821075,
-0.07212541252374649,
-0.06767111271619797,
-0.09044868499040604,
-0.06399789452552795,
-0.050818853080272675,
0.1627829521894455,
-0.020859163254499435,
0.05225593224167824,
-0.07246708869934082,
0.14966769516468048,
-0.04250088706612587,
-0.15936748683452606,
-0.0029940009117126465,
-0.12990540266036987,
0.09307670593261719,
0.011848014779388905,
0.0001082078306353651,
-0.07664672285318375,
0.09915021806955338,
0.008362120017409325,
0.16271620988845825,
-0.020197179168462753,
0.07357082515954971,
0.08746248483657837,
0.015689220279455185,
-0.03323451057076454,
-0.026945466175675392,
-0.12375668436288834,
-0.012635746039450169,
0.007967920042574406,
0.007086880970746279,
-0.04071060195565224,
-0.07411783188581467,
0.006248038727790117,
-0.07838525623083115,
-0.013528356328606606,
0.04295303672552109,
-0.009425929747521877,
-0.056397538632154465,
-0.03956517577171326,
0.0934823676943779,
0.01943943277001381,
-0.028847981244325638,
-0.07585343718528748,
-0.06876826286315918,
0.03454645723104477,
0.09848897159099579,
-0.028485136106610298,
0.02022762969136238,
0.12337999045848846,
-0.10232368111610413,
-0.02183554880321026,
-0.034005604684352875,
-0.03364499285817146,
0.005904588848352432,
-0.08415495604276657,
0.05583339184522629,
-0.11703185737133026,
-0.17647898197174072,
0.002386843552812934,
0.07668566703796387,
-0.0796113833785057,
-0.014912496320903301,
-0.021597398445010185,
-0.1229923889040947,
0.016291934996843338,
0.049318235367536545,
-0.026452941820025444,
-0.012334730476140976,
0.048934753984212875,
0.038311611860990524,
0.1327662318944931,
-0.0911371037364006,
-0.016373341903090477,
-0.08685853332281113,
0.03003901243209839,
-0.1254134476184845,
0.03442903235554695,
-0.09429246932268143,
0.10754859447479248,
-0.026895830407738686,
-0.05116557329893112,
0.017942002043128014,
0.024036623537540436,
0.032981839030981064,
0.20392447710037231,
-0.1662045568227768,
-0.002717429306358099,
0.0775548592209816,
-0.11965954303741455,
-0.19870828092098236,
0.05508138984441757,
0.013157765381038189,
0.1820525974035263,
0.0065069948323071,
0.11622810363769531,
0.06490407139062881,
-0.25758570432662964,
0.006906038150191307,
0.03372785449028015,
-0.09310249984264374,
-0.06611265242099762,
0.01732087880373001,
0.10724537074565887,
-0.01996253803372383,
0.01699003018438816,
-0.03908100724220276,
0.05012546107172966,
-0.10299848765134811,
-0.020016662776470184,
-0.05058085918426514,
-0.06907561421394348,
0.04181050881743431,
0.005502508021891117,
0.04235859587788582,
-0.050528451800346375,
0.031249145045876503,
0.03173746168613434,
0.01156133133918047,
0.008759241551160812,
-0.030738750472664833,
-0.07270727306604385,
0.058535993099212646,
-0.03384210914373398,
-0.019265776500105858,
-0.05860178545117378,
-0.04317917302250862,
0.019062163308262825,
0.13072392344474792,
-0.002822374226525426,
0.18614892661571503,
0.047758977860212326,
0.08157292008399963,
0.007695184089243412,
-0.02957472950220108,
0.039425890892744064,
0.040259506553411484,
-0.0508105605840683,
-0.1510094255208969,
0.07221877574920654,
-0.0690564289689064,
-0.06072727590799332,
-0.11719055473804474,
0.03155459091067314,
0.08513811230659485,
0.17474214732646942,
0.0578787736594677,
0.015632202848792076,
0.048256661742925644,
-0.011462160386145115,
-0.03633366525173187,
-0.0469638854265213,
0.05428171157836914,
0.03020426072180271,
0.02031862549483776,
0.08348890393972397,
0.000658064556773752,
0.26794084906578064,
0.08038339763879776,
0.005741239059716463,
-0.051896240562200546,
0.011628505773842335,
-0.0344998724758625,
-0.011152786202728748,
-0.05126745253801346,
0.048629917204380035,
0.022123146802186966,
-0.02442983165383339,
0.13667012751102448,
-0.05032294988632202,
0.01078425906598568,
0.03951341658830643,
-0.08311772346496582,
-0.020907247439026833,
0.08406826853752136,
-0.01266208291053772,
-0.12465687841176987,
0.028567969799041748,
0.15939924120903015,
-0.0424872487783432,
0.17137248814105988,
0.01736670359969139,
0.020912975072860718,
-0.09205081313848495,
0.010065433569252491,
-0.02105778455734253,
0.2092631608247757,
-0.10501366853713989,
0.012509926222264767,
0.014856788329780102,
-0.056673914194107056,
0.030012117698788643,
-0.09589135646820068,
-0.04191828519105911,
0.0394279845058918,
0.030514005571603775,
0.1648908108472824,
0.09978404641151428,
-0.12165511399507523,
0.03287627547979355,
-0.07718867063522339,
-0.16840438544750214,
0.03025052696466446,
0.00623679906129837,
0.02639957331120968,
0.12543603777885437,
-0.04148046672344208,
-0.1757291555404663,
-0.10999539494514465,
-0.10050154477357864,
0.0020452511962503195,
-0.03635912016034126,
0.06473571807146072,
0.0259973406791687,
-0.04462898522615433,
-0.07317138463258743,
0.03284969553351402,
-0.02805327996611595,
0.021191222593188286,
0.024968648329377174,
0.02001313678920269,
-0.09713082760572433,
-0.04717402160167694,
0.0018483118619769812,
-0.018316518515348434,
0.1368895024061203,
0.13859570026397705,
-0.029339052736759186,
0.12876848876476288,
0.10821383446455002,
-0.03509252145886421,
-0.004483293741941452,
0.03132081776857376,
0.2616937756538391,
-0.026537425816059113,
0.1259952038526535,
0.15357708930969238,
0.08500035107135773,
0.07451440393924713,
0.15767252445220947,
0.06521893292665482,
-0.05656072497367859,
0.08062373101711273,
-0.0976010113954544,
-0.07415526360273361,
-0.09078508615493774,
-0.08814053237438202,
-0.016589563339948654,
0.10457447171211243,
-0.005868707317858934,
0.04434880241751671,
0.03088463470339775,
0.1475890725851059,
0.11146634072065353,
0.008379350416362286,
-0.08116896450519562,
0.0828850045800209,
0.17106769979000092,
-0.08867529034614563,
0.045922860503196716,
-0.06327176094055176,
-0.08742458373308182,
0.0950663685798645,
-0.013745852746069431,
0.028929388150572777,
-0.04493573680520058,
-0.07603761553764343,
0.07203606516122818,
0.10410236567258835,
0.12122445553541183,
0.10347853600978851,
-0.005066056735813618,
-0.08609530329704285,
-0.039321720600128174,
-0.10226650536060333,
0.05792562663555145,
0.07441753149032593,
-0.06859125941991806,
-0.05110207945108414,
0.04793865606188774,
0.08776941895484924,
-0.025752075016498566,
-0.0041350978426635265,
0.1885611116886139,
-0.2065115123987198,
-0.030455395579338074,
-0.028087811544537544,
0.04712672159075737,
-0.09850580990314484,
0.016237875446677208,
0.250525563955307,
-0.017023596912622452,
-0.010979306884109974,
-0.06210405379533768,
0.06627877056598663,
0.06213598698377609,
0.011084326542913914,
-0.055103860795497894,
-0.0021332851611077785,
-0.019566992297768593,
0.01802610233426094,
-0.13298141956329346,
0.10363146662712097,
-0.01869390718638897,
0.059514399617910385,
0.012218144722282887,
-0.02383466064929962,
0.032585445791482925,
0.1748635470867157,
0.13723845779895782,
-0.023986177518963814,
0.08609879016876221,
0.016954777762293816,
-0.10554983466863632,
-0.01539886835962534,
0.06974431872367859,
0.06663329154253006,
0.03791241720318794,
0.056070417165756226,
-0.02362174727022648,
-0.00639962637796998,
0.0057432567700743675,
-0.17879682779312134,
-0.039554961025714874,
0.06326747685670853,
0.061699029058218,
0.003580173710361123,
-0.052139513194561005,
-0.062080055475234985,
0.12185356765985489,
0.13201767206192017,
-0.06667131930589676,
-0.05313260480761528,
-0.092188261449337,
-0.09235527366399765,
0.09270089864730835,
-0.02107950672507286,
0.07890782505273819,
-0.0946609154343605,
0.03653210029006004,
-0.03714510053396225,
-0.06547076255083084,
0.042412012815475464,
-0.13602836430072784,
-0.07035751640796661,
-0.14367054402828217,
0.005932298954576254,
-0.03901221603155136,
-0.0019199474481865764,
0.026183048263192177,
-0.014092065393924713,
-0.09535692632198334,
-0.09024842828512192,
-0.006286682095378637,
0.013798377476632595,
-0.11669609695672989,
-0.048909854143857956,
0.010991094633936882,
0.06181120127439499,
0.0068992506712675095,
-0.009591449052095413,
0.06043995916843414,
0.2423652708530426,
-0.05451325699687004,
0.055458199232816696,
0.15970541536808014,
-0.04603542387485504,
-0.2741565406322479,
-0.12442925572395325,
-0.04886443167924881,
0.011120091192424297,
-0.033595189452171326,
-0.10366471856832504,
0.19868683815002441,
-0.006086275912821293,
-0.03240831196308136,
0.19493748247623444,
-0.34072205424308777,
-0.07900750637054443,
0.05987809970974922,
0.12299312651157379,
0.34449881315231323,
-0.11129540950059891,
-0.08385514467954636,
-0.020956920459866524,
-0.2672390937805176,
0.09786885976791382,
0.021803995594382286,
0.08064364641904831,
-0.10594597458839417,
0.028690211474895477,
-0.010030187666416168,
-0.0423477366566658,
0.13573633134365082,
-0.08746609091758728,
0.03215061500668526,
-0.10457572340965271,
-0.0022890660911798477,
0.16571365296840668,
-0.0352473258972168,
0.06942596286535263,
-0.053323131054639816,
0.1135043278336525,
-0.07423341274261475,
-0.02477562054991722,
-0.02881774865090847,
0.06139713525772095,
-0.05954604223370552,
-0.09664556384086609,
-0.10217645019292831,
0.035269953310489655,
-0.03689912334084511,
-0.015926651656627655,
-0.11802610009908676,
0.024419443681836128,
-0.12367616593837738,
0.21472753584384918,
-0.025628715753555298,
-0.08630244433879852,
-0.07101628184318542,
0.00029507401632145047,
-0.05774776265025139,
0.08425460755825043,
-0.10304250568151474,
-0.0952225923538208,
0.19147025048732758,
0.0682922825217247,
0.02236524596810341,
0.03989977389574051,
-0.027515357360243797,
0.0025568739511072636,
0.09410254657268524,
-0.16961655020713806,
-0.060446955263614655,
-0.054775420576334,
0.2167336791753769,
0.04001424461603165,
0.008745228871703148,
0.15007220208644867,
-0.11499465256929398,
0.03535004332661629,
-0.036520082503557205,
-0.04008501023054123,
-0.0012836650712415576,
0.1291932314634323,
0.04497070237994194,
0.05248143523931503,
-0.04338008910417557,
0.05625300109386444,
-0.05688348039984703,
-0.11349987983703613,
-0.09890798479318619,
0.08522799611091614,
-0.08509986847639084,
-0.0713215172290802,
0.06315889209508896,
0.15386153757572174,
-0.19661104679107666,
0.017146490514278412,
-0.10295984894037247,
-0.09790647029876709,
0.03966231271624565,
0.2322053760290146,
0.06833349168300629,
0.04917372390627861,
-0.04443395137786865,
-0.06530334800481796,
0.015439589507877827,
0.06850582361221313,
0.057217516005039215,
0.10144630819559097,
-0.1801186054944992,
-0.05063154548406601,
-0.060217924416065216,
0.04695615917444229,
-0.09274564683437347,
-0.019017953425645828,
-0.07994718849658966,
-0.013773346319794655,
-0.05445345118641853,
0.09541656076908112,
-0.05578022822737694,
-0.07082714885473251,
-0.021055109798908234,
-0.011717671528458595,
-0.02169531024992466,
0.007561421021819115,
-0.060179658234119415,
0.036376241594552994,
0.003358684480190277,
0.0028601009398698807,
-0.046786922961473465,
-0.04776574298739433,
0.052331335842609406,
-0.06238716095685959,
0.04648606851696968,
-0.05721794068813324,
-0.1032446026802063,
-0.04309297353029251,
-0.11587764322757721,
-0.04800600931048393,
0.08977408707141876,
-0.009045290760695934,
0.03866764158010483,
0.060989297926425934,
-0.03379964083433151,
-0.0347016267478466,
0.061986587941646576,
-0.0030593674164265394,
0.07012362778186798,
-0.09263219684362411,
-0.06874772161245346,
-0.03348588943481445,
-0.04192386195063591,
-0.09105490893125534,
0.00010090794239658862,
0.13623307645320892,
0.09439293295145035,
0.12582151591777802,
-0.09087119996547699,
0.04306621849536896,
-0.017152119427919388,
-0.0003463233297225088,
0.06453593820333481,
-0.10314184427261353,
0.10247141122817993,
-0.01977301388978958,
-0.0185967106372118,
-0.002188062760978937,
0.10043730586767197,
0.027265748009085655,
-0.22245480120182037,
-0.006654144264757633,
-0.0851367712020874,
-0.013633069582283497,
0.027062300592660904,
0.24522168934345245,
0.03818318620324135,
0.03934898227453232,
-0.15988729894161224,
0.08530384302139282,
0.07292632013559341,
0.15789707005023956,
0.04210306331515312,
0.08400565385818481,
0.024969082325696945,
0.14107249677181244,
0.02145584300160408,
0.08834320306777954,
-0.0419924333691597,
-0.019418703392148018,
-0.04848393052816391,
0.15362142026424408,
-0.061682481318712234,
0.022400321438908577,
0.0538121722638607,
0.017235249280929565,
-0.03435809537768364,
0.027611661702394485,
-0.08312152326107025,
-0.015008826740086079,
-0.02058311179280281,
-0.06918635219335556,
-0.07324551045894623,
0.025587305426597595,
-0.07882248610258102,
0.00977508444339037,
0.044238872826099396,
0.039019376039505005,
-0.0100631732493639,
0.11090924590826035,
-0.02459675632417202,
-0.012264564633369446,
0.15928268432617188,
-0.023536749184131622,
-0.0859101414680481,
0.0028035673312842846,
0.05812597647309303,
-0.09192369878292084,
0.07614085078239441,
-0.0828365683555603,
0.042675718665122986,
-0.022465188056230545,
-0.00923229567706585,
0.05178401246666908,
-0.0630408301949501,
-0.024100646376609802,
0.012453182600438595,
0.03940850496292114,
0.045032087713479996,
0.028846673667430878,
-0.007874447852373123,
0.002872152952477336,
0.16634714603424072,
-0.058862172067165375,
-0.14784999191761017,
-0.06645956635475159,
0.0707724466919899,
-0.09426701068878174,
0.08664726465940475,
-0.03237761929631233,
-0.022413162514567375,
-0.06379012763500214,
0.1765233427286148,
0.10360999405384064,
-0.1270081251859665,
-0.013973970897495747,
0.006536753382533789,
-0.0020501238759607077,
-0.02678241766989231,
0.033380281180143356,
-0.014077461324632168,
0.22928108274936676,
-0.100586898624897,
-0.09593671560287476,
-0.10390244424343109,
-0.08928786218166351,
-0.03618412837386131,
-0.17694659531116486,
0.07500998675823212,
-0.04578631371259689,
-0.12285307794809341,
0.1005437970161438,
-0.1665278673171997,
-0.05405128374695778,
0.1975567638874054,
-0.062351763248443604,
-0.053160410374403,
-0.04189402610063553,
0.12044209986925125,
0.02231891080737114,
0.08916069567203522,
-0.1103857010602951,
0.010570058599114418,
0.04003197327256203,
-0.030356494709849358,
-0.13810575008392334,
0.09227889031171799,
0.02390551008284092,
-0.21870645880699158,
0.12373048812150955,
0.005874559283256531,
0.07975174486637115,
0.07842680811882019,
-0.021656019613146782,
-0.12335260957479477,
0.05082147195935249,
0.00641535222530365,
-0.10633514076471329,
-0.04160630702972412,
0.09733060747385025,
0.03985327109694481,
-0.007229972630739212,
0.01289683859795332,
-0.12642869353294373,
-0.0209675133228302,
0.18633098900318146,
0.030917225405573845,
-0.13981279730796814,
0.05964827165007591,
-0.017299100756645203,
0.0673719272017479,
0.07147105783224106,
-0.04724118113517761,
-0.014467309229075909,
0.001991100376471877,
0.04628443345427513,
0.014385794289410114,
-0.09249278157949448,
0.03128251060843468,
-0.03495987132191658,
-0.010176555253565311,
-0.07982178032398224,
-0.02642117813229561,
-0.21732065081596375,
-0.07907360047101974,
-0.15054906904697418,
0.03894459083676338,
-0.008979486301541328,
0.08547873049974442,
0.21499507129192352,
0.02562081441283226,
0.028044989332556725,
-0.03683153912425041,
-0.0429266020655632,
0.010963743552565575,
-0.02193964459002018,
-0.1237839087843895
] |
null | null | null | https://civitai.com/models/173812/clorinde-or-genshin-impact | {"license": "creativeml-openrail-m"} | null | LarryAIDraw/Char-Genshin-Clorinde-v2 | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-14T14:51:43+00:00 | [] | [] | TAGS
#license-creativeml-openrail-m #region-us
| URL | [] | [
"TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
18
] | [
"passage: TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
-0.07587551325559616,
0.1441737711429596,
-0.0062791393138468266,
0.012048184871673584,
-0.001431003911420703,
-0.022854028269648552,
0.2091037780046463,
-0.018623588606715202,
0.08854977041482925,
-0.11491455882787704,
0.14648450911045074,
0.18939465284347534,
-0.10384178161621094,
0.0838744044303894,
-0.061768148094415665,
-0.13200531899929047,
0.029243366792798042,
-0.07651498913764954,
-0.0865340456366539,
0.028722204267978668,
0.056829702109098434,
-0.01273291651159525,
-0.003666024887934327,
-0.0012952570104971528,
-0.11045186221599579,
0.07173702865839005,
-0.029841862618923187,
-0.037320639938116074,
0.060927797108888626,
-0.04866224527359009,
0.04899880662560463,
0.11812204867601395,
-0.033462416380643845,
-0.13358792662620544,
0.004443002864718437,
-0.11795501410961151,
-0.13281011581420898,
0.007506446447223425,
0.121794693171978,
-0.0353701114654541,
0.12644833326339722,
0.17882929742336273,
0.0022871040273457766,
0.07042364031076431,
-0.1692226231098175,
-0.17680460214614868,
-0.04340395703911781,
-0.018681490793824196,
-0.026622790843248367,
0.0532202385365963,
0.11296376585960388,
0.0959911122918129,
-0.1474708467721939,
0.059626504778862,
0.08025065064430237,
-0.29932230710983276,
0.03342466056346893,
0.23123668134212494,
0.11160528659820557,
0.03646189346909523,
-0.04899992793798447,
0.06103713810443878,
0.037279851734638214,
-0.055691562592983246,
-0.011489230208098888,
-0.07466674596071243,
0.033063821494579315,
0.1203068420290947,
-0.048032116144895554,
-0.025952165946364403,
0.3207513689994812,
-0.011608880013227463,
0.004257023800164461,
0.03850623592734337,
-0.046627260744571686,
0.03471478819847107,
0.053042974323034286,
0.07628075033426285,
0.05806995555758476,
0.1503586620092392,
0.06162842735648155,
-0.11057397723197937,
-0.12041215598583221,
0.018044639378786087,
-0.14939343929290771,
0.16419777274131775,
-0.05087574943900108,
0.0932750254869461,
-0.11752020567655563,
0.018267955631017685,
-0.0651155412197113,
-0.03550999239087105,
-0.010290741920471191,
-0.14436741173267365,
0.09543514996767044,
-0.00750720826908946,
-0.044816359877586365,
-0.06333030760288239,
0.06353012472391129,
0.134693443775177,
0.06326734274625778,
-0.01916888915002346,
0.03110724687576294,
0.18312698602676392,
0.02453736774623394,
-0.039170458912849426,
0.02620672434568405,
0.14288429915905,
0.03429737314581871,
-0.1762668490409851,
-0.0059744445607066154,
-0.0644608810544014,
-0.1936662793159485,
-0.02320769429206848,
-0.19997692108154297,
0.16352415084838867,
-0.030033577233552933,
-0.016221072524785995,
-0.03707468882203102,
0.022218478843569756,
0.04353277385234833,
0.007484832778573036,
0.018807580694556236,
-0.044244956225156784,
-0.08294660598039627,
-0.08514150232076645,
-0.020517800003290176,
0.05681263282895088,
0.07853931933641434,
0.18057872354984283,
-0.12033670395612717,
0.0023163571022450924,
-0.04746192321181297,
-0.002028648741543293,
0.10751507431268692,
-0.1799560934305191,
0.05942503362894058,
-0.10612065345048904,
-0.21264076232910156,
-0.0035186251625418663,
0.11188323050737381,
0.02211635187268257,
0.00010340322478441522,
0.023470120504498482,
-0.042402785271406174,
-0.03322858735918999,
-0.06714189052581787,
-0.09123854339122772,
-0.07618846744298935,
0.0644230917096138,
-0.15088342130184174,
-0.06908489763736725,
-0.27447474002838135,
0.021657612174749374,
-0.11370886117219925,
0.030269425362348557,
0.09551744163036346,
-0.08233252167701721,
-0.11906278878450394,
0.24992190301418304,
0.07235409319400787,
0.07105377316474915,
-0.037106942385435104,
-0.02335505001246929,
-0.040998950600624084,
0.07576625794172287,
-0.051450882107019424,
0.006896975915879011,
0.06892602890729904,
-0.05309505760669708,
-0.13028347492218018,
-0.018723927438259125,
-0.04109232872724533,
0.13036558032035828,
-0.005558064207434654,
0.30143606662750244,
0.04775548353791237,
-0.18540549278259277,
0.20458267629146576,
0.13462620973587036,
-0.17578788101673126,
-0.3525811433792114,
0.10510481148958206,
-0.08032525330781937,
-0.12903624773025513,
0.02135874517261982,
0.05760384723544121,
0.08029629290103912,
-0.016704760491847992,
-0.03554001823067665,
0.003427563700824976,
-0.061561521142721176,
-0.016107140108942986,
0.031175263226032257,
0.09541988372802734,
-0.08737137913703918,
0.08379733562469482,
0.03426050394773483,
-0.0114505710080266,
0.14006270468235016,
-0.02073829248547554,
-0.0763879269361496,
0.02079492248594761,
0.04172089695930481,
-0.020384199917316437,
-0.056601639837026596,
-0.019958069548010826,
0.024005193263292313,
-0.017852509394288063,
0.10743143409490585,
0.29301881790161133,
0.0457768440246582,
-0.015894168987870216,
0.050522804260253906,
0.02892244979739189,
0.031187754124403,
0.04622279107570648,
0.002081167884171009,
-0.15730762481689453,
0.07284589111804962,
-0.05682012811303139,
-0.09314198791980743,
-0.03167767822742462,
-0.0017506676958873868,
0.0981268361210823,
-0.05222945287823677,
0.06663653254508972,
0.04907272756099701,
0.008146014995872974,
-0.0024776349309831858,
0.019724633544683456,
0.03505800664424896,
0.15693770349025726,
0.06973138451576233,
-0.09330075234174728,
0.2326427847146988,
-0.07795968651771545,
0.3451519012451172,
0.06519531458616257,
-0.17186447978019714,
0.0015280802035704255,
-0.16536928713321686,
-0.08274903148412704,
0.009426575154066086,
0.06846177577972412,
0.04244798794388771,
-0.06766051799058914,
-0.0681324228644371,
0.1076645776629448,
-0.05602144077420235,
-0.05967314541339874,
-0.09208252280950546,
-0.06438151746988297,
-0.09841792285442352,
0.11479154229164124,
0.17103825509548187,
-0.17601613700389862,
0.14707137644290924,
0.31644511222839355,
0.0033473046496510506,
0.20550797879695892,
-0.06598898768424988,
0.06533558666706085,
-0.11870601028203964,
0.06948951631784439,
-0.033792875707149506,
0.1264963299036026,
-0.10152938961982727,
0.04339653253555298,
0.01719778962433338,
0.05835990980267525,
0.12580721080303192,
-0.1375611275434494,
-0.2047722488641739,
0.05393601953983307,
0.04846670478582382,
-0.08490802347660065,
0.15654030442237854,
-0.07621043175458908,
0.03958071768283844,
-0.04002580791711807,
-0.10932640731334686,
0.16022461652755737,
-0.07396190613508224,
-0.03576399013400078,
0.04601873457431793,
-0.162797212600708,
0.04817049205303192,
-0.13655415177345276,
-0.20034807920455933,
-0.03256381303071976,
0.011739566922187805,
0.09091648459434509,
0.0064963698387146,
-0.045913100242614746,
0.008927296847105026,
-0.1321311742067337,
-0.24660253524780273,
-0.10214889049530029,
-0.04224977269768715,
0.1463703066110611,
-0.09529456496238708,
-0.08689732849597931,
-0.008191614411771297,
-0.027925807982683182,
0.0383632630109787,
0.0873899981379509,
-0.04390016943216324,
0.15604910254478455,
0.13776685297489166,
0.03233470022678375,
0.07692384719848633,
-0.0302706528455019,
0.16908830404281616,
0.07715359330177307,
-0.09182680398225784,
0.09044599533081055,
-0.006939579267054796,
0.07778391242027283,
0.26205286383628845,
0.13615888357162476,
-0.10827198624610901,
0.0021787171717733145,
-0.09298930317163467,
-0.13136249780654907,
-0.25473496317863464,
-0.03117409534752369,
-0.15477068722248077,
0.13437145948410034,
-0.08579761534929276,
0.08686056733131409,
0.13696706295013428,
0.05041143670678139,
0.10572081059217453,
0.018525123596191406,
-0.016791416332125664,
0.022843502461910248,
0.17746564745903015,
-0.02853401191532612,
-0.043541014194488525,
-0.14404186606407166,
-0.022182300686836243,
0.15260697901248932,
0.10192563384771347,
0.16757766902446747,
0.16616763174533844,
0.11930298805236816,
0.1956932544708252,
0.11704401671886444,
0.10304278880357742,
0.052189555019140244,
-0.013531852513551712,
-0.004093863070011139,
-0.01228472962975502,
-0.042497504502534866,
0.05230056867003441,
0.05571495369076729,
0.027585504576563835,
-0.19872500002384186,
0.02184155583381653,
-0.19329896569252014,
-0.02313016541302204,
-0.08243345469236374,
0.01644495315849781,
0.05239224433898926,
0.2096434086561203,
0.04210057109594345,
0.10118018835783005,
0.021744482219219208,
0.10573884844779968,
0.015865135937929153,
-0.07006605714559555,
-0.0065298317931592464,
-0.024272896349430084,
0.09974277764558792,
0.10174193233251572,
0.021700428798794746,
-0.016679642722010612,
-0.09889253973960876,
0.04607788100838661,
0.17424549162387848,
-0.17494839429855347,
0.3187439739704132,
-0.0007240860140882432,
-0.04524024948477745,
-0.04190666601061821,
-0.08219234645366669,
0.04142151027917862,
0.1647384762763977,
0.1017698273062706,
0.0333428718149662,
-0.14635729789733887,
-0.06874663382768631,
-0.029922528192400932,
-0.029125673696398735,
0.10087492316961288,
-0.06689736992120743,
-0.13817089796066284,
-0.025579528883099556,
0.0344909206032753,
0.003919827751815319,
0.21354736387729645,
-0.10228335112333298,
-0.15175104141235352,
0.00922450888901949,
0.13133007287979126,
-0.06745465099811554,
-0.04906000941991806,
0.09594502300024033,
-0.02669750526547432,
0.0972210094332695,
-0.0541548989713192,
0.002656505908817053,
-0.14727191627025604,
-0.2363637089729309,
0.010592032223939896,
-0.02335694245994091,
0.020698489621281624,
-0.07203120738267899,
-0.11125075072050095,
-0.1240958720445633,
-0.1789770871400833,
0.11374562233686447,
-0.06521226465702057,
0.09276589751243591,
-0.09726036339998245,
0.08684233576059341,
-0.08414942771196365,
0.02816055528819561,
-0.05099964141845703,
-0.0012100528692826629,
-0.09757094830274582,
-0.14613427221775055,
0.024435222148895264,
-0.13409870862960815,
-0.001014217734336853,
0.034934982657432556,
-0.11161556839942932,
0.14066044986248016,
0.13931402564048767,
-0.08724056929349899,
0.17418785393238068,
0.42831170558929443,
-0.05984934791922569,
0.25173598527908325,
0.2527628242969513,
-0.13718484342098236,
-0.2734082341194153,
-0.059651490300893784,
-0.23391994833946228,
-0.08160211890935898,
0.1082993745803833,
-0.1578003615140915,
0.015907390043139458,
0.05020333454012871,
-0.11690597236156464,
0.1467704027891159,
-0.32824045419692993,
-0.07495500147342682,
0.09672868996858597,
0.007048844825476408,
0.4732857048511505,
-0.1068139299750328,
-0.12494277954101562,
-0.07125994563102722,
-0.10485164821147919,
0.10395017266273499,
-0.07008004188537598,
0.08493339270353317,
-0.030203424394130707,
0.025772906839847565,
0.011868835426867008,
-0.04774972423911095,
0.14879614114761353,
-0.0427577942609787,
0.19098854064941406,
-0.11560776084661484,
0.0027590321842581034,
0.14695321023464203,
-0.03108292631804943,
0.038532279431819916,
-0.07178329676389694,
0.04545990377664566,
-0.042950090020895004,
-0.027814088389277458,
-0.018928585574030876,
0.11621513217687607,
-0.004339784849435091,
-0.1380559802055359,
-0.06945756077766418,
0.01972813345491886,
-0.07362999767065048,
-0.05320021137595177,
0.15675771236419678,
0.03502804413437843,
0.05609925836324692,
0.11970125883817673,
0.004991572815924883,
-0.146412655711174,
0.00884049292653799,
-0.07536338269710541,
0.01455683447420597,
0.04314182698726654,
-0.08771193772554398,
-0.050023581832647324,
0.11971840262413025,
0.021750157698988914,
0.0665673241019249,
0.06486256420612335,
-0.042168524116277695,
0.02131110616028309,
0.11186312884092331,
-0.12857086956501007,
-0.06895474344491959,
-0.017605429515242577,
0.2739332914352417,
0.20882153511047363,
0.06424131989479065,
0.011942589655518532,
0.03977527841925621,
0.08851079642772675,
0.025800030678510666,
-0.024320857599377632,
-0.027894796803593636,
-0.07533380389213562,
0.08076632767915726,
-0.026636533439159393,
-0.08794095367193222,
0.1338292956352234,
0.04866079241037369,
-0.0795087143778801,
-0.08115667849779129,
0.10095386952161789,
-0.03139214217662811,
-0.0645640566945076,
-0.04291141778230667,
0.16875873506069183,
-0.142974391579628,
-0.05379750579595566,
0.05253109708428383,
-0.06923473626375198,
0.03050602227449417,
0.1983366161584854,
0.06317481398582458,
0.10652732849121094,
0.020412208512425423,
-0.03693949803709984,
0.09139978885650635,
-0.008889229968190193,
-0.1458244025707245,
0.04242372885346413,
-0.1516965925693512,
-0.1209954097867012,
-0.03220202773809433,
0.059742625802755356,
-0.06468313187360764,
-0.0443362258374691,
-0.16110824048519135,
0.08512833714485168,
-0.059125129133462906,
-0.04787873104214668,
-0.07900126278400421,
-0.034204404801130295,
-0.011031275615096092,
-0.027199620380997658,
-0.08409348875284195,
0.0068776607513427734,
-0.22133535146713257,
0.051574207842350006,
0.04428314045071602,
0.017113016918301582,
-0.03435007482767105,
-0.08292978256940842,
0.07848229259252548,
0.04986674711108208,
0.10280575603246689,
0.03711284324526787,
-0.059191394597291946,
0.0037306465674191713,
-0.20414716005325317,
-0.038815271109342575,
0.04232484847307205,
-0.021390240639448166,
0.0267819594591856,
0.08142497390508652,
-0.03312315046787262,
0.05886727198958397,
-0.04134150594472885,
0.031092548742890358,
-0.12302310764789581,
-0.19250139594078064,
-0.07369648665189743,
0.0737677738070488,
-0.1768668293952942,
-0.007294799666851759,
-0.158339723944664,
0.12045895308256149,
0.0037357027176767588,
0.19128042459487915,
0.05877019464969635,
0.07969143241643906,
0.07085993885993958,
-0.03897101804614067,
0.1005023792386055,
-0.05584702640771866,
-0.09622103720903397,
-0.019361555576324463,
-0.12480172514915466,
-0.049345120787620544,
0.42032214999198914,
0.05109545961022377,
-0.34862402081489563,
0.03209015727043152,
0.10416815429925919,
0.09029489010572433,
0.0010600913083180785,
0.1751212626695633,
-0.02115757390856743,
0.00999172031879425,
-0.09422436356544495,
0.09467131644487381,
-0.0020058725494891405,
-0.11290951073169708,
0.0739678293466568,
0.09658773243427277,
0.08477838337421417,
-0.024424241855740547,
0.13553570210933685,
-0.010457966476678848,
0.03920025750994682,
-0.11343693733215332,
0.15077632665634155,
0.06773624569177628,
-0.05210328474640846,
0.062154389917850494,
0.1635616272687912,
0.05306112766265869,
0.07038675248622894,
0.04032095894217491,
0.0014122785069048405,
-0.1754148155450821,
-0.1602102369070053,
0.02099275030195713,
-0.05523645877838135,
0.07993361353874207,
0.02664482593536377,
0.06025690957903862,
0.05930217728018761,
0.08369890600442886,
-0.02683570235967636,
-0.012045243754982948,
-0.21370548009872437,
-0.059094905853271484,
-0.014421275816857815,
-0.06632379442453384,
-0.06530799716711044,
-0.13236206769943237,
-0.007965253666043282,
-0.11605394631624222,
-0.1677420735359192,
-0.11075370758771896,
0.06186629459261894,
-0.03134578466415405,
-0.07950954884290695,
-0.1361609846353531,
0.005552724003791809,
-0.051663242280483246,
0.0591781884431839,
0.020678075030446053,
0.14382748305797577,
-0.055859338492155075,
-0.007769476156681776,
0.03557850420475006,
0.17586101591587067,
0.03452156111598015,
-0.019137056544423103,
0.05009777843952179,
-0.11230028420686722,
-0.013903132639825344,
0.09447801858186722,
-0.05355257913470268,
0.03868480771780014,
0.05060523375868797,
0.14069905877113342,
0.3000718951225281,
-0.15852685272693634,
0.022173447534441948,
-0.0156106511130929,
0.027616411447525024,
0.03752091899514198,
0.10538272559642792,
-0.047601912170648575,
0.30318450927734375,
-0.03754459694027901,
0.015319152735173702,
-0.05392564833164215,
0.03960913047194481,
-0.0902356207370758,
0.13807453215122223,
0.07016881555318832,
-0.1437612622976303,
-0.11773919314146042,
0.13123241066932678,
-0.2251790165901184,
0.21079330146312714,
0.05835592746734619,
-0.018531115725636482,
0.0006959201418794692,
-0.017787374556064606,
0.20127902925014496,
-0.06664536148309708,
0.07648804783821106,
-0.10087135434150696,
-0.11177007853984833,
-0.14956814050674438,
0.008278977125883102,
-0.3149573504924774,
-0.07720612734556198,
0.10045251995325089,
0.1509818434715271,
0.17898774147033691,
-0.022407056763768196,
0.060840118676424026,
0.03429623693227768,
0.016734736040234566,
-0.09003262221813202,
0.09443855285644531,
0.08975303173065186,
-0.14206120371818542,
-0.09327292442321777,
-0.12793666124343872,
-0.015153053216636181,
-0.009946417063474655,
-0.008153465576469898,
0.0022670275066047907,
0.04026666656136513,
0.12014163285493851,
-0.04463301971554756,
-0.05576737970113754,
0.06202622875571251,
-0.09607529640197754,
0.03486022725701332,
-0.03752650320529938,
0.012558498419821262,
-0.07468373328447342,
-0.03885192796587944,
-0.04395401477813721,
0.06765811145305634,
-0.2736577093601227,
-0.04237256944179535,
0.10482975840568542,
-0.0006625195383094251,
0.22920070588588715,
0.053381726145744324,
-0.108866386115551,
-0.028044672682881355,
-0.11392955482006073,
0.06305203586816788,
-0.12086670845746994,
-0.0018355880165472627,
0.1538183093070984,
0.022182224318385124,
0.03804173693060875,
-0.16429899632930756,
0.040075428783893585,
-0.10011276602745056,
-0.03175477311015129,
-0.06921384483575821
] |
null | null | null | https://civitai.com/models/304743/uta-one-piece-or-goofy-ai | {"license": "creativeml-openrail-m"} | null | LarryAIDraw/uta_one_piece | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-14T14:52:10+00:00 | [] | [] | TAGS
#license-creativeml-openrail-m #region-us
| URL | [] | [
"TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
18
] | [
"passage: TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
-0.07587551325559616,
0.1441737711429596,
-0.0062791393138468266,
0.012048184871673584,
-0.001431003911420703,
-0.022854028269648552,
0.2091037780046463,
-0.018623588606715202,
0.08854977041482925,
-0.11491455882787704,
0.14648450911045074,
0.18939465284347534,
-0.10384178161621094,
0.0838744044303894,
-0.061768148094415665,
-0.13200531899929047,
0.029243366792798042,
-0.07651498913764954,
-0.0865340456366539,
0.028722204267978668,
0.056829702109098434,
-0.01273291651159525,
-0.003666024887934327,
-0.0012952570104971528,
-0.11045186221599579,
0.07173702865839005,
-0.029841862618923187,
-0.037320639938116074,
0.060927797108888626,
-0.04866224527359009,
0.04899880662560463,
0.11812204867601395,
-0.033462416380643845,
-0.13358792662620544,
0.004443002864718437,
-0.11795501410961151,
-0.13281011581420898,
0.007506446447223425,
0.121794693171978,
-0.0353701114654541,
0.12644833326339722,
0.17882929742336273,
0.0022871040273457766,
0.07042364031076431,
-0.1692226231098175,
-0.17680460214614868,
-0.04340395703911781,
-0.018681490793824196,
-0.026622790843248367,
0.0532202385365963,
0.11296376585960388,
0.0959911122918129,
-0.1474708467721939,
0.059626504778862,
0.08025065064430237,
-0.29932230710983276,
0.03342466056346893,
0.23123668134212494,
0.11160528659820557,
0.03646189346909523,
-0.04899992793798447,
0.06103713810443878,
0.037279851734638214,
-0.055691562592983246,
-0.011489230208098888,
-0.07466674596071243,
0.033063821494579315,
0.1203068420290947,
-0.048032116144895554,
-0.025952165946364403,
0.3207513689994812,
-0.011608880013227463,
0.004257023800164461,
0.03850623592734337,
-0.046627260744571686,
0.03471478819847107,
0.053042974323034286,
0.07628075033426285,
0.05806995555758476,
0.1503586620092392,
0.06162842735648155,
-0.11057397723197937,
-0.12041215598583221,
0.018044639378786087,
-0.14939343929290771,
0.16419777274131775,
-0.05087574943900108,
0.0932750254869461,
-0.11752020567655563,
0.018267955631017685,
-0.0651155412197113,
-0.03550999239087105,
-0.010290741920471191,
-0.14436741173267365,
0.09543514996767044,
-0.00750720826908946,
-0.044816359877586365,
-0.06333030760288239,
0.06353012472391129,
0.134693443775177,
0.06326734274625778,
-0.01916888915002346,
0.03110724687576294,
0.18312698602676392,
0.02453736774623394,
-0.039170458912849426,
0.02620672434568405,
0.14288429915905,
0.03429737314581871,
-0.1762668490409851,
-0.0059744445607066154,
-0.0644608810544014,
-0.1936662793159485,
-0.02320769429206848,
-0.19997692108154297,
0.16352415084838867,
-0.030033577233552933,
-0.016221072524785995,
-0.03707468882203102,
0.022218478843569756,
0.04353277385234833,
0.007484832778573036,
0.018807580694556236,
-0.044244956225156784,
-0.08294660598039627,
-0.08514150232076645,
-0.020517800003290176,
0.05681263282895088,
0.07853931933641434,
0.18057872354984283,
-0.12033670395612717,
0.0023163571022450924,
-0.04746192321181297,
-0.002028648741543293,
0.10751507431268692,
-0.1799560934305191,
0.05942503362894058,
-0.10612065345048904,
-0.21264076232910156,
-0.0035186251625418663,
0.11188323050737381,
0.02211635187268257,
0.00010340322478441522,
0.023470120504498482,
-0.042402785271406174,
-0.03322858735918999,
-0.06714189052581787,
-0.09123854339122772,
-0.07618846744298935,
0.0644230917096138,
-0.15088342130184174,
-0.06908489763736725,
-0.27447474002838135,
0.021657612174749374,
-0.11370886117219925,
0.030269425362348557,
0.09551744163036346,
-0.08233252167701721,
-0.11906278878450394,
0.24992190301418304,
0.07235409319400787,
0.07105377316474915,
-0.037106942385435104,
-0.02335505001246929,
-0.040998950600624084,
0.07576625794172287,
-0.051450882107019424,
0.006896975915879011,
0.06892602890729904,
-0.05309505760669708,
-0.13028347492218018,
-0.018723927438259125,
-0.04109232872724533,
0.13036558032035828,
-0.005558064207434654,
0.30143606662750244,
0.04775548353791237,
-0.18540549278259277,
0.20458267629146576,
0.13462620973587036,
-0.17578788101673126,
-0.3525811433792114,
0.10510481148958206,
-0.08032525330781937,
-0.12903624773025513,
0.02135874517261982,
0.05760384723544121,
0.08029629290103912,
-0.016704760491847992,
-0.03554001823067665,
0.003427563700824976,
-0.061561521142721176,
-0.016107140108942986,
0.031175263226032257,
0.09541988372802734,
-0.08737137913703918,
0.08379733562469482,
0.03426050394773483,
-0.0114505710080266,
0.14006270468235016,
-0.02073829248547554,
-0.0763879269361496,
0.02079492248594761,
0.04172089695930481,
-0.020384199917316437,
-0.056601639837026596,
-0.019958069548010826,
0.024005193263292313,
-0.017852509394288063,
0.10743143409490585,
0.29301881790161133,
0.0457768440246582,
-0.015894168987870216,
0.050522804260253906,
0.02892244979739189,
0.031187754124403,
0.04622279107570648,
0.002081167884171009,
-0.15730762481689453,
0.07284589111804962,
-0.05682012811303139,
-0.09314198791980743,
-0.03167767822742462,
-0.0017506676958873868,
0.0981268361210823,
-0.05222945287823677,
0.06663653254508972,
0.04907272756099701,
0.008146014995872974,
-0.0024776349309831858,
0.019724633544683456,
0.03505800664424896,
0.15693770349025726,
0.06973138451576233,
-0.09330075234174728,
0.2326427847146988,
-0.07795968651771545,
0.3451519012451172,
0.06519531458616257,
-0.17186447978019714,
0.0015280802035704255,
-0.16536928713321686,
-0.08274903148412704,
0.009426575154066086,
0.06846177577972412,
0.04244798794388771,
-0.06766051799058914,
-0.0681324228644371,
0.1076645776629448,
-0.05602144077420235,
-0.05967314541339874,
-0.09208252280950546,
-0.06438151746988297,
-0.09841792285442352,
0.11479154229164124,
0.17103825509548187,
-0.17601613700389862,
0.14707137644290924,
0.31644511222839355,
0.0033473046496510506,
0.20550797879695892,
-0.06598898768424988,
0.06533558666706085,
-0.11870601028203964,
0.06948951631784439,
-0.033792875707149506,
0.1264963299036026,
-0.10152938961982727,
0.04339653253555298,
0.01719778962433338,
0.05835990980267525,
0.12580721080303192,
-0.1375611275434494,
-0.2047722488641739,
0.05393601953983307,
0.04846670478582382,
-0.08490802347660065,
0.15654030442237854,
-0.07621043175458908,
0.03958071768283844,
-0.04002580791711807,
-0.10932640731334686,
0.16022461652755737,
-0.07396190613508224,
-0.03576399013400078,
0.04601873457431793,
-0.162797212600708,
0.04817049205303192,
-0.13655415177345276,
-0.20034807920455933,
-0.03256381303071976,
0.011739566922187805,
0.09091648459434509,
0.0064963698387146,
-0.045913100242614746,
0.008927296847105026,
-0.1321311742067337,
-0.24660253524780273,
-0.10214889049530029,
-0.04224977269768715,
0.1463703066110611,
-0.09529456496238708,
-0.08689732849597931,
-0.008191614411771297,
-0.027925807982683182,
0.0383632630109787,
0.0873899981379509,
-0.04390016943216324,
0.15604910254478455,
0.13776685297489166,
0.03233470022678375,
0.07692384719848633,
-0.0302706528455019,
0.16908830404281616,
0.07715359330177307,
-0.09182680398225784,
0.09044599533081055,
-0.006939579267054796,
0.07778391242027283,
0.26205286383628845,
0.13615888357162476,
-0.10827198624610901,
0.0021787171717733145,
-0.09298930317163467,
-0.13136249780654907,
-0.25473496317863464,
-0.03117409534752369,
-0.15477068722248077,
0.13437145948410034,
-0.08579761534929276,
0.08686056733131409,
0.13696706295013428,
0.05041143670678139,
0.10572081059217453,
0.018525123596191406,
-0.016791416332125664,
0.022843502461910248,
0.17746564745903015,
-0.02853401191532612,
-0.043541014194488525,
-0.14404186606407166,
-0.022182300686836243,
0.15260697901248932,
0.10192563384771347,
0.16757766902446747,
0.16616763174533844,
0.11930298805236816,
0.1956932544708252,
0.11704401671886444,
0.10304278880357742,
0.052189555019140244,
-0.013531852513551712,
-0.004093863070011139,
-0.01228472962975502,
-0.042497504502534866,
0.05230056867003441,
0.05571495369076729,
0.027585504576563835,
-0.19872500002384186,
0.02184155583381653,
-0.19329896569252014,
-0.02313016541302204,
-0.08243345469236374,
0.01644495315849781,
0.05239224433898926,
0.2096434086561203,
0.04210057109594345,
0.10118018835783005,
0.021744482219219208,
0.10573884844779968,
0.015865135937929153,
-0.07006605714559555,
-0.0065298317931592464,
-0.024272896349430084,
0.09974277764558792,
0.10174193233251572,
0.021700428798794746,
-0.016679642722010612,
-0.09889253973960876,
0.04607788100838661,
0.17424549162387848,
-0.17494839429855347,
0.3187439739704132,
-0.0007240860140882432,
-0.04524024948477745,
-0.04190666601061821,
-0.08219234645366669,
0.04142151027917862,
0.1647384762763977,
0.1017698273062706,
0.0333428718149662,
-0.14635729789733887,
-0.06874663382768631,
-0.029922528192400932,
-0.029125673696398735,
0.10087492316961288,
-0.06689736992120743,
-0.13817089796066284,
-0.025579528883099556,
0.0344909206032753,
0.003919827751815319,
0.21354736387729645,
-0.10228335112333298,
-0.15175104141235352,
0.00922450888901949,
0.13133007287979126,
-0.06745465099811554,
-0.04906000941991806,
0.09594502300024033,
-0.02669750526547432,
0.0972210094332695,
-0.0541548989713192,
0.002656505908817053,
-0.14727191627025604,
-0.2363637089729309,
0.010592032223939896,
-0.02335694245994091,
0.020698489621281624,
-0.07203120738267899,
-0.11125075072050095,
-0.1240958720445633,
-0.1789770871400833,
0.11374562233686447,
-0.06521226465702057,
0.09276589751243591,
-0.09726036339998245,
0.08684233576059341,
-0.08414942771196365,
0.02816055528819561,
-0.05099964141845703,
-0.0012100528692826629,
-0.09757094830274582,
-0.14613427221775055,
0.024435222148895264,
-0.13409870862960815,
-0.001014217734336853,
0.034934982657432556,
-0.11161556839942932,
0.14066044986248016,
0.13931402564048767,
-0.08724056929349899,
0.17418785393238068,
0.42831170558929443,
-0.05984934791922569,
0.25173598527908325,
0.2527628242969513,
-0.13718484342098236,
-0.2734082341194153,
-0.059651490300893784,
-0.23391994833946228,
-0.08160211890935898,
0.1082993745803833,
-0.1578003615140915,
0.015907390043139458,
0.05020333454012871,
-0.11690597236156464,
0.1467704027891159,
-0.32824045419692993,
-0.07495500147342682,
0.09672868996858597,
0.007048844825476408,
0.4732857048511505,
-0.1068139299750328,
-0.12494277954101562,
-0.07125994563102722,
-0.10485164821147919,
0.10395017266273499,
-0.07008004188537598,
0.08493339270353317,
-0.030203424394130707,
0.025772906839847565,
0.011868835426867008,
-0.04774972423911095,
0.14879614114761353,
-0.0427577942609787,
0.19098854064941406,
-0.11560776084661484,
0.0027590321842581034,
0.14695321023464203,
-0.03108292631804943,
0.038532279431819916,
-0.07178329676389694,
0.04545990377664566,
-0.042950090020895004,
-0.027814088389277458,
-0.018928585574030876,
0.11621513217687607,
-0.004339784849435091,
-0.1380559802055359,
-0.06945756077766418,
0.01972813345491886,
-0.07362999767065048,
-0.05320021137595177,
0.15675771236419678,
0.03502804413437843,
0.05609925836324692,
0.11970125883817673,
0.004991572815924883,
-0.146412655711174,
0.00884049292653799,
-0.07536338269710541,
0.01455683447420597,
0.04314182698726654,
-0.08771193772554398,
-0.050023581832647324,
0.11971840262413025,
0.021750157698988914,
0.0665673241019249,
0.06486256420612335,
-0.042168524116277695,
0.02131110616028309,
0.11186312884092331,
-0.12857086956501007,
-0.06895474344491959,
-0.017605429515242577,
0.2739332914352417,
0.20882153511047363,
0.06424131989479065,
0.011942589655518532,
0.03977527841925621,
0.08851079642772675,
0.025800030678510666,
-0.024320857599377632,
-0.027894796803593636,
-0.07533380389213562,
0.08076632767915726,
-0.026636533439159393,
-0.08794095367193222,
0.1338292956352234,
0.04866079241037369,
-0.0795087143778801,
-0.08115667849779129,
0.10095386952161789,
-0.03139214217662811,
-0.0645640566945076,
-0.04291141778230667,
0.16875873506069183,
-0.142974391579628,
-0.05379750579595566,
0.05253109708428383,
-0.06923473626375198,
0.03050602227449417,
0.1983366161584854,
0.06317481398582458,
0.10652732849121094,
0.020412208512425423,
-0.03693949803709984,
0.09139978885650635,
-0.008889229968190193,
-0.1458244025707245,
0.04242372885346413,
-0.1516965925693512,
-0.1209954097867012,
-0.03220202773809433,
0.059742625802755356,
-0.06468313187360764,
-0.0443362258374691,
-0.16110824048519135,
0.08512833714485168,
-0.059125129133462906,
-0.04787873104214668,
-0.07900126278400421,
-0.034204404801130295,
-0.011031275615096092,
-0.027199620380997658,
-0.08409348875284195,
0.0068776607513427734,
-0.22133535146713257,
0.051574207842350006,
0.04428314045071602,
0.017113016918301582,
-0.03435007482767105,
-0.08292978256940842,
0.07848229259252548,
0.04986674711108208,
0.10280575603246689,
0.03711284324526787,
-0.059191394597291946,
0.0037306465674191713,
-0.20414716005325317,
-0.038815271109342575,
0.04232484847307205,
-0.021390240639448166,
0.0267819594591856,
0.08142497390508652,
-0.03312315046787262,
0.05886727198958397,
-0.04134150594472885,
0.031092548742890358,
-0.12302310764789581,
-0.19250139594078064,
-0.07369648665189743,
0.0737677738070488,
-0.1768668293952942,
-0.007294799666851759,
-0.158339723944664,
0.12045895308256149,
0.0037357027176767588,
0.19128042459487915,
0.05877019464969635,
0.07969143241643906,
0.07085993885993958,
-0.03897101804614067,
0.1005023792386055,
-0.05584702640771866,
-0.09622103720903397,
-0.019361555576324463,
-0.12480172514915466,
-0.049345120787620544,
0.42032214999198914,
0.05109545961022377,
-0.34862402081489563,
0.03209015727043152,
0.10416815429925919,
0.09029489010572433,
0.0010600913083180785,
0.1751212626695633,
-0.02115757390856743,
0.00999172031879425,
-0.09422436356544495,
0.09467131644487381,
-0.0020058725494891405,
-0.11290951073169708,
0.0739678293466568,
0.09658773243427277,
0.08477838337421417,
-0.024424241855740547,
0.13553570210933685,
-0.010457966476678848,
0.03920025750994682,
-0.11343693733215332,
0.15077632665634155,
0.06773624569177628,
-0.05210328474640846,
0.062154389917850494,
0.1635616272687912,
0.05306112766265869,
0.07038675248622894,
0.04032095894217491,
0.0014122785069048405,
-0.1754148155450821,
-0.1602102369070053,
0.02099275030195713,
-0.05523645877838135,
0.07993361353874207,
0.02664482593536377,
0.06025690957903862,
0.05930217728018761,
0.08369890600442886,
-0.02683570235967636,
-0.012045243754982948,
-0.21370548009872437,
-0.059094905853271484,
-0.014421275816857815,
-0.06632379442453384,
-0.06530799716711044,
-0.13236206769943237,
-0.007965253666043282,
-0.11605394631624222,
-0.1677420735359192,
-0.11075370758771896,
0.06186629459261894,
-0.03134578466415405,
-0.07950954884290695,
-0.1361609846353531,
0.005552724003791809,
-0.051663242280483246,
0.0591781884431839,
0.020678075030446053,
0.14382748305797577,
-0.055859338492155075,
-0.007769476156681776,
0.03557850420475006,
0.17586101591587067,
0.03452156111598015,
-0.019137056544423103,
0.05009777843952179,
-0.11230028420686722,
-0.013903132639825344,
0.09447801858186722,
-0.05355257913470268,
0.03868480771780014,
0.05060523375868797,
0.14069905877113342,
0.3000718951225281,
-0.15852685272693634,
0.022173447534441948,
-0.0156106511130929,
0.027616411447525024,
0.03752091899514198,
0.10538272559642792,
-0.047601912170648575,
0.30318450927734375,
-0.03754459694027901,
0.015319152735173702,
-0.05392564833164215,
0.03960913047194481,
-0.0902356207370758,
0.13807453215122223,
0.07016881555318832,
-0.1437612622976303,
-0.11773919314146042,
0.13123241066932678,
-0.2251790165901184,
0.21079330146312714,
0.05835592746734619,
-0.018531115725636482,
0.0006959201418794692,
-0.017787374556064606,
0.20127902925014496,
-0.06664536148309708,
0.07648804783821106,
-0.10087135434150696,
-0.11177007853984833,
-0.14956814050674438,
0.008278977125883102,
-0.3149573504924774,
-0.07720612734556198,
0.10045251995325089,
0.1509818434715271,
0.17898774147033691,
-0.022407056763768196,
0.060840118676424026,
0.03429623693227768,
0.016734736040234566,
-0.09003262221813202,
0.09443855285644531,
0.08975303173065186,
-0.14206120371818542,
-0.09327292442321777,
-0.12793666124343872,
-0.015153053216636181,
-0.009946417063474655,
-0.008153465576469898,
0.0022670275066047907,
0.04026666656136513,
0.12014163285493851,
-0.04463301971554756,
-0.05576737970113754,
0.06202622875571251,
-0.09607529640197754,
0.03486022725701332,
-0.03752650320529938,
0.012558498419821262,
-0.07468373328447342,
-0.03885192796587944,
-0.04395401477813721,
0.06765811145305634,
-0.2736577093601227,
-0.04237256944179535,
0.10482975840568542,
-0.0006625195383094251,
0.22920070588588715,
0.053381726145744324,
-0.108866386115551,
-0.028044672682881355,
-0.11392955482006073,
0.06305203586816788,
-0.12086670845746994,
-0.0018355880165472627,
0.1538183093070984,
0.022182224318385124,
0.03804173693060875,
-0.16429899632930756,
0.040075428783893585,
-0.10011276602745056,
-0.03175477311015129,
-0.06921384483575821
] |
null | null | null | https://civitai.com/models/303514/shushu-suruga-or-mato-seihei-no-slave | {"license": "creativeml-openrail-m"} | null | LarryAIDraw/CHAR-ShushuSuruga | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-14T14:52:34+00:00 | [] | [] | TAGS
#license-creativeml-openrail-m #region-us
| URL | [] | [
"TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
18
] | [
"passage: TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
-0.07587551325559616,
0.1441737711429596,
-0.0062791393138468266,
0.012048184871673584,
-0.001431003911420703,
-0.022854028269648552,
0.2091037780046463,
-0.018623588606715202,
0.08854977041482925,
-0.11491455882787704,
0.14648450911045074,
0.18939465284347534,
-0.10384178161621094,
0.0838744044303894,
-0.061768148094415665,
-0.13200531899929047,
0.029243366792798042,
-0.07651498913764954,
-0.0865340456366539,
0.028722204267978668,
0.056829702109098434,
-0.01273291651159525,
-0.003666024887934327,
-0.0012952570104971528,
-0.11045186221599579,
0.07173702865839005,
-0.029841862618923187,
-0.037320639938116074,
0.060927797108888626,
-0.04866224527359009,
0.04899880662560463,
0.11812204867601395,
-0.033462416380643845,
-0.13358792662620544,
0.004443002864718437,
-0.11795501410961151,
-0.13281011581420898,
0.007506446447223425,
0.121794693171978,
-0.0353701114654541,
0.12644833326339722,
0.17882929742336273,
0.0022871040273457766,
0.07042364031076431,
-0.1692226231098175,
-0.17680460214614868,
-0.04340395703911781,
-0.018681490793824196,
-0.026622790843248367,
0.0532202385365963,
0.11296376585960388,
0.0959911122918129,
-0.1474708467721939,
0.059626504778862,
0.08025065064430237,
-0.29932230710983276,
0.03342466056346893,
0.23123668134212494,
0.11160528659820557,
0.03646189346909523,
-0.04899992793798447,
0.06103713810443878,
0.037279851734638214,
-0.055691562592983246,
-0.011489230208098888,
-0.07466674596071243,
0.033063821494579315,
0.1203068420290947,
-0.048032116144895554,
-0.025952165946364403,
0.3207513689994812,
-0.011608880013227463,
0.004257023800164461,
0.03850623592734337,
-0.046627260744571686,
0.03471478819847107,
0.053042974323034286,
0.07628075033426285,
0.05806995555758476,
0.1503586620092392,
0.06162842735648155,
-0.11057397723197937,
-0.12041215598583221,
0.018044639378786087,
-0.14939343929290771,
0.16419777274131775,
-0.05087574943900108,
0.0932750254869461,
-0.11752020567655563,
0.018267955631017685,
-0.0651155412197113,
-0.03550999239087105,
-0.010290741920471191,
-0.14436741173267365,
0.09543514996767044,
-0.00750720826908946,
-0.044816359877586365,
-0.06333030760288239,
0.06353012472391129,
0.134693443775177,
0.06326734274625778,
-0.01916888915002346,
0.03110724687576294,
0.18312698602676392,
0.02453736774623394,
-0.039170458912849426,
0.02620672434568405,
0.14288429915905,
0.03429737314581871,
-0.1762668490409851,
-0.0059744445607066154,
-0.0644608810544014,
-0.1936662793159485,
-0.02320769429206848,
-0.19997692108154297,
0.16352415084838867,
-0.030033577233552933,
-0.016221072524785995,
-0.03707468882203102,
0.022218478843569756,
0.04353277385234833,
0.007484832778573036,
0.018807580694556236,
-0.044244956225156784,
-0.08294660598039627,
-0.08514150232076645,
-0.020517800003290176,
0.05681263282895088,
0.07853931933641434,
0.18057872354984283,
-0.12033670395612717,
0.0023163571022450924,
-0.04746192321181297,
-0.002028648741543293,
0.10751507431268692,
-0.1799560934305191,
0.05942503362894058,
-0.10612065345048904,
-0.21264076232910156,
-0.0035186251625418663,
0.11188323050737381,
0.02211635187268257,
0.00010340322478441522,
0.023470120504498482,
-0.042402785271406174,
-0.03322858735918999,
-0.06714189052581787,
-0.09123854339122772,
-0.07618846744298935,
0.0644230917096138,
-0.15088342130184174,
-0.06908489763736725,
-0.27447474002838135,
0.021657612174749374,
-0.11370886117219925,
0.030269425362348557,
0.09551744163036346,
-0.08233252167701721,
-0.11906278878450394,
0.24992190301418304,
0.07235409319400787,
0.07105377316474915,
-0.037106942385435104,
-0.02335505001246929,
-0.040998950600624084,
0.07576625794172287,
-0.051450882107019424,
0.006896975915879011,
0.06892602890729904,
-0.05309505760669708,
-0.13028347492218018,
-0.018723927438259125,
-0.04109232872724533,
0.13036558032035828,
-0.005558064207434654,
0.30143606662750244,
0.04775548353791237,
-0.18540549278259277,
0.20458267629146576,
0.13462620973587036,
-0.17578788101673126,
-0.3525811433792114,
0.10510481148958206,
-0.08032525330781937,
-0.12903624773025513,
0.02135874517261982,
0.05760384723544121,
0.08029629290103912,
-0.016704760491847992,
-0.03554001823067665,
0.003427563700824976,
-0.061561521142721176,
-0.016107140108942986,
0.031175263226032257,
0.09541988372802734,
-0.08737137913703918,
0.08379733562469482,
0.03426050394773483,
-0.0114505710080266,
0.14006270468235016,
-0.02073829248547554,
-0.0763879269361496,
0.02079492248594761,
0.04172089695930481,
-0.020384199917316437,
-0.056601639837026596,
-0.019958069548010826,
0.024005193263292313,
-0.017852509394288063,
0.10743143409490585,
0.29301881790161133,
0.0457768440246582,
-0.015894168987870216,
0.050522804260253906,
0.02892244979739189,
0.031187754124403,
0.04622279107570648,
0.002081167884171009,
-0.15730762481689453,
0.07284589111804962,
-0.05682012811303139,
-0.09314198791980743,
-0.03167767822742462,
-0.0017506676958873868,
0.0981268361210823,
-0.05222945287823677,
0.06663653254508972,
0.04907272756099701,
0.008146014995872974,
-0.0024776349309831858,
0.019724633544683456,
0.03505800664424896,
0.15693770349025726,
0.06973138451576233,
-0.09330075234174728,
0.2326427847146988,
-0.07795968651771545,
0.3451519012451172,
0.06519531458616257,
-0.17186447978019714,
0.0015280802035704255,
-0.16536928713321686,
-0.08274903148412704,
0.009426575154066086,
0.06846177577972412,
0.04244798794388771,
-0.06766051799058914,
-0.0681324228644371,
0.1076645776629448,
-0.05602144077420235,
-0.05967314541339874,
-0.09208252280950546,
-0.06438151746988297,
-0.09841792285442352,
0.11479154229164124,
0.17103825509548187,
-0.17601613700389862,
0.14707137644290924,
0.31644511222839355,
0.0033473046496510506,
0.20550797879695892,
-0.06598898768424988,
0.06533558666706085,
-0.11870601028203964,
0.06948951631784439,
-0.033792875707149506,
0.1264963299036026,
-0.10152938961982727,
0.04339653253555298,
0.01719778962433338,
0.05835990980267525,
0.12580721080303192,
-0.1375611275434494,
-0.2047722488641739,
0.05393601953983307,
0.04846670478582382,
-0.08490802347660065,
0.15654030442237854,
-0.07621043175458908,
0.03958071768283844,
-0.04002580791711807,
-0.10932640731334686,
0.16022461652755737,
-0.07396190613508224,
-0.03576399013400078,
0.04601873457431793,
-0.162797212600708,
0.04817049205303192,
-0.13655415177345276,
-0.20034807920455933,
-0.03256381303071976,
0.011739566922187805,
0.09091648459434509,
0.0064963698387146,
-0.045913100242614746,
0.008927296847105026,
-0.1321311742067337,
-0.24660253524780273,
-0.10214889049530029,
-0.04224977269768715,
0.1463703066110611,
-0.09529456496238708,
-0.08689732849597931,
-0.008191614411771297,
-0.027925807982683182,
0.0383632630109787,
0.0873899981379509,
-0.04390016943216324,
0.15604910254478455,
0.13776685297489166,
0.03233470022678375,
0.07692384719848633,
-0.0302706528455019,
0.16908830404281616,
0.07715359330177307,
-0.09182680398225784,
0.09044599533081055,
-0.006939579267054796,
0.07778391242027283,
0.26205286383628845,
0.13615888357162476,
-0.10827198624610901,
0.0021787171717733145,
-0.09298930317163467,
-0.13136249780654907,
-0.25473496317863464,
-0.03117409534752369,
-0.15477068722248077,
0.13437145948410034,
-0.08579761534929276,
0.08686056733131409,
0.13696706295013428,
0.05041143670678139,
0.10572081059217453,
0.018525123596191406,
-0.016791416332125664,
0.022843502461910248,
0.17746564745903015,
-0.02853401191532612,
-0.043541014194488525,
-0.14404186606407166,
-0.022182300686836243,
0.15260697901248932,
0.10192563384771347,
0.16757766902446747,
0.16616763174533844,
0.11930298805236816,
0.1956932544708252,
0.11704401671886444,
0.10304278880357742,
0.052189555019140244,
-0.013531852513551712,
-0.004093863070011139,
-0.01228472962975502,
-0.042497504502534866,
0.05230056867003441,
0.05571495369076729,
0.027585504576563835,
-0.19872500002384186,
0.02184155583381653,
-0.19329896569252014,
-0.02313016541302204,
-0.08243345469236374,
0.01644495315849781,
0.05239224433898926,
0.2096434086561203,
0.04210057109594345,
0.10118018835783005,
0.021744482219219208,
0.10573884844779968,
0.015865135937929153,
-0.07006605714559555,
-0.0065298317931592464,
-0.024272896349430084,
0.09974277764558792,
0.10174193233251572,
0.021700428798794746,
-0.016679642722010612,
-0.09889253973960876,
0.04607788100838661,
0.17424549162387848,
-0.17494839429855347,
0.3187439739704132,
-0.0007240860140882432,
-0.04524024948477745,
-0.04190666601061821,
-0.08219234645366669,
0.04142151027917862,
0.1647384762763977,
0.1017698273062706,
0.0333428718149662,
-0.14635729789733887,
-0.06874663382768631,
-0.029922528192400932,
-0.029125673696398735,
0.10087492316961288,
-0.06689736992120743,
-0.13817089796066284,
-0.025579528883099556,
0.0344909206032753,
0.003919827751815319,
0.21354736387729645,
-0.10228335112333298,
-0.15175104141235352,
0.00922450888901949,
0.13133007287979126,
-0.06745465099811554,
-0.04906000941991806,
0.09594502300024033,
-0.02669750526547432,
0.0972210094332695,
-0.0541548989713192,
0.002656505908817053,
-0.14727191627025604,
-0.2363637089729309,
0.010592032223939896,
-0.02335694245994091,
0.020698489621281624,
-0.07203120738267899,
-0.11125075072050095,
-0.1240958720445633,
-0.1789770871400833,
0.11374562233686447,
-0.06521226465702057,
0.09276589751243591,
-0.09726036339998245,
0.08684233576059341,
-0.08414942771196365,
0.02816055528819561,
-0.05099964141845703,
-0.0012100528692826629,
-0.09757094830274582,
-0.14613427221775055,
0.024435222148895264,
-0.13409870862960815,
-0.001014217734336853,
0.034934982657432556,
-0.11161556839942932,
0.14066044986248016,
0.13931402564048767,
-0.08724056929349899,
0.17418785393238068,
0.42831170558929443,
-0.05984934791922569,
0.25173598527908325,
0.2527628242969513,
-0.13718484342098236,
-0.2734082341194153,
-0.059651490300893784,
-0.23391994833946228,
-0.08160211890935898,
0.1082993745803833,
-0.1578003615140915,
0.015907390043139458,
0.05020333454012871,
-0.11690597236156464,
0.1467704027891159,
-0.32824045419692993,
-0.07495500147342682,
0.09672868996858597,
0.007048844825476408,
0.4732857048511505,
-0.1068139299750328,
-0.12494277954101562,
-0.07125994563102722,
-0.10485164821147919,
0.10395017266273499,
-0.07008004188537598,
0.08493339270353317,
-0.030203424394130707,
0.025772906839847565,
0.011868835426867008,
-0.04774972423911095,
0.14879614114761353,
-0.0427577942609787,
0.19098854064941406,
-0.11560776084661484,
0.0027590321842581034,
0.14695321023464203,
-0.03108292631804943,
0.038532279431819916,
-0.07178329676389694,
0.04545990377664566,
-0.042950090020895004,
-0.027814088389277458,
-0.018928585574030876,
0.11621513217687607,
-0.004339784849435091,
-0.1380559802055359,
-0.06945756077766418,
0.01972813345491886,
-0.07362999767065048,
-0.05320021137595177,
0.15675771236419678,
0.03502804413437843,
0.05609925836324692,
0.11970125883817673,
0.004991572815924883,
-0.146412655711174,
0.00884049292653799,
-0.07536338269710541,
0.01455683447420597,
0.04314182698726654,
-0.08771193772554398,
-0.050023581832647324,
0.11971840262413025,
0.021750157698988914,
0.0665673241019249,
0.06486256420612335,
-0.042168524116277695,
0.02131110616028309,
0.11186312884092331,
-0.12857086956501007,
-0.06895474344491959,
-0.017605429515242577,
0.2739332914352417,
0.20882153511047363,
0.06424131989479065,
0.011942589655518532,
0.03977527841925621,
0.08851079642772675,
0.025800030678510666,
-0.024320857599377632,
-0.027894796803593636,
-0.07533380389213562,
0.08076632767915726,
-0.026636533439159393,
-0.08794095367193222,
0.1338292956352234,
0.04866079241037369,
-0.0795087143778801,
-0.08115667849779129,
0.10095386952161789,
-0.03139214217662811,
-0.0645640566945076,
-0.04291141778230667,
0.16875873506069183,
-0.142974391579628,
-0.05379750579595566,
0.05253109708428383,
-0.06923473626375198,
0.03050602227449417,
0.1983366161584854,
0.06317481398582458,
0.10652732849121094,
0.020412208512425423,
-0.03693949803709984,
0.09139978885650635,
-0.008889229968190193,
-0.1458244025707245,
0.04242372885346413,
-0.1516965925693512,
-0.1209954097867012,
-0.03220202773809433,
0.059742625802755356,
-0.06468313187360764,
-0.0443362258374691,
-0.16110824048519135,
0.08512833714485168,
-0.059125129133462906,
-0.04787873104214668,
-0.07900126278400421,
-0.034204404801130295,
-0.011031275615096092,
-0.027199620380997658,
-0.08409348875284195,
0.0068776607513427734,
-0.22133535146713257,
0.051574207842350006,
0.04428314045071602,
0.017113016918301582,
-0.03435007482767105,
-0.08292978256940842,
0.07848229259252548,
0.04986674711108208,
0.10280575603246689,
0.03711284324526787,
-0.059191394597291946,
0.0037306465674191713,
-0.20414716005325317,
-0.038815271109342575,
0.04232484847307205,
-0.021390240639448166,
0.0267819594591856,
0.08142497390508652,
-0.03312315046787262,
0.05886727198958397,
-0.04134150594472885,
0.031092548742890358,
-0.12302310764789581,
-0.19250139594078064,
-0.07369648665189743,
0.0737677738070488,
-0.1768668293952942,
-0.007294799666851759,
-0.158339723944664,
0.12045895308256149,
0.0037357027176767588,
0.19128042459487915,
0.05877019464969635,
0.07969143241643906,
0.07085993885993958,
-0.03897101804614067,
0.1005023792386055,
-0.05584702640771866,
-0.09622103720903397,
-0.019361555576324463,
-0.12480172514915466,
-0.049345120787620544,
0.42032214999198914,
0.05109545961022377,
-0.34862402081489563,
0.03209015727043152,
0.10416815429925919,
0.09029489010572433,
0.0010600913083180785,
0.1751212626695633,
-0.02115757390856743,
0.00999172031879425,
-0.09422436356544495,
0.09467131644487381,
-0.0020058725494891405,
-0.11290951073169708,
0.0739678293466568,
0.09658773243427277,
0.08477838337421417,
-0.024424241855740547,
0.13553570210933685,
-0.010457966476678848,
0.03920025750994682,
-0.11343693733215332,
0.15077632665634155,
0.06773624569177628,
-0.05210328474640846,
0.062154389917850494,
0.1635616272687912,
0.05306112766265869,
0.07038675248622894,
0.04032095894217491,
0.0014122785069048405,
-0.1754148155450821,
-0.1602102369070053,
0.02099275030195713,
-0.05523645877838135,
0.07993361353874207,
0.02664482593536377,
0.06025690957903862,
0.05930217728018761,
0.08369890600442886,
-0.02683570235967636,
-0.012045243754982948,
-0.21370548009872437,
-0.059094905853271484,
-0.014421275816857815,
-0.06632379442453384,
-0.06530799716711044,
-0.13236206769943237,
-0.007965253666043282,
-0.11605394631624222,
-0.1677420735359192,
-0.11075370758771896,
0.06186629459261894,
-0.03134578466415405,
-0.07950954884290695,
-0.1361609846353531,
0.005552724003791809,
-0.051663242280483246,
0.0591781884431839,
0.020678075030446053,
0.14382748305797577,
-0.055859338492155075,
-0.007769476156681776,
0.03557850420475006,
0.17586101591587067,
0.03452156111598015,
-0.019137056544423103,
0.05009777843952179,
-0.11230028420686722,
-0.013903132639825344,
0.09447801858186722,
-0.05355257913470268,
0.03868480771780014,
0.05060523375868797,
0.14069905877113342,
0.3000718951225281,
-0.15852685272693634,
0.022173447534441948,
-0.0156106511130929,
0.027616411447525024,
0.03752091899514198,
0.10538272559642792,
-0.047601912170648575,
0.30318450927734375,
-0.03754459694027901,
0.015319152735173702,
-0.05392564833164215,
0.03960913047194481,
-0.0902356207370758,
0.13807453215122223,
0.07016881555318832,
-0.1437612622976303,
-0.11773919314146042,
0.13123241066932678,
-0.2251790165901184,
0.21079330146312714,
0.05835592746734619,
-0.018531115725636482,
0.0006959201418794692,
-0.017787374556064606,
0.20127902925014496,
-0.06664536148309708,
0.07648804783821106,
-0.10087135434150696,
-0.11177007853984833,
-0.14956814050674438,
0.008278977125883102,
-0.3149573504924774,
-0.07720612734556198,
0.10045251995325089,
0.1509818434715271,
0.17898774147033691,
-0.022407056763768196,
0.060840118676424026,
0.03429623693227768,
0.016734736040234566,
-0.09003262221813202,
0.09443855285644531,
0.08975303173065186,
-0.14206120371818542,
-0.09327292442321777,
-0.12793666124343872,
-0.015153053216636181,
-0.009946417063474655,
-0.008153465576469898,
0.0022670275066047907,
0.04026666656136513,
0.12014163285493851,
-0.04463301971554756,
-0.05576737970113754,
0.06202622875571251,
-0.09607529640197754,
0.03486022725701332,
-0.03752650320529938,
0.012558498419821262,
-0.07468373328447342,
-0.03885192796587944,
-0.04395401477813721,
0.06765811145305634,
-0.2736577093601227,
-0.04237256944179535,
0.10482975840568542,
-0.0006625195383094251,
0.22920070588588715,
0.053381726145744324,
-0.108866386115551,
-0.028044672682881355,
-0.11392955482006073,
0.06305203586816788,
-0.12086670845746994,
-0.0018355880165472627,
0.1538183093070984,
0.022182224318385124,
0.03804173693060875,
-0.16429899632930756,
0.040075428783893585,
-0.10011276602745056,
-0.03175477311015129,
-0.06921384483575821
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ocr8_bert-base-uncased
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0241
- Accuracy: 0.7767
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 52 | 1.2318 | 0.6602 |
| No log | 2.0 | 104 | 1.0241 | 0.7767 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "bert-base-uncased", "model-index": [{"name": "ocr8_bert-base-uncased", "results": []}]} | text-classification | sebastiencormier/ocr8_bert-base-uncased | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:52:38+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| ocr8\_bert-base-uncased
=======================
This model is a fine-tuned version of bert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 1.0241
* Accuracy: 0.7767
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
68,
98,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.0797819197177887,
0.09771517664194107,
-0.002456550020724535,
0.10853520780801773,
0.13939835131168365,
0.022952929139137268,
0.15123485028743744,
0.1234414130449295,
-0.07446906715631485,
0.032194189727306366,
0.125277578830719,
0.1402045637369156,
0.007098411675542593,
0.12202486395835876,
-0.07031642645597458,
-0.23690274357795715,
0.004317371640354395,
0.03353201225399971,
-0.06747346371412277,
0.1119462177157402,
0.09948607534170151,
-0.12427598237991333,
0.08620689809322357,
-0.004753828514367342,
-0.16978105902671814,
0.018819576129317284,
0.017973359674215317,
-0.05737454816699028,
0.12899136543273926,
0.03531511873006821,
0.12629009783267975,
0.01805366761982441,
0.08823168277740479,
-0.2021840363740921,
0.008686580695211887,
0.059405550360679626,
-0.006959312129765749,
0.08292573690414429,
0.03413904458284378,
0.013293743133544922,
0.10040020942687988,
-0.0773923471570015,
0.061311472207307816,
0.017267875373363495,
-0.1126224473118782,
-0.2328600436449051,
-0.07457811385393143,
0.042487774044275284,
0.09403127431869507,
0.0776728168129921,
-0.010875712148845196,
0.13311219215393066,
-0.05685737356543541,
0.09218809008598328,
0.21166859567165375,
-0.3131856918334961,
-0.06305663287639618,
0.047306738793849945,
0.02846105955541134,
0.07988528162240982,
-0.10352880507707596,
-0.019275953993201256,
0.06738307327032089,
0.02635953016579151,
0.13259294629096985,
-0.029299303889274597,
-0.07532000541687012,
0.013950368389487267,
-0.14707711338996887,
-0.016690637916326523,
0.16580700874328613,
0.04784967005252838,
-0.040759649127721786,
-0.05257398635149002,
-0.06686602532863617,
-0.12587866187095642,
-0.035484280437231064,
-0.02506018429994583,
0.05245323106646538,
-0.022405657917261124,
-0.0626186728477478,
-0.023173103109002113,
-0.10756636410951614,
-0.0755278542637825,
-0.05679016187787056,
0.15145999193191528,
0.041415728628635406,
0.008008134551346302,
-0.018760595470666885,
0.09725726395845413,
-0.041861917823553085,
-0.1307838261127472,
0.017296990379691124,
0.02135673351585865,
0.02113731950521469,
-0.04952465742826462,
-0.057402897626161575,
-0.07673227041959763,
0.025737393647432327,
0.15119174122810364,
-0.05504118278622627,
0.04839261621236801,
0.00495043583214283,
0.05110719054937363,
-0.094505675137043,
0.16168539226055145,
-0.032504625618457794,
-0.01862674579024315,
0.016477277502417564,
0.0680336207151413,
0.04731462150812149,
-0.006140326615422964,
-0.12664775550365448,
0.03406788036227226,
0.10300900787115097,
0.014240115880966187,
-0.06830672174692154,
0.07983028143644333,
-0.04736904054880142,
-0.009859585203230381,
0.020174389705061913,
-0.0893627405166626,
0.033146895468235016,
0.003010517917573452,
-0.060472272336483,
-0.06818504631519318,
0.03201361000537872,
0.02304183878004551,
0.0003840347926598042,
0.10978380590677261,
-0.08150718361139297,
0.007901650853455067,
-0.09089560806751251,
-0.11934482306241989,
0.02235981449484825,
-0.07019621878862381,
0.03096047230064869,
-0.10694094747304916,
-0.17886056005954742,
0.0017309915274381638,
0.06854778528213501,
-0.02774452231824398,
-0.034642670303583145,
-0.04950636252760887,
-0.07121799141168594,
0.016389843076467514,
-0.024512240663170815,
0.0904453694820404,
-0.06355751305818558,
0.09248796850442886,
0.04217895492911339,
0.06549599766731262,
-0.050235211849212646,
0.03911086916923523,
-0.10129427909851074,
0.02190335839986801,
-0.17789842188358307,
0.01603635773062706,
-0.07501258701086044,
0.057470157742500305,
-0.08070991933345795,
-0.07575391232967377,
-0.0017078971723094583,
0.01375860720872879,
0.06558097898960114,
0.08572486788034439,
-0.15655523538589478,
-0.06878741085529327,
0.16439194977283478,
-0.09443752467632294,
-0.14160652458667755,
0.12824861705303192,
-0.0590665377676487,
0.060814112424850464,
0.061926309019327164,
0.19208160042762756,
0.05562915280461311,
-0.09262460470199585,
0.008101215586066246,
0.009994051419198513,
0.06072083115577698,
-0.03725086525082588,
0.0695541650056839,
0.0023517049849033356,
0.0018118206644430757,
0.017790360376238823,
-0.05025595799088478,
0.0462321937084198,
-0.07893367856740952,
-0.08876635134220123,
-0.043741997331380844,
-0.10171534866094589,
0.0552060529589653,
0.05726510286331177,
0.07532951980829239,
-0.11418554186820984,
-0.09551059454679489,
0.07744555920362473,
0.073478564620018,
-0.07786676287651062,
0.02362843044102192,
-0.07042315602302551,
0.07653164118528366,
-0.049148980528116226,
-0.012679949402809143,
-0.15180754661560059,
-0.043977104127407074,
0.021474452689290047,
-0.01316611748188734,
0.017639929428696632,
0.009008659049868584,
0.07291315495967865,
0.07499933242797852,
-0.0762643963098526,
-0.023511206731200218,
-0.017277780920267105,
0.017887135967612267,
-0.12463409453630447,
-0.20416748523712158,
-0.01649537682533264,
-0.03533760830760002,
0.11581241339445114,
-0.23149774968624115,
0.053617075085639954,
-0.001019283663481474,
0.089424729347229,
0.03441040217876434,
-0.0035721324384212494,
-0.04990626871585846,
0.06270547211170197,
-0.048456110060214996,
-0.05414646491408348,
0.061821967363357544,
0.01019125897437334,
-0.09560289233922958,
-0.04196828603744507,
-0.13384175300598145,
0.18670272827148438,
0.13801632821559906,
-0.09915204346179962,
-0.0724833682179451,
-0.0062512559816241264,
-0.045451439917087555,
-0.029399553313851357,
-0.046665847301483154,
0.0028141450602561235,
0.14195825159549713,
-0.02152937464416027,
0.15349902212619781,
-0.0862615630030632,
-0.042357221245765686,
0.020882025361061096,
-0.05170745775103569,
0.00669317040592432,
0.1049218475818634,
0.1018395945429802,
-0.1196378767490387,
0.15443973243236542,
0.1864342987537384,
-0.1017761304974556,
0.13929390907287598,
-0.04412739723920822,
-0.057700760662555695,
-0.01924275793135166,
0.0017794520827010274,
0.005084644537419081,
0.10686005651950836,
-0.14091460406780243,
0.0031851872336119413,
0.01167117990553379,
0.01699870266020298,
0.018962889909744263,
-0.21883568167686462,
-0.025972789153456688,
0.03626922518014908,
-0.04867904633283615,
0.005606156308203936,
-0.02044430375099182,
-0.015960007905960083,
0.09802138805389404,
-0.006971329916268587,
-0.08566663414239883,
0.049958474934101105,
-0.004271937068551779,
-0.08541408181190491,
0.20839117467403412,
-0.08651788532733917,
-0.13618429005146027,
-0.12903456389904022,
-0.07003574818372726,
-0.0467001236975193,
0.02606920525431633,
0.07469294220209122,
-0.0760735347867012,
-0.04503488168120384,
-0.10435352474451065,
0.006382586434483528,
0.02743448130786419,
0.028031134977936745,
0.024814199656248093,
0.0053853923454880714,
0.08476903289556503,
-0.10876345634460449,
-0.01227850466966629,
-0.04818478599190712,
-0.06089454144239426,
0.028564438223838806,
0.03172755241394043,
0.10769978165626526,
0.13746361434459686,
-0.027999138459563255,
0.00004161569813732058,
-0.030621018260717392,
0.22009988129138947,
-0.0571722574532032,
-0.01920890621840954,
0.13135498762130737,
-0.03264008089900017,
0.04635337367653847,
0.13050402700901031,
0.0669039934873581,
-0.09142553806304932,
0.01968965120613575,
0.04300123453140259,
-0.02993026375770569,
-0.22380688786506653,
-0.037306345999240875,
-0.03770260885357857,
0.007738180458545685,
0.09767626971006393,
0.03808522969484329,
0.0323859266936779,
0.0636126771569252,
0.0274664293974638,
0.07604294270277023,
-0.0040813228115439415,
0.0748676061630249,
0.12242995947599411,
0.037491150200366974,
0.12640519440174103,
-0.04551834240555763,
-0.057925716042518616,
0.03924558684229851,
0.0019674247596412897,
0.20293164253234863,
0.02421560138463974,
0.1300126165151596,
0.05907101184129715,
0.15105471014976501,
-0.004539238754659891,
0.06662192195653915,
-0.020247219130396843,
-0.044745072722435,
-0.01647450029850006,
-0.049464400857686996,
-0.022707374766469002,
0.04770064726471901,
-0.09631725400686264,
0.054322563111782074,
-0.10690656304359436,
0.00810781680047512,
0.06579402089118958,
0.23532485961914062,
0.04450570419430733,
-0.3111468553543091,
-0.08861877769231796,
0.03236289694905281,
-0.03392266482114792,
-0.01832054741680622,
0.03567364439368248,
0.12149045616388321,
-0.05047235265374184,
0.03885773569345474,
-0.07248560339212418,
0.08268944919109344,
-0.0327652245759964,
0.048042550683021545,
0.067657969892025,
0.08163566142320633,
-0.0014251514803618193,
0.08030518144369125,
-0.2765340209007263,
0.2794402539730072,
0.012291161343455315,
0.07303071022033691,
-0.05727417767047882,
0.006585754919797182,
0.03291743993759155,
0.08837247639894485,
0.07900358736515045,
-0.02252069115638733,
-0.06545382738113403,
-0.18000806868076324,
-0.047123756259679794,
0.027728354558348656,
0.09162984788417816,
-0.029578305780887604,
0.09133581817150116,
-0.03590993583202362,
0.004450347740203142,
0.09288652241230011,
-0.006236667279154062,
-0.08249424397945404,
-0.09719424694776535,
-0.016599789261817932,
0.036312542855739594,
-0.036998726427555084,
-0.0807005912065506,
-0.10216376930475235,
-0.13208527863025665,
0.1689361184835434,
-0.04968464747071266,
-0.029143353924155235,
-0.0960620865225792,
0.06850653141736984,
0.048589903861284256,
-0.07510235905647278,
0.04370032623410225,
0.012487463653087616,
0.09275323897600174,
0.025190193206071854,
-0.06564383953809738,
0.12746413052082062,
-0.08048804104328156,
-0.17193743586540222,
-0.07454067468643188,
0.09561721980571747,
0.02132791094481945,
0.047237787395715714,
-0.0005880189128220081,
0.005852259695529938,
-0.016049087047576904,
-0.07773466408252716,
0.026736540719866753,
-0.002275961684063077,
0.06679290533065796,
0.008156750351190567,
-0.08079056441783905,
0.0005821866216138005,
-0.05195792764425278,
-0.03772619366645813,
0.162481889128685,
0.2739873230457306,
-0.09024525433778763,
0.011986254714429379,
0.06961671262979507,
-0.07455961406230927,
-0.2122161090373993,
0.027155205607414246,
0.034769829362630844,
-0.0069835009053349495,
0.04927825927734375,
-0.15011858940124512,
0.1208517774939537,
0.1033036932349205,
-0.029967062175273895,
0.09296533465385437,
-0.28254061937332153,
-0.13628889620304108,
0.14042986929416656,
0.1502479612827301,
0.11145118623971939,
-0.1596938818693161,
-0.03272543102502823,
-0.03813493251800537,
-0.09596246480941772,
0.11239631474018097,
-0.13514649868011475,
0.11002632230520248,
-0.007160888984799385,
0.0571286603808403,
0.002448436338454485,
-0.05625659227371216,
0.12578865885734558,
-0.011760988272726536,
0.109458789229393,
-0.06509486585855484,
-0.017556339502334595,
0.04807350039482117,
-0.053482767194509506,
0.026596367359161377,
-0.11401473730802536,
0.0343601293861866,
-0.06050615385174751,
-0.03207220882177353,
-0.04455127194523811,
0.04020671918988228,
-0.035331450402736664,
-0.07236979156732559,
-0.040971193462610245,
0.025057464838027954,
0.04418523609638214,
-0.012684138491749763,
0.1678304374217987,
0.017826039344072342,
0.15905562043190002,
0.15805862843990326,
0.08388780802488327,
-0.0707545280456543,
-0.025926221162080765,
-0.008031001314520836,
-0.03542497381567955,
0.06802796572446823,
-0.1583097279071808,
0.044038668274879456,
0.1179119423031807,
0.006999141536653042,
0.15082255005836487,
0.07619539648294449,
-0.03126189112663269,
0.009789849631488323,
0.06632652878761292,
-0.16449086368083954,
-0.09684281051158905,
-0.007289344910532236,
-0.03033187985420227,
-0.11429082602262497,
0.07288957387208939,
0.11473739147186279,
-0.07540848851203918,
0.008062694221735,
-0.005077483132481575,
0.016233481466770172,
-0.0449376255273819,
0.17846426367759705,
0.06436190754175186,
0.04466516524553299,
-0.07630427181720734,
0.08812889456748962,
0.042507875710725784,
-0.067933589220047,
0.020359866321086884,
0.0423903688788414,
-0.07760171592235565,
-0.04997468367218971,
0.05789071321487427,
0.1922970414161682,
-0.029584331437945366,
-0.06047901138663292,
-0.14365074038505554,
-0.11242443323135376,
0.05847926810383797,
0.17407451570034027,
0.10520444065332413,
0.002684630686417222,
-0.041283994913101196,
0.025005966424942017,
-0.11013984680175781,
0.11665613949298859,
0.02925836481153965,
0.08050315082073212,
-0.15648971498012543,
0.11141387373209,
0.002997614908963442,
0.0068704551085829735,
-0.028717877343297005,
0.047908712178468704,
-0.12246686965227127,
-0.0041211494244635105,
-0.12828585505485535,
-0.010669020004570484,
-0.02056000754237175,
0.009644187055528164,
0.009313761256635189,
-0.060239072889089584,
-0.06418932229280472,
0.016509128734469414,
-0.10443530976772308,
-0.015127871185541153,
0.042484525591135025,
0.06134793162345886,
-0.1266019642353058,
-0.043028440326452255,
0.023248499259352684,
-0.06395406275987625,
0.06551404297351837,
0.02187904715538025,
0.019165163859725,
0.060244083404541016,
-0.18879419565200806,
0.029569067060947418,
0.06992565095424652,
0.01475759781897068,
0.044940751045942307,
-0.08743639290332794,
-0.015398761257529259,
0.0016376420389860868,
0.04369867965579033,
0.017927786335349083,
0.08674997091293335,
-0.1289035528898239,
0.004202608019113541,
-0.02946300059556961,
-0.07707810401916504,
-0.05252010375261307,
0.023117337375879288,
0.08614911884069443,
0.0011595466639846563,
0.20745894312858582,
-0.09625274688005447,
0.013681359589099884,
-0.2018575519323349,
0.012860758230090141,
0.006714555900543928,
-0.11418362706899643,
-0.12162143737077713,
-0.055072154849767685,
0.043861228972673416,
-0.06602727621793747,
0.16036562621593475,
0.022648004814982414,
0.018384555354714394,
0.03930500149726868,
-0.05231667682528496,
0.04046563059091568,
0.025218700990080833,
0.23460343480110168,
0.025045782327651978,
-0.03498510643839836,
0.014057846739888191,
0.031576190143823624,
0.11123853921890259,
0.07421616464853287,
0.16883806884288788,
0.1600341945886612,
-0.056942906230688095,
0.10213397443294525,
0.049359727650880814,
-0.058258943259716034,
-0.1492982804775238,
0.059493664652109146,
-0.029862552881240845,
0.10764426738023758,
-0.023479478433728218,
0.20413221418857574,
0.09673414379358292,
-0.16477908194065094,
0.01577126979827881,
-0.06079339608550072,
-0.08214830607175827,
-0.12067025899887085,
-0.06272333115339279,
-0.09824198484420776,
-0.16051696240901947,
0.004179046489298344,
-0.11713595688343048,
0.010911533609032631,
0.08360355347394943,
0.005539858713746071,
-0.017735756933689117,
0.16212573647499084,
-0.0031072511337697506,
0.036190178245306015,
0.04879632964730263,
0.0006779512041248381,
-0.041574615985155106,
-0.09632547199726105,
-0.08890119940042496,
-0.005415064748376608,
-0.01394969318062067,
0.014021048322319984,
-0.05089817941188812,
-0.028945719823241234,
0.036887794733047485,
-0.008590728975832462,
-0.0969475731253624,
0.009594894014298916,
0.021085157990455627,
0.05405399948358536,
0.04363342374563217,
0.0024077720008790493,
0.006958155892789364,
0.001470588380470872,
0.21940430998802185,
-0.08142107725143433,
-0.062139663845300674,
-0.09972519427537918,
0.2289753556251526,
0.03210698440670967,
0.02123699150979519,
0.013868999667465687,
-0.08878480643033981,
0.013534193858504295,
0.2299811989068985,
0.19646142423152924,
-0.07241625338792801,
0.0006384849548339844,
0.00033248355612158775,
-0.010041964240372181,
-0.03681601211428642,
0.09794739633798599,
0.12176446616649628,
0.022109253332018852,
-0.0748419463634491,
-0.04831604287028313,
-0.03058733604848385,
-0.002099626464769244,
-0.051603030413389206,
0.06702613085508347,
0.04221402853727341,
0.008196825161576271,
-0.04754137247800827,
0.051064521074295044,
-0.029165120795369148,
-0.11183173954486847,
0.04723441228270531,
-0.19558559358119965,
-0.1469527781009674,
-0.0059720417484641075,
0.12984243035316467,
-0.01672147586941719,
0.05298282206058502,
-0.029048878699541092,
-0.0027894307859241962,
0.06576639413833618,
-0.024036886170506477,
-0.07282593101263046,
-0.07364694774150848,
0.06602470576763153,
-0.10185234248638153,
0.24622902274131775,
-0.035363778471946716,
0.056913234293460846,
0.12802433967590332,
0.0350734144449234,
-0.0650118812918663,
0.08239439129829407,
0.047467343509197235,
-0.07706404477357864,
0.026610063388943672,
0.07762178778648376,
-0.044671691954135895,
0.12463180720806122,
0.05426880344748497,
-0.14058484137058258,
0.012985564768314362,
-0.05354517325758934,
-0.09530907869338989,
-0.04982293024659157,
-0.03212900459766388,
-0.0681382343173027,
0.13068628311157227,
0.19188886880874634,
-0.03258826583623886,
-0.0007857691962271929,
-0.05390837788581848,
0.03352435305714607,
0.0662221610546112,
0.027754968032240868,
-0.030007382854819298,
-0.23304638266563416,
0.02800321765244007,
0.07986708730459213,
-0.0063707767985761166,
-0.2728588879108429,
-0.0886104479432106,
-0.0011227894574403763,
-0.04924667999148369,
-0.09964649379253387,
0.0704386830329895,
0.12292925268411636,
0.0490211620926857,
-0.06636761128902435,
-0.10238955169916153,
-0.07571086287498474,
0.14550942182540894,
-0.13193966448307037,
-0.09658011049032211
] |
null | null | null | https://civitai.com/models/303967/priscilla-guardian-tales | {"license": "creativeml-openrail-m"} | null | LarryAIDraw/Priscilla_-_Guardian_Tales | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-14T14:53:16+00:00 | [] | [] | TAGS
#license-creativeml-openrail-m #region-us
| URL | [] | [
"TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
18
] | [
"passage: TAGS\n#license-creativeml-openrail-m #region-us \n"
] | [
-0.07587551325559616,
0.1441737711429596,
-0.0062791393138468266,
0.012048184871673584,
-0.001431003911420703,
-0.022854028269648552,
0.2091037780046463,
-0.018623588606715202,
0.08854977041482925,
-0.11491455882787704,
0.14648450911045074,
0.18939465284347534,
-0.10384178161621094,
0.0838744044303894,
-0.061768148094415665,
-0.13200531899929047,
0.029243366792798042,
-0.07651498913764954,
-0.0865340456366539,
0.028722204267978668,
0.056829702109098434,
-0.01273291651159525,
-0.003666024887934327,
-0.0012952570104971528,
-0.11045186221599579,
0.07173702865839005,
-0.029841862618923187,
-0.037320639938116074,
0.060927797108888626,
-0.04866224527359009,
0.04899880662560463,
0.11812204867601395,
-0.033462416380643845,
-0.13358792662620544,
0.004443002864718437,
-0.11795501410961151,
-0.13281011581420898,
0.007506446447223425,
0.121794693171978,
-0.0353701114654541,
0.12644833326339722,
0.17882929742336273,
0.0022871040273457766,
0.07042364031076431,
-0.1692226231098175,
-0.17680460214614868,
-0.04340395703911781,
-0.018681490793824196,
-0.026622790843248367,
0.0532202385365963,
0.11296376585960388,
0.0959911122918129,
-0.1474708467721939,
0.059626504778862,
0.08025065064430237,
-0.29932230710983276,
0.03342466056346893,
0.23123668134212494,
0.11160528659820557,
0.03646189346909523,
-0.04899992793798447,
0.06103713810443878,
0.037279851734638214,
-0.055691562592983246,
-0.011489230208098888,
-0.07466674596071243,
0.033063821494579315,
0.1203068420290947,
-0.048032116144895554,
-0.025952165946364403,
0.3207513689994812,
-0.011608880013227463,
0.004257023800164461,
0.03850623592734337,
-0.046627260744571686,
0.03471478819847107,
0.053042974323034286,
0.07628075033426285,
0.05806995555758476,
0.1503586620092392,
0.06162842735648155,
-0.11057397723197937,
-0.12041215598583221,
0.018044639378786087,
-0.14939343929290771,
0.16419777274131775,
-0.05087574943900108,
0.0932750254869461,
-0.11752020567655563,
0.018267955631017685,
-0.0651155412197113,
-0.03550999239087105,
-0.010290741920471191,
-0.14436741173267365,
0.09543514996767044,
-0.00750720826908946,
-0.044816359877586365,
-0.06333030760288239,
0.06353012472391129,
0.134693443775177,
0.06326734274625778,
-0.01916888915002346,
0.03110724687576294,
0.18312698602676392,
0.02453736774623394,
-0.039170458912849426,
0.02620672434568405,
0.14288429915905,
0.03429737314581871,
-0.1762668490409851,
-0.0059744445607066154,
-0.0644608810544014,
-0.1936662793159485,
-0.02320769429206848,
-0.19997692108154297,
0.16352415084838867,
-0.030033577233552933,
-0.016221072524785995,
-0.03707468882203102,
0.022218478843569756,
0.04353277385234833,
0.007484832778573036,
0.018807580694556236,
-0.044244956225156784,
-0.08294660598039627,
-0.08514150232076645,
-0.020517800003290176,
0.05681263282895088,
0.07853931933641434,
0.18057872354984283,
-0.12033670395612717,
0.0023163571022450924,
-0.04746192321181297,
-0.002028648741543293,
0.10751507431268692,
-0.1799560934305191,
0.05942503362894058,
-0.10612065345048904,
-0.21264076232910156,
-0.0035186251625418663,
0.11188323050737381,
0.02211635187268257,
0.00010340322478441522,
0.023470120504498482,
-0.042402785271406174,
-0.03322858735918999,
-0.06714189052581787,
-0.09123854339122772,
-0.07618846744298935,
0.0644230917096138,
-0.15088342130184174,
-0.06908489763736725,
-0.27447474002838135,
0.021657612174749374,
-0.11370886117219925,
0.030269425362348557,
0.09551744163036346,
-0.08233252167701721,
-0.11906278878450394,
0.24992190301418304,
0.07235409319400787,
0.07105377316474915,
-0.037106942385435104,
-0.02335505001246929,
-0.040998950600624084,
0.07576625794172287,
-0.051450882107019424,
0.006896975915879011,
0.06892602890729904,
-0.05309505760669708,
-0.13028347492218018,
-0.018723927438259125,
-0.04109232872724533,
0.13036558032035828,
-0.005558064207434654,
0.30143606662750244,
0.04775548353791237,
-0.18540549278259277,
0.20458267629146576,
0.13462620973587036,
-0.17578788101673126,
-0.3525811433792114,
0.10510481148958206,
-0.08032525330781937,
-0.12903624773025513,
0.02135874517261982,
0.05760384723544121,
0.08029629290103912,
-0.016704760491847992,
-0.03554001823067665,
0.003427563700824976,
-0.061561521142721176,
-0.016107140108942986,
0.031175263226032257,
0.09541988372802734,
-0.08737137913703918,
0.08379733562469482,
0.03426050394773483,
-0.0114505710080266,
0.14006270468235016,
-0.02073829248547554,
-0.0763879269361496,
0.02079492248594761,
0.04172089695930481,
-0.020384199917316437,
-0.056601639837026596,
-0.019958069548010826,
0.024005193263292313,
-0.017852509394288063,
0.10743143409490585,
0.29301881790161133,
0.0457768440246582,
-0.015894168987870216,
0.050522804260253906,
0.02892244979739189,
0.031187754124403,
0.04622279107570648,
0.002081167884171009,
-0.15730762481689453,
0.07284589111804962,
-0.05682012811303139,
-0.09314198791980743,
-0.03167767822742462,
-0.0017506676958873868,
0.0981268361210823,
-0.05222945287823677,
0.06663653254508972,
0.04907272756099701,
0.008146014995872974,
-0.0024776349309831858,
0.019724633544683456,
0.03505800664424896,
0.15693770349025726,
0.06973138451576233,
-0.09330075234174728,
0.2326427847146988,
-0.07795968651771545,
0.3451519012451172,
0.06519531458616257,
-0.17186447978019714,
0.0015280802035704255,
-0.16536928713321686,
-0.08274903148412704,
0.009426575154066086,
0.06846177577972412,
0.04244798794388771,
-0.06766051799058914,
-0.0681324228644371,
0.1076645776629448,
-0.05602144077420235,
-0.05967314541339874,
-0.09208252280950546,
-0.06438151746988297,
-0.09841792285442352,
0.11479154229164124,
0.17103825509548187,
-0.17601613700389862,
0.14707137644290924,
0.31644511222839355,
0.0033473046496510506,
0.20550797879695892,
-0.06598898768424988,
0.06533558666706085,
-0.11870601028203964,
0.06948951631784439,
-0.033792875707149506,
0.1264963299036026,
-0.10152938961982727,
0.04339653253555298,
0.01719778962433338,
0.05835990980267525,
0.12580721080303192,
-0.1375611275434494,
-0.2047722488641739,
0.05393601953983307,
0.04846670478582382,
-0.08490802347660065,
0.15654030442237854,
-0.07621043175458908,
0.03958071768283844,
-0.04002580791711807,
-0.10932640731334686,
0.16022461652755737,
-0.07396190613508224,
-0.03576399013400078,
0.04601873457431793,
-0.162797212600708,
0.04817049205303192,
-0.13655415177345276,
-0.20034807920455933,
-0.03256381303071976,
0.011739566922187805,
0.09091648459434509,
0.0064963698387146,
-0.045913100242614746,
0.008927296847105026,
-0.1321311742067337,
-0.24660253524780273,
-0.10214889049530029,
-0.04224977269768715,
0.1463703066110611,
-0.09529456496238708,
-0.08689732849597931,
-0.008191614411771297,
-0.027925807982683182,
0.0383632630109787,
0.0873899981379509,
-0.04390016943216324,
0.15604910254478455,
0.13776685297489166,
0.03233470022678375,
0.07692384719848633,
-0.0302706528455019,
0.16908830404281616,
0.07715359330177307,
-0.09182680398225784,
0.09044599533081055,
-0.006939579267054796,
0.07778391242027283,
0.26205286383628845,
0.13615888357162476,
-0.10827198624610901,
0.0021787171717733145,
-0.09298930317163467,
-0.13136249780654907,
-0.25473496317863464,
-0.03117409534752369,
-0.15477068722248077,
0.13437145948410034,
-0.08579761534929276,
0.08686056733131409,
0.13696706295013428,
0.05041143670678139,
0.10572081059217453,
0.018525123596191406,
-0.016791416332125664,
0.022843502461910248,
0.17746564745903015,
-0.02853401191532612,
-0.043541014194488525,
-0.14404186606407166,
-0.022182300686836243,
0.15260697901248932,
0.10192563384771347,
0.16757766902446747,
0.16616763174533844,
0.11930298805236816,
0.1956932544708252,
0.11704401671886444,
0.10304278880357742,
0.052189555019140244,
-0.013531852513551712,
-0.004093863070011139,
-0.01228472962975502,
-0.042497504502534866,
0.05230056867003441,
0.05571495369076729,
0.027585504576563835,
-0.19872500002384186,
0.02184155583381653,
-0.19329896569252014,
-0.02313016541302204,
-0.08243345469236374,
0.01644495315849781,
0.05239224433898926,
0.2096434086561203,
0.04210057109594345,
0.10118018835783005,
0.021744482219219208,
0.10573884844779968,
0.015865135937929153,
-0.07006605714559555,
-0.0065298317931592464,
-0.024272896349430084,
0.09974277764558792,
0.10174193233251572,
0.021700428798794746,
-0.016679642722010612,
-0.09889253973960876,
0.04607788100838661,
0.17424549162387848,
-0.17494839429855347,
0.3187439739704132,
-0.0007240860140882432,
-0.04524024948477745,
-0.04190666601061821,
-0.08219234645366669,
0.04142151027917862,
0.1647384762763977,
0.1017698273062706,
0.0333428718149662,
-0.14635729789733887,
-0.06874663382768631,
-0.029922528192400932,
-0.029125673696398735,
0.10087492316961288,
-0.06689736992120743,
-0.13817089796066284,
-0.025579528883099556,
0.0344909206032753,
0.003919827751815319,
0.21354736387729645,
-0.10228335112333298,
-0.15175104141235352,
0.00922450888901949,
0.13133007287979126,
-0.06745465099811554,
-0.04906000941991806,
0.09594502300024033,
-0.02669750526547432,
0.0972210094332695,
-0.0541548989713192,
0.002656505908817053,
-0.14727191627025604,
-0.2363637089729309,
0.010592032223939896,
-0.02335694245994091,
0.020698489621281624,
-0.07203120738267899,
-0.11125075072050095,
-0.1240958720445633,
-0.1789770871400833,
0.11374562233686447,
-0.06521226465702057,
0.09276589751243591,
-0.09726036339998245,
0.08684233576059341,
-0.08414942771196365,
0.02816055528819561,
-0.05099964141845703,
-0.0012100528692826629,
-0.09757094830274582,
-0.14613427221775055,
0.024435222148895264,
-0.13409870862960815,
-0.001014217734336853,
0.034934982657432556,
-0.11161556839942932,
0.14066044986248016,
0.13931402564048767,
-0.08724056929349899,
0.17418785393238068,
0.42831170558929443,
-0.05984934791922569,
0.25173598527908325,
0.2527628242969513,
-0.13718484342098236,
-0.2734082341194153,
-0.059651490300893784,
-0.23391994833946228,
-0.08160211890935898,
0.1082993745803833,
-0.1578003615140915,
0.015907390043139458,
0.05020333454012871,
-0.11690597236156464,
0.1467704027891159,
-0.32824045419692993,
-0.07495500147342682,
0.09672868996858597,
0.007048844825476408,
0.4732857048511505,
-0.1068139299750328,
-0.12494277954101562,
-0.07125994563102722,
-0.10485164821147919,
0.10395017266273499,
-0.07008004188537598,
0.08493339270353317,
-0.030203424394130707,
0.025772906839847565,
0.011868835426867008,
-0.04774972423911095,
0.14879614114761353,
-0.0427577942609787,
0.19098854064941406,
-0.11560776084661484,
0.0027590321842581034,
0.14695321023464203,
-0.03108292631804943,
0.038532279431819916,
-0.07178329676389694,
0.04545990377664566,
-0.042950090020895004,
-0.027814088389277458,
-0.018928585574030876,
0.11621513217687607,
-0.004339784849435091,
-0.1380559802055359,
-0.06945756077766418,
0.01972813345491886,
-0.07362999767065048,
-0.05320021137595177,
0.15675771236419678,
0.03502804413437843,
0.05609925836324692,
0.11970125883817673,
0.004991572815924883,
-0.146412655711174,
0.00884049292653799,
-0.07536338269710541,
0.01455683447420597,
0.04314182698726654,
-0.08771193772554398,
-0.050023581832647324,
0.11971840262413025,
0.021750157698988914,
0.0665673241019249,
0.06486256420612335,
-0.042168524116277695,
0.02131110616028309,
0.11186312884092331,
-0.12857086956501007,
-0.06895474344491959,
-0.017605429515242577,
0.2739332914352417,
0.20882153511047363,
0.06424131989479065,
0.011942589655518532,
0.03977527841925621,
0.08851079642772675,
0.025800030678510666,
-0.024320857599377632,
-0.027894796803593636,
-0.07533380389213562,
0.08076632767915726,
-0.026636533439159393,
-0.08794095367193222,
0.1338292956352234,
0.04866079241037369,
-0.0795087143778801,
-0.08115667849779129,
0.10095386952161789,
-0.03139214217662811,
-0.0645640566945076,
-0.04291141778230667,
0.16875873506069183,
-0.142974391579628,
-0.05379750579595566,
0.05253109708428383,
-0.06923473626375198,
0.03050602227449417,
0.1983366161584854,
0.06317481398582458,
0.10652732849121094,
0.020412208512425423,
-0.03693949803709984,
0.09139978885650635,
-0.008889229968190193,
-0.1458244025707245,
0.04242372885346413,
-0.1516965925693512,
-0.1209954097867012,
-0.03220202773809433,
0.059742625802755356,
-0.06468313187360764,
-0.0443362258374691,
-0.16110824048519135,
0.08512833714485168,
-0.059125129133462906,
-0.04787873104214668,
-0.07900126278400421,
-0.034204404801130295,
-0.011031275615096092,
-0.027199620380997658,
-0.08409348875284195,
0.0068776607513427734,
-0.22133535146713257,
0.051574207842350006,
0.04428314045071602,
0.017113016918301582,
-0.03435007482767105,
-0.08292978256940842,
0.07848229259252548,
0.04986674711108208,
0.10280575603246689,
0.03711284324526787,
-0.059191394597291946,
0.0037306465674191713,
-0.20414716005325317,
-0.038815271109342575,
0.04232484847307205,
-0.021390240639448166,
0.0267819594591856,
0.08142497390508652,
-0.03312315046787262,
0.05886727198958397,
-0.04134150594472885,
0.031092548742890358,
-0.12302310764789581,
-0.19250139594078064,
-0.07369648665189743,
0.0737677738070488,
-0.1768668293952942,
-0.007294799666851759,
-0.158339723944664,
0.12045895308256149,
0.0037357027176767588,
0.19128042459487915,
0.05877019464969635,
0.07969143241643906,
0.07085993885993958,
-0.03897101804614067,
0.1005023792386055,
-0.05584702640771866,
-0.09622103720903397,
-0.019361555576324463,
-0.12480172514915466,
-0.049345120787620544,
0.42032214999198914,
0.05109545961022377,
-0.34862402081489563,
0.03209015727043152,
0.10416815429925919,
0.09029489010572433,
0.0010600913083180785,
0.1751212626695633,
-0.02115757390856743,
0.00999172031879425,
-0.09422436356544495,
0.09467131644487381,
-0.0020058725494891405,
-0.11290951073169708,
0.0739678293466568,
0.09658773243427277,
0.08477838337421417,
-0.024424241855740547,
0.13553570210933685,
-0.010457966476678848,
0.03920025750994682,
-0.11343693733215332,
0.15077632665634155,
0.06773624569177628,
-0.05210328474640846,
0.062154389917850494,
0.1635616272687912,
0.05306112766265869,
0.07038675248622894,
0.04032095894217491,
0.0014122785069048405,
-0.1754148155450821,
-0.1602102369070053,
0.02099275030195713,
-0.05523645877838135,
0.07993361353874207,
0.02664482593536377,
0.06025690957903862,
0.05930217728018761,
0.08369890600442886,
-0.02683570235967636,
-0.012045243754982948,
-0.21370548009872437,
-0.059094905853271484,
-0.014421275816857815,
-0.06632379442453384,
-0.06530799716711044,
-0.13236206769943237,
-0.007965253666043282,
-0.11605394631624222,
-0.1677420735359192,
-0.11075370758771896,
0.06186629459261894,
-0.03134578466415405,
-0.07950954884290695,
-0.1361609846353531,
0.005552724003791809,
-0.051663242280483246,
0.0591781884431839,
0.020678075030446053,
0.14382748305797577,
-0.055859338492155075,
-0.007769476156681776,
0.03557850420475006,
0.17586101591587067,
0.03452156111598015,
-0.019137056544423103,
0.05009777843952179,
-0.11230028420686722,
-0.013903132639825344,
0.09447801858186722,
-0.05355257913470268,
0.03868480771780014,
0.05060523375868797,
0.14069905877113342,
0.3000718951225281,
-0.15852685272693634,
0.022173447534441948,
-0.0156106511130929,
0.027616411447525024,
0.03752091899514198,
0.10538272559642792,
-0.047601912170648575,
0.30318450927734375,
-0.03754459694027901,
0.015319152735173702,
-0.05392564833164215,
0.03960913047194481,
-0.0902356207370758,
0.13807453215122223,
0.07016881555318832,
-0.1437612622976303,
-0.11773919314146042,
0.13123241066932678,
-0.2251790165901184,
0.21079330146312714,
0.05835592746734619,
-0.018531115725636482,
0.0006959201418794692,
-0.017787374556064606,
0.20127902925014496,
-0.06664536148309708,
0.07648804783821106,
-0.10087135434150696,
-0.11177007853984833,
-0.14956814050674438,
0.008278977125883102,
-0.3149573504924774,
-0.07720612734556198,
0.10045251995325089,
0.1509818434715271,
0.17898774147033691,
-0.022407056763768196,
0.060840118676424026,
0.03429623693227768,
0.016734736040234566,
-0.09003262221813202,
0.09443855285644531,
0.08975303173065186,
-0.14206120371818542,
-0.09327292442321777,
-0.12793666124343872,
-0.015153053216636181,
-0.009946417063474655,
-0.008153465576469898,
0.0022670275066047907,
0.04026666656136513,
0.12014163285493851,
-0.04463301971554756,
-0.05576737970113754,
0.06202622875571251,
-0.09607529640197754,
0.03486022725701332,
-0.03752650320529938,
0.012558498419821262,
-0.07468373328447342,
-0.03885192796587944,
-0.04395401477813721,
0.06765811145305634,
-0.2736577093601227,
-0.04237256944179535,
0.10482975840568542,
-0.0006625195383094251,
0.22920070588588715,
0.053381726145744324,
-0.108866386115551,
-0.028044672682881355,
-0.11392955482006073,
0.06305203586816788,
-0.12086670845746994,
-0.0018355880165472627,
0.1538183093070984,
0.022182224318385124,
0.03804173693060875,
-0.16429899632930756,
0.040075428783893585,
-0.10011276602745056,
-0.03175477311015129,
-0.06921384483575821
] |
null | null | ml-agents |
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: WesleyRianSmith/SecondModel
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["Huggy", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Huggy"]} | reinforcement-learning | WesleyRianSmith/SecondModel | [
"ml-agents",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] | 2024-02-14T14:56:13+00:00 | [] | [] | TAGS
#ml-agents #Huggy #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Huggy #region-us
|
# ppo Agent playing Huggy
This is a trained model of a ppo agent playing Huggy
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: WesleyRianSmith/SecondModel
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing Huggy\n This is a trained model of a ppo agent playing Huggy\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: WesleyRianSmith/SecondModel\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #Huggy #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Huggy #region-us \n",
"# ppo Agent playing Huggy\n This is a trained model of a ppo agent playing Huggy\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: WesleyRianSmith/SecondModel\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
36,
204
] | [
"passage: TAGS\n#ml-agents #Huggy #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Huggy #region-us \n# ppo Agent playing Huggy\n This is a trained model of a ppo agent playing Huggy\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: WesleyRianSmith/SecondModel\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
0.0086976932361722,
0.024542035534977913,
-0.003611472202464938,
0.019523432478308678,
0.13434529304504395,
0.006531518418341875,
0.15407893061637878,
0.095069520175457,
0.108209028840065,
0.0846223309636116,
0.04817910119891167,
0.034584298729896545,
0.04334770888090134,
0.18677827715873718,
0.07216647267341614,
-0.2061539590358734,
0.0033736235927790403,
-0.10782751441001892,
0.04746362194418907,
0.08176074922084808,
0.05198410153388977,
-0.05828549712896347,
0.07925932854413986,
0.02513931691646576,
-0.04797660931944847,
-0.0181560181081295,
-0.07818817347288132,
-0.014062766917049885,
0.05647597089409828,
0.008456743322312832,
-0.01984020322561264,
-0.039028022438287735,
0.09584656357765198,
-0.2183307707309723,
0.024167953059077263,
0.04622931405901909,
0.01534133218228817,
0.004653607495129108,
0.07652998715639114,
0.0686316192150116,
0.11044071614742279,
-0.07219428569078445,
0.08920777589082718,
0.061628978699445724,
-0.06884103268384933,
0.00031396697158925235,
-0.12013889849185944,
0.03557911142706871,
0.21418526768684387,
0.09755882620811462,
0.0062013897113502026,
0.08116835355758667,
-0.10045100003480911,
0.03134983032941818,
0.16029144823551178,
-0.25934430956840515,
-0.055895376950502396,
0.09243406355381012,
0.06465914100408554,
-0.0393708199262619,
-0.04014969989657402,
0.04762644320726395,
-0.01763864792883396,
0.04270533099770546,
0.07320769876241684,
-0.01682945154607296,
0.2086070030927658,
-0.018742531538009644,
-0.06769051402807236,
-0.0842246487736702,
0.07901117950677872,
0.06910990923643112,
-0.0638742595911026,
-0.22153340280056,
0.030521022155880928,
0.1511707752943039,
-0.029861772432923317,
0.00602162117138505,
0.0814725011587143,
-0.003750869072973728,
-0.03586256131529808,
-0.09612856060266495,
-0.0505460686981678,
-0.06653101742267609,
0.05331137031316757,
0.15815900266170502,
0.001009187544696033,
-0.02987012080848217,
0.06932215392589569,
0.06447748094797134,
0.026950519531965256,
-0.039548952132463455,
-0.03170548379421234,
-0.041459985077381134,
-0.13188236951828003,
-0.013024860993027687,
-0.019127804785966873,
0.04402416571974754,
0.03662586212158203,
0.15878094732761383,
0.0037488522939383984,
0.004294663667678833,
0.05458708852529526,
0.05328255891799927,
-0.013520018197596073,
0.13658912479877472,
0.041880764067173004,
0.01314619742333889,
0.026450179517269135,
0.04114736616611481,
0.06867329776287079,
-0.05381178483366966,
-0.09872560203075409,
0.06666627526283264,
-0.11401281505823135,
0.09693394601345062,
0.08336253464221954,
0.019398493692278862,
-0.07008101046085358,
-0.027003653347492218,
0.033155407756567,
-0.1339402049779892,
0.07141711562871933,
0.04685454070568085,
-0.04918935149908066,
-0.10660786926746368,
0.004788797348737717,
0.02425464801490307,
-0.05811437964439392,
0.01931929402053356,
-0.01856823079288006,
0.031123019754886627,
-0.012915159575641155,
-0.04647627845406532,
0.08166800439357758,
-0.06269432604312897,
-0.02704099752008915,
-0.16660049557685852,
-0.13607731461524963,
-0.052367959171533585,
0.0526852048933506,
-0.047434478998184204,
-0.12479420006275177,
-0.04935935512185097,
0.025960974395275116,
-0.0775240808725357,
-0.0041996631771326065,
-0.0618027001619339,
-0.06807044893503189,
-0.015021925792098045,
-0.03912995755672455,
0.07169164717197418,
0.15665782988071442,
0.0287193451076746,
-0.022845588624477386,
0.0755448266863823,
-0.18007522821426392,
0.11750838905572891,
-0.08651033043861389,
0.20824986696243286,
-0.0502571240067482,
0.019850580021739006,
0.04756447300314903,
0.020255105569958687,
0.007741069421172142,
0.18410415947437286,
-0.06691006571054459,
-0.09987465292215347,
0.12376295775175095,
-0.04786120355129242,
-0.11903131008148193,
0.054696012288331985,
0.028193483129143715,
0.05698416009545326,
0.05219200253486633,
0.23759931325912476,
0.09837153553962708,
-0.25738847255706787,
0.06069955229759216,
0.03800474852323532,
-0.13884961605072021,
0.02946970984339714,
0.12819430232048035,
-0.056074462831020355,
0.0011176691623404622,
0.003123014699667692,
-0.13176187872886658,
0.07512643188238144,
-0.015499190427362919,
-0.037137363106012344,
0.03911563754081726,
-0.06439581513404846,
0.024830356240272522,
-0.008010604418814182,
-0.0016406126087531447,
-0.05021600425243378,
-0.07633259147405624,
-0.035120297223329544,
0.1002371609210968,
-0.004175754729658365,
0.0561366081237793,
-0.07126711308956146,
0.11091150343418121,
0.0486905612051487,
0.03618035465478897,
-0.1159699559211731,
-0.13421238958835602,
0.01929035596549511,
0.0710178017616272,
0.09603624045848846,
-0.06735861301422119,
0.05797027423977852,
0.09420909732580185,
0.008100662380456924,
-0.07975540310144424,
-0.10838305950164795,
-0.01833210699260235,
-0.057317961007356644,
-0.11650455743074417,
-0.07525425404310226,
-0.06881983578205109,
0.1258457750082016,
-0.10502602905035019,
0.05430607497692108,
-0.1239175871014595,
0.048360634595155716,
0.0030353954061865807,
-0.031176084652543068,
0.052381861954927444,
-0.001106727053411305,
0.03692370653152466,
-0.056717902421951294,
0.09411095082759857,
0.03398558497428894,
-0.07798323035240173,
0.07643605768680573,
-0.048621539026498795,
-0.08196112513542175,
0.0680852010846138,
0.04525548592209816,
-0.02926572971045971,
-0.03700486198067665,
-0.0903053879737854,
0.017586002126336098,
-0.07952693104743958,
-0.01578020490705967,
0.155573308467865,
0.09929928928613663,
0.12197452783584595,
-0.06553317606449127,
-0.06874848157167435,
-0.03983701020479202,
-0.09310222417116165,
-0.044303230941295624,
0.13021032512187958,
0.014224160462617874,
0.04306374117732048,
0.027619438245892525,
0.05139829218387604,
0.09624072909355164,
0.08718396723270416,
0.011822888627648354,
-0.11020290106534958,
-0.016788415610790253,
0.04947818070650101,
0.05021096020936966,
-0.004755459725856781,
0.05610152333974838,
-0.013439428992569447,
0.021302487701177597,
-0.032146718353033066,
-0.023706775158643723,
-0.1271580308675766,
-0.06831888854503632,
0.027441948652267456,
-0.033354710787534714,
0.047050073742866516,
-0.015608429908752441,
-0.034250110387802124,
0.041170790791511536,
0.10505058616399765,
0.03656787797808647,
-0.015298374928534031,
-0.05836484581232071,
-0.11604787409305573,
0.07230015099048615,
-0.08954408019781113,
-0.29743486642837524,
-0.13023698329925537,
-0.10803098976612091,
-0.06144365295767784,
0.04117851331830025,
0.055638737976551056,
-0.15803419053554535,
-0.03865376487374306,
-0.11457890272140503,
-0.043572019785642624,
0.05925760418176651,
-0.06626419723033905,
0.18611088395118713,
0.10483463108539581,
0.027260608971118927,
-0.08048783987760544,
-0.017461242154240608,
0.008005538024008274,
-0.020248107612133026,
-0.00048327239346690476,
0.03451984375715256,
0.07272966206073761,
0.12005384266376495,
0.0711328536272049,
0.03860098868608475,
-0.01575944386422634,
0.1246894896030426,
-0.059372834861278534,
-0.029069503769278526,
0.16070014238357544,
-0.024623211473226547,
0.06533613801002502,
0.05941709131002426,
0.028111567720770836,
-0.039764486253261566,
0.06168633699417114,
0.0037247887812554836,
-0.05861278995871544,
-0.1783980429172516,
-0.1239980161190033,
-0.03132915496826172,
0.2050897628068924,
0.09491228312253952,
0.07866504788398743,
-0.08254464715719223,
-0.04932745173573494,
-0.007914994843304157,
-0.022173050791025162,
0.1399100422859192,
0.12376873195171356,
-0.0083152512088418,
-0.05878731608390808,
0.0012613078579306602,
-0.04541120305657387,
0.031769488006830215,
0.0948762595653534,
-0.03876064717769623,
0.07614476978778839,
0.028078805655241013,
0.016277698799967766,
0.038989096879959106,
-0.07022226601839066,
-0.057521671056747437,
0.05998513102531433,
0.03862421587109566,
-0.009790931828320026,
-0.020797275006771088,
-0.08560212701559067,
-0.06044676527380943,
0.10456357151269913,
0.11013717204332352,
-0.04346705600619316,
-0.12307409197092056,
0.10145679861307144,
0.08914021402597427,
0.1197611466050148,
0.03254832699894905,
-0.14038720726966858,
-0.07857128232717514,
0.01682213880121708,
-0.13061735033988953,
0.025006689131259918,
-0.009226029738783836,
0.04490720480680466,
-0.18460434675216675,
0.07423321902751923,
0.045460186898708344,
0.12918932735919952,
0.03567224740982056,
0.013305836357176304,
0.015468153171241283,
0.09218689799308777,
0.00838875025510788,
0.06427881121635437,
-0.131155863404274,
0.06845204532146454,
-0.007161901798099279,
0.08856754750013351,
-0.03088672272861004,
0.006417810916900635,
0.0848964974284172,
0.034565046429634094,
0.16409757733345032,
0.0395929329097271,
0.03158118575811386,
-0.0893043577671051,
-0.17040275037288666,
-0.05911466106772423,
-0.0026823163498193026,
-0.09359170496463776,
0.07725372165441513,
-0.012297563254833221,
-0.04605598375201225,
-0.10893912613391876,
0.120203398168087,
-0.022881444543600082,
-0.06709938496351242,
-0.019959433004260063,
-0.06031779944896698,
0.03985218703746796,
-0.06016991659998894,
-0.019516844302415848,
-0.04003295302391052,
0.2401534914970398,
0.12465649843215942,
-0.012995687313377857,
-0.09249621629714966,
-0.056779369711875916,
-0.029312796890735626,
-0.026124155148863792,
0.011057543568313122,
-0.004440296906977892,
0.15553070604801178,
-0.08688896149396896,
-0.03687985613942146,
-0.024924222379922867,
-0.11170408874750137,
-0.1083294004201889,
-0.0051733157597482204,
0.21541599929332733,
-0.012093344703316689,
0.10061749815940857,
0.008783646859228611,
0.03110731579363346,
-0.0021642448846250772,
-0.07995249330997467,
0.13350529968738556,
0.21598275005817413,
0.02890045754611492,
0.07001005113124847,
-0.11800137162208557,
0.05392559617757797,
-0.08521126210689545,
-0.03295987471938133,
0.21554820239543915,
0.3122294545173645,
-0.049195796251297,
0.22047756612300873,
0.07014241814613342,
-0.05728387087583542,
-0.21175524592399597,
-0.08454339951276779,
0.04776590317487717,
0.013644901104271412,
0.1503828763961792,
-0.15338367223739624,
0.03328896313905716,
0.008588098920881748,
-0.009829163551330566,
-0.03746883571147919,
-0.18115101754665375,
-0.09991852939128876,
-0.020921064540743828,
0.056288525462150574,
0.004703260958194733,
-0.0640345886349678,
-0.061178065836429596,
-0.06279685348272324,
-0.13044431805610657,
0.047579098492860794,
-0.14916984736919403,
0.07851429283618927,
0.0071461815387010574,
0.02352862060070038,
0.05084976553916931,
-0.021925853565335274,
0.1344052106142044,
-0.06484720855951309,
-0.005141747649759054,
-0.1022762656211853,
0.0091751953586936,
0.010430867783725262,
-0.11406304687261581,
0.11294443905353546,
-0.05153242126107216,
-0.030433572828769684,
-0.17780163884162903,
-0.047612372785806656,
-0.031012583523988724,
0.027433453127741814,
-0.014264160767197609,
-0.021372096613049507,
-0.014804129488766193,
0.07896754145622253,
0.09261169284582138,
0.040974561125040054,
0.03075638972222805,
-0.02151755802333355,
-0.007957939058542252,
0.07063356786966324,
0.11991918087005615,
0.005865676328539848,
-0.07437319308519363,
-0.04020781069993973,
-0.04038994759321213,
-0.020979294553399086,
-0.07425620406866074,
0.009018580429255962,
0.038352422416210175,
0.0139557383954525,
0.06437287479639053,
0.04439840465784073,
-0.08697297424077988,
-0.017232179641723633,
0.08604858815670013,
-0.09955156594514847,
-0.15517883002758026,
-0.060027360916137695,
-0.050425976514816284,
-0.043861743062734604,
-0.08895852416753769,
0.04798649623990059,
-0.02165031246840954,
0.004509442951530218,
0.042128972709178925,
0.026352031156420708,
-0.06843579560518265,
0.025461524724960327,
-0.03379811719059944,
0.03550814092159271,
-0.05759773775935173,
0.1501539945602417,
0.022892944514751434,
-0.024662617594003677,
0.02047577127814293,
0.21174894273281097,
-0.08171094954013824,
-0.06893077492713928,
-0.04637284204363823,
0.06418140232563019,
0.14219391345977783,
-0.03550604358315468,
-0.04489907622337341,
-0.09216144680976868,
0.08188898861408234,
-0.10231894999742508,
0.009266363456845284,
-0.07134879380464554,
0.009843739680945873,
0.08676151931285858,
-0.09548527747392654,
0.09673625230789185,
0.0152114387601614,
-0.05944113805890083,
-0.12325098365545273,
0.12577639520168304,
0.03723684698343277,
0.15897372364997864,
-0.02146434597671032,
-0.05071088671684265,
-0.11796999722719193,
0.015201006084680557,
-0.04276403784751892,
-0.015214387327432632,
-0.15871989727020264,
-0.011528571136295795,
-0.036717768758535385,
0.06684233993291855,
-0.008658191189169884,
0.04222363233566284,
-0.051909491419792175,
-0.075919508934021,
-0.06913042068481445,
0.08712311089038849,
-0.05647464096546173,
-0.038014236837625504,
0.02437366172671318,
-0.07130059599876404,
0.09464084357023239,
0.04755261912941933,
-0.015390963293612003,
-0.04767446964979172,
-0.05398926883935928,
-0.07375701516866684,
0.025047551840543747,
-0.029143663123250008,
0.030693070963025093,
-0.17413292825222015,
-0.008131877519190311,
-0.0424451045691967,
-0.11451776325702667,
0.00942440889775753,
0.09604532271623611,
-0.06968812644481659,
0.0321592278778553,
0.040795888751745224,
-0.13105036318302155,
-0.09190880507230759,
-0.0017091736663132906,
0.006418506149202585,
0.05898682773113251,
0.07544927299022675,
-0.0713556781411171,
0.15125125646591187,
-0.11137191951274872,
-0.012057316489517689,
0.002716048853471875,
0.02763974294066429,
-0.024317540228366852,
-0.1143094077706337,
0.030557725578546524,
-0.00825549103319645,
0.1161249428987503,
0.07289829105138779,
-0.01674421690404415,
0.033168189227581024,
0.02106192708015442,
0.10007092356681824,
0.000516776810400188,
0.03920436650514603,
-0.007405433338135481,
0.01349072065204382,
0.062348514795303345,
-0.005331939551979303,
0.06268823891878128,
-0.10917196422815323,
0.12512977421283722,
0.06489809602499008,
0.1486658900976181,
0.05579226464033127,
0.08109144866466522,
-0.09238725900650024,
-0.1874365210533142,
-0.05592789500951767,
-0.014430818147957325,
0.0370216965675354,
-0.07041298598051071,
0.21809904277324677,
0.11476980894804001,
-0.22613561153411865,
0.07195398211479187,
0.010005608201026917,
0.014710655435919762,
-0.09367715567350388,
-0.10467574745416641,
-0.0025195577181875706,
-0.19576269388198853,
0.07007312774658203,
-0.06010124459862709,
0.012018950656056404,
-0.06857606023550034,
-0.02764616720378399,
-0.00465408293530345,
0.05756964161992073,
-0.16126692295074463,
-0.044316839426755905,
0.07941614091396332,
-0.05635726824402809,
0.010308338329195976,
0.0018059584544971585,
-0.020608292892575264,
-0.030337143689393997,
-0.08034728467464447,
0.04679642245173454,
0.05629048869013786,
0.002226306824013591,
0.05517837032675743,
-0.02754489704966545,
-0.0582067146897316,
0.04054991528391838,
-0.010392358526587486,
0.0309943575412035,
0.10024400055408478,
0.06603258848190308,
-0.09301567077636719,
-0.0019481639610603452,
0.23328301310539246,
-0.06295902281999588,
-0.04793828725814819,
-0.10374180227518082,
0.14658205211162567,
-0.0077824704349040985,
-0.04308652505278587,
-0.04509411007165909,
-0.10668180882930756,
-0.08938430994749069,
0.23582717776298523,
0.11939100176095963,
-0.0748278945684433,
0.02437616139650345,
-0.04605737328529358,
0.020614255219697952,
-0.01050301268696785,
0.11885702610015869,
0.06112184002995491,
0.15115556120872498,
-0.05573244020342827,
-0.02315184846520424,
-0.015335096046328545,
-0.07745996117591858,
-0.1748393326997757,
-0.002118233824148774,
0.015794191509485245,
-0.026591414585709572,
-0.03507224842905998,
0.05110590532422066,
-0.1171480193734169,
-0.0786009207367897,
0.09789182245731354,
-0.08194268494844437,
-0.0716734379529953,
-0.025101367384195328,
0.004285442642867565,
0.048459239304065704,
0.12186107784509659,
0.04980633407831192,
0.02959924004971981,
0.13239890336990356,
-0.028323881328105927,
-0.06119292229413986,
0.011139776557683945,
0.09272744506597519,
-0.03250770643353462,
0.18529151380062103,
-0.038676802068948746,
0.030813150107860565,
0.044844042509794235,
0.020674200728535652,
-0.13917140662670135,
0.05905378609895706,
0.0377720482647419,
-0.163254052400589,
0.022632436826825142,
0.08794081956148148,
-0.06959468871355057,
-0.00027618141029961407,
0.0738103911280632,
-0.05365326255559921,
0.003418556647375226,
0.1405586153268814,
-0.025387490168213844,
-0.03948046639561653,
0.07499921321868896,
-0.14475604891777039,
0.10342772305011749,
0.14388129115104675,
-0.05177155137062073,
-0.005786023568361998,
-0.03451008349657059,
0.04669329524040222,
0.03808034956455231,
0.027004839852452278,
-0.03078084997832775,
-0.12726134061813354,
0.017191864550113678,
0.03591058775782585,
0.024176500737667084,
-0.27082914113998413,
-0.1492418348789215,
-0.05208609625697136,
-0.026453645899891853,
-0.0350038968026638,
0.1265811175107956,
0.12346834689378738,
-0.020783454179763794,
-0.01352655328810215,
-0.2261504977941513,
0.05671648681163788,
0.15865272283554077,
-0.09206484258174896,
-0.03363911807537079
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ocr8_distilbert-base-uncased
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2347
- Accuracy: 0.7087
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 52 | 1.4452 | 0.6019 |
| No log | 2.0 | 104 | 1.2347 | 0.7087 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "ocr8_distilbert-base-uncased", "results": []}]} | text-classification | sebastiencormier/ocr8_distilbert-base-uncased | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T14:56:23+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| ocr8\_distilbert-base-uncased
=============================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2347
* Accuracy: 0.7087
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
98,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.08926873654127121,
0.10998335480690002,
-0.003008445957675576,
0.11622580885887146,
0.13546361029148102,
0.011617263779044151,
0.15824685990810394,
0.1241612583398819,
-0.0728776603937149,
0.036667149513959885,
0.1243002787232399,
0.13492651283740997,
0.012294704094529152,
0.12253350019454956,
-0.08260267972946167,
-0.2234765589237213,
0.007743557449430227,
0.02551986835896969,
-0.06787396967411041,
0.11384408175945282,
0.10085876286029816,
-0.12201319634914398,
0.08760378509759903,
-0.013957970775663853,
-0.16014520823955536,
0.011286166496574879,
0.017782557755708694,
-0.057123105973005295,
0.12326359748840332,
0.03111259452998638,
0.12168850749731064,
0.03228840231895447,
0.08346685022115707,
-0.1866990029811859,
0.008949638344347477,
0.060641080141067505,
-0.00341395172290504,
0.08306409418582916,
0.0406012125313282,
-0.006102605722844601,
0.08265414834022522,
-0.09300684928894043,
0.06023038178682327,
0.01213532593101263,
-0.12203769385814667,
-0.22661131620407104,
-0.0833311378955841,
0.03248216584324837,
0.0973300188779831,
0.0731595903635025,
-0.010262645781040192,
0.1260586977005005,
-0.04532845690846443,
0.10145056247711182,
0.20004089176654816,
-0.3017774820327759,
-0.06320162862539291,
0.04651586338877678,
0.02555817738175392,
0.08914035558700562,
-0.10045526921749115,
-0.015617309138178825,
0.05522442236542702,
0.026431854814291,
0.13713249564170837,
-0.03063766285777092,
-0.06320463865995407,
0.001736846985295415,
-0.14084817469120026,
-0.016260750591754913,
0.16629521548748016,
0.04832417890429497,
-0.043793074786663055,
-0.05090697482228279,
-0.07515518367290497,
-0.10997745394706726,
-0.03842649236321449,
-0.0168691985309124,
0.05241378769278526,
-0.018508080393075943,
-0.06296207755804062,
-0.030030574649572372,
-0.09835762530565262,
-0.060929782688617706,
-0.05302850157022476,
0.14637728035449982,
0.034711603075265884,
0.009309844113886356,
-0.014828776940703392,
0.0955362319946289,
-0.02612057887017727,
-0.1491943597793579,
0.020271703600883484,
0.019771071150898933,
0.014676802791655064,
-0.047781795263290405,
-0.05145517736673355,
-0.0880870521068573,
0.024035481736063957,
0.15866811573505402,
-0.05737634375691414,
0.05290921777486801,
-0.004876693245023489,
0.04638580232858658,
-0.09839250147342682,
0.16375966370105743,
-0.03265410661697388,
-0.03388706594705582,
0.02536803111433983,
0.0837421864271164,
0.05877043306827545,
-0.014929710887372494,
-0.12795965373516083,
0.03789345175027847,
0.10832995921373367,
0.017832478508353233,
-0.05112747102975845,
0.0676228255033493,
-0.054418280720710754,
-0.019129613414406776,
0.04084443300962448,
-0.09682707488536835,
0.030978107824921608,
0.00034359542769379914,
-0.06222878023982048,
-0.0520375557243824,
0.029653511941432953,
0.0252204742282629,
0.002765434794127941,
0.10890829563140869,
-0.07464905828237534,
0.012872360646724701,
-0.08243989199399948,
-0.1239001527428627,
0.017645737156271935,
-0.08012057840824127,
0.02422146499156952,
-0.10622692853212357,
-0.19696764647960663,
-0.005230903625488281,
0.07226017117500305,
-0.031281113624572754,
-0.033290378749370575,
-0.05692625045776367,
-0.07604914903640747,
0.018459098413586617,
-0.016276847571134567,
0.07485166937112808,
-0.06258618086576462,
0.09666934609413147,
0.03680890426039696,
0.06635802239179611,
-0.062495846301317215,
0.04196435585618019,
-0.1108182966709137,
0.034678224474191666,
-0.1837867796421051,
0.03664547577500343,
-0.06929502636194229,
0.06548699736595154,
-0.08306700736284256,
-0.07337658852338791,
0.0022841475438326597,
-0.0032535314094275236,
0.068473681807518,
0.09256382286548615,
-0.17408700287342072,
-0.06158512830734253,
0.14403803646564484,
-0.09080757945775986,
-0.1406845599412918,
0.13845224678516388,
-0.06076953932642937,
0.050180256366729736,
0.06748602539300919,
0.20164018869400024,
0.07143031805753708,
-0.07824953645467758,
0.009550555609166622,
0.004625976551324129,
0.06290360540151596,
-0.027836095541715622,
0.07616737484931946,
0.002897188300266862,
0.006067011039704084,
0.013470789417624474,
-0.050840359181165695,
0.04516048729419708,
-0.07396241277456284,
-0.0939980298280716,
-0.04395344480872154,
-0.1014312133193016,
0.06569263339042664,
0.051725391298532486,
0.06870488077402115,
-0.11351328343153,
-0.08641940355300903,
0.06805786490440369,
0.0730888620018959,
-0.07413754612207413,
0.026085350662469864,
-0.07032324373722076,
0.08287811279296875,
-0.05320930853486061,
-0.012524441815912724,
-0.16007238626480103,
-0.036736078560352325,
0.023458898067474365,
-0.008170023560523987,
0.019877871498465538,
-0.0021719078067690134,
0.07352416217327118,
0.08080187439918518,
-0.07439074665307999,
-0.029838863760232925,
-0.009708479978144169,
0.015396040864288807,
-0.12263476103544235,
-0.19809359312057495,
-0.00812264159321785,
-0.037562232464551926,
0.12936612963676453,
-0.22733590006828308,
0.05196335166692734,
0.0002264039358124137,
0.09582049399614334,
0.0416586697101593,
-0.0073423502035439014,
-0.040914617478847504,
0.060607217252254486,
-0.05290140211582184,
-0.06812110543251038,
0.061239197850227356,
0.009604355320334435,
-0.10465332865715027,
-0.047215837985277176,
-0.14232473075389862,
0.18258355557918549,
0.1304166167974472,
-0.08276215940713882,
-0.06587540358304977,
0.007949452847242355,
-0.03613446280360222,
-0.02875368297100067,
-0.04062514752149582,
0.00410206476226449,
0.1279762238264084,
-0.010706719011068344,
0.1546172946691513,
-0.08953037112951279,
-0.03471137583255768,
0.01782185584306717,
-0.05211556702852249,
0.005459570325911045,
0.11025799065828323,
0.07404210418462753,
-0.11881019920110703,
0.1486174762248993,
0.2067631483078003,
-0.09571727365255356,
0.13579827547073364,
-0.0472286157310009,
-0.04863784462213516,
-0.026692403480410576,
0.007044097874313593,
0.010845634154975414,
0.10153281688690186,
-0.1146823912858963,
0.007195579819381237,
0.015327511355280876,
0.01634928584098816,
0.009496953338384628,
-0.2115468829870224,
-0.020887356251478195,
0.03684235364198685,
-0.05266826972365379,
0.011382010765373707,
-0.018603354692459106,
-0.010951214469969273,
0.09799563139677048,
-0.00948934257030487,
-0.09440761804580688,
0.05269159376621246,
-0.0043449969962239265,
-0.07463862001895905,
0.19957749545574188,
-0.09539637714624405,
-0.15827253460884094,
-0.13217122852802277,
-0.06512434780597687,
-0.06263192743062973,
0.03309796750545502,
0.07075757533311844,
-0.062446340918540955,
-0.04488082602620125,
-0.11194973438978195,
-0.005750475451350212,
0.025999929755926132,
0.018164345994591713,
0.027035627514123917,
-0.0018451075302436948,
0.08869156241416931,
-0.10355286300182343,
-0.00945084448903799,
-0.031644370406866074,
-0.04842795059084892,
0.03706330806016922,
0.03310619667172432,
0.10598734766244888,
0.13669195771217346,
-0.028727389872074127,
-0.004016880877315998,
-0.027004115283489227,
0.22637762129306793,
-0.05872900411486626,
-0.001165876630693674,
0.1343916356563568,
-0.02860669232904911,
0.058104775846004486,
0.13683390617370605,
0.06258900463581085,
-0.09634390473365784,
0.02017364278435707,
0.035407960414886475,
-0.03474221006035805,
-0.21608035266399384,
-0.03688240423798561,
-0.035257354378700256,
0.006485526915639639,
0.09455239027738571,
0.03515748307108879,
0.029318323358893394,
0.06389832496643066,
0.021100690588355064,
0.07843982428312302,
-0.0005097358953207731,
0.07360055297613144,
0.11827528476715088,
0.042940348386764526,
0.13245470821857452,
-0.04533745348453522,
-0.05640536919236183,
0.044650688767433167,
-0.0060392688028514385,
0.19727005064487457,
0.022779539227485657,
0.1354413479566574,
0.04809228330850601,
0.15638190507888794,
-0.006332395598292351,
0.06123470142483711,
-0.013218614272773266,
-0.031597938388586044,
-0.022014210000634193,
-0.05228841304779053,
-0.02484453283250332,
0.039802566170692444,
-0.08870066702365875,
0.05418580397963524,
-0.09560089558362961,
0.017404207959771156,
0.06763454526662827,
0.23564307391643524,
0.05079013481736183,
-0.31628429889678955,
-0.08558297157287598,
0.03837418183684349,
-0.02530944161117077,
-0.02007925882935524,
0.029434192925691605,
0.12170978635549545,
-0.04785851016640663,
0.04925205931067467,
-0.07750130444765091,
0.08504318445920944,
-0.035475365817546844,
0.04697280749678612,
0.052135273814201355,
0.08253123611211777,
-0.005146387033164501,
0.07509756833314896,
-0.2880243957042694,
0.26246464252471924,
0.019334519281983376,
0.07074800878763199,
-0.05197254940867424,
0.005251124035567045,
0.04002148285508156,
0.09441938996315002,
0.07351373136043549,
-0.01570369303226471,
-0.0654997006058693,
-0.17821235954761505,
-0.06192803010344505,
0.020869312807917595,
0.09432113915681839,
-0.04069112613797188,
0.09361796081066132,
-0.032673101872205734,
0.0018532571848481894,
0.08157292008399963,
-0.021014932543039322,
-0.08502376824617386,
-0.09368307888507843,
-0.008905177935957909,
0.04097491502761841,
-0.03878292441368103,
-0.07735822349786758,
-0.09634415060281754,
-0.1317199468612671,
0.15700528025627136,
-0.05365661904215813,
-0.038154732435941696,
-0.10143553465604782,
0.05536038056015968,
0.05737286061048508,
-0.07938631623983383,
0.034190718084573746,
0.005522001069039106,
0.086995929479599,
0.022261803969740868,
-0.06936872005462646,
0.12091349810361862,
-0.07417368143796921,
-0.17895004153251648,
-0.06689543277025223,
0.1073911264538765,
0.016967061907052994,
0.04384465888142586,
-0.003939361311495304,
0.015496612526476383,
-0.01773250661790371,
-0.07557760924100876,
0.027127359062433243,
-0.0004438577452674508,
0.05506158247590065,
0.0272329393774271,
-0.05900978669524193,
-0.005473320838063955,
-0.05892971530556679,
-0.02869773656129837,
0.1471795290708542,
0.2884332537651062,
-0.08215415477752686,
0.020938297733664513,
0.0705472007393837,
-0.06946388632059097,
-0.20971108973026276,
0.027015037834644318,
0.023768967017531395,
-0.005495409946888685,
0.05788213387131691,
-0.1498233526945114,
0.1039208471775055,
0.10089986026287079,
-0.031662601977586746,
0.10238587856292725,
-0.2900680601596832,
-0.13800883293151855,
0.12793584167957306,
0.1420518010854721,
0.11484784632921219,
-0.15970754623413086,
-0.041375480592250824,
-0.04193974286317825,
-0.08688896149396896,
0.1093466654419899,
-0.13819627463817596,
0.10881418734788895,
-0.006867789663374424,
0.05380350723862648,
0.005601470358669758,
-0.04984588548541069,
0.13666853308677673,
-0.003726549446582794,
0.11814361810684204,
-0.06583453714847565,
-0.007247914560139179,
0.06496827304363251,
-0.06235118582844734,
0.028761688619852066,
-0.12050696462392807,
0.04611263796687126,
-0.062082067131996155,
-0.022609131410717964,
-0.04229997470974922,
0.03958353027701378,
-0.03440290689468384,
-0.06512994319200516,
-0.04659460112452507,
0.027596991509199142,
0.0479687824845314,
-0.008913140743970871,
0.18172280490398407,
0.02407049760222435,
0.13958169519901276,
0.1621658205986023,
0.07797637581825256,
-0.07167762517929077,
-0.012460877187550068,
-0.013882449828088284,
-0.034095872193574905,
0.062411535531282425,
-0.15947523713111877,
0.046997521072626114,
0.12289862334728241,
0.006636141799390316,
0.15209618210792542,
0.06721685081720352,
-0.029003456234931946,
0.013890491798520088,
0.0606515072286129,
-0.17050805687904358,
-0.10738909244537354,
-0.009914337657392025,
-0.03018624149262905,
-0.11437971889972687,
0.06353311985731125,
0.12979957461357117,
-0.06692610681056976,
0.008534704335033894,
-0.00366361066699028,
0.021590765565633774,
-0.03339353948831558,
0.1827058494091034,
0.06169332191348076,
0.04253038018941879,
-0.0853588730096817,
0.10068171471357346,
0.05770397558808327,
-0.0706048533320427,
0.012739990837872028,
0.04828668758273125,
-0.08210878819227219,
-0.04941298067569733,
0.04924240708351135,
0.19624628126621246,
-0.02851487137377262,
-0.05208349600434303,
-0.14649862051010132,
-0.10707079619169235,
0.051426250487565994,
0.15943235158920288,
0.10214749723672867,
0.008074752055108547,
-0.04095737636089325,
0.011641246266663074,
-0.10445597767829895,
0.1249285563826561,
0.04896088317036629,
0.08532719314098358,
-0.15842580795288086,
0.11025864630937576,
-0.004672781098634005,
0.010790706612169743,
-0.028056327253580093,
0.043784696608781815,
-0.10934721678495407,
-0.009139272384345531,
-0.1374220848083496,
-0.003066734876483679,
-0.022742480039596558,
0.009717575274407864,
0.002509254263713956,
-0.060115277767181396,
-0.05702780932188034,
0.01335076242685318,
-0.09990077465772629,
-0.021870914846658707,
0.03508708253502846,
0.05293571203947067,
-0.12596069276332855,
-0.05630340427160263,
0.018375856801867485,
-0.06816056370735168,
0.0702814906835556,
0.01903885044157505,
0.013062783516943455,
0.049413811415433884,
-0.18195350468158722,
0.022122958675026894,
0.06071025878190994,
0.019478872418403625,
0.0444464311003685,
-0.08663713932037354,
-0.024595173075795174,
0.002546576550230384,
0.04597204178571701,
0.019292738288640976,
0.08880658447742462,
-0.1258571445941925,
0.013376898132264614,
-0.033503804355859756,
-0.07292158901691437,
-0.05150812119245529,
0.03277651593089104,
0.08672047406435013,
0.015028915368020535,
0.21930697560310364,
-0.09788388013839722,
0.01692311093211174,
-0.20010007917881012,
0.008953569456934929,
0.005229828413575888,
-0.12491333484649658,
-0.12467022240161896,
-0.0508919283747673,
0.04744887724518776,
-0.06565657258033752,
0.1371975839138031,
0.022460270673036575,
0.025334494188427925,
0.039426226168870926,
-0.03764406591653824,
0.03665042296051979,
0.022846432402729988,
0.21552275121212006,
0.028863048180937767,
-0.03905826807022095,
0.018020352348685265,
0.023438967764377594,
0.11510421335697174,
0.07957468926906586,
0.16547304391860962,
0.1687891185283661,
-0.05047261714935303,
0.09726493060588837,
0.04198668152093887,
-0.045426830649375916,
-0.1500409096479416,
0.06415248662233353,
-0.028207927942276,
0.10947396606206894,
-0.015843192115426064,
0.1938706934452057,
0.0910869687795639,
-0.16245640814304352,
0.019425487145781517,
-0.05261993780732155,
-0.0850987583398819,
-0.11023136973381042,
-0.07098947465419769,
-0.10283627361059189,
-0.1451779156923294,
-0.00784735381603241,
-0.11393815279006958,
0.02131550945341587,
0.09198518097400665,
0.0020332150161266327,
-0.02274199388921261,
0.15864664316177368,
-0.008149819448590279,
0.03513344004750252,
0.05927311256527901,
-0.002149451058357954,
-0.0461726114153862,
-0.06608077883720398,
-0.10039334744215012,
0.002057385863736272,
-0.0021215006709098816,
0.020179251208901405,
-0.04519597440958023,
-0.020094746723771095,
0.03467751666903496,
-0.016900377348065376,
-0.10895438492298126,
0.011455165222287178,
0.02706545777618885,
0.04911571741104126,
0.04639025405049324,
0.015199663117527962,
0.005939229391515255,
0.0031494644936174154,
0.22672364115715027,
-0.07946091890335083,
-0.065178282558918,
-0.096132293343544,
0.21650636196136475,
0.026440879330039024,
0.012307299301028252,
0.01543999370187521,
-0.09350261837244034,
0.01894436590373516,
0.20806846022605896,
0.19150367379188538,
-0.0893358513712883,
-0.002713070949539542,
-0.022125236690044403,
-0.009278379380702972,
-0.03385009616613388,
0.09255325049161911,
0.11099600791931152,
0.0014963657595217228,
-0.07013841718435287,
-0.0489102341234684,
-0.03906361013650894,
-0.007816681638360023,
-0.06527714431285858,
0.06098431348800659,
0.028029896318912506,
0.008445528335869312,
-0.04626801237463951,
0.0607743002474308,
-0.030458513647317886,
-0.11457160115242004,
0.037523750215768814,
-0.19330978393554688,
-0.15125994384288788,
-0.012444810941815376,
0.11977136135101318,
-0.009878015145659447,
0.044461436569690704,
-0.031929079443216324,
0.0034758853726089,
0.05714339017868042,
-0.029704073444008827,
-0.06139025464653969,
-0.0583818294107914,
0.06630783528089523,
-0.117778480052948,
0.23078592121601105,
-0.029258357360959053,
0.050177205353975296,
0.12560603022575378,
0.0406632237136364,
-0.07071354240179062,
0.08365996181964874,
0.04601852223277092,
-0.05666709318757057,
0.03745274990797043,
0.09272072464227676,
-0.04121397063136101,
0.12118438631296158,
0.06641913950443268,
-0.1305442601442337,
0.001553710550069809,
-0.028945166617631912,
-0.09436935931444168,
-0.051205676048994064,
-0.04011671245098114,
-0.06293879449367523,
0.1279955953359604,
0.18476663529872894,
-0.03519527241587639,
0.009957547299563885,
-0.048826079815626144,
0.02144162543118,
0.0661601647734642,
0.02481161430478096,
-0.0339943952858448,
-0.22979126870632172,
0.02324492670595646,
0.06944291293621063,
-0.0018917714478448033,
-0.27080708742141724,
-0.08645816892385483,
-0.0130172623321414,
-0.044301409274339676,
-0.0961085855960846,
0.08595990389585495,
0.11138355731964111,
0.043216850608587265,
-0.060799747705459595,
-0.08875515311956406,
-0.07906167209148407,
0.15013602375984192,
-0.1289762258529663,
-0.09081195294857025
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text2text-generation | OmarHaroon01/t5_imdb_accelerator | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T14:56:31+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
58,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.053328532725572586,
0.16120538115501404,
-0.005120371468365192,
0.022602224722504616,
0.09686747193336487,
0.013199392706155777,
0.07261143624782562,
0.11177206039428711,
-0.020693831145763397,
0.1128523200750351,
0.0323781855404377,
0.09778297692537308,
0.11381756514310837,
0.15530984103679657,
-0.0018252237932756543,
-0.23414164781570435,
0.051169246435165405,
-0.12603329122066498,
-0.039110470563173294,
0.11734651774168015,
0.14655858278274536,
-0.10434788465499878,
0.07780920714139938,
-0.029932111501693726,
-0.010786613449454308,
-0.030950399115681648,
-0.06109464541077614,
-0.04963193088769913,
0.05158040300011635,
0.07096312940120697,
0.06875279545783997,
0.009741154499351978,
0.09293358027935028,
-0.2676756680011749,
0.021060682833194733,
0.07436702400445938,
-0.0019205488497391343,
0.07644513249397278,
0.05394738167524338,
-0.07786445319652557,
0.08801496773958206,
-0.053122974932193756,
0.14802159368991852,
0.08166222274303436,
-0.09144649654626846,
-0.19256246089935303,
-0.08630277216434479,
0.10201671719551086,
0.17971307039260864,
0.050409309566020966,
-0.02338344417512417,
0.10295069962739944,
-0.08843041211366653,
0.012706292793154716,
0.059160783886909485,
-0.06515879184007645,
-0.05482804775238037,
0.0630323737859726,
0.08173035830259323,
0.0787791833281517,
-0.12468571215867996,
-0.018215585500001907,
0.011311499401926994,
0.00691694812849164,
0.08102929592132568,
0.022060219198465347,
0.14176861941814423,
0.03922285884618759,
-0.1292058527469635,
-0.047744158655405045,
0.10315844416618347,
0.04381343349814415,
-0.04969092458486557,
-0.24839195609092712,
-0.028692634776234627,
-0.03409173712134361,
-0.029329892247915268,
-0.041139665991067886,
0.04428756237030029,
-0.010770969092845917,
0.08322557806968689,
-0.008045176975429058,
-0.07979845255613327,
-0.03690612316131592,
0.06324487924575806,
0.05645342543721199,
0.024454401805996895,
-0.008984005078673363,
0.006743076257407665,
0.1175178587436676,
0.10636600106954575,
-0.12631633877754211,
-0.05289403349161148,
-0.06528059393167496,
-0.0853288322687149,
-0.04429693520069122,
0.03338160738348961,
0.04351643845438957,
0.04334709793329239,
0.24920088052749634,
0.011966975405812263,
0.05556565150618553,
0.03878911957144737,
0.011687099933624268,
0.06360286474227905,
0.11270952969789505,
-0.05845928564667702,
-0.09383665025234222,
-0.033332064747810364,
0.09301437437534332,
0.008503437042236328,
-0.0402098223567009,
-0.06047673895955086,
0.06078295037150383,
0.015703821554780006,
0.12211526930332184,
0.087046779692173,
0.002870776690542698,
-0.07195370644330978,
-0.06478150933980942,
0.19285908341407776,
-0.15949691832065582,
0.047871991991996765,
0.03357849270105362,
-0.040312062948942184,
-0.0005020854296162724,
0.01165273692458868,
0.023987481370568275,
-0.021567439660429955,
0.0924374982714653,
-0.05500924214720726,
-0.03761355206370354,
-0.10879732668399811,
-0.03591866046190262,
0.03197222575545311,
0.0022585385013371706,
-0.02967100404202938,
-0.033424828201532364,
-0.08920473605394363,
-0.0635172426700592,
0.09580977261066437,
-0.07413128018379211,
-0.05156254023313522,
-0.016345804557204247,
-0.0761859342455864,
0.026101797819137573,
0.01702207140624523,
0.08535456657409668,
-0.0213642455637455,
0.037230201065540314,
-0.05421315133571625,
0.06241346150636673,
0.10910454392433167,
0.0320611298084259,
-0.053984515368938446,
0.06094928830862045,
-0.2412392497062683,
0.10316064208745956,
-0.07156267017126083,
0.05108866095542908,
-0.15137021243572235,
-0.025331947952508926,
0.04665522649884224,
0.009590202011168003,
-0.011478574015200138,
0.14007656276226044,
-0.2198302298784256,
-0.029333066195249557,
0.1640782356262207,
-0.09730498492717743,
-0.08055570721626282,
0.059064920991659164,
-0.054139286279678345,
0.10999192297458649,
0.04003598168492317,
-0.023768696933984756,
0.06297750771045685,
-0.14250542223453522,
-0.0039275879971683025,
-0.041889119893312454,
-0.01720282807946205,
0.16010744869709015,
0.07506491243839264,
-0.06698185205459595,
0.077672079205513,
0.022212913259863853,
-0.023321649059653282,
-0.04393244534730911,
-0.022494852542877197,
-0.10826845467090607,
0.009565223939716816,
-0.06269361078739166,
0.02424052357673645,
-0.023944495245814323,
-0.0903024971485138,
-0.029575346037745476,
-0.1770460456609726,
-0.013402442447841167,
0.08679109811782837,
-0.010982494801282883,
-0.019886262714862823,
-0.11693590134382248,
0.012033592909574509,
0.032231178134679794,
0.0004325093177612871,
-0.13445010781288147,
-0.05658498778939247,
0.0273329745978117,
-0.16240260004997253,
0.031236927956342697,
-0.05114622414112091,
0.04928715154528618,
0.03406677767634392,
-0.03175085783004761,
-0.031348153948783875,
0.01572313904762268,
0.006510823033750057,
-0.013680041767656803,
-0.24737438559532166,
-0.02852414920926094,
-0.022412575781345367,
0.16979394853115082,
-0.2190135270357132,
0.04012007266283035,
0.07135825604200363,
0.15074580907821655,
0.006911954842507839,
-0.03669405356049538,
0.005606858059763908,
-0.0768459290266037,
-0.03284264728426933,
-0.0623927041888237,
-0.008401541970670223,
-0.03721899166703224,
-0.054593876004219055,
0.051287684589624405,
-0.16718235611915588,
-0.031153932213783264,
0.1028679683804512,
0.06780845671892166,
-0.13963541388511658,
-0.01705223321914673,
-0.04106766730546951,
-0.043112557381391525,
-0.05709490180015564,
-0.05539087578654289,
0.11148729920387268,
0.05757083371281624,
0.04828811436891556,
-0.06848311424255371,
-0.0756818875670433,
0.006132613401859999,
-0.0179264098405838,
-0.021222935989499092,
0.0928845927119255,
0.07583390921354294,
-0.12310270220041275,
0.09178637713193893,
0.10549022257328033,
0.0892157256603241,
0.10119049996137619,
-0.02137933485209942,
-0.08691582083702087,
-0.04892461374402046,
0.0229446180164814,
0.016364475712180138,
0.13983985781669617,
-0.016759416088461876,
0.05310053750872612,
0.04020100086927414,
-0.012910815887153149,
0.011883769184350967,
-0.09328193217515945,
0.02934250421822071,
0.03636814281344414,
-0.019501443952322006,
0.040251899510622025,
-0.03908125311136246,
0.020790016278624535,
0.08787564933300018,
0.04434992000460625,
0.03818633407354355,
0.013980780728161335,
-0.04370194673538208,
-0.11091572046279907,
0.17051653563976288,
-0.12536633014678955,
-0.239797443151474,
-0.14147889614105225,
0.001731917611323297,
0.041165996342897415,
-0.01159723661839962,
0.0031763319857418537,
-0.06770002096891403,
-0.11874829977750778,
-0.09346967190504074,
0.015001182444393635,
0.04228860139846802,
-0.080612413585186,
-0.05524664744734764,
0.05777253210544586,
0.040611669421195984,
-0.143319234251976,
0.020423002541065216,
0.04869217798113823,
-0.08989228308200836,
-0.00900039542466402,
0.08071441948413849,
0.06998268514871597,
0.17929090559482574,
0.009512054733932018,
-0.020932139828801155,
0.03292093798518181,
0.2157505750656128,
-0.13771237432956696,
0.11451084166765213,
0.14277678728103638,
-0.0911637470126152,
0.08293474465608597,
0.1991184800863266,
0.03884927183389664,
-0.10264625400304794,
0.03326369449496269,
0.022328944876790047,
-0.028676386922597885,
-0.2503291964530945,
-0.06918580830097198,
0.0007976540364325047,
-0.05238448083400726,
0.07527847588062286,
0.08888168632984161,
0.09494108706712723,
0.01729334332048893,
-0.09416709095239639,
-0.08025584369897842,
0.04901478812098503,
0.10409125685691833,
0.010409193113446236,
-0.01156378723680973,
0.09060908854007721,
-0.03323452174663544,
0.01843860000371933,
0.09313460439443588,
0.004041523206979036,
0.17060963809490204,
0.05550962686538696,
0.18336638808250427,
0.07643263041973114,
0.0721396952867508,
0.015671607106924057,
0.013079277239739895,
0.02304760180413723,
0.021578695625066757,
-0.0033059304114431143,
-0.0851421132683754,
-0.009511260315775871,
0.11862117052078247,
0.06801546365022659,
0.020754681900143623,
0.009507957845926285,
-0.033934496343135834,
0.08064714074134827,
0.17465052008628845,
-0.0009437129483558238,
-0.1870066076517105,
-0.06896740943193436,
0.08026526123285294,
-0.08972865343093872,
-0.10345284640789032,
-0.02900044620037079,
0.0354950949549675,
-0.17372116446495056,
0.02448408491909504,
-0.018045885488390923,
0.11108683049678802,
-0.1356782615184784,
-0.01890929788351059,
0.06319493800401688,
0.07008420675992966,
-0.0016097982879728079,
0.06208989396691322,
-0.16155508160591125,
0.10791012644767761,
0.01390943955630064,
0.06503470987081528,
-0.09786296635866165,
0.10111832618713379,
-0.006267238408327103,
-0.007413685787469149,
0.14043578505516052,
0.009255880489945412,
-0.07051325589418411,
-0.08343593031167984,
-0.0979004055261612,
-0.010649190284311771,
0.12877127528190613,
-0.14879846572875977,
0.08456916362047195,
-0.0322830006480217,
-0.04405250772833824,
0.005208021495491266,
-0.10768675804138184,
-0.12857580184936523,
-0.18887875974178314,
0.05537694692611694,
-0.13356289267539978,
0.033175256103277206,
-0.1055491715669632,
-0.0408647358417511,
-0.02885887771844864,
0.19630752503871918,
-0.22321896255016327,
-0.0670507624745369,
-0.15318840742111206,
-0.09096445143222809,
0.14798617362976074,
-0.049908362329006195,
0.08374498039484024,
-0.005065108183771372,
0.18742504715919495,
0.01894373446702957,
-0.024415504187345505,
0.1011786088347435,
-0.09638315439224243,
-0.19627197086811066,
-0.08534666895866394,
0.15457913279533386,
0.13537167012691498,
0.0351712740957737,
-0.004617651924490929,
0.03167666867375374,
-0.0189940445125103,
-0.12101218104362488,
0.022920187562704086,
0.17696480453014374,
0.07036592066287994,
0.024736741557717323,
-0.02639835514128208,
-0.11453131586313248,
-0.06600044667720795,
-0.032452553510665894,
0.02982977218925953,
0.18294402956962585,
-0.07586611062288284,
0.18679921329021454,
0.13732017576694489,
-0.05770440772175789,
-0.1956426501274109,
0.01923983357846737,
0.04058924317359924,
0.00837375782430172,
0.032165057957172394,
-0.20239581167697906,
0.08806682378053665,
0.0007347199134528637,
-0.05074144899845123,
0.13624143600463867,
-0.17552010715007782,
-0.15046143531799316,
0.06929060816764832,
0.03642011433839798,
-0.19279520213603973,
-0.12030941992998123,
-0.08865538984537125,
-0.05107492581009865,
-0.17776648700237274,
0.10758756101131439,
0.02193085290491581,
0.00676411809399724,
0.033654287457466125,
0.026140762493014336,
0.014790141955018044,
-0.0396585576236248,
0.19431912899017334,
-0.02348872646689415,
0.030807901173830032,
-0.08293910324573517,
-0.07001609355211258,
0.05941145867109299,
-0.05705835670232773,
0.0775861069560051,
-0.022215960547327995,
0.013414059765636921,
-0.10643109679222107,
-0.04425564035773277,
-0.03175993636250496,
0.015691282227635384,
-0.09722420573234558,
-0.08909335732460022,
-0.050057362765073776,
0.09262266010046005,
0.0974174216389656,
-0.035089656710624695,
-0.03564268350601196,
-0.07118509709835052,
0.039714183658361435,
0.18831974267959595,
0.17605267465114594,
0.046182651072740555,
-0.08030564337968826,
-0.004098092205822468,
-0.011694483458995819,
0.042484745383262634,
-0.21906526386737823,
0.062426332384347916,
0.05058585852384567,
0.014059843495488167,
0.1173645630478859,
-0.01779606007039547,
-0.15810294449329376,
-0.06761486083269119,
0.05993710458278656,
-0.06326820701360703,
-0.19225671887397766,
0.0032602818682789803,
0.055388111621141434,
-0.16711848974227905,
-0.04538320377469063,
0.0430813767015934,
-0.005750913172960281,
-0.039257556200027466,
0.01613711006939411,
0.08359149098396301,
0.0031580389477312565,
0.07040093839168549,
0.05520293489098549,
0.086640864610672,
-0.10250966250896454,
0.07937785238027573,
0.08386688679456711,
-0.08347215503454208,
0.028158824890851974,
0.09330378472805023,
-0.06144890934228897,
-0.029910072684288025,
0.032212331891059875,
0.08255140483379364,
0.012964491732418537,
-0.04401125758886337,
0.008184057660400867,
-0.10146338492631912,
0.0627170279622078,
0.09755739569664001,
0.03206513822078705,
0.011901181191205978,
0.03383762761950493,
0.04645882546901703,
-0.07481352984905243,
0.11842621862888336,
0.025973208248615265,
0.01822328381240368,
-0.04273592680692673,
-0.04516541585326195,
0.027133917436003685,
-0.02340707741677761,
-0.007566304877400398,
-0.03583317995071411,
-0.06988023966550827,
-0.01722576655447483,
-0.16493180394172668,
-0.01076561864465475,
-0.044063083827495575,
0.008020744659006596,
0.026847293600440025,
-0.0369400717318058,
0.008594665676355362,
0.009077225811779499,
-0.07577309012413025,
-0.06240518018603325,
-0.02245018258690834,
0.0914878100156784,
-0.16343435645103455,
0.023352261632680893,
0.08310231566429138,
-0.12098916620016098,
0.09322582185268402,
0.018653366714715958,
-0.0019369579385966063,
0.02680385299026966,
-0.15561461448669434,
0.0368269607424736,
-0.027320701628923416,
0.014671673998236656,
0.045705173164606094,
-0.21818207204341888,
-0.0014451020397245884,
-0.03558654710650444,
-0.059982262551784515,
-0.010693925432860851,
-0.037350837141275406,
-0.11245633661746979,
0.10088492184877396,
0.012412267737090588,
-0.08672942966222763,
-0.03157110512256622,
0.03652326017618179,
0.08053763210773468,
-0.02631879225373268,
0.15205731987953186,
-0.0010786735219880939,
0.07447176426649094,
-0.1738860309123993,
-0.0210786834359169,
-0.0090115275233984,
0.02177848480641842,
-0.016872623935341835,
-0.01564885675907135,
0.042430613189935684,
-0.026671668514609337,
0.18584245443344116,
-0.027355844154953957,
0.03733034059405327,
0.06316441297531128,
0.01770097203552723,
-0.021354418247938156,
0.10755398869514465,
0.06012963131070137,
0.02173144742846489,
0.019801700487732887,
0.0075409491546452045,
-0.041807159781455994,
-0.018543899059295654,
-0.19347810745239258,
0.07164526730775833,
0.14044208824634552,
0.08769161999225616,
-0.012164209969341755,
0.08067302405834198,
-0.10084949433803558,
-0.11743459850549698,
0.11121641099452972,
-0.059808436781167984,
-0.0022669173777103424,
-0.06652101874351501,
0.13155525922775269,
0.14582572877407074,
-0.19254228472709656,
0.07050827890634537,
-0.06511960923671722,
-0.05269601568579674,
-0.11906112730503082,
-0.1953776627779007,
-0.05703132599592209,
-0.054343048483133316,
-0.015079263597726822,
-0.05059242993593216,
0.07498416304588318,
0.05622640252113342,
0.010858895257115364,
0.0015552249969914556,
0.06971994787454605,
-0.019759170711040497,
0.001521410304121673,
0.032095473259687424,
0.06417544931173325,
0.014362066984176636,
-0.03133942559361458,
0.018592869862914085,
-0.008470231667160988,
0.03991629183292389,
0.0633486732840538,
0.04155107960104942,
-0.028110865503549576,
0.01659207232296467,
-0.0337030366063118,
-0.10854189842939377,
0.04278707876801491,
-0.028698457404971123,
-0.08063279837369919,
0.13984808325767517,
0.025403661653399467,
0.009562181308865547,
-0.022226108238101006,
0.241981640458107,
-0.07480388879776001,
-0.09265431761741638,
-0.14692139625549316,
0.1055137887597084,
-0.04348868504166603,
0.06415078788995743,
0.045384783297777176,
-0.10421041399240494,
0.012057800777256489,
0.12658540904521942,
0.1625804305076599,
-0.0438871793448925,
0.019560009241104126,
0.03037482313811779,
0.00398933095857501,
-0.03853052854537964,
0.05252939090132713,
0.06827457249164581,
0.14848913252353668,
-0.050116557627916336,
0.09223522990942001,
0.0050886585377156734,
-0.09908851981163025,
-0.034064266830682755,
0.11810369789600372,
-0.019035303965210915,
0.019260596483945847,
-0.05601469427347183,
0.11788773536682129,
-0.06368034332990646,
-0.233087420463562,
0.06406685709953308,
-0.07426205277442932,
-0.14131881296634674,
-0.024826664477586746,
0.07676053047180176,
-0.014309047721326351,
0.027850469574332237,
0.0722186341881752,
-0.07654546946287155,
0.19937579333782196,
0.03671684116125107,
-0.058611851185560226,
-0.05623113736510277,
0.07896319031715393,
-0.11419995129108429,
0.27488458156585693,
0.015893742442131042,
0.045155949890613556,
0.1038452610373497,
-0.013412448577582836,
-0.13435201346874237,
0.01833420805633068,
0.09638454020023346,
-0.08846497535705566,
0.04018587991595268,
0.20595665276050568,
-0.0028567397966980934,
0.11962885409593582,
0.07707620412111282,
-0.08087631314992905,
0.049051105976104736,
-0.09828304499387741,
-0.07230360060930252,
-0.08931835740804672,
0.09120666980743408,
-0.07232820242643356,
0.14308606088161469,
0.1311190128326416,
-0.05265164002776146,
0.00968363881111145,
-0.029376711696386337,
0.045510269701480865,
0.004632700700312853,
0.10403459519147873,
0.008749093860387802,
-0.1797543615102768,
0.02403045818209648,
0.01841445453464985,
0.10992073267698288,
-0.1701374351978302,
-0.09734909981489182,
0.043629229068756104,
-0.0012522460892796516,
-0.06121290475130081,
0.1290796846151352,
0.05957380682229996,
0.05011506378650665,
-0.043520737439394,
-0.0211784765124321,
-0.008504665456712246,
0.14072857797145844,
-0.10404830425977707,
-0.00016830587992444634
] |
null | null | null |
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
```python
{'exp_name': 'ppo'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'env_id': 'LunarLander-v2'
'total_timesteps': 50000
'learning_rate': 0.00025
'num_envs': 4
'num_steps': 128
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.95
'num_minibatches': 4
'update_epochs': 4
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'hugo-massonnat/ppo-LunarLander-v2-unit8'
'batch_size': 512
'minibatch_size': 128}
```
| {"tags": ["LunarLander-v2", "ppo", "deep-reinforcement-learning", "reinforcement-learning", "custom-implementation", "deep-rl-course"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "-113.86 +/- 36.79", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | hugo-massonnat/ppo-LunarLander-v2-unit8 | [
"tensorboard",
"LunarLander-v2",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
] | 2024-02-14T14:57:43+00:00 | [] | [] | TAGS
#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us
|
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
| [
"# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n \n # Hyperparameters"
] | [
"TAGS\n#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n",
"# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n \n # Hyperparameters"
] | [
51,
37
] | [
"passage: TAGS\n#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n \n # Hyperparameters"
] | [
0.07948226481676102,
-0.021824665367603302,
-0.005334289278835058,
0.07425090670585632,
0.11451162397861481,
-0.051334477961063385,
0.11827225238084793,
0.05111894756555557,
0.0632978081703186,
0.08233953267335892,
0.09910695254802704,
0.11526558548212051,
0.02103434130549431,
0.12346389144659042,
0.10133372992277145,
-0.26653239130973816,
0.0048308540135622025,
-0.042133692651987076,
0.020121442154049873,
0.07062754780054092,
-0.028985055163502693,
-0.12164036184549332,
0.02042403817176819,
-0.008055811747908592,
0.04164125770330429,
0.03685355558991432,
-0.020250989124178886,
-0.07061084359884262,
0.1035412922501564,
-0.04342407360672951,
0.07646117359399796,
0.04053044691681862,
0.12915800511837006,
-0.11266650259494781,
0.03731851652264595,
0.047094929963350296,
-0.058420803397893906,
0.040810972452163696,
0.023221731185913086,
0.07433853298425674,
0.15582501888275146,
0.0008022422553040087,
0.10807766020298004,
-0.019928930327296257,
-0.15859591960906982,
-0.0564296655356884,
0.04013175517320633,
0.10688508301973343,
0.041339244693517685,
0.05763867497444153,
0.01518392562866211,
0.24210692942142487,
-0.07300914824008942,
0.0014766358071938157,
0.1963091939687729,
-0.2750851511955261,
-0.056198850274086,
0.2650637924671173,
0.08425293117761612,
0.09438422322273254,
-0.09869689494371414,
-0.0236953292042017,
0.007850034162402153,
0.013983802869915962,
-0.038732558488845825,
-0.07621388882398605,
0.1343805193901062,
0.06358266621828079,
-0.07906194031238556,
-0.05448254942893982,
0.09211132675409317,
0.015635671094059944,
0.03398676961660385,
0.0008897133520804346,
-0.015260354615747929,
0.03964465111494064,
-0.008004734292626381,
-0.08323223143815994,
0.067534439265728,
0.017411211505532265,
-0.059903185814619064,
-0.11101946979761124,
-0.11182308942079544,
-0.028280947357416153,
-0.08438915759325027,
0.16840966045856476,
-0.023494480177760124,
0.07285201549530029,
-0.06215810775756836,
0.06860414892435074,
-0.037912189960479736,
0.004227026831358671,
0.006380763836205006,
-0.049948662519454956,
-0.04539962485432625,
-0.025878654792904854,
0.006328459829092026,
0.011017742566764355,
0.11213880032300949,
-0.002449487103149295,
0.0508684441447258,
0.04856472462415695,
0.014653711579740047,
0.0942535474896431,
0.04126615449786186,
0.18958540260791779,
-0.006363034248352051,
0.0650586485862732,
0.062062907963991165,
0.017491057515144348,
0.022076671943068504,
-0.05142693966627121,
-0.1658715307712555,
0.0807771384716034,
-0.08260773122310638,
-0.028765955939888954,
0.09323479980230331,
-0.044928085058927536,
-0.1112084910273552,
-0.01773354969918728,
-0.07590804249048233,
-0.025731517001986504,
-0.01252016518265009,
0.01790926419198513,
-0.035574477165937424,
0.005672375671565533,
0.03449513763189316,
0.08204318583011627,
0.033907562494277954,
-0.08674118667840958,
0.00984077900648117,
0.012360874563455582,
-0.122767873108387,
-0.004771664272993803,
0.010288639925420284,
0.04804306477308273,
0.04491464048624039,
-0.1116413027048111,
-0.2020648866891861,
-0.08828215301036835,
0.053431469947099686,
-0.07537820190191269,
-0.15614600479602814,
-0.11512033641338348,
0.02302604168653488,
-0.10217837989330292,
-0.046169016510248184,
-0.0017400066135451198,
-0.019300667569041252,
0.05366985872387886,
-0.06531468033790588,
0.1828034669160843,
0.0271916463971138,
-0.00020129751646891236,
-0.14947181940078735,
0.019320663064718246,
-0.2362208217382431,
0.07685942947864532,
-0.04987453296780586,
0.07074880599975586,
-0.04584719240665436,
-0.09154892712831497,
-0.01864667609333992,
0.054014526307582855,
0.013841784559190273,
0.10950348526239395,
-0.1638582944869995,
-0.05129624530673027,
0.024843567982316017,
-0.08068934828042984,
-0.0030390452593564987,
-0.04837793856859207,
-0.04604795575141907,
0.1606992781162262,
0.018704978749155998,
0.14688511192798615,
-0.12919624149799347,
-0.09930720180273056,
0.19129104912281036,
0.03531093895435333,
-0.16984215378761292,
-0.036521974951028824,
0.09952033311128616,
0.019277004525065422,
-0.01849931664764881,
-0.05688142776489258,
-0.07599073648452759,
0.015944182872772217,
-0.08702079951763153,
-0.04182637855410576,
0.04013517126441002,
-0.042824242264032364,
0.14606650173664093,
0.10223949700593948,
0.07952884584665298,
-0.07538176327943802,
-0.007020880468189716,
0.08674140274524689,
0.06271850317716599,
0.045035574585199356,
0.03672485426068306,
-0.05614851415157318,
0.03206208720803261,
-0.025039123371243477,
-0.01738123595714569,
-0.13521039485931396,
0.0019960827194154263,
-0.06055765971541405,
0.1118607297539711,
0.13101612031459808,
0.28467631340026855,
0.10075046867132187,
0.02464960888028145,
0.07675616443157196,
-0.07042508572340012,
-0.10758408159017563,
0.002032244112342596,
0.0235405582934618,
-0.1785016655921936,
0.026378504931926727,
-0.07599464803934097,
-0.14044412970542908,
-0.1351996809244156,
-0.025685761123895645,
-0.17195537686347961,
0.02159930020570755,
0.054728612303733826,
-0.018639836460351944,
0.0013907389948144555,
0.12220112234354019,
0.013543038628995419,
-0.053733617067337036,
0.10188740491867065,
0.009542218409478664,
-0.05206648260354996,
-0.045367226004600525,
0.1050298660993576,
0.13431710004806519,
0.1365344226360321,
-0.2098493129014969,
0.008600602857768536,
0.1119711846113205,
-0.04708562791347504,
0.03519878163933754,
0.026510966941714287,
0.21071651577949524,
0.2740876078605652,
0.0374440960586071,
0.008118349127471447,
-0.05789022892713547,
0.0453064851462841,
-0.05260699614882469,
-0.11800429224967957,
-0.05410657823085785,
0.17159637808799744,
0.07862472534179688,
-0.006237224210053682,
0.09871696680784225,
0.07909595966339111,
0.037818074226379395,
0.16045765578746796,
0.03334520757198334,
-0.09544764459133148,
-0.03232238441705704,
-0.026171676814556122,
-0.0047440179623663425,
0.06791821867227554,
-0.0798373743891716,
-0.032012078911066055,
0.021649274975061417,
-0.13788609206676483,
0.018513672053813934,
-0.18612799048423767,
-0.1437452882528305,
0.03805195167660713,
0.043561313301324844,
-0.008401780389249325,
0.04065251722931862,
-0.0160639937967062,
0.05676067993044853,
0.03282754495739937,
-0.08861549198627472,
0.04405612871050835,
-0.005384152289479971,
0.009959283284842968,
0.03441033884882927,
-0.01767686940729618,
-0.21204280853271484,
-0.15340813994407654,
0.013550614938139915,
-0.05142427980899811,
0.05592547729611397,
-0.008550947532057762,
-0.19242143630981445,
0.025911282747983932,
-0.014332908205688,
0.02364996261894703,
-0.03164665028452873,
-0.03833974152803421,
0.1345074623823166,
0.14185978472232819,
-0.026165392249822617,
0.00023905932903289795,
-0.03341824188828468,
-0.14318081736564636,
-0.180479034781456,
0.06557876616716385,
0.0740460753440857,
0.006866236217319965,
0.1220167726278305,
0.004434254486113787,
0.026604121550917625,
-0.00636066310107708,
0.007762894034385681,
-0.07827747613191605,
-0.10268643498420715,
0.2943233549594879,
0.02490289881825447,
-0.022609207779169083,
-0.023361563682556152,
0.022680940106511116,
-0.005913543980568647,
0.020695405080914497,
-0.06731052696704865,
-0.11051533371210098,
-0.10214895755052567,
-0.018064133822917938,
-0.05326148122549057,
0.08696132898330688,
0.05207669362425804,
-0.0023201601579785347,
-0.058658841997385025,
0.0491698756814003,
0.15816207230091095,
0.0022554483730345964,
-0.07889559864997864,
0.00756099633872509,
0.06827649474143982,
-0.10357149690389633,
0.019141824916005135,
-0.011750275269150734,
-0.06115471199154854,
0.01578802429139614,
0.021844392642378807,
0.02698187716305256,
0.10298074781894684,
-0.21004606783390045,
0.04396829754114151,
0.06455216556787491,
0.025463011115789413,
0.08768844604492188,
0.05016043782234192,
-0.11047832667827606,
-0.016628960147500038,
-0.0343489907681942,
-0.16258354485034943,
0.1297316700220108,
0.14130131900310516,
0.06893892586231232,
0.039022352546453476,
0.04288983345031738,
-0.07514789700508118,
0.058336563408374786,
-0.03656633570790291,
-0.1470387876033783,
-0.018523573875427246,
0.03902188688516617,
0.03257647529244423,
0.038807060569524765,
0.10827972739934921,
0.10223158448934555,
-0.14332416653633118,
-0.03201044723391533,
0.06512229144573212,
-0.008886558935046196,
-0.04119880497455597,
0.004403908737003803,
-0.09832779318094254,
0.07498125731945038,
-0.0024919756688177586,
0.04813602566719055,
-0.20199769735336304,
0.16434083878993988,
-0.09330786764621735,
0.034300561994314194,
-0.04896155744791031,
-0.044333528727293015,
0.03555295243859291,
-0.09057865291833878,
0.20472288131713867,
0.0057462104596197605,
0.008313721977174282,
-0.12209630757570267,
-0.17661772668361664,
-0.034985676407814026,
-0.09205599129199982,
-0.07460658252239227,
0.02909865602850914,
0.0682184249162674,
0.029013507068157196,
-0.044006895273923874,
0.1327963024377823,
-0.007539169397205114,
0.08532623946666718,
-0.09495806694030762,
-0.09892267733812332,
-0.06850815564393997,
-0.09003753960132599,
-0.13165755569934845,
-0.069197878241539,
0.05082700401544571,
0.12665395438671112,
0.02109835296869278,
-0.02864154241979122,
0.016000375151634216,
-0.01131656114012003,
0.0060316757299005985,
-0.006539386231452227,
0.0482512004673481,
0.015850301831960678,
-0.05547862499952316,
-0.13189296424388885,
0.08252222090959549,
-0.06544385105371475,
-0.06556238979101181,
-0.023766927421092987,
0.09430349618196487,
0.09706855565309525,
0.1314772367477417,
-0.052682001143693924,
0.028886299580335617,
-0.03723334148526192,
-0.04484548792243004,
0.18565788865089417,
0.0040725888684391975,
-0.07140722125768661,
0.04510314390063286,
0.08041586726903915,
0.05989309027791023,
0.0390491709113121,
-0.031676698476076126,
0.20406655967235565,
0.15550298988819122,
-0.018378838896751404,
0.19636642932891846,
-0.017176153138279915,
-0.0269333329051733,
-0.20952188968658447,
0.006836839485913515,
-0.019357649609446526,
0.029477683827280998,
0.1340312361717224,
-0.1391998678445816,
0.02293945848941803,
-0.004865060094743967,
-0.02284914068877697,
-0.07053285837173462,
-0.3114997148513794,
-0.06468415260314941,
0.20102077722549438,
0.17379379272460938,
0.30399972200393677,
-0.10662104934453964,
0.05403600633144379,
0.02176249772310257,
0.035715505480766296,
0.03934846818447113,
-0.07645441591739655,
0.1000572219491005,
-0.11122481524944305,
0.16528162360191345,
0.08111181855201721,
-0.020749825984239578,
-0.02004031278192997,
-0.13701297342777252,
0.018633954226970673,
-0.12466508150100708,
-0.017992790788412094,
0.08779406547546387,
-0.003319771494716406,
-0.09328535199165344,
0.23242005705833435,
-0.06734555959701538,
-0.127778559923172,
-0.028943995013833046,
-0.057271506637334824,
-0.030531147494912148,
0.012628542259335518,
-0.09404513984918594,
0.005903336685150862,
0.1308545619249344,
-0.011834635399281979,
0.11608193069696426,
0.16071371734142303,
-0.035819161683321,
0.07980551570653915,
0.11671095341444016,
0.041628848761320114,
0.06653126329183578,
-0.16247588396072388,
-0.008802353404462337,
-0.0202709399163723,
0.029673689976334572,
-0.1328430324792862,
-0.08996491879224777,
0.037999510765075684,
0.055287107825279236,
-0.016219541430473328,
0.11157703399658203,
-0.02790040522813797,
0.0671137273311615,
0.05197756364941597,
-0.14911557734012604,
-0.21309031546115875,
0.043088413774967194,
-0.03457297012209892,
0.16741053760051727,
0.032527483999729156,
0.07026690244674683,
-0.1318490356206894,
0.005996404681354761,
-0.008010598830878735,
-0.02555401436984539,
-0.113502137362957,
-0.04016893729567528,
0.10736791044473648,
0.01890859194099903,
-0.05588224157691002,
0.11932288110256195,
0.053731534630060196,
0.07207717001438141,
0.022103527560830116,
0.036430660635232925,
0.10638459026813507,
-0.05759545415639877,
0.08525355905294418,
0.19163745641708374,
0.022084489464759827,
-0.050156377255916595,
-0.1069810688495636,
-0.142279252409935,
0.1059383824467659,
-0.029212607070803642,
0.06867408007383347,
-0.16743674874305725,
-0.09695854038000107,
0.03239866718649864,
-0.006085241679102182,
-0.045712824910879135,
-0.04037291929125786,
-0.029692232608795166,
-0.1638854742050171,
0.07177262753248215,
-0.026750473305583,
0.09733851999044418,
-0.07764898240566254,
-0.08057862520217896,
-0.1878826767206192,
0.0927230566740036,
0.11600489169359207,
-0.09250454604625702,
-0.07816965878009796,
0.0006463889149017632,
0.007188722491264343,
-0.05905555561184883,
-0.05547625944018364,
0.05128099024295807,
-0.1268264353275299,
0.03925716504454613,
0.02211940288543701,
0.07955963909626007,
-0.013168327510356903,
-0.022237133234739304,
0.053730763494968414,
-0.05526714771986008,
-0.004513209220021963,
-0.0007778665167279541,
-0.010598957538604736,
-0.04734821990132332,
-0.2539333701133728,
0.026826584711670876,
0.015074611641466618,
0.023000292479991913,
0.11450504511594772,
0.052672553807497025,
0.002142281737178564,
-0.022901082411408424,
-0.09921795129776001,
0.004082086030393839,
0.0676940307021141,
-0.0444176085293293,
0.02973432093858719,
0.04361078143119812,
-0.10892095416784286,
-0.011856138706207275,
-0.024206269532442093,
0.07134921103715897,
0.010941405780613422,
0.06965811550617218,
-0.07052738219499588,
0.09066002070903778,
-0.1813029795885086,
-0.042003389447927475,
0.02394963428378105,
0.0719861164689064,
0.12007027864456177,
-0.10232933610677719,
0.05554276332259178,
0.007666701916605234,
0.16984406113624573,
0.10653958469629288,
-0.002575549529865384,
-0.03601353242993355,
0.06471540033817291,
0.09858960658311844,
0.034707363694906235,
0.04066390544176102,
0.06345933675765991,
-0.010203788988292217,
0.10382732003927231,
0.10297582298517227,
0.14551296830177307,
0.050692107528448105,
0.15706492960453033,
0.03763074800372124,
0.008729667402803898,
0.07412492483854294,
0.0944521427154541,
0.08652419596910477,
-0.006242257542908192,
0.1731923371553421,
-0.007543493993580341,
-0.01751723699271679,
-0.03595760464668274,
0.16348356008529663,
0.06810002774000168,
-0.10502735525369644,
0.032236937433481216,
-0.05084357038140297,
0.025795334950089455,
-0.021152885630726814,
-0.15513712167739868,
-0.03436838835477829,
-0.2639841139316559,
0.12161721289157867,
-0.04934193193912506,
-0.00526955584064126,
0.0620683990418911,
-0.019800636917352676,
-0.053851764649152756,
-0.00036916558747179806,
0.0654521957039833,
0.026729213073849678,
0.01114212442189455,
-0.028801998123526573,
-0.021474527195096016,
-0.19075548648834229,
-0.11265835911035538,
-0.04041624069213867,
-0.13205185532569885,
-0.026539895683526993,
0.02738100476562977,
-0.05638997629284859,
0.00884995236992836,
-0.0025031883269548416,
-0.01385815255343914,
0.04824291169643402,
-0.052424367517232895,
0.045965224504470825,
0.051154542714357376,
0.06721315532922745,
-0.07684784382581711,
0.00411610584706068,
0.11700203269720078,
0.03185063600540161,
-0.09347992390394211,
0.055158115923404694,
0.12995439767837524,
-0.058530066162347794,
0.026019345968961716,
-0.007744444999843836,
-0.032847896218299866,
-0.09708602726459503,
0.19312189519405365,
0.11783043295145035,
-0.16847896575927734,
0.0006766151054762304,
-0.036616407334804535,
-0.01160040870308876,
-0.09233774989843369,
0.12344596534967422,
0.1592838317155838,
0.055998723953962326,
-0.15062640607357025,
-0.11043619364500046,
-0.10300665348768234,
0.06709197163581848,
-0.07569106668233871,
-0.07460284233093262,
0.15964122116565704,
-0.02457398921251297,
-0.10188330709934235,
0.03819292411208153,
-0.21867942810058594,
-0.01995755359530449,
0.19039398431777954,
-0.29568302631378174,
-0.11494400352239609,
-0.07910088449716568,
0.18586759269237518,
0.025469033047556877,
0.11436232179403305,
-0.023825788870453835,
-0.02012297883629799,
-0.221383735537529,
0.0029703411273658276,
-0.08713068813085556,
0.034245800226926804,
0.0651308074593544,
-0.09516268968582153,
0.24007263779640198,
-0.09044498205184937,
0.05269941687583923,
0.033750344067811966,
0.07691317796707153,
0.01018204540014267,
0.05163824185729027,
-0.048588331788778305,
-0.16688252985477448,
-0.09095858782529831,
0.014404932036995888,
0.03795035555958748,
0.0503084696829319,
0.09903772920370102,
-0.04082057997584343,
0.04713768512010574,
0.0953395888209343,
0.030845828354358673,
-0.004454230889678001,
0.052237071096897125,
-0.15630710124969482,
0.05534590780735016,
0.018921079114079475,
-0.025683825835585594,
0.02539582923054695,
-0.08227502554655075,
0.10333657264709473,
0.03491305932402611,
0.0618959404528141,
-0.0665573701262474,
0.03160114586353302,
-0.009742318652570248,
-0.12334126234054565,
-0.04329211637377739,
-0.18513770401477814,
-0.0893927589058876,
-0.1391412913799286,
-0.03897256776690483,
-0.04044290632009506,
-0.025919048115611076,
0.01644543558359146,
0.00776201207190752,
-0.0044921645894646645,
-0.11029971390962601,
0.07136444747447968,
0.11884529888629913,
-0.030008424073457718,
0.0031494214199483395
] |
null | null | transformers |

# 👑 AlphaMonarch-7B
**tl;dr: AlphaMonarch-7B is a new DPO merge that retains all the reasoning abilities of the very best merges and significantly improves its conversational abilities. Kind of the best of both worlds in a 7B model. 🎉**
AlphaMonarch-7B is a DPO fine-tuned of [mlabonne/NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B/) using the [argilla/OpenHermes2.5-dpo-binarized-alpha](https://huggingface.co/datasets/argilla/OpenHermes2.5-dpo-binarized-alpha) preference dataset.
It is based on a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [mlabonne/OmniTruthyBeagle-7B-v0](https://huggingface.co/mlabonne/OmniTruthyBeagle-7B-v0)
* [mlabonne/NeuBeagle-7B](https://huggingface.co/mlabonne/NeuBeagle-7B)
* [mlabonne/NeuralOmniBeagle-7B](https://huggingface.co/mlabonne/NeuralOmniBeagle-7B)
Special thanks to [Jon Durbin](https://huggingface.co/jondurbin), [Intel](https://huggingface.co/Intel), [Argilla](https://huggingface.co/argilla), and [Teknium](https://huggingface.co/teknium) for the preference datasets.
**Try the demo**: https://huggingface.co/spaces/mlabonne/AlphaMonarch-7B-GGUF-Chat
## 🔍 Applications
This model uses a context window of 8k. I recommend using it with the Mistral Instruct chat template (works perfectly with LM Studio).
It is one of the very best 7B models in terms of instructing following and reasoning abilities and can be used for conversations, RP, and storytelling. Note that it tends to have a quite formal and sophisticated style, but it can be changed by modifying the prompt.
## ⚡ Quantized models
* **GGUF**: https://huggingface.co/mlabonne/AlphaMonarch-7B-GGUF
## 🏆 Evaluation
### Nous
AlphaMonarch-7B is the best-performing 7B model on Nous' benchmark suite (evaluation performed using [LLM AutoEval](https://github.com/mlabonne/llm-autoeval)). See the entire leaderboard [here](https://huggingface.co/spaces/mlabonne/Yet_Another_LLM_Leaderboard).
| Model | Average | AGIEval | GPT4All | TruthfulQA | Bigbench |
|---|---:|---:|---:|---:|---:|
| [**AlphaMonarch-7B**](https://huggingface.co/mlabonne/AlphaMonarch-7B) [📄](https://gist.github.com/mlabonne/1d33c86824b3a11d2308e36db1ba41c1) | **62.74** | **45.37** | **77.01** | **78.39** | **50.2** |
| [NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B) [📄](https://gist.github.com/mlabonne/64050c96c6aa261a8f5b403190c8dee4) | 62.73 | 45.31 | 76.99 | 78.35 | 50.28 |
| [Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B) [📄](https://gist.github.com/mlabonne/0b8d057c5ece41e0290580a108c7a093) | 62.68 | 45.48 | 77.07 | 78.04 | 50.14 |
| [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) [📄](https://gist.github.com/mlabonne/88b21dd9698ffed75d6163ebdc2f6cc8) | 52.42 | 42.75 | 72.99 | 52.99 | 40.94 |
| [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B) [📄](https://gist.github.com/mlabonne/14687f1eb3425b166db511f31f8e66f6) | 53.51 | 43.67 | 73.24 | 55.37 | 41.76 |
| [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) [📄](https://gist.github.com/mlabonne/ad0c665bbe581c8420136c3b52b3c15c) | 60.25 | 46.06 | 76.77 | 70.32 | 47.86 |
| [mlabonne/NeuralOmniBeagle-7B](https://huggingface.co/mlabonne/NeuralOmniBeagle-7B) [📄](https://gist.github.com/mlabonne/0e49d591787185fa5ae92ca5d9d4a1fd) | 62.3 | 45.85 | 77.26 | 76.06 | 50.03 |
| [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B) [📄](https://gist.github.com/CultriX-Github/dbdde67ead233df0c7c56f1b091f728c) | 62.5 | 44.57 | 76.34 | 79.81 | 49.27 |
| [CultriX/NeuralTrix-7B-dpo](https://huggingface.co/CultriX/NeuralTrix-7B-dpo) [📄](https://gist.github.com/CultriX-Github/df0502599867d4043b45d9dafb5976e8) | 62.5 | 44.61 | 76.33 | 79.8 | 49.24 |
### EQ-bench
AlphaMonarch-7B is also outperforming 70B and 120B parameter models on [EQ-bench](https://eqbench.com/) by [Samuel J. Paech](https://twitter.com/sam_paech), who kindly ran the evaluations.

### MT-Bench
```
########## First turn ##########
score
model turn
gpt-4 1 8.95625
OmniBeagle-7B 1 8.31250
AlphaMonarch-7B 1 8.23750
claude-v1 1 8.15000
NeuralMonarch-7B 1 8.09375
gpt-3.5-turbo 1 8.07500
claude-instant-v1 1 7.80000
########## Second turn ##########
score
model turn
gpt-4 2 9.025000
claude-instant-v1 2 8.012658
OmniBeagle-7B 2 7.837500
gpt-3.5-turbo 2 7.812500
claude-v1 2 7.650000
AlphaMonarch-7B 2 7.618750
NeuralMonarch-7B 2 7.375000
########## Average ##########
score
model
gpt-4 8.990625
OmniBeagle-7B 8.075000
gpt-3.5-turbo 7.943750
AlphaMonarch-7B 7.928125
claude-instant-v1 7.905660
claude-v1 7.900000
NeuralMonarch-7B 7.734375
NeuralBeagle14-7B 7.628125
```
### Open LLM Leaderboard
AlphaMonarch-7B is one of the best-performing non-merge 7B models on the Open LLM Leaderboard:

## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "mlabonne/AlphaMonarch-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"language": ["en"], "license": "cc-by-nc-4.0", "tags": ["merge", "lazymergekit", "dpo", "rlhf"], "dataset": ["mlabonne/truthy-dpo-v0.1", "mlabonne/distilabel-intel-orca-dpo-pairs", "mlabonne/chatml-OpenHermes2.5-dpo-binarized-alpha"], "base_model": ["mlabonne/NeuralMonarch-7B"]} | text-generation | mlabonne/AlphaMonarch-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"lazymergekit",
"dpo",
"rlhf",
"conversational",
"en",
"base_model:mlabonne/NeuralMonarch-7B",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T14:59:24+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #mistral #text-generation #merge #lazymergekit #dpo #rlhf #conversational #en #base_model-mlabonne/NeuralMonarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| !image/jpeg
AlphaMonarch-7B
===============
tl;dr: AlphaMonarch-7B is a new DPO merge that retains all the reasoning abilities of the very best merges and significantly improves its conversational abilities. Kind of the best of both worlds in a 7B model.
AlphaMonarch-7B is a DPO fine-tuned of mlabonne/NeuralMonarch-7B using the argilla/OpenHermes2.5-dpo-binarized-alpha preference dataset.
It is based on a merge of the following models using LazyMergekit:
* mlabonne/OmniTruthyBeagle-7B-v0
* mlabonne/NeuBeagle-7B
* mlabonne/NeuralOmniBeagle-7B
Special thanks to Jon Durbin, Intel, Argilla, and Teknium for the preference datasets.
Try the demo: URL
Applications
------------
This model uses a context window of 8k. I recommend using it with the Mistral Instruct chat template (works perfectly with LM Studio).
It is one of the very best 7B models in terms of instructing following and reasoning abilities and can be used for conversations, RP, and storytelling. Note that it tends to have a quite formal and sophisticated style, but it can be changed by modifying the prompt.
Quantized models
----------------
* GGUF: URL
Evaluation
----------
### Nous
AlphaMonarch-7B is the best-performing 7B model on Nous' benchmark suite (evaluation performed using LLM AutoEval). See the entire leaderboard here.
### EQ-bench
AlphaMonarch-7B is also outperforming 70B and 120B parameter models on EQ-bench by Samuel J. Paech, who kindly ran the evaluations.
!image/png
### MT-Bench
### Open LLM Leaderboard
AlphaMonarch-7B is one of the best-performing non-merge 7B models on the Open LLM Leaderboard:
!image/png
Usage
-----
| [
"### Nous\n\n\nAlphaMonarch-7B is the best-performing 7B model on Nous' benchmark suite (evaluation performed using LLM AutoEval). See the entire leaderboard here.",
"### EQ-bench\n\n\nAlphaMonarch-7B is also outperforming 70B and 120B parameter models on EQ-bench by Samuel J. Paech, who kindly ran the evaluations.\n\n\n!image/png",
"### MT-Bench",
"### Open LLM Leaderboard\n\n\nAlphaMonarch-7B is one of the best-performing non-merge 7B models on the Open LLM Leaderboard:\n\n\n!image/png\n\n\nUsage\n-----"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #lazymergekit #dpo #rlhf #conversational #en #base_model-mlabonne/NeuralMonarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Nous\n\n\nAlphaMonarch-7B is the best-performing 7B model on Nous' benchmark suite (evaluation performed using LLM AutoEval). See the entire leaderboard here.",
"### EQ-bench\n\n\nAlphaMonarch-7B is also outperforming 70B and 120B parameter models on EQ-bench by Samuel J. Paech, who kindly ran the evaluations.\n\n\n!image/png",
"### MT-Bench",
"### Open LLM Leaderboard\n\n\nAlphaMonarch-7B is one of the best-performing non-merge 7B models on the Open LLM Leaderboard:\n\n\n!image/png\n\n\nUsage\n-----"
] | [
94,
42,
50,
6,
44
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #lazymergekit #dpo #rlhf #conversational #en #base_model-mlabonne/NeuralMonarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Nous\n\n\nAlphaMonarch-7B is the best-performing 7B model on Nous' benchmark suite (evaluation performed using LLM AutoEval). See the entire leaderboard here.### EQ-bench\n\n\nAlphaMonarch-7B is also outperforming 70B and 120B parameter models on EQ-bench by Samuel J. Paech, who kindly ran the evaluations.\n\n\n!image/png### MT-Bench### Open LLM Leaderboard\n\n\nAlphaMonarch-7B is one of the best-performing non-merge 7B models on the Open LLM Leaderboard:\n\n\n!image/png\n\n\nUsage\n-----"
] | [
-0.1114223301410675,
0.1082129031419754,
-0.005081187933683395,
0.07829853892326355,
0.002796404529362917,
-0.0038671193178743124,
0.10511452704668045,
0.10792283713817596,
0.10606250911951065,
0.014470837078988552,
0.04959205538034439,
0.10308492183685303,
-0.009743045084178448,
0.1448356807231903,
-0.06508210301399231,
-0.11410586535930634,
0.02079635299742222,
0.02943636290729046,
0.006340586580336094,
0.06694881618022919,
0.06235138326883316,
-0.0681198462843895,
0.04002048447728157,
0.025612469762563705,
-0.0038240670692175627,
0.04003896564245224,
-0.017984909936785698,
-0.04673594981431961,
0.10367979854345322,
0.05823806673288345,
0.016174914315342903,
0.07910719513893127,
-0.015478297136723995,
-0.11419405788183212,
0.02948276326060295,
0.014828372746706009,
-0.007068213541060686,
0.07013270258903503,
0.05379396677017212,
-0.05813438445329666,
0.022483352571725845,
0.0032045203261077404,
-0.009886759333312511,
0.02338600903749466,
-0.05585512891411781,
-0.0995517373085022,
-0.0586407333612442,
0.0112191466614604,
0.06053619086742401,
0.07699763029813766,
-0.020292408764362335,
0.1955331265926361,
-0.06827723979949951,
0.050916872918605804,
0.19112752377986908,
-0.19966213405132294,
-0.008409755304455757,
0.009574922733008862,
-0.02343423292040825,
-0.0759122371673584,
-0.09501922130584717,
-0.020000532269477844,
0.0686478465795517,
-0.04613117501139641,
0.05541978031396866,
-0.017943236976861954,
0.1302441507577896,
-0.06091634929180145,
-0.13371138274669647,
0.0670325756072998,
0.15826310217380524,
0.09344260394573212,
-0.053428299725055695,
-0.11390513926744461,
-0.13148796558380127,
0.04841580241918564,
0.016721107065677643,
-0.009121950715780258,
0.044279035180807114,
-0.023991938680410385,
0.04300503432750702,
-0.04151204228401184,
-0.07110604643821716,
-0.07438595592975616,
-0.10816475749015808,
0.16953904926776886,
0.031876340508461,
0.03258032724261284,
0.0074191614985466,
0.10516931116580963,
-0.2412533313035965,
-0.11130642145872116,
-0.09375855326652527,
-0.09899654984474182,
-0.06227993592619896,
0.020147353410720825,
0.013176743872463703,
-0.08820293843746185,
0.07985147088766098,
0.17418916523456573,
-0.01688375324010849,
0.02923174761235714,
0.004838875494897366,
0.041350025683641434,
-0.003083971329033375,
0.12551988661289215,
-0.055830590426921844,
-0.01605069823563099,
0.11998912692070007,
0.08967489749193192,
0.09371918439865112,
0.027522319927811623,
-0.018561214208602905,
-0.044276442378759384,
0.028707655146718025,
-0.017393946647644043,
-0.01716141402721405,
0.030396251007914543,
-0.07210643589496613,
-0.02277500554919243,
0.1854695826768875,
-0.09228415042161942,
-0.02266068384051323,
-0.0034381214063614607,
-0.06723880022764206,
0.0016721688443794847,
0.10264728218317032,
-0.022185008972883224,
0.0025157583877444267,
0.015984278172254562,
-0.04080380126833916,
-0.08394762128591537,
-0.09947574138641357,
-0.056576069444417953,
0.009363001212477684,
-0.0011417445493862033,
0.03164086863398552,
-0.10910718142986298,
-0.17312981188297272,
0.020216694101691246,
-0.015471900813281536,
-0.022148357704281807,
-0.048532452434301376,
-0.04109445959329605,
-0.058080337941646576,
-0.032468799501657486,
-0.011729242280125618,
0.03197174891829491,
-0.037356920540332794,
0.06227369233965874,
0.11784912645816803,
0.05273428559303284,
-0.12414725869894028,
-0.015573525801301003,
-0.07968458533287048,
0.05423855781555176,
-0.18608543276786804,
0.07412025332450867,
-0.07376578450202942,
-0.026103675365447998,
-0.04241139814257622,
0.004913925658911467,
-0.11434455960988998,
0.0163353830575943,
0.07522595673799515,
0.11754671484231949,
-0.12034417688846588,
-0.02077936939895153,
0.13778749108314514,
-0.10076391696929932,
-0.15809328854084015,
0.1075536459684372,
-0.04146290197968483,
-0.011707855388522148,
0.054815854877233505,
0.169691801071167,
0.047109562903642654,
-0.12477549910545349,
-0.1166222095489502,
-0.02044093981385231,
0.020317478105425835,
-0.05284351482987404,
0.08789113909006119,
-0.005447064060717821,
0.031109049916267395,
0.01920352876186371,
-0.06953782588243484,
0.03670172020792961,
-0.02644553780555725,
-0.03467954695224762,
-0.039687629789114,
-0.07785985618829727,
0.02464456483721733,
-0.02636820264160633,
0.02097400464117527,
-0.08199773728847504,
-0.13645069301128387,
-0.008546044118702412,
0.13920427858829498,
-0.007390846963971853,
-0.01779736950993538,
-0.09958602488040924,
0.11813025176525116,
-0.12826569378376007,
0.010840702801942825,
-0.09208156913518906,
-0.05098412558436394,
0.010613881051540375,
-0.03784368932247162,
0.038952603936195374,
0.02505899965763092,
0.05155371502041817,
0.049992743879556656,
-0.03708087280392647,
-0.04581115022301674,
-0.004748801700770855,
-0.03362897038459778,
-0.029083888977766037,
-0.14742963016033173,
0.055330611765384674,
-0.04654902219772339,
0.13205982744693756,
-0.1582353711128235,
0.010569977574050426,
-0.030943211168050766,
0.1037677675485611,
0.052932094782590866,
0.005830295383930206,
0.06625888496637344,
-0.04911191016435623,
-0.028087886050343513,
-0.010646950453519821,
0.020926041528582573,
0.002568989060819149,
-0.1154329851269722,
0.03886518254876137,
-0.13631115853786469,
0.1414109766483307,
0.10880076140165329,
0.012961999513208866,
-0.018781568855047226,
0.02621810883283615,
-0.04320865869522095,
-0.027657849714159966,
-0.07292640954256058,
-0.04758773371577263,
0.11340660601854324,
0.032438378781080246,
0.11142272502183914,
-0.12037636339664459,
-0.06751931458711624,
-0.017294498160481453,
-0.01845470443367958,
0.0033980959560722113,
0.1205214411020279,
0.15798504650592804,
-0.08815459907054901,
0.04151197150349617,
0.13979066908359528,
-0.04511914402246475,
0.16316798329353333,
0.00568192545324564,
-0.05841412767767906,
-0.013784692622721195,
0.05230507627129555,
-0.017430709674954414,
0.16263456642627716,
-0.08613478392362595,
0.09237311035394669,
0.027428459376096725,
0.004230097867548466,
0.08371786028146744,
-0.14585661888122559,
-0.00600479356944561,
0.008794929832220078,
-0.030650721862912178,
-0.008055736310780048,
0.11550870537757874,
0.015909751877188683,
0.11075963824987411,
-0.02306569181382656,
-0.06333646923303604,
0.00014539471885655075,
-0.03615016117691994,
-0.0660347044467926,
0.17152582108974457,
-0.05703568458557129,
-0.087257981300354,
-0.02821647934615612,
-0.007977874018251896,
-0.10752503573894501,
0.010926018469035625,
0.03879893571138382,
-0.01617727056145668,
-0.05747915804386139,
-0.07306982576847076,
0.02975492551922798,
0.04514390975236893,
-0.016177548095583916,
0.06591804325580597,
-0.01388439629226923,
0.0733526423573494,
-0.10712075978517532,
-0.03709794580936432,
-0.05078727379441261,
-0.04241294041275978,
0.039266180247068405,
-0.008751505054533482,
0.1082644909620285,
0.07349266856908798,
0.008829070255160332,
0.0015643953811377287,
-0.021305223926901817,
0.19357198476791382,
-0.07224205881357193,
-0.0037212311290204525,
0.14319446682929993,
0.020596574991941452,
0.033951666206121445,
0.08225329220294952,
0.0023170115891844034,
-0.10555841773748398,
0.024683652445673943,
0.07226381450891495,
-0.05361083522439003,
-0.2026568353176117,
-0.02190997451543808,
-0.04289614036679268,
-0.0174216628074646,
-0.03474504500627518,
0.06918347626924515,
-0.031770769506692886,
0.04883534461259842,
0.00480067590251565,
-0.04168105870485306,
0.032125651836395264,
0.036286160349845886,
0.10973145812749863,
0.002168957842513919,
0.08939987421035767,
-0.04505443945527077,
-0.07876641303300858,
0.10131753236055374,
0.0802231952548027,
0.12230439484119415,
-0.06830627471208572,
0.0721062496304512,
0.0823792889714241,
0.016712041571736336,
0.02115132287144661,
0.07575759291648865,
-0.041221607476472855,
-0.012134287506341934,
-0.020924843847751617,
-0.06770854443311691,
-0.023994527757167816,
0.040993135422468185,
-0.05851364508271217,
0.07527605444192886,
-0.06648921221494675,
0.10209977626800537,
0.09184707701206207,
0.12380810081958771,
0.07252766191959381,
-0.17033670842647552,
-0.008309726603329182,
0.021496538072824478,
-0.01755450665950775,
-0.024505237117409706,
0.0601641871035099,
-0.000667692453134805,
-0.00739350076764822,
0.10266763716936111,
0.01707316003739834,
0.09221836924552917,
-0.11208024621009827,
0.004465798381716013,
-0.08631376922130585,
0.12182801216840744,
-0.01204246748238802,
0.06816700100898743,
-0.1624622493982315,
0.13712003827095032,
0.01285006944090128,
0.06782541424036026,
-0.025908008217811584,
-0.016217412427067757,
0.07435470074415207,
0.005965873133391142,
0.13310059905052185,
-0.03288036584854126,
0.08493176847696304,
-0.013274393044412136,
-0.16881337761878967,
0.035885024815797806,
-0.022969547659158707,
-0.03229139372706413,
0.0843045711517334,
0.013515647500753403,
-0.02443697489798069,
0.02703246660530567,
0.10486745089292526,
-0.1432994157075882,
-0.07211799174547195,
0.005794771946966648,
0.04743677377700806,
-0.08676598221063614,
-0.0900859460234642,
-0.0548371747136116,
-0.041503291577100754,
0.1099303662776947,
-0.09242712706327438,
-0.0847652480006218,
-0.08088614046573639,
0.022958582267165184,
0.06427847594022751,
-0.08891888707876205,
-0.04383016750216484,
-0.03884043172001839,
0.10167845338582993,
-0.019046438857913017,
-0.09619467705488205,
0.05667118728160858,
-0.06886431574821472,
-0.11265143007040024,
0.005601189099252224,
0.1442599892616272,
0.00993789080530405,
0.018293051049113274,
0.009629366919398308,
-0.021921994164586067,
-0.032635461539030075,
-0.10472215712070465,
0.03540761023759842,
0.13150563836097717,
-0.05373060330748558,
0.051974907517433167,
-0.0538654625415802,
-0.0024990129750221968,
-0.022187968716025352,
-0.014347824268043041,
0.11828987300395966,
0.347808837890625,
-0.030102862045168877,
0.036233846098184586,
0.09893781691789627,
-0.07653222233057022,
-0.14319324493408203,
-0.05262951925396919,
0.03330203890800476,
0.04205703362822533,
-0.03428598493337631,
-0.05020724982023239,
0.09188348054885864,
0.09289844334125519,
-0.02499234303832054,
0.07586737722158432,
-0.15884652733802795,
-0.1357896476984024,
0.12693312764167786,
0.06052219867706299,
0.21944184601306915,
-0.08282969146966934,
-0.02132815681397915,
-0.05452291667461395,
-0.06736983358860016,
0.11022678762674332,
-0.06674160063266754,
0.10670718550682068,
-0.042247895151376724,
0.03910724073648453,
0.02253600023686886,
-0.034272290766239166,
0.11062970012426376,
-0.05671973153948784,
0.06086289510130882,
-0.0038144723512232304,
0.0586385540664196,
-0.00373094086535275,
-0.06907577812671661,
0.1000450924038887,
-0.0006152402493171394,
0.058508507907390594,
-0.08140695840120316,
-0.07413703203201294,
-0.039134617894887924,
0.029806634411215782,
-0.01428536418825388,
-0.04795866832137108,
-0.03167400509119034,
0.04188775271177292,
0.02277054823935032,
0.014047826640307903,
0.06144360825419426,
-0.04522183910012245,
0.024149538949131966,
0.08843321353197098,
0.144229456782341,
-0.05793694034218788,
-0.04481622576713562,
0.0005891796317882836,
0.0032164091244339943,
0.04039962217211723,
-0.10353405773639679,
0.06966734677553177,
0.13965903222560883,
0.0228724367916584,
0.07241306453943253,
0.037232279777526855,
-0.13884922862052917,
0.026146676391363144,
0.08383403718471527,
-0.18184606730937958,
-0.10989424586296082,
-0.03420256823301315,
0.06581950187683105,
-0.00033589103259146214,
0.07066655904054642,
0.17490480840206146,
-0.06334172189235687,
-0.006148881744593382,
0.011964814737439156,
0.0015076476847752929,
-0.006623902823776007,
0.1413247138261795,
0.06511305272579193,
0.02793963812291622,
-0.05154929310083389,
0.07428054511547089,
-0.031119847670197487,
0.03803781792521477,
-0.01614336483180523,
0.053450807929039,
-0.033187441527843475,
-0.042858581990003586,
-0.04042380303144455,
0.23300835490226746,
-0.07781288772821426,
-0.05988198518753052,
-0.15270908176898956,
-0.04992866516113281,
0.05119305104017258,
0.15251590311527252,
0.08797652274370193,
-0.024630364030599594,
0.029920224100351334,
-0.008500110357999802,
-0.060623977333307266,
0.109956294298172,
0.010250731371343136,
0.04072798416018486,
-0.14207270741462708,
-0.15103767812252045,
-0.009791901335120201,
0.030711404979228973,
-0.04636017978191376,
0.004804122261703014,
-0.1116759404540062,
-0.012416747398674488,
-0.1592855602502823,
0.041486706584692,
-0.00847314391285181,
0.010585801675915718,
-0.012083522044122219,
-0.014475204050540924,
-0.07572019845247269,
0.03281313553452492,
-0.08428846299648285,
-0.012387127615511417,
0.03184903785586357,
0.030317211523652077,
-0.11227475851774216,
-0.03634665533900261,
-0.005249700974673033,
-0.049151740968227386,
0.08309012651443481,
-0.01483993511646986,
0.022196929901838303,
0.010242639109492302,
-0.07780420035123825,
-0.026263663545250893,
0.0493767075240612,
0.044621359556913376,
0.022676851600408554,
-0.11795973777770996,
0.04890540614724159,
0.026956506073474884,
-0.01916889287531376,
-0.04217098280787468,
0.05074435845017433,
-0.0926365926861763,
-0.05785699933767319,
-0.029853079468011856,
-0.016640208661556244,
-0.03826950117945671,
0.009333410300314426,
0.05638759955763817,
0.14761412143707275,
0.10446637123823166,
-0.04275596886873245,
0.034297503530979156,
-0.17594926059246063,
0.0023613162338733673,
0.03850288689136505,
-0.056890204548835754,
0.05325343832373619,
0.031088393181562424,
0.052931081503629684,
-0.06278178095817566,
0.23398765921592712,
-0.04994194582104683,
-0.009789843112230301,
0.009061842225492,
-0.061673521995544434,
-0.029532380402088165,
-0.02210819534957409,
0.11931923031806946,
-0.022359704598784447,
0.05774515122175217,
0.05489635095000267,
0.029017718508839607,
0.0928008034825325,
0.0373966209590435,
0.13770368695259094,
0.11748798936605453,
0.03660836070775986,
0.0664195865392685,
0.09291581809520721,
-0.01748463325202465,
-0.06444454938173294,
0.036041632294654846,
0.03514782339334488,
0.02629287727177143,
-0.022503938525915146,
0.07662450522184372,
0.10925602912902832,
-0.05208273231983185,
0.033919211477041245,
-0.005712602753192186,
-0.02505730837583542,
-0.14870132505893707,
-0.08275733143091202,
-0.0951179638504982,
-0.08289554715156555,
0.04739431291818619,
-0.13669325411319733,
-0.0734541267156601,
-0.003150675678625703,
0.05659203976392746,
-0.04143727198243141,
0.11467801779508591,
-0.07024715095758438,
-0.04742155224084854,
0.049823399633169174,
-0.012675299309194088,
-0.05280180647969246,
-0.006817979272454977,
-0.014045849442481995,
0.03685420751571655,
0.04504082724452019,
-0.016010230407118797,
-0.021817628294229507,
-0.05101165920495987,
0.024514539167284966,
-0.013863581232726574,
-0.03214939683675766,
-0.037369512021541595,
0.050347700715065,
0.07761910557746887,
0.11682768911123276,
-0.007089841645210981,
-0.0503237247467041,
-0.01772226206958294,
0.24168717861175537,
-0.02555210515856743,
-0.04584937542676926,
-0.10865088552236557,
0.13840436935424805,
-0.02825336530804634,
0.0016915126470848918,
0.026219919323921204,
-0.09798213839530945,
0.005582267418503761,
0.20356367528438568,
0.13493992388248444,
-0.11168718338012695,
0.00904099177569151,
-0.022186800837516785,
0.008805237710475922,
0.00926106609404087,
0.022909870371222496,
0.08316084742546082,
0.24163126945495605,
-0.028459355235099792,
-0.004090235568583012,
-0.056769683957099915,
0.032815203070640564,
0.0338592566549778,
0.1229192465543747,
0.061411142349243164,
0.030557522550225258,
-0.08259475976228714,
0.04339232295751572,
-0.01514551043510437,
-0.1523713767528534,
-0.09723573923110962,
-0.14033983647823334,
-0.15082789957523346,
-0.029607625678181648,
-0.015614659525454044,
-0.050935350358486176,
0.007558926474303007,
-0.05913424491882324,
-0.011809187941253185,
0.05708736926317215,
-0.01564764603972435,
-0.1182650625705719,
-0.008320781402289867,
-0.006305625196546316,
0.10456709563732147,
0.17599709331989288,
0.020243879407644272,
0.05745166540145874,
0.10852225124835968,
-0.028755897656083107,
-0.12247725576162338,
0.1383129507303238,
-0.01705499365925789,
-0.13376425206661224,
0.018480300903320312,
0.11602028459310532,
-0.029787084087729454,
0.07938216626644135,
0.06605881452560425,
-0.1053706482052803,
-0.017360065132379532,
0.005363171454519033,
-0.12600243091583252,
-0.10868716239929199,
0.12567049264907837,
-0.129108265042305,
0.1189006119966507,
0.11256156116724014,
-0.029177341610193253,
-0.007395091466605663,
-0.05245129391551018,
0.1370733231306076,
0.009474428370594978,
-0.05462038144469261,
0.0035216794349253178,
-0.16760851442813873,
0.04713796451687813,
0.07140057533979416,
-0.05640392005443573,
-0.2344399094581604,
-0.08318580687046051,
-0.09184558689594269,
-0.030130909755825996,
-0.04372701793909073,
0.07444357126951218,
0.0954255536198616,
0.06033320724964142,
-0.053810976445674896,
-0.07506133615970612,
-0.05142412334680557,
0.019898060709238052,
-0.10495232045650482,
-0.0820089802145958
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3734
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.5258 | 1.0 | 157 | 2.4221 |
| 2.4859 | 2.0 | 314 | 2.4164 |
| 2.4556 | 3.0 | 471 | 2.3682 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu118
- Datasets 2.16.1
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-imdb", "results": []}]} | fill-mask | Fm505/distilbert-base-uncased-finetuned-imdb | [
"transformers",
"safetensors",
"distilbert",
"fill-mask",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:03:47+00:00 | [] | [] | TAGS
#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-imdb
======================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 2.3734
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu118
* Datasets 2.16.1
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
68,
113,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
-0.09967523068189621,
0.08503454923629761,
-0.003192664822563529,
0.1042095348238945,
0.11984850466251373,
0.017311735078692436,
0.14859037101268768,
0.10826661437749863,
-0.06483545154333115,
0.056283656507730484,
0.12611731886863708,
0.09691762179136276,
0.023243509232997894,
0.17960834503173828,
-0.057097453624010086,
-0.22146837413311005,
0.036584943532943726,
-0.001419536885805428,
-0.07020783424377441,
0.10772843658924103,
0.09083166718482971,
-0.1109369695186615,
0.07696598023176193,
0.008651407435536385,
-0.13149301707744598,
0.01461684051901102,
0.024115346372127533,
-0.0580960176885128,
0.11718732863664627,
0.02349243313074112,
0.13811655342578888,
0.020263316109776497,
0.0985620766878128,
-0.19735144078731537,
0.009720980189740658,
0.06806785613298416,
-0.0002822163514792919,
0.063956119120121,
0.021034734323620796,
0.005930785555392504,
0.060490306466817856,
-0.09415537863969803,
0.07270921766757965,
0.03209800645709038,
-0.13159404695034027,
-0.25669100880622864,
-0.09905841201543808,
0.03541920706629753,
0.09646488726139069,
0.08151543140411377,
-0.004540467634797096,
0.11879651993513107,
-0.054530125111341476,
0.07885075360536575,
0.2298669070005417,
-0.29949915409088135,
-0.06305332481861115,
0.00541033037006855,
0.03632689267396927,
0.03309828042984009,
-0.09128966182470322,
-0.02716994658112526,
0.049013007432222366,
0.026473527774214745,
0.14677181839942932,
-0.014959277585148811,
-0.058567386120557785,
-0.020670706406235695,
-0.1381426900625229,
-0.0314522311091423,
0.13010230660438538,
0.06053164601325989,
-0.0683927908539772,
-0.04597487300634384,
-0.07794889062643051,
-0.1345864087343216,
-0.05190245062112808,
-0.009776054881513119,
0.05231484770774841,
-0.01728174090385437,
-0.057266730815172195,
-0.01321149617433548,
-0.08343164622783661,
-0.08256509155035019,
-0.049716006964445114,
0.17965860664844513,
0.0458582304418087,
0.00878950860351324,
-0.0018596790032461286,
0.0819791629910469,
-0.07360030710697174,
-0.15271496772766113,
-0.006235258653759956,
0.01815415546298027,
-0.009145339950919151,
-0.03596702963113785,
-0.05345064774155617,
-0.03963969275355339,
0.031122952699661255,
0.21221376955509186,
-0.07917921990156174,
0.04622434824705124,
0.014512799680233002,
0.02964162640273571,
-0.11888270080089569,
0.14268918335437775,
-0.027979085221886635,
-0.021521545946598053,
0.02807530015707016,
0.08179371803998947,
0.07265914231538773,
-0.009929265826940536,
-0.09429562091827393,
0.022580944001674652,
0.07668451964855194,
0.02288239076733589,
-0.05687684938311577,
0.051763080060482025,
-0.07024890184402466,
0.012842557393014431,
0.041860371828079224,
-0.10091464221477509,
0.02895025722682476,
-0.0036004947032779455,
-0.03940330073237419,
-0.060837507247924805,
0.04952571168541908,
0.017482919618487358,
0.02090631052851677,
0.10349608212709427,
-0.08470393717288971,
-0.0019131225999444723,
-0.09878120571374893,
-0.12672725319862366,
0.016478165984153748,
-0.06397173553705215,
0.012826606631278992,
-0.10604258626699448,
-0.20578479766845703,
0.0046317861415445805,
0.06787482649087906,
-0.026778921484947205,
-0.024836396798491478,
-0.030634459108114243,
-0.08187749236822128,
0.03168095275759697,
-0.007523781154304743,
0.06904619187116623,
-0.07249223440885544,
0.08854452520608902,
0.058331914246082306,
0.07840926945209503,
-0.05424562841653824,
0.03497087210416794,
-0.10047704726457596,
0.05248202010989189,
-0.19847914576530457,
0.004155274014919996,
-0.07170974463224411,
0.05214977636933327,
-0.08380739390850067,
-0.08449029177427292,
-0.01293052826076746,
-0.00221520964987576,
0.09241589903831482,
0.10484172403812408,
-0.16332761943340302,
-0.06045025587081909,
0.1865071952342987,
-0.11245819181203842,
-0.12667742371559143,
0.129616841673851,
-0.044747475534677505,
0.02170569635927677,
0.03237752616405487,
0.13441286981105804,
0.061357270926237106,
-0.14084632694721222,
-0.006175795570015907,
-0.026313694193959236,
0.06415769457817078,
-0.004692322574555874,
0.08183011412620544,
-0.02167334593832493,
-0.012154249474406242,
0.012257316149771214,
-0.041994065046310425,
0.046282973140478134,
-0.0776098445057869,
-0.09216859936714172,
-0.052659641951322556,
-0.0940513163805008,
0.06854257732629776,
0.03026358224451542,
0.03494175523519516,
-0.11963536590337753,
-0.11821124702692032,
0.03839141130447388,
0.08742455393075943,
-0.06043180078268051,
0.02215624786913395,
-0.07909293472766876,
0.08777537941932678,
-0.04934941604733467,
-0.021386759355664253,
-0.1406714767217636,
-0.07479695975780487,
0.030377812683582306,
-0.0561806783080101,
0.007964497432112694,
-0.05926528945565224,
0.07837681472301483,
0.1010153740644455,
-0.06880653649568558,
-0.04874580726027489,
-0.0800684466958046,
0.01643388345837593,
-0.09033075720071793,
-0.2063126266002655,
-0.037803709506988525,
-0.032929256558418274,
0.13554109632968903,
-0.182423934340477,
0.04540323093533516,
-0.04292472079396248,
0.10256821662187576,
0.04100137948989868,
-0.01644233427941799,
-0.042377594858407974,
0.07296901941299438,
-0.03149400278925896,
-0.07128310948610306,
0.03997690603137016,
0.013901464641094208,
-0.0867927148938179,
-0.04633399844169617,
-0.14192208647727966,
0.16487006843090057,
0.11819584667682648,
-0.0367605946958065,
-0.09101660549640656,
0.019030561670660973,
-0.05357294902205467,
-0.03169980272650719,
-0.05864457041025162,
0.008316623978316784,
0.10585087537765503,
-0.007784191519021988,
0.13173624873161316,
-0.07876694947481155,
-0.019656183198094368,
0.03453236445784569,
-0.047650787979364395,
0.005686894990503788,
0.06834173202514648,
0.10661623626947403,
-0.07667290419340134,
0.1442878246307373,
0.20065493881702423,
-0.10481848567724228,
0.12692315876483917,
-0.0438602976500988,
-0.06996510922908783,
-0.03770028427243233,
0.0026757896412163973,
0.02419748157262802,
0.13573142886161804,
-0.09302158653736115,
0.024094583466649055,
0.012977932579815388,
0.01027713157236576,
-0.004079528618603945,
-0.20577472448349,
-0.01994767040014267,
0.042492400854825974,
-0.06210155785083771,
-0.03823903203010559,
-0.005713256541639566,
-0.004767207894474268,
0.08473093062639236,
0.0027225164230912924,
-0.07986012101173401,
0.03815761208534241,
0.002530647674575448,
-0.06409190595149994,
0.1952456682920456,
-0.1076221913099289,
-0.13080406188964844,
-0.13718734681606293,
-0.055851805955171585,
-0.04141690209507942,
0.026721255853772163,
0.06711340695619583,
-0.06661151349544525,
-0.06251239031553268,
-0.09712547063827515,
-0.004938831087201834,
0.04163968935608864,
0.041241828352212906,
0.028359122574329376,
-0.009887601248919964,
0.10853386670351028,
-0.09894461184740067,
-0.020438814535737038,
-0.01711234636604786,
-0.048333555459976196,
0.03883842006325722,
0.0507628433406353,
0.1184372529387474,
0.11111501604318619,
-0.020075082778930664,
-0.0074900235049426556,
-0.018772603943943977,
0.23423631489276886,
-0.05512867122888565,
-0.01132841408252716,
0.14785189926624298,
-0.008919297717511654,
0.06067141145467758,
0.15298762917518616,
0.057040244340896606,
-0.10096221417188644,
0.027417633682489395,
0.04349144548177719,
-0.021386688575148582,
-0.19688814878463745,
-0.03451526537537575,
-0.050537168979644775,
-0.02709716372191906,
0.09682431817054749,
0.02436479926109314,
0.02224048040807247,
0.053526077419519424,
0.0003786552988458425,
0.059109896421432495,
-0.015327472239732742,
0.08938194811344147,
0.1018197312951088,
0.05105630308389664,
0.12655004858970642,
-0.031949643045663834,
-0.04579800367355347,
0.033909332007169724,
-0.016832809895277023,
0.22112204134464264,
0.030297698453068733,
0.12962396442890167,
0.06379533559083939,
0.17680297791957855,
0.0008853139006532729,
0.07113789767026901,
0.015456270426511765,
-0.036977607756853104,
-0.009028789587318897,
-0.062255099415779114,
-0.017816642299294472,
0.04519200325012207,
-0.08172808587551117,
0.07777706533670425,
-0.11687725782394409,
0.029096776619553566,
0.04423383250832558,
0.28144359588623047,
0.061747558414936066,
-0.32701873779296875,
-0.10340669006109238,
0.018312716856598854,
-0.02903435379266739,
-0.03361104056239128,
0.011697950772941113,
0.0960972011089325,
-0.04935565963387489,
0.06723292171955109,
-0.0661555826663971,
0.07864651829004288,
0.03374362364411354,
0.034663762897253036,
0.061326440423727036,
0.12161369621753693,
-0.0026725553907454014,
0.04564709588885307,
-0.2653009593486786,
0.288644403219223,
0.015984080731868744,
0.09507652372121811,
-0.03829508647322655,
0.029112625867128372,
0.046474117785692215,
0.05148189887404442,
0.08890815824270248,
-0.020782629027962685,
-0.09472186118364334,
-0.17988142371177673,
-0.06469691544771194,
0.023006420582532883,
0.09733246266841888,
-0.021885359659790993,
0.11955635249614716,
-0.03893396630883217,
-0.010746000334620476,
0.08431750535964966,
0.02378249727189541,
-0.1437307894229889,
-0.08486264199018478,
-0.0025957226753234863,
0.047697629779577255,
-0.014922857284545898,
-0.1064763218164444,
-0.0973244458436966,
-0.08567486703395844,
0.1569274514913559,
-0.04134151339530945,
-0.02840820513665676,
-0.11302477866411209,
0.04963750019669533,
0.09066653996706009,
-0.07831751555204391,
0.07616525143384933,
-0.002439020434394479,
0.1127433180809021,
0.008368587121367455,
-0.051489513367414474,
0.1227102056145668,
-0.09342808276414871,
-0.1888788789510727,
-0.08143305033445358,
0.09893862158060074,
0.013978692702949047,
0.05280343443155289,
-0.010343060828745365,
0.03195343166589737,
-0.007015634328126907,
-0.07360956072807312,
0.03182031586766243,
0.009153240360319614,
0.05512065067887306,
0.04312713444232941,
-0.06986595690250397,
-0.016791783273220062,
-0.036289144307374954,
-0.025741536170244217,
0.11789269745349884,
0.31413984298706055,
-0.09244406968355179,
0.017904402688145638,
0.055398281663656235,
-0.04887446388602257,
-0.1943257600069046,
0.012720719911158085,
0.04497382417321205,
0.014443485997617245,
0.04272379353642464,
-0.15855135023593903,
0.08712989836931229,
0.10415035486221313,
-0.03490375354886055,
0.09966038912534714,
-0.268961638212204,
-0.1365315020084381,
0.12834830582141876,
0.13595394790172577,
0.11363466084003448,
-0.14838121831417084,
-0.03034619428217411,
-0.042151644825935364,
-0.10919105261564255,
0.08216217905282974,
-0.1118837371468544,
0.10867001116275787,
-0.01845364086329937,
0.04787342622876167,
-0.00242582312785089,
-0.05955643951892853,
0.1486298143863678,
-0.03716535121202469,
0.10907822847366333,
-0.05151628330349922,
0.03770780935883522,
0.05934647470712662,
-0.059727203100919724,
0.029102936387062073,
-0.10240922123193741,
0.04178749397397041,
-0.014137453399598598,
-0.025460075587034225,
-0.04967119172215462,
0.060331791639328,
-0.03797232359647751,
-0.059929098933935165,
-0.028894685208797455,
0.028522366657853127,
0.052482720464468,
-0.02183668315410614,
0.12777605652809143,
0.02621215023100376,
0.16870681941509247,
0.1086912676692009,
0.048560503870248795,
-0.0483379065990448,
-0.04479933902621269,
0.011639099568128586,
-0.0437619574368,
0.0737609788775444,
-0.1230960264801979,
0.04099518433213234,
0.10473033785820007,
0.018387001007795334,
0.1490199714899063,
0.07028783112764359,
-0.04086366295814514,
0.01275820191949606,
0.07590354233980179,
-0.1519855558872223,
-0.09371677786111832,
0.007464227732270956,
0.004685893189162016,
-0.13533100485801697,
0.033336933702230453,
0.11255349218845367,
-0.07888975739479065,
0.0064006163738667965,
-0.009344267658889294,
0.026777375489473343,
-0.04653444141149521,
0.180768683552742,
0.04154131934046745,
0.05335506796836853,
-0.07699883729219437,
0.08550214767456055,
0.03657439723610878,
-0.08509042859077454,
0.016866931691765785,
0.03944144770503044,
-0.07467532902956009,
-0.0317794531583786,
0.05974468216300011,
0.1672438532114029,
-0.021454889327287674,
-0.060796260833740234,
-0.14853501319885254,
-0.11649245023727417,
0.05862776190042496,
0.1834707260131836,
0.0764988586306572,
0.00306127336807549,
-0.0055377488024532795,
0.028070803731679916,
-0.09577096253633499,
0.10760915279388428,
0.050405655056238174,
0.08094511926174164,
-0.14098209142684937,
0.1058872863650322,
-0.0005975639214739203,
0.0006834070663899183,
-0.01642429456114769,
0.05635891854763031,
-0.1059284582734108,
-0.019106030464172363,
-0.13739435374736786,
-0.0021136091090738773,
-0.03986268490552902,
0.0003221177030354738,
0.009278170764446259,
-0.05544077232480049,
-0.06849990040063858,
0.031271953135728836,
-0.0936720222234726,
-0.04722683131694794,
0.03153860941529274,
0.052564457058906555,
-0.1273084431886673,
-0.0421600379049778,
0.04469463229179382,
-0.08590951561927795,
0.056949518620967865,
0.0358864888548851,
0.030333122238516808,
0.0413878932595253,
-0.16638317704200745,
0.005648497026413679,
0.05612587183713913,
0.010062577202916145,
0.037055306136608124,
-0.12678705155849457,
-0.018250320106744766,
-0.014451560564339161,
0.026668347418308258,
-0.0011334237642586231,
0.08419933170080185,
-0.13399475812911987,
-0.006244973745197058,
-0.00989325251430273,
-0.06245337426662445,
-0.041597120463848114,
-0.0012716937344521284,
0.10120595246553421,
-0.0009217667393386364,
0.20157423615455627,
-0.11434813588857651,
0.024475712329149246,
-0.2005232572555542,
0.011120716109871864,
-0.01750359497964382,
-0.08314089477062225,
-0.11630350351333618,
-0.03532400727272034,
0.049703918397426605,
-0.05445446819067001,
0.12727679312229156,
-0.024219989776611328,
0.04736778140068054,
0.038753997534513474,
-0.063714399933815,
0.03283483907580376,
0.024970727041363716,
0.21965384483337402,
0.035613689571619034,
-0.043617941439151764,
0.03443380072712898,
0.018053503707051277,
0.10564220696687698,
0.07428958266973495,
0.18308576941490173,
0.17474819719791412,
0.00416037579998374,
0.10553409159183502,
0.04400390014052391,
-0.07674209773540497,
-0.13569022715091705,
0.0524674691259861,
-0.019109902903437614,
0.10507699102163315,
-0.007198181934654713,
0.180153951048851,
0.10466351360082626,
-0.16535116732120514,
0.03528875485062599,
-0.033329155296087265,
-0.06935923546552658,
-0.12510475516319275,
-0.04380951076745987,
-0.09760600328445435,
-0.14882080256938934,
0.007967837154865265,
-0.10261284559965134,
0.028867188841104507,
0.07115381211042404,
-0.006444786209613085,
0.007500545121729374,
0.17958565056324005,
-0.003754005068913102,
0.03630824759602547,
0.03778619319200516,
0.004219302907586098,
-0.04061956703662872,
-0.04618069529533386,
-0.11109340935945511,
0.019207891076803207,
-0.02526320517063141,
0.023092327639460564,
-0.04290136322379112,
-0.04144930839538574,
0.0572982132434845,
0.005291911773383617,
-0.1082896813750267,
0.015064297243952751,
0.01625109650194645,
0.043042950332164764,
0.05923667550086975,
0.016688844189047813,
0.019914863631129265,
0.0020543131977319717,
0.19315952062606812,
-0.08354219049215317,
-0.08534331619739532,
-0.12285745143890381,
0.22567962110042572,
0.023786470293998718,
0.00009937932918546721,
0.015428371727466583,
-0.0796067863702774,
0.010301071219146252,
0.1753075271844864,
0.15783408284187317,
-0.04743095859885216,
0.0103244474157691,
-0.01278867106884718,
-0.020984340459108353,
-0.05985188111662865,
0.08090723305940628,
0.11503790318965912,
0.024657094851136208,
-0.06137092038989067,
-0.05298163741827011,
-0.04748216271400452,
-0.010743479244410992,
-0.07109513878822327,
0.03334967792034149,
0.0067295716144144535,
0.0023837918415665627,
-0.03364846110343933,
0.048520106822252274,
-0.01718151941895485,
-0.09973739087581635,
0.0752582773566246,
-0.17624928057193756,
-0.15246008336544037,
-0.01609824411571026,
0.09545065462589264,
0.011668434366583824,
0.04998252913355827,
-0.035814471542835236,
-0.00671601016074419,
0.10030090808868408,
-0.024900993332266808,
-0.04461164399981499,
-0.10668759047985077,
0.06852378696203232,
-0.07023344933986664,
0.25485098361968994,
-0.01876083016395569,
0.049814824014902115,
0.12831492722034454,
0.04945185407996178,
-0.08691862970590591,
0.10092911869287491,
0.05805274844169617,
-0.09053188562393188,
0.025879645720124245,
0.11372869461774826,
-0.05507819354534149,
0.11565994471311569,
0.045612309128046036,
-0.14255300164222717,
0.01095741055905819,
-0.03628357872366905,
-0.09764862060546875,
-0.04458678141236305,
-0.017931023612618446,
-0.06711477041244507,
0.13085171580314636,
0.1911061853170395,
-0.04291022568941116,
0.00006359977851388976,
-0.043848514556884766,
0.04881729185581207,
0.08262920379638672,
0.04596361145377159,
-0.04190820828080177,
-0.23430679738521576,
0.049885910004377365,
0.0565570667386055,
-0.0028555593453347683,
-0.2625422477722168,
-0.10406602174043655,
-0.0007653599604964256,
-0.046490419656038284,
-0.0944218561053276,
0.08322572708129883,
0.10982922464609146,
0.054687634110450745,
-0.05299846827983856,
-0.11294373869895935,
-0.07150762528181076,
0.15606407821178436,
-0.1358245611190796,
-0.0934496521949768
] |
null | null | ml-agents |
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: juan9/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["SnowballTarget", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SnowballTarget"]} | reinforcement-learning | juan9/ppo-SnowballTarget | [
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] | 2024-02-14T15:04:54+00:00 | [] | [] | TAGS
#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us
|
# ppo Agent playing SnowballTarget
This is a trained model of a ppo agent playing SnowballTarget
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: juan9/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: juan9/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n",
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: juan9/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
50,
205
] | [
"passage: TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: juan9/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
-0.03678528591990471,
0.07222326844930649,
-0.0037923105992376804,
0.11336509138345718,
0.17021463811397552,
-0.02063927985727787,
0.1533733457326889,
0.10542189329862595,
0.1289433091878891,
0.07125405222177505,
0.08640003204345703,
0.09505504369735718,
0.06200467050075531,
0.15303146839141846,
0.07849393039941788,
-0.2169235199689865,
-0.05049910768866539,
-0.10281366854906082,
0.0027308736462146044,
0.07461635768413544,
0.041612181812524796,
-0.03444340080022812,
0.031046664342284203,
0.052286695688962936,
-0.008085994981229305,
0.003952100872993469,
-0.06648840010166168,
-0.0445169173181057,
0.06285472214221954,
-0.028576495125889778,
0.012993338517844677,
-0.05231165140867233,
0.09627629071474075,
-0.17276307940483093,
0.024518746882677078,
0.043109893798828125,
-0.015891224145889282,
-0.02347487583756447,
0.149744912981987,
0.02417454496026039,
0.10216426849365234,
-0.128680020570755,
0.09453927725553513,
0.07968901842832565,
-0.061686765402555466,
0.00218791957013309,
-0.07124917209148407,
0.059281185269355774,
0.209698885679245,
0.13874201476573944,
-0.003421763889491558,
0.07347049564123154,
-0.027493923902511597,
0.0561504028737545,
0.15655891597270966,
-0.2747141718864441,
-0.07381393760442734,
0.17132842540740967,
-0.04054580628871918,
0.03126297518610954,
-0.007706311531364918,
0.0454842671751976,
-0.01545011904090643,
0.021593596786260605,
-0.016888992860913277,
0.025845687836408615,
0.2502119541168213,
0.019334863871335983,
-0.08779127895832062,
-0.08348797261714935,
-0.007875845767557621,
0.03299051523208618,
-0.04800206795334816,
-0.18292278051376343,
0.012132318690419197,
0.12183620780706406,
0.008915740065276623,
0.02346062660217285,
0.05846555158495903,
0.013157124631106853,
-0.09414053708314896,
-0.14930124580860138,
-0.03620561212301254,
-0.05804254114627838,
0.11124300956726074,
0.09770563244819641,
-0.029861243441700935,
-0.008773989975452423,
0.03477253392338753,
0.07704078406095505,
0.11557171493768692,
-0.04855960234999657,
-0.02958293817937374,
-0.019038159400224686,
-0.14546717703342438,
-0.019460545852780342,
-0.029042018577456474,
-0.0123366704210639,
0.03268235921859741,
0.1479264497756958,
0.15771007537841797,
0.0277567096054554,
0.033162910491228104,
0.022684093564748764,
0.006918015424162149,
0.10756238549947739,
0.05253513157367706,
-0.0272687915712595,
0.009535024873912334,
0.012138314545154572,
0.06284784525632858,
-0.09673193842172623,
-0.1034572571516037,
0.05279281362891197,
-0.04393277317285538,
0.13237044215202332,
0.1602073758840561,
-0.03723180666565895,
-0.013661627657711506,
-0.04002530127763748,
0.030895987525582314,
-0.1413416713476181,
0.08173652738332748,
0.06059876084327698,
-0.051756732165813446,
-0.08810427784919739,
-0.06258371472358704,
0.055338066071271896,
-0.07916983962059021,
0.028031310066580772,
0.0038874002639204264,
0.06959395110607147,
0.008031215518712997,
-0.03598540276288986,
0.05711454525589943,
-0.11868918687105179,
-0.006793851498514414,
-0.16245625913143158,
-0.11066866666078568,
-0.08812214434146881,
0.03326401859521866,
-0.05167161300778389,
-0.11814678460359573,
-0.10484160482883453,
0.03456679359078407,
-0.07495512813329697,
0.03097042627632618,
-0.024556132033467293,
-0.06328355520963669,
-0.0357997789978981,
-0.1048269271850586,
0.05725220590829849,
0.179495707154274,
0.0014070892939344049,
-0.02994200959801674,
0.023135406896471977,
-0.15608105063438416,
0.15720611810684204,
-0.1414816677570343,
0.16156715154647827,
-0.08149664849042892,
0.04127171263098717,
0.1299816071987152,
-0.02410838007926941,
0.04196292161941528,
0.19050505757331848,
-0.11169634759426117,
-0.0788717046380043,
0.04458051919937134,
-0.07904721051454544,
-0.10839021950960159,
0.05492161214351654,
0.013141660951077938,
0.0588911734521389,
0.06096401438117027,
0.20242805778980255,
0.09578948467969894,
-0.2421831488609314,
0.047925349324941635,
0.004680690821260214,
-0.13543765246868134,
-0.0008638364379294217,
0.12444956600666046,
-0.07178066670894623,
0.011526880785822868,
-0.043667327612638474,
-0.12279532104730606,
0.1024790033698082,
-0.00658657168969512,
-0.06742635369300842,
0.03368714451789856,
-0.04850941151380539,
-0.05807879939675331,
-0.001685765921138227,
0.04259665682911873,
-0.044526003301143646,
-0.04826991260051727,
-0.026654556393623352,
0.027430666610598564,
0.0025187900755554438,
0.07490221410989761,
-0.033857788890600204,
0.1173534020781517,
-0.009721529670059681,
0.007157667074352503,
-0.09869924932718277,
-0.13711673021316528,
-0.008347340859472752,
0.019248101860284805,
0.07940208911895752,
-0.08858820796012878,
0.096357561647892,
0.07738939672708511,
0.04134513437747955,
-0.06809211522340775,
-0.06410358846187592,
0.017868513241410255,
-0.10058163106441498,
-0.11175078898668289,
-0.07460564374923706,
-0.06338635087013245,
0.1154012382030487,
-0.0915408730506897,
0.05905996263027191,
-0.060993727296590805,
0.0997304767370224,
-0.01658100076019764,
-0.07583002746105194,
0.04496866092085838,
-0.012067856267094612,
0.03606484830379486,
-0.10362182557582855,
0.10195504873991013,
0.06828802078962326,
-0.13111969828605652,
0.03291252627968788,
0.05063684657216072,
-0.09735235571861267,
0.11714588850736618,
0.03816114738583565,
-0.005283920094370842,
-0.043561603873968124,
-0.061008308082818985,
0.0019493112340569496,
-0.0731748566031456,
0.03394685313105583,
0.2103731632232666,
0.12761959433555603,
0.08230068534612656,
-0.03871060535311699,
-0.061612486839294434,
-0.02535969950258732,
-0.05863412842154503,
-0.06368830800056458,
0.13517796993255615,
0.023953044787049294,
-0.0354955792427063,
0.03730212524533272,
-0.009982781484723091,
0.08834417164325714,
0.12496665120124817,
-0.0020359517075121403,
-0.11908669769763947,
0.009974958375096321,
0.06246917322278023,
0.0673857256770134,
0.005549968220293522,
0.06144797056913376,
-0.028886476531624794,
-0.015495850704610348,
-0.06823434680700302,
-0.015258938074111938,
-0.10364770889282227,
-0.06163407117128372,
0.06567516177892685,
-0.010765106417238712,
0.01040069293230772,
-0.07949668914079666,
-0.0471331924200058,
0.024184782058000565,
0.10466551035642624,
-0.0074111083522439,
0.03558802977204323,
-0.0389675609767437,
-0.1302603781223297,
0.040865253657102585,
-0.08437835425138474,
-0.23276744782924652,
-0.11569521576166153,
-0.05414951592683792,
-0.06927618384361267,
0.02710905112326145,
0.06906618922948837,
-0.19204150140285492,
-0.0013083815574645996,
-0.09319116175174713,
-0.008634197525680065,
-0.008785986341536045,
-0.04490775614976883,
0.1351732313632965,
0.10777983069419861,
-0.02312530018389225,
-0.06204080581665039,
0.012061403132975101,
0.015027990564703941,
-0.05884868651628494,
-0.01936400681734085,
0.07192926108837128,
0.09955784678459167,
0.06949969381093979,
0.06210600957274437,
0.05623513460159302,
-0.02923334389925003,
0.15142692625522614,
-0.05388827621936798,
0.02926827035844326,
0.07124590128660202,
-0.009789654053747654,
0.06952673941850662,
0.009375440888106823,
0.02780444733798504,
0.004123697057366371,
0.009540856815874577,
0.005987789016216993,
-0.07632578909397125,
-0.21850000321865082,
-0.07575058192014694,
-0.002361224964261055,
0.17952390015125275,
0.16994160413742065,
0.09456983953714371,
-0.09449955821037292,
0.02149028703570366,
0.008434883318841457,
-0.09769205749034882,
0.1176406592130661,
0.132381871342659,
-0.06455889344215393,
-0.01533848699182272,
0.03038451448082924,
-0.03818416967988014,
0.052027441561222076,
0.05859651044011116,
-0.03505786508321762,
0.07453343272209167,
0.026688192039728165,
0.0027228950057178736,
-0.02969978004693985,
-0.05152487754821777,
-0.05872097611427307,
0.12434566766023636,
0.07624388486146927,
0.025100210681557655,
0.01270606741309166,
-0.06566527485847473,
-0.08093805611133575,
0.1371610164642334,
0.16402234137058258,
-0.07037327438592911,
-0.05045706406235695,
0.11555194854736328,
0.050716329365968704,
0.18168459832668304,
0.002894853474572301,
-0.10686491429805756,
-0.06622456014156342,
0.0014105496229603887,
-0.10538750886917114,
0.004702356643974781,
0.03554219752550125,
-0.0033812744077295065,
-0.16449491679668427,
0.055937498807907104,
-0.0024371230974793434,
0.11456715315580368,
0.02152232639491558,
-0.032357264310121536,
0.0615072026848793,
0.014659126289188862,
-0.024613579735159874,
0.04797667637467384,
-0.15786650776863098,
0.025136690586805344,
0.0014656118582934141,
0.10001428425312042,
-0.05985267087817192,
0.029518073424696922,
0.07941600680351257,
-0.0574256032705307,
0.17243747413158417,
0.04274635389447212,
-0.03813473507761955,
-0.12133734673261642,
-0.1706044226884842,
-0.058938104659318924,
-0.029402682557702065,
-0.10592596977949142,
0.06846746057271957,
0.03841102495789528,
-0.02070213109254837,
-0.10475035756826401,
0.03722177818417549,
-0.04873348027467728,
-0.12586189806461334,
-0.043671950697898865,
-0.08876708149909973,
0.048130664974451065,
-0.052556172013282776,
-0.07101640105247498,
-0.0851176381111145,
0.18229179084300995,
0.09651142358779907,
-0.09663373976945877,
-0.11657443642616272,
0.008645916357636452,
-0.057999588549137115,
-0.03225468844175339,
0.06852743029594421,
0.014781612902879715,
0.11404666304588318,
-0.11517460644245148,
-0.04945524036884308,
-0.028213031589984894,
-0.10716024786233902,
-0.0923725888133049,
0.025679729878902435,
0.1729038804769516,
0.036701615899801254,
0.08758635073900223,
-0.006734035909175873,
0.0986592248082161,
-0.010739893652498722,
-0.06298166513442993,
0.12304925173521042,
0.08775878697633743,
-0.03289146348834038,
0.045839402824640274,
0.02953748032450676,
0.0734049379825592,
-0.13147816061973572,
-0.01668115332722664,
0.2075512856245041,
0.26900675892829895,
-0.06693559139966965,
0.19984792172908783,
0.01049264706671238,
-0.0501200295984745,
-0.15032094717025757,
-0.061235588043928146,
0.01037096418440342,
-0.04957623407244682,
0.10692892968654633,
-0.1902269721031189,
0.08174271881580353,
0.0036138352006673813,
-0.012147106230258942,
0.03914410620927811,
-0.13714033365249634,
-0.08313854783773422,
0.019803743809461594,
0.0942874401807785,
-0.05630173534154892,
-0.09117324650287628,
-0.08049888908863068,
0.0132296122610569,
-0.09060195088386536,
0.023716045543551445,
-0.10530144721269608,
0.052647773176431656,
0.02045205608010292,
0.0314071960747242,
0.06075244024395943,
-0.0544709712266922,
0.1300293356180191,
-0.04002578929066658,
-0.06168534606695175,
-0.06299609690904617,
0.019551509991288185,
-0.01349432859569788,
-0.09928533434867859,
0.0416259765625,
-0.006825623102486134,
-0.022041508927941322,
-0.1905287504196167,
-0.05470775067806244,
0.023615317419171333,
0.03758645057678223,
-0.03774058818817139,
-0.07591576874256134,
-0.02381979115307331,
0.06147583946585655,
0.09096696972846985,
0.030136369168758392,
0.14461848139762878,
0.0007363948388956487,
0.0030780734959989786,
0.07586962729692459,
0.03691892325878143,
0.03920052573084831,
-0.11497174203395844,
-0.0648704543709755,
-0.0685470849275589,
0.0006068765069358051,
-0.05606772005558014,
-0.015408567152917385,
0.0522720068693161,
0.05625039339065552,
-0.01513749547302723,
0.054364755749702454,
-0.08592608571052551,
-0.0164538212120533,
0.024762993678450584,
-0.09185043722391129,
-0.10557019710540771,
-0.08490579575300217,
-0.11496303975582123,
0.026340549811720848,
-0.08836465328931808,
0.08921878039836884,
-0.0509624108672142,
-0.006805800832808018,
0.01405110489577055,
0.03373730555176735,
-0.005598338320851326,
0.03544207662343979,
0.020684033632278442,
0.030486498028039932,
-0.07275401055812836,
0.1309679001569748,
0.020868949592113495,
-0.039101626724004745,
0.04943614825606346,
0.18991905450820923,
-0.05762731656432152,
-0.06446555256843567,
-0.04908410459756851,
0.09273070842027664,
0.04084182158112526,
-0.023662980645895004,
-0.046289172023534775,
-0.04833582788705826,
0.11587385088205338,
-0.15497377514839172,
0.007331740111112595,
-0.11588425934314728,
0.006235855631530285,
0.054218895733356476,
-0.05473800748586655,
0.06892277300357819,
-0.021109774708747864,
-0.0653064027428627,
-0.14378534257411957,
0.07624826580286026,
0.02702195942401886,
0.09121240675449371,
-0.01111595332622528,
-0.023814311251044273,
-0.1424005627632141,
0.029228007420897484,
0.0010880433255806565,
0.008984988555312157,
-0.16590656340122223,
0.017138153314590454,
-0.008082916028797626,
0.034953564405441284,
0.029804740101099014,
0.06836552917957306,
-0.037193939089775085,
-0.09331444650888443,
-0.05689029395580292,
0.06748384982347488,
-0.07961423695087433,
-0.02085202746093273,
-0.03213251382112503,
-0.0796205997467041,
0.05253944918513298,
0.07615773379802704,
-0.019237574189901352,
-0.04989152401685715,
-0.0683194100856781,
0.018926184624433517,
-0.021459657698869705,
-0.04911506175994873,
0.05316515639424324,
-0.13395389914512634,
0.023309247568249702,
-0.06293486803770065,
-0.120618537068367,
0.03168425336480141,
0.11351443827152252,
-0.06335132569074631,
0.04921812564134598,
0.05643027648329735,
-0.09142373502254486,
-0.07074553519487381,
-0.0047636027447879314,
0.07686076313257217,
0.046886034309864044,
0.10137468576431274,
-0.08142755925655365,
0.2024770975112915,
-0.1059698760509491,
-0.03299078345298767,
0.012003968469798565,
0.06933613121509552,
0.015636419877409935,
-0.08463222533464432,
0.04137616604566574,
-0.008776084519922733,
0.0626300647854805,
0.07584960013628006,
0.019617591053247452,
0.05012509226799011,
0.03288162499666214,
0.13391518592834473,
0.01786103844642639,
0.08707015961408615,
-0.008839940652251244,
0.0181028600782156,
0.11869392544031143,
-0.002441630233079195,
0.07088755071163177,
-0.06076546385884285,
0.06575864553451538,
0.05557682737708092,
0.09781712293624878,
0.07789632678031921,
0.05769622325897217,
-0.09460945427417755,
-0.16216319799423218,
-0.03426442667841911,
0.042574476450681686,
0.02941535972058773,
-0.041559621691703796,
0.15125706791877747,
0.1344892531633377,
-0.20435725152492523,
0.014353699050843716,
0.0006476973067037761,
0.04302090033888817,
-0.07729655504226685,
-0.08729594945907593,
0.005774565972387791,
-0.1350155472755432,
0.09912341088056564,
-0.01774374209344387,
0.00421175779774785,
-0.043298423290252686,
0.002322270767763257,
0.022775257006287575,
0.03147086873650551,
-0.04337984696030617,
-0.0006613109726458788,
0.04855896160006523,
-0.02844976633787155,
0.010823814198374748,
-0.007278710603713989,
-0.07922347635030746,
-0.042201217263936996,
-0.06168964505195618,
-0.019026264548301697,
0.02408100664615631,
0.018413597717881203,
0.06242775544524193,
0.0125346090644598,
-0.062082674354314804,
0.0763353481888771,
-0.0015884785680100322,
0.017363982275128365,
0.21509724855422974,
0.0959654226899147,
-0.0455782487988472,
-0.0410735197365284,
0.2098669707775116,
-0.0306564848870039,
-0.049787070602178574,
-0.0864814817905426,
0.11914187669754028,
-0.04572062939405441,
-0.04191697761416435,
-0.041757941246032715,
-0.1613074541091919,
-0.06005474179983139,
0.16320571303367615,
0.11739373207092285,
-0.015041103586554527,
0.0012399073457345366,
-0.0617719441652298,
0.005673473700881004,
0.02685486152768135,
0.08310315757989883,
0.05686555430293083,
0.05420747771859169,
-0.09505932033061981,
-0.008239774033427238,
-0.06420274078845978,
-0.10259328037500381,
-0.20088931918144226,
0.05030672997236252,
0.027954384684562683,
-0.028045527637004852,
-0.015549386851489544,
0.11999042332172394,
-0.11290942132472992,
-0.09171360731124878,
0.12339875847101212,
-0.04971075430512428,
-0.07429421693086624,
-0.00192330963909626,
0.03803997114300728,
0.005224281456321478,
0.11772934347391129,
0.09088444709777832,
0.04197167977690697,
0.02746087871491909,
-0.013146796263754368,
-0.08754747360944748,
0.026731614023447037,
0.041473206132650375,
-0.13903358578681946,
0.24674300849437714,
-0.02713509276509285,
-0.0036741760559380054,
0.09757035970687866,
0.06520397961139679,
-0.18667054176330566,
0.0023049607407301664,
0.05190761014819145,
-0.17457516491413116,
0.018452877178788185,
0.0801280215382576,
-0.04329894855618477,
0.011815379373729229,
0.06234152242541313,
-0.04513998702168465,
0.007489461917430162,
0.18841306865215302,
0.04433669149875641,
-0.042088426649570465,
0.07901173084974289,
-0.14805100858211517,
0.09769488871097565,
0.09331143647432327,
-0.06163656339049339,
0.0029501395765691996,
-0.033945925533771515,
0.006811319384723902,
0.0032498135697096586,
0.0018159591127187014,
-0.021362580358982086,
-0.11325731873512268,
-0.02159229665994644,
-0.059258412569761276,
0.02575843594968319,
-0.20862367749214172,
-0.12552809715270996,
-0.053193509578704834,
-0.08601973950862885,
-0.044070806354284286,
0.08404342085123062,
0.07378803938627243,
-0.045846667140722275,
0.016247596591711044,
-0.1280696839094162,
0.027865367010235786,
0.14928878843784332,
-0.06950131058692932,
-0.0012356777442619205
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# asbl_model
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6721
- Accuracy: 0.5978
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 12
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6751 | 1.0 | 2052 | 0.6718 | 0.5846 |
| 0.6665 | 2.0 | 4104 | 0.6652 | 0.5996 |
| 0.6517 | 3.0 | 6156 | 0.6633 | 0.6046 |
| 0.6443 | 4.0 | 8208 | 0.6721 | 0.5978 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "asbl_model", "results": []}]} | text-classification | aravind-selvam/asbl_model | [
"transformers",
"tensorboard",
"safetensors",
"deberta-v2",
"text-classification",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:04:54+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #deberta-v2 #text-classification #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #autotrain_compatible #endpoints_compatible #region-us
| asbl\_model
===========
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6721
* Accuracy: 0.5978
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 12
* eval\_batch\_size: 24
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #deberta-v2 #text-classification #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #deberta-v2 #text-classification #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.0946415513753891,
0.0848933607339859,
-0.0025443134363740683,
0.10713937878608704,
0.152293398976326,
0.02249673195183277,
0.16325291991233826,
0.09864223003387451,
-0.06928317248821259,
0.05259036272764206,
0.13855046033859253,
0.13821996748447418,
0.02092110924422741,
0.13364407420158386,
-0.07379524409770966,
-0.23256658017635345,
0.017103906720876694,
0.03380677476525307,
-0.057962819933891296,
0.10446591675281525,
0.10434985160827637,
-0.12969282269477844,
0.09438607096672058,
-0.02142271399497986,
-0.17911343276500702,
0.01946225017309189,
0.026292484253644943,
-0.05238806828856468,
0.14176595211029053,
0.053665269166231155,
0.14761599898338318,
0.037482552230358124,
0.09192255884408951,
-0.19061702489852905,
0.008929217234253883,
0.05836627259850502,
-0.016236141324043274,
0.07179199159145355,
0.03810350224375725,
0.010731377638876438,
0.0781775638461113,
-0.10072635859251022,
0.07342623174190521,
0.012935477308928967,
-0.13864043354988098,
-0.22076983749866486,
-0.07870540022850037,
0.006108603440225124,
0.0919475182890892,
0.07834625244140625,
-0.01605663076043129,
0.141341894865036,
-0.04098863899707794,
0.09515465795993805,
0.19633464515209198,
-0.3012804687023163,
-0.0579088032245636,
0.06871140003204346,
0.05654552951455116,
0.09228397905826569,
-0.0935150682926178,
-0.005960893351584673,
0.07087534666061401,
0.025174204260110855,
0.1258767545223236,
-0.022757496684789658,
-0.00842432864010334,
-0.002058794954791665,
-0.14035767316818237,
-0.016039876267313957,
0.14197692275047302,
0.05096612125635147,
-0.04044770076870918,
-0.04974333569407463,
-0.06782857328653336,
-0.13615421950817108,
-0.031450942158699036,
-0.04499946907162666,
0.05927886813879013,
-0.032584983855485916,
-0.06732279807329178,
-0.01613292098045349,
-0.10919936746358871,
-0.07796905189752579,
-0.059243157505989075,
0.13377298414707184,
0.030314631760120392,
-0.004353587049990892,
-0.02832060493528843,
0.0984097346663475,
-0.028227193281054497,
-0.1408134549856186,
0.00991091225296259,
0.026690471917390823,
-0.0006261100061237812,
-0.05905788391828537,
-0.05970792844891548,
-0.09243693947792053,
0.019997544586658478,
0.14605727791786194,
-0.04188874363899231,
0.051974985748529434,
0.00453030364587903,
0.036679089069366455,
-0.09876639395952225,
0.18571840226650238,
-0.049569521099328995,
-0.052056580781936646,
0.023991500958800316,
0.07771600037813187,
0.0626668781042099,
-0.014383003115653992,
-0.14603795111179352,
0.006073995493352413,
0.12415073812007904,
0.02448653243482113,
-0.06347549706697464,
0.06964968889951706,
-0.06543236970901489,
-0.008074355311691761,
0.0368523932993412,
-0.08465156704187393,
0.02442731335759163,
-0.002643893240019679,
-0.056682463735342026,
-0.06277640163898468,
0.02084776945412159,
0.01664978265762329,
0.02238563634455204,
0.10581706464290619,
-0.08890948444604874,
-0.004677157383412123,
-0.09114570170640945,
-0.14512908458709717,
0.019249530509114265,
-0.09689310193061829,
0.036735471338033676,
-0.12437362223863602,
-0.1663505733013153,
-0.022665975615382195,
0.03847913071513176,
-0.03525141626596451,
-0.015068135224282742,
-0.06164686381816864,
-0.07285434007644653,
0.013914225623011589,
-0.02023925818502903,
0.05812026560306549,
-0.06162779778242111,
0.09345914423465729,
0.06054607778787613,
0.07580845803022385,
-0.07607943564653397,
0.03667836636304855,
-0.08561734855175018,
0.0330401211977005,
-0.18770825862884521,
0.03245905041694641,
-0.05925445631146431,
0.07729807496070862,
-0.07839301228523254,
-0.06881345808506012,
0.013675786554813385,
0.011577093042433262,
0.08037284016609192,
0.08876781165599823,
-0.19464071094989777,
-0.06052619218826294,
0.15952058136463165,
-0.09289079904556274,
-0.14202234148979187,
0.11905231326818466,
-0.05954454466700554,
0.058935269713401794,
0.08313752710819244,
0.18283797800540924,
0.06268316507339478,
-0.10896710306406021,
0.01869542896747589,
-0.014167643152177334,
0.047245509922504425,
-0.03165228292346001,
0.061578959226608276,
0.0016133125172927976,
0.029962699860334396,
0.009085509926080704,
-0.045378971844911575,
0.042948607355356216,
-0.07624977082014084,
-0.0747334212064743,
-0.02118328958749771,
-0.10081180930137634,
0.04611297324299812,
0.05130504071712494,
0.05959390476346016,
-0.12066050618886948,
-0.08184567838907242,
0.09777017682790756,
0.0667840838432312,
-0.06782925873994827,
0.010809291154146194,
-0.07943595945835114,
0.06555840373039246,
-0.06618912518024445,
-0.018860172480344772,
-0.1411016583442688,
-0.07398518919944763,
0.015050680376589298,
-0.0034275527577847242,
0.010561831295490265,
0.03488902375102043,
0.07247045636177063,
0.07742978632450104,
-0.06526894867420197,
-0.02294658124446869,
-0.02913091704249382,
0.02884386107325554,
-0.12207245826721191,
-0.19710250198841095,
-0.010300330817699432,
-0.03556746244430542,
0.12884050607681274,
-0.237945556640625,
0.05113065987825394,
0.023403175175189972,
0.09644179791212082,
0.05052384361624718,
-0.0049706739373505116,
-0.044551409780979156,
0.0693451538681984,
-0.04667140543460846,
-0.06889945268630981,
0.05023401603102684,
0.0011503342539072037,
-0.10706929862499237,
-0.0417572446167469,
-0.17638535797595978,
0.2082446664571762,
0.13684681057929993,
-0.07719754427671432,
-0.09004856646060944,
0.01670163683593273,
-0.024133730679750443,
-0.02662229910492897,
-0.03333127498626709,
-0.028598520904779434,
0.11309291422367096,
-0.017281439155340195,
0.14812952280044556,
-0.08284167945384979,
-0.03208911791443825,
0.026095129549503326,
-0.04164736717939377,
-0.0075643653981387615,
0.1039210706949234,
0.07405347377061844,
-0.13189755380153656,
0.1518326699733734,
0.17594322562217712,
-0.08092480897903442,
0.1405753791332245,
-0.025782639160752296,
-0.05263080820441246,
-0.03696644678711891,
-0.005940564442425966,
0.021784793585538864,
0.12531176209449768,
-0.12363376468420029,
-0.004050012212246656,
-0.005683471914380789,
0.014381980523467064,
0.010464407503604889,
-0.211037740111351,
-0.03254454955458641,
0.048196420073509216,
-0.048883870244026184,
0.0028004769701510668,
-0.019204771146178246,
-0.012046384625136852,
0.10118720680475235,
0.013023129664361477,
-0.08550813049077988,
0.045166827738285065,
-0.007872557267546654,
-0.08750087767839432,
0.21070295572280884,
-0.08335892111063004,
-0.16123180091381073,
-0.13284632563591003,
-0.05467211455106735,
-0.038183022290468216,
0.04991254210472107,
0.06448686122894287,
-0.0740697979927063,
-0.044082559645175934,
-0.09576672315597534,
0.03045727126300335,
0.01736827939748764,
0.02385462448000908,
0.00943220779299736,
0.00799164455384016,
0.08404630422592163,
-0.1117047518491745,
-0.007660156115889549,
-0.04447263851761818,
-0.04203332960605621,
0.035123810172080994,
0.015445340424776077,
0.12435319274663925,
0.13300861418247223,
-0.029284141957759857,
-0.01314378622919321,
-0.03867610543966293,
0.23869000375270844,
-0.07767091691493988,
-0.016542477533221245,
0.12630020081996918,
-0.03937695547938347,
0.03788987174630165,
0.1492353081703186,
0.05631760135293007,
-0.1217886209487915,
0.026492729783058167,
0.029002739116549492,
-0.03347235545516014,
-0.2094804346561432,
-0.02240731008350849,
-0.02826671488583088,
0.003408512333407998,
0.07660172134637833,
0.02426810748875141,
0.026311831548810005,
0.05661739036440849,
0.02425342984497547,
0.09330487251281738,
-0.001590269268490374,
0.0724836215376854,
0.1385299116373062,
0.030828051269054413,
0.13465772569179535,
-0.051842160522937775,
-0.06662584096193314,
0.03337319195270538,
0.0036034774966537952,
0.1871376931667328,
0.046701956540346146,
0.1400400698184967,
0.04919043928384781,
0.11847934126853943,
0.001696816529147327,
0.04587855935096741,
0.009089292958378792,
-0.04564478620886803,
-0.01664583571255207,
-0.044030919671058655,
-0.02432958036661148,
0.04392014443874359,
-0.07883058488368988,
0.03623346984386444,
-0.10896426439285278,
0.0011711408151313663,
0.051665596663951874,
0.1903403252363205,
0.06890860199928284,
-0.34940534830093384,
-0.09076602011919022,
0.03552888706326485,
-0.015347803942859173,
-0.024584617465734482,
0.015836093574762344,
0.12331633269786835,
-0.048150815069675446,
0.05067850276827812,
-0.059437189251184464,
0.07675480097532272,
-0.07068222761154175,
0.056099168956279755,
0.051480863243341446,
0.08252725005149841,
-0.014218234457075596,
0.06075570732355118,
-0.307682067155838,
0.2639124095439911,
0.021655255928635597,
0.06952857971191406,
-0.03983068838715553,
-0.006544685456901789,
0.018129907548427582,
0.09248774498701096,
0.06488869339227676,
-0.028465544804930687,
-0.05360109731554985,
-0.2074195295572281,
-0.039080798625946045,
0.022244403138756752,
0.10594111680984497,
-0.033134620636701584,
0.10396934300661087,
-0.02021760493516922,
0.007286840118467808,
0.0934370830655098,
-0.0056982142850756645,
-0.09211106598377228,
-0.08829225599765778,
-0.0058320434764027596,
0.029822198674082756,
-0.013562369160354137,
-0.07938259840011597,
-0.09660273045301437,
-0.12018641084432602,
0.136929452419281,
-0.07531770318746567,
-0.030091023072600365,
-0.09961095452308655,
0.07988474518060684,
0.04446392506361008,
-0.07860647141933441,
0.048657312989234924,
0.016664182767271996,
0.09108024090528488,
0.018628986552357674,
-0.049249567091464996,
0.14504799246788025,
-0.06225896626710892,
-0.17696994543075562,
-0.08582253754138947,
0.09403833746910095,
0.024557428434491158,
0.03856157138943672,
0.004194032400846481,
0.009656072594225407,
0.0009091017418541014,
-0.06686865538358688,
0.04337267577648163,
0.007985655218362808,
0.035259097814559937,
-0.002986819250509143,
-0.03944094479084015,
-0.012081308290362358,
-0.054692935198545456,
-0.0351492278277874,
0.14427916705608368,
0.2949938178062439,
-0.08937980234622955,
-0.011606189422309399,
0.0531839020550251,
-0.06626540422439575,
-0.2064560204744339,
0.064403235912323,
0.029909756034612656,
0.016758490353822708,
0.058826278895139694,
-0.13771119713783264,
0.07066113501787186,
0.1111132875084877,
-0.027231337502598763,
0.12592707574367523,
-0.2857850193977356,
-0.1497199684381485,
0.1239621564745903,
0.15727265179157257,
0.11155947297811508,
-0.1606512814760208,
-0.04256714507937431,
-0.06201157346367836,
-0.10491574555635452,
0.10258638858795166,
-0.11326312273740768,
0.10708490759134293,
-0.003422506619244814,
0.04841690510511398,
0.008185622282326221,
-0.05001449212431908,
0.14214542508125305,
-0.010629797354340553,
0.12460863590240479,
-0.07300008088350296,
-0.01905290223658085,
0.06288573145866394,
-0.05585610866546631,
0.016506463289260864,
-0.09040560573339462,
0.02846776321530342,
-0.04590163752436638,
-0.04141510650515556,
-0.03927725926041603,
0.036395274102687836,
-0.044498227536678314,
-0.06728427857160568,
-0.044181939214468,
0.02678963541984558,
0.020448671653866768,
-0.025974692776799202,
0.15310806035995483,
-0.003941092640161514,
0.13612708449363708,
0.13283175230026245,
0.08291906863451004,
-0.05070192739367485,
0.002001815242692828,
0.009553144685924053,
-0.03655696660280228,
0.06820684671401978,
-0.13788174092769623,
0.03626073896884918,
0.12169507890939713,
-0.006088967435061932,
0.14889603853225708,
0.06859448552131653,
-0.05024741217494011,
0.02443314902484417,
0.07136403024196625,
-0.14462992548942566,
-0.12552624940872192,
-0.0035097075160592794,
-0.054418355226516724,
-0.09853403270244598,
0.07685503363609314,
0.13595598936080933,
-0.06669130176305771,
0.006031804718077183,
-0.008643556386232376,
0.004015189595520496,
-0.0331910103559494,
0.18978038430213928,
0.07442810386419296,
0.04401219263672829,
-0.07245244830846786,
0.08361127972602844,
0.05433284863829613,
-0.08376353979110718,
0.01884625107049942,
0.03282297030091286,
-0.08153655380010605,
-0.04020366072654724,
0.029631545767188072,
0.20183074474334717,
-0.058381348848342896,
-0.053169697523117065,
-0.15876531600952148,
-0.12088532000780106,
0.04612424224615097,
0.1985984891653061,
0.09772269427776337,
0.024173036217689514,
-0.017967088147997856,
0.020803850144147873,
-0.10549203306436539,
0.11523060500621796,
0.025708239525556564,
0.09779652953147888,
-0.16746966540813446,
0.12345215678215027,
-0.004897424951195717,
-0.004545680247247219,
-0.02913890965282917,
0.056904446333646774,
-0.13367727398872375,
-0.007687981240451336,
-0.15737369656562805,
-0.009988897480070591,
-0.017251422628760338,
0.004702226258814335,
0.004920944571495056,
-0.058647193014621735,
-0.055429857224226,
0.0074996924959123135,
-0.09158658236265182,
-0.02281036786735058,
0.03877517208456993,
0.05162510648369789,
-0.11468332260847092,
-0.054914385080337524,
0.009191927500069141,
-0.06965984404087067,
0.07284469902515411,
-0.0004911230062134564,
0.03736719861626625,
0.04456526041030884,
-0.16137069463729858,
0.051794931292533875,
0.07100836932659149,
0.010112794116139412,
0.055639367550611496,
-0.08277398347854614,
-0.02228907123208046,
-0.007933316752314568,
0.04556666687130928,
0.01884475350379944,
0.09257520735263824,
-0.12312354147434235,
0.01140410453081131,
-0.02287367731332779,
-0.07387106865644455,
-0.04635853320360184,
0.02406507171690464,
0.08005671203136444,
0.003296057227998972,
0.19437099993228912,
-0.10120206326246262,
0.01825658418238163,
-0.19641302525997162,
0.002764555159956217,
0.003234172472730279,
-0.11104753613471985,
-0.10725443810224533,
-0.050556547939777374,
0.06073150783777237,
-0.0664554089307785,
0.14583666622638702,
0.01765078864991665,
0.04821716248989105,
0.04407629370689392,
-0.043754346668720245,
0.049024730920791626,
0.0352175235748291,
0.20016704499721527,
0.025454938411712646,
-0.03611613065004349,
0.0005632932297885418,
0.042751092463731766,
0.11754502356052399,
0.06838449090719223,
0.16912132501602173,
0.16373150050640106,
-0.06228192523121834,
0.11037055402994156,
0.0645129531621933,
-0.04629563167691231,
-0.1561141312122345,
0.04069092869758606,
-0.027942687273025513,
0.08718152344226837,
-0.015467912890017033,
0.16277500987052917,
0.11148417741060257,
-0.1522117406129837,
0.000776305387262255,
-0.062242500483989716,
-0.08354362100362778,
-0.11113930493593216,
-0.043524134904146194,
-0.1067151129245758,
-0.1395130604505539,
-0.0033286060206592083,
-0.1072651594877243,
-0.0016304507153108716,
0.09129934757947922,
-0.001977816689759493,
-0.02682694047689438,
0.17885755002498627,
0.02589714154601097,
0.028856078162789345,
0.05240612104535103,
0.017017697915434837,
-0.035015277564525604,
-0.07414382696151733,
-0.08484593778848648,
0.003472241573035717,
-0.004058193415403366,
0.0059359208680689335,
-0.05231605842709541,
-0.03738776221871376,
0.027990348637104034,
-0.0014312304556369781,
-0.10645153373479843,
0.010902571491897106,
0.04709836095571518,
0.0326949767768383,
0.046805113554000854,
0.012105600908398628,
0.0010726399486884475,
0.0018594536231830716,
0.19730940461158752,
-0.06715750694274902,
-0.06033850461244583,
-0.08367576450109482,
0.22704476118087769,
0.034023161977529526,
0.04070515185594559,
0.0013352942187339067,
-0.10666275024414062,
0.03301159664988518,
0.19781918823719025,
0.18986189365386963,
-0.08696390688419342,
0.0037271270994096994,
-0.012490407563745975,
-0.007343857549130917,
-0.0265743900090456,
0.0995214432477951,
0.09827907383441925,
0.0361919067800045,
-0.06303688883781433,
-0.04785420373082161,
-0.0343811996281147,
0.0033902458380907774,
-0.023093940690159798,
0.056120142340660095,
0.04218630865216255,
0.02396758273243904,
-0.051146000623703,
0.06941480189561844,
-0.020058074966073036,
-0.127733051776886,
0.07391971349716187,
-0.20233988761901855,
-0.14483772218227386,
-0.00995057076215744,
0.11423780769109726,
-0.02517031878232956,
0.06070356443524361,
-0.03477218374609947,
-0.0006032441160641611,
0.0637848898768425,
-0.024203822016716003,
-0.07544409483671188,
-0.06602184474468231,
0.05427873134613037,
-0.07423004508018494,
0.21165156364440918,
-0.05181441083550453,
0.060890115797519684,
0.13298273086547852,
0.057591892778873444,
-0.07461705058813095,
0.10404425859451294,
0.04722621291875839,
-0.05224394425749779,
0.040435582399368286,
0.07095970958471298,
-0.043963342905044556,
0.12551233172416687,
0.06711645424365997,
-0.15744024515151978,
0.01982480101287365,
-0.04092119261622429,
-0.10016602277755737,
-0.05844961851835251,
-0.029757972806692123,
-0.06472878158092499,
0.12894612550735474,
0.18237736821174622,
-0.036676760762929916,
0.012128072790801525,
-0.03502007946372032,
0.017137249931693077,
0.08107025921344757,
0.045570775866508484,
-0.03578391671180725,
-0.2383846789598465,
0.02933601662516594,
0.08225870132446289,
-0.007121483329683542,
-0.2800319194793701,
-0.08084575831890106,
-0.014494470320641994,
-0.043016932904720306,
-0.08320441842079163,
0.08770447969436646,
0.10165376961231232,
0.05459588021039963,
-0.06890539824962616,
-0.08974043279886246,
-0.0754510909318924,
0.16214539110660553,
-0.13703572750091553,
-0.11738985776901245
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | mtc/meta-llama-Llama-2-7b-hf-pubmed-summarization-10k-last-lora-full-adapter | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:06:59+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | mtc/meta-llama-Llama-2-7b-hf-pubmed-summarization-10k-last_merged | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:07:01+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500
<Gallery />
## Model description
These are rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a selfie photo of http woman to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "diffusers", "lora", "template:sd-lora", "text-to-image", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "diffusers", "lora", "template:sd-lora"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a selfie photo of http woman", "widget": []} | text-to-image | rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 | [
"diffusers",
"tensorboard",
"text-to-image",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | 2024-02-14T15:10:19+00:00 | [] | [] | TAGS
#diffusers #tensorboard #text-to-image #stable-diffusion-xl #stable-diffusion-xl-diffusers #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #region-us
|
# SDXL LoRA DreamBooth - rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500
<Gallery />
## Model description
These are rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using DreamBooth.
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a selfie photo of http woman to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
Download them in the Files & versions tab.
## Intended uses & limitations
#### How to use
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | [
"# SDXL LoRA DreamBooth - rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500\n\n<Gallery />",
"## Model description\n\nThese are rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.",
"## Trigger words\n\nYou should use a selfie photo of http woman to trigger the image generation.",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab.",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]",
"## Training details\n\n[TODO: describe the data used to train the model]"
] | [
"TAGS\n#diffusers #tensorboard #text-to-image #stable-diffusion-xl #stable-diffusion-xl-diffusers #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #region-us \n",
"# SDXL LoRA DreamBooth - rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500\n\n<Gallery />",
"## Model description\n\nThese are rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.",
"## Trigger words\n\nYou should use a selfie photo of http woman to trigger the image generation.",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab.",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]",
"## Training details\n\n[TODO: describe the data used to train the model]"
] | [
82,
37,
102,
19,
28,
9,
5,
24,
16
] | [
"passage: TAGS\n#diffusers #tensorboard #text-to-image #stable-diffusion-xl #stable-diffusion-xl-diffusers #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #region-us \n# SDXL LoRA DreamBooth - rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500\n\n<Gallery />## Model description\n\nThese are rpaganini-dsense/sdxl-v1.0-id-rorro-http-1500 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.## Trigger words\n\nYou should use a selfie photo of http woman to trigger the image generation.## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab.## Intended uses & limitations#### How to use#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]## Training details\n\n[TODO: describe the data used to train the model]"
] | [
-0.025690842419862747,
0.12892138957977295,
-0.005138632375746965,
0.014234754256904125,
0.10495322197675705,
-0.016127021983265877,
0.1351710706949234,
0.10678780823945999,
0.09935148805379868,
0.10327479988336563,
-0.006231941748410463,
0.041726864874362946,
0.11279099434614182,
0.12939618527889252,
-0.06012549623847008,
-0.19564928114414215,
0.00953702162951231,
-0.024195075035095215,
-0.03200460225343704,
0.07160671800374985,
0.07611222565174103,
-0.06590587645769119,
0.07218249887228012,
0.010810742154717445,
-0.1018405333161354,
0.022045135498046875,
0.0010638403473421931,
0.00327532016672194,
0.01943543367087841,
0.056890495121479034,
0.06241130456328392,
0.06941454112529755,
0.03543755039572716,
-0.26337119936943054,
0.009080223739147186,
0.05712854489684105,
-0.025437045842409134,
0.04443957656621933,
0.07961804419755936,
-0.06737104058265686,
0.07182635366916656,
-0.16098731756210327,
0.057042866945266724,
0.056248072534799576,
-0.035927578806877136,
-0.11136343330144882,
-0.0799696296453476,
-0.015268834307789803,
0.07383940368890762,
0.07287122309207916,
-0.02066541649401188,
0.08140338957309723,
-0.009486018680036068,
0.052927061915397644,
0.25197461247444153,
-0.09855607897043228,
-0.05672536790370941,
0.12380502372980118,
0.04723697155714035,
0.026519259437918663,
-0.10566205531358719,
0.03510534390807152,
0.028310850262641907,
-0.00005113979568704963,
0.047263890504837036,
-0.05703480541706085,
0.0688607320189476,
-0.10499563813209534,
-0.10100322961807251,
-0.04269837960600853,
0.10887106508016586,
0.04371292144060135,
-0.061236146837472916,
-0.15964001417160034,
-0.06323965638875961,
0.15039591491222382,
-0.06301294267177582,
-0.07697732001543045,
0.011933833360671997,
-0.03282926231622696,
-0.0024706032127141953,
-0.11945905536413193,
-0.06102573499083519,
-0.020510071888566017,
0.06475101411342621,
0.12874458730220795,
0.012114419601857662,
0.020261311903595924,
0.006048842333257198,
0.08756446093320847,
0.008877644315361977,
-0.1276666820049286,
0.016352681443095207,
-0.05398426949977875,
-0.08150104433298111,
-0.0002485580334905535,
0.03998703509569168,
-0.07051382213830948,
0.06266915798187256,
0.05714615434408188,
-0.06649935990571976,
0.06542497873306274,
-0.05395817384123802,
0.023722687736153603,
0.006594917271286249,
0.08828330039978027,
0.013126915320754051,
-0.042606860399246216,
0.055578455328941345,
0.03837880119681358,
-0.02135542780160904,
-0.032796699553728104,
-0.03379201889038086,
-0.055449437350034714,
0.027699418365955353,
0.07436256110668182,
0.05081934109330177,
-0.00023538252571597695,
-0.0645795539021492,
-0.05771308019757271,
0.10586269199848175,
-0.11887555569410324,
0.04069161042571068,
-0.03987369313836098,
-0.02790154330432415,
0.07850536704063416,
0.08728661388158798,
-0.007669081445783377,
-0.0808565691113472,
0.046684104949235916,
-0.06778937578201294,
0.017346441745758057,
-0.08661295473575592,
-0.0807858556509018,
0.015963856130838394,
-0.17125946283340454,
-0.012220158241689205,
-0.084230437874794,
-0.198120579123497,
-0.06439398974180222,
0.008004517294466496,
-0.10637630522251129,
-0.025539172813296318,
-0.03256593644618988,
-0.03615723177790642,
0.00714073283597827,
0.03942202776670456,
0.030564680695533752,
0.006951935589313507,
0.08295457065105438,
-0.03106708452105522,
0.061488136649131775,
0.0925968661904335,
0.015916923061013222,
-0.07143820077180862,
0.07431504875421524,
-0.12592115998268127,
0.18022198975086212,
-0.07797323912382126,
-0.0009501446620561182,
-0.11332881450653076,
-0.05666004493832588,
-0.04204520583152771,
-0.009862768463790417,
0.012196507304906845,
0.14802151918411255,
-0.20514962077140808,
-0.03589438274502754,
0.18219129741191864,
-0.15216144919395447,
-0.03801440820097923,
0.036368656903505325,
-0.02178971841931343,
0.08727984875440598,
0.06168319284915924,
0.12917400896549225,
0.14217045903205872,
-0.14155592024326324,
-0.030611369758844376,
-0.05189547687768936,
-0.034997232258319855,
0.10354160517454147,
-0.004062427673488855,
0.022265871986746788,
0.067649245262146,
0.018893783912062645,
-0.0038429892156273127,
0.008504009805619717,
-0.011593146249651909,
-0.04653635993599892,
-0.015047726221382618,
-0.024492794647812843,
-0.03264857456088066,
0.019559431821107864,
-0.05830465629696846,
0.000800702313426882,
-0.06450258940458298,
0.05982718616724014,
0.04990730434656143,
-0.05277183651924133,
0.006974130403250456,
-0.06337085366249084,
0.038188062608242035,
-0.06679783761501312,
0.007793997880071402,
-0.15732744336128235,
-0.1354977935552597,
0.018995217978954315,
-0.005814222153276205,
0.051174066960811615,
0.06716074049472809,
0.08163943141698837,
0.0015801870031282306,
-0.045630499720573425,
0.019359705969691277,
0.01253301091492176,
-0.033306241035461426,
-0.03040124662220478,
-0.10428691655397415,
-0.007041143253445625,
-0.03470299765467644,
0.18320587277412415,
-0.208055779337883,
0.01814616285264492,
0.15799374878406525,
0.1412091851234436,
0.06343339383602142,
-0.0642680749297142,
0.043409187346696854,
-0.0010063598165288568,
0.012957041151821613,
-0.07114654034376144,
0.008121791295707226,
-0.008719678036868572,
-0.03392402082681656,
0.07865260541439056,
-0.14891523122787476,
-0.059955112636089325,
0.09315720200538635,
0.04011433199048042,
-0.08844590932130814,
-0.10799825191497803,
-0.029643990099430084,
-0.017476608976721764,
-0.07342897355556488,
-0.035368844866752625,
0.04957062005996704,
0.02216620370745659,
0.061097584664821625,
-0.05270827189087868,
-0.05874691531062126,
0.017666228115558624,
-0.0225509200245142,
-0.08062126487493515,
0.0593927726149559,
-0.009429258294403553,
-0.06473182141780853,
0.06760391592979431,
-0.03927038609981537,
-0.015489019453525543,
0.13073791563510895,
0.03608078509569168,
-0.06696397811174393,
0.00044057125342078507,
0.04494929313659668,
0.04877190664410591,
0.05010370910167694,
0.0070650773122906685,
0.06154628470540047,
0.057306740432977676,
-0.035457905381917953,
-0.01584766060113907,
-0.12464066594839096,
0.0013719761045649648,
0.04433104023337364,
-0.06064401566982269,
0.07833357155323029,
0.030145397409796715,
0.04473906755447388,
0.0793549120426178,
0.016374383121728897,
0.07030104100704193,
0.00025501559139229357,
-0.057923197746276855,
-0.10270216315984726,
0.08194449543952942,
-0.029818734154105186,
-0.2249804437160492,
-0.08881887793540955,
-0.05340948328375816,
0.006176862865686417,
-0.013662891462445259,
0.004194032400846481,
-0.031824659556150436,
-0.08081122487783432,
-0.06158169358968735,
0.025236038491129875,
0.007367216981947422,
-0.04502631351351738,
-0.019798746332526207,
0.03499018773436546,
0.08476236462593079,
-0.06550728529691696,
-0.013941648416221142,
0.01257959846407175,
-0.05353647097945213,
0.0447436086833477,
0.04279802367091179,
0.11902680993080139,
0.040534138679504395,
-0.0558403804898262,
0.028284674510359764,
-0.0007281150319613516,
0.22665764391422272,
-0.13299062848091125,
0.11280518770217896,
0.23718354105949402,
0.03701997175812721,
0.10237953811883926,
0.1267065703868866,
0.0009665176039561629,
-0.06982342153787613,
0.021785134449601173,
0.10972919315099716,
-0.08766810595989227,
-0.16279324889183044,
-0.07534904032945633,
-0.07146498560905457,
-0.07153453677892685,
0.10823950916528702,
0.07310005277395248,
0.12407791614532471,
0.14361700415611267,
-0.08578017354011536,
0.04855551943182945,
0.07006069272756577,
0.11768828332424164,
0.08483598381280899,
0.056509073823690414,
0.0584895946085453,
-0.040062315762043,
-0.05075555667281151,
0.07222320139408112,
0.025457940995693207,
0.22351554036140442,
-0.10206424444913864,
0.0294952429831028,
0.06585950404405594,
0.047576915472745895,
0.027139000594615936,
0.0313013419508934,
-0.012376808561384678,
-0.0038548088632524014,
-0.017646413296461105,
-0.11439631879329681,
-0.0038691957015544176,
0.08700791001319885,
0.019381100311875343,
-0.03905775770545006,
0.04148567095398903,
-0.03187580779194832,
0.02663097158074379,
0.11163477599620819,
0.011416574940085411,
-0.17765571177005768,
0.014908782206475735,
0.0781744122505188,
0.003880626056343317,
-0.07505585998296738,
-0.026826610788702965,
0.11007734388113022,
-0.1663382202386856,
0.12553773820400238,
-0.06948166340589523,
0.11016447842121124,
-0.1028730645775795,
-0.061535343527793884,
0.02009524405002594,
0.15032827854156494,
0.0023197727277874947,
0.09140800684690475,
-0.18392746150493622,
0.06149774789810181,
0.01830943673849106,
0.03377052769064903,
-0.07947885245084763,
0.03606943041086197,
0.013824699446558952,
-0.0970679223537445,
0.14748045802116394,
0.0001327408099314198,
0.030056236311793327,
-0.04999773949384689,
-0.036260541528463364,
-0.01796381175518036,
0.05893880873918533,
-0.040495991706848145,
0.08637116849422455,
-0.017570732161402702,
-0.025053435936570168,
-0.03707832843065262,
-0.0627104789018631,
-0.11282822489738464,
-0.18248167634010315,
0.010319137014448643,
-0.010131901130080223,
-0.04496876522898674,
-0.04036171734333038,
-0.013109717518091202,
-0.014371268451213837,
0.18847806751728058,
-0.02898339554667473,
-0.1249171793460846,
-0.14980453252792358,
0.029103700071573257,
0.10402468591928482,
-0.05802612006664276,
0.03943885490298271,
0.011522517539560795,
0.2126876264810562,
-0.09120059013366699,
-0.0956914871931076,
-0.01064485963433981,
-0.04907636344432831,
-0.11454986035823822,
-0.011096867732703686,
0.1447075605392456,
0.046711888164281845,
0.015016866847872734,
0.005038492381572723,
0.011556683108210564,
0.004446456208825111,
-0.08747632056474686,
0.05507770553231239,
0.06601136177778244,
0.011958446353673935,
0.06952743232250214,
0.025050247088074684,
-0.05063750222325325,
-0.12054683268070221,
0.006296941544860601,
0.035937532782554626,
0.25478461384773254,
-0.07020975649356842,
0.05288339778780937,
0.03204583004117012,
-0.03398098424077034,
-0.17352409660816193,
0.04741372913122177,
0.05665738880634308,
-0.01890658773481846,
0.05308982729911804,
-0.14569233357906342,
0.03215162083506584,
0.05538473278284073,
0.0016580679221078753,
0.18322689831256866,
-0.2238706350326538,
-0.11701516062021255,
-0.011326689273118973,
0.1521756798028946,
-0.010043257847428322,
-0.09332533180713654,
-0.0769694596529007,
-0.09316320717334747,
-0.1373397707939148,
0.12069697678089142,
-0.010771444998681545,
0.026047006249427795,
0.005671834107488394,
0.05198090150952339,
0.05372145399451256,
-0.03316063433885574,
0.13970428705215454,
-0.06463632732629776,
0.08450081944465637,
-0.10166563838720322,
-0.016632501035928726,
0.017151622101664543,
-0.03649640828371048,
0.07294964790344238,
-0.04821348190307617,
-0.008607178926467896,
-0.022311508655548096,
-0.04974854364991188,
-0.06918656080961227,
0.0305496659129858,
-0.040477048605680466,
-0.05847688764333725,
-0.09223967790603638,
0.06990986317396164,
0.09052659571170807,
-0.01551330741494894,
-0.0864417627453804,
-0.0903429314494133,
0.0006294297054409981,
0.18928802013397217,
0.08789023011922836,
0.0720798596739769,
-0.07902795076370239,
-0.028344392776489258,
0.0024011055938899517,
0.04481365531682968,
-0.05731242150068283,
0.02492259256541729,
0.06961625069379807,
0.04528115317225456,
0.1559482216835022,
-0.005253855604678392,
-0.11209030449390411,
0.005955573171377182,
0.04039381816983223,
-0.07785305380821228,
-0.16185186803340912,
-0.04706181213259697,
0.026607835665345192,
-0.0915108323097229,
-0.1440727859735489,
0.11773405224084854,
-0.014959757216274738,
-0.009190605022013187,
-0.002528318203985691,
0.05357753857970238,
0.011366493068635464,
0.10279262810945511,
0.03347812965512276,
0.0710657462477684,
-0.08918126672506332,
0.05669168010354042,
0.070685476064682,
-0.05473627150058746,
0.022452637553215027,
0.1210079938173294,
-0.0634787380695343,
0.009089792147278786,
-0.056983157992362976,
0.06927879899740219,
-0.048620857298374176,
-0.017759639769792557,
-0.048751357942819595,
-0.12812553346157074,
0.010648952797055244,
0.10606671124696732,
0.009928764775395393,
0.016051338985562325,
0.05162448063492775,
-0.01714659109711647,
-0.1516142636537552,
0.09056240320205688,
0.050414156168699265,
0.04091405123472214,
-0.18510036170482635,
-0.016461994498968124,
0.01954382285475731,
0.02869189903140068,
-0.043075911700725555,
-0.011788752861320972,
-0.06172047555446625,
-0.007746448740363121,
-0.033981531858444214,
0.09199424833059311,
-0.09434100985527039,
-0.01899249106645584,
-0.041529394686222076,
-0.012928820215165615,
-0.014659560285508633,
0.030282743275165558,
-0.052109479904174805,
-0.061996184289455414,
-0.006554595660418272,
0.012478209100663662,
-0.12726347148418427,
-0.05824664607644081,
0.062448617070913315,
-0.0896008312702179,
0.028339529410004616,
0.02835058607161045,
-0.04046783968806267,
-0.047937437891960144,
-0.14442259073257446,
0.010018709115684032,
0.10274052619934082,
0.009692367166280746,
0.03522832691669464,
-0.08997393399477005,
0.02733915112912655,
-0.010354515165090561,
-0.044416867196559906,
-0.00758766196668148,
-0.04609348997473717,
-0.1297272890806198,
0.07029195874929428,
-0.06611782312393188,
-0.08805448561906815,
-0.0740688219666481,
0.11290917545557022,
0.14789338409900665,
0.02327066659927368,
0.09275607019662857,
-0.08944744616746902,
0.09767480194568634,
-0.20169197022914886,
-0.025357160717248917,
0.02578076720237732,
0.011625437065958977,
0.049912214279174805,
-0.04846988990902901,
0.0390697680413723,
-0.012087179347872734,
0.1173093393445015,
0.06553962826728821,
0.005299743730574846,
0.023127982392907143,
-0.03711684048175812,
0.026547394692897797,
0.0011808137642219663,
0.033199697732925415,
0.0031743496656417847,
-0.013965766876935959,
0.03226498141884804,
0.001576049136929214,
0.018602969124913216,
-0.012018367648124695,
0.07333169132471085,
0.10156264901161194,
0.03239249810576439,
0.02509870007634163,
0.04472477734088898,
-0.005842490587383509,
-0.0330742783844471,
0.14399947226047516,
-0.020182427018880844,
-0.011056222952902317,
-0.09526719152927399,
0.025915568694472313,
0.16606825590133667,
-0.1639566868543625,
0.10747525840997696,
0.0480358824133873,
-0.05098225176334381,
-0.07627705484628677,
-0.22347061336040497,
-0.08445000648498535,
-0.07486634701490402,
0.03756179288029671,
-0.08032301813364029,
0.032530467957258224,
0.10049615800380707,
-0.02371404878795147,
-0.016775362193584442,
0.130295991897583,
-0.09251351654529572,
-0.060388922691345215,
0.02904554270207882,
0.01895415224134922,
0.011652957648038864,
0.055686697363853455,
0.034059397876262665,
0.08447618782520294,
0.07419472187757492,
0.04222739487886429,
0.049687791615724564,
0.059329889714717865,
0.03799271211028099,
-0.013976413756608963,
-0.08913733810186386,
0.009296835400164127,
0.029772743582725525,
0.03918367624282837,
0.16787777841091156,
0.04408792406320572,
0.011098315007984638,
-0.06711453944444656,
0.17873404920101166,
-0.03660504147410393,
-0.046153534203767776,
-0.09012146294116974,
0.05891227722167969,
-0.018603794276714325,
0.0033389776945114136,
0.02153312787413597,
-0.1202254518866539,
0.05259003862738609,
0.14679497480392456,
0.11897286772727966,
-0.019277848303318024,
0.027445001527667046,
-0.01522869523614645,
0.005514372140169144,
-0.02928929403424263,
0.02211662195622921,
0.03637048229575157,
0.19078640639781952,
-0.07089830189943314,
0.029326505959033966,
-0.020821280777454376,
-0.055857788771390915,
-0.04069945961236954,
0.031203705817461014,
-0.014806476421654224,
0.025481227785348892,
-0.06465891748666763,
0.08254978060722351,
-0.020444244146347046,
-0.16099698841571808,
0.15253019332885742,
-0.09469770640134811,
-0.05616801232099533,
-0.0038785715587437153,
-0.07045195251703262,
-0.0073295761831104755,
0.03731130436062813,
-0.03228257969021797,
-0.0016236946685239673,
0.05810251832008362,
0.024640066549181938,
-0.08844317495822906,
-0.011774622835218906,
0.03827233985066414,
-0.033178821206092834,
0.19085654616355896,
-0.03773891553282738,
0.04578184336423874,
0.060410358011722565,
-0.020847676321864128,
-0.07961568236351013,
0.04183490574359894,
0.02994907833635807,
-0.07925479859113693,
0.010516565293073654,
0.23903727531433105,
-0.034635093063116074,
0.011074038222432137,
0.09967949986457825,
-0.09762273728847504,
0.007706765551120043,
-0.019534457474946976,
-0.014112623408436775,
-0.08091996610164642,
0.06158946454524994,
-0.1336231678724289,
0.1010495126247406,
0.1665228009223938,
-0.002726147184148431,
0.03595777601003647,
-0.07366591691970825,
0.017435889691114426,
0.02093742974102497,
0.11673365533351898,
0.006577719934284687,
-0.028894314542412758,
-0.023196425288915634,
0.040319621562957764,
0.03596567362546921,
-0.21968437731266022,
-0.07263775169849396,
-0.06015828996896744,
-0.06921795755624771,
-0.03225706145167351,
0.09976204484701157,
0.03700311481952667,
0.044404540210962296,
-0.02029981091618538,
-0.22596508264541626,
0.03866768628358841,
0.12633661925792694,
-0.07977075129747391,
0.04345051944255829
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | AntoineGourru/Mistral_qlora_telecom_3_low | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:11:58+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilhubert-finetuned-gtzan
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the GTZAN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5295
- Accuracy: 0.83
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0004 | 1.0 | 113 | 1.8623 | 0.39 |
| 1.3378 | 2.0 | 226 | 1.2327 | 0.62 |
| 0.9874 | 3.0 | 339 | 0.9539 | 0.78 |
| 0.7984 | 4.0 | 452 | 0.7968 | 0.77 |
| 0.5491 | 5.0 | 565 | 0.7040 | 0.79 |
| 0.3278 | 6.0 | 678 | 0.6850 | 0.75 |
| 0.4007 | 7.0 | 791 | 0.5304 | 0.81 |
| 0.1203 | 8.0 | 904 | 0.5527 | 0.83 |
| 0.267 | 9.0 | 1017 | 0.5332 | 0.85 |
| 0.1416 | 10.0 | 1130 | 0.5295 | 0.83 |
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.2.0+cu118
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["marsyas/gtzan"], "metrics": ["accuracy"], "base_model": "ntu-spml/distilhubert", "model-index": [{"name": "distilhubert-finetuned-gtzan", "results": [{"task": {"type": "audio-classification", "name": "Audio Classification"}, "dataset": {"name": "GTZAN", "type": "marsyas/gtzan", "config": "all", "split": "train", "args": "all"}, "metrics": [{"type": "accuracy", "value": 0.83, "name": "Accuracy"}]}]}]} | audio-classification | GeeDino/distilhubert-finetuned-gtzan | [
"transformers",
"tensorboard",
"safetensors",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:marsyas/gtzan",
"base_model:ntu-spml/distilhubert",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:14:33+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-marsyas/gtzan #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us
| distilhubert-finetuned-gtzan
============================
This model is a fine-tuned version of ntu-spml/distilhubert on the GTZAN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5295
* Accuracy: 0.83
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.38.0.dev0
* Pytorch 2.2.0+cu118
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-marsyas/gtzan #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
78,
131,
4,
38
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-marsyas/gtzan #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.14417456090450287,
0.1378134787082672,
-0.0021908858325332403,
0.0682583898305893,
0.12133412063121796,
0.01276610977947712,
0.149664044380188,
0.09914003312587738,
-0.07175067067146301,
0.07450028508901596,
0.09853275865316391,
0.07674562931060791,
0.03691475838422775,
0.10893496870994568,
-0.053293466567993164,
-0.2658846974372864,
0.035722147673368454,
0.04160024970769882,
-0.13125304877758026,
0.10935661196708679,
0.10345185548067093,
-0.11467941850423813,
0.07156630605459213,
0.03290634602308273,
-0.14702454209327698,
-0.009124870412051678,
0.02525624819099903,
-0.0908794179558754,
0.0989801213145256,
0.034953705966472626,
0.10482800006866455,
0.030349675565958023,
0.09433634579181671,
-0.162052184343338,
0.015551930293440819,
0.06931182742118835,
0.02170187048614025,
0.10130324214696884,
0.08657396584749222,
0.006567487958818674,
0.06268101930618286,
-0.06516135483980179,
0.0704910159111023,
0.04152704030275345,
-0.10603142529726028,
-0.2964617908000946,
-0.10370276123285294,
0.07765856385231018,
0.10229391604661942,
0.07069358974695206,
-0.004212700761854649,
0.11547902971506119,
-0.007429072633385658,
0.08529651910066605,
0.21355858445167542,
-0.2581867277622223,
-0.07605718821287155,
-0.018831957131624222,
0.08928307145833969,
0.06654826551675797,
-0.10116088390350342,
-0.00437521655112505,
0.04983654245734215,
0.028303828090429306,
0.12778708338737488,
-0.004533736500889063,
0.024122439324855804,
-0.010947572067379951,
-0.1430506557226181,
-0.04082132875919342,
0.1740162968635559,
0.058915119618177414,
-0.0623292401432991,
-0.05116163566708565,
-0.05547179654240608,
-0.16487692296504974,
-0.048299893736839294,
0.005374670960009098,
0.02285381220281124,
-0.05324983224272728,
-0.08611448854207993,
-0.019510360434651375,
-0.07465122640132904,
-0.11076750606298447,
0.00872834213078022,
0.23517687618732452,
0.04401884973049164,
0.0038003188092261553,
-0.0254580769687891,
0.09881451725959778,
0.02461039274930954,
-0.18111924827098846,
-0.004589150659739971,
0.015230647288262844,
-0.03322336822748184,
-0.026722123846411705,
-0.03469129651784897,
-0.040884315967559814,
0.0075376033782958984,
0.18363045156002045,
-0.0790906772017479,
0.05809490755200386,
0.010003769770264626,
0.02282877452671528,
-0.08763477206230164,
0.17404693365097046,
-0.04938439652323723,
-0.033213481307029724,
0.01861117221415043,
0.10465648025274277,
0.04624851047992706,
-0.020644262433052063,
-0.0957183763384819,
0.021548118442296982,
0.10560042411088943,
0.028630860149860382,
-0.010526162572205067,
0.024190833792090416,
-0.04661531001329422,
-0.03962405025959015,
0.08392907679080963,
-0.08944337069988251,
0.02366010844707489,
0.016521938145160675,
-0.04949771985411644,
-0.031802911311388016,
0.012900938279926777,
0.01960223726928234,
0.018371911719441414,
0.11012528836727142,
-0.09577301144599915,
-0.03557155653834343,
-0.08246420323848724,
-0.11376763135194778,
0.0329345166683197,
-0.038662031292915344,
0.026604367420077324,
-0.09884385019540787,
-0.14009800553321838,
0.0008144913008436561,
0.04542626813054085,
-0.02222657948732376,
-0.06141859292984009,
-0.007760460954159498,
-0.09960227459669113,
0.05200713500380516,
-0.02177940122783184,
0.09349250793457031,
-0.05991723760962486,
0.10487089306116104,
0.07328968495130539,
0.04394249990582466,
-0.01878776215016842,
0.046927232295274734,
-0.05922141671180725,
0.06860078126192093,
-0.18639029562473297,
0.028384439647197723,
-0.09640339016914368,
0.03607138991355896,
-0.10093759745359421,
-0.11621876806020737,
0.03138986974954605,
-0.0012531676329672337,
0.08739632368087769,
0.09715282171964645,
-0.11893722414970398,
-0.09476117789745331,
0.12273219227790833,
-0.10395976901054382,
-0.14931906759738922,
0.11911531537771225,
-0.008430211804807186,
-0.04021548479795456,
0.04798891395330429,
0.14231069386005402,
0.14430983364582062,
-0.13072969019412994,
-0.033839598298072815,
-0.037187449634075165,
0.12093814462423325,
-0.018620822578668594,
0.1191922202706337,
0.01893663965165615,
-0.014935316517949104,
0.0020603241864591837,
-0.09451726078987122,
0.06061777099967003,
-0.08932074904441833,
-0.0783979594707489,
-0.037086427211761475,
-0.09075281769037247,
0.05088767409324646,
0.04500744119286537,
0.019938068464398384,
-0.08845304697751999,
-0.12046393752098083,
0.10136044770479202,
0.11926892399787903,
-0.07010917365550995,
0.014492037706077099,
-0.08881816267967224,
0.10071878135204315,
-0.08387138694524765,
-0.03229989483952522,
-0.16672495007514954,
-0.06495589017868042,
0.020862041041254997,
-0.07725854218006134,
0.01028656866401434,
-0.06123536825180054,
0.0696951374411583,
0.10395462065935135,
-0.06849461793899536,
-0.08974695950746536,
-0.09496258944272995,
0.012490536086261272,
-0.0683261975646019,
-0.20585770905017853,
-0.08387426286935806,
-0.028667112812399864,
0.12041281908750534,
-0.18193067610263824,
0.023970743641257286,
0.019534969702363014,
0.13643504679203033,
0.053212616592645645,
-0.031533367931842804,
-0.01470350194722414,
0.06488829851150513,
-0.03432246670126915,
-0.07009105384349823,
0.00975298322737217,
0.026767663657665253,
-0.10393129289150238,
-0.01250501349568367,
-0.147317573428154,
0.17602133750915527,
0.11238362640142441,
-0.0038838707841932774,
-0.05178610235452652,
0.014807375147938728,
-0.06753131747245789,
-0.042940665036439896,
-0.027718454599380493,
-0.022759903222322464,
0.08828794211149216,
0.014611609280109406,
0.12934604287147522,
-0.09167090803384781,
-0.034272175282239914,
0.047692421823740005,
-0.006038649473339319,
-0.024676678702235222,
0.09011019021272659,
0.048324838280677795,
-0.06874020397663116,
0.13516607880592346,
0.1539836972951889,
-0.08994703739881516,
0.15669408440589905,
-0.08235447108745575,
-0.09398019313812256,
-0.03909512981772423,
-0.022450650110840797,
0.03487484157085419,
0.15701717138290405,
-0.08134550601243973,
0.004023039247840643,
0.03132952004671097,
0.016539594158530235,
-0.004097987897694111,
-0.19653332233428955,
-0.0039062206633388996,
0.033301182091236115,
-0.0450262650847435,
-0.0580509752035141,
0.007570225279778242,
-0.010353575460612774,
0.0794685110449791,
0.008072562515735626,
-0.04334066063165665,
0.015611007809638977,
0.006871392019093037,
-0.06258410215377808,
0.1847832351922989,
-0.1043616533279419,
-0.13973690569400787,
-0.1834448128938675,
-0.033051520586013794,
-0.06226140260696411,
0.011533373035490513,
0.04822169616818428,
-0.08014175295829773,
-0.05775060877203941,
-0.050298649817705154,
0.04555242881178856,
-0.013801781460642815,
0.041223134845495224,
0.02634369023144245,
-0.0009164180955849588,
0.09610269963741302,
-0.10746368765830994,
0.018109310418367386,
0.011232353746891022,
-0.004920176230370998,
-0.006983474362641573,
0.049238190054893494,
0.10634530335664749,
0.13219398260116577,
0.014167992398142815,
0.004656955599784851,
-0.013137646950781345,
0.2196209877729416,
-0.1075146347284317,
-0.009412603452801704,
0.1653636395931244,
-0.03128008171916008,
0.038802530616521835,
0.10765990614891052,
0.06329869478940964,
-0.07144610583782196,
0.008900058455765247,
0.024289187043905258,
-0.027610652148723602,
-0.235499769449234,
-0.03330211713910103,
-0.053958211094141006,
0.0015866595786064863,
0.09113586694002151,
0.03674881160259247,
0.032002296298742294,
0.06148876994848251,
-0.03445154428482056,
0.04827399179339409,
-0.012989207170903683,
0.09502530843019485,
0.11693403124809265,
0.05105156823992729,
0.11990319192409515,
-0.0317079983651638,
-0.01889456994831562,
0.040490273386240005,
0.010372864082455635,
0.18359753489494324,
0.01427507121115923,
0.17483122646808624,
0.04583469033241272,
0.13716773688793182,
0.016145197674632072,
0.07414983958005905,
0.030629156157374382,
-0.01140446774661541,
0.016797522082924843,
-0.06446391344070435,
-0.015039350837469101,
0.019765788689255714,
-0.03392130881547928,
0.07769343256950378,
-0.1130669116973877,
0.015107309445738792,
0.013724733144044876,
0.28501611948013306,
0.05307930335402489,
-0.3504071533679962,
-0.14718160033226013,
0.024462107568979263,
-0.03451192006468773,
-0.07140319049358368,
0.025621924549341202,
0.11213624477386475,
-0.046786725521087646,
0.07705635577440262,
-0.07864221930503845,
0.09139332175254822,
-0.030301623046398163,
0.006386065389961004,
0.1100555807352066,
0.12142656743526459,
-0.004579933360219002,
0.0416005402803421,
-0.20922362804412842,
0.26128482818603516,
0.027891535311937332,
0.08630965650081635,
-0.022962668910622597,
0.027938250452280045,
0.032873254269361496,
0.050232771784067154,
0.0609285794198513,
-0.011992046609520912,
-0.10494616627693176,
-0.18955354392528534,
-0.0969015508890152,
0.011153706349432468,
0.1062348335981369,
-0.048025552183389664,
0.12023280560970306,
-0.046133603900671005,
-0.029696064069867134,
0.06560260057449341,
-0.0876123458147049,
-0.11036505550146103,
-0.07026215642690659,
0.036266591399908066,
0.03958994522690773,
0.04582499340176582,
-0.11082352697849274,
-0.1178707554936409,
-0.05965852364897728,
0.11091925948858261,
-0.08273947983980179,
-0.04327813535928726,
-0.13260920345783234,
0.05242867022752762,
0.15512311458587646,
-0.06828573346138,
0.07131550461053848,
0.004317313898354769,
0.16762274503707886,
0.02408352680504322,
-0.05119946226477623,
0.09313841164112091,
-0.09835619479417801,
-0.2234545350074768,
-0.0384492501616478,
0.16980284452438354,
0.026791749522089958,
0.052097126841545105,
-0.021193798631429672,
0.05059516057372093,
-0.007721565198153257,
-0.07877227663993835,
0.021408041939139366,
-0.02204533852636814,
0.03881458565592766,
0.012395404279232025,
-0.020938221365213394,
0.013572855852544308,
-0.036809515208005905,
-0.031801823526620865,
0.07940483093261719,
0.2816636264324188,
-0.07016459107398987,
-0.00025481998454779387,
0.05528726056218147,
-0.049118801951408386,
-0.16031630337238312,
0.034091830253601074,
0.09387240558862686,
0.020656146109104156,
0.017401281744241714,
-0.17973168194293976,
0.08412019908428192,
0.0690416544675827,
-0.04240689054131508,
0.12011882662773132,
-0.2584805488586426,
-0.13458628952503204,
0.12000852078199387,
0.12267482280731201,
0.012128638103604317,
-0.16487430036067963,
-0.06452258676290512,
-0.03653143346309662,
-0.15613475441932678,
0.12392601370811462,
-0.09015748649835587,
0.09474509954452515,
-0.009919744916260242,
0.04992666468024254,
0.015113763511180878,
-0.05092116445302963,
0.1640133112668991,
-0.016331037506461143,
0.06945284456014633,
-0.014331333339214325,
0.027417946606874466,
0.10450278222560883,
-0.062610924243927,
0.009667692705988884,
-0.06285958737134933,
0.0451056994497776,
-0.08062410354614258,
-0.027119630947709084,
-0.07619798183441162,
0.02492608316242695,
-0.04917988181114197,
-0.04353583604097366,
-0.022180624306201935,
0.03962007537484169,
0.0199040025472641,
-0.03223564848303795,
0.20423905551433563,
0.02022663690149784,
0.15898221731185913,
0.115326888859272,
0.0969327762722969,
-0.01802477240562439,
-0.08499132841825485,
0.003986779134720564,
-0.0472603403031826,
0.08285053819417953,
-0.1394137144088745,
0.04158084839582443,
0.1165970042347908,
0.052949462085962296,
0.10572008788585663,
0.06303279846906662,
-0.0607302300632,
0.012355589307844639,
0.07332927733659744,
-0.14411059021949768,
-0.12808841466903687,
-0.04568904638290405,
0.03708194941282272,
-0.1411116123199463,
0.043178435415029526,
0.12643523514270782,
-0.08053738623857498,
-0.012446022592484951,
0.02620146982371807,
0.004156365990638733,
-0.04117351397871971,
0.21276307106018066,
0.06810631603002548,
0.0746019259095192,
-0.09839674085378647,
0.09321781247854233,
0.03836073726415634,
-0.12808142602443695,
0.001407242612913251,
0.0546395480632782,
-0.07068338990211487,
-0.015303746797144413,
0.004923057276755571,
0.08279486745595932,
-0.05808211490511894,
-0.0720200166106224,
-0.15146969258785248,
-0.1317177414894104,
0.06153903529047966,
0.15008150041103363,
0.05156250670552254,
0.025277893990278244,
-0.029995933175086975,
0.061847615987062454,
-0.11417897045612335,
0.13869482278823853,
0.0697820633649826,
0.10985539108514786,
-0.1984713226556778,
0.11041973531246185,
-0.003384226933121681,
0.025260422378778458,
-0.019790656864643097,
0.02649909257888794,
-0.10001298785209656,
0.0017172731459140778,
-0.16954940557479858,
-0.010510746389627457,
-0.035596806555986404,
0.007012166082859039,
-0.01838708110153675,
-0.07675199210643768,
-0.0816197618842125,
0.041484586894512177,
-0.10002091526985168,
-0.04038950055837631,
0.016976546496152878,
0.04477265477180481,
-0.11820495873689651,
-0.014506577514111996,
0.05387164279818535,
-0.11940839141607285,
0.0705755278468132,
0.04785594344139099,
0.036207087337970734,
0.05247367173433304,
-0.050337351858615875,
-0.0011996403336524963,
0.0512179359793663,
0.010396802797913551,
0.038479290902614594,
-0.16323822736740112,
-0.010450114496052265,
-0.022175932303071022,
0.02742486633360386,
-0.011637059040367603,
0.06338027864694595,
-0.11200670152902603,
-0.023774221539497375,
-0.013929387554526329,
-0.01876063272356987,
-0.059237342327833176,
0.031838804483413696,
0.11615563929080963,
0.022929780185222626,
0.17902915179729462,
-0.07771915942430496,
0.006560640409588814,
-0.2133626788854599,
0.019501425325870514,
-0.026058221235871315,
-0.13034391403198242,
-0.10613919794559479,
-0.014841394498944283,
0.06846486777067184,
-0.055529531091451645,
0.08462747931480408,
-0.0594177208840847,
0.08665432035923004,
0.051401834934949875,
-0.010936226695775986,
0.015414347872138023,
0.048135120421648026,
0.23215895891189575,
0.032435670495033264,
-0.01678312197327614,
0.05517534911632538,
0.008472542278468609,
0.07809896022081375,
0.08852303773164749,
0.1440776288509369,
0.13640999794006348,
-0.00929509662091732,
0.07811091095209122,
0.07575210928916931,
-0.059920720756053925,
-0.18936799466609955,
0.03900472819805145,
-0.036811333149671555,
0.12601026892662048,
0.014432701282203197,
0.1949634701013565,
0.10699336230754852,
-0.1720283031463623,
0.033954329788684845,
-0.03662465512752533,
-0.07104099541902542,
-0.10501077026128769,
-0.03751669451594353,
-0.07955825328826904,
-0.15637655556201935,
0.005813021212816238,
-0.1293179988861084,
0.031342580914497375,
0.05212816223502159,
0.01413737889379263,
0.006496282760053873,
0.18252259492874146,
0.04211321845650673,
0.007205562200397253,
0.06752391159534454,
0.019194848835468292,
-0.03731696307659149,
-0.018522778525948524,
-0.1000278890132904,
0.04522887244820595,
-0.016521498560905457,
0.031659942120313644,
-0.05558733642101288,
-0.07747266441583633,
0.07116870582103729,
0.019716894254088402,
-0.11737555265426636,
0.026946745812892914,
0.016820503398776054,
0.06059278920292854,
0.04554890841245651,
-0.0036682405043393373,
0.014013009145855904,
-0.004706352483481169,
0.22672413289546967,
-0.09378539770841599,
-0.03564964979887009,
-0.12428838014602661,
0.2070855349302292,
-0.0191491786390543,
-0.02765580825507641,
0.04659261927008629,
-0.08089758455753326,
-0.02472875453531742,
0.13251712918281555,
0.12378182262182236,
-0.016059940680861473,
-0.023266827687621117,
0.0017325544031336904,
-0.01335328072309494,
-0.0697491466999054,
0.08226228505373001,
0.10832075774669647,
0.056605931371450424,
-0.05629292502999306,
-0.053973689675331116,
-0.05352696776390076,
-0.020800525322556496,
-0.005865718238055706,
0.07821495085954666,
0.00519918417558074,
-0.030427010729908943,
-0.030064303427934647,
0.08827326446771622,
-0.051342327147722244,
-0.14029110968112946,
0.01940772496163845,
-0.15702851116657257,
-0.17848625779151917,
-0.0473339781165123,
0.08534206449985504,
0.006055100820958614,
0.03553116321563721,
-0.006099517922848463,
-0.021673355251550674,
0.10093256086111069,
-0.004228788893669844,
-0.0402010902762413,
-0.11250101029872894,
0.10046375542879105,
-0.054650239646434784,
0.22147691249847412,
-0.037592317909002304,
0.05012517794966698,
0.10574121028184891,
0.03889956325292587,
-0.10494047403335571,
0.02530662901699543,
0.08084403723478317,
-0.15087197721004486,
0.024544447660446167,
0.1735786348581314,
-0.04359778016805649,
0.12518319487571716,
0.01843283325433731,
-0.13281913101673126,
-0.026256509125232697,
-0.05318525806069374,
-0.07527843117713928,
-0.05144338682293892,
-0.037351712584495544,
-0.05894182249903679,
0.13555066287517548,
0.16654452681541443,
-0.06084287539124489,
-0.03109271079301834,
-0.053690217435359955,
0.05019395798444748,
0.09329663962125778,
0.08222493529319763,
-0.0014754958683624864,
-0.2712819576263428,
0.010743487626314163,
0.02571941912174225,
-0.005077061243355274,
-0.2514696717262268,
-0.07862550765275955,
0.014151720330119133,
-0.041698649525642395,
-0.06945347040891647,
0.08581247925758362,
0.07064446806907654,
0.03562682494521141,
-0.06348598003387451,
-0.0475497767329216,
-0.07384973019361496,
0.16293302178382874,
-0.17732037603855133,
-0.07521972805261612
] |
null | null | transformers |
# EmertonMonarch-7B-slerp
EmertonMonarch-7B-slerp is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [mlabonne/Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B)
* [yleo/EmertonBeagle-7B-dpo](https://huggingface.co/yleo/EmertonBeagle-7B-dpo)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: mlabonne/Monarch-7B
layer_range: [0, 32]
- model: yleo/EmertonBeagle-7B-dpo
layer_range: [0, 32]
merge_method: slerp
base_model: mlabonne/Monarch-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "yleo/EmertonMonarch-7B-slerp"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit", "mlabonne/Monarch-7B", "yleo/EmertonBeagle-7B-dpo"], "base_model": ["mlabonne/Monarch-7B", "yleo/EmertonBeagle-7B-dpo"]} | text-generation | yleo/EmertonMonarch-7B-slerp | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"mlabonne/Monarch-7B",
"yleo/EmertonBeagle-7B-dpo",
"base_model:mlabonne/Monarch-7B",
"base_model:yleo/EmertonBeagle-7B-dpo",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:14:51+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/Monarch-7B #yleo/EmertonBeagle-7B-dpo #base_model-mlabonne/Monarch-7B #base_model-yleo/EmertonBeagle-7B-dpo #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# EmertonMonarch-7B-slerp
EmertonMonarch-7B-slerp is a merge of the following models using LazyMergekit:
* mlabonne/Monarch-7B
* yleo/EmertonBeagle-7B-dpo
## Configuration
## Usage
| [
"# EmertonMonarch-7B-slerp\n\nEmertonMonarch-7B-slerp is a merge of the following models using LazyMergekit:\n* mlabonne/Monarch-7B\n* yleo/EmertonBeagle-7B-dpo",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/Monarch-7B #yleo/EmertonBeagle-7B-dpo #base_model-mlabonne/Monarch-7B #base_model-yleo/EmertonBeagle-7B-dpo #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# EmertonMonarch-7B-slerp\n\nEmertonMonarch-7B-slerp is a merge of the following models using LazyMergekit:\n* mlabonne/Monarch-7B\n* yleo/EmertonBeagle-7B-dpo",
"## Configuration",
"## Usage"
] | [
125,
57,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/Monarch-7B #yleo/EmertonBeagle-7B-dpo #base_model-mlabonne/Monarch-7B #base_model-yleo/EmertonBeagle-7B-dpo #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# EmertonMonarch-7B-slerp\n\nEmertonMonarch-7B-slerp is a merge of the following models using LazyMergekit:\n* mlabonne/Monarch-7B\n* yleo/EmertonBeagle-7B-dpo## Configuration## Usage"
] | [
-0.079913429915905,
-0.003251905320212245,
-0.005163630470633507,
0.019358104094862938,
0.036593299359083176,
0.045065443962812424,
0.16813676059246063,
0.049875371158123016,
0.06782770156860352,
0.024219488725066185,
0.0962124764919281,
0.19030244648456573,
-0.00797856505960226,
0.08435877412557602,
-0.041760142892599106,
-0.2206924855709076,
0.07953303307294846,
0.06110186502337456,
-0.004550281446427107,
0.07860899716615677,
0.09589411318302155,
-0.047275010496377945,
0.09667608141899109,
0.007630248088389635,
-0.04273923859000206,
-0.0031815278343856335,
-0.026311861351132393,
-0.05980442091822624,
0.10646714270114899,
0.05834370106458664,
0.09290100634098053,
0.07351642847061157,
-0.02088317647576332,
-0.08951012790203094,
0.025677399709820747,
-0.015043860301375389,
-0.02869051694869995,
0.06178199499845505,
0.03273819386959076,
-0.07337579131126404,
0.07496333122253418,
-0.03815394639968872,
0.0701623260974884,
0.011046876199543476,
-0.06702287495136261,
-0.1384403258562088,
-0.0543045736849308,
0.048010118305683136,
0.0668274536728859,
0.04126855731010437,
0.0030970233492553234,
0.09456527978181839,
-0.057013798505067825,
0.08817610889673233,
0.2277091145515442,
-0.3040910065174103,
-0.0004582832916639745,
0.11718253046274185,
0.09729872643947601,
-0.019558485597372055,
0.00575505755841732,
0.015068773180246353,
-0.02822069264948368,
0.0177045576274395,
0.10145976394414902,
-0.07894230633974075,
0.18550480902194977,
-0.08003171533346176,
-0.13695655763149261,
0.011235397309064865,
0.16624432802200317,
0.011990994215011597,
-0.023992707952857018,
-0.09338593482971191,
-0.10940646380186081,
0.06586136668920517,
-0.04209322854876518,
-0.05968843773007393,
0.04509262368083,
-0.0110790329053998,
0.013932759873569012,
-0.02069145441055298,
-0.014013044536113739,
-0.039270300418138504,
-0.1080564558506012,
0.19164758920669556,
0.019198555499315262,
0.007797444239258766,
-0.028137331828475,
0.07418935000896454,
-0.08306475728750229,
-0.11231497675180435,
-0.03775012865662575,
-0.06880401819944382,
0.06190185993909836,
0.02191907912492752,
-0.025817731395363808,
-0.09754202514886856,
0.08759185671806335,
0.2431638240814209,
-0.06656258553266525,
0.07094168663024902,
0.040474630892276764,
0.05559569224715233,
-0.026836464181542397,
0.05944586917757988,
-0.06553374230861664,
-0.11264952272176743,
0.062412913888692856,
0.036882515996694565,
0.1088280901312828,
-0.008445324376225471,
-0.09593600034713745,
-0.015107659623026848,
0.05118221417069435,
-0.013846350833773613,
0.013570720329880714,
0.09803210198879242,
-0.09050953388214111,
-0.04502413794398308,
0.19768217206001282,
-0.07866913825273514,
-0.006341961212456226,
-0.0209168903529644,
-0.04787876829504967,
0.05565765127539635,
0.11772903800010681,
0.010364385321736336,
0.00850595347583294,
0.12122631818056107,
-0.06838507205247879,
-0.0485118068754673,
-0.03797159716486931,
-0.10474416613578796,
0.015848778188228607,
-0.05621121823787689,
-0.004813600331544876,
-0.1317104697227478,
-0.2493142932653427,
0.02653956227004528,
0.028025634586811066,
-0.0026885787956416607,
0.003302741562947631,
-0.06859833002090454,
-0.00913157407194376,
-0.011345930397510529,
-0.010553154163062572,
-0.08870168030261993,
-0.020336544141173363,
0.016069959849119186,
0.016589365899562836,
0.0683809444308281,
-0.21715497970581055,
0.02941768616437912,
-0.09110010415315628,
0.10247671604156494,
-0.2864020764827728,
0.09330662339925766,
-0.05270447954535484,
0.020018955692648888,
-0.04917146638035774,
-0.014410125091671944,
-0.07906067371368408,
0.03809135779738426,
0.023704640567302704,
0.10937009751796722,
-0.06120915710926056,
-0.11871941387653351,
0.14489401876926422,
-0.14156943559646606,
-0.12189403176307678,
0.06444151699542999,
0.008883905597031116,
0.043736618012189865,
0.09519688785076141,
0.1922987550497055,
0.057876020669937134,
-0.0511687695980072,
-0.05601007491350174,
0.014670361764729023,
-0.011497639119625092,
-0.018074292689561844,
0.07837895303964615,
-0.0427805595099926,
-0.0541975200176239,
0.06251170486211777,
-0.042449451982975006,
0.04007333517074585,
-0.011506671085953712,
-0.004945416934788227,
-0.033199772238731384,
-0.03981220722198486,
0.15210379660129547,
-0.04333703592419624,
0.015001415275037289,
-0.08527417480945587,
-0.09595227241516113,
0.16915951669216156,
0.06527426838874817,
-0.05615357682108879,
-0.006708992645144463,
-0.09575647115707397,
0.09297149628400803,
-0.015721432864665985,
0.039980243891477585,
-0.09397976100444794,
-0.13932910561561584,
0.002071032766252756,
-0.05446520075201988,
0.0062536997720599174,
-0.03298228234052658,
0.06722118705511093,
0.07806815207004547,
-0.051888421177864075,
-0.052696920931339264,
0.051964446902275085,
0.03675006330013275,
-0.01656627096235752,
-0.1388239860534668,
-0.051557302474975586,
-0.05242817848920822,
0.2117813378572464,
-0.04993119835853577,
0.09132063388824463,
0.011945119127631187,
0.1470387727022171,
0.018093092367053032,
-0.006627040449529886,
-0.005404130555689335,
0.040858104825019836,
-0.005291400942951441,
0.00330817187204957,
0.0812339261174202,
-0.011015640571713448,
-0.14047199487686157,
0.05373343452811241,
-0.16852033138275146,
0.20161758363246918,
0.15057481825351715,
0.03244916349649429,
0.012897850945591927,
-0.029732195660471916,
0.020916299894452095,
-0.05137137696146965,
0.0676254853606224,
-0.11438434571027756,
0.09513357281684875,
0.015606043860316277,
0.10779095441102982,
-0.1046980768442154,
-0.029840948060154915,
-0.00729876896366477,
-0.04801584407687187,
-0.018412692472338676,
0.06922692060470581,
-0.013667326420545578,
-0.09389778226613998,
0.10339529812335968,
0.21635472774505615,
-0.013840374536812305,
0.10974087566137314,
0.014593330211937428,
0.024868866428732872,
-0.04059843346476555,
0.0018804644932970405,
-0.019903426989912987,
0.02907382696866989,
-0.16493713855743408,
0.04125102236866951,
0.03355555981397629,
0.005476709455251694,
0.04648308828473091,
-0.07511388510465622,
0.018924258649349213,
0.013118044473230839,
-0.016648979857563972,
0.023549377918243408,
0.0663665160536766,
0.007221778389066458,
0.06033198907971382,
0.01573677733540535,
-0.061657682061195374,
0.06167572736740112,
0.02323777601122856,
-0.06475995481014252,
0.17399808764457703,
-0.14022879302501678,
-0.24309982359409332,
-0.12368439137935638,
-0.07919773459434509,
-0.08621779084205627,
0.002640614751726389,
0.0841061919927597,
0.0022510048002004623,
-0.018567459657788277,
-0.10736130177974701,
0.12098455429077148,
0.0618860200047493,
0.0013882454950362444,
0.021466592326760292,
-0.02626710757613182,
-0.00016878522001206875,
-0.1392637938261032,
-0.048917725682258606,
-0.008681349456310272,
-0.03253423422574997,
0.09437000751495361,
-0.0707947388291359,
0.05780870094895363,
0.0992334634065628,
0.022832125425338745,
-0.013375716283917427,
-0.03379453346133232,
0.15953412652015686,
-0.0015676083276048303,
0.04216017946600914,
0.16015808284282684,
-0.049603722989559174,
0.06610576063394547,
0.13240700960159302,
0.043094851076602936,
-0.05735025554895401,
-0.014069355092942715,
-0.020366841927170753,
-0.06145145744085312,
-0.15717211365699768,
-0.1174471378326416,
-0.01583690010011196,
0.09629785269498825,
0.004780075047165155,
0.041663236916065216,
0.05308574438095093,
0.08387230336666107,
-0.03249070420861244,
-0.007171998731791973,
0.08147168159484863,
0.06584175676107407,
0.22021710872650146,
-0.017848066985607147,
0.12954704463481903,
-0.019667373970150948,
-0.04286753386259079,
0.029815321788191795,
0.00734360096976161,
0.03507191687822342,
0.025548996403813362,
0.0929425060749054,
0.04654966667294502,
0.06230684742331505,
0.04702412709593773,
0.08490612357854843,
-0.012727647088468075,
-0.012854818254709244,
-0.03291868045926094,
-0.08215034753084183,
-0.042085397988557816,
0.002782928291708231,
-0.07937375456094742,
0.07838296890258789,
-0.031407155096530914,
0.030409228056669235,
0.06142814829945564,
0.12228161096572876,
0.02496222034096718,
-0.31860947608947754,
-0.08479625731706619,
0.03021053597331047,
0.03235684335231781,
-0.014366032555699348,
-0.001565769431181252,
0.014548614621162415,
-0.06885753571987152,
0.14557717740535736,
0.006621749140322208,
0.04835442826151848,
-0.05295547470450401,
0.03883592039346695,
-0.028824545443058014,
0.10642711818218231,
0.009489563293755054,
0.020650021731853485,
-0.18162894248962402,
0.07330148667097092,
0.033807847648859024,
0.013076243922114372,
0.006815892178565264,
0.01944209448993206,
0.00925068836659193,
0.13478732109069824,
0.055371928960084915,
-0.012089396826922894,
0.07887871563434601,
-0.03854353353381157,
-0.10587189346551895,
0.016646863892674446,
0.047578465193510056,
-0.07486474514007568,
0.06430703401565552,
-0.0027942617889493704,
-0.04206108674407005,
0.03315258398652077,
0.011539283208549023,
-0.12161226570606232,
-0.09219814091920853,
0.053242068737745285,
0.0608476847410202,
0.10820577293634415,
-0.10001574456691742,
-0.03943729028105736,
-0.05781428888440132,
0.15483629703521729,
-0.08980335295200348,
-0.10360521823167801,
-0.06550183147192001,
-0.06937126070261002,
0.07886825501918793,
-0.08031141012907028,
0.09925742447376251,
-0.024425454437732697,
0.03715056553483009,
-0.048829857259988785,
-0.12569059431552887,
0.11686099320650101,
-0.07529442757368088,
-0.08083386719226837,
-0.036532603204250336,
0.15201686322689056,
0.003032771172001958,
0.01335090957581997,
0.013431528583168983,
0.01893007382750511,
-0.006549518089741468,
-0.06476705521345139,
-0.006716081872582436,
0.13095413148403168,
-0.03630934655666351,
0.0919330045580864,
-0.073330819606781,
-0.09043795615434647,
0.017954619601368904,
0.03997553884983063,
0.11617731302976608,
0.25197869539260864,
-0.02463592402637005,
0.06101872771978378,
0.1580444872379303,
-0.03193322941660881,
-0.21629339456558228,
-0.08345170319080353,
0.014655712060630322,
-0.018110409379005432,
0.000117323717859108,
-0.11644364148378372,
0.11801794916391373,
0.20932038128376007,
-0.007996032014489174,
0.06677281111478806,
-0.354960560798645,
-0.12041079998016357,
0.12376734614372253,
0.08185276389122009,
0.2066017985343933,
-0.12307677417993546,
-0.07725867629051208,
-0.08840303868055344,
-0.16683422029018402,
0.09474662691354752,
-0.1281917542219162,
0.0653565302491188,
-0.03960896655917168,
-0.028452984988689423,
-0.0030988408252596855,
-0.02389875240623951,
0.09278566390275955,
0.00481117470189929,
0.029984530061483383,
-0.0653940960764885,
-0.009528862312436104,
0.1433945745229721,
-0.03606482222676277,
0.10325584560632706,
-0.12011540681123734,
0.006984880194067955,
-0.009993509389460087,
-0.03409714624285698,
-0.05775875598192215,
0.10710449516773224,
-0.03728589415550232,
-0.055704567581415176,
-0.009820944629609585,
-0.028222547844052315,
-0.0386698879301548,
0.028460627421736717,
0.10246211290359497,
-0.01415847148746252,
0.0455477349460125,
0.1600460559129715,
0.07782553881406784,
-0.19017618894577026,
-0.045553624629974365,
-0.00040056410944089293,
-0.05234730616211891,
0.04603953659534454,
-0.0019249359611421824,
-0.006027427036315203,
0.07705079764127731,
0.006229695398360491,
0.0815386176109314,
0.05977202206850052,
-0.016125254333019257,
-0.0036578825674951077,
0.07777830958366394,
-0.17784430086612701,
-0.1456756293773651,
-0.00368114304728806,
0.020661214366555214,
-0.011494433507323265,
0.13741031289100647,
0.20610633492469788,
-0.0036319163627922535,
-0.017179694026708603,
0.011706260964274406,
-0.019377749413251877,
-0.07000589370727539,
0.13089890778064728,
-0.005800659768283367,
0.035856813192367554,
-0.09827370941638947,
0.05351600795984268,
0.005056871101260185,
-0.05730338767170906,
-0.035911425948143005,
0.08242394030094147,
-0.13302576541900635,
-0.07981996238231659,
-0.14724776148796082,
0.18494510650634766,
-0.06717272847890854,
-0.02646517939865589,
-0.11298020929098129,
-0.14515596628189087,
0.02423742413520813,
0.1770000010728836,
0.08528603613376617,
0.05068659782409668,
0.027624476701021194,
-0.024072224274277687,
-0.01006199512630701,
0.05624117702245712,
-0.0012564872158691287,
0.10903874039649963,
-0.10293526202440262,
-0.022274242714047432,
0.003910839557647705,
-0.010009746067225933,
-0.04939822480082512,
-0.005511715542525053,
-0.16040489077568054,
-0.041182734072208405,
-0.142052561044693,
-0.020139066502451897,
-0.12851938605308533,
-0.016846679151058197,
-0.021738151088356972,
0.023444153368473053,
0.021436495706439018,
-0.008372755721211433,
-0.018199343234300613,
-0.04431496188044548,
0.01757872849702835,
0.054423291236162186,
-0.1011064350605011,
-0.01942010596394539,
0.017635701224207878,
-0.05429399386048317,
0.08460117131471634,
0.06576059013605118,
0.002736779162660241,
-0.011915812268853188,
-0.10444150120019913,
-0.03910941258072853,
0.0971631333231926,
-0.016180481761693954,
0.01594049669802189,
-0.0526127927005291,
-0.026907609775662422,
0.015225128270685673,
-0.03317985683679581,
0.009462953545153141,
0.14831359684467316,
-0.10438282042741776,
0.057600896805524826,
-0.031803280115127563,
-0.054414331912994385,
-0.05278054624795914,
-0.029901014640927315,
0.09323274344205856,
0.030058391392230988,
0.15682531893253326,
-0.0611436665058136,
-0.002289406955242157,
-0.1452295035123825,
-0.024199558421969414,
0.010595392435789108,
-0.13185691833496094,
-0.00019807417993433774,
-0.03269793093204498,
-0.0038050503935664892,
-0.014826267026364803,
0.14648424088954926,
-0.035134680569171906,
-0.15048852562904358,
0.01099268440157175,
0.01306058932095766,
0.028888758271932602,
0.024851223453879356,
0.18813493847846985,
0.05608414486050606,
-0.011179951950907707,
-0.056063212454319,
0.07688459753990173,
0.05388915166258812,
-0.05820538103580475,
0.04833421856164932,
0.11822755634784698,
-0.028789497911930084,
0.0799475684762001,
0.10613131523132324,
0.06099672615528107,
-0.13732104003429413,
-0.012094639241695404,
0.03289046138525009,
0.04681865870952606,
0.011793393641710281,
0.19909809529781342,
0.09497099369764328,
-0.09217540919780731,
0.028182996436953545,
0.07388269156217575,
-0.020027948543429375,
-0.0869022086262703,
-0.0919371172785759,
-0.09217039495706558,
-0.16102883219718933,
-0.04257919639348984,
-0.10186710953712463,
-0.09794820845127106,
0.08928129822015762,
-0.012534559704363346,
0.017427315935492516,
0.1639990210533142,
-0.06623578071594238,
-0.05635159835219383,
-0.006588167976588011,
-0.017826536670327187,
-0.06651163101196289,
-0.010657653212547302,
-0.04778929427266121,
-0.02043265849351883,
0.05978274717926979,
-0.010974324308335781,
0.0005090284976176918,
-0.019418632611632347,
0.01766396500170231,
-0.040669795125722885,
-0.08830317854881287,
-0.009419864043593407,
0.07886376231908798,
-0.01224950049072504,
-0.005513298790901899,
-0.00430404394865036,
-0.10650262236595154,
0.0017098399112001061,
0.10601147264242172,
-0.02012692019343376,
-0.10657349973917007,
-0.013487336225807667,
0.15900637209415436,
0.021148620173335075,
0.07403592020273209,
0.0030524057801812887,
-0.05742906779050827,
0.011422236450016499,
0.10980153828859329,
0.26042717695236206,
-0.016296831890940666,
0.025474749505519867,
-0.007397180423140526,
0.009848780930042267,
0.029315151274204254,
0.042888909578323364,
0.06727880984544754,
0.19734369218349457,
-0.022829987108707428,
0.0398101843893528,
-0.0328327976167202,
-0.023155823349952698,
-0.11580397933721542,
-0.05352093651890755,
0.008137993514537811,
-0.031369395554065704,
-0.005547652021050453,
0.05255699157714844,
-0.06837009638547897,
-0.012698749080300331,
0.007300278637558222,
-0.11128544062376022,
-0.0744268000125885,
-0.07874151319265366,
0.03738439455628395,
-0.011634673923254013,
0.08431039750576019,
-0.01816931553184986,
-0.058167874813079834,
0.10512164980173111,
-0.01754937693476677,
-0.13408328592777252,
-0.05910440534353256,
-0.000007622211342095397,
-0.02632000297307968,
0.08150795847177505,
-0.017732158303260803,
0.05532456934452057,
0.10286162048578262,
0.037095122039318085,
-0.05338181182742119,
0.09207716584205627,
0.005449444055557251,
-0.02138829417526722,
0.032683808356523514,
-0.013922126032412052,
-0.023484691977500916,
0.15319857001304626,
0.032255761325359344,
-0.17935286462306976,
0.02525399439036846,
0.09562251716852188,
-0.10260418802499771,
-0.0383559986948967,
0.036487456411123276,
-0.05743439868092537,
0.07406370341777802,
0.13655421137809753,
-0.017406435683369637,
-0.01810469850897789,
-0.01882750168442726,
0.0321769192814827,
0.08604668825864792,
-0.026631414890289307,
-0.04534545913338661,
-0.17516553401947021,
-0.006989618297666311,
0.05806552246212959,
-0.0202474482357502,
-0.23184381425380707,
-0.10323549807071686,
-0.1393464207649231,
-0.008290141820907593,
-0.07590985298156738,
-0.0017073695780709386,
0.1159234493970871,
0.039832159876823425,
-0.0295284241437912,
-0.04738141596317291,
-0.03097183257341385,
0.11504102498292923,
-0.11645230650901794,
-0.11202189326286316
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | unsloth/tinyllama-chat-bnb-4bit | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | 2024-02-14T15:15:09+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
63,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04188906401395798,
0.1868634819984436,
-0.005423274356871843,
0.01757637970149517,
0.09925421327352524,
0.005491955671459436,
0.05528821051120758,
0.11709722131490707,
-0.0533502921462059,
0.12540937960147858,
0.04112355411052704,
0.11151822656393051,
0.11794505268335342,
0.14256049692630768,
-0.0006609223200939596,
-0.21857817471027374,
0.049333877861499786,
-0.10810572654008865,
-0.011713704094290733,
0.12273083627223969,
0.14624762535095215,
-0.10107240080833435,
0.06873463094234467,
-0.03527786582708359,
-0.018390236422419548,
-0.03754361346364021,
-0.0631043016910553,
-0.0428098663687706,
0.04095882177352905,
0.05579971522092819,
0.06504834443330765,
0.0006914962432347238,
0.08312074840068817,
-0.28029099106788635,
0.019063476473093033,
0.06934154778718948,
-0.002301337430253625,
0.0655684545636177,
0.0665774792432785,
-0.06358441710472107,
0.10711093246936798,
-0.053030017763376236,
0.13777746260166168,
0.08432129770517349,
-0.09262416511774063,
-0.18061797320842743,
-0.09176573157310486,
0.1097564846277237,
0.17784090340137482,
0.05030388385057449,
-0.027255237102508545,
0.10309778153896332,
-0.08135051280260086,
0.01753326505422592,
0.049427784979343414,
-0.08851896971464157,
-0.05635679140686989,
0.06728223711252213,
0.09259241074323654,
0.05506645515561104,
-0.12643326818943024,
-0.033265259116888046,
0.005194802302867174,
0.01665104553103447,
0.07253342121839523,
0.021623503416776657,
0.14840956032276154,
0.03535414859652519,
-0.13311921060085297,
-0.05290580168366432,
0.10409694164991379,
0.04025519639253616,
-0.04047621786594391,
-0.2443436086177826,
-0.02754354290664196,
-0.02584495209157467,
-0.034590695053339005,
-0.04391676187515259,
0.04140663892030716,
-0.0025886280927807093,
0.08679894357919693,
-0.008228464052081108,
-0.07375224679708481,
-0.03308725357055664,
0.0671553984284401,
0.062167905271053314,
0.029431181028485298,
-0.020213494077324867,
0.022710734978318214,
0.10721412301063538,
0.09491091966629028,
-0.11595078557729721,
-0.058361902832984924,
-0.06589995324611664,
-0.0719945952296257,
-0.041349031031131744,
0.0347420796751976,
0.016205165535211563,
0.07120134681463242,
0.2571125328540802,
0.01922749914228916,
0.05696013197302818,
0.028859760612249374,
0.007498756982386112,
0.051865316927433014,
0.10615350306034088,
-0.06344283372163773,
-0.11439100652933121,
-0.0198503527790308,
0.08982639759778976,
0.019948948174715042,
-0.03634757548570633,
-0.0477963387966156,
0.06657220423221588,
0.04311024025082588,
0.11142025142908096,
0.09991209954023361,
0.020174458622932434,
-0.07410643249750137,
-0.06169014424085617,
0.19804425537586212,
-0.15571331977844238,
0.03830697387456894,
0.04228408634662628,
-0.0350167416036129,
-0.020801648497581482,
0.008958199061453342,
0.02559036947786808,
-0.03405916318297386,
0.08602763712406158,
-0.0542801134288311,
-0.04678044095635414,
-0.11117270588874817,
-0.03103097900748253,
0.044144779443740845,
0.009705442003905773,
-0.03457854315638542,
-0.03633419796824455,
-0.07459833472967148,
-0.08514720946550369,
0.08634167164564133,
-0.07002478837966919,
-0.05773063376545906,
-0.02634890377521515,
-0.08299881964921951,
0.02310962788760662,
0.020002737641334534,
0.0748848021030426,
-0.025510413572192192,
0.05559871718287468,
-0.0527077317237854,
0.05521956831216812,
0.102859266102314,
0.034644488245248795,
-0.059215348213911057,
0.0586630143225193,
-0.2292465716600418,
0.08570117503404617,
-0.06799479573965073,
0.06182733550667763,
-0.15809562802314758,
-0.02371748723089695,
0.03577778860926628,
0.0046258047223091125,
-0.005437619984149933,
0.1341741383075714,
-0.2101311832666397,
-0.02308833971619606,
0.16872304677963257,
-0.09519724547863007,
-0.07186034321784973,
0.051047924906015396,
-0.04659112170338631,
0.10240385681390762,
0.033361393958330154,
0.0017350923735648394,
0.061108484864234924,
-0.10985323041677475,
-0.011671986430883408,
-0.056128498166799545,
-0.0257779061794281,
0.1382409632205963,
0.07547777146100998,
-0.07926145941019058,
0.06575100868940353,
0.02194836176931858,
-0.020947322249412537,
-0.06538521498441696,
-0.018329601734876633,
-0.1007523387670517,
0.016174660995602608,
-0.06786375492811203,
0.010926736518740654,
-0.018007369711995125,
-0.09520112723112106,
-0.0293081384152174,
-0.16955751180648804,
-0.03186587616801262,
0.08120496571063995,
-0.0039502959698438644,
-0.014517090283334255,
-0.11174587905406952,
0.025241373106837273,
0.033142950385808945,
0.004078295081853867,
-0.13219983875751495,
-0.03811480104923248,
0.03414497151970863,
-0.14914745092391968,
0.03702550381422043,
-0.07292518764734268,
0.05196194723248482,
0.015052439644932747,
-0.02801859751343727,
-0.026396771892905235,
0.02275901474058628,
0.009010270237922668,
-0.016446102410554886,
-0.23674587905406952,
-0.02596096321940422,
-0.02948179468512535,
0.16272801160812378,
-0.2067878395318985,
0.03451593220233917,
0.08174202591180801,
0.15760692954063416,
0.003921739757061005,
-0.05103260651230812,
0.018777096644043922,
-0.07056388258934021,
-0.02478284202516079,
-0.05614311248064041,
0.0029366237577050924,
-0.018842201679944992,
-0.044184111058712006,
0.028629349544644356,
-0.17749272286891937,
-0.0478653647005558,
0.09850458055734634,
0.04734146222472191,
-0.1258343607187271,
-0.02524331584572792,
-0.03737637773156166,
-0.05151774734258652,
-0.0409732386469841,
-0.06301578134298325,
0.09911505877971649,
0.06279782205820084,
0.0382394976913929,
-0.06066882237792015,
-0.07980932295322418,
-0.004368528723716736,
-0.0154745914041996,
-0.024671558290719986,
0.09488890320062637,
0.07743741571903229,
-0.13006442785263062,
0.09399931877851486,
0.0845300555229187,
0.07865235209465027,
0.08858645707368851,
-0.02114405855536461,
-0.07441538572311401,
-0.03704451024532318,
0.037285782396793365,
0.019511960446834564,
0.12384209036827087,
-0.04054240137338638,
0.04314751178026199,
0.04097121208906174,
-0.027314281091094017,
0.0179463941603899,
-0.079317107796669,
0.03420396149158478,
0.022174343466758728,
-0.015434152446687222,
0.054325349628925323,
-0.037130847573280334,
0.019326936453580856,
0.08801604062318802,
0.05892675369977951,
0.04210146516561508,
0.015272390097379684,
-0.05278733745217323,
-0.11141471564769745,
0.15833337604999542,
-0.12401816993951797,
-0.21754354238510132,
-0.13233990967273712,
0.01126034650951624,
0.02784722112119198,
-0.014460757374763489,
0.004845886025577784,
-0.06044495850801468,
-0.10889866203069687,
-0.09166782349348068,
0.006674831733107567,
0.056104786694049835,
-0.08347801119089127,
-0.060731321573257446,
0.04706362634897232,
0.04077374190092087,
-0.14247475564479828,
0.020912135019898415,
0.04298027232289314,
-0.09156464785337448,
-0.011692359112203121,
0.07930251210927963,
0.07647769898176193,
0.18610550463199615,
0.021635528653860092,
-0.02018691971898079,
0.03035759925842285,
0.22015435993671417,
-0.1371978521347046,
0.11312834918498993,
0.13443483412265778,
-0.08657801896333694,
0.08107224106788635,
0.20937806367874146,
0.04128330200910568,
-0.09665821492671967,
0.030488910153508186,
0.030000556260347366,
-0.023369858041405678,
-0.2347419559955597,
-0.07046357542276382,
-0.0001642264542169869,
-0.06512240320444107,
0.07899942249059677,
0.09564073383808136,
0.07777895033359528,
0.017650140449404716,
-0.09577023237943649,
-0.09159492701292038,
0.059776052832603455,
0.10890021175146103,
0.014759724959731102,
-0.00850776955485344,
0.08850563317537308,
-0.03501661866903305,
0.014902367256581783,
0.08669138699769974,
0.005105787422508001,
0.15939538180828094,
0.04961811378598213,
0.1777653694152832,
0.08442499488592148,
0.07212188839912415,
0.0022480690386146307,
0.007672940380871296,
0.012716339901089668,
0.041776418685913086,
-0.00586339458823204,
-0.08367537707090378,
-0.025811415165662766,
0.11007143557071686,
0.06822896003723145,
0.01674085482954979,
0.013469134457409382,
-0.048928502947092056,
0.08716683089733124,
0.17909866571426392,
0.002402160782366991,
-0.18131543695926666,
-0.05789600685238838,
0.0750206708908081,
-0.09928116202354431,
-0.10272461175918579,
-0.008900126442313194,
0.01586969569325447,
-0.16637296974658966,
0.0353267602622509,
-0.020183570683002472,
0.10838212817907333,
-0.13442763686180115,
-0.017363429069519043,
0.07578733563423157,
0.0699351504445076,
-0.0022181386593729258,
0.05766081437468529,
-0.17974431812763214,
0.09947559982538223,
0.012100325897336006,
0.0704662874341011,
-0.09766525775194168,
0.09176629781723022,
-0.009300199337303638,
-0.030800525099039078,
0.1424117237329483,
-0.004621058702468872,
-0.0713675245642662,
-0.06180211901664734,
-0.09473362565040588,
-0.011214162223041058,
0.1266127973794937,
-0.1298455148935318,
0.09194545447826385,
-0.03333241492509842,
-0.03638272359967232,
-0.010627835057675838,
-0.08796215802431107,
-0.10934466868638992,
-0.17969419062137604,
0.0595061257481575,
-0.12724877893924713,
0.03739791736006737,
-0.1058867797255516,
-0.025679782032966614,
-0.025718556717038155,
0.1798698753118515,
-0.24040640890598297,
-0.07245399057865143,
-0.14371363818645477,
-0.09313122928142548,
0.13163676857948303,
-0.04657937213778496,
0.09096090495586395,
-0.016109757125377655,
0.16015022993087769,
0.02005946636199951,
-0.019326770678162575,
0.08680126816034317,
-0.08495312184095383,
-0.19593647122383118,
-0.07017680257558823,
0.16592541337013245,
0.11997250467538834,
0.03374743461608887,
0.0002775360771920532,
0.037539996206760406,
-0.020964371040463448,
-0.1182674989104271,
0.021811965852975845,
0.15264973044395447,
0.06795535981655121,
0.009478275664150715,
-0.02395930327475071,
-0.1102604866027832,
-0.0759085938334465,
-0.028979400172829628,
0.03204840421676636,
0.17042486369609833,
-0.07147930562496185,
0.17116157710552216,
0.14203238487243652,
-0.05904092639684677,
-0.20811264216899872,
-0.0021387117449194193,
0.026254096999764442,
-0.00908663496375084,
0.010671177878975868,
-0.18749478459358215,
0.08536157757043839,
-0.002555366139858961,
-0.054554641246795654,
0.1051621288061142,
-0.162221297621727,
-0.13799940049648285,
0.08243122696876526,
0.050192203372716904,
-0.187874898314476,
-0.13649384677410126,
-0.0961209312081337,
-0.0409083291888237,
-0.16012360155582428,
0.09442240744829178,
0.02092057839035988,
0.01192572433501482,
0.031539436429739,
0.014813697896897793,
0.024217691272497177,
-0.04841739311814308,
0.1757010519504547,
-0.018160223960876465,
0.021848153322935104,
-0.09544609487056732,
-0.08094970881938934,
0.017702657729387283,
-0.05052844062447548,
0.07107838243246078,
-0.01870221272110939,
0.011255724355578423,
-0.10348273068666458,
-0.036123502999544144,
-0.042589638382196426,
0.015495449304580688,
-0.09958311915397644,
-0.08601320534944534,
-0.047096606343984604,
0.09428968280553818,
0.09659279137849808,
-0.022826362401247025,
-0.0273736622184515,
-0.07949339598417282,
0.054964661598205566,
0.20859195291996002,
0.18808089196681976,
0.04425831884145737,
-0.062306828796863556,
-0.001769690657965839,
-0.014944184571504593,
0.0417049303650856,
-0.19710716605186462,
0.05912362411618233,
0.0562397725880146,
0.02090633660554886,
0.10401801019906998,
-0.019544212147593498,
-0.15806327760219574,
-0.07712605595588684,
0.06873630732297897,
-0.0639798492193222,
-0.2012440711259842,
0.010393884032964706,
0.05963889881968498,
-0.1750810444355011,
-0.03916953504085541,
0.046524349600076675,
-0.002354544820263982,
-0.039746832102537155,
0.023544801399111748,
0.09521713852882385,
0.004076723475009203,
0.0779079720377922,
0.07060851901769638,
0.08211179822683334,
-0.09903982281684875,
0.08303046971559525,
0.0972437709569931,
-0.07399678975343704,
0.029093235731124878,
0.10163311660289764,
-0.05663156509399414,
-0.0387692004442215,
0.03549791872501373,
0.08046255260705948,
0.02705121599137783,
-0.043615687638521194,
0.011865747161209583,
-0.09553902596235275,
0.06658727675676346,
0.10204542428255081,
0.029401643201708794,
0.017935337498784065,
0.044615767896175385,
0.04629816859960556,
-0.0760611966252327,
0.12436296790838242,
0.03220634162425995,
0.015023278072476387,
-0.04068984091281891,
-0.0437593087553978,
0.0095688970759511,
-0.031515296548604965,
-0.004716753493994474,
-0.02209470048546791,
-0.08724912256002426,
-0.015015563927590847,
-0.13236366212368011,
-0.008743971586227417,
-0.06157073378562927,
0.012826182879507542,
0.028436195105314255,
-0.031039604917168617,
0.0064733498729765415,
0.004978197161108255,
-0.0703253373503685,
-0.06705103069543839,
-0.014137581922113895,
0.09520672261714935,
-0.16576512157917023,
0.026662830263376236,
0.08306188136339188,
-0.11256496608257294,
0.10015136748552322,
0.01094723865389824,
-0.006087163463234901,
0.022484684363007545,
-0.1469746232032776,
0.036669302731752396,
-0.03779744729399681,
0.008942550048232079,
0.02391820214688778,
-0.20087900757789612,
0.0007856925949454308,
-0.033994678407907486,
-0.0705132707953453,
-0.009060341864824295,
-0.025304662063717842,
-0.11176600307226181,
0.10632134974002838,
0.0007318496936932206,
-0.08119626343250275,
-0.02982376329600811,
0.03208870068192482,
0.07710310071706772,
-0.028534123674035072,
0.15100301802158356,
-0.012797199189662933,
0.06572870165109634,
-0.1587447077035904,
-0.01154392957687378,
-0.0113789988681674,
0.015301055274903774,
-0.03572189435362816,
-0.007380445022135973,
0.050975777208805084,
-0.01390343438833952,
0.17424729466438293,
-0.03530619665980339,
0.01664605177938938,
0.0663379579782486,
0.04406430572271347,
-0.03276953473687172,
0.09988830238580704,
0.04999752715229988,
0.017141716554760933,
0.009620473720133305,
0.013025123625993729,
-0.04225568845868111,
-0.036079514771699905,
-0.19024775922298431,
0.06989775598049164,
0.1866932511329651,
0.09840615093708038,
-0.020935317501425743,
0.07415685057640076,
-0.10040472447872162,
-0.09374922513961792,
0.15041497349739075,
-0.037143923342227936,
-0.006523944437503815,
-0.07367441058158875,
0.13029973208904266,
0.14549610018730164,
-0.1814785897731781,
0.06606297194957733,
-0.07180570065975189,
-0.04206360504031181,
-0.10935494303703308,
-0.19550631940364838,
-0.06178471818566322,
-0.048242729157209396,
-0.01886189728975296,
-0.0473010279238224,
0.06674393266439438,
0.060638632625341415,
0.0006839959532953799,
-0.008536890149116516,
0.06868739426136017,
-0.03323382884263992,
-0.0032462377566844225,
0.028674080967903137,
0.0611579604446888,
0.00929503608494997,
-0.03787853941321373,
0.017562363296747208,
-0.012991433031857014,
0.0538424551486969,
0.07582131028175354,
0.050257325172424316,
-0.027161642909049988,
0.01914774253964424,
-0.040576957166194916,
-0.10661055147647858,
0.04927254468202591,
-0.0285146776586771,
-0.07324735075235367,
0.15458324551582336,
0.020231405273079872,
0.0046523697674274445,
-0.012391991913318634,
0.23933540284633636,
-0.06322796642780304,
-0.1032072901725769,
-0.1446000337600708,
0.0738082081079483,
-0.04255184531211853,
0.04970810189843178,
0.03866847604513168,
-0.11122773587703705,
0.025612516328692436,
0.1497742384672165,
0.15382428467273712,
-0.041335951536893845,
0.022738350555300713,
0.03465048223733902,
0.008969487622380257,
-0.023746641352772713,
0.0378718264400959,
0.06464387476444244,
0.15299609303474426,
-0.047938305884599686,
0.07862778007984161,
0.0008240414317697287,
-0.08734137564897537,
-0.03727345913648605,
0.11393242329359055,
-0.011411378160119057,
0.014004142954945564,
-0.058753952383995056,
0.11869291216135025,
-0.0715123638510704,
-0.21872852742671967,
0.04109719768166542,
-0.07026688009500504,
-0.1331670731306076,
-0.023679273203015327,
0.07693633437156677,
-0.012184051796793938,
0.02233847789466381,
0.07813236862421036,
-0.07187089323997498,
0.18990160524845123,
0.039505671709775925,
-0.05827925726771355,
-0.05084679275751114,
0.07384473830461502,
-0.07778072357177734,
0.29490548372268677,
0.015166562050580978,
0.04005192592740059,
0.11093740910291672,
-0.0153964227065444,
-0.1418374478816986,
0.023329483345150948,
0.09676562994718552,
-0.09851957112550735,
0.052737001329660416,
0.17991116642951965,
0.0022530595306307077,
0.1274615228176117,
0.07713494449853897,
-0.08902951329946518,
0.04603179916739464,
-0.07316387444734573,
-0.07014941424131393,
-0.09849017858505249,
0.10400687903165817,
-0.08915241807699203,
0.1447027027606964,
0.121854268014431,
-0.056043028831481934,
0.011541114188730717,
-0.03429169952869415,
0.046591032296419144,
-0.003980774898082018,
0.12048541754484177,
0.010917561128735542,
-0.19038154184818268,
0.02617969550192356,
-0.02745375595986843,
0.10133756697177887,
-0.16829264163970947,
-0.08677268028259277,
0.04525073990225792,
0.01023704931139946,
-0.07288476079702377,
0.1258174479007721,
0.05900004133582115,
0.029306763783097267,
-0.04840981587767601,
-0.022562798112630844,
-0.010515614412724972,
0.14134228229522705,
-0.10410626232624054,
-0.004606697708368301
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | tyemel/dinov2-base-finetuned-lora-dino_genre_augmentation_wfc2 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:16:21+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | grasool/bloomz-560m_PROMPT_TUNING_CAUSAL_LM | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:18:46+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trocr-base-printed_license_plates_ocr
This model is a fine-tuned version of [microsoft/trocr-base-printed](https://huggingface.co/microsoft/trocr-base-printed) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1479
- Cer: 0.0343
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.356 | 1.0 | 500 | 0.1934 | 0.041 |
| 0.1558 | 2.0 | 1000 | 0.1479 | 0.0343 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.0.0+cu117
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"tags": ["generated_from_trainer"], "base_model": "microsoft/trocr-base-printed", "model-index": [{"name": "trocr-base-printed_license_plates_ocr", "results": []}]} | null | mariovigliar/trocr-base-printed_license_plates_ocr | [
"transformers",
"safetensors",
"vision-encoder-decoder",
"generated_from_trainer",
"base_model:microsoft/trocr-base-printed",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:18:54+00:00 | [] | [] | TAGS
#transformers #safetensors #vision-encoder-decoder #generated_from_trainer #base_model-microsoft/trocr-base-printed #endpoints_compatible #region-us
| trocr-base-printed\_license\_plates\_ocr
========================================
This model is a fine-tuned version of microsoft/trocr-base-printed on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1479
* Cer: 0.0343
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.0.0+cu117
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #vision-encoder-decoder #generated_from_trainer #base_model-microsoft/trocr-base-printed #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
51,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #vision-encoder-decoder #generated_from_trainer #base_model-microsoft/trocr-base-printed #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.1102430671453476,
0.012449072673916817,
-0.00008127240289468318,
0.11082348227500916,
0.18419896066188812,
0.026625920087099075,
0.0931709036231041,
0.07611890137195587,
-0.1243775337934494,
0.0338287390768528,
0.13373728096485138,
0.14445039629936218,
0.009292034432291985,
0.1563904881477356,
-0.07109039276838303,
-0.2465847283601761,
0.008547796867787838,
0.04937528073787689,
-0.057525984942913055,
0.12199005484580994,
0.08221210539340973,
-0.149592325091362,
0.10165476053953171,
-0.036317698657512665,
-0.23172608017921448,
0.006732274312525988,
0.026557181030511856,
-0.043544646352529526,
0.1352699100971222,
0.034608144313097,
0.1386832296848297,
-0.013653676956892014,
0.09961273521184921,
-0.17930209636688232,
0.012037091888487339,
0.07085073739290237,
0.006014915183186531,
0.062009233981370926,
0.04909629002213478,
0.01188587211072445,
0.08835087716579437,
-0.11906711757183075,
0.054166678339242935,
0.02618417516350746,
-0.14525039494037628,
-0.24374160170555115,
-0.05524589493870735,
-0.003934276755899191,
0.07308180630207062,
0.10387039184570312,
-0.0059496089816093445,
0.15679840743541718,
-0.033511292189359665,
0.11719726771116257,
0.21820314228534698,
-0.23597952723503113,
-0.08904162049293518,
0.042833831161260605,
0.03583253547549248,
0.10956485569477081,
-0.12308501452207565,
0.00271890452131629,
0.07770262658596039,
0.04535135254263878,
0.11234922707080841,
-0.035060636699199677,
-0.0708170011639595,
-0.01098706852644682,
-0.1471703201532364,
-0.024667074903845787,
0.1264343410730362,
0.044391993433237076,
-0.04025513306260109,
-0.021676814183592796,
-0.08899924159049988,
-0.15186190605163574,
-0.04919140785932541,
-0.0032781537156552076,
0.04124484583735466,
-0.03791261464357376,
-0.10707157105207443,
-0.01646341010928154,
-0.11123376339673996,
-0.08198722451925278,
-0.04640662297606468,
0.1708453744649887,
0.02491004578769207,
0.030747126787900925,
-0.027892781421542168,
0.12140601873397827,
-0.0014647210482507944,
-0.14063473045825958,
-0.003164870198816061,
-0.007196183316409588,
0.005382992327213287,
-0.03436212241649628,
-0.07753943651914597,
-0.011891977861523628,
-0.01367686502635479,
0.09941229969263077,
-0.060727719217538834,
0.05017619580030441,
0.021382736042141914,
0.041756466031074524,
-0.1353190839290619,
0.18532752990722656,
-0.05204456299543381,
-0.006938019767403603,
-0.00045557794510386884,
0.0461706668138504,
0.031194912269711494,
0.003499058075249195,
-0.09055745601654053,
0.0034729621838778257,
0.10685335099697113,
-0.005354821681976318,
-0.08502104878425598,
0.07078265398740768,
-0.0463503934442997,
0.005125274881720543,
-0.00155976926907897,
-0.08255769312381744,
0.033425964415073395,
-0.023381536826491356,
-0.04800008609890938,
-0.03820851817727089,
0.05738167092204094,
0.00838017463684082,
0.0032892317976802588,
0.08678749948740005,
-0.07820899784564972,
0.042251843959093094,
-0.10726843774318695,
-0.10420320183038712,
-0.010359692387282848,
-0.05401315167546272,
0.03980796039104462,
-0.11735548079013824,
-0.16466955840587616,
-0.001037961570546031,
0.060112375766038895,
-0.022649483755230904,
0.027330417186021805,
-0.03129587322473526,
-0.10714920610189438,
0.004720376338809729,
-0.004884023685008287,
0.10994482785463333,
-0.051787082105875015,
0.13191017508506775,
0.10223905742168427,
0.07793381065130234,
-0.09478352218866348,
0.024450061842799187,
-0.1007213294506073,
0.015086036175489426,
-0.21180203557014465,
0.047406621277332306,
-0.024763550609350204,
0.07884089648723602,
-0.04587937518954277,
-0.12148147821426392,
-0.009591015987098217,
0.008176297880709171,
0.0819547176361084,
0.0985390916466713,
-0.18857018649578094,
-0.03773106634616852,
0.1403934359550476,
-0.11581841856241226,
-0.1287081241607666,
0.0987485945224762,
-0.044084880501031876,
0.04789561778306961,
0.06809259951114655,
0.15672974288463593,
0.06583043932914734,
-0.11711540818214417,
0.029871918261051178,
-0.03133585304021835,
0.016765670850872993,
-0.04302341490983963,
0.05404860898852348,
0.038846004754304886,
-0.024500994011759758,
0.027169818058609962,
-0.07213997095823288,
0.053573478013277054,
-0.12031903862953186,
-0.07933963090181351,
-0.0652490183711052,
-0.10521005094051361,
0.045627888292074203,
0.07222580909729004,
0.05084804818034172,
-0.12981319427490234,
-0.05219895765185356,
0.06848793476819992,
0.058810070157051086,
-0.08230070024728775,
0.017016079276800156,
-0.05895724147558212,
0.02123066782951355,
-0.08609901368618011,
-0.0370742566883564,
-0.16782791912555695,
-0.039660971611738205,
-0.00985985342413187,
0.02657785639166832,
-0.004157518967986107,
0.0021126619540154934,
0.08828131854534149,
0.10514846444129944,
-0.08251890540122986,
-0.055855896323919296,
-0.028480207547545433,
0.005769538227468729,
-0.11799251288175583,
-0.1597183644771576,
-0.02636372298002243,
-0.018322672694921494,
0.08734127134084702,
-0.22538158297538757,
0.02426968142390251,
-0.013011880218982697,
0.10118182003498077,
0.03309052810072899,
-0.022987771779298782,
-0.05023260787129402,
0.08483827114105225,
-0.0077772908844053745,
-0.08012954890727997,
0.054626088589429855,
0.0015733495820313692,
-0.06790227442979813,
-0.06437183171510696,
-0.1378888040781021,
0.21558518707752228,
0.14367012679576874,
-0.13378594815731049,
-0.08825492858886719,
0.033056050539016724,
-0.0507567897439003,
-0.02562159113585949,
-0.050804778933525085,
-0.003931385464966297,
0.13759472966194153,
-0.040217164903879166,
0.13252747058868408,
-0.0761256217956543,
-0.027371721342206,
0.030635040253400803,
-0.05106360465288162,
0.010455512441694736,
0.08978543430566788,
0.026507681235671043,
-0.14418327808380127,
0.12049403786659241,
0.14971832931041718,
-0.06907093524932861,
0.13521336019039154,
-0.03534923121333122,
-0.0626421719789505,
-0.01048522349447012,
0.0029715863056480885,
0.013742310926318169,
0.16133621335029602,
-0.0925404354929924,
0.010250356048345566,
-0.004660997539758682,
0.011593295261263847,
0.015625687316060066,
-0.24710847437381744,
-0.04301992803812027,
0.03178722411394119,
-0.026570521295070648,
0.009634389542043209,
-0.017879728227853775,
0.0007018735632300377,
0.09486911445856094,
-0.01031475979834795,
-0.04550745710730553,
0.045310620218515396,
-0.004111343063414097,
-0.07672271132469177,
0.2093420773744583,
-0.07807907462120056,
-0.15556472539901733,
-0.13624529540538788,
-0.03937897831201553,
-0.0176768247038126,
0.03901084512472153,
0.06903833895921707,
-0.08674255013465881,
-0.04824176803231239,
-0.08360079675912857,
0.006596050225198269,
0.027014445513486862,
0.03940312936902046,
0.03175635263323784,
0.002681019250303507,
0.10380790382623672,
-0.09140564501285553,
-0.005638046655803919,
-0.03705054521560669,
-0.036120083183050156,
0.05841764435172081,
0.034044332802295685,
0.14233185350894928,
0.11210034042596817,
-0.04324493557214737,
0.008919590152800083,
-0.019448822364211082,
0.25976476073265076,
-0.08923514187335968,
-0.03321628272533417,
0.14186082780361176,
-0.008474523201584816,
0.041673291474580765,
0.12424150109291077,
0.04755515605211258,
-0.12742170691490173,
0.03755176439881325,
0.0078482860699296,
-0.04330351576209068,
-0.1455683410167694,
-0.037610165774822235,
-0.052122846245765686,
-0.05514923483133316,
0.0787644237279892,
0.0212614294141531,
-0.006449416745454073,
0.06738512217998505,
0.02083640731871128,
0.06181217357516289,
-0.019775889813899994,
0.0763515904545784,
0.09473583847284317,
0.03028780408203602,
0.10278233140707016,
-0.06209804490208626,
-0.06560912728309631,
0.02635127864778042,
-0.03998425602912903,
0.19792114198207855,
-0.017230171710252762,
0.04643464460968971,
0.03749646618962288,
0.1702439785003662,
0.017452308908104897,
0.09547407925128937,
0.020539727061986923,
-0.05701598897576332,
-0.0007432780112139881,
-0.04554329439997673,
-0.05509798973798752,
0.01789642684161663,
-0.11446540802717209,
0.07703464478254318,
-0.10110227763652802,
-0.007933814078569412,
0.05650617182254791,
0.21755649149417877,
0.06699712574481964,
-0.37803107500076294,
-0.09814088046550751,
0.014351575635373592,
0.012877735309302807,
-0.06662347912788391,
-0.001404212205670774,
0.16130056977272034,
-0.06623024493455887,
0.04960525780916214,
-0.08719179779291153,
0.07134001702070236,
-0.01592416688799858,
0.04785655438899994,
0.056524358689785004,
0.0875461995601654,
0.006016289349645376,
0.0403880774974823,
-0.2806210517883301,
0.2691103518009186,
0.003290452528744936,
0.10254276543855667,
-0.04051418602466583,
-0.02802421897649765,
0.02369190752506256,
0.08523575961589813,
0.06485012918710709,
-0.01496194675564766,
-0.028211627155542374,
-0.23315973579883575,
-0.009874612092971802,
0.04472339525818825,
0.15390612185001373,
0.0015061893500387669,
0.09840825200080872,
-0.03914998844265938,
-0.0005230251117609441,
0.07531695067882538,
-0.050426408648490906,
-0.09293365478515625,
-0.0774734690785408,
-0.028819743543863297,
0.008476713672280312,
0.04886955767869949,
-0.08055652678012848,
-0.08734568953514099,
-0.07664337009191513,
0.12544554471969604,
-0.023525420576334,
-0.011586550623178482,
-0.12392908334732056,
0.09180063754320145,
0.08191642165184021,
-0.06260969489812851,
0.07763936370611191,
0.024436261504888535,
0.08358852565288544,
0.05454320088028908,
-0.06762450188398361,
0.10936019569635391,
-0.0660511925816536,
-0.1621997207403183,
-0.049393754452466965,
0.07710930705070496,
0.015713371336460114,
0.02240905910730362,
-0.003172841388732195,
0.01506759226322174,
-0.02013966254889965,
-0.08170770108699799,
0.032760221511125565,
-0.004283307120203972,
0.06555682420730591,
0.05195244774222374,
-0.026445381343364716,
0.0038363172207027674,
-0.04561633616685867,
-0.015323292464017868,
0.11917297542095184,
0.22401121258735657,
-0.07901332527399063,
-0.06685835868120193,
0.05018432438373566,
-0.04025996848940849,
-0.20751453936100006,
0.0767461284995079,
0.04112567380070686,
0.00704940827563405,
0.021726442500948906,
-0.08236514031887054,
0.11393226683139801,
0.08961386233568192,
-0.011347806081175804,
0.11975380033254623,
-0.2897959351539612,
-0.127651184797287,
0.07755080610513687,
0.21916109323501587,
0.12154048681259155,
-0.15420188009738922,
-0.01667296513915062,
-0.04201352596282959,
-0.1555408388376236,
0.04940777271986008,
-0.07917120307683945,
0.12510046362876892,
-0.02054317109286785,
0.037328869104385376,
-0.0026038147043436766,
-0.06102168932557106,
0.13776662945747375,
-0.0417630635201931,
0.1327233612537384,
-0.07933821529150009,
0.0033183307386934757,
0.0699576586484909,
-0.048748601227998734,
0.00871207844465971,
-0.006342991720885038,
0.06710200011730194,
-0.030016224831342697,
-0.02299327217042446,
-0.06821350008249283,
0.00955597311258316,
-0.016184087842702866,
-0.060390278697013855,
-0.027242386713624,
0.024076249450445175,
0.020107468590140343,
-0.030460910871624947,
0.1035647913813591,
-0.003096591914072633,
0.13008391857147217,
0.09368492662906647,
0.056733883917331696,
-0.06491054594516754,
-0.03340738266706467,
-0.0030029239133000374,
-0.02194533869624138,
0.05982589349150658,
-0.11377542465925217,
0.027626115828752518,
0.1286325305700302,
0.019359413534402847,
0.12613987922668457,
0.08776608109474182,
0.0011669180821627378,
0.04099227488040924,
0.07790327817201614,
-0.1614321619272232,
-0.1190488263964653,
-0.012415742501616478,
-0.03901602327823639,
-0.09445617347955704,
0.08937675505876541,
0.09259927272796631,
-0.08363084495067596,
0.015582060441374779,
-0.048826344311237335,
-0.006091169081628323,
-0.06490706652402878,
0.20333515107631683,
0.06517215818166733,
0.04839564859867096,
-0.10270949453115463,
0.07782170921564102,
0.005530226044356823,
-0.07782315462827682,
-0.015507066622376442,
0.06538610905408859,
-0.0886167585849762,
-0.03459596261382103,
0.12746880948543549,
0.1728590428829193,
-0.0708690658211708,
-0.04115036129951477,
-0.11696145683526993,
-0.12467887252569199,
0.04492770880460739,
0.22784844040870667,
0.09795791655778885,
0.016123641282320023,
-0.02717902511358261,
0.022158075124025345,
-0.12698060274124146,
0.06530104577541351,
0.037684500217437744,
0.09787369519472122,
-0.1719122678041458,
0.13729815185070038,
0.002397743286564946,
0.03051742911338806,
-0.04540108144283295,
0.04244999587535858,
-0.11297066509723663,
0.016896648332476616,
-0.12097695469856262,
-0.00826445035636425,
0.010094447061419487,
0.004714973270893097,
0.0036906967870891094,
-0.07980076968669891,
-0.06149478256702423,
0.034212060272693634,
-0.09681985527276993,
-0.03519466146826744,
0.05275136977434158,
0.016736624762415886,
-0.10745862126350403,
-0.04522017762064934,
0.028889192268252373,
-0.06418094784021378,
0.030871102586388588,
0.04777592420578003,
-0.004521280061453581,
0.05100395902991295,
-0.18643778562545776,
-0.018609602004289627,
0.1085837259888649,
-0.01731831021606922,
0.04414716735482216,
-0.04520599544048309,
-0.015147923491895199,
0.00940907746553421,
0.07824265956878662,
0.019054502248764038,
0.12462856620550156,
-0.12496288865804672,
0.002765155164524913,
-0.05622078850865364,
-0.0553540363907814,
-0.06120304763317108,
0.012270803563296795,
0.09343497455120087,
0.014341742731630802,
0.1635461300611496,
-0.10137970745563507,
0.018308408558368683,
-0.1994803249835968,
-0.015190616250038147,
-0.013253020122647285,
-0.1330285668373108,
-0.09563065320253372,
-0.04650173336267471,
0.07331162691116333,
-0.0270118098706007,
0.1049608364701271,
-0.018382282927632332,
0.09283359348773956,
0.02166702225804329,
-0.02862880751490593,
0.024192191660404205,
0.04218532517552376,
0.26191163063049316,
0.012370558455586433,
-0.04115239903330803,
0.07160239666700363,
0.08990661799907684,
0.11927656084299088,
0.09793968498706818,
0.156256303191185,
0.16415899991989136,
-0.07195907831192017,
0.12320501357316971,
0.018888423219323158,
-0.016926845535635948,
-0.12624169886112213,
-0.011349608190357685,
-0.053415462374687195,
0.06311483681201935,
-0.0362480990588665,
0.15043875575065613,
0.10626456141471863,
-0.17135761678218842,
0.014288647100329399,
-0.04472271353006363,
-0.0901976153254509,
-0.058239925652742386,
-0.008890701457858086,
-0.10308315604925156,
-0.1551472395658493,
0.016433093696832657,
-0.09906712919473648,
0.01591615006327629,
0.1536819338798523,
-0.005041466094553471,
-0.014901746064424515,
0.24249215424060822,
0.04998786747455597,
0.02486489899456501,
0.04478544741868973,
0.0069847628474235535,
-0.01609816774725914,
-0.044314444065093994,
-0.07685935497283936,
0.021092163398861885,
-0.03490106016397476,
0.03596007078886032,
-0.047624923288822174,
-0.07930239289999008,
0.043322958052158356,
-0.004949693568050861,
-0.10213043540716171,
0.02334294654428959,
0.05610742047429085,
0.04187212511897087,
0.029057038947939873,
0.0106091583147645,
0.0016326631885021925,
-0.01362670212984085,
0.24182185530662537,
-0.10199311375617981,
-0.1004987582564354,
-0.08446042984724045,
0.2439817488193512,
0.029811976477503777,
0.02683061733841896,
-0.002926197834312916,
-0.09921254217624664,
0.029391178861260414,
0.23353323340415955,
0.1375901848077774,
-0.0920201763510704,
-0.002326238201931119,
-0.018135081976652145,
-0.009301927872002125,
-0.03692851588129997,
0.11090558767318726,
0.14135998487472534,
0.0258502047508955,
-0.10087616741657257,
-0.04500753432512283,
-0.032601043581962585,
-0.01434633694589138,
-0.04208279773592949,
-0.0074269636534154415,
0.026604413986206055,
0.017228273674845695,
-0.07087454199790955,
0.07543931901454926,
-0.022613106295466423,
-0.08556073904037476,
0.1285969316959381,
-0.1835172474384308,
-0.1137532964348793,
-0.01114723552018404,
0.12435393780469894,
-0.024637704715132713,
0.049403585493564606,
-0.04863423854112625,
0.00630977563560009,
0.04026307538151741,
-0.013287006877362728,
-0.06710963696241379,
-0.09107466042041779,
0.021883457899093628,
-0.15317486226558685,
0.19738852977752686,
-0.03902198001742363,
0.042212143540382385,
0.11962955445051193,
0.04220554605126381,
-0.08014821261167526,
0.117674320936203,
0.008716600947082043,
-0.0978357270359993,
0.029860273003578186,
0.13159865140914917,
-0.041791439056396484,
0.07940824329853058,
0.02697547897696495,
-0.16840656101703644,
0.02431977540254593,
-0.03084409050643444,
-0.03083553910255432,
-0.022245297208428383,
-0.07105801999568939,
-0.06665074080228806,
0.13137398660182953,
0.18278896808624268,
-0.020885096862912178,
0.03552490472793579,
-0.06131584197282791,
0.031014028936624527,
0.07053162157535553,
0.03896684944629669,
-0.023229360580444336,
-0.24126693606376648,
0.010069919750094414,
0.08495429158210754,
-0.05029425770044327,
-0.2275150716304779,
-0.10444635152816772,
-0.01024833507835865,
-0.061253100633621216,
-0.0656074658036232,
0.08260003477334976,
0.10103900730609894,
0.04322587326169014,
-0.05489581823348999,
-0.14137482643127441,
-0.06391822546720505,
0.18706396222114563,
-0.13310271501541138,
-0.1063164621591568
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | unsloth/tinyllama-chat | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:19:49+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04654794931411743,
0.16618601977825165,
-0.005445904564112425,
0.01853804849088192,
0.0981811136007309,
0.011998992413282394,
0.06433123350143433,
0.11398410052061081,
-0.0230073444545269,
0.11406639218330383,
0.03047988750040531,
0.10172267258167267,
0.11317981779575348,
0.14841650426387787,
-0.002152352826669812,
-0.22403094172477722,
0.050844956189394,
-0.12105348706245422,
-0.033293843269348145,
0.11749980598688126,
0.1483822613954544,
-0.09928343445062637,
0.07274559140205383,
-0.029687678441405296,
-0.012143402360379696,
-0.030057786032557487,
-0.05890674889087677,
-0.046214159578084946,
0.04651786759495735,
0.06640566885471344,
0.06770290434360504,
0.0071083661168813705,
0.09012923389673233,
-0.2696533799171448,
0.018959321081638336,
0.07145345956087112,
-0.002759667346253991,
0.06957992166280746,
0.06404146552085876,
-0.07107418030500412,
0.10337356477975845,
-0.05106033384799957,
0.14650006592273712,
0.08365883678197861,
-0.09081148356199265,
-0.1895141303539276,
-0.08866965025663376,
0.09882009029388428,
0.17572562396526337,
0.04925641790032387,
-0.02320658043026924,
0.09761467576026917,
-0.08769196271896362,
0.015438909642398357,
0.04981724172830582,
-0.07620415836572647,
-0.05378096550703049,
0.05986575037240982,
0.07907199114561081,
0.06627275794744492,
-0.12434766441583633,
-0.02885502204298973,
0.005009706597775221,
0.010980482213199139,
0.0769270583987236,
0.01728810742497444,
0.146672785282135,
0.0338633768260479,
-0.12615777552127838,
-0.04880760237574577,
0.09869225323200226,
0.03395522013306618,
-0.04422314465045929,
-0.24749068915843964,
-0.03152675926685333,
-0.030810698866844177,
-0.029386121779680252,
-0.03716538846492767,
0.04340358078479767,
-0.007673026993870735,
0.08638741075992584,
-0.0060646249912679195,
-0.07403432577848434,
-0.03937075287103653,
0.06169692054390907,
0.0672287791967392,
0.02999979443848133,
-0.013745363801717758,
0.010938193649053574,
0.11620724946260452,
0.1095694974064827,
-0.12054188549518585,
-0.05555335059762001,
-0.06393084675073624,
-0.08656639605760574,
-0.040790557861328125,
0.034162238240242004,
0.03456587344408035,
0.05349370837211609,
0.25305667519569397,
0.015654386952519417,
0.059652652591466904,
0.034477248787879944,
0.007892133668065071,
0.05848940089344978,
0.11044429242610931,
-0.06018859148025513,
-0.10444226115942001,
-0.02648012898862362,
0.08843598514795303,
0.008199662901461124,
-0.03287925571203232,
-0.05088530853390694,
0.06019928678870201,
0.01946467161178589,
0.11926145106554031,
0.09061790257692337,
0.010536285117268562,
-0.07121123373508453,
-0.061038948595523834,
0.1891259253025055,
-0.16544590890407562,
0.04322727024555206,
0.035097137093544006,
-0.03903156518936157,
0.00019933005387429148,
0.013914269395172596,
0.016625655815005302,
-0.025983380153775215,
0.09017423540353775,
-0.054113563150167465,
-0.04145489260554314,
-0.11186197400093079,
-0.03383193537592888,
0.033762916922569275,
0.008953776210546494,
-0.035059962421655655,
-0.033713940531015396,
-0.08351044356822968,
-0.07577689737081528,
0.09320491552352905,
-0.07346344739198685,
-0.04878907650709152,
-0.01804324984550476,
-0.07530532777309418,
0.022395428270101547,
0.019394835457205772,
0.07707412540912628,
-0.02362251654267311,
0.04399976506829262,
-0.05189276114106178,
0.05863580107688904,
0.11207318305969238,
0.03570080175995827,
-0.05736649036407471,
0.06062258034944534,
-0.23834340274333954,
0.09552820026874542,
-0.07409077137708664,
0.05591456592082977,
-0.153293639421463,
-0.024439791217446327,
0.04788333550095558,
0.008784620091319084,
-0.009650949388742447,
0.13416339457035065,
-0.21702027320861816,
-0.02536402828991413,
0.1717337965965271,
-0.10057014971971512,
-0.07069246470928192,
0.05619903281331062,
-0.04835370555520058,
0.10988964140415192,
0.03825836628675461,
-0.025690359994769096,
0.06171267107129097,
-0.1267417073249817,
0.003717758459970355,
-0.05005312338471413,
-0.017048977315425873,
0.1548657864332199,
0.07182947546243668,
-0.07217690348625183,
0.07399354875087738,
0.025708531960844994,
-0.0246540866792202,
-0.04625825211405754,
-0.015164627693593502,
-0.10536660254001617,
0.014689887873828411,
-0.06369215250015259,
0.014470234513282776,
-0.020807426422834396,
-0.09071163833141327,
-0.027962757274508476,
-0.17504668235778809,
-0.03014434315264225,
0.08651752024888992,
-0.008693269453942776,
-0.01803150773048401,
-0.1178668737411499,
0.009341353550553322,
0.04177580401301384,
0.0061247628182172775,
-0.13462838530540466,
-0.04812471568584442,
0.02780051715672016,
-0.1600649207830429,
0.034652888774871826,
-0.05392369255423546,
0.04932025074958801,
0.025790516287088394,
-0.028889117762446404,
-0.026493212208151817,
0.021633783355355263,
0.005992184858769178,
-0.011999987065792084,
-0.24343903362751007,
-0.028118690475821495,
-0.024888472631573677,
0.1682123839855194,
-0.20917098224163055,
0.03546025976538658,
0.07867541164159775,
0.15366052091121674,
0.011240328662097454,
-0.04177491366863251,
0.005974748637527227,
-0.06935794651508331,
-0.02736494317650795,
-0.05875484645366669,
-0.0047869328409433365,
-0.03310677409172058,
-0.04545191675424576,
0.04568447172641754,
-0.16510973870754242,
-0.032636504620313644,
0.09776268899440765,
0.06289951503276825,
-0.13922683894634247,
-0.020621931180357933,
-0.03630133345723152,
-0.049253206700086594,
-0.04911839962005615,
-0.0605199858546257,
0.10893940925598145,
0.05891856551170349,
0.04574795812368393,
-0.05928509309887886,
-0.07568105310201645,
-0.001827909960411489,
-0.013898161239922047,
-0.017864689230918884,
0.09759635478258133,
0.0751434788107872,
-0.13251115381717682,
0.09224759042263031,
0.09603385627269745,
0.07919023185968399,
0.09113933145999908,
-0.02355697751045227,
-0.08261934667825699,
-0.045987509191036224,
0.031442027539014816,
0.020124373957514763,
0.13039541244506836,
-0.024294709786772728,
0.04352088272571564,
0.042134687304496765,
-0.019369594752788544,
0.014752166345715523,
-0.08687400817871094,
0.033972494304180145,
0.028472330421209335,
-0.016721390187740326,
0.050190530717372894,
-0.03876714035868645,
0.02440318465232849,
0.08830609917640686,
0.045322712510824203,
0.03507532551884651,
0.015493292361497879,
-0.05206458270549774,
-0.1083620935678482,
0.16405931115150452,
-0.12714070081710815,
-0.22483378648757935,
-0.13936103880405426,
0.0037376401014626026,
0.035628627985715866,
-0.015835661441087723,
0.002417160663753748,
-0.059374887496232986,
-0.12220635265111923,
-0.08858037739992142,
0.015140829607844353,
0.04942670464515686,
-0.09028962254524231,
-0.06437795609235764,
0.058117836713790894,
0.03889724239706993,
-0.14560972154140472,
0.017612040042877197,
0.04854894429445267,
-0.09789852797985077,
-0.006774199660867453,
0.08094939589500427,
0.0698540136218071,
0.1770169734954834,
0.017703235149383545,
-0.021850809454917908,
0.032354529947042465,
0.20614571869373322,
-0.13538233935832977,
0.11083246022462845,
0.13607586920261383,
-0.09041404724121094,
0.08072979003190994,
0.19951270520687103,
0.03932560607790947,
-0.10153959691524506,
0.031980328261852264,
0.02283124253153801,
-0.0284719280898571,
-0.24526868760585785,
-0.07212468236684799,
-0.004402178805321455,
-0.058010730892419815,
0.07660572230815887,
0.09286724030971527,
0.08215958625078201,
0.012304253876209259,
-0.09310996532440186,
-0.08154371380805969,
0.05942574888467789,
0.10367169976234436,
0.024584239348769188,
-0.010839897207915783,
0.08998730033636093,
-0.034100502729415894,
0.019626356661319733,
0.0853661298751831,
0.005239574704319239,
0.17840281128883362,
0.05159219726920128,
0.18830420076847076,
0.07925192266702652,
0.07219027727842331,
0.009912233799695969,
0.013080619275569916,
0.018877580761909485,
0.03300119563937187,
-0.002769160782918334,
-0.08440786600112915,
-0.02248465269804001,
0.11566436290740967,
0.06668911874294281,
0.010815348476171494,
0.015172341838479042,
-0.04104290530085564,
0.07965951412916183,
0.1831512451171875,
-0.007656289264559746,
-0.1783534437417984,
-0.057547420263290405,
0.07553383708000183,
-0.09879875183105469,
-0.09854305535554886,
-0.013454320840537548,
0.03072015568614006,
-0.17046253383159637,
0.023390959948301315,
-0.02239842526614666,
0.1106182336807251,
-0.14194999635219574,
-0.020490378141403198,
0.07218493521213531,
0.07199500501155853,
0.004729843698441982,
0.05758659541606903,
-0.16417601704597473,
0.10671813786029816,
0.008950476534664631,
0.06779605895280838,
-0.09610627591609955,
0.1008887067437172,
-0.004196076653897762,
-0.02063460275530815,
0.1393408179283142,
0.002700034761801362,
-0.06884108483791351,
-0.0763031542301178,
-0.08754398673772812,
-0.009632662869989872,
0.12754282355308533,
-0.1419651061296463,
0.08767123520374298,
-0.037212442606687546,
-0.0424150750041008,
-0.0017086371080949903,
-0.10206665843725204,
-0.11638247221708298,
-0.18888559937477112,
0.06001543253660202,
-0.13492922484874725,
0.03152317553758621,
-0.10799519717693329,
-0.032371897250413895,
-0.030304040759801865,
0.19337286055088043,
-0.23447458446025848,
-0.07199826091527939,
-0.1475764364004135,
-0.10233612358570099,
0.1443224400281906,
-0.0501345656812191,
0.08485390990972519,
-0.007241467013955116,
0.16846685111522675,
0.019060896709561348,
-0.02531743235886097,
0.0971490666270256,
-0.09173708409070969,
-0.19302815198898315,
-0.07869284600019455,
0.15662524104118347,
0.13260218501091003,
0.031680017709732056,
-0.002461588243022561,
0.036563750356435776,
-0.015421539545059204,
-0.11935004591941833,
0.015969349071383476,
0.1787186712026596,
0.06237189099192619,
0.02331034652888775,
-0.027346095070242882,
-0.11273157596588135,
-0.06900003552436829,
-0.028530338779091835,
0.03054865077137947,
0.17762407660484314,
-0.07057618349790573,
0.18207968771457672,
0.14163152873516083,
-0.05922834202647209,
-0.20400173962116241,
0.010538800619542599,
0.03055560030043125,
0.0009220078936778009,
0.02591954916715622,
-0.20123432576656342,
0.08688826113939285,
0.004683020059019327,
-0.05110127478837967,
0.13194532692432404,
-0.17217805981636047,
-0.14451217651367188,
0.0765485092997551,
0.038384392857551575,
-0.19559739530086517,
-0.12913893163204193,
-0.09174312651157379,
-0.045869920402765274,
-0.18591414391994476,
0.09569250047206879,
0.0305706188082695,
0.010893458500504494,
0.03030681423842907,
0.029179483652114868,
0.019487828016281128,
-0.0418255440890789,
0.18391458690166473,
-0.024792250245809555,
0.026594700291752815,
-0.08539514988660812,
-0.06927408277988434,
0.03743394836783409,
-0.052842434495687485,
0.07349982857704163,
-0.023486759513616562,
0.007861839607357979,
-0.10348054021596909,
-0.042148489505052567,
-0.03735732287168503,
0.015448716469109058,
-0.09657872468233109,
-0.08514349907636642,
-0.045032672584056854,
0.09675803780555725,
0.09690850973129272,
-0.033646680414676666,
-0.028050623834133148,
-0.07533035427331924,
0.04412057250738144,
0.19926515221595764,
0.1785389482975006,
0.042153384536504745,
-0.08034496754407883,
-0.004150947090238333,
-0.010121207684278488,
0.04310847446322441,
-0.20463712513446808,
0.06283636391162872,
0.05450061708688736,
0.01973269321024418,
0.11436162889003754,
-0.019565396010875702,
-0.15359151363372803,
-0.07263088971376419,
0.06303015351295471,
-0.060181066393852234,
-0.19620554149150848,
0.00867035984992981,
0.060603946447372437,
-0.16371412575244904,
-0.04535605385899544,
0.04643881320953369,
-0.005620351992547512,
-0.038163937628269196,
0.021896906197071075,
0.09194854646921158,
0.0026654244866222143,
0.07427921891212463,
0.05387866869568825,
0.0827430784702301,
-0.10537070035934448,
0.08090532571077347,
0.08839722722768784,
-0.08452684432268143,
0.023530138656497,
0.10478579998016357,
-0.059433579444885254,
-0.03440561518073082,
0.020135708153247833,
0.08153781294822693,
0.01775863952934742,
-0.040019966661930084,
0.013229827396571636,
-0.10452935844659805,
0.05954122915863991,
0.08839859813451767,
0.032507482916116714,
0.016702456399798393,
0.03425082191824913,
0.04607953503727913,
-0.07238735258579254,
0.12142276018857956,
0.031868141144514084,
0.017129309475421906,
-0.036505792289972305,
-0.040896978229284286,
0.019542274996638298,
-0.03214648738503456,
-0.005015232600271702,
-0.03023446537554264,
-0.07695909589529037,
-0.014793801121413708,
-0.1626158058643341,
-0.011131818406283855,
-0.05648450180888176,
0.010329355485737324,
0.03204665705561638,
-0.032609567046165466,
0.008124498650431633,
0.009250079281628132,
-0.07695289701223373,
-0.0663459524512291,
-0.020460480824112892,
0.09540658444166183,
-0.16213038563728333,
0.022481130436062813,
0.08244425803422928,
-0.12187694013118744,
0.09281346201896667,
0.016204802319407463,
-0.006236857734620571,
0.025038830935955048,
-0.1475188434123993,
0.034843120723962784,
-0.03386561945080757,
0.010836300440132618,
0.04373383894562721,
-0.21569781005382538,
-0.00004886732858722098,
-0.033673107624053955,
-0.06639216095209122,
-0.009451326914131641,
-0.03672455996274948,
-0.11508306115865707,
0.1058407872915268,
0.007236586883664131,
-0.08753558248281479,
-0.03186136856675148,
0.029325377196073532,
0.0838974118232727,
-0.021959776058793068,
0.15145497024059296,
-0.008370938710868359,
0.07429654151201248,
-0.16209737956523895,
-0.018623165786266327,
-0.006028574425727129,
0.022658247500658035,
-0.01664556935429573,
-0.01111356820911169,
0.044031109660863876,
-0.022746501490473747,
0.17925859987735748,
-0.030318550765514374,
0.02272745408117771,
0.06815794110298157,
0.019072026014328003,
-0.030184008181095123,
0.10406795144081116,
0.04094860330224037,
0.02014910988509655,
0.018591465428471565,
0.003289656015112996,
-0.04647882282733917,
-0.03173251822590828,
-0.19407226145267487,
0.07288651913404465,
0.15608493983745575,
0.09729263186454773,
-0.016707008704543114,
0.07954329252243042,
-0.10199416428804398,
-0.1109243705868721,
0.12477338314056396,
-0.04797708988189697,
-0.002418199321255088,
-0.07150927931070328,
0.13247236609458923,
0.1437523066997528,
-0.1859612911939621,
0.07269313186407089,
-0.0699717253446579,
-0.04708027467131615,
-0.10980689525604248,
-0.19441905617713928,
-0.05561789125204086,
-0.049456022679805756,
-0.016053348779678345,
-0.04698808491230011,
0.07504211366176605,
0.054538097232580185,
0.006766852922737598,
-0.0023397188633680344,
0.06506035476922989,
-0.031050674617290497,
-0.0037882844917476177,
0.032597362995147705,
0.06591679900884628,
0.012734474614262581,
-0.030802709981799126,
0.016619903966784477,
-0.013545602560043335,
0.045626189559698105,
0.06578011065721512,
0.04976864159107208,
-0.02938537672162056,
0.014603170566260815,
-0.038539156317710876,
-0.10249634087085724,
0.043612558394670486,
-0.024421939626336098,
-0.0789753645658493,
0.15477414429187775,
0.023680059239268303,
0.007779473438858986,
-0.020137663930654526,
0.23901568353176117,
-0.0738423764705658,
-0.0964353010058403,
-0.14737580716609955,
0.10557299107313156,
-0.038081806153059006,
0.05800395458936691,
0.04625935107469559,
-0.10226529091596603,
0.018044332042336464,
0.1338089406490326,
0.16182038187980652,
-0.039008259773254395,
0.020095856860280037,
0.031135575845837593,
0.00566398398950696,
-0.03622615709900856,
0.04847532883286476,
0.06906453520059586,
0.16569648683071136,
-0.04632584750652313,
0.09100406616926193,
0.0019041687482967973,
-0.09579581767320633,
-0.038361791521310806,
0.11069868505001068,
-0.016052277758717537,
0.019335128366947174,
-0.05818064883351326,
0.11742528527975082,
-0.06386786699295044,
-0.23783175647258759,
0.06453443318605423,
-0.0684293657541275,
-0.13765870034694672,
-0.02378307841718197,
0.08207765966653824,
-0.012955902144312859,
0.027587108314037323,
0.0730307325720787,
-0.07240920513868332,
0.201939657330513,
0.03798431158065796,
-0.05499868467450142,
-0.055047210305929184,
0.0805421993136406,
-0.10008571296930313,
0.2739645540714264,
0.01557221356779337,
0.04601577669382095,
0.10384146869182587,
-0.009341772645711899,
-0.13838784396648407,
0.019836371764540672,
0.09581108391284943,
-0.10502193123102188,
0.04196618124842644,
0.19815568625926971,
-0.0014755994779989123,
0.12389086186885834,
0.07657600939273834,
-0.07551808655261993,
0.0478031262755394,
-0.08054235577583313,
-0.06760486960411072,
-0.09260394424200058,
0.09703279286623001,
-0.07772123068571091,
0.14251399040222168,
0.13876807689666748,
-0.05074559152126312,
0.012724342755973339,
-0.031311117112636566,
0.044293127954006195,
-0.00010600237874314189,
0.10321761667728424,
0.004272161517292261,
-0.1832672357559204,
0.024692710489034653,
0.005650998093187809,
0.10749758034944534,
-0.16033467650413513,
-0.09566054493188858,
0.042343202978372574,
0.003505636239424348,
-0.0672195628285408,
0.1290110945701599,
0.05665452033281326,
0.04342988133430481,
-0.03997718170285225,
-0.03521440550684929,
-0.0060732318088412285,
0.13561366498470306,
-0.10713256150484085,
0.0009933578548952937
] |
null | null | diffusers | # Victoria_Whitman_RH
<Gallery />
## Model description
Here's my RVC voice model of Victoria Whitman from Rainbow High.
## Trigger words
You should use `Please spare me` to trigger the image generation.
## Download model
[Download](/LegoClipStars/Victoria_Whitman_RH/tree/main) them in the Files & versions tab.
| {"license": "cc-by-4.0", "tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "NEFT", "parameters": {"negative_prompt": "High school student"}, "output": {"url": "images/20240214_172335.jpg"}}], "base_model": "cagliostrolab/animagine-xl-3.0", "instance_prompt": "Please spare me"} | text-to-image | LegoClipStars/Victoria_Whitman_RH | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:cagliostrolab/animagine-xl-3.0",
"license:cc-by-4.0",
"region:us"
] | 2024-02-14T15:21:58+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-cc-by-4.0 #region-us
| # Victoria_Whitman_RH
<Gallery />
## Model description
Here's my RVC voice model of Victoria Whitman from Rainbow High.
## Trigger words
You should use 'Please spare me' to trigger the image generation.
## Download model
Download them in the Files & versions tab.
| [
"# Victoria_Whitman_RH\n\n<Gallery />",
"## Model description \n\nHere's my RVC voice model of Victoria Whitman from Rainbow High.",
"## Trigger words\n\nYou should use 'Please spare me' to trigger the image generation.",
"## Download model\n\n\nDownload them in the Files & versions tab."
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-cc-by-4.0 #region-us \n",
"# Victoria_Whitman_RH\n\n<Gallery />",
"## Model description \n\nHere's my RVC voice model of Victoria Whitman from Rainbow High.",
"## Trigger words\n\nYou should use 'Please spare me' to trigger the image generation.",
"## Download model\n\n\nDownload them in the Files & versions tab."
] | [
60,
14,
19,
18,
14
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-cc-by-4.0 #region-us \n# Victoria_Whitman_RH\n\n<Gallery />## Model description \n\nHere's my RVC voice model of Victoria Whitman from Rainbow High.## Trigger words\n\nYou should use 'Please spare me' to trigger the image generation.## Download model\n\n\nDownload them in the Files & versions tab."
] | [
-0.10562440007925034,
0.026723016053438187,
-0.002767453668639064,
0.01982390508055687,
0.09928195178508759,
0.03919558599591255,
0.17369039356708527,
0.0527053102850914,
-0.040260083973407745,
0.06690999865531921,
0.10760296881198883,
0.07657768577337265,
0.006693325936794281,
0.02226218953728676,
-0.04901387169957161,
-0.2903635799884796,
0.005266352090984583,
0.011322662234306335,
0.04297603666782379,
0.05386186018586159,
0.04486406594514847,
-0.0174796711653471,
0.11212978512048721,
-0.016010530292987823,
-0.10398934036493301,
-0.017632508650422096,
0.027494871988892555,
-0.000706024351529777,
0.05257828161120415,
0.07257276028394699,
-0.046971142292022705,
0.16347075998783112,
0.12845677137374878,
-0.22005051374435425,
0.06937648355960846,
-0.004068923182785511,
-0.07392726838588715,
0.07951132953166962,
-0.01222616620361805,
-0.08292251080274582,
0.18423333764076233,
0.03954220935702324,
-0.06794114410877228,
0.08177610486745834,
-0.052486930042505264,
-0.035004742443561554,
-0.010076554492115974,
0.05003849044442177,
0.04319014772772789,
0.02833220362663269,
-0.0060394988395273685,
0.0727904736995697,
-0.03407483547925949,
0.008832401596009731,
0.2596619427204132,
-0.26981112360954285,
-0.045509688556194305,
0.20702233910560608,
0.12491413950920105,
0.12757480144500732,
-0.07715063542127609,
0.14289309084415436,
0.020038731396198273,
0.0016124618705362082,
-0.026392584666609764,
-0.059217993170022964,
0.06718551367521286,
-0.07474000751972198,
-0.05279555916786194,
0.0836457908153534,
0.3360990583896637,
0.08259781450033188,
-0.04640330746769905,
0.003549381159245968,
-0.05346541851758957,
0.10527680069208145,
-0.08627553284168243,
0.023799844086170197,
0.017598455771803856,
0.0204866174608469,
-0.15117424726486206,
-0.1292402744293213,
-0.09398391842842102,
-0.10982924699783325,
0.02694711647927761,
-0.08232101798057556,
-0.02937306836247444,
0.06545501947402954,
-0.029621563851833344,
0.0530608594417572,
-0.12092700600624084,
-0.11730808019638062,
0.012103229761123657,
-0.1255921572446823,
0.07410339266061783,
0.11659939587116241,
-0.02494215965270996,
-0.002235585357993841,
0.04842919483780861,
0.05886270850896835,
0.16465073823928833,
-0.018234817311167717,
-0.03159269690513611,
0.15174473822116852,
-0.04202600196003914,
0.002694264519959688,
-0.09843089431524277,
-0.05492420494556427,
0.07135583460330963,
0.00014039050438441336,
0.07814092934131622,
-0.08289141952991486,
-0.13229356706142426,
-0.03725380450487137,
-0.13681724667549133,
0.027654394507408142,
-0.012346443720161915,
0.0015730963787063956,
-0.07741669565439224,
-0.0009354164358228445,
0.17806752026081085,
0.06786586344242096,
-0.0027319572400301695,
0.01829748973250389,
0.04623732343316078,
0.15659458935260773,
0.1326676309108734,
0.041117820888757706,
0.07152162492275238,
-0.040641866624355316,
-0.08510555326938629,
-0.0421464741230011,
-0.025822967290878296,
-0.0010512516601011157,
-0.006296444684267044,
-0.0980486124753952,
0.01248844899237156,
-0.1294190138578415,
-0.10353594273328781,
-0.01758577860891819,
0.03060181625187397,
-0.0993000790476799,
-0.03041967563331127,
-0.08724623173475266,
-0.07024694979190826,
0.05080582946538925,
0.022612014785408974,
-0.053556427359580994,
-0.045620907098054886,
0.0858013778924942,
-0.02900438755750656,
0.12575119733810425,
-0.1402590274810791,
0.03831768408417702,
-0.04042787104845047,
0.04858006164431572,
-0.1598798930644989,
0.06907634437084198,
-0.05987393483519554,
0.034349773079156876,
-0.04641282558441162,
-0.0852094516158104,
-0.06723041087388992,
0.020878450945019722,
-0.02792702242732048,
0.2077498435974121,
-0.263558954000473,
-0.0469970777630806,
0.06321647018194199,
-0.1320478320121765,
0.007694287691265345,
0.059838999062776566,
0.025420185178518295,
0.10338316112756729,
0.07255943864583969,
0.16431182622909546,
-0.0036561908200383186,
-0.168134868144989,
0.08582524955272675,
0.03895723447203636,
-0.06922251731157303,
-0.08864285051822662,
0.016891708597540855,
0.020998306572437286,
0.020760701969265938,
0.026307517662644386,
-0.17443031072616577,
0.012612859718501568,
-0.0817512571811676,
0.011255920864641666,
0.024992257356643677,
-0.08766140788793564,
-0.02335698902606964,
0.03297419100999832,
-0.030672157183289528,
-0.006092485971748829,
-0.0732329860329628,
-0.0865328311920166,
0.04230690747499466,
-0.08547362685203552,
0.05813491716980934,
-0.0428641214966774,
0.11033303290605545,
-0.0031425929628312588,
-0.0049476586282253265,
-0.027209609746932983,
0.11549320816993713,
-0.04014570266008377,
0.19047804176807404,
0.1421188861131668,
0.0710892528295517,
0.04271208494901657,
0.04067855328321457,
-0.03803761675953865,
0.041057080030441284,
0.04350196570158005,
-0.053489893674850464,
0.02522708848118782,
-0.19219714403152466,
0.002374449511989951,
-0.02965831384062767,
0.14589491486549377,
-0.19333301484584808,
-0.018071893602609634,
0.03952864184975624,
0.004768517799675465,
0.010293823666870594,
-0.021364307031035423,
0.04973447322845459,
-0.027544036507606506,
-0.06079646199941635,
0.0420231819152832,
0.08168326318264008,
0.024031255394220352,
-0.0912722498178482,
0.2008875459432602,
-0.08886901289224625,
-0.04568655043840408,
0.13777540624141693,
-0.17794844508171082,
0.017475465312600136,
-0.03951435536146164,
0.020523326471447945,
0.052169714123010635,
0.017201043665409088,
-0.036896008998155594,
0.011534908786416054,
-0.030781349167227745,
0.0912511944770813,
-0.04065951332449913,
0.07320144027471542,
0.04604656621813774,
-0.04645640403032303,
-0.06642138212919235,
0.08428957313299179,
0.22140365839004517,
0.09488403797149658,
-0.04092077910900116,
0.10440775007009506,
-0.06714871525764465,
0.16760118305683136,
0.06133623421192169,
-0.03338543325662613,
0.0270096343010664,
-0.1065063327550888,
0.04621993005275726,
0.1694774329662323,
-0.07681450992822647,
-0.05738387629389763,
0.0480007603764534,
-0.07097838073968887,
-0.05225498974323273,
-0.1323111653327942,
-0.08873818814754486,
-0.00980902649462223,
-0.04062480106949806,
-0.0445699617266655,
0.0831766128540039,
-0.10055191069841385,
0.05445712432265282,
-0.08008906245231628,
-0.12302245199680328,
-0.012485786341130733,
-0.035707518458366394,
-0.0603407546877861,
0.07175138592720032,
-0.03480846807360649,
-0.06512511521577835,
-0.1962568759918213,
-0.06039726361632347,
-0.000022728221665602177,
0.05582929402589798,
0.07978446781635284,
-0.06432273983955383,
-0.05861577019095421,
-0.028806256130337715,
0.07781671732664108,
0.006775110960006714,
-0.043164368718862534,
-0.050697825849056244,
0.020168272778391838,
-0.00956494826823473,
-0.13448572158813477,
0.029364386573433876,
-0.0650540441274643,
-0.06375082582235336,
0.05165582895278931,
-0.07945112138986588,
0.18030771613121033,
0.11712955683469772,
0.07407087087631226,
0.028934121131896973,
0.03815343230962753,
0.197331041097641,
-0.06547579914331436,
0.09333141148090363,
0.14288094639778137,
0.023116163909435272,
0.03750644624233246,
0.13712498545646667,
0.04161308705806732,
-0.08528702706098557,
0.00558597594499588,
-0.06149715185165405,
-0.10501205921173096,
-0.08954384177923203,
-0.09141464531421661,
-0.053022559732198715,
0.019809285178780556,
-0.037384338676929474,
0.01206947397440672,
0.037155881524086,
0.1332060545682907,
-0.009654806926846504,
0.003895934671163559,
0.032218411564826965,
0.047515012323856354,
0.0949750691652298,
-0.03229041025042534,
0.08348681777715683,
-0.03374766930937767,
-0.024880312383174896,
0.07298222184181213,
-0.052768152207136154,
0.15225103497505188,
0.05205273628234863,
0.1644987165927887,
0.10329827666282654,
0.10838188976049423,
0.17061719298362732,
0.04835406318306923,
0.01435585506260395,
-0.01098833978176117,
-0.032899655401706696,
-0.10054630786180496,
0.05761103704571724,
0.05331579968333244,
0.005765449721366167,
-0.08996033668518066,
0.00522297527641058,
0.022089019417762756,
-0.07448721677064896,
0.05148160830140114,
0.09639829397201538,
-0.19562026858329773,
0.06960170716047287,
0.060878101736307144,
0.07416725903749466,
-0.027915969491004944,
0.0603783018887043,
0.0736776813864708,
-0.053470510989427567,
0.005325851961970329,
0.027289364486932755,
0.09309559315443039,
-0.1105019673705101,
0.008258689194917679,
-0.07021106034517288,
0.04089755937457085,
-0.011536892503499985,
0.027968941256403923,
-0.026031410321593285,
0.17736749351024628,
0.011226147413253784,
-0.04250805079936981,
0.025673331692814827,
-0.10490404069423676,
0.05364198237657547,
0.08321553468704224,
0.13159780204296112,
-0.013062428683042526,
0.04904763773083687,
-0.028803115710616112,
-0.09623952955007553,
0.014956643804907799,
0.1030409187078476,
-0.04933055862784386,
0.03552539274096489,
0.0022216252982616425,
-0.060734447091817856,
-0.0076263220980763435,
0.17223741114139557,
-0.14106012880802155,
-0.0976506844162941,
-0.01529125589877367,
0.1251552700996399,
0.1117875725030899,
0.009502731263637543,
-0.0594882033765316,
-0.05320978909730911,
-0.08147663623094559,
0.009276908822357655,
-0.002503408584743738,
0.0027851301711052656,
-0.05247493460774422,
0.08144450932741165,
-0.039430201053619385,
0.04373197257518768,
-0.011509956791996956,
0.10074958205223083,
-0.07628080248832703,
-0.08523473143577576,
0.01617361791431904,
-0.059418320655822754,
-0.12982600927352905,
-0.0629778578877449,
0.16878560185432434,
0.041104692965745926,
0.0656125470995903,
0.052121713757514954,
0.056237660348415375,
-0.05030342563986778,
-0.07409738749265671,
0.1138966903090477,
-0.10777395218610764,
-0.12068300694227219,
0.01108808908611536,
-0.05772235244512558,
-0.024632927030324936,
-0.06845001876354218,
0.010298086330294609,
0.06604259461164474,
0.28501832485198975,
-0.0758068785071373,
0.08189880847930908,
0.19871841371059418,
-0.02932930923998356,
-0.26690417528152466,
-0.04800092428922653,
-0.0731360986828804,
-0.015198225155472755,
0.08841690421104431,
-0.1289239525794983,
0.09356579929590225,
0.04006562754511833,
-0.03020060993731022,
0.20441247522830963,
-0.23724471032619476,
-0.0800260379910469,
0.046119749546051025,
0.061944231390953064,
0.21319441497325897,
-0.1874142289161682,
-0.08060669153928757,
-0.1278114765882492,
-0.14514446258544922,
0.08302082866430283,
-0.05772143229842186,
0.1058267280459404,
-0.009699338115751743,
0.011711753904819489,
-0.009968726895749569,
0.008423687890172005,
0.10384205728769302,
0.011900998651981354,
0.013257643207907677,
-0.07387549430131912,
0.022433897480368614,
0.15016178786754608,
0.02326376549899578,
0.0005260796169750392,
-0.22111108899116516,
-0.022591186687350273,
0.0024360138922929764,
-0.00209467438980937,
-0.0837060809135437,
-0.0019311439245939255,
-0.013609027490019798,
-0.02061799354851246,
-0.05864541977643967,
-0.008033095858991146,
-0.04111595079302788,
0.03878933563828468,
0.011519168503582478,
-0.12630629539489746,
0.024576885625720024,
0.07990048080682755,
0.06369288265705109,
0.0009701543021947145,
-0.08763444423675537,
-0.0542791448533535,
-0.07946356385946274,
0.07769612967967987,
-0.140825554728508,
-0.009232604876160622,
0.03801799193024635,
0.04384361207485199,
0.09166602790355682,
0.028495116159319878,
-0.05978524684906006,
0.09277337044477463,
0.15209078788757324,
-0.11115819215774536,
-0.09692500531673431,
-0.06118929013609886,
0.03906850144267082,
0.10244475305080414,
-0.03841251879930496,
0.11669304966926575,
-0.04443072900176048,
-0.006437279284000397,
0.0026729267556220293,
0.02178657241165638,
-0.08277978748083115,
0.01607697829604149,
0.11483100801706314,
0.008577396161854267,
-0.10283759236335754,
0.0490802600979805,
-0.01097217295318842,
0.054609715938568115,
-0.054229486733675,
0.13562148809432983,
-0.07610581815242767,
-0.061534810811281204,
-0.053290873765945435,
0.07226387411355972,
-0.18399181962013245,
0.03692585602402687,
-0.07264436781406403,
-0.08439921587705612,
-0.04996873810887337,
0.08055955916643143,
0.052083611488342285,
-0.03876492753624916,
-0.006588868331164122,
-0.04089759290218353,
-0.05282778665423393,
0.02487317845225334,
-0.003397288266569376,
0.0546465702354908,
-0.16038279235363007,
-0.12118012458086014,
0.01611022651195526,
0.04107077047228813,
-0.06224767491221428,
-0.05283794179558754,
-0.0493868850171566,
0.02240477129817009,
-0.09758815169334412,
0.03214984014630318,
-0.03689456731081009,
-0.059778206050395966,
-0.036185745149850845,
0.022207587957382202,
-0.05615420266985893,
-0.01810835301876068,
-0.05426154285669327,
-0.03573661297559738,
0.004645666107535362,
-0.008983534760773182,
-0.058512717485427856,
0.02451367862522602,
0.09024330228567123,
-0.07743655145168304,
0.06599976867437363,
0.027367793023586273,
-0.0611894428730011,
0.017966318875551224,
-0.24463333189487457,
-0.05686580389738083,
0.042328692972660065,
0.010251994244754314,
-0.00539544178172946,
0.09446258097887039,
-0.01034641545265913,
-0.026983268558979034,
0.05811198428273201,
-0.02870391495525837,
0.005817418452352285,
-0.06795560568571091,
0.07299556583166122,
-0.07123110443353653,
-0.024067102000117302,
-0.03432373329997063,
-0.016913212835788727,
0.0789506658911705,
0.08662066608667374,
0.09855551272630692,
-0.06713015586137772,
0.05126809701323509,
-0.061773836612701416,
0.03291291370987892,
0.015768421813845634,
-0.07358711212873459,
-0.04001523554325104,
-0.1333174705505371,
0.010822121985256672,
-0.02977246604859829,
0.08377222716808319,
0.02135835587978363,
-0.07032154500484467,
-0.03526483476161957,
0.1305750459432602,
0.01692480407655239,
-0.03125613182783127,
0.23856370151042938,
0.12553539872169495,
0.07100845128297806,
-0.062216151505708694,
0.08755442500114441,
0.0783500149846077,
0.08515093475580215,
-0.03006988950073719,
0.010877564549446106,
0.0629928782582283,
0.08313940465450287,
0.16736909747123718,
0.015718728303909302,
-0.010263307951390743,
0.0439113974571228,
0.0852590873837471,
0.06329164654016495,
-0.04728984460234642,
0.03709767013788223,
0.16271519660949707,
-0.05241266265511513,
0.0036878734827041626,
0.02255861461162567,
-0.03694714605808258,
-0.12984412908554077,
-0.22216740250587463,
-0.07886773347854614,
-0.24244453012943268,
0.08175007998943329,
-0.042755819857120514,
-0.042300764471292496,
0.09356275200843811,
0.004226847551763058,
0.013062546961009502,
-0.006255107466131449,
-0.060440242290496826,
-0.08443129807710648,
0.09104447066783905,
-0.04139377176761627,
-0.0693604126572609,
0.03704614192247391,
0.0341746024787426,
0.12021666765213013,
-0.071384958922863,
-0.0133071793243289,
0.06846282631158829,
0.003404957940801978,
0.03708009794354439,
-0.0495949350297451,
-0.152370423078537,
-0.01038464903831482,
0.028619330376386642,
0.002650961047038436,
0.24503491818904877,
0.029474852606654167,
-0.014290855266153812,
-0.0006690928130410612,
0.1326640397310257,
0.008058406412601471,
0.023383179679512978,
-0.047509633004665375,
-0.00008303311187773943,
-0.11297435313463211,
0.053804535418748856,
-0.08682059496641159,
-0.08557841181755066,
0.019101521000266075,
0.3397303521633148,
0.27869653701782227,
-0.11899449676275253,
0.009854094125330448,
0.06787669658660889,
0.0007560279336757958,
-0.022502213716506958,
0.010474120266735554,
0.09117802232503891,
0.253572553396225,
-0.09917448461055756,
-0.03927338123321533,
-0.08159655332565308,
-0.04224690794944763,
-0.031464651226997375,
-0.0752558559179306,
-0.0010226389858871698,
-0.06519216299057007,
-0.007101183291524649,
0.13048872351646423,
-0.191161647439003,
-0.10221870243549347,
0.06412332504987717,
-0.07308501750230789,
0.03846953064203262,
-0.10625061392784119,
-0.03241468966007233,
0.12232037633657455,
0.047125887125730515,
-0.12742501497268677,
-0.01877759024500847,
0.05669130012392998,
-0.00824737548828125,
-0.18652856349945068,
-0.0602697990834713,
-0.01928369514644146,
-0.18371683359146118,
0.08792012929916382,
-0.07708071917295456,
-0.0085215512663126,
-0.003657990600913763,
0.06404710561037064,
-0.016437718644738197,
0.13373535871505737,
0.0018110176315531135,
-0.111323781311512,
-0.06021786481142044,
0.22569209337234497,
-0.04156307503581047,
0.12385046482086182,
0.0074182539246976376,
-0.036146197468042374,
0.0742415189743042,
0.12835462391376495,
-0.07694198936223984,
-0.033787794411182404,
0.01568482629954815,
-0.15788255631923676,
0.055356770753860474,
0.026492517441511154,
0.013503368943929672,
-0.08619582653045654,
-0.029382016509771347,
0.002709185006096959,
0.07910978049039841,
-0.12694548070430756,
0.013050151988863945,
-0.04515361040830612,
-0.055390410125255585,
0.054844487458467484,
0.022944556549191475,
-0.26518380641937256,
-0.0028687305748462677,
-0.1539127379655838,
-0.015324458479881287,
-0.024211078882217407,
0.0484735369682312,
0.22655025124549866,
0.015359057113528252,
0.0060188667848706245,
-0.06739554554224014,
0.05766001343727112,
0.064011350274086,
-0.15340334177017212,
-0.10437414050102234
] |
null | null | diffusers | ### Nora_Appalachian_Cottontail Dreambooth model trained by kimelyle with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook
Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb)
Sample pictures of this concept:
| {"license": "creativeml-openrail-m", "tags": ["text-to-image", "stable-diffusion"]} | text-to-image | kimelyle/nora-appalachian-cottontail | [
"diffusers",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-14T15:24:53+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### Nora_Appalachian_Cottontail Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook
Test the concept via A1111 Colab fast-Colab-A1111
Sample pictures of this concept:
| [
"### Nora_Appalachian_Cottontail Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### Nora_Appalachian_Cottontail Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
56,
57
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### Nora_Appalachian_Cottontail Dreambooth model trained by kimelyle with TheLastBen's fast-DreamBooth notebook\n\n\nTest the concept via A1111 Colab fast-Colab-A1111\n\nSample pictures of this concept:"
] | [
-0.07445494085550308,
0.07362834364175797,
-0.0020805145613849163,
0.07308696210384369,
0.015484548173844814,
-0.011046695522964,
0.13008932769298553,
0.009817221201956272,
0.008744768798351288,
0.022441551089286804,
0.15773388743400574,
0.04339431971311569,
0.001483261352404952,
0.12965650856494904,
-0.06050420552492142,
-0.17346049845218658,
-0.003509979695081711,
0.042056821286678314,
-0.027996568009257317,
0.09322573989629745,
0.07167099416255951,
-0.07475994527339935,
0.09883002936840057,
-0.027186142280697823,
-0.14329202473163605,
-0.013693371787667274,
-0.10321362316608429,
-0.03255974501371384,
0.08491851389408112,
0.027144482359290123,
0.08202892541885376,
0.12606973946094513,
-0.005061754956841469,
-0.09717235714197159,
0.052777357399463654,
-0.05073295161128044,
-0.010430295951664448,
0.04535799100995064,
-0.0352494902908802,
0.03759589046239853,
0.07369991391897202,
0.10314073413610458,
0.0012776536168530583,
0.00016544577374588698,
-0.06798174232244492,
0.07360052317380905,
0.013475894927978516,
0.08171220123767853,
0.025569459423422813,
0.015778295695781708,
0.012176874093711376,
0.07640448957681656,
-0.01743028126657009,
0.10541483759880066,
0.16547313332557678,
-0.2269507199525833,
-0.09440919756889343,
0.24530962109565735,
0.12363296747207642,
-0.07545089721679688,
-0.03175533562898636,
0.06946408003568649,
0.0466063991189003,
0.038477931171655655,
-0.051234565675258636,
-0.08295278996229172,
-0.027862248942255974,
-0.07091072201728821,
-0.08788727223873138,
0.0366458036005497,
0.19277851283550262,
0.015997491776943207,
-0.04255440831184387,
-0.014819457195699215,
-0.08665326237678528,
0.10436265170574188,
-0.04939907044172287,
-0.048409413546323776,
-0.0214834064245224,
-0.004623545333743095,
-0.07364565879106522,
-0.06498811393976212,
-0.10110161453485489,
-0.055869512259960175,
-0.049723684787750244,
0.1388838291168213,
-0.025645779445767403,
0.04579804465174675,
-0.06625408679246902,
0.1642085760831833,
0.0037988622207194567,
-0.13863372802734375,
-0.01365630328655243,
-0.13464213907718658,
0.05994485318660736,
0.0005595083930529654,
0.01099927444010973,
-0.024904383346438408,
0.0895136147737503,
0.03150108829140663,
0.1741475909948349,
-0.0034411654341965914,
0.10710341483354568,
0.09174764156341553,
0.016042137518525124,
-0.005471601616591215,
0.036896929144859314,
-0.1550453156232834,
-0.02019835077226162,
0.016360178589820862,
0.0030360673554241657,
-0.037533462047576904,
-0.08003970235586166,
-0.03318183869123459,
-0.05122542753815651,
0.0006819299887865782,
0.04879641905426979,
-0.00592040317133069,
-0.06920292973518372,
-0.0315365232527256,
0.12966610491275787,
0.03940137103199959,
-0.03196967393159866,
-0.06129283457994461,
-0.08012005686759949,
0.03520921245217323,
0.1186058446764946,
0.00020913040498271585,
0.0296027734875679,
0.12215641140937805,
-0.0820624977350235,
-0.02714284136891365,
-0.010947774164378643,
0.011186020448803902,
0.004143645521253347,
-0.11714330315589905,
0.0617598257958889,
-0.14729928970336914,
-0.22649499773979187,
-0.009586503729224205,
0.07981770485639572,
-0.0697089359164238,
-0.04592001810669899,
-0.05738898366689682,
-0.10797202587127686,
0.02292841486632824,
0.028470225632190704,
-0.010159937664866447,
-0.016763143241405487,
0.0350056029856205,
0.02492905966937542,
0.10187606513500214,
-0.08658518642187119,
-0.027555033564567566,
-0.10699653625488281,
0.030422385782003403,
-0.07250527292490005,
0.029196111485362053,
-0.07451073080301285,
0.1482716202735901,
-0.03462446480989456,
-0.0526077002286911,
-0.027460193261504173,
0.011819840408861637,
0.006186296697705984,
0.21776659786701202,
-0.1370992213487625,
-0.010578718967735767,
0.07479570060968399,
-0.10956564545631409,
-0.20997723937034607,
0.05952530354261398,
0.0062123993411660194,
0.21216802299022675,
0.023750798776745796,
0.07955764979124069,
0.1247793510556221,
-0.29760560393333435,
0.0006036445847712457,
0.03950994834303856,
-0.11443764716386795,
-0.1099645346403122,
0.01605621911585331,
0.10612618178129196,
0.010077642276883125,
0.01871742680668831,
0.010231999680399895,
0.06552151590585709,
-0.08696542680263519,
-0.04199698567390442,
-0.06787028908729553,
-0.0676102265715599,
0.00423041544854641,
0.022045383229851723,
0.02535586804151535,
-0.055255740880966187,
0.01985028386116028,
-0.03300221636891365,
-0.007268122397363186,
0.04005841165781021,
-0.033349938690662384,
-0.059622589498758316,
0.040826041251420975,
-0.005253744777292013,
-0.029502471908926964,
-0.02544121816754341,
-0.07787073403596878,
0.0066113849170506,
0.10594458132982254,
0.017583860084414482,
0.1751570999622345,
0.047640491276979446,
0.059681277722120285,
0.006633146200329065,
-0.0278492271900177,
0.05638342350721359,
0.023460013791918755,
-0.030774665996432304,
-0.13106249272823334,
0.10937709361314774,
-0.05850598216056824,
-0.03723349794745445,
-0.11009318381547928,
0.026113159954547882,
0.06973330676555634,
0.15873581171035767,
0.050738077610731125,
0.023730315268039703,
0.03844399377703667,
0.0014723635977134109,
-0.039195094257593155,
-0.04289288446307182,
0.07764147967100143,
0.0416187047958374,
0.014406193979084492,
0.10851091146469116,
0.0008782520308159292,
0.23362182080745697,
0.06976204365491867,
0.0265874695032835,
-0.06348662823438644,
-0.08149631321430206,
-0.04819007217884064,
-0.027448290959000587,
-0.012876065447926521,
0.06748976558446884,
0.08593626320362091,
-0.01822712831199169,
0.13285507261753082,
-0.03288118541240692,
-0.002351648872718215,
0.025395656004548073,
-0.07154010236263275,
-0.017112964764237404,
0.08390432596206665,
-0.03105248510837555,
-0.1603061407804489,
0.029279205948114395,
0.15502339601516724,
-0.03560926020145416,
0.18815574049949646,
0.02196997031569481,
0.030824102461338043,
-0.10664444416761398,
-0.010752998292446136,
-0.03801076114177704,
0.21400213241577148,
-0.07929693162441254,
0.006141891703009605,
0.013273513875901699,
-0.012743747793138027,
0.04137222468852997,
-0.08289933204650879,
-0.05088413134217262,
0.03980482369661331,
0.02185843512415886,
0.13529075682163239,
0.10790719836950302,
-0.09612047672271729,
0.022917138412594795,
-0.049105070531368256,
-0.16617362201213837,
0.031212396919727325,
-0.00948242750018835,
0.020655132830142975,
0.12175372242927551,
-0.02190328575670719,
-0.199759379029274,
-0.1253083497285843,
-0.09675022959709167,
0.02910841442644596,
-0.0481715090572834,
0.0671062096953392,
0.04375794529914856,
-0.025174185633659363,
-0.06624244898557663,
0.06323801726102829,
-0.03220435231924057,
0.008670756593346596,
0.00005682411210727878,
0.04408428817987442,
-0.1001744195818901,
-0.043127626180648804,
-0.0012448065681383014,
-0.021066969260573387,
0.1618468016386032,
0.13102154433727264,
-0.06513255089521408,
0.11818013340234756,
0.09501656144857407,
-0.008253623731434345,
0.0032074281480163336,
0.02833200991153717,
0.27542197704315186,
-0.037656497210264206,
0.1253364384174347,
0.12401431798934937,
0.08192852884531021,
0.06548893451690674,
0.16664063930511475,
0.040316492319107056,
-0.04663095623254776,
0.09082400798797607,
-0.08902772516012192,
-0.09146848320960999,
-0.08934107422828674,
-0.0935753807425499,
-0.010693363845348358,
0.1022612601518631,
-0.002891009906306863,
0.04203835502266884,
0.032298266887664795,
0.1614605039358139,
0.10957105457782745,
-0.033363714814186096,
-0.07669483125209808,
0.0895790383219719,
0.16220515966415405,
-0.06707720458507538,
0.037730228155851364,
-0.07063047587871552,
-0.08930806815624237,
0.08197131007909775,
0.058133527636528015,
0.05484049394726753,
-0.05130298435688019,
-0.0505719892680645,
0.08105359971523285,
0.11302770674228668,
0.1300048530101776,
0.09061603993177414,
0.028907183557748795,
-0.08372951298952103,
-0.02571662701666355,
-0.07433489710092545,
0.07626241445541382,
0.0850156620144844,
-0.03023040108382702,
-0.03539188951253891,
0.05071476474404335,
0.06473949551582336,
-0.031771380454301834,
0.05124977231025696,
0.16639722883701324,
-0.2150334119796753,
-0.026679009199142456,
-0.02879973128437996,
0.05725519731640816,
-0.08892390131950378,
0.02678331546485424,
0.19329971075057983,
-0.04209601879119873,
-0.012212404981255531,
-0.07211869210004807,
0.07263345271348953,
0.03717968985438347,
-0.004422674886882305,
-0.05406051501631737,
-0.008389939554035664,
-0.02332601323723793,
0.0383530855178833,
-0.15123607218265533,
0.12014242261648178,
-0.031116727739572525,
0.05989297106862068,
0.022776197642087936,
-0.04366203770041466,
0.029163340106606483,
0.13465936481952667,
0.13412147760391235,
-0.024910544976592064,
0.059002287685871124,
0.01914706453680992,
-0.1382865309715271,
0.009510151110589504,
0.07752564549446106,
0.06252794712781906,
0.041707802563905716,
0.04090840369462967,
-0.014798164367675781,
-0.005105642601847649,
0.007773797959089279,
-0.15006086230278015,
-0.05298856273293495,
0.026677483692765236,
0.08272074908018112,
-0.0031998998019844294,
-0.035311259329319,
-0.05376563221216202,
0.0871644839644432,
0.11853469163179398,
-0.0945211872458458,
-0.04338811710476875,
-0.06736888736486435,
-0.14655181765556335,
0.09023823589086533,
-0.03150046616792679,
0.07010694593191147,
-0.10732819139957428,
0.026597699150443077,
-0.038164474070072174,
-0.08481820672750473,
0.031580954790115356,
-0.15145447850227356,
-0.07763878256082535,
-0.14239096641540527,
0.03030874766409397,
-0.019206758588552475,
-0.011866572313010693,
0.03284496068954468,
-0.010509824380278587,
-0.15581724047660828,
-0.08592326194047928,
0.01597488485276699,
-0.04244775325059891,
-0.0844329446554184,
-0.008504724130034447,
0.023952879011631012,
0.07873638719320297,
-0.010076653212308884,
-0.013737721368670464,
0.07385455816984177,
0.2624082565307617,
-0.055160053074359894,
0.05509914085268974,
0.148071750998497,
-0.045266102999448776,
-0.26209887862205505,
-0.12824797630310059,
-0.040919966995716095,
0.015184026211500168,
-0.05901758372783661,
-0.09820874035358429,
0.17910915613174438,
-0.008753570728003979,
-0.04273310303688049,
0.21284615993499756,
-0.3257221281528473,
-0.08218519389629364,
0.1005258560180664,
0.08584895730018616,
0.3802810609340668,
-0.10565321892499924,
-0.06302984058856964,
-0.01701871119439602,
-0.24954497814178467,
0.15062567591667175,
0.058490194380283356,
0.08548770099878311,
-0.11457589268684387,
0.03069322742521763,
-0.01368781179189682,
-0.05296824499964714,
0.1321226954460144,
-0.07069150358438492,
0.04620320349931717,
-0.11234907060861588,
0.0014213940594345331,
0.09804631769657135,
-0.053344856947660446,
0.06953433901071548,
-0.06872037798166275,
0.12126768380403519,
-0.04481273889541626,
-0.029850106686353683,
-0.041958194226026535,
0.08104628324508667,
-0.06130778789520264,
-0.10029393434524536,
-0.11573080718517303,
0.04467102140188217,
-0.030745794996619225,
-0.01279903668910265,
-0.08926157653331757,
0.01719929464161396,
-0.0915008932352066,
0.2328980714082718,
-0.02935279719531536,
-0.0755992904305458,
-0.07069972902536392,
-0.012190189212560654,
-0.05832685902714729,
0.05960807576775551,
-0.0653756856918335,
-0.08957464247941971,
0.21197092533111572,
0.054545801132917404,
0.05223007872700691,
0.03648095205426216,
-0.028102541342377663,
0.00258867465890944,
0.11641491949558258,
-0.1595626026391983,
-0.04145512357354164,
-0.07369854301214218,
0.126066654920578,
0.033220864832401276,
-0.01869886927306652,
0.14602315425872803,
-0.09438055753707886,
0.05391628295183182,
-0.0351150706410408,
-0.04355797916650772,
-0.008332714438438416,
0.10584257543087006,
0.042758259922266006,
0.06193789839744568,
-0.028662189841270447,
0.067612424492836,
-0.060329996049404144,
-0.09522674232721329,
-0.0983014702796936,
0.08188951015472412,
-0.0884387195110321,
-0.06964345276355743,
0.02216051146388054,
0.15136054158210754,
-0.21660469472408295,
0.004257391206920147,
-0.12363551557064056,
-0.08687475323677063,
0.02865569479763508,
0.20367854833602905,
0.07451706379652023,
0.0734059065580368,
-0.027446512132883072,
-0.07170969247817993,
-0.007556090131402016,
0.08184293657541275,
0.05363665148615837,
0.08334118127822876,
-0.21275006234645844,
-0.06989370286464691,
-0.049579910933971405,
0.05595098063349724,
-0.08940395712852478,
-0.04186710715293884,
-0.07103003561496735,
0.0048342421650886536,
-0.046871159225702286,
0.09791892021894455,
-0.05647244304418564,
-0.06383316963911057,
-0.01432635448873043,
-0.021167324855923653,
-0.03168873116374016,
-0.0034904780331999063,
-0.05550768971443176,
0.04504651576280594,
0.01381529588252306,
-0.0036602148320525885,
-0.03968719765543938,
-0.06042587384581566,
0.07178401947021484,
-0.057548560202121735,
0.045597199350595474,
-0.028985822573304176,
-0.10534381866455078,
-0.037822071462869644,
-0.14941038191318512,
0.00625890027731657,
0.10927597433328629,
-0.013113350607454777,
0.021800627931952477,
0.017271842807531357,
-0.03723769262433052,
-0.018333297222852707,
0.07347110658884048,
0.016997799277305603,
0.03527721017599106,
-0.10340835154056549,
-0.07467515021562576,
-0.011745822615921497,
-0.06873758137226105,
-0.08486848324537277,
-0.027808036655187607,
0.12725354731082916,
0.0981370359659195,
0.146804541349411,
-0.09545319527387619,
0.06168952211737633,
0.0008878822554834187,
-0.00503719924017787,
0.06433375179767609,
-0.08943714946508408,
0.07629391551017761,
-0.04329519346356392,
-0.039916470646858215,
0.019874311983585358,
0.13069842755794525,
-0.0023300754837691784,
-0.22419936954975128,
-0.007507979404181242,
-0.11713670194149017,
-0.060705360025167465,
0.021710222586989403,
0.20340965688228607,
0.03922765702009201,
0.021319303661584854,
-0.12807747721672058,
0.0847734808921814,
0.0652594193816185,
0.11599427461624146,
0.04141326621174812,
0.09948879480361938,
0.06243507191538811,
0.12936361134052277,
0.017649980261921883,
0.07052882015705109,
-0.004226116929203272,
0.01617956906557083,
-0.058403074741363525,
0.14569132030010223,
-0.05819958820939064,
-0.0018385635921731591,
0.04783325642347336,
0.005176198668777943,
-0.026142410933971405,
0.0023738166783005,
-0.0804639458656311,
-0.034402996301651,
0.008622980676591396,
-0.08318112045526505,
-0.08032304793596268,
0.030692854896187782,
-0.0574212409555912,
-0.030271198600530624,
0.05656074360013008,
0.045980054885149,
-0.03939659148454666,
0.09807900339365005,
0.000883345550391823,
-0.011924232356250286,
0.16845320165157318,
-0.031770072877407074,
-0.0902097150683403,
-0.021699896082282066,
0.0681418776512146,
-0.08425790071487427,
0.0535842701792717,
-0.08862198889255524,
0.03457064554095268,
-0.03688934072852135,
-0.029913993552327156,
0.02742752991616726,
-0.054741520434617996,
-0.025701487436890602,
0.03758315369486809,
0.030745046213269234,
0.03437190502882004,
0.03183051571249962,
0.00012625689851120114,
0.0026661644224077463,
0.19468244910240173,
-0.031002044677734375,
-0.15329977869987488,
-0.08233873546123505,
0.07193559408187866,
-0.10239644348621368,
0.0766114667057991,
-0.033967576920986176,
-0.005335276015102863,
-0.052903033792972565,
0.14291365444660187,
0.116013303399086,
-0.12534376978874207,
-0.004659036640077829,
0.007068224251270294,
0.0019392630783841014,
-0.025138655677437782,
0.027217017486691475,
0.02790846861898899,
0.23206891119480133,
-0.08929189294576645,
-0.07639726996421814,
-0.09790782630443573,
-0.07953149825334549,
-0.006822247989475727,
-0.15402951836585999,
0.05314423888921738,
-0.016273967921733856,
-0.13851888477802277,
0.10186257213354111,
-0.20376743376255035,
-0.025731738656759262,
0.18467308580875397,
-0.11281509697437286,
-0.06425464153289795,
-0.047935087233781815,
0.11265549808740616,
0.017600372433662415,
0.089471735060215,
-0.11382418125867844,
-0.011328289285302162,
0.01092974841594696,
-0.05381771922111511,
-0.13985992968082428,
0.06326350569725037,
0.010678157210350037,
-0.2341020405292511,
0.14070743322372437,
-0.02243228815495968,
0.05764583498239517,
0.08816066384315491,
-0.03849261254072189,
-0.12545731663703918,
0.053999509662389755,
0.011926761828362942,
-0.08547472208738327,
-0.022899549454450607,
0.092320017516613,
0.04709002375602722,
0.021912280470132828,
0.04393427446484566,
-0.11271513253450394,
-0.008650006726384163,
0.16641999781131744,
0.02074385993182659,
-0.15723879635334015,
0.06220448762178421,
-0.01138245314359665,
0.07950451225042343,
0.05480743572115898,
-0.049971044063568115,
-0.0007088800775818527,
0.0000752273335820064,
0.051778052002191544,
0.004298151936382055,
-0.08182810246944427,
0.036807216703891754,
-0.015350248664617538,
-0.0029761171899735928,
-0.02739034593105316,
-0.014994185417890549,
-0.257389634847641,
-0.0823015496134758,
-0.1596459001302719,
0.023696813732385635,
0.007412672974169254,
0.10851756483316422,
0.16570185124874115,
0.038631830364465714,
0.02816605381667614,
-0.008377929218113422,
-0.048313938081264496,
0.014092091470956802,
-0.030353814363479614,
-0.13736574351787567
] |
null | null | null |
# PPO Agent Playing CartPole-v1
This is a trained model of a PPO agent playing CartPole-v1.
# Hyperparameters
```python
{'exp_name': 'ppo'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'env_id': 'CartPole-v1'
'total_timesteps': 50000
'learning_rate': 0.00025
'num_envs': 4
'num_steps': 128
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.95
'num_minibatches': 4
'update_epochs': 4
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'Overgrown7380/ppo-CartPole-v1'
'batch_size': 512
'minibatch_size': 128}
```
| {"tags": ["CartPole-v1", "ppo", "deep-reinforcement-learning", "reinforcement-learning", "custom-implementation", "deep-rl-course"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "CartPole-v1", "type": "CartPole-v1"}, "metrics": [{"type": "mean_reward", "value": "160.50 +/- 82.51", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | Overgrown7380/ppo-CartPole-v1 | [
"tensorboard",
"CartPole-v1",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
] | 2024-02-14T15:26:21+00:00 | [] | [] | TAGS
#tensorboard #CartPole-v1 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us
|
# PPO Agent Playing CartPole-v1
This is a trained model of a PPO agent playing CartPole-v1.
# Hyperparameters
| [
"# PPO Agent Playing CartPole-v1\n\n This is a trained model of a PPO agent playing CartPole-v1.\n\n # Hyperparameters"
] | [
"TAGS\n#tensorboard #CartPole-v1 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n",
"# PPO Agent Playing CartPole-v1\n\n This is a trained model of a PPO agent playing CartPole-v1.\n\n # Hyperparameters"
] | [
51,
36
] | [
"passage: TAGS\n#tensorboard #CartPole-v1 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n# PPO Agent Playing CartPole-v1\n\n This is a trained model of a PPO agent playing CartPole-v1.\n\n # Hyperparameters"
] | [
-0.004879093263298273,
-0.053268589079380035,
-0.005098002962768078,
0.07334969937801361,
0.1777561753988266,
-0.038396865129470825,
0.14297015964984894,
0.08244865387678146,
0.09997384995222092,
0.09035326540470123,
0.1059093028306961,
0.16291606426239014,
0.04727339372038841,
0.13590997457504272,
0.09157068282365799,
-0.30968427658081055,
0.00408547418192029,
0.003679449437186122,
-0.04604669660329819,
0.1186426505446434,
0.020725054666399956,
-0.11653941124677658,
0.041666194796562195,
0.07010702043771744,
-0.050974249839782715,
0.014011003077030182,
-0.016095049679279327,
-0.11272101104259491,
0.09897422045469284,
-0.026063082739710808,
0.08666004240512848,
0.00034508894896134734,
0.12480676919221878,
-0.09643567353487015,
0.04538939893245697,
0.09414922446012497,
-0.06467069685459137,
0.05906341224908829,
0.010876919142901897,
-0.031389329582452774,
0.07894393056631088,
0.00017247258801944554,
0.09476213157176971,
-0.00817358959466219,
-0.16068053245544434,
-0.06650647521018982,
0.032433513551950455,
0.04522484540939331,
0.09700331091880798,
0.07551904767751694,
0.017124267295002937,
0.2459845244884491,
-0.11146722733974457,
-0.004057587124407291,
0.1765144020318985,
-0.38273483514785767,
-0.06979376077651978,
0.16279388964176178,
0.06927463412284851,
0.09216812998056412,
-0.0810844823718071,
-0.024574723094701767,
0.00557481124997139,
0.022949418053030968,
-0.01380628440529108,
-0.04777095094323158,
0.018658874556422234,
0.04569006338715553,
-0.13524354994297028,
-0.008603408001363277,
0.12342680245637894,
-0.00008464334678137675,
0.03494636341929436,
-0.0071600740775465965,
-0.04128574952483177,
-0.0692334920167923,
-0.03034450113773346,
-0.10930527001619339,
0.09515775740146637,
0.04257714003324509,
-0.0011546805035322905,
0.0005034031346440315,
-0.06961086392402649,
-0.017436109483242035,
-0.10763698816299438,
0.14605791866779327,
-0.007813513278961182,
0.02994344010949135,
-0.007597260642796755,
0.04507143422961235,
-0.13701359927654266,
0.022694434970617294,
0.028660546988248825,
-0.016641151160001755,
-0.06184710934758186,
-0.07638183236122131,
0.01605123095214367,
-0.05258476734161377,
0.09291502088308334,
0.018960420042276382,
0.039665330201387405,
0.03944318741559982,
0.043618619441986084,
0.07198116183280945,
0.10703267902135849,
0.16111579537391663,
-0.103221096098423,
0.17381423711776733,
0.03620865195989609,
0.03044985421001911,
-0.008281699381768703,
-0.05913712829351425,
-0.14506186544895172,
0.13327346742153168,
-0.006918599829077721,
-0.012964146211743355,
-0.015688301995396614,
0.01993255503475666,
-0.1394311487674713,
-0.013748332858085632,
-0.10034602880477905,
-0.052920278161764145,
-0.012290297076106071,
0.016496967524290085,
-0.07121220976114273,
0.11974801123142242,
-0.006320524960756302,
0.03624839335680008,
-0.04705633968114853,
0.020102931186556816,
-0.07316501438617706,
-0.003537173382937908,
-0.10739076882600784,
-0.04455853998661041,
0.050751298666000366,
0.0007905436214059591,
0.0005692050326615572,
-0.09255252778530121,
-0.11784593015909195,
-0.06631878763437271,
0.056279804557561874,
-0.040278349071741104,
-0.10668687522411346,
-0.10858064144849777,
0.0483064167201519,
-0.12098009139299393,
0.012252585031092167,
0.01745755784213543,
-0.03691673278808594,
0.05524871125817299,
-0.09846973419189453,
0.11137216538190842,
0.025424465537071228,
0.023771964013576508,
-0.11410197615623474,
0.010270664468407631,
-0.2889874279499054,
-0.004275968298316002,
-0.11028347164392471,
0.008037938736379147,
-0.08946748077869415,
-0.03660508990287781,
-0.04175717383623123,
0.08176011592149734,
-0.009923416189849377,
0.088219054043293,
-0.0626499354839325,
-0.09986653178930283,
0.10682560503482819,
-0.03623077645897865,
-0.05495396628975868,
0.04345869645476341,
-0.02742840349674225,
0.12940017879009247,
0.09059691429138184,
0.1916920393705368,
-0.07209257036447525,
-0.194810152053833,
0.09807803481817245,
0.0840335562825203,
-0.12613753974437714,
-0.03441683202981949,
0.0833493247628212,
-0.046824339777231216,
0.01835087686777115,
-0.029562918469309807,
-0.0729789212346077,
0.013639457523822784,
-0.12368898093700409,
0.013604939915239811,
0.07838290929794312,
-0.04451325163245201,
0.09414191544055939,
0.08455628156661987,
0.08474118262529373,
-0.03221198916435242,
-0.137506365776062,
0.03516169264912605,
0.041969913989305496,
-0.00808932539075613,
0.0017193031962960958,
-0.16498339176177979,
0.16603133082389832,
-0.06196242570877075,
-0.0057325041852891445,
-0.15319792926311493,
-0.10427401959896088,
-0.031267065554857254,
0.09206677228212357,
0.10521110892295837,
0.1510094851255417,
0.08560361713171005,
-0.008934620767831802,
0.09780433028936386,
-0.019225982949137688,
-0.02401897683739662,
0.019565658643841743,
-0.03196932375431061,
-0.18399155139923096,
0.019172528758645058,
-0.05717253312468529,
0.003533789189532399,
-0.058066945523023605,
0.0040851435624063015,
-0.022902006283402443,
0.008802728727459908,
0.02703857421875,
-0.010253684595227242,
-0.043418996036052704,
0.08139035105705261,
0.029596807435154915,
-0.08052238076925278,
0.14117023348808289,
-0.001214782940223813,
0.0581800639629364,
-0.05452844128012657,
-0.05158551037311554,
0.18343888223171234,
0.16450025141239166,
-0.23041456937789917,
-0.05166757106781006,
0.04556092992424965,
-0.03387729823589325,
0.057073261588811874,
-0.04804402217268944,
0.1480136215686798,
0.21703225374221802,
0.036160293966531754,
0.06227940320968628,
-0.08267539739608765,
0.04931255057454109,
0.028955863788723946,
-0.0773174911737442,
0.05161866173148155,
0.10072989016771317,
0.26084160804748535,
-0.016485532745718956,
0.11036565899848938,
0.14755627512931824,
0.034148938953876495,
0.12254941463470459,
-0.024094605818390846,
-0.06119551882147789,
-0.04161033406853676,
0.04928101599216461,
-0.020901871845126152,
0.03585442900657654,
-0.17524774372577667,
-0.009959354996681213,
-0.017488300800323486,
-0.10002518445253372,
0.014016618020832539,
-0.19032876193523407,
-0.0738009437918663,
0.007204416207969189,
0.03414149209856987,
0.021567946299910545,
0.023021142929792404,
-0.027325019240379333,
0.08576241135597229,
0.02462105266749859,
-0.1355176866054535,
0.07175250351428986,
0.027540041133761406,
-0.02122023142874241,
0.1371327042579651,
-0.05558187514543533,
-0.14105089008808136,
-0.015953142195940018,
-0.09038077294826508,
0.02670563943684101,
0.06648987531661987,
-0.0405830554664135,
-0.1725231409072876,
0.0014473540941253304,
0.014205146580934525,
0.05312950164079666,
0.00450565479695797,
-0.04461467266082764,
0.06662546098232269,
0.08561044931411743,
-0.054448265582323074,
-0.06488117575645447,
-0.042523834854364395,
-0.1161925420165062,
-0.21491117775440216,
0.011486858129501343,
-0.010688519105315208,
-0.02955492027103901,
0.2562185525894165,
0.023314470425248146,
0.06652859598398209,
-0.017599284648895264,
0.0018624486401677132,
-0.04591444507241249,
-0.07139933109283447,
0.16454476118087769,
-0.03048505075275898,
0.0055803656578063965,
-0.024878205731511116,
0.03318792209029198,
-0.039268504828214645,
-0.01617603376507759,
0.012402212247252464,
-0.08402779698371887,
-0.18313777446746826,
-0.06184569001197815,
-0.03711171820759773,
0.1419387012720108,
0.08243502676486969,
0.02845289371907711,
-0.08174657076597214,
0.0661219134926796,
0.12952229380607605,
0.02783835120499134,
-0.04430088773369789,
-0.02036169543862343,
0.094272181391716,
-0.09886091947555542,
0.012361646629869938,
-0.01734011061489582,
-0.10803284496068954,
0.04269475117325783,
-0.013095911592245102,
0.09271764010190964,
0.11186782270669937,
-0.00036001286935061216,
0.03768351674079895,
0.0831303671002388,
0.03892263397574425,
0.0911790281534195,
0.08403602987527847,
-0.1147247776389122,
-0.037051353603601456,
-0.014040433801710606,
-0.10196618735790253,
0.03182453662157059,
0.17152321338653564,
0.004120252560824156,
-0.057270556688308716,
0.0912473201751709,
-0.034518368542194366,
0.08860155940055847,
-0.03018788807094097,
-0.22772818803787231,
0.06025097519159317,
-0.047323860228061676,
0.017748957499861717,
-0.045258112251758575,
0.12160554528236389,
-0.06427639722824097,
-0.1740245819091797,
-0.06648560613393784,
0.01842343620955944,
0.021517690271139145,
-0.1084541454911232,
0.0011309711262583733,
-0.016537094488739967,
0.09486077725887299,
-0.027617240324616432,
0.06119350343942642,
-0.08461108803749084,
0.18558549880981445,
-0.0316321887075901,
0.07090424001216888,
-0.12411578744649887,
-0.04644077271223068,
0.04234891012310982,
-0.04343846067786217,
0.21959346532821655,
-0.009721633978188038,
0.1167549192905426,
-0.09800613671541214,
-0.11295241117477417,
-0.022806162014603615,
0.018750736489892006,
-0.04683797061443329,
0.010146685875952244,
0.022143030539155006,
0.0158755574375391,
-0.00712275505065918,
-0.0015142736956477165,
0.0025053033605217934,
0.018607567995786667,
-0.04539605975151062,
-0.10414505004882812,
0.0031124367378652096,
-0.048797257244586945,
-0.11872010678052902,
-0.21296188235282898,
0.13528093695640564,
0.19337601959705353,
0.02599218860268593,
-0.04890408739447594,
0.08912426233291626,
-0.07363592088222504,
-0.01923859305679798,
0.02067587710916996,
-0.01394637580960989,
0.05053963139653206,
-0.03894123062491417,
-0.07650179415941238,
0.1957497000694275,
-0.03852236643433571,
-0.023618174716830254,
-0.04514233022928238,
0.15933865308761597,
0.07191948592662811,
0.06652267277240753,
-0.030605658888816833,
0.01159533392637968,
0.05110948160290718,
-0.038397181779146194,
0.16713254153728485,
-0.025344232097268105,
-0.09375008195638657,
0.057611044496297836,
0.03605407103896141,
0.06852881610393524,
0.04896658658981323,
0.029369542375206947,
0.19403201341629028,
0.1948624849319458,
-0.0034709153696894646,
0.1447412222623825,
0.01122130174189806,
-0.049524303525686264,
-0.2257317304611206,
-0.030365614220499992,
0.03183531388640404,
0.003940147813409567,
0.16197948157787323,
-0.11311732232570648,
0.09940971434116364,
0.05467512458562851,
-0.01518184319138527,
0.014796463772654533,
-0.3779623806476593,
-0.035542141646146774,
0.20203492045402527,
0.13629621267318726,
0.28798437118530273,
-0.11130319535732269,
0.04862072318792343,
0.03498063609004021,
0.00997246615588665,
0.14061783254146576,
-0.16928575932979584,
0.09474216401576996,
-0.011930056847631931,
0.052062638103961945,
0.04381201043725014,
-0.0046874783001840115,
0.009734786115586758,
-0.07897170633077621,
-0.01260753907263279,
-0.09118814021348953,
-0.08579449355602264,
0.17453667521476746,
0.05983076989650726,
-0.10698884725570679,
0.18890205025672913,
-0.07481404393911362,
-0.14431631565093994,
-0.05105263739824295,
-0.015674425289034843,
0.03342440724372864,
-0.0006725095445290208,
-0.05749676749110222,
-0.013071183115243912,
0.07752269506454468,
-0.03053666651248932,
0.07517395168542862,
0.12476314604282379,
-0.012020100839436054,
0.0803857371211052,
0.13760598003864288,
0.08713838458061218,
0.038192879408597946,
-0.18429221212863922,
-0.04483243450522423,
-0.026529459282755852,
0.04743347689509392,
-0.1346912831068039,
-0.028621146455407143,
0.08046353608369827,
0.0412418358027935,
0.03467710316181183,
0.10316871851682663,
-0.07498378306627274,
0.0225164033472538,
0.033059071749448776,
-0.16800881922245026,
-0.18474186956882477,
0.040773846209049225,
-0.006085168570280075,
0.08235961198806763,
0.033765338361263275,
0.05959876626729965,
-0.07514463365077972,
-0.02667473629117012,
0.03925332427024841,
-0.02591698057949543,
-0.1051616445183754,
0.056738998740911484,
0.09470438957214355,
-0.008505056612193584,
-0.08280894160270691,
0.1233520433306694,
0.06753265112638474,
-0.09256496280431747,
0.02688448689877987,
0.01907486654818058,
-0.01051837857812643,
-0.08317705988883972,
0.096236951649189,
0.2479025274515152,
0.03088749572634697,
-0.032586727291345596,
-0.10720907151699066,
-0.171994149684906,
0.1138756275177002,
0.0652080550789833,
0.09152500331401825,
-0.09260109812021255,
-0.060834161937236786,
0.04114098846912384,
-0.07842009514570236,
-0.018979962915182114,
0.03198205307126045,
-0.05040104687213898,
-0.09793168306350708,
0.05380454286932945,
0.030097730457782745,
0.08453776687383652,
-0.0788675919175148,
-0.10397668927907944,
-0.18298394978046417,
0.07614149153232574,
0.056481603533029556,
-0.11105712503194809,
-0.043198809027671814,
-0.03718862310051918,
0.030012834817171097,
0.0032613843213766813,
-0.05961630865931511,
-0.015114767476916313,
-0.13593745231628418,
0.07557616382837296,
0.05942581593990326,
0.060136742889881134,
-0.03158582001924515,
-0.015616845339536667,
0.01775345765054226,
-0.029984313994646072,
0.039214663207530975,
-0.014155536890029907,
0.0064228433184325695,
0.013777287676930428,
-0.28584709763526917,
0.011614147573709488,
0.03662247583270073,
-0.015821535140275955,
0.07484413683414459,
-0.013321996666491032,
0.054203640669584274,
0.004009241703897715,
-0.07026302814483643,
-0.000051808881835313514,
0.011072422377765179,
-0.08469333499670029,
0.10085124522447586,
-0.02224084921181202,
-0.0779024288058281,
-0.0451645590364933,
-0.020019622519612312,
0.0026309089735150337,
0.03957454860210419,
0.1143956258893013,
-0.04068254306912422,
0.07455530017614365,
-0.16142232716083527,
-0.005104721058160067,
-0.007587803527712822,
0.053889233618974686,
0.08383380621671677,
-0.08010061830282211,
0.02726542204618454,
-0.018196336925029755,
0.18441873788833618,
0.1360616534948349,
-0.016972869634628296,
0.012520797550678253,
0.046380266547203064,
0.030953066423535347,
0.04696502909064293,
0.07573944330215454,
0.047440387308597565,
0.023338787257671356,
0.022828364744782448,
0.053823843598365784,
0.11377840489149094,
-0.027810778468847275,
0.14852945506572723,
-0.022325463593006134,
-0.04140667989850044,
0.08989053964614868,
0.05996384844183922,
0.02584362030029297,
-0.0056707244366407394,
0.22333309054374695,
-0.01234759297221899,
0.027750112116336823,
0.002983179409056902,
0.12526190280914307,
0.08957081288099289,
-0.049282293766736984,
-0.02991841360926628,
-0.06409192830324173,
-0.03569342568516731,
-0.10752855986356735,
-0.19642680883407593,
-0.06445988267660141,
-0.27970272302627563,
0.099894218146801,
-0.06311818212270737,
0.02516455389559269,
0.1649707704782486,
0.025802282616496086,
-0.008987406268715858,
0.08460691571235657,
-0.10170688480138779,
-0.028371557593345642,
0.057142872363328934,
-0.03985306993126869,
-0.01203426718711853,
-0.21272823214530945,
-0.10396099090576172,
-0.13333424925804138,
-0.103139728307724,
-0.03523807227611542,
0.006300660315901041,
-0.04225986450910568,
0.022673433646559715,
-0.04449579119682312,
-0.026972321793437004,
0.021258419379591942,
-0.03531299903988838,
-0.019903428852558136,
0.08502040803432465,
-0.00247888988815248,
-0.08383714407682419,
0.010132243856787682,
0.10210548341274261,
0.026078959926962852,
-0.0773373395204544,
-0.012925010174512863,
0.14302696287631989,
-0.011951901949942112,
0.04301919415593147,
-0.0032417369075119495,
0.0036823933478444815,
-0.010332739911973476,
0.2376859039068222,
0.14044520258903503,
-0.1484585404396057,
-0.017418598756194115,
-0.020826241001486778,
-0.0005514072836376727,
-0.03978477045893669,
0.14805418252944946,
0.09752041846513748,
0.0744171068072319,
-0.16734418272972107,
-0.07684502005577087,
-0.1035190224647522,
0.07242846488952637,
-0.13752183318138123,
-0.09808503836393356,
0.214148610830307,
0.005981049500405788,
-0.04543512314558029,
0.08297977596521378,
-0.19298547506332397,
-0.06420750916004181,
0.11323536932468414,
-0.17824698984622955,
-0.13982565701007843,
-0.07454214245080948,
0.08547235280275345,
0.08693820983171463,
0.16296127438545227,
-0.07050101459026337,
-0.05753713473677635,
-0.08074700832366943,
0.021522311493754387,
-0.1508437842130661,
0.001992675708606839,
0.03776737302541733,
-0.13227902352809906,
0.182963564991951,
-0.05954059585928917,
0.11718567460775375,
0.052957933396101,
0.04108313471078873,
0.03906266391277313,
0.029074331745505333,
-0.011078592389822006,
-0.11551868170499802,
-0.048984136432409286,
-0.06047533452510834,
0.014690612442791462,
0.037655480206012726,
0.04968557134270668,
-0.09210143983364105,
0.0755632072687149,
0.017098160460591316,
-0.012279175221920013,
-0.03515845164656639,
0.07773496955633163,
-0.166275754570961,
0.08074065297842026,
0.041757144033908844,
0.004675289615988731,
0.01274809055030346,
-0.0611170157790184,
0.0832490473985672,
0.054849740117788315,
-0.011265380308032036,
-0.06946953386068344,
-0.09695396572351456,
-0.010837951675057411,
-0.038931433111429214,
-0.07382240891456604,
-0.09820899367332458,
-0.03794136270880699,
-0.16925369203090668,
-0.02869848720729351,
-0.130319744348526,
0.02632369101047516,
0.03506185859441757,
0.05149315670132637,
-0.02994428388774395,
0.07583170384168625,
0.029270920902490616,
0.11956024914979935,
-0.0804939791560173,
-0.034947264939546585
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# quiz3_distilbert
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5149
- Accuracy: 0.7720
- F1: 0.7920
- Precision: 0.8003
- Recall: 0.7838
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| No log | 1.0 | 307 | 0.5163 | 0.7720 | 0.7904 | 0.8049 | 0.7765 |
| 0.4185 | 2.0 | 614 | 0.5204 | 0.7712 | 0.7941 | 0.7912 | 0.7971 |
| 0.4185 | 3.0 | 921 | 0.5157 | 0.7712 | 0.7917 | 0.7982 | 0.7853 |
| 0.4339 | 4.0 | 1228 | 0.5146 | 0.7744 | 0.7941 | 0.8030 | 0.7853 |
| 0.4381 | 5.0 | 1535 | 0.5149 | 0.7720 | 0.7920 | 0.8003 | 0.7838 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy", "f1", "precision", "recall"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "quiz3_distilbert", "results": []}]} | text-classification | amirbralin/quiz3_distilbert | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:26:23+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| quiz3\_distilbert
=================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5149
* Accuracy: 0.7720
* F1: 0.7920
* Precision: 0.8003
* Recall: 0.7838
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10121794790029526,
0.10849687457084656,
-0.002683942671865225,
0.11278937011957169,
0.14344525337219238,
0.015307440422475338,
0.16096976399421692,
0.11345184594392776,
-0.06649868190288544,
0.04792138934135437,
0.12867797911167145,
0.1276552826166153,
0.01491527445614338,
0.11794900894165039,
-0.08142182976007462,
-0.21183905005455017,
0.009844031184911728,
0.02330554649233818,
-0.0624646358191967,
0.11465875804424286,
0.09254448860883713,
-0.12268923223018646,
0.08891849219799042,
-0.01705627329647541,
-0.16726548969745636,
0.005207315552979708,
0.0174532663077116,
-0.04799072444438934,
0.12475350499153137,
0.033180415630340576,
0.13270573318004608,
0.03679875656962395,
0.08477474004030228,
-0.19070906937122345,
0.01055945735424757,
0.06143561378121376,
-0.004980100318789482,
0.08199379593133926,
0.03525537624955177,
-0.008309946395456791,
0.0684790387749672,
-0.09376794844865799,
0.0621793232858181,
0.017882412299513817,
-0.12836965918540955,
-0.20675502717494965,
-0.08676086366176605,
0.033286068588495255,
0.09270384162664413,
0.07702130824327469,
-0.011414076201617718,
0.11576959490776062,
-0.054148707538843155,
0.09685610234737396,
0.20029789209365845,
-0.30548229813575745,
-0.062249764800071716,
0.04951600357890129,
0.02563987672328949,
0.08845861256122589,
-0.09906233102083206,
-0.020126737654209137,
0.06007659062743187,
0.02344440296292305,
0.12847812473773956,
-0.02420722506940365,
-0.05878700315952301,
0.0002846854622475803,
-0.14351950585842133,
-0.018368935212492943,
0.15288323163986206,
0.051083117723464966,
-0.04606734216213226,
-0.04680010676383972,
-0.0730985477566719,
-0.1331396847963333,
-0.040224839001894,
-0.008612176403403282,
0.04960773140192032,
-0.021653341129422188,
-0.05973825976252556,
-0.021570181474089622,
-0.0957750454545021,
-0.06548568606376648,
-0.05490470677614212,
0.14268548786640167,
0.03433391824364662,
0.004394685383886099,
-0.010407647117972374,
0.10001061111688614,
-0.026069343090057373,
-0.14581499993801117,
0.022180212661623955,
0.021659109741449356,
0.00323349772952497,
-0.05085572600364685,
-0.05059783533215523,
-0.0809604600071907,
0.02198006585240364,
0.1577625870704651,
-0.04995660111308098,
0.052257560193538666,
0.002408423461019993,
0.046190351247787476,
-0.10198047757148743,
0.1685650646686554,
-0.041523177176713943,
-0.03155946359038353,
0.024232491850852966,
0.09219672530889511,
0.055347565561532974,
-0.015233075246214867,
-0.12940697371959686,
0.03449622169137001,
0.10452470183372498,
0.02061655931174755,
-0.05149027332663536,
0.0685155913233757,
-0.05901641398668289,
-0.012874356471002102,
0.040616724640131,
-0.09452637284994125,
0.025914419442415237,
0.004910796880722046,
-0.05594973266124725,
-0.045645710080862045,
0.031831033527851105,
0.01996317133307457,
0.004923659842461348,
0.10794103145599365,
-0.08321143686771393,
0.010627209208905697,
-0.08280333131551743,
-0.13102878630161285,
0.016900906339287758,
-0.09531939029693604,
0.021152691915631294,
-0.1082535982131958,
-0.18250754475593567,
-0.015223346650600433,
0.06315308064222336,
-0.028991172090172768,
-0.03203524649143219,
-0.0648559108376503,
-0.07925350219011307,
0.01832621544599533,
-0.011423670686781406,
0.06764046102762222,
-0.06422092765569687,
0.09778342396020889,
0.03587295114994049,
0.06623940169811249,
-0.06026804447174072,
0.042603809386491776,
-0.10311456024646759,
0.04061461240053177,
-0.17994363605976105,
0.03735209256410599,
-0.07223028689622879,
0.07295333594083786,
-0.08085381984710693,
-0.0715339258313179,
0.0035265532787889242,
-0.0032742712646722794,
0.0756440982222557,
0.09985123574733734,
-0.17529886960983276,
-0.06268090009689331,
0.15282467007637024,
-0.08623063564300537,
-0.14086668193340302,
0.13735626637935638,
-0.058692749589681625,
0.04313623532652855,
0.06323163211345673,
0.19025535881519318,
0.07657959312200546,
-0.08546028286218643,
0.0036434405483305454,
0.0032537810038775206,
0.06817395240068436,
-0.031254496425390244,
0.06865496933460236,
0.00018324938719160855,
0.0016279872506856918,
0.01325446180999279,
-0.052950918674468994,
0.05068075656890869,
-0.07884176075458527,
-0.09046371281147003,
-0.04127654805779457,
-0.10482265800237656,
0.06559299677610397,
0.051180046051740646,
0.06236816942691803,
-0.1095428317785263,
-0.08727344125509262,
0.06097988039255142,
0.07466138899326324,
-0.07363278418779373,
0.022779572755098343,
-0.0679624155163765,
0.09160306304693222,
-0.06168187037110329,
-0.013848716393113136,
-0.15853522717952728,
-0.045477576553821564,
0.019154585897922516,
0.0029090691823512316,
0.017786895856261253,
-0.005335149355232716,
0.07069236040115356,
0.08284441381692886,
-0.06650552153587341,
-0.03193272277712822,
-0.013921920210123062,
0.015499023720622063,
-0.12408967316150665,
-0.20177587866783142,
-0.015072435140609741,
-0.035466454923152924,
0.1483743041753769,
-0.23398134112358093,
0.05295942723751068,
-0.0023092878982424736,
0.08780873566865921,
0.040178991854190826,
-0.011383126489818096,
-0.03745713084936142,
0.06991580128669739,
-0.04910387471318245,
-0.06992653012275696,
0.05981118604540825,
0.009371060878038406,
-0.10391124337911606,
-0.04445532336831093,
-0.1494286060333252,
0.18319328129291534,
0.13298924267292023,
-0.08125237375497818,
-0.07479484379291534,
0.008754899725317955,
-0.03453653305768967,
-0.026969274505972862,
-0.03794853389263153,
0.005433558952063322,
0.12793083488941193,
-0.006223155185580254,
0.1546262949705124,
-0.08582335710525513,
-0.03166574984788895,
0.022457430139183998,
-0.04648949205875397,
0.00971145462244749,
0.11752322316169739,
0.08988630771636963,
-0.10616599023342133,
0.1499408483505249,
0.19540907442569733,
-0.09490369260311127,
0.1305539757013321,
-0.04627962410449982,
-0.05226865038275719,
-0.02531939186155796,
0.009920225478708744,
0.014525830745697021,
0.10948200523853302,
-0.11679229885339737,
0.0006200583302415907,
0.009403534233570099,
0.013378415256738663,
0.010068133473396301,
-0.2152058333158493,
-0.025038503110408783,
0.04172010347247124,
-0.05096025392413139,
-0.0020760593470185995,
-0.024866120889782906,
-0.00787574052810669,
0.09768234193325043,
-0.005330485757440329,
-0.08886933326721191,
0.04574514180421829,
-0.0037227694410830736,
-0.07635562866926193,
0.20495449006557465,
-0.09303013235330582,
-0.14097721874713898,
-0.13491474092006683,
-0.06824702024459839,
-0.05552178621292114,
0.03103185072541237,
0.06231898441910744,
-0.06940926611423492,
-0.04168069735169411,
-0.10959157347679138,
-0.004373472649604082,
0.02981908991932869,
0.01891358010470867,
0.02113107033073902,
-0.005165772046893835,
0.08363731950521469,
-0.10084854066371918,
-0.006893283221870661,
-0.036048781126737595,
-0.05177467316389084,
0.037755805999040604,
0.02625323086977005,
0.11163891106843948,
0.148122638463974,
-0.025613104924559593,
-0.006946571636945009,
-0.02769496478140354,
0.22953350841999054,
-0.05870795249938965,
-0.0077184089459478855,
0.12513825297355652,
-0.03000832535326481,
0.05627429485321045,
0.1390390694141388,
0.06380990147590637,
-0.09825202077627182,
0.0186602883040905,
0.03378245607018471,
-0.03427950292825699,
-0.2173471748828888,
-0.03472483903169632,
-0.038170307874679565,
0.0068526132963597775,
0.09507537633180618,
0.029064400121569633,
0.022947324439883232,
0.06691696494817734,
0.02048744447529316,
0.08220516890287399,
-0.010212715715169907,
0.0703267976641655,
0.11408952623605728,
0.039522089064121246,
0.1306401789188385,
-0.04772838205099106,
-0.05137401074171066,
0.03997739776968956,
-0.004548690747469664,
0.20310908555984497,
0.022632885724306107,
0.14729838073253632,
0.05118778347969055,
0.15887318551540375,
-0.003068004734814167,
0.0617159828543663,
-0.010166461579501629,
-0.036760516464710236,
-0.015585427172482014,
-0.050306014716625214,
-0.03010045923292637,
0.0323067232966423,
-0.08406302332878113,
0.057368192821741104,
-0.10532594472169876,
0.0158893633633852,
0.06059538945555687,
0.2333851009607315,
0.05742907524108887,
-0.3211807310581207,
-0.09123788774013519,
0.0296799149364233,
-0.019411997869610786,
-0.020452722907066345,
0.028065385296940804,
0.1251915544271469,
-0.04652510955929756,
0.036364924162626266,
-0.0701504498720169,
0.08700110763311386,
-0.040620360523462296,
0.045824553817510605,
0.051480427384376526,
0.08606688678264618,
-0.01208100002259016,
0.06748142093420029,
-0.2830243706703186,
0.2644575834274292,
0.019164375960826874,
0.067796491086483,
-0.04566694423556328,
-0.000741059600841254,
0.03821861371397972,
0.09455706179141998,
0.07047563046216965,
-0.014578169211745262,
-0.05063919350504875,
-0.1933763474225998,
-0.06500523537397385,
0.021568791940808296,
0.09765829145908356,
-0.039979252964258194,
0.10103023052215576,
-0.028661057353019714,
0.0011625932529568672,
0.08050500601530075,
-0.013698228634893894,
-0.08194254338741302,
-0.09885435551404953,
-0.010064135305583477,
0.037356309592723846,
-0.037274766713380814,
-0.0784413069486618,
-0.09727057069540024,
-0.13664039969444275,
0.14985240995883942,
-0.06861110776662827,
-0.03534773364663124,
-0.10327646881341934,
0.0557425282895565,
0.05806611850857735,
-0.08102939277887344,
0.04197092354297638,
0.00410559494048357,
0.08096008002758026,
0.01581844873726368,
-0.06548845767974854,
0.12203843146562576,
-0.07336916029453278,
-0.1783807873725891,
-0.07030806690454483,
0.10654226690530777,
0.020066862925887108,
0.04477318376302719,
-0.009090358391404152,
0.011936066672205925,
-0.017226889729499817,
-0.07792635262012482,
0.025064785033464432,
0.0027695323806256056,
0.05124685913324356,
0.03162262588739395,
-0.05831018462777138,
-0.0068210517056286335,
-0.05991419777274132,
-0.024621184915304184,
0.15021120011806488,
0.2918873727321625,
-0.08488544076681137,
0.009370977990329266,
0.061568278819322586,
-0.06844663619995117,
-0.2086779773235321,
0.036761604249477386,
0.026885122060775757,
0.003928216639906168,
0.045984264463186264,
-0.1509300321340561,
0.10160014778375626,
0.10138650238513947,
-0.027568519115447998,
0.11795143783092499,
-0.2906166613101959,
-0.13612516224384308,
0.1279585212469101,
0.1464913785457611,
0.12138252705335617,
-0.160446897149086,
-0.04257933422923088,
-0.03833223506808281,
-0.10824933648109436,
0.10684359073638916,
-0.12878739833831787,
0.11104761809110641,
-0.00627377862110734,
0.05139957368373871,
0.006110801827162504,
-0.05214080587029457,
0.13692866265773773,
0.0008247254882007837,
0.11835799366235733,
-0.06141924485564232,
-0.016540059819817543,
0.05825554206967354,
-0.06079769507050514,
0.021083679050207138,
-0.11619207262992859,
0.04510151222348213,
-0.05947131663560867,
-0.021898383274674416,
-0.044676970690488815,
0.03390366956591606,
-0.04018785059452057,
-0.058032307773828506,
-0.04291559010744095,
0.026344481855630875,
0.04461132735013962,
-0.007541230879724026,
0.1653440147638321,
0.013670734129846096,
0.14486299455165863,
0.14555104076862335,
0.07518836855888367,
-0.06832261383533478,
-0.010945823043584824,
-0.007457596715539694,
-0.03619324788451195,
0.06369559466838837,
-0.16137509047985077,
0.04070211574435234,
0.12617850303649902,
0.01292323973029852,
0.14992308616638184,
0.07033639401197433,
-0.028576498851180077,
0.01332114078104496,
0.05993926525115967,
-0.1619824916124344,
-0.10823898762464523,
-0.007271658629179001,
-0.03221843019127846,
-0.11999117583036423,
0.0599081814289093,
0.12739017605781555,
-0.0678100734949112,
0.006759348791092634,
-0.006384191103279591,
0.015632720664143562,
-0.03300776332616806,
0.1817978322505951,
0.07053627818822861,
0.04686874896287918,
-0.08605154603719711,
0.0932336077094078,
0.0588991716504097,
-0.07821019738912582,
0.009416437707841396,
0.043587129563093185,
-0.08533044904470444,
-0.047834139317274094,
0.04296550154685974,
0.19367869198322296,
-0.030460117384791374,
-0.046186115592718124,
-0.14838449656963348,
-0.11439543217420578,
0.05454510450363159,
0.18039070069789886,
0.09821612387895584,
0.014992800541222095,
-0.03502317890524864,
0.011346257291734219,
-0.10996631532907486,
0.11958345025777817,
0.045329250395298004,
0.08931741863489151,
-0.15394841134548187,
0.11588060110807419,
-0.005420478526502848,
0.011024672538042068,
-0.024740226566791534,
0.04579845070838928,
-0.11770102381706238,
-0.008504625409841537,
-0.14258767664432526,
-0.0015955882845446467,
-0.022358383983373642,
0.010307316668331623,
0.0011009488953277469,
-0.054419685155153275,
-0.0550454743206501,
0.011195247992873192,
-0.09925538301467896,
-0.0241842158138752,
0.035043224692344666,
0.051111966371536255,
-0.1229093000292778,
-0.04944412037730217,
0.0212596096098423,
-0.07321304827928543,
0.06950036436319351,
0.018725745379924774,
0.01875239796936512,
0.04734025523066521,
-0.18584255874156952,
0.022972015663981438,
0.05620085448026657,
0.01813981868326664,
0.04751887544989586,
-0.08329412341117859,
-0.02172962948679924,
-0.005818838253617287,
0.043332040309906006,
0.018723126500844955,
0.08839926868677139,
-0.12334216386079788,
0.013408287428319454,
-0.028051195666193962,
-0.061398185789585114,
-0.05162149667739868,
0.03426454961299896,
0.08636065572500229,
0.010787087492644787,
0.2080513834953308,
-0.09664454311132431,
0.020229725167155266,
-0.20295631885528564,
0.004270973149687052,
0.0019696217495948076,
-0.11877559870481491,
-0.11729589104652405,
-0.05376913771033287,
0.05116454139351845,
-0.0626569390296936,
0.1329067498445511,
0.009817218407988548,
0.02791614644229412,
0.035358332097530365,
-0.02747352048754692,
0.034357305616140366,
0.027933411300182343,
0.2135666310787201,
0.03241397440433502,
-0.041101276874542236,
0.013179825618863106,
0.02675819583237171,
0.11381107568740845,
0.0805184543132782,
0.16738614439964294,
0.16609059274196625,
-0.04737955704331398,
0.09962451457977295,
0.042630381882190704,
-0.04988471046090126,
-0.13491198420524597,
0.06847930699586868,
-0.03907198831439018,
0.10557116568088531,
-0.016817428171634674,
0.19729672372341156,
0.08886385709047318,
-0.15448008477687836,
0.01898733712732792,
-0.049659378826618195,
-0.0857897698879242,
-0.10848147422075272,
-0.0619000568985939,
-0.09729479253292084,
-0.1440095752477646,
-0.006058102007955313,
-0.1125493273139,
0.012023195624351501,
0.10202023386955261,
0.0028273980133235455,
-0.016769347712397575,
0.16587810218334198,
0.003874819725751877,
0.03597233071923256,
0.06448289006948471,
-0.0003833488735835999,
-0.043977171182632446,
-0.07499638944864273,
-0.09782355278730392,
0.011870714835822582,
-0.010084718465805054,
0.024086587131023407,
-0.0465596504509449,
-0.02193159982562065,
0.04299632087349892,
-0.009386337362229824,
-0.1110650897026062,
0.012226617895066738,
0.02887762524187565,
0.04813952371478081,
0.053234584629535675,
0.011485445313155651,
0.010946343652904034,
0.000990072963759303,
0.2192821204662323,
-0.07538867741823196,
-0.0646853819489479,
-0.0979708656668663,
0.21333259344100952,
0.024389447644352913,
0.010831446386873722,
0.011934622190892696,
-0.09380260109901428,
0.029880622401833534,
0.2103576362133026,
0.18764431774616241,
-0.0964600220322609,
-0.0004302510351408273,
-0.016991840675473213,
-0.009119370952248573,
-0.034386422485113144,
0.0912385955452919,
0.11085443943738937,
0.00608203187584877,
-0.07342655956745148,
-0.05387148633599281,
-0.037955015897750854,
-0.009321549907326698,
-0.050700593739748,
0.05441456660628319,
0.028818562626838684,
0.009469437412917614,
-0.049536433070898056,
0.05874045565724373,
-0.037721700966358185,
-0.10737603157758713,
0.05117682367563248,
-0.19643937051296234,
-0.15185007452964783,
-0.021160613745450974,
0.11175902932882309,
-0.007605516817420721,
0.04398437961935997,
-0.03326214849948883,
-0.0004496549372561276,
0.06872965395450592,
-0.031769171357154846,
-0.06008480489253998,
-0.06276219338178635,
0.05837061256170273,
-0.1054256483912468,
0.22303993999958038,
-0.03256956860423088,
0.04488839581608772,
0.12723790109157562,
0.048882801085710526,
-0.0721021518111229,
0.08361142873764038,
0.046385083347558975,
-0.05973880738019943,
0.034183550626039505,
0.08488596975803375,
-0.04158170893788338,
0.12098633497953415,
0.061533983796834946,
-0.13454097509384155,
0.01183896791189909,
-0.0333896167576313,
-0.09528689086437225,
-0.053371552377939224,
-0.04084678739309311,
-0.05917266756296158,
0.1325230598449707,
0.18787863850593567,
-0.03565766662359238,
0.01031640823930502,
-0.042722515761852264,
0.01499985158443451,
0.06732244789600372,
0.037976447492837906,
-0.0362069196999073,
-0.22765406966209412,
0.027533134445548058,
0.06291575729846954,
-0.0011935088550671935,
-0.2825477123260498,
-0.0817604511976242,
-0.012147908098995686,
-0.043350353837013245,
-0.09466942399740219,
0.08840856701135635,
0.115601547062397,
0.05181148648262024,
-0.06292437762022018,
-0.08895543962717056,
-0.07488109916448593,
0.15622596442699432,
-0.12520626187324524,
-0.09613480418920517
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Mistral-7B-v0.1-compliance-copilot-identity-exquisite-tulip-20
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8188
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.8797 | 0.4 | 500 | 1.0525 |
| 0.9754 | 0.8 | 1000 | 1.0462 |
| 1.0449 | 1.21 | 1500 | 1.1108 |
| 0.7545 | 1.61 | 2000 | 1.0665 |
| 0.853 | 2.01 | 2500 | 0.8735 |
| 1.0528 | 2.41 | 3000 | 0.8771 |
| 0.9369 | 2.82 | 3500 | 0.8188 |
### Framework versions
- PEFT 0.8.1
- Transformers 4.37.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0 | {"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "mistralai/Mistral-7B-v0.1", "model-index": [{"name": "Mistral-7B-v0.1-compliance-copilot-identity-exquisite-tulip-20", "results": []}]} | null | ripjar/Mistral-7B-v0.1-compliance-copilot-identity-exquisite-tulip-20 | [
"peft",
"safetensors",
"generated_from_trainer",
"base_model:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"region:us"
] | 2024-02-14T15:28:59+00:00 | [] | [] | TAGS
#peft #safetensors #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us
| Mistral-7B-v0.1-compliance-copilot-identity-exquisite-tulip-20
==============================================================
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8188
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 4
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: constant
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 3
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.8.1
* Transformers 4.37.2
* Pytorch 2.1.2+cu121
* Datasets 2.16.1
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.1\n* Transformers 4.37.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
"TAGS\n#peft #safetensors #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.1\n* Transformers 4.37.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
45,
158,
4,
39
] | [
"passage: TAGS\n#peft #safetensors #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.1\n* Transformers 4.37.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
-0.12177665531635284,
0.0628223866224289,
-0.0034671134781092405,
0.08186446130275726,
0.1408240646123886,
0.03292229026556015,
0.107639379799366,
0.11696294695138931,
-0.08860059082508087,
0.07559622079133987,
0.0996089056134224,
0.08202996104955673,
0.042389631271362305,
0.13021127879619598,
-0.02789536863565445,
-0.27009809017181396,
0.020974302664399147,
-0.004916554316878319,
-0.1043669804930687,
0.1252126395702362,
0.10252080857753754,
-0.11740578711032867,
0.04452669247984886,
-0.0024305705446749926,
-0.13675424456596375,
-0.004717509727925062,
-0.004015450831502676,
-0.05356350541114807,
0.11747397482395172,
-0.0001386342046316713,
0.1315288543701172,
0.024857668206095695,
0.1256755292415619,
-0.2077021598815918,
0.004423369187861681,
0.06498533487319946,
0.023122573271393776,
0.08782102167606354,
0.11883767694234848,
-0.0067987265065312386,
0.11628800630569458,
-0.0824299156665802,
0.07642533630132675,
0.03576067090034485,
-0.1323486864566803,
-0.3133811354637146,
-0.11010409891605377,
0.07670203596353531,
0.12657591700553894,
0.08821644634008408,
-0.011099900119006634,
0.09852645546197891,
-0.08764271438121796,
0.07536116242408752,
0.29815056920051575,
-0.2509367763996124,
-0.07057691365480423,
0.00864732638001442,
0.021811150014400482,
0.03956523537635803,
-0.10098651051521301,
-0.051491886377334595,
0.03536944091320038,
0.040421053767204285,
0.1177826076745987,
0.014273366890847683,
-0.04857997968792915,
0.012622583657503128,
-0.161234050989151,
-0.042663272470235825,
0.09137670695781708,
0.044397685676813126,
-0.045578740537166595,
-0.03838619217276573,
-0.04985470697283745,
-0.18580876290798187,
-0.05720674619078636,
0.012455104850232601,
0.03468127176165581,
-0.05359473451972008,
-0.043641526252031326,
0.03487808257341385,
-0.08400049805641174,
-0.10397698730230331,
0.012365998700261116,
0.19143785536289215,
0.07685939967632294,
0.0008114464581012726,
0.0026619748678058386,
0.13931745290756226,
0.001462990534491837,
-0.15931940078735352,
0.007876099087297916,
0.028553757816553116,
-0.05815345421433449,
-0.02615022286772728,
-0.043322641402482986,
0.010402156040072441,
0.015004541724920273,
0.16437305510044098,
-0.1153455451130867,
0.05720210820436478,
0.05298522114753723,
0.024211063981056213,
-0.10246951878070831,
0.14223916828632355,
-0.07118753343820572,
0.006015488877892494,
-0.008676202036440372,
0.1085854321718216,
0.019824029877781868,
0.007435406558215618,
-0.05150485411286354,
0.012897023931145668,
0.11148006469011307,
0.054734934121370316,
-0.027579091489315033,
0.02143533155322075,
-0.055435799062252045,
-0.0061411261558532715,
0.05186307802796364,
-0.10862218588590622,
0.03584558516740799,
0.03146364912390709,
-0.08731365203857422,
-0.02076684683561325,
0.026409750804305077,
-0.0027013684157282114,
-0.003997205290943384,
0.140279158949852,
-0.07523461431264877,
0.0214025117456913,
-0.09687984734773636,
-0.10397876799106598,
0.018306275829672813,
-0.0047744764015078545,
-0.0010980201186612248,
-0.0805499255657196,
-0.16210776567459106,
-0.021455174311995506,
0.04628833755850792,
-0.05755777284502983,
-0.04814722761511803,
-0.015650346875190735,
-0.10192249715328217,
0.02885463275015354,
-0.01439458318054676,
0.151011660695076,
-0.06282459944486618,
0.12376860529184341,
0.03610964119434357,
0.046107541769742966,
0.02628065086901188,
0.04581698775291443,
-0.08457806706428528,
0.046637795865535736,
-0.20685894787311554,
0.016454508528113365,
-0.08249560743570328,
0.05222560465335846,
-0.12210782617330551,
-0.11929541081190109,
0.024664774537086487,
-0.029270926490426064,
0.12212958931922913,
0.13717956840991974,
-0.14772935211658478,
-0.05959821119904518,
0.1382347047328949,
-0.10702908784151077,
-0.11206702888011932,
0.10613387823104858,
-0.018740613013505936,
-0.029334403574466705,
0.02191748470067978,
0.13054750859737396,
0.06584841012954712,
-0.11330816149711609,
0.0043664793483912945,
-0.05141666904091835,
0.10414911061525345,
0.0106700723990798,
0.0978369191288948,
-0.0129619799554348,
-0.01717226393520832,
0.0204518623650074,
-0.08268813788890839,
0.06750448793172836,
-0.12128835171461105,
-0.08677764236927032,
-0.050599511712789536,
-0.07375574856996536,
0.04222465306520462,
0.05912471562623978,
0.021691439673304558,
-0.08701447397470474,
-0.11695234477519989,
0.0520341582596302,
0.1282980591058731,
-0.05697442591190338,
0.02193094789981842,
-0.03348197042942047,
0.09637126326560974,
-0.03175460174679756,
-0.0209814440459013,
-0.18465927243232727,
-0.08790937066078186,
0.016674648970365524,
-0.0709039717912674,
-0.017585203051567078,
-0.07867363095283508,
0.07674851268529892,
0.10600734502077103,
-0.08038530498743057,
-0.07836202532052994,
-0.0933532565832138,
0.00025109489797614515,
-0.09639860689640045,
-0.24859142303466797,
-0.08891797065734863,
-0.018513796851038933,
0.12248772382736206,
-0.21547912061214447,
0.024842845275998116,
-0.03171705827116966,
0.1168268471956253,
0.00633647944778204,
-0.03386710211634636,
-0.025770995765924454,
0.07510498911142349,
-0.017798325046896935,
-0.06975740939378738,
0.03095661662518978,
-0.010966730304062366,
-0.08500833809375763,
-0.02060820907354355,
-0.09755662083625793,
0.12989532947540283,
0.10042494535446167,
-0.01314394362270832,
-0.12077296525239944,
-0.018980590626597404,
-0.08567468076944351,
-0.05410763621330261,
-0.05509064346551895,
0.02995903044939041,
0.10611490160226822,
0.022126877680420876,
0.11023348569869995,
-0.07416895776987076,
-0.04531038552522659,
0.0254222359508276,
-0.022789712995290756,
0.01609850861132145,
0.13576211035251617,
0.10000070184469223,
-0.04072604700922966,
0.12139208614826202,
0.1500501185655594,
-0.09108761698007584,
0.08653616905212402,
-0.08116381615400314,
-0.11036284267902374,
-0.03391890972852707,
0.025368427857756615,
0.017412805929780006,
0.14324766397476196,
-0.031845495104789734,
0.03774886950850487,
0.024128269404172897,
0.016570521518588066,
0.006321490742266178,
-0.21812540292739868,
-0.03741880878806114,
0.020917804911732674,
-0.05418343096971512,
-0.03607682138681412,
-0.030209746211767197,
0.023654645308852196,
0.11370045691728592,
-0.0077449120581150055,
-0.07622092962265015,
-0.008891220204532146,
-0.0005140879075042903,
-0.07566151767969131,
0.22623108327388763,
-0.08710501343011856,
-0.06093017756938934,
-0.11059550195932388,
0.001562097342684865,
-0.03290397673845291,
-0.03146728128194809,
0.04911201447248459,
-0.0952705442905426,
-0.041694771498441696,
-0.07054657489061356,
0.03166578337550163,
-0.0006458106799982488,
0.03358197957277298,
-0.028532207012176514,
-0.0033139337319880724,
0.09195351600646973,
-0.0965103805065155,
0.008121037855744362,
-0.03224153071641922,
-0.06286298483610153,
0.0467907078564167,
0.07079987227916718,
0.09444083273410797,
0.16312113404273987,
0.012887656688690186,
-0.000043830848881043494,
-0.01645704358816147,
0.20522721111774445,
-0.06475594639778137,
-0.028659440577030182,
0.12105897814035416,
0.020465966314077377,
0.068569615483284,
0.13920573890209198,
0.07614125311374664,
-0.08002522587776184,
0.00872031319886446,
0.03930043801665306,
-0.00966272410005331,
-0.23699608445167542,
-0.052992064505815506,
-0.02796187251806259,
-0.035673078149557114,
0.10538206994533539,
0.03560464456677437,
-0.017743196338415146,
0.04071648791432381,
-0.023704690858721733,
-0.010708450339734554,
-0.008486596867442131,
0.08355017751455307,
0.02968796342611313,
0.05628839507699013,
0.11486262828111649,
-0.02005579136312008,
-0.027357541024684906,
0.030336393043398857,
-0.03429492563009262,
0.25209179520606995,
-0.039133451879024506,
0.09404405206441879,
0.0583316870033741,
0.19716951251029968,
-0.011307347565889359,
0.0850088894367218,
0.024073442444205284,
-0.03425755724310875,
0.020718028768897057,
-0.07301320880651474,
0.0068628499284386635,
0.022918768227100372,
-0.041755978018045425,
0.0738438069820404,
-0.14096355438232422,
-0.035242997109889984,
0.03280335292220116,
0.31197547912597656,
0.057493481785058975,
-0.3235360085964203,
-0.09124857932329178,
0.0004320417356211692,
-0.02953396923840046,
-0.035551514476537704,
0.002092992188408971,
0.07789797335863113,
-0.07416199147701263,
0.05500521883368492,
-0.07292882353067398,
0.07974497228860855,
0.013113697059452534,
0.0031865090131759644,
0.09314979612827301,
0.11277351528406143,
-0.002368832705542445,
0.05864810198545456,
-0.24256817996501923,
0.3092867136001587,
-0.00045168495853431523,
0.07715601474046707,
-0.021859891712665558,
0.016974007710814476,
0.029775992035865784,
0.03177543357014656,
0.05549708008766174,
-0.010097168385982513,
-0.08561823517084122,
-0.23338693380355835,
-0.09850186109542847,
0.01597057655453682,
0.10840120911598206,
-0.04273379221558571,
0.1336529552936554,
-0.023065639659762383,
-0.015635957941412926,
0.05896637961268425,
-0.04862845316529274,
-0.1260349303483963,
-0.0464167445898056,
0.018307138234376907,
-0.018718082457780838,
0.019982310011982918,
-0.1124962866306305,
-0.11585824936628342,
-0.05959302559494972,
0.12547188997268677,
-0.05968286842107773,
-0.04854769632220268,
-0.14113955199718475,
0.08711851388216019,
0.1698647439479828,
-0.08186409622430801,
0.056358784437179565,
0.011919359676539898,
0.08752749115228653,
0.027305593714118004,
-0.04154324159026146,
0.10116913914680481,
-0.0892108827829361,
-0.23075248301029205,
-0.0517878457903862,
0.13519634306430817,
0.049437861889600754,
0.05771791189908981,
-0.04135751724243164,
0.04386257007718086,
-0.008945914916694164,
-0.10062088817358017,
0.02897573821246624,
0.011769739910960197,
0.053702495992183685,
0.05939898639917374,
-0.060946233570575714,
0.014377426356077194,
-0.0607081763446331,
-0.045289505273103714,
0.08505649119615555,
0.33451420068740845,
-0.09104123711585999,
0.020681466907262802,
0.06506776064634323,
-0.04498862847685814,
-0.15746772289276123,
0.0006159188924357295,
0.1131591722369194,
-0.009522249922156334,
0.06031296029686928,
-0.17857375741004944,
0.07940951734781265,
0.11584019660949707,
-0.03507092967629433,
0.0934249684214592,
-0.3440726697444916,
-0.12903867661952972,
0.11312749236822128,
0.14982351660728455,
0.01934799738228321,
-0.18236581981182098,
-0.04175742343068123,
0.0005867747822776437,
-0.12392288446426392,
0.07374287396669388,
-0.06482907384634018,
0.10053147375583649,
-0.042630285024642944,
0.04715285077691078,
0.020459040999412537,
-0.06888609379529953,
0.1680741161108017,
-0.020567839965224266,
0.1008046567440033,
-0.0212805587798357,
0.0232032872736454,
0.04531288892030716,
-0.05775325745344162,
0.04046463966369629,
-0.04709646478295326,
0.04137023538351059,
-0.07988446205854416,
-0.007987825199961662,
-0.1079791933298111,
0.034230638295412064,
-0.054858721792697906,
-0.049067288637161255,
-0.01565519906580448,
0.04630366712808609,
0.039724983274936676,
-0.031308162957429886,
0.12977565824985504,
0.027794070541858673,
0.18484964966773987,
0.09244967252016068,
0.018951082602143288,
0.0112527497112751,
-0.10953123867511749,
-0.0073394086211919785,
-0.025149963796138763,
0.05785500258207321,
-0.1425674557685852,
0.014938073232769966,
0.13509589433670044,
0.04944614693522453,
0.11419980227947235,
0.07602816820144653,
-0.07607796788215637,
-0.009123021736741066,
0.07535352557897568,
-0.1330363154411316,
-0.09072045981884003,
-0.0016051710117608309,
0.04717693850398064,
-0.13468046486377716,
0.03410743176937103,
0.09128765016794205,
-0.08685820549726486,
-0.023388344794511795,
0.004078652244061232,
0.04687543213367462,
-0.042407549917697906,
0.24883827567100525,
0.053871363401412964,
0.08407903462648392,
-0.09638810902833939,
0.09546027332544327,
0.06821085512638092,
-0.08915568888187408,
-0.0037671374157071114,
0.10644938796758652,
-0.06028149649500847,
-0.024999871850013733,
0.09904780983924866,
0.09598725289106369,
-0.04800201207399368,
-0.047335393726825714,
-0.1497335582971573,
-0.13975852727890015,
0.0751139298081398,
0.11599724739789963,
0.0628371462225914,
0.021436765789985657,
0.0073590464890003204,
0.02526811510324478,
-0.11417914181947708,
0.10487496107816696,
0.09399495273828506,
0.09247180074453354,
-0.13876290619373322,
0.14803647994995117,
0.007981885224580765,
0.028359809890389442,
-0.0033984307665377855,
0.03715918958187103,
-0.12019041925668716,
0.018651504069566727,
-0.14298087358474731,
-0.033399902284145355,
-0.03255598992109299,
-0.008553561754524708,
-0.006990132853388786,
-0.059445079416036606,
-0.04020892456173897,
0.0356699600815773,
-0.11532662808895111,
-0.04447444900870323,
-0.004774669650942087,
0.04422585293650627,
-0.14125385880470276,
-0.04022710397839546,
0.041361454874277115,
-0.10290302336215973,
0.07207326591014862,
0.066141776740551,
0.06083185598254204,
0.05535414442420006,
-0.13850419223308563,
0.002312877681106329,
0.04281511902809143,
-0.017617464065551758,
0.03728644177317619,
-0.17060577869415283,
-0.015634706243872643,
-0.04285799711942673,
0.018043451011180878,
0.0008340795175172389,
0.03259742632508278,
-0.1546272337436676,
-0.016382966190576553,
-0.019148483872413635,
-0.06670159846544266,
-0.048746656626462936,
0.012814279645681381,
0.07164108008146286,
0.02170424535870552,
0.1568630188703537,
-0.09448562562465668,
0.051928482949733734,
-0.23451271653175354,
-0.013233528472483158,
-0.04529096186161041,
-0.0726165771484375,
-0.11562094837427139,
-0.006081577856093645,
0.08183642476797104,
-0.05431484431028366,
0.07059770077466965,
-0.04054219275712967,
0.07435642182826996,
0.03995512053370476,
-0.03321029990911484,
0.038581471890211105,
0.052666425704956055,
0.184658944606781,
0.033622875809669495,
-0.023288311436772346,
0.05373552441596985,
0.0337115041911602,
0.06618726998567581,
0.09871374815702438,
0.18259966373443604,
0.15038125216960907,
0.033457569777965546,
0.06880179792642593,
0.04468127340078354,
-0.1273280531167984,
-0.11562945693731308,
0.059452421963214874,
-0.02392025850713253,
0.09813163429498672,
-0.01738480105996132,
0.21253593266010284,
0.10829201340675354,
-0.1877686083316803,
0.0460873544216156,
-0.04248913750052452,
-0.07472510635852814,
-0.10080975294113159,
-0.013609526678919792,
-0.0620899423956871,
-0.1654219925403595,
0.003956920001655817,
-0.10070973634719849,
0.038393743336200714,
0.10491601377725601,
0.019443951547145844,
0.029293326660990715,
0.14436796307563782,
0.10928575694561005,
0.025370486080646515,
0.048861999064683914,
0.03253962844610214,
-0.020546993240714073,
-0.0232703797519207,
-0.08125487715005875,
0.026305198669433594,
-0.045851726084947586,
0.02671106718480587,
-0.0390843041241169,
-0.09533344209194183,
0.0706973522901535,
0.027431368827819824,
-0.11463667452335358,
0.03856423869729042,
-0.009556188248097897,
0.0683508813381195,
0.07635830342769623,
0.019847974181175232,
0.030816208571195602,
-0.032010357826948166,
0.2619931697845459,
-0.08189389854669571,
-0.08057007938623428,
-0.0934443548321724,
0.30714187026023865,
0.0330556221306324,
-0.03195636346936226,
0.03989448398351669,
-0.0895097628235817,
-0.014772411435842514,
0.12267960608005524,
0.12810444831848145,
-0.03808924928307533,
-0.014755018055438995,
0.0010834907880052924,
-0.0290870051831007,
-0.051966045051813126,
0.09831680357456207,
0.12247904390096664,
0.0683387815952301,
-0.1069321408867836,
-0.0018397006206214428,
-0.06636916846036911,
-0.02790266089141369,
-0.059192441403865814,
0.058115698397159576,
0.022017959505319595,
-0.0026727255899459124,
-0.04765095189213753,
0.08880552649497986,
-0.034654222428798676,
-0.11859112977981567,
0.0654200091958046,
-0.17099420726299286,
-0.19062736630439758,
-0.03616008162498474,
0.09111260622739792,
0.023731932044029236,
0.06359434127807617,
-0.02389717288315296,
-0.01734968274831772,
0.09930416196584702,
-0.015327065251767635,
-0.03955019265413284,
-0.15747210383415222,
0.09990480542182922,
-0.049431879073381424,
0.23789428174495697,
-0.022852327674627304,
0.057928893715143204,
0.11381935328245163,
0.033327069133520126,
-0.13258686661720276,
0.04914672300219536,
0.08107152581214905,
-0.14386802911758423,
0.02623872086405754,
0.1556837260723114,
-0.04164639487862587,
0.07177925854921341,
0.018130552023649216,
-0.126771941781044,
-0.0007357891881838441,
-0.023088928312063217,
-0.05495133623480797,
-0.04017186537384987,
-0.022695867344737053,
-0.02767997421324253,
0.12278292328119278,
0.21480639278888702,
-0.06620676815509796,
0.006288663949817419,
-0.07498949021100998,
0.036896560341119766,
0.06471077352762222,
0.0960618183016777,
-0.008188068866729736,
-0.2527819573879242,
0.05507100000977516,
0.06516867876052856,
0.014527102932333946,
-0.20972256362438202,
-0.09617374837398529,
0.055704519152641296,
-0.0709640309214592,
-0.09978204220533371,
0.10510627925395966,
0.07442116737365723,
0.05549148842692375,
-0.04403362050652504,
-0.10810086131095886,
-0.07339084893465042,
0.16895611584186554,
-0.161523699760437,
-0.07946155220270157
] |
null | null | sample-factory |
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment.
This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory.
Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/
## Downloading the model
After installing Sample-Factory, download the model with:
```
python -m sample_factory.huggingface.load_from_hub -r Facepalm0/rl_course_vizdoom_health_gathering_supreme
```
## Using the model
To run the model after download, use the `enjoy` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme
```
You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag.
See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details
## Training with this model
To continue training with this model, use the `train` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000
```
Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
| {"library_name": "sample-factory", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "sample-factory"], "model-index": [{"name": "APPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "doom_health_gathering_supreme", "type": "doom_health_gathering_supreme"}, "metrics": [{"type": "mean_reward", "value": "10.73 +/- 4.71", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | Facepalm0/rl_course_vizdoom_health_gathering_supreme | [
"sample-factory",
"tensorboard",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-14T15:30:12+00:00 | [] | [] | TAGS
#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
A(n) APPO model trained on the doom_health_gathering_supreme environment.
This model was trained using Sample-Factory 2.0: URL
Documentation for how to use Sample-Factory can be found at URL
## Downloading the model
After installing Sample-Factory, download the model with:
## Using the model
To run the model after download, use the 'enjoy' script corresponding to this environment:
You can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.
See URL for more details
## Training with this model
To continue training with this model, use the 'train' script corresponding to this environment:
Note, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at.
| [
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
"TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
34,
19,
59,
67
] | [
"passage: TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n## Downloading the model\n\nAfter installing Sample-Factory, download the model with:## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
-0.162887305021286,
-0.07949446886777878,
0.0013769814977422357,
0.0244897473603487,
0.13643795251846313,
0.08826540410518646,
0.13243556022644043,
0.07938782125711441,
0.19449298083782196,
0.07451266050338745,
0.12160012871026993,
0.06742649525403976,
0.02505551464855671,
0.31084391474723816,
0.08655242621898651,
-0.18235880136489868,
0.031082456931471825,
-0.06436605006456375,
-0.02882574498653412,
0.05590416118502617,
0.050910040736198425,
-0.06422623991966248,
0.11641133576631546,
-0.05714287608861923,
-0.15497641265392303,
0.08288847655057907,
0.008126083761453629,
0.03596968948841095,
0.12199652194976807,
-0.007729834411293268,
0.06358569860458374,
0.02508161962032318,
0.09885215014219284,
-0.08979995548725128,
0.05817115306854248,
0.037268251180648804,
-0.005583701189607382,
0.0697544738650322,
-0.02916712686419487,
0.01197513286024332,
0.20552261173725128,
0.051445573568344116,
-0.014811687171459198,
0.0707944929599762,
-0.04854035750031471,
0.005004523321986198,
0.024828260764479637,
0.08118943125009537,
0.1108563020825386,
-0.013300174847245216,
-0.015604399144649506,
0.2098497599363327,
-0.045419543981552124,
0.030687451362609863,
0.1803472340106964,
-0.13901305198669434,
-0.00587898213416338,
0.3598267436027527,
0.13591337203979492,
0.07389762997627258,
-0.05572221428155899,
0.065569669008255,
0.12957775592803955,
-0.013377981260418892,
-0.022062024101614952,
-0.037468962371349335,
0.01014290377497673,
0.02470328100025654,
-0.08271043002605438,
-0.03898613899946213,
0.18779566884040833,
0.027798498049378395,
-0.0647122785449028,
-0.11388745903968811,
-0.08383605629205704,
-0.01143614575266838,
-0.08729266375303268,
-0.06047317758202553,
0.061255209147930145,
0.06450130045413971,
-0.05541218817234039,
-0.16354843974113464,
-0.08759765326976776,
-0.14808951318264008,
0.09711641818284988,
-0.018818290904164314,
0.020023507997393608,
0.039053402841091156,
-0.13240769505500793,
0.13932685554027557,
-0.12239529192447662,
-0.005040881223976612,
-0.00391974626109004,
-0.10012788325548172,
-0.0298643596470356,
-0.02757178619503975,
-0.06954579800367355,
-0.08072661608457565,
0.06621979922056198,
0.1397300660610199,
0.1075919046998024,
0.04457515478134155,
-0.016096504405140877,
0.0929836705327034,
0.0659836158156395,
0.015487046912312508,
-0.046446919441223145,
-0.03190334141254425,
0.06750229746103287,
0.09463070333003998,
-0.0025161339435726404,
-0.04405781999230385,
-0.12502750754356384,
0.004669501446187496,
-0.05889439582824707,
0.07438734918832779,
-0.01944235898554325,
0.09347380697727203,
0.0012449703644961119,
-0.0658751055598259,
0.09675891697406769,
-0.056166794151067734,
-0.015024078078567982,
0.05717969685792923,
-0.09829384088516235,
-0.044000294059515,
0.02636338584125042,
-0.018662840127944946,
0.02191256918013096,
-0.08697114139795303,
-0.1281215101480484,
-0.0406981036067009,
-0.15496762096881866,
-0.0733695924282074,
0.020342092961072922,
-0.10162562131881714,
0.040819648653268814,
-0.08701786398887634,
-0.27291807532310486,
-0.016108427196741104,
0.05915366858243942,
0.0003154690202791244,
0.03663148358464241,
-0.06209208071231842,
0.0267410296946764,
-0.030988745391368866,
-0.013702943921089172,
0.12538094818592072,
-0.04706621542572975,
0.005733184050768614,
0.02853262610733509,
0.09092917293310165,
0.029396481812000275,
-0.011824010871350765,
-0.09237373620271683,
0.03002769686281681,
-0.1866937130689621,
0.0038047281559556723,
-0.051012441515922546,
0.14028684794902802,
-0.07785230129957199,
-0.0034444157499819994,
-0.07691079378128052,
0.06912831217050552,
0.052552226930856705,
0.21963854134082794,
-0.22059281170368195,
-0.09743031859397888,
0.1902308464050293,
-0.09678838402032852,
-0.1949385702610016,
0.06732125580310822,
-0.03079940192401409,
0.20069970190525055,
0.02597416751086712,
0.1891578733921051,
0.00020795770979020745,
-0.25584760308265686,
0.035303130745887756,
0.07686726003885269,
-0.2078019231557846,
-0.11653494834899902,
0.00783967413008213,
0.04216665402054787,
-0.050144799053668976,
0.023388857021927834,
-0.07392873615026474,
0.1217033788561821,
-0.023950038477778435,
-0.021695949137210846,
-0.009935722686350346,
-0.06940963864326477,
-0.039610356092453,
0.012346661649644375,
0.06086154654622078,
-0.02202412113547325,
-0.025860905647277832,
-0.05173748731613159,
0.16720648109912872,
-0.0795547217130661,
0.011736705899238586,
-0.11241740733385086,
0.1497063785791397,
0.007124151568859816,
0.025635361671447754,
-0.0980280190706253,
-0.014672551304101944,
0.044151511043310165,
0.08621654659509659,
0.011970171704888344,
0.1326037049293518,
0.06774137914180756,
0.01454958226531744,
0.042493220418691635,
-0.004039871972054243,
-0.0012205307139083743,
-0.10230473428964615,
-0.05593033879995346,
-0.11311958730220795,
-0.11286478489637375,
-0.09429361671209335,
0.08868816494941711,
-0.20066434144973755,
0.05826579034328461,
-0.15120604634284973,
0.047645486891269684,
0.038803353905677795,
-0.07772190868854523,
0.05121537670493126,
-0.08661998063325882,
-0.021283775568008423,
-0.08784573525190353,
0.0805407464504242,
-0.014386715367436409,
-0.08415807038545609,
0.006313080433756113,
-0.09094364196062088,
-0.08295580744743347,
0.09175937622785568,
0.013830476440489292,
0.0026490744203329086,
-0.1170414388179779,
-0.04695970565080643,
0.001149212708696723,
0.03873389959335327,
-0.0591595321893692,
0.08649469166994095,
0.06776818633079529,
0.09646541625261307,
-0.09070473909378052,
0.03797374665737152,
-0.020416714251041412,
-0.06236580014228821,
-0.045745182782411575,
0.014070805162191391,
0.1767948418855667,
-0.022993814200162888,
-0.01734299771487713,
-0.005982444155961275,
-0.048861317336559296,
0.20095843076705933,
-0.018403954803943634,
-0.11935548484325409,
0.0030399553943425417,
-0.01395543571561575,
-0.017944620922207832,
0.11660698801279068,
-0.13726668059825897,
-0.05182260647416115,
0.030854813754558563,
-0.06529976427555084,
0.10216285288333893,
-0.08242622762918472,
-0.0392029769718647,
-0.05685178562998772,
-0.043409593403339386,
0.046979792416095734,
0.12330524623394012,
-0.07290767133235931,
-0.009151018224656582,
-0.047789376229047775,
-0.03510203957557678,
-0.025379952043294907,
-0.05724980682134628,
-0.11478709429502487,
0.1582695096731186,
0.002751561114564538,
-0.09990474581718445,
-0.17415542900562286,
-0.08029486984014511,
-0.03834356367588043,
0.05337152257561684,
-0.034037429839372635,
-0.04430336132645607,
-0.01500723510980606,
-0.07299388945102692,
0.1465158462524414,
0.063304103910923,
-0.0472191721200943,
-0.01852818764746189,
0.08560720086097717,
0.04456184431910515,
-0.15394946932792664,
0.007078593596816063,
-0.08948076516389847,
-0.08794131129980087,
0.03091353550553322,
-0.08061819523572922,
0.012820594012737274,
0.11341627687215805,
0.03525753691792488,
0.02826494723558426,
0.01035099383443594,
0.23537762463092804,
-0.0369284451007843,
-0.01093987375497818,
0.19019025564193726,
0.0682438537478447,
0.020443644374608994,
0.055847786366939545,
0.027420951053500175,
-0.15370461344718933,
0.10424364358186722,
0.012530675157904625,
-0.044538769870996475,
-0.10689681768417358,
-0.04666181653738022,
-0.03360101953148842,
0.09803235530853271,
0.12185155600309372,
0.03158954530954361,
0.025155838578939438,
0.096546471118927,
0.02187134325504303,
-0.0098390718922019,
-0.11183010786771774,
0.05996714532375336,
-0.1770814210176468,
-0.043808963149785995,
0.00898060668259859,
-0.028755301609635353,
0.00010461114288773388,
0.0659034252166748,
0.026660064235329628,
0.12833580374717712,
0.0295290257781744,
0.06181740015745163,
0.0663255974650383,
0.10200989991426468,
0.01538698747754097,
0.1999037265777588,
-0.06215142831206322,
-0.1075027585029602,
-0.03758005052804947,
-0.04118350148200989,
-0.11916319280862808,
0.12439136207103729,
0.1381523460149765,
-0.030515994876623154,
-0.06625506281852722,
0.07200724631547928,
0.014589293859899044,
0.08729344606399536,
0.08250882476568222,
-0.29115065932273865,
-0.034177567809820175,
0.031450141221284866,
0.01114452164620161,
-0.04308335855603218,
0.010566305369138718,
0.10542299598455429,
-0.07616783678531647,
-0.09982791543006897,
-0.03972722589969635,
0.1055394783616066,
0.08046542853116989,
0.03702867403626442,
-0.10841067880392075,
0.20128826797008514,
-0.01744360849261284,
0.07004447281360626,
-0.07662706822156906,
0.1728198230266571,
0.018701205030083656,
0.05943213775753975,
-0.07497778534889221,
-0.009592941962182522,
0.1228223443031311,
0.03374773636460304,
0.09092900156974792,
-0.0056656887754797935,
-0.09995020180940628,
-0.13336431980133057,
-0.1216202825307846,
0.024986369535326958,
-0.000090524394181557,
-0.08169890940189362,
0.03341596573591232,
-0.016717763617634773,
0.017487963661551476,
-0.0027857583481818438,
0.23440547287464142,
-0.18267135322093964,
0.012482558377087116,
-0.054521817713975906,
0.02707577496767044,
-0.04300008341670036,
-0.0709642544388771,
-0.027162717655301094,
0.060507629066705704,
0.09744840115308762,
0.07921962440013885,
0.030401866883039474,
-0.07419665157794952,
0.1431404948234558,
0.06514685600996017,
-0.058246973901987076,
-0.01524845976382494,
0.01951364241540432,
0.1256532073020935,
-0.07438289374113083,
-0.10393836349248886,
0.10585980117321014,
-0.11736445128917694,
0.008749126456677914,
-0.05019083246588707,
0.04299405962228775,
0.02305823378264904,
0.011290842667222023,
0.007447924464941025,
-0.04279239848256111,
0.0015383695717900991,
-0.06904047727584839,
0.0778660774230957,
0.020559091120958328,
-0.0047941361553967,
-0.0006717707728967071,
-0.16239388287067413,
0.08390985429286957,
-0.04138755425810814,
0.052877847105264664,
0.1489589661359787,
0.27864590287208557,
-0.02386910282075405,
0.030926240608096123,
0.1617380678653717,
-0.01897917501628399,
-0.2491649091243744,
0.04654841497540474,
0.014908025041222572,
0.10310175269842148,
0.04640066251158714,
-0.19236695766448975,
0.11111847311258316,
0.009474517777562141,
-0.02225719392299652,
0.009804603643715382,
-0.24880149960517883,
-0.13740544021129608,
0.17525193095207214,
0.06902051717042923,
0.15983323752880096,
-0.03665107116103172,
-0.013587141409516335,
-0.061109546571969986,
-0.03419603407382965,
-0.026354335248470306,
-0.12708203494548798,
0.12749767303466797,
-0.017607107758522034,
0.047745801508426666,
0.027817612513899803,
-0.07676684111356735,
0.12058744579553604,
-0.017944786697626114,
0.13344953954219818,
-0.017018258571624756,
-0.031023232266306877,
0.042466819286346436,
-0.09033756703138351,
0.1662607043981552,
-0.10233280807733536,
0.057950668036937714,
-0.11091876775026321,
-0.03109682910144329,
-0.015322481282055378,
0.15654151141643524,
0.005544521380215883,
-0.0855189636349678,
-0.041066281497478485,
0.04975702613592148,
-0.05784251168370247,
0.05022609233856201,
-0.0021613158751279116,
-0.03506873920559883,
0.022246064618229866,
0.08415499329566956,
0.040208954364061356,
-0.10403558611869812,
-0.011038471013307571,
0.03089289739727974,
0.01896476000547409,
0.09993185102939606,
-0.20835483074188232,
-0.020152123644948006,
0.019231827929615974,
-0.015702085569500923,
0.13085414469242096,
0.04400704801082611,
-0.08080117404460907,
0.027568496763706207,
0.13726983964443207,
-0.061186157166957855,
-0.030986590310931206,
-0.04847807064652443,
-0.016679393127560616,
-0.12794725596904755,
-0.01594163477420807,
0.057148490101099014,
-0.04251079633831978,
0.02512725070118904,
-0.03424951806664467,
0.0004248716577421874,
-0.10717252641916275,
0.07036283612251282,
0.06859682500362396,
0.0642281174659729,
-0.07167360186576843,
0.09394960850477219,
-0.07811970263719559,
0.014289900660514832,
0.03734226152300835,
0.045441556721925735,
-0.06931920349597931,
-0.06820165365934372,
-0.05322124809026718,
0.27575042843818665,
-0.024388493970036507,
-0.02025510184466839,
-0.06021025776863098,
0.11942195147275925,
-0.057836465537548065,
-0.06673881411552429,
0.08716115355491638,
-0.007450808770954609,
-0.059019722044467926,
0.022327717393636703,
-0.0734894648194313,
-0.014457973651587963,
0.04693116992712021,
0.016375891864299774,
-0.11610891669988632,
0.1136312261223793,
0.031648989766836166,
0.02891513518989086,
-0.09186926484107971,
-0.0486464723944664,
-0.12123195827007294,
0.0032020595390349627,
-0.025323880836367607,
-0.06051601842045784,
-0.07913094758987427,
-0.0425749197602272,
0.049642790108919144,
0.018434861674904823,
-0.08444267511367798,
-0.0022111251018941402,
-0.12617166340351105,
0.006370943505316973,
0.006689207162708044,
0.10316617041826248,
-0.06351965665817261,
0.04670397937297821,
0.10049878805875778,
-0.07692139595746994,
0.09893755614757538,
0.0846271738409996,
-0.00729260453954339,
0.08929292112588882,
-0.20261284708976746,
-0.02319980226457119,
0.047821637243032455,
0.055264540016651154,
0.03154374286532402,
0.06104309484362602,
0.013487739488482475,
-0.05460033565759659,
0.04538526386022568,
-0.03539090231060982,
0.0028435050044208765,
-0.09104080498218536,
0.09713591635227203,
0.009731475263834,
-0.009716489352285862,
-0.060456521809101105,
-0.01384128537029028,
0.01817488856613636,
0.10404353588819504,
0.09692291915416718,
-0.07237115502357483,
-0.0035003575030714273,
-0.11786255985498428,
0.024597108364105225,
0.02565017342567444,
0.010576808825135231,
0.03638135641813278,
-0.11692339926958084,
0.03729743883013725,
-0.05475534871220589,
0.19700418412685394,
0.019796879962086678,
-0.10531783103942871,
-0.008661900646984577,
0.07250577956438065,
0.17378750443458557,
-0.006129021290689707,
0.21011123061180115,
0.05919691175222397,
0.09556611627340317,
0.0324610099196434,
0.11373614519834518,
0.11542147397994995,
0.004254546947777271,
0.10733281821012497,
0.0500684529542923,
-0.04822303727269173,
0.14306919276714325,
0.032827045768499374,
-0.017670227214694023,
0.0304852481931448,
0.04704435542225838,
-0.03187015652656555,
0.02075354754924774,
-0.06440161913633347,
0.11196915805339813,
0.13514995574951172,
-0.08471442013978958,
-0.0081911850720644,
0.04797748476266861,
-0.0438203290104866,
-0.1532401293516159,
-0.08671712130308151,
-0.024648865684866905,
-0.2236001342535019,
0.08533021807670593,
-0.06946314871311188,
-0.13578248023986816,
0.019155733287334442,
0.013867083936929703,
-0.028145823627710342,
0.11776147037744522,
-0.07801362872123718,
-0.03346126526594162,
0.020983682945370674,
-0.039618294686079025,
-0.09754771739244461,
-0.09402462840080261,
-0.07874704152345657,
0.03500581532716751,
-0.04535633698105812,
0.025271590799093246,
-0.05421067774295807,
0.015182215720415115,
0.10334893316030502,
-0.04038224741816521,
-0.041323766112327576,
-0.0359976626932621,
-0.035855069756507874,
-0.11793428659439087,
0.025968458503484726,
0.044103916734457016,
-0.03597194701433182,
-0.05585090070962906,
0.17637495696544647,
-0.04257858544588089,
-0.01666315644979477,
-0.1211012676358223,
0.14332374930381775,
-0.04330325871706009,
0.03261799365282059,
-0.10366860777139664,
-0.08559805154800415,
-0.10071583092212677,
0.27439257502555847,
0.2784624397754669,
-0.14349330961704254,
-0.009759977459907532,
0.02939503826200962,
0.004204166121780872,
-0.14250165224075317,
0.14376720786094666,
0.01570971868932247,
-0.024460898712277412,
-0.027595078572630882,
0.026391539722681046,
-0.007621914613991976,
-0.0827714279294014,
-0.03114704228937626,
-0.05752136558294296,
-0.006779014132916927,
-0.05148708075284958,
-0.034257955849170685,
0.06298708915710449,
-0.12136059254407883,
-0.09091135859489441,
-0.05560125410556793,
-0.0083417734131217,
-0.03344108536839485,
-0.07473809272050858,
-0.019548200070858,
0.07662302255630493,
0.14781777560710907,
-0.05502733215689659,
0.06005467101931572,
-0.004367031157016754,
-0.04969286173582077,
-0.13970479369163513,
-0.13660922646522522,
0.05449144169688225,
-0.129489928483963,
0.26909253001213074,
-0.050524767488241196,
-0.05207161232829094,
0.041712693870067596,
-0.03221052139997482,
-0.05838879942893982,
0.020522039383649826,
0.009778409264981747,
-0.05078497156500816,
-0.029240628704428673,
0.09255361557006836,
-0.033305004239082336,
0.009149706922471523,
-0.022496739402413368,
-0.22135144472122192,
0.0034119023475795984,
-0.05107501149177551,
0.028507398441433907,
-0.12569822371006012,
0.06501629203557968,
-0.09348012506961823,
0.12403472512960434,
0.07595156878232956,
-0.01166640967130661,
-0.036088403314352036,
-0.04733064025640488,
0.1257045865058899,
0.08392459154129028,
-0.02910126931965351,
-0.0870935395359993,
-0.16758979856967926,
-0.004611360374838114,
-0.0011314527364447713,
-0.08687946200370789,
-0.23090760409832,
-0.008421163074672222,
-0.031696807593107224,
0.0109195401892066,
-0.00838692206889391,
0.12826944887638092,
0.14749252796173096,
0.05249129980802536,
0.016358694061636925,
-0.12719306349754333,
0.041898638010025024,
0.08496948331594467,
-0.15762199461460114,
-0.1707899123430252
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# passive_invoices_v4.4
This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the sroie dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1011
- Precision: 0.8806
- Recall: 0.9064
- F1: 0.8933
- Accuracy: 0.9781
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 17000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 1.1039 | 0.28 | 500 | 0.8842 | 0.1507 | 0.0671 | 0.0929 | 0.7802 |
| 0.6047 | 0.55 | 1000 | 0.5921 | 0.3710 | 0.3162 | 0.3414 | 0.8441 |
| 0.4058 | 0.83 | 1500 | 0.4163 | 0.5222 | 0.5050 | 0.5134 | 0.8857 |
| 0.2917 | 1.11 | 2000 | 0.2980 | 0.6188 | 0.5958 | 0.6071 | 0.9174 |
| 0.2093 | 1.39 | 2500 | 0.2464 | 0.6655 | 0.6697 | 0.6676 | 0.9307 |
| 0.1802 | 1.66 | 3000 | 0.2127 | 0.7081 | 0.7092 | 0.7086 | 0.9383 |
| 0.1496 | 1.94 | 3500 | 0.2034 | 0.7044 | 0.7563 | 0.7294 | 0.9446 |
| 0.1223 | 2.22 | 4000 | 0.1817 | 0.7263 | 0.7635 | 0.7445 | 0.9493 |
| 0.107 | 2.49 | 4500 | 0.1603 | 0.7620 | 0.7874 | 0.7745 | 0.9564 |
| 0.0985 | 2.77 | 5000 | 0.1521 | 0.7659 | 0.8042 | 0.7846 | 0.9586 |
| 0.0929 | 3.05 | 5500 | 0.1491 | 0.7819 | 0.8181 | 0.7996 | 0.9611 |
| 0.0687 | 3.33 | 6000 | 0.1337 | 0.7980 | 0.8242 | 0.8109 | 0.9648 |
| 0.0721 | 3.6 | 6500 | 0.1244 | 0.8182 | 0.8529 | 0.8352 | 0.9680 |
| 0.0734 | 3.88 | 7000 | 0.1206 | 0.8274 | 0.8591 | 0.8430 | 0.9694 |
| 0.0627 | 4.16 | 7500 | 0.1197 | 0.8321 | 0.8640 | 0.8478 | 0.9693 |
| 0.0557 | 4.43 | 8000 | 0.1188 | 0.8412 | 0.8677 | 0.8542 | 0.9704 |
| 0.0557 | 4.71 | 8500 | 0.1073 | 0.8541 | 0.8728 | 0.8634 | 0.9736 |
| 0.0554 | 4.99 | 9000 | 0.1195 | 0.8395 | 0.8793 | 0.8589 | 0.9706 |
| 0.0431 | 5.27 | 9500 | 0.1114 | 0.8571 | 0.8793 | 0.8680 | 0.9739 |
| 0.0441 | 5.54 | 10000 | 0.1153 | 0.8588 | 0.8733 | 0.8660 | 0.9728 |
| 0.0526 | 5.82 | 10500 | 0.1077 | 0.8556 | 0.8808 | 0.8680 | 0.9735 |
| 0.0401 | 6.1 | 11000 | 0.1076 | 0.8651 | 0.8873 | 0.8761 | 0.9747 |
| 0.0373 | 6.37 | 11500 | 0.1048 | 0.8586 | 0.8889 | 0.8735 | 0.9746 |
| 0.0412 | 6.65 | 12000 | 0.0997 | 0.8697 | 0.8945 | 0.8819 | 0.9763 |
| 0.0321 | 6.93 | 12500 | 0.1081 | 0.8708 | 0.8988 | 0.8846 | 0.9757 |
| 0.0302 | 7.21 | 13000 | 0.1029 | 0.8712 | 0.8948 | 0.8828 | 0.9761 |
| 0.0324 | 7.48 | 13500 | 0.1053 | 0.8728 | 0.8987 | 0.8856 | 0.9762 |
| 0.0373 | 7.76 | 14000 | 0.1036 | 0.8732 | 0.9020 | 0.8873 | 0.9763 |
| 0.028 | 8.04 | 14500 | 0.1024 | 0.8783 | 0.9037 | 0.8908 | 0.9780 |
| 0.0272 | 8.31 | 15000 | 0.1043 | 0.8733 | 0.9052 | 0.8889 | 0.9774 |
| 0.0281 | 8.59 | 15500 | 0.1007 | 0.8767 | 0.9057 | 0.8910 | 0.9780 |
| 0.0317 | 8.87 | 16000 | 0.0987 | 0.8817 | 0.9076 | 0.8945 | 0.9786 |
| 0.0266 | 9.15 | 16500 | 0.1000 | 0.8808 | 0.9066 | 0.8935 | 0.9783 |
| 0.0256 | 9.42 | 17000 | 0.1011 | 0.8806 | 0.9064 | 0.8933 | 0.9781 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu121
- Datasets 2.2.2
- Tokenizers 0.13.3
| {"license": "cc-by-nc-sa-4.0", "tags": ["generated_from_trainer"], "datasets": ["sroie"], "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "passive_invoices_v4.4", "results": [{"task": {"type": "token-classification", "name": "Token Classification"}, "dataset": {"name": "sroie", "type": "sroie", "config": "discharge", "split": "test", "args": "discharge"}, "metrics": [{"type": "precision", "value": 0.8805877909771338, "name": "Precision"}, {"type": "recall", "value": 0.90635380592268, "name": "Recall"}, {"type": "f1", "value": 0.8932850376149345, "name": "F1"}, {"type": "accuracy", "value": 0.9780829089524534, "name": "Accuracy"}]}]}]} | token-classification | atatavana/passive_invoices_v4.4 | [
"transformers",
"pytorch",
"tensorboard",
"layoutlmv3",
"token-classification",
"generated_from_trainer",
"dataset:sroie",
"license:cc-by-nc-sa-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:31:02+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #layoutlmv3 #token-classification #generated_from_trainer #dataset-sroie #license-cc-by-nc-sa-4.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| passive\_invoices\_v4.4
=======================
This model is a fine-tuned version of microsoft/layoutlmv3-base on the sroie dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1011
* Precision: 0.8806
* Recall: 0.9064
* F1: 0.8933
* Accuracy: 0.9781
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 2
* eval\_batch\_size: 2
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* training\_steps: 17000
### Training results
### Framework versions
* Transformers 4.28.0
* Pytorch 2.1.0+cu121
* Datasets 2.2.2
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 17000",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.0+cu121\n* Datasets 2.2.2\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #layoutlmv3 #token-classification #generated_from_trainer #dataset-sroie #license-cc-by-nc-sa-4.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 17000",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.0+cu121\n* Datasets 2.2.2\n* Tokenizers 0.13.3"
] | [
76,
98,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #layoutlmv3 #token-classification #generated_from_trainer #dataset-sroie #license-cc-by-nc-sa-4.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 17000### Training results### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.0+cu121\n* Datasets 2.2.2\n* Tokenizers 0.13.3"
] | [
-0.11472612619400024,
0.10516137629747391,
-0.002045717788860202,
0.12537530064582825,
0.1558455377817154,
0.020754504948854446,
0.12298985570669174,
0.1307992935180664,
-0.06152460351586342,
0.026204155758023262,
0.13548597693443298,
0.14298225939273834,
0.027670303359627724,
0.15693378448486328,
-0.049888744950294495,
-0.25685322284698486,
-0.010953780263662338,
0.04692748934030533,
-0.05106453597545624,
0.13448268175125122,
0.09876840561628342,
-0.12783272564411163,
0.09653960168361664,
0.010442662052810192,
-0.20331355929374695,
-0.015105859376490116,
0.027878625318408012,
-0.05009457841515541,
0.146821066737175,
0.02900220826268196,
0.12337390333414078,
0.020900575444102287,
0.099802665412426,
-0.15579450130462646,
0.01090315356850624,
0.046813782304525375,
0.008267591707408428,
0.10334978252649307,
0.039833664894104004,
0.013457116670906544,
0.060841310769319534,
-0.07705759257078171,
0.05573158711194992,
0.01046417374163866,
-0.13163362443447113,
-0.2216448187828064,
-0.09171168506145477,
0.05262606590986252,
0.0936422273516655,
0.08205964416265488,
0.0012844522716477513,
0.150767982006073,
-0.05668679624795914,
0.08305242657661438,
0.169096440076828,
-0.2937656342983246,
-0.0699220523238182,
0.06810304522514343,
0.024940213188529015,
0.053099796175956726,
-0.10397697240114212,
-0.02000340260565281,
0.035726502537727356,
0.04000211879611015,
0.145084947347641,
-0.0309114009141922,
-0.025359366089105606,
0.015152690932154655,
-0.1323980689048767,
-0.041536398231983185,
0.1540806144475937,
0.047922614961862564,
-0.04004359245300293,
-0.06059476360678673,
-0.04581225663423538,
-0.12982194125652313,
-0.03751741349697113,
-0.003083195071667433,
0.038871973752975464,
-0.02555505372583866,
-0.10531184077262878,
-0.027477476745843887,
-0.10925082117319107,
-0.06450555473566055,
-0.05832303687930107,
0.11182891577482224,
0.01270776242017746,
0.01471524778753519,
-0.017305489629507065,
0.11301631480455399,
-0.0033682426437735558,
-0.1301174759864807,
0.031043056398630142,
0.020808126777410507,
-0.031310852617025375,
-0.0643778145313263,
-0.04184027388691902,
-0.05423774570226669,
-0.014031881466507912,
0.11433959007263184,
-0.02592453546822071,
0.028655529022216797,
0.02266087383031845,
0.04971560835838318,
-0.0982552170753479,
0.19394488632678986,
-0.046935662627220154,
-0.0275578610599041,
-0.0005639510345645249,
0.08558106422424316,
0.023239342495799065,
-0.012930799275636673,
-0.14897996187210083,
0.005902503617107868,
0.08760355412960052,
0.009174436330795288,
-0.045862454921007156,
0.060249052941799164,
-0.06565365195274353,
-0.040584858506917953,
0.058258961886167526,
-0.07640356570482254,
0.03194462135434151,
-0.016498452052474022,
-0.07729197293519974,
-0.0535440593957901,
0.002579262014478445,
0.026772083714604378,
0.009687887504696846,
0.11461113393306732,
-0.10417542606592178,
0.02598605304956436,
-0.08912680298089981,
-0.10438825190067291,
0.01648086868226528,
-0.09551196545362473,
0.01980312168598175,
-0.09578091651201248,
-0.19048888981342316,
-0.014223574660718441,
0.06583280116319656,
-0.04010872170329094,
-0.0719740241765976,
-0.04172276705503464,
-0.0638536661863327,
0.011923045851290226,
-0.015374135226011276,
0.12657834589481354,
-0.061558522284030914,
0.10145286470651627,
0.01509050838649273,
0.05867299437522888,
-0.04460155963897705,
0.04238652437925339,
-0.09601376205682755,
0.03348855674266815,
-0.15477973222732544,
0.038221053779125214,
-0.03369814530014992,
0.07041912525892258,
-0.11167622357606888,
-0.08908668905496597,
0.01756971888244152,
-0.012233960442245007,
0.062439609318971634,
0.09020037204027176,
-0.18390263617038727,
-0.06687523424625397,
0.14899088442325592,
-0.05900394544005394,
-0.1279173493385315,
0.12915053963661194,
-0.06661659479141235,
0.06018075346946716,
0.057326383888721466,
0.18021424114704132,
0.0771457850933075,
-0.09146387130022049,
0.020007716491818428,
0.012736883945763111,
0.055783744901418686,
-0.08234769105911255,
0.09724763035774231,
-0.0011223368346691132,
0.037132296711206436,
0.007089321501553059,
-0.05726972594857216,
0.05874404311180115,
-0.08054812997579575,
-0.09184709936380386,
-0.015928667038679123,
-0.08881986886262894,
0.05750558152794838,
0.06657421588897705,
0.06935598701238632,
-0.0869150385260582,
-0.09005746245384216,
0.06221138685941696,
0.0839172750711441,
-0.04641876742243767,
0.023820267990231514,
-0.08458410203456879,
0.07467078417539597,
-0.07431945949792862,
-0.02960790880024433,
-0.15763448178768158,
-0.03749604895710945,
0.00974937155842781,
0.03104179911315441,
0.011821513064205647,
0.02194800227880478,
0.06626692414283752,
0.05513345077633858,
-0.0654791072010994,
-0.02186022326350212,
-0.020408524200320244,
0.0012356913648545742,
-0.13007397949695587,
-0.18888762593269348,
-0.056985531002283096,
-0.03090202435851097,
0.16065596044063568,
-0.21758772432804108,
0.03512854501605034,
0.004163525532931089,
0.09713265299797058,
0.03995530679821968,
-0.022068724036216736,
-0.03279528766870499,
0.06694307178258896,
-0.03614724799990654,
-0.05969259515404701,
0.07880034297704697,
0.02040976658463478,
-0.11616497486829758,
-0.01422989834100008,
-0.11694139987230301,
0.1585492044687271,
0.12237538397312164,
-0.07599838078022003,
-0.07343191653490067,
-0.031268712133169174,
-0.04542938992381096,
-0.027820656076073647,
-0.04766308516263962,
0.009334757924079895,
0.14590150117874146,
0.013485674746334553,
0.16500549018383026,
-0.07176868617534637,
-0.04872850328683853,
0.028414467349648476,
-0.03195551410317421,
0.005215295124799013,
0.11234652251005173,
0.09320589900016785,
-0.12003657221794128,
0.15327687561511993,
0.15882334113121033,
-0.061700545251369476,
0.13453364372253418,
-0.036229752004146576,
-0.06446968764066696,
-0.047405969351530075,
-0.017506498843431473,
0.014692283235490322,
0.13129250705242157,
-0.09000927954912186,
-0.008454713970422745,
0.026108408346772194,
0.020713601261377335,
0.0005163585301488638,
-0.22627794742584229,
-0.045980505645275116,
0.039823826402425766,
-0.04046343266963959,
-0.024629896506667137,
-0.01358089130371809,
-0.009547563269734383,
0.09502624720335007,
0.02865665964782238,
-0.08976190537214279,
0.055496424436569214,
0.0034121142234653234,
-0.07650274783372879,
0.19746612012386322,
-0.06676819175481796,
-0.15803249180316925,
-0.14849591255187988,
-0.08461078256368637,
-0.037402037531137466,
0.0215971190482378,
0.033153340220451355,
-0.061148788779973984,
-0.021275650709867477,
-0.07832907140254974,
-0.022374803200364113,
-0.01905364915728569,
0.018978603184223175,
0.010818849317729473,
-0.0011281741317361593,
0.06875070929527283,
-0.07649371773004532,
-0.010422416031360626,
-0.03664155676960945,
-0.026610245928168297,
0.04132654517889023,
0.01490049809217453,
0.10980628430843353,
0.14578396081924438,
-0.014674809761345387,
0.01347065158188343,
-0.045828867703676224,
0.21761296689510345,
-0.08496580272912979,
-0.01816580817103386,
0.1478658765554428,
-0.026532264426350594,
0.057969819754362106,
0.13977356255054474,
0.07063709199428558,
-0.0823846161365509,
-0.0021766352001577616,
0.013988482765853405,
-0.046991392970085144,
-0.18877995014190674,
-0.038943029940128326,
-0.05867040157318115,
0.0009831913048401475,
0.10396828502416611,
0.019298041239380836,
0.014794344082474709,
0.0696306899189949,
0.03584863618016243,
0.08180775493383408,
-0.03646697476506233,
0.07693919539451599,
0.10429113358259201,
0.044940922409296036,
0.13649922609329224,
-0.03587767854332924,
-0.05096203833818436,
0.03875664994120598,
0.038386791944503784,
0.2078830748796463,
0.012431949377059937,
0.16212089359760284,
0.038409579545259476,
0.1619643270969391,
0.009541087783873081,
0.0457100011408329,
0.006992502138018608,
-0.03956134617328644,
-0.02071247063577175,
-0.028305480256676674,
-0.03083072230219841,
0.03555063158273697,
-0.017483271658420563,
0.03971176967024803,
-0.10198524594306946,
0.010614311322569847,
0.04567134380340576,
0.2416987419128418,
0.059448227286338806,
-0.34965386986732483,
-0.10358171164989471,
0.012196890078485012,
-0.025711428374052048,
-0.026667291298508644,
0.005714649334549904,
0.11486611515283585,
-0.10006319731473923,
0.026220863685011864,
-0.09015514701604843,
0.08894577622413635,
-0.0648791491985321,
0.03790232911705971,
0.08270230889320374,
0.07090513408184052,
-0.0011485021095722914,
0.0734802782535553,
-0.26267728209495544,
0.29697737097740173,
0.018307818099856377,
0.05568605288863182,
-0.06735441833734512,
-0.0038133899215608835,
0.026822390034794807,
0.07825938612222672,
0.09772732853889465,
-0.007404554635286331,
-0.05424680933356285,
-0.21762165427207947,
-0.057837069034576416,
0.006188353057950735,
0.06911562383174896,
-0.055989060550928116,
0.09336792677640915,
-0.037807315587997437,
0.003255600342527032,
0.06144636124372482,
0.011879445053637028,
-0.013974065892398357,
-0.09602383524179459,
0.008687326684594154,
0.026101529598236084,
-0.04496005177497864,
-0.06746172904968262,
-0.10689257830381393,
-0.0966370701789856,
0.1496216356754303,
-0.02548336423933506,
-0.0368550643324852,
-0.11631996929645538,
0.08682914823293686,
0.06908160448074341,
-0.08871719986200333,
0.027552708983421326,
0.00021586113143712282,
0.09934038668870926,
0.01683344691991806,
-0.04054837301373482,
0.1145118921995163,
-0.06810085475444794,
-0.16157281398773193,
-0.07373137772083282,
0.11773896962404251,
0.009414366446435452,
0.07178940623998642,
0.005757794715464115,
0.029901601374149323,
-0.03324967250227928,
-0.06381680816411972,
0.05277783423662186,
-0.026443125680088997,
0.07355418801307678,
-0.007114982232451439,
-0.028183715417981148,
0.028850873932242393,
-0.0552448146045208,
-0.04025006666779518,
0.17567813396453857,
0.2710682451725006,
-0.10551856458187103,
0.0278130155056715,
0.02411753125488758,
-0.060402072966098785,
-0.19274042546749115,
0.04969705268740654,
0.05085204914212227,
0.022104820236563683,
0.057051993906497955,
-0.16307322680950165,
0.075068399310112,
0.09168817102909088,
-0.031066257506608963,
0.0836174488067627,
-0.29934972524642944,
-0.12491613626480103,
0.08451323956251144,
0.12131001800298691,
0.09747970849275589,
-0.12719297409057617,
-0.03457731753587723,
-0.015605795197188854,
-0.11679776012897491,
0.1099676787853241,
-0.06160421669483185,
0.11328922212123871,
-0.013371736742556095,
0.08819445967674255,
0.011886189691722393,
-0.05769561231136322,
0.12969578802585602,
0.010289225727319717,
0.09037033468484879,
-0.05252480134367943,
-0.04638269171118736,
0.06014837697148323,
-0.051190778613090515,
0.004750795196741819,
-0.06770776957273483,
0.020543761551380157,
-0.1257573813199997,
-0.021663110703229904,
-0.07401712238788605,
0.019100116565823555,
-0.02899588830769062,
-0.06861474364995956,
-0.029074957594275475,
0.0601339116692543,
0.04708969220519066,
-0.015760034322738647,
0.15466640889644623,
0.011762778274714947,
0.13752709329128265,
0.12854905426502228,
0.0880158469080925,
-0.0569058395922184,
-0.0629602000117302,
-0.02175212651491165,
-0.03166807442903519,
0.052764635533094406,
-0.15010854601860046,
0.030579425394535065,
0.1310463547706604,
0.021422626450657845,
0.14338430762290955,
0.0703158900141716,
-0.02712949551641941,
0.02180924080312252,
0.06630033999681473,
-0.14789432287216187,
-0.09908383339643478,
-0.010032404214143753,
-0.03245871886610985,
-0.13444872200489044,
0.027807481586933136,
0.11982465535402298,
-0.06023254990577698,
-0.009398371912539005,
0.004246351309120655,
0.0037684838753193617,
-0.04635796323418617,
0.17963409423828125,
0.05680130422115326,
0.05668886750936508,
-0.08744322508573532,
0.0563226044178009,
0.06415246427059174,
-0.07278039306402206,
-0.0025401825550943613,
0.040701333433389664,
-0.09954972565174103,
-0.042307961732149124,
0.011872051283717155,
0.14346720278263092,
-0.08009056001901627,
-0.030395133420825005,
-0.14608153700828552,
-0.10180097818374634,
0.06230519711971283,
0.13222557306289673,
0.10361135751008987,
0.005459437146782875,
-0.04871312901377678,
-0.00043025120976381004,
-0.11530661582946777,
0.0991584062576294,
0.04278646036982536,
0.07517554610967636,
-0.15506593883037567,
0.16000793874263763,
-0.015140472911298275,
0.05030161514878273,
-0.019378263503313065,
0.03169766068458557,
-0.10130403935909271,
0.015584420412778854,
-0.10545826703310013,
-0.02067127265036106,
-0.03822823986411095,
-0.0009843348525464535,
-0.003923279233276844,
-0.05767333507537842,
-0.045538466423749924,
0.005064159166067839,
-0.11512971669435501,
-0.02032969705760479,
0.03835105150938034,
0.05218487232923508,
-0.09771653264760971,
-0.04481515660881996,
0.02187638357281685,
-0.05821474269032478,
0.07427500933408737,
0.00822126492857933,
0.02939869463443756,
0.031441982835531235,
-0.09120754152536392,
0.017266923561692238,
0.04013785347342491,
0.013854130171239376,
0.07148049026727676,
-0.09417981654405594,
-0.006970478221774101,
-0.01320091262459755,
0.034177277237176895,
0.030607474967837334,
0.08454694598913193,
-0.12905676662921906,
0.003196069737896323,
-0.005003135651350021,
-0.06731444597244263,
-0.06739247590303421,
0.05360542982816696,
0.0722476989030838,
0.044049013406038284,
0.2041216343641281,
-0.07152897864580154,
0.036856718361377716,
-0.20308251678943634,
-0.0026317937299609184,
-0.012741505168378353,
-0.10961756110191345,
-0.10452321916818619,
-0.07079573720693588,
0.06039412319660187,
-0.06194465979933739,
0.11452346295118332,
0.03972570598125458,
0.06065872684121132,
0.03997763618826866,
-0.0142905805259943,
0.03774024546146393,
0.014919099397957325,
0.1802697628736496,
0.029218940064311028,
-0.037109557539224625,
0.07250221818685532,
0.04175128787755966,
0.08292970806360245,
0.12504255771636963,
0.16881336271762848,
0.14093510806560516,
0.011873152107000351,
0.08079523593187332,
0.04332425445318222,
-0.04123745113611221,
-0.19664525985717773,
0.024009056389331818,
-0.04522927477955818,
0.10507268458604813,
-0.02642802707850933,
0.19637133181095123,
0.07845967262983322,
-0.1847168654203415,
0.025040345266461372,
-0.06209966167807579,
-0.08372186869382858,
-0.0974922701716423,
-0.10141582041978836,
-0.08313200622797012,
-0.11820151656866074,
0.002321413718163967,
-0.098109170794487,
0.01237744465470314,
0.14330770075321198,
-0.0032638683915138245,
-0.0153727438300848,
0.12574975192546844,
-0.00365554285235703,
0.02705415152013302,
0.05625314638018608,
0.010500537231564522,
-0.013958417810499668,
-0.09453894197940826,
-0.06553180515766144,
-0.014753162860870361,
-0.03343171253800392,
0.033351507037878036,
-0.07343065738677979,
-0.019606152549386024,
0.021249819546937943,
-0.013128791004419327,
-0.1089552491903305,
0.0057227835059165955,
0.02482595667243004,
0.06442037224769592,
0.041393015533685684,
0.010703419335186481,
0.026613403111696243,
-0.0151672950014472,
0.22865422070026398,
-0.07790245860815048,
-0.04550120234489441,
-0.11807490140199661,
0.24885962903499603,
0.007981895469129086,
-0.025007789954543114,
0.02690955065190792,
-0.0726422518491745,
0.02717616967856884,
0.22794437408447266,
0.1944398432970047,
-0.11200219392776489,
-0.0067359753884375095,
0.010743549093604088,
-0.009633312933146954,
-0.02469033934175968,
0.1148776113986969,
0.08741731941699982,
0.01161153893917799,
-0.09554441273212433,
-0.04503680393099785,
-0.06430614739656448,
-0.01870362088084221,
-0.019978154450654984,
0.05877803638577461,
0.03316816687583923,
0.017371241003274918,
-0.05129062011837959,
0.06607004255056381,
-0.04436109587550163,
-0.10360332578420639,
0.07123740017414093,
-0.21520459651947021,
-0.16222110390663147,
-0.011314271949231625,
0.08092325925827026,
-0.0014275184366852045,
0.0650225356221199,
-0.03492607921361923,
0.02173726074397564,
0.059060338884592056,
-0.020174458622932434,
-0.06883098185062408,
-0.07093590497970581,
0.10533315688371658,
-0.0873795598745346,
0.21160127222537994,
-0.05654818192124367,
0.06269479542970657,
0.12624287605285645,
0.055858105421066284,
-0.08417236059904099,
0.05018939450383186,
0.05499890446662903,
-0.03633236512541771,
0.03536035493016243,
0.09713154286146164,
-0.03568955883383751,
0.11393094807863235,
0.051774702966213226,
-0.13172701001167297,
0.015730559825897217,
-0.08781632781028748,
-0.05287845432758331,
-0.04876963421702385,
-0.03478880599141121,
-0.047942839562892914,
0.1501244604587555,
0.2043846696615219,
-0.03530283272266388,
-0.017308302223682404,
-0.05855875834822655,
0.004611734300851822,
0.07701636850833893,
0.02405489794909954,
-0.07569844275712967,
-0.20755450427532196,
0.0012407884933054447,
0.04859388992190361,
-0.015064350329339504,
-0.23776915669441223,
-0.09620483964681625,
0.0023821480572223663,
-0.06617759168148041,
-0.0740196630358696,
0.10215537250041962,
0.07901430875062943,
0.049086518585681915,
-0.0652848333120346,
-0.032977212220430374,
-0.06421712785959244,
0.1299734264612198,
-0.14074090123176575,
-0.086308553814888
] |
null | null | null |
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
```python
{'exp_name': 'ppo'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'env_id': 'LunarLander-v2'
'total_timesteps': 500000
'learning_rate': 0.00025
'num_envs': 4
'num_steps': 128
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.95
'num_minibatches': 4
'update_epochs': 4
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'Overgrown7380/ppo-implement-LunarLander-v2'
'batch_size': 512
'minibatch_size': 128}
```
| {"tags": ["LunarLander-v2", "ppo", "deep-reinforcement-learning", "reinforcement-learning", "custom-implementation", "deep-rl-course"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "-46.67 +/- 22.05", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | Overgrown7380/ppo-implement-LunarLander-v2 | [
"tensorboard",
"LunarLander-v2",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
] | 2024-02-14T15:37:04+00:00 | [] | [] | TAGS
#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us
|
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
| [
"# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n\n # Hyperparameters"
] | [
"TAGS\n#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n",
"# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n\n # Hyperparameters"
] | [
51,
37
] | [
"passage: TAGS\n#tensorboard #LunarLander-v2 #ppo #deep-reinforcement-learning #reinforcement-learning #custom-implementation #deep-rl-course #model-index #region-us \n# PPO Agent Playing LunarLander-v2\n\n This is a trained model of a PPO agent playing LunarLander-v2.\n\n # Hyperparameters"
] | [
0.07948226481676102,
-0.021824665367603302,
-0.005334289278835058,
0.07425090670585632,
0.11451162397861481,
-0.051334477961063385,
0.11827225238084793,
0.05111894756555557,
0.0632978081703186,
0.08233953267335892,
0.09910695254802704,
0.11526558548212051,
0.02103434130549431,
0.12346389144659042,
0.10133372992277145,
-0.26653239130973816,
0.0048308540135622025,
-0.042133692651987076,
0.020121442154049873,
0.07062754780054092,
-0.028985055163502693,
-0.12164036184549332,
0.02042403817176819,
-0.008055811747908592,
0.04164125770330429,
0.03685355558991432,
-0.020250989124178886,
-0.07061084359884262,
0.1035412922501564,
-0.04342407360672951,
0.07646117359399796,
0.04053044691681862,
0.12915800511837006,
-0.11266650259494781,
0.03731851652264595,
0.047094929963350296,
-0.058420803397893906,
0.040810972452163696,
0.023221731185913086,
0.07433853298425674,
0.15582501888275146,
0.0008022422553040087,
0.10807766020298004,
-0.019928930327296257,
-0.15859591960906982,
-0.0564296655356884,
0.04013175517320633,
0.10688508301973343,
0.041339244693517685,
0.05763867497444153,
0.01518392562866211,
0.24210692942142487,
-0.07300914824008942,
0.0014766358071938157,
0.1963091939687729,
-0.2750851511955261,
-0.056198850274086,
0.2650637924671173,
0.08425293117761612,
0.09438422322273254,
-0.09869689494371414,
-0.0236953292042017,
0.007850034162402153,
0.013983802869915962,
-0.038732558488845825,
-0.07621388882398605,
0.1343805193901062,
0.06358266621828079,
-0.07906194031238556,
-0.05448254942893982,
0.09211132675409317,
0.015635671094059944,
0.03398676961660385,
0.0008897133520804346,
-0.015260354615747929,
0.03964465111494064,
-0.008004734292626381,
-0.08323223143815994,
0.067534439265728,
0.017411211505532265,
-0.059903185814619064,
-0.11101946979761124,
-0.11182308942079544,
-0.028280947357416153,
-0.08438915759325027,
0.16840966045856476,
-0.023494480177760124,
0.07285201549530029,
-0.06215810775756836,
0.06860414892435074,
-0.037912189960479736,
0.004227026831358671,
0.006380763836205006,
-0.049948662519454956,
-0.04539962485432625,
-0.025878654792904854,
0.006328459829092026,
0.011017742566764355,
0.11213880032300949,
-0.002449487103149295,
0.0508684441447258,
0.04856472462415695,
0.014653711579740047,
0.0942535474896431,
0.04126615449786186,
0.18958540260791779,
-0.006363034248352051,
0.0650586485862732,
0.062062907963991165,
0.017491057515144348,
0.022076671943068504,
-0.05142693966627121,
-0.1658715307712555,
0.0807771384716034,
-0.08260773122310638,
-0.028765955939888954,
0.09323479980230331,
-0.044928085058927536,
-0.1112084910273552,
-0.01773354969918728,
-0.07590804249048233,
-0.025731517001986504,
-0.01252016518265009,
0.01790926419198513,
-0.035574477165937424,
0.005672375671565533,
0.03449513763189316,
0.08204318583011627,
0.033907562494277954,
-0.08674118667840958,
0.00984077900648117,
0.012360874563455582,
-0.122767873108387,
-0.004771664272993803,
0.010288639925420284,
0.04804306477308273,
0.04491464048624039,
-0.1116413027048111,
-0.2020648866891861,
-0.08828215301036835,
0.053431469947099686,
-0.07537820190191269,
-0.15614600479602814,
-0.11512033641338348,
0.02302604168653488,
-0.10217837989330292,
-0.046169016510248184,
-0.0017400066135451198,
-0.019300667569041252,
0.05366985872387886,
-0.06531468033790588,
0.1828034669160843,
0.0271916463971138,
-0.00020129751646891236,
-0.14947181940078735,
0.019320663064718246,
-0.2362208217382431,
0.07685942947864532,
-0.04987453296780586,
0.07074880599975586,
-0.04584719240665436,
-0.09154892712831497,
-0.01864667609333992,
0.054014526307582855,
0.013841784559190273,
0.10950348526239395,
-0.1638582944869995,
-0.05129624530673027,
0.024843567982316017,
-0.08068934828042984,
-0.0030390452593564987,
-0.04837793856859207,
-0.04604795575141907,
0.1606992781162262,
0.018704978749155998,
0.14688511192798615,
-0.12919624149799347,
-0.09930720180273056,
0.19129104912281036,
0.03531093895435333,
-0.16984215378761292,
-0.036521974951028824,
0.09952033311128616,
0.019277004525065422,
-0.01849931664764881,
-0.05688142776489258,
-0.07599073648452759,
0.015944182872772217,
-0.08702079951763153,
-0.04182637855410576,
0.04013517126441002,
-0.042824242264032364,
0.14606650173664093,
0.10223949700593948,
0.07952884584665298,
-0.07538176327943802,
-0.007020880468189716,
0.08674140274524689,
0.06271850317716599,
0.045035574585199356,
0.03672485426068306,
-0.05614851415157318,
0.03206208720803261,
-0.025039123371243477,
-0.01738123595714569,
-0.13521039485931396,
0.0019960827194154263,
-0.06055765971541405,
0.1118607297539711,
0.13101612031459808,
0.28467631340026855,
0.10075046867132187,
0.02464960888028145,
0.07675616443157196,
-0.07042508572340012,
-0.10758408159017563,
0.002032244112342596,
0.0235405582934618,
-0.1785016655921936,
0.026378504931926727,
-0.07599464803934097,
-0.14044412970542908,
-0.1351996809244156,
-0.025685761123895645,
-0.17195537686347961,
0.02159930020570755,
0.054728612303733826,
-0.018639836460351944,
0.0013907389948144555,
0.12220112234354019,
0.013543038628995419,
-0.053733617067337036,
0.10188740491867065,
0.009542218409478664,
-0.05206648260354996,
-0.045367226004600525,
0.1050298660993576,
0.13431710004806519,
0.1365344226360321,
-0.2098493129014969,
0.008600602857768536,
0.1119711846113205,
-0.04708562791347504,
0.03519878163933754,
0.026510966941714287,
0.21071651577949524,
0.2740876078605652,
0.0374440960586071,
0.008118349127471447,
-0.05789022892713547,
0.0453064851462841,
-0.05260699614882469,
-0.11800429224967957,
-0.05410657823085785,
0.17159637808799744,
0.07862472534179688,
-0.006237224210053682,
0.09871696680784225,
0.07909595966339111,
0.037818074226379395,
0.16045765578746796,
0.03334520757198334,
-0.09544764459133148,
-0.03232238441705704,
-0.026171676814556122,
-0.0047440179623663425,
0.06791821867227554,
-0.0798373743891716,
-0.032012078911066055,
0.021649274975061417,
-0.13788609206676483,
0.018513672053813934,
-0.18612799048423767,
-0.1437452882528305,
0.03805195167660713,
0.043561313301324844,
-0.008401780389249325,
0.04065251722931862,
-0.0160639937967062,
0.05676067993044853,
0.03282754495739937,
-0.08861549198627472,
0.04405612871050835,
-0.005384152289479971,
0.009959283284842968,
0.03441033884882927,
-0.01767686940729618,
-0.21204280853271484,
-0.15340813994407654,
0.013550614938139915,
-0.05142427980899811,
0.05592547729611397,
-0.008550947532057762,
-0.19242143630981445,
0.025911282747983932,
-0.014332908205688,
0.02364996261894703,
-0.03164665028452873,
-0.03833974152803421,
0.1345074623823166,
0.14185978472232819,
-0.026165392249822617,
0.00023905932903289795,
-0.03341824188828468,
-0.14318081736564636,
-0.180479034781456,
0.06557876616716385,
0.0740460753440857,
0.006866236217319965,
0.1220167726278305,
0.004434254486113787,
0.026604121550917625,
-0.00636066310107708,
0.007762894034385681,
-0.07827747613191605,
-0.10268643498420715,
0.2943233549594879,
0.02490289881825447,
-0.022609207779169083,
-0.023361563682556152,
0.022680940106511116,
-0.005913543980568647,
0.020695405080914497,
-0.06731052696704865,
-0.11051533371210098,
-0.10214895755052567,
-0.018064133822917938,
-0.05326148122549057,
0.08696132898330688,
0.05207669362425804,
-0.0023201601579785347,
-0.058658841997385025,
0.0491698756814003,
0.15816207230091095,
0.0022554483730345964,
-0.07889559864997864,
0.00756099633872509,
0.06827649474143982,
-0.10357149690389633,
0.019141824916005135,
-0.011750275269150734,
-0.06115471199154854,
0.01578802429139614,
0.021844392642378807,
0.02698187716305256,
0.10298074781894684,
-0.21004606783390045,
0.04396829754114151,
0.06455216556787491,
0.025463011115789413,
0.08768844604492188,
0.05016043782234192,
-0.11047832667827606,
-0.016628960147500038,
-0.0343489907681942,
-0.16258354485034943,
0.1297316700220108,
0.14130131900310516,
0.06893892586231232,
0.039022352546453476,
0.04288983345031738,
-0.07514789700508118,
0.058336563408374786,
-0.03656633570790291,
-0.1470387876033783,
-0.018523573875427246,
0.03902188688516617,
0.03257647529244423,
0.038807060569524765,
0.10827972739934921,
0.10223158448934555,
-0.14332416653633118,
-0.03201044723391533,
0.06512229144573212,
-0.008886558935046196,
-0.04119880497455597,
0.004403908737003803,
-0.09832779318094254,
0.07498125731945038,
-0.0024919756688177586,
0.04813602566719055,
-0.20199769735336304,
0.16434083878993988,
-0.09330786764621735,
0.034300561994314194,
-0.04896155744791031,
-0.044333528727293015,
0.03555295243859291,
-0.09057865291833878,
0.20472288131713867,
0.0057462104596197605,
0.008313721977174282,
-0.12209630757570267,
-0.17661772668361664,
-0.034985676407814026,
-0.09205599129199982,
-0.07460658252239227,
0.02909865602850914,
0.0682184249162674,
0.029013507068157196,
-0.044006895273923874,
0.1327963024377823,
-0.007539169397205114,
0.08532623946666718,
-0.09495806694030762,
-0.09892267733812332,
-0.06850815564393997,
-0.09003753960132599,
-0.13165755569934845,
-0.069197878241539,
0.05082700401544571,
0.12665395438671112,
0.02109835296869278,
-0.02864154241979122,
0.016000375151634216,
-0.01131656114012003,
0.0060316757299005985,
-0.006539386231452227,
0.0482512004673481,
0.015850301831960678,
-0.05547862499952316,
-0.13189296424388885,
0.08252222090959549,
-0.06544385105371475,
-0.06556238979101181,
-0.023766927421092987,
0.09430349618196487,
0.09706855565309525,
0.1314772367477417,
-0.052682001143693924,
0.028886299580335617,
-0.03723334148526192,
-0.04484548792243004,
0.18565788865089417,
0.0040725888684391975,
-0.07140722125768661,
0.04510314390063286,
0.08041586726903915,
0.05989309027791023,
0.0390491709113121,
-0.031676698476076126,
0.20406655967235565,
0.15550298988819122,
-0.018378838896751404,
0.19636642932891846,
-0.017176153138279915,
-0.0269333329051733,
-0.20952188968658447,
0.006836839485913515,
-0.019357649609446526,
0.029477683827280998,
0.1340312361717224,
-0.1391998678445816,
0.02293945848941803,
-0.004865060094743967,
-0.02284914068877697,
-0.07053285837173462,
-0.3114997148513794,
-0.06468415260314941,
0.20102077722549438,
0.17379379272460938,
0.30399972200393677,
-0.10662104934453964,
0.05403600633144379,
0.02176249772310257,
0.035715505480766296,
0.03934846818447113,
-0.07645441591739655,
0.1000572219491005,
-0.11122481524944305,
0.16528162360191345,
0.08111181855201721,
-0.020749825984239578,
-0.02004031278192997,
-0.13701297342777252,
0.018633954226970673,
-0.12466508150100708,
-0.017992790788412094,
0.08779406547546387,
-0.003319771494716406,
-0.09328535199165344,
0.23242005705833435,
-0.06734555959701538,
-0.127778559923172,
-0.028943995013833046,
-0.057271506637334824,
-0.030531147494912148,
0.012628542259335518,
-0.09404513984918594,
0.005903336685150862,
0.1308545619249344,
-0.011834635399281979,
0.11608193069696426,
0.16071371734142303,
-0.035819161683321,
0.07980551570653915,
0.11671095341444016,
0.041628848761320114,
0.06653126329183578,
-0.16247588396072388,
-0.008802353404462337,
-0.0202709399163723,
0.029673689976334572,
-0.1328430324792862,
-0.08996491879224777,
0.037999510765075684,
0.055287107825279236,
-0.016219541430473328,
0.11157703399658203,
-0.02790040522813797,
0.0671137273311615,
0.05197756364941597,
-0.14911557734012604,
-0.21309031546115875,
0.043088413774967194,
-0.03457297012209892,
0.16741053760051727,
0.032527483999729156,
0.07026690244674683,
-0.1318490356206894,
0.005996404681354761,
-0.008010598830878735,
-0.02555401436984539,
-0.113502137362957,
-0.04016893729567528,
0.10736791044473648,
0.01890859194099903,
-0.05588224157691002,
0.11932288110256195,
0.053731534630060196,
0.07207717001438141,
0.022103527560830116,
0.036430660635232925,
0.10638459026813507,
-0.05759545415639877,
0.08525355905294418,
0.19163745641708374,
0.022084489464759827,
-0.050156377255916595,
-0.1069810688495636,
-0.142279252409935,
0.1059383824467659,
-0.029212607070803642,
0.06867408007383347,
-0.16743674874305725,
-0.09695854038000107,
0.03239866718649864,
-0.006085241679102182,
-0.045712824910879135,
-0.04037291929125786,
-0.029692232608795166,
-0.1638854742050171,
0.07177262753248215,
-0.026750473305583,
0.09733851999044418,
-0.07764898240566254,
-0.08057862520217896,
-0.1878826767206192,
0.0927230566740036,
0.11600489169359207,
-0.09250454604625702,
-0.07816965878009796,
0.0006463889149017632,
0.007188722491264343,
-0.05905555561184883,
-0.05547625944018364,
0.05128099024295807,
-0.1268264353275299,
0.03925716504454613,
0.02211940288543701,
0.07955963909626007,
-0.013168327510356903,
-0.022237133234739304,
0.053730763494968414,
-0.05526714771986008,
-0.004513209220021963,
-0.0007778665167279541,
-0.010598957538604736,
-0.04734821990132332,
-0.2539333701133728,
0.026826584711670876,
0.015074611641466618,
0.023000292479991913,
0.11450504511594772,
0.052672553807497025,
0.002142281737178564,
-0.022901082411408424,
-0.09921795129776001,
0.004082086030393839,
0.0676940307021141,
-0.0444176085293293,
0.02973432093858719,
0.04361078143119812,
-0.10892095416784286,
-0.011856138706207275,
-0.024206269532442093,
0.07134921103715897,
0.010941405780613422,
0.06965811550617218,
-0.07052738219499588,
0.09066002070903778,
-0.1813029795885086,
-0.042003389447927475,
0.02394963428378105,
0.0719861164689064,
0.12007027864456177,
-0.10232933610677719,
0.05554276332259178,
0.007666701916605234,
0.16984406113624573,
0.10653958469629288,
-0.002575549529865384,
-0.03601353242993355,
0.06471540033817291,
0.09858960658311844,
0.034707363694906235,
0.04066390544176102,
0.06345933675765991,
-0.010203788988292217,
0.10382732003927231,
0.10297582298517227,
0.14551296830177307,
0.050692107528448105,
0.15706492960453033,
0.03763074800372124,
0.008729667402803898,
0.07412492483854294,
0.0944521427154541,
0.08652419596910477,
-0.006242257542908192,
0.1731923371553421,
-0.007543493993580341,
-0.01751723699271679,
-0.03595760464668274,
0.16348356008529663,
0.06810002774000168,
-0.10502735525369644,
0.032236937433481216,
-0.05084357038140297,
0.025795334950089455,
-0.021152885630726814,
-0.15513712167739868,
-0.03436838835477829,
-0.2639841139316559,
0.12161721289157867,
-0.04934193193912506,
-0.00526955584064126,
0.0620683990418911,
-0.019800636917352676,
-0.053851764649152756,
-0.00036916558747179806,
0.0654521957039833,
0.026729213073849678,
0.01114212442189455,
-0.028801998123526573,
-0.021474527195096016,
-0.19075548648834229,
-0.11265835911035538,
-0.04041624069213867,
-0.13205185532569885,
-0.026539895683526993,
0.02738100476562977,
-0.05638997629284859,
0.00884995236992836,
-0.0025031883269548416,
-0.01385815255343914,
0.04824291169643402,
-0.052424367517232895,
0.045965224504470825,
0.051154542714357376,
0.06721315532922745,
-0.07684784382581711,
0.00411610584706068,
0.11700203269720078,
0.03185063600540161,
-0.09347992390394211,
0.055158115923404694,
0.12995439767837524,
-0.058530066162347794,
0.026019345968961716,
-0.007744444999843836,
-0.032847896218299866,
-0.09708602726459503,
0.19312189519405365,
0.11783043295145035,
-0.16847896575927734,
0.0006766151054762304,
-0.036616407334804535,
-0.01160040870308876,
-0.09233774989843369,
0.12344596534967422,
0.1592838317155838,
0.055998723953962326,
-0.15062640607357025,
-0.11043619364500046,
-0.10300665348768234,
0.06709197163581848,
-0.07569106668233871,
-0.07460284233093262,
0.15964122116565704,
-0.02457398921251297,
-0.10188330709934235,
0.03819292411208153,
-0.21867942810058594,
-0.01995755359530449,
0.19039398431777954,
-0.29568302631378174,
-0.11494400352239609,
-0.07910088449716568,
0.18586759269237518,
0.025469033047556877,
0.11436232179403305,
-0.023825788870453835,
-0.02012297883629799,
-0.221383735537529,
0.0029703411273658276,
-0.08713068813085556,
0.034245800226926804,
0.0651308074593544,
-0.09516268968582153,
0.24007263779640198,
-0.09044498205184937,
0.05269941687583923,
0.033750344067811966,
0.07691317796707153,
0.01018204540014267,
0.05163824185729027,
-0.048588331788778305,
-0.16688252985477448,
-0.09095858782529831,
0.014404932036995888,
0.03795035555958748,
0.0503084696829319,
0.09903772920370102,
-0.04082057997584343,
0.04713768512010574,
0.0953395888209343,
0.030845828354358673,
-0.004454230889678001,
0.052237071096897125,
-0.15630710124969482,
0.05534590780735016,
0.018921079114079475,
-0.025683825835585594,
0.02539582923054695,
-0.08227502554655075,
0.10333657264709473,
0.03491305932402611,
0.0618959404528141,
-0.0665573701262474,
0.03160114586353302,
-0.009742318652570248,
-0.12334126234054565,
-0.04329211637377739,
-0.18513770401477814,
-0.0893927589058876,
-0.1391412913799286,
-0.03897256776690483,
-0.04044290632009506,
-0.025919048115611076,
0.01644543558359146,
0.00776201207190752,
-0.0044921645894646645,
-0.11029971390962601,
0.07136444747447968,
0.11884529888629913,
-0.030008424073457718,
0.0031494214199483395
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | mtc/meta-llama-Llama-2-7b-hf-arxiv-summarization-10k-last-lora-full-adapter | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:41:37+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | mtc/meta-llama-Llama-2-7b-hf-arxiv-summarization-10k-last_merged | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:41:39+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | transformers |
# OgnoMonarch-7B
OgnoMonarch-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [paulml/OGNO-7B](https://huggingface.co/paulml/OGNO-7B)
* [mlabonne/Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: paulml/OGNO-7B
layer_range: [0, 32]
- model: mlabonne/Monarch-7B
layer_range: [0, 32]
merge_method: slerp
base_model: paulml/OGNO-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "yleo/OgnoMonarch-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit", "paulml/OGNO-7B", "mlabonne/Monarch-7B"], "base_model": ["paulml/OGNO-7B", "mlabonne/Monarch-7B"]} | text-generation | yleo/OgnoMonarch-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"paulml/OGNO-7B",
"mlabonne/Monarch-7B",
"base_model:paulml/OGNO-7B",
"base_model:mlabonne/Monarch-7B",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:43:32+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #paulml/OGNO-7B #mlabonne/Monarch-7B #base_model-paulml/OGNO-7B #base_model-mlabonne/Monarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# OgnoMonarch-7B
OgnoMonarch-7B is a merge of the following models using LazyMergekit:
* paulml/OGNO-7B
* mlabonne/Monarch-7B
## Configuration
## Usage
| [
"# OgnoMonarch-7B\n\nOgnoMonarch-7B is a merge of the following models using LazyMergekit:\n* paulml/OGNO-7B\n* mlabonne/Monarch-7B",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #paulml/OGNO-7B #mlabonne/Monarch-7B #base_model-paulml/OGNO-7B #base_model-mlabonne/Monarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# OgnoMonarch-7B\n\nOgnoMonarch-7B is a merge of the following models using LazyMergekit:\n* paulml/OGNO-7B\n* mlabonne/Monarch-7B",
"## Configuration",
"## Usage"
] | [
113,
45,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #paulml/OGNO-7B #mlabonne/Monarch-7B #base_model-paulml/OGNO-7B #base_model-mlabonne/Monarch-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# OgnoMonarch-7B\n\nOgnoMonarch-7B is a merge of the following models using LazyMergekit:\n* paulml/OGNO-7B\n* mlabonne/Monarch-7B## Configuration## Usage"
] | [
-0.06144983321428299,
0.07672954350709915,
-0.004818749148398638,
0.033490922302007675,
0.05618608370423317,
0.041201911866664886,
0.1362975686788559,
0.08176293969154358,
-0.05698104202747345,
0.03494047001004219,
0.08734539151191711,
0.12629000842571259,
0.019862806424498558,
0.044409237802028656,
-0.08252304792404175,
-0.19173745810985565,
0.06823043525218964,
0.017197057604789734,
-0.11126182228326797,
0.03878562152385712,
0.11294735223054886,
0.013708644546568394,
0.15146799385547638,
0.00327818444930017,
-0.06509469449520111,
0.03608810529112816,
-0.046475600451231,
-0.04374607279896736,
0.04511343687772751,
0.10794711112976074,
0.002070476533845067,
0.07232502102851868,
-0.048045843839645386,
-0.09730993956327438,
0.03411324322223663,
-0.018240293487906456,
-0.04811154305934906,
0.06542812287807465,
0.0792165994644165,
-0.03452131524682045,
0.1586187779903412,
-0.0233354102820158,
0.0034229776356369257,
0.059812333434820175,
-0.1434575766324997,
-0.030417269095778465,
-0.07778667658567429,
0.03651126101613045,
0.0316922627389431,
0.04155438020825386,
-0.004446379374712706,
0.09694726020097733,
-0.0417025126516819,
0.05080706998705864,
0.12563997507095337,
-0.3297414779663086,
-0.033636368811130524,
0.19988621771335602,
0.08416593074798584,
0.016663290560245514,
0.009765797294676304,
0.07715224474668503,
0.022609557956457138,
0.015643713995814323,
0.050062473863363266,
-0.09871289879083633,
0.14241690933704376,
-0.05001259967684746,
-0.11482061445713043,
-0.024170242249965668,
0.22184059023857117,
0.030848177149891853,
-0.00315655255690217,
-0.09938795864582062,
-0.05156271159648895,
0.14590759575366974,
-0.04973967745900154,
0.021595526486635208,
0.04678189381957054,
0.03883003070950508,
0.024489667266607285,
-0.08283142745494843,
-0.035996533930301666,
-0.007669673301279545,
-0.1022723913192749,
0.07422668486833572,
-0.015053166076540947,
0.010107740759849548,
0.018518300727009773,
0.08112327009439468,
-0.24496042728424072,
-0.12274149805307388,
-0.040140990167856216,
-0.06447014957666397,
0.08366523683071136,
-0.011017401702702045,
-0.05557470768690109,
0.012206116691231728,
0.1391344517469406,
0.25190141797065735,
-0.06818502396345139,
0.03503163903951645,
-0.028135674074292183,
0.07298169285058975,
0.021270371973514557,
0.027843154966831207,
-0.08068053424358368,
-0.18113896250724792,
0.09222016483545303,
0.038542162626981735,
0.06599553674459457,
-0.0062484643422067165,
-0.11085227131843567,
-0.05410284176468849,
-0.060688309371471405,
0.01871296390891075,
0.09205196052789688,
0.09895971417427063,
-0.07055255025625229,
-0.04497312381863594,
0.28070393204689026,
-0.03372526541352272,
0.024152010679244995,
-0.007825147360563278,
-0.03707176446914673,
-0.0065380120649933815,
0.07638200372457504,
0.0798710361123085,
-0.022388942539691925,
0.027650851756334305,
-0.041438180953264236,
-0.053383711725473404,
-0.007374575361609459,
-0.06363587081432343,
0.008523663505911827,
-0.0019336147233843803,
-0.0021010497584939003,
-0.08709323406219482,
-0.17567498981952667,
-0.007043012417852879,
0.07676982134580612,
-0.05634765326976776,
-0.08386556804180145,
-0.06543967127799988,
-0.037354275584220886,
-0.016740065068006516,
0.012192212045192719,
0.03417757526040077,
-0.04343618080019951,
-0.024751141667366028,
0.03787222132086754,
0.004442861303687096,
-0.20994052290916443,
0.02252463437616825,
-0.11332978308200836,
0.08939605951309204,
-0.17457365989685059,
0.061493437737226486,
-0.05163860693573952,
0.05837016552686691,
-0.06940052658319473,
-0.019969364628195763,
-0.09702091664075851,
0.035542748868465424,
0.03127048537135124,
0.12368883937597275,
0.027701394632458687,
-0.07356841117143631,
0.01825934648513794,
-0.09808554500341415,
-0.1555505096912384,
0.10217931121587753,
0.03253732994198799,
0.048488929867744446,
0.032314617186784744,
0.16477695107460022,
0.05715412646532059,
-0.013847275637090206,
-0.02697104774415493,
0.016502996906638145,
0.0035913221072405577,
-0.0480419397354126,
0.1133430227637291,
-0.057391636073589325,
-0.08004897087812424,
0.0557682029902935,
-0.061993859708309174,
0.02469167672097683,
0.004907978232949972,
-0.03894798457622528,
-0.07221487164497375,
-0.07420508563518524,
-0.0008714453433640301,
-0.02607991360127926,
0.019970962777733803,
-0.07606136798858643,
-0.0723528042435646,
-0.0012341258116066456,
0.060696132481098175,
-0.03950945660471916,
-0.008503368124365807,
-0.03044082410633564,
0.1663394570350647,
-0.09486483782529831,
0.048228103667497635,
-0.06214958429336548,
-0.08758768439292908,
0.008909357711672783,
-0.008470838889479637,
0.06202720105648041,
-0.04332280158996582,
0.06613805890083313,
0.06404440850019455,
-0.08075173944234848,
-0.02679048478603363,
0.11943821609020233,
0.023540738970041275,
0.010978746227920055,
-0.17800404131412506,
-0.02595614083111286,
-0.037728652358055115,
0.1799616515636444,
-0.0697157084941864,
0.08999568969011307,
0.039831407368183136,
0.14889559149742126,
-0.016728131100535393,
0.018855951726436615,
0.04777229577302933,
0.034927040338516235,
-0.04689595848321915,
-0.0122719407081604,
0.08655540645122528,
0.025701021775603294,
-0.11840928345918655,
0.15221616625785828,
-0.15128374099731445,
0.16412986814975739,
0.15000513195991516,
-0.024659734219312668,
0.01842469722032547,
-0.02290925569832325,
0.008015982806682587,
-0.052497584372758865,
0.105677530169487,
-0.05986693874001503,
0.09744423627853394,
-0.05111987143754959,
0.12811321020126343,
-0.09024941176176071,
-0.03840493783354759,
0.012947270646691322,
-0.041382916271686554,
-0.07042887806892395,
0.08538100868463516,
0.06958060711622238,
-0.14759619534015656,
0.13291743397712708,
0.24094915390014648,
-0.009523379616439342,
0.08039804548025131,
0.0012294779298827052,
-0.00392721826210618,
-0.04102778062224388,
0.027133140712976456,
0.039069581776857376,
0.01193855982273817,
-0.12130986154079437,
0.046051476150751114,
0.04977229982614517,
0.03627314046025276,
0.08722619712352753,
-0.07043077051639557,
0.03220279514789581,
-0.0022989094723016024,
-0.00744804460555315,
0.059682697057724,
0.0785011351108551,
-0.018043706193566322,
0.057370834052562714,
0.01888415776193142,
-0.0736030638217926,
0.07866081595420837,
0.02472573332488537,
-0.052322693169116974,
0.15051133930683136,
-0.09926750510931015,
-0.192541241645813,
-0.18464981019496918,
-0.09270279854536057,
-0.15668433904647827,
-0.0005870113964192569,
0.06701885908842087,
0.042947061359882355,
-0.035670630633831024,
-0.1480349451303482,
0.07245112955570221,
-0.004432239104062319,
-0.05949020758271217,
-0.00920599140226841,
0.018240388482809067,
0.003299865871667862,
-0.11719358712434769,
-0.05497024953365326,
0.04811942204833031,
-0.08714696019887924,
0.06202127784490585,
-0.11011670529842377,
0.09571248292922974,
0.06797241419553757,
0.03084542416036129,
-0.003492939518764615,
-0.006611064076423645,
0.2022053748369217,
-0.02130340412259102,
0.03751743584871292,
0.15565568208694458,
-0.01614738442003727,
0.0562770739197731,
0.16922368109226227,
0.045607149600982666,
-0.05008172243833542,
-0.020928826183080673,
-0.028478723019361496,
-0.04610877484083176,
-0.13134171068668365,
-0.10485444962978363,
-0.08264245837926865,
0.08473480492830276,
0.024113185703754425,
0.06588494777679443,
0.10262534767389297,
0.07454612106084824,
-0.06271739304065704,
0.003958529327064753,
0.09534952789545059,
0.0986066684126854,
0.22007617354393005,
0.015050187706947327,
0.09581595659255981,
-0.0389786921441555,
-0.02536688558757305,
0.08796863257884979,
0.04489694535732269,
-0.01850035786628723,
0.053520530462265015,
0.15986184775829315,
0.011842578649520874,
0.08958130329847336,
0.0666106790304184,
0.003607378341257572,
-0.03027348406612873,
0.0023499340750277042,
-0.046496301889419556,
-0.09256602078676224,
0.05231953784823418,
0.036033645272254944,
-0.0019370423397049308,
0.011068292893469334,
-0.03504577651619911,
-0.09181971102952957,
0.07627850770950317,
0.13140206038951874,
0.04899336025118828,
-0.2799150049686432,
-0.0431412048637867,
0.05244138464331627,
0.049532487988471985,
-0.06095731630921364,
-0.005654494743794203,
-0.029337961226701736,
-0.08836937695741653,
0.12485045939683914,
0.01581948809325695,
0.12116874009370804,
0.006936643738299608,
0.02818184532225132,
0.00419352063909173,
0.09874539077281952,
-0.02129441313445568,
0.043629854917526245,
-0.1990703046321869,
0.07071082293987274,
0.03163175284862518,
0.014401615597307682,
0.0023000140208750963,
0.050521623343229294,
0.062426477670669556,
0.2215336114168167,
0.037317413836717606,
-0.015005120076239109,
-0.02961094118654728,
0.0019892111886292696,
-0.11532469838857651,
0.029410958290100098,
0.01321115531027317,
-0.11032510548830032,
0.04496688395738602,
-0.02726304717361927,
-0.05370672792196274,
0.022412093356251717,
0.014533160254359245,
-0.14602415263652802,
-0.1424275040626526,
0.09840455651283264,
0.05687542259693146,
0.10615458339452744,
-0.06969311088323593,
-0.005950773134827614,
-0.11154721677303314,
0.33697354793548584,
-0.012487269006669521,
-0.048622846603393555,
-0.07496979087591171,
-0.08759943395853043,
0.13655127584934235,
-0.03556608036160469,
0.06264469772577286,
-0.06979302316904068,
0.02774864062666893,
-0.04646066576242447,
-0.14660508930683136,
0.09336765110492706,
-0.08876701444387436,
-0.07418031990528107,
0.011232361197471619,
0.14704859256744385,
-0.11026664823293686,
0.02278166264295578,
0.0024674066808074713,
0.08267776668071747,
-0.055460114032030106,
-0.06245476007461548,
-0.0012565907090902328,
0.060728199779987335,
0.029257744550704956,
0.10539823025465012,
-0.05095553398132324,
-0.1755146086215973,
0.020917238667607307,
-0.0017650544177740812,
0.11001139879226685,
0.32151323556900024,
0.024027621373534203,
0.03432997316122055,
0.16498304903507233,
-0.06921696662902832,
-0.2016090750694275,
-0.04895323887467384,
-0.07511879503726959,
0.00617455318570137,
0.020298480987548828,
-0.10273931175470352,
0.06557796895503998,
0.14589859545230865,
-0.022781720384955406,
0.11084329336881638,
-0.3273261785507202,
-0.10749159008264542,
0.06291501224040985,
0.029322652146220207,
0.21384260058403015,
-0.15629549324512482,
-0.08187180757522583,
-0.06548559665679932,
-0.21286390721797943,
0.07392076402902603,
-0.08170410990715027,
0.09513087570667267,
-0.02280350774526596,
-0.04225282371044159,
0.0039972844533622265,
-0.03473959118127823,
0.15930908918380737,
-0.08710512518882751,
0.0014811953296884894,
-0.09329010546207428,
-0.012812664732336998,
0.1577850729227066,
-0.04720407724380493,
0.056714195758104324,
-0.19177542626857758,
0.02311549335718155,
-0.006498122587800026,
-0.038978952914476395,
-0.018070107325911522,
0.11629676818847656,
-0.037947606295347214,
-0.033279549330472946,
-0.022151313722133636,
0.008655951358377934,
-0.0241025909781456,
0.05907107889652252,
0.12689760327339172,
-0.05067269876599312,
0.04734061658382416,
0.15860317647457123,
0.09444667398929596,
-0.11154448986053467,
0.030789297074079514,
-0.0067079151049256325,
-0.05871745944023132,
0.027068497613072395,
-0.04745122045278549,
-0.0013747159391641617,
0.08932309597730637,
-0.041526682674884796,
0.06462599337100983,
0.04433002322912216,
0.003313021967187524,
-0.021717671304941177,
0.09680617600679398,
-0.17097871005535126,
-0.08180968463420868,
-0.056379418820142746,
-0.05833418294787407,
0.002788298297673464,
0.035305049270391464,
0.19614540040493011,
-0.024802325293421745,
-0.010828466154634953,
0.048232629895210266,
-0.004227026831358671,
-0.12017641216516495,
0.062307242304086685,
-0.031994301825761795,
-0.015321310609579086,
-0.08014477044343948,
0.06673406064510345,
0.048751357942819595,
-0.0274571031332016,
-0.035969629883766174,
0.11117851734161377,
-0.09142182767391205,
-0.10978855937719345,
-0.07117179036140442,
0.19960834085941315,
-0.05118991434574127,
-0.045902449637651443,
-0.12619970738887787,
-0.07086770981550217,
0.012853451073169708,
0.06733725965023041,
0.06405649334192276,
0.019396305084228516,
0.03492674604058266,
-0.036030665040016174,
-0.03492867201566696,
0.0406017042696476,
-0.011645548976957798,
0.11098292469978333,
-0.061118677258491516,
-0.0704798698425293,
-0.0142842847853899,
-0.008222117088735104,
-0.037799984216690063,
-0.0006658312631770968,
-0.14325399696826935,
-0.06951416283845901,
-0.11025175452232361,
-0.03224619850516319,
-0.14384058117866516,
-0.017230790108442307,
-0.022150158882141113,
-0.026463694870471954,
-0.026171009987592697,
0.01517571322619915,
-0.022465689107775688,
-0.06325256824493408,
-0.020488694310188293,
0.09764105826616287,
-0.08110086619853973,
0.003790037939324975,
0.0467168353497982,
-0.061428699642419815,
0.06567294895648956,
0.014577871188521385,
-0.03550739213824272,
-0.029064705595374107,
-0.18276306986808777,
-0.06933550536632538,
0.038124121725559235,
-0.008814269676804543,
-0.0017388564301654696,
-0.08467202633619308,
-0.01039628405123949,
0.07299986481666565,
-0.021302934736013412,
-0.013898484408855438,
0.12770944833755493,
-0.10457642376422882,
0.004240542184561491,
-0.04235496744513512,
-0.09442533552646637,
-0.006380707956850529,
0.005296849180012941,
0.0972919762134552,
0.03368080034852028,
0.13253054022789001,
-0.04598552733659744,
-0.0033415667712688446,
-0.12116639316082001,
-0.00829180609434843,
0.01612425036728382,
-0.18908987939357758,
-0.0965866968035698,
-0.028957873582839966,
0.015079117380082607,
-0.010455623269081116,
0.19761662185192108,
-0.017293712124228477,
-0.16176190972328186,
0.023802706971764565,
0.011768204160034657,
0.14477819204330444,
0.04772299900650978,
0.22321827709674835,
0.08761359751224518,
-0.003171172458678484,
-0.07344380021095276,
0.0952543169260025,
0.023480167612433434,
-0.04501863569021225,
0.018786724656820297,
0.09345076233148575,
0.027970489114522934,
0.06902234256267548,
0.10747090727090836,
0.005555136129260063,
-0.0460188053548336,
0.047531239688396454,
0.05524948239326477,
0.08124980330467224,
-0.0471901036798954,
0.13479112088680267,
0.14642123878002167,
-0.10277478396892548,
0.025648141279816628,
0.023639392107725143,
-0.013039454817771912,
-0.06731361150741577,
-0.1514691561460495,
-0.12285608798265457,
-0.12564288079738617,
-0.020728424191474915,
-0.11113988608121872,
-0.013154597021639347,
0.08984269946813583,
-0.0002845234703272581,
-0.018008675426244736,
0.11976141482591629,
-0.03589563071727753,
-0.03475871682167053,
0.026492344215512276,
-0.07723154127597809,
-0.013871492817997932,
-0.020945515483617783,
-0.054602574557065964,
-0.0018510724185034633,
0.005867213476449251,
-0.0026561615522950888,
0.03495151549577713,
0.004751322790980339,
0.036202844232320786,
-0.0812658816576004,
-0.09618651866912842,
-0.028210831806063652,
0.05861685425043106,
-0.011283274739980698,
0.05876258760690689,
0.019936958327889442,
-0.0786975622177124,
0.0527099147439003,
0.09403948485851288,
-0.013403289951384068,
-0.13625217974185944,
-0.08693628758192062,
0.15371891856193542,
-0.033471085131168365,
0.013335767202079296,
0.02104080840945244,
-0.04297269508242607,
0.031154638156294823,
0.14006009697914124,
0.33398205041885376,
-0.051298659294843674,
-0.00046167318942025304,
0.03798714280128479,
0.007451326586306095,
0.06520561128854752,
0.03214099258184433,
0.021190257743000984,
0.13627557456493378,
-0.07159847021102905,
-0.0036026702728122473,
-0.031379226595163345,
-0.022520389407873154,
-0.11473758518695831,
0.005276917014271021,
0.0288871880620718,
-0.0721244215965271,
0.03938240930438042,
0.07734648138284683,
-0.07732075452804565,
0.03289904072880745,
-0.02047818712890148,
-0.138166144490242,
-0.07931482046842575,
-0.08577042073011398,
0.0017051276518031955,
0.0008289376273751259,
0.06249242648482323,
-0.06241984665393829,
-0.04781057685613632,
0.015064151957631111,
-0.03983628749847412,
-0.0813632681965828,
-0.072576142847538,
-0.014335542917251587,
-0.08010324090719223,
0.06566783040761948,
-0.012753648683428764,
0.04825572669506073,
0.11167791485786438,
0.006968213245272636,
-0.08381108194589615,
0.04153231158852577,
0.01600971631705761,
-0.03697352111339569,
0.04005124047398567,
-0.011263426393270493,
-0.024235235527157784,
0.15167894959449768,
0.031190253794193268,
-0.08056683838367462,
0.0677705928683281,
0.03581756725907326,
-0.04055498167872429,
-0.02231418713927269,
0.01539311371743679,
-0.06137779727578163,
0.09876527637243271,
0.13704049587249756,
-0.004285928327590227,
-0.019916843622922897,
-0.036240894347429276,
0.05444783717393875,
0.08103185147047043,
0.004624612629413605,
-0.067422054708004,
-0.1801449954509735,
-0.011625699698925018,
0.09208625555038452,
-0.0024169397074729204,
-0.22450776398181915,
-0.08596731722354889,
-0.15615110099315643,
0.04159483686089516,
-0.08278869837522507,
0.03527030721306801,
0.20041099190711975,
-0.01926691085100174,
-0.02609376236796379,
-0.07840432971715927,
-0.024806004017591476,
0.08188572525978088,
-0.08007451891899109,
-0.0995323657989502
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | token-classification | SKNahin/NER_Deberta | [
"transformers",
"safetensors",
"deberta",
"token-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:44:13+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #deberta #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #deberta #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
49,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #deberta #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.07582134753465652,
0.1588554084300995,
-0.0037710394244641066,
0.02595895528793335,
0.11814748495817184,
0.009787439368665218,
0.07563809305429459,
0.10580293834209442,
-0.01635763794183731,
0.12473590672016144,
0.039848506450653076,
0.10315580666065216,
0.10913156718015671,
0.19314853847026825,
-0.0022752105724066496,
-0.2091170996427536,
0.0622878298163414,
-0.11678310483694077,
0.010382972657680511,
0.12100998312234879,
0.14090564846992493,
-0.11161413788795471,
0.06912626326084137,
-0.042492516338825226,
-0.018259624019265175,
-0.031717754900455475,
-0.06345462054014206,
-0.05417080596089363,
0.06679418683052063,
0.055831894278526306,
0.0655626431107521,
0.02029980719089508,
0.08335743844509125,
-0.2845470905303955,
0.019045550376176834,
0.07766879349946976,
0.002123442944139242,
0.0609569177031517,
0.07383845746517181,
-0.07475782185792923,
0.09538372606039047,
-0.0625537782907486,
0.15552008152008057,
0.07236507534980774,
-0.09746827185153961,
-0.1846913844347,
-0.08630084246397018,
0.10301622748374939,
0.17936977744102478,
0.055135540664196014,
-0.03497261554002762,
0.14088936150074005,
-0.07017937302589417,
0.016605326905846596,
0.06468700617551804,
-0.07290935516357422,
-0.05348815396428108,
0.06037161126732826,
0.0742330476641655,
0.09727266430854797,
-0.13022415339946747,
-0.010263546369969845,
0.04170728102326393,
0.018862606957554817,
0.11035116761922836,
0.01817622222006321,
0.1315252184867859,
0.02978314645588398,
-0.14483527839183807,
-0.06033659726381302,
0.10560949891805649,
0.03417903184890747,
-0.05956808477640152,
-0.24999532103538513,
-0.007466548588126898,
-0.03454916179180145,
-0.028890257701277733,
-0.04949915036559105,
0.04322859272360802,
-0.02760316990315914,
0.09133703261613846,
0.0029181726276874542,
-0.06773355603218079,
-0.05198833718895912,
0.09289798140525818,
0.06621047854423523,
0.028308380395174026,
-0.028138238936662674,
0.01582835055887699,
0.12160526216030121,
0.10461324453353882,
-0.1136874184012413,
-0.062054362148046494,
-0.06239290162920952,
-0.08454962074756622,
-0.047138769179582596,
0.0364193357527256,
0.06810244172811508,
0.0531260222196579,
0.2071358561515808,
-0.006076968740671873,
0.04858258739113808,
0.03387023136019707,
0.012895005755126476,
0.07188420742750168,
0.07482324540615082,
-0.05877743288874626,
-0.13807903230190277,
-0.02900126948952675,
0.1174110546708107,
0.007540363352745771,
-0.030544515699148178,
-0.035413552075624466,
0.058053579181432724,
0.051524870097637177,
0.12658938765525818,
0.06639471650123596,
0.015262601897120476,
-0.07834838330745697,
-0.0524817518889904,
0.18440617620944977,
-0.15845707058906555,
0.02396208792924881,
0.016546091064810753,
-0.051321811974048615,
-0.031176572665572166,
0.016762053593993187,
0.010351779870688915,
-0.027110083028674126,
0.09371508657932281,
-0.06443753838539124,
-0.0473901741206646,
-0.10924410820007324,
-0.053515657782554626,
0.031546659767627716,
-0.020617445930838585,
-0.024359513074159622,
-0.04260227829217911,
-0.12834255397319794,
-0.07842771708965302,
0.0685746818780899,
-0.0641399547457695,
-0.06342477351427078,
-0.0371922142803669,
-0.06433878093957901,
0.012278573587536812,
-0.003310238244011998,
0.1162533387541771,
-0.030190767720341682,
0.05034793168306351,
-0.05589374527335167,
0.06739762425422668,
0.136927992105484,
0.030936243012547493,
-0.06643268465995789,
0.06668223440647125,
-0.2147911936044693,
0.10741851478815079,
-0.08720431476831436,
0.02960197627544403,
-0.16360381245613098,
-0.019591867923736572,
0.03655751794576645,
0.035667140036821365,
-0.009463734924793243,
0.14313486218452454,
-0.18016517162322998,
-0.03503832593560219,
0.1872098445892334,
-0.12664759159088135,
-0.09166831523180008,
0.05657235532999039,
-0.06124594062566757,
0.13424597680568695,
0.058792535215616226,
-0.021857086569070816,
0.053700074553489685,
-0.13882024586200714,
-0.023953566327691078,
-0.06349705904722214,
-0.01692943274974823,
0.1536400020122528,
0.0591365285217762,
-0.04715650901198387,
0.029974887147545815,
0.01766093075275421,
-0.026321327313780785,
-0.049422211945056915,
-0.03356151655316353,
-0.09505396336317062,
0.010490937158465385,
-0.08083124458789825,
0.019366491585969925,
-0.024703530594706535,
-0.09222285449504852,
-0.03895638883113861,
-0.15414710342884064,
0.013223775662481785,
0.10018230229616165,
-0.003258682554587722,
-0.031090516597032547,
-0.10073129087686539,
-0.0023207797203212976,
0.015256554819643497,
-0.005745676811784506,
-0.14969810843467712,
-0.058194927871227264,
0.02466205134987831,
-0.17005768418312073,
0.02577390894293785,
-0.044527240097522736,
0.03756798058748245,
0.04352610558271408,
-0.04543379694223404,
-0.03406783938407898,
0.016558455303311348,
0.021964261308312416,
-0.023266246542334557,
-0.2578973174095154,
-0.01380295492708683,
-0.05108672380447388,
0.173564150929451,
-0.24888335168361664,
0.04812554270029068,
0.06300827115774155,
0.12234839797019958,
0.008925281465053558,
-0.041419293731451035,
0.03762400150299072,
-0.05390049144625664,
-0.03631659969687462,
-0.06965415179729462,
-0.008510426618158817,
-0.03528457134962082,
-0.045454367995262146,
0.03824324160814285,
-0.18064084649085999,
-0.026336830109357834,
0.11385028064250946,
0.07398278266191483,
-0.16777318716049194,
-0.07124035060405731,
-0.03505503386259079,
-0.06051163747906685,
-0.07906899601221085,
-0.05589747801423073,
0.0883159190416336,
0.04612462595105171,
0.05180882290005684,
-0.0675927922129631,
-0.060058094561100006,
0.012471389956772327,
-0.011969506740570068,
-0.0303481537848711,
0.08634323626756668,
0.06659138202667236,
-0.12965452671051025,
0.10808708518743515,
0.07467851042747498,
0.07148940861225128,
0.1054377406835556,
0.007283661514520645,
-0.0933234840631485,
-0.019234750419855118,
0.0287589393556118,
0.014921999536454678,
0.15269333124160767,
-0.06041413173079491,
0.03667184337973595,
0.0400078184902668,
-0.0238299872726202,
0.008937633596360683,
-0.0961742103099823,
0.020881228148937225,
0.0288530420511961,
-0.011045076884329319,
0.024098359048366547,
-0.05350760370492935,
0.015204832889139652,
0.10605532675981522,
0.033052194863557816,
0.029378650709986687,
0.016013238579034805,
-0.04219504073262215,
-0.12545357644557953,
0.1795472502708435,
-0.09741300344467163,
-0.24795091152191162,
-0.1238713338971138,
-0.004490654915571213,
0.03899503871798515,
-0.010224607773125172,
0.022714829072356224,
-0.05680026113986969,
-0.10976943373680115,
-0.10007549822330475,
0.033845312893390656,
0.06473647058010101,
-0.08570653945207596,
-0.06949411332607269,
0.05287871137261391,
0.042630840092897415,
-0.12978382408618927,
0.020360087975859642,
0.04170483723282814,
-0.07134698331356049,
0.006720427889376879,
0.05840947851538658,
0.07955096662044525,
0.1792677789926529,
0.010206159204244614,
-0.02045300044119358,
0.01256562676280737,
0.21880127489566803,
-0.1466495245695114,
0.09328395873308182,
0.141775980591774,
-0.06401344388723373,
0.08160442858934402,
0.20209866762161255,
0.030180789530277252,
-0.10358195006847382,
0.03875577077269554,
0.03624651953577995,
-0.034544941037893295,
-0.24263033270835876,
-0.07476656138896942,
0.005603810306638479,
-0.06450692564249039,
0.10198681056499481,
0.08651577681303024,
0.10705448687076569,
0.04512255638837814,
-0.11235596984624863,
-0.06125471368432045,
0.05285392701625824,
0.11886214464902878,
-0.02292727865278721,
-0.002029747236520052,
0.09566584974527359,
-0.023103266954421997,
0.022581107914447784,
0.09030848741531372,
0.02752724476158619,
0.18029075860977173,
0.04536215215921402,
0.13428856432437897,
0.08979280292987823,
0.05752971023321152,
0.01326517853885889,
0.018752997741103172,
0.018830908462405205,
0.03053511306643486,
-0.020200124010443687,
-0.08164151757955551,
-0.009718762710690498,
0.13676214218139648,
0.023310136049985886,
0.03856613487005234,
0.0037885159254074097,
-0.04802961274981499,
0.0742470771074295,
0.17449264228343964,
0.016129810363054276,
-0.22968627512454987,
-0.06662201136350632,
0.07476157695055008,
-0.07414892315864563,
-0.11967059224843979,
-0.018175605684518814,
0.028066670522093773,
-0.18243610858917236,
0.03899575024843216,
-0.024058742448687553,
0.1009836345911026,
-0.12115946412086487,
-0.02023388259112835,
0.03995189815759659,
0.059546250849962234,
-0.031342994421720505,
0.07075176388025284,
-0.19329877197742462,
0.1349882036447525,
0.008857686072587967,
0.06893958151340485,
-0.10071657598018646,
0.0806516632437706,
0.01673419401049614,
0.0031026999931782484,
0.1628592163324356,
-0.004056477919220924,
-0.06180410459637642,
-0.10156970471143723,
-0.085969477891922,
-0.01386621780693531,
0.0969386100769043,
-0.12537075579166412,
0.09289414435625076,
-0.005195782519876957,
-0.03387987241148949,
-0.002357541350647807,
-0.13537321984767914,
-0.13680528104305267,
-0.17812253534793854,
0.045711059123277664,
-0.12590152025222778,
0.04698599874973297,
-0.10713580250740051,
-0.05455242469906807,
-0.039548274129629135,
0.19008749723434448,
-0.21812191605567932,
-0.08449868857860565,
-0.14965391159057617,
-0.0647575631737709,
0.11447085440158844,
-0.04327482357621193,
0.0829252079129219,
0.011816042475402355,
0.19854897260665894,
-0.0006443834863603115,
-0.003518373239785433,
0.09804367274045944,
-0.09864502400159836,
-0.20901942253112793,
-0.10028961300849915,
0.1335003674030304,
0.13788676261901855,
0.039611831307411194,
0.004660561680793762,
0.023397788405418396,
-0.0015464697498828173,
-0.11214857548475266,
0.03287322819232941,
0.15810757875442505,
0.10996973514556885,
0.0365285724401474,
-0.025456419214606285,
-0.134415864944458,
-0.10064711421728134,
-0.04832622781395912,
0.007826434448361397,
0.1952485889196396,
-0.06921055167913437,
0.16170531511306763,
0.1607891321182251,
-0.06080915778875351,
-0.20997200906276703,
0.031182322651147842,
0.03326820209622383,
-0.0014625083422288299,
0.047811076045036316,
-0.20409850776195526,
0.07823505252599716,
0.017151953652501106,
-0.057960692793130875,
0.13218380510807037,
-0.18051816523075104,
-0.14781798422336578,
0.09219598770141602,
0.07739659398794174,
-0.19650183618068695,
-0.1307976245880127,
-0.09491356462240219,
-0.048903729766607285,
-0.0986228659749031,
0.09035423398017883,
-0.008246852084994316,
0.005571926478296518,
0.03216136619448662,
0.01862735114991665,
0.015546685084700584,
-0.0485796183347702,
0.19612248241901398,
-0.0004862592031713575,
0.051562365144491196,
-0.07412069290876389,
-0.07256710529327393,
0.03558439388871193,
-0.07121629267930984,
0.08863470703363419,
-0.01741286925971508,
0.006189014296978712,
-0.11463242024183273,
-0.06591381877660751,
-0.04193699732422829,
0.03264250233769417,
-0.08610426634550095,
-0.09879012405872345,
-0.04721202328801155,
0.10410426557064056,
0.09013193845748901,
-0.03732464835047722,
-0.06628967076539993,
-0.08886818587779999,
0.04777882248163223,
0.21941284835338593,
0.18009066581726074,
0.07462029904127121,
-0.07408244162797928,
-0.006450060289353132,
-0.02119174227118492,
0.06165141984820366,
-0.20728842914104462,
0.04867798089981079,
0.0371476411819458,
0.03147519752383232,
0.1294110119342804,
-0.026139631867408752,
-0.16417503356933594,
-0.05014671012759209,
0.05455317720770836,
-0.07321128249168396,
-0.15782763063907623,
0.007312591653317213,
0.08040127158164978,
-0.15300150215625763,
-0.0399666391313076,
0.04038683697581291,
-0.03046327270567417,
-0.031874220818281174,
0.002157751005142927,
0.08237545937299728,
0.020008308812975883,
0.10707804560661316,
0.06407194584608078,
0.10628766566514969,
-0.10405796021223068,
0.07105699181556702,
0.08430515974760056,
-0.10969457775354385,
0.03989127278327942,
0.057201582938432693,
-0.06448391079902649,
-0.035053741186857224,
0.03124030865728855,
0.08985407650470734,
0.025547293946146965,
-0.07282499223947525,
0.004657800309360027,
-0.11015290766954422,
0.06599664688110352,
0.1322651207447052,
0.04004150256514549,
0.0137094771489501,
0.042923275381326675,
0.033320680260658264,
-0.10062410682439804,
0.11902371048927307,
0.05028797686100006,
0.03921814635396004,
-0.055055614560842514,
-0.02014131471514702,
0.040924906730651855,
-0.020380757749080658,
-0.0170382559299469,
-0.038348399102687836,
-0.07091455906629562,
-0.012284004129469395,
-0.1709912270307541,
0.021539563313126564,
-0.0643090084195137,
0.01184446457773447,
0.016018250957131386,
-0.028759492561221123,
0.0066799623891711235,
0.013610909692943096,
-0.07205694168806076,
-0.04707391932606697,
-0.0035674909595400095,
0.10853823274374008,
-0.1683083474636078,
0.006899724714457989,
0.0807538852095604,
-0.12974224984645844,
0.08663125336170197,
0.0028446076903492212,
-0.0010952386073768139,
0.01968505047261715,
-0.1273455023765564,
0.06220780685544014,
-0.005134644918143749,
0.0054603819735348225,
0.03317052498459816,
-0.21593356132507324,
0.0022072684951126575,
-0.0508023202419281,
-0.06484345346689224,
-0.0030691337306052446,
-0.03311242535710335,
-0.11482230573892593,
0.10371197760105133,
0.01689654216170311,
-0.0779203251004219,
-0.021994326263666153,
0.04691578447818756,
0.10877161473035812,
-0.04842940717935562,
0.14637391269207,
-0.017828896641731262,
0.059009574353694916,
-0.18030768632888794,
-0.021076207980513573,
-0.017547423020005226,
0.01863904669880867,
-0.03460176661610603,
-0.004361789207905531,
0.05265168845653534,
-0.018918506801128387,
0.22060424089431763,
-0.02269667573273182,
0.028595924377441406,
0.06291016191244125,
-0.00589276198297739,
-0.013385145924985409,
0.09999861568212509,
0.045825641602277756,
0.01133752427995205,
0.023678036406636238,
0.006810458842664957,
-0.04093582555651665,
-0.006517220288515091,
-0.1356368511915207,
0.07505546510219574,
0.16590814292430878,
0.08395548164844513,
-0.005548030138015747,
0.048898518085479736,
-0.11000822484493256,
-0.10603458434343338,
0.09506268054246902,
-0.034125830978155136,
-0.017605112865567207,
-0.0482698529958725,
0.13643765449523926,
0.1578799933195114,
-0.19135646522045135,
0.06331014633178711,
-0.06746772676706314,
-0.056378982961177826,
-0.10576022416353226,
-0.17683377861976624,
-0.05985268950462341,
-0.03730166703462601,
-0.013985122554004192,
-0.06049531325697899,
0.058749984949827194,
0.10166352242231369,
0.014109558425843716,
0.006751217879354954,
0.08392417430877686,
-0.024542953819036484,
0.0066293105483055115,
0.039478328078985214,
0.06338965147733688,
0.015577137470245361,
-0.05866408720612526,
0.007817541249096394,
0.0013342619640752673,
0.03782322257757187,
0.05102457478642464,
0.030786391347646713,
-0.013074606657028198,
0.0072399284690618515,
-0.022229334339499474,
-0.10115569829940796,
0.04019651189446449,
-0.023845037445425987,
-0.053124140948057175,
0.1545589715242386,
0.024088291451334953,
-0.007400582078844309,
-0.020909536629915237,
0.23240384459495544,
-0.06599795818328857,
-0.07759324461221695,
-0.1394353210926056,
0.15041467547416687,
-0.03572135791182518,
0.05887357145547867,
0.04635220393538475,
-0.1078629195690155,
0.03809856250882149,
0.13536633551120758,
0.1431995928287506,
-0.04122826084494591,
0.011672002263367176,
0.009141024202108383,
0.003373268526047468,
-0.028277069330215454,
0.05200572684407234,
0.05153002589941025,
0.1268187165260315,
-0.06070486456155777,
0.09818972647190094,
-0.006758245639503002,
-0.09154469519853592,
-0.024163398891687393,
0.13355377316474915,
0.002042521024122834,
0.023738635703921318,
-0.08125443011522293,
0.12700043618679047,
-0.05307674780488014,
-0.2563377320766449,
0.07183146476745605,
-0.06298090517520905,
-0.15155023336410522,
-0.018484145402908325,
0.018409613519906998,
-0.004543652758002281,
0.024991953745484352,
0.06298631429672241,
-0.0635342076420784,
0.16134211421012878,
0.03548634424805641,
-0.06381282955408096,
-0.07436368614435196,
0.07651722431182861,
-0.07090579718351364,
0.3068118989467621,
0.006135785020887852,
0.05688599869608879,
0.09663521498441696,
-0.036371853202581406,
-0.1401294320821762,
0.03042936511337757,
0.09035595506429672,
-0.04897838085889816,
0.061034440994262695,
0.2049657553434372,
-0.012143735773861408,
0.11445674300193787,
0.07419917732477188,
-0.08735579252243042,
0.04469425231218338,
-0.0924081802368164,
-0.09559591859579086,
-0.09037184715270996,
0.09341852366924286,
-0.055895816534757614,
0.14971210062503815,
0.1230359748005867,
-0.04733745753765106,
0.009310171008110046,
-0.018910879269242287,
0.05801442265510559,
0.003590804524719715,
0.11238192021846771,
0.02891220524907112,
-0.19292861223220825,
0.030544010922312737,
-0.0002170102234231308,
0.10042719542980194,
-0.2395145148038864,
-0.0887673869729042,
0.042938072234392166,
-0.0011809789575636387,
-0.05844983085989952,
0.12348521500825882,
0.047044940292835236,
0.04414747655391693,
-0.05313320830464363,
-0.043901123106479645,
-0.011250684037804604,
0.16178865730762482,
-0.10214143991470337,
-0.0059404755011200905
] |
null | null | sentence-transformers |
# gapendl/geronimo-base
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('gapendl/geronimo-base')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('gapendl/geronimo-base')
model = AutoModel.from_pretrained('gapendl/geronimo-base')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=gapendl/geronimo-base)
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 2062 with parameters:
```
{'batch_size': 48, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 2,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 412,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | {"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"], "pipeline_tag": "sentence-similarity"} | sentence-similarity | gapendl/geronimo-base | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:44:53+00:00 | [] | [] | TAGS
#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us
|
# gapendl/geronimo-base
This is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
Then you can use the model like this:
## Usage (HuggingFace Transformers)
Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL
## Training
The model was trained with the parameters:
DataLoader:
'URL.dataloader.DataLoader' of length 2062 with parameters:
Loss:
'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:
Parameters of the fit()-Method:
## Full Model Architecture
## Citing & Authors
| [
"# gapendl/geronimo-base\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 2062 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
"TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n",
"# gapendl/geronimo-base\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 2062 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
43,
53,
38,
64,
29,
86,
5,
6
] | [
"passage: TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n# gapendl/geronimo-base\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 2062 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors"
] | [
-0.014033368788659573,
0.11551492661237717,
-0.00874051358550787,
0.04282158985733986,
0.11133800446987152,
0.02173498086631298,
0.16817434132099152,
0.08482642471790314,
-0.015068081207573414,
0.08676004409790039,
0.0184947457164526,
0.0943850502371788,
-0.009349246509373188,
0.011908608488738537,
0.03642291948199272,
-0.2688949704170227,
0.04739714041352272,
-0.07376070320606232,
0.0022697155363857746,
0.056984689086675644,
0.10875824093818665,
-0.07303411513566971,
0.04970161244273186,
-0.0105900252237916,
-0.060274433344602585,
0.03246834874153137,
-0.02961156517267227,
-0.02833077684044838,
0.07728154957294464,
0.06447198241949081,
0.05257634073495865,
0.0011356326285749674,
0.004545534495264292,
-0.2104051262140274,
0.01690405234694481,
0.0902571827173233,
-0.029400033876299858,
0.04784727841615677,
0.01236684899777174,
-0.05736839771270752,
0.1067781075835228,
-0.11324232816696167,
0.07104623317718506,
0.05205308645963669,
-0.13268131017684937,
-0.05998358875513077,
-0.03499117121100426,
0.010111267678439617,
0.09517550468444824,
0.08684457838535309,
-0.05400849133729935,
0.11457179486751556,
-0.03707040473818779,
0.09183678030967712,
0.14173667132854462,
-0.267341285943985,
-0.04374869167804718,
0.016859231516718864,
0.05056631192564964,
0.04195588454604149,
-0.12069246172904968,
0.025140410289168358,
-0.028068814426660538,
0.0386938713490963,
0.05908722057938576,
-0.022508421912789345,
0.05513002723455429,
-0.0008811819716356695,
-0.0977221131324768,
0.018084345385432243,
0.14757002890110016,
0.0255007091909647,
-0.020262086763978004,
-0.18677042424678802,
-0.08603498339653015,
0.11297842860221863,
-0.03756857290863991,
-0.0177605003118515,
0.043286848813295364,
0.05929279327392578,
-0.037769511342048645,
-0.11710187792778015,
-0.0735463947057724,
-0.00825018621981144,
-0.061509206891059875,
0.04385053366422653,
-0.002555529121309519,
-0.051838263869285583,
-0.011272561736404896,
0.05992735177278519,
-0.024536512792110443,
-0.11536846309900284,
-0.024256229400634766,
-0.04251890257000923,
-0.11046701669692993,
-0.004021305125206709,
-0.06047946214675903,
-0.0967273935675621,
0.05192233622074127,
0.16030848026275635,
0.06159156560897827,
0.03656507283449173,
-0.061628032475709915,
0.04793547838926315,
0.01804475672543049,
0.13537228107452393,
-0.037567876279354095,
-0.07831926643848419,
-0.013443411327898502,
0.011839560233056545,
0.01772608794271946,
-0.029764529317617416,
-0.04448697343468666,
0.0022811254020780325,
0.016677631065249443,
0.0673951730132103,
0.06655854731798172,
0.06838241964578629,
-0.028715213760733604,
-0.0457354411482811,
0.03188444301486015,
-0.1321244090795517,
0.035223979502916336,
0.03386959061026573,
-0.012825566343963146,
0.048441991209983826,
0.08190697431564331,
-0.02426792122423649,
-0.0784766748547554,
0.0003780247934628278,
-0.08621445298194885,
-0.01838058792054653,
-0.05308463051915169,
-0.12564532458782196,
-0.00457532424479723,
0.01697869412600994,
-0.061430856585502625,
-0.08876759558916092,
-0.1330403983592987,
-0.06650712341070175,
0.04832863435149193,
-0.04852776974439621,
0.005891704931855202,
-0.11386486142873764,
-0.006840068381279707,
0.010673180222511292,
0.012641272507607937,
-0.05870551988482475,
0.010657805018126965,
0.009160800836980343,
-0.05261889472603798,
0.054273445159196854,
0.05672868341207504,
0.032684724777936935,
-0.0984664186835289,
0.022063754498958588,
-0.17365524172782898,
0.16755364835262299,
-0.04995156079530716,
0.06792214512825012,
-0.10352792590856552,
0.04518508166074753,
-0.002026204252615571,
0.06033754721283913,
-0.0003999444888904691,
0.133629709482193,
-0.190458744764328,
-0.07354387640953064,
0.1851062923669815,
-0.07241939008235931,
-0.08984045684337616,
0.09542608261108398,
-0.04994553327560425,
0.10955013334751129,
0.13872472941875458,
0.12808239459991455,
0.07852613925933838,
-0.07108281552791595,
0.0038940857630223036,
0.015112265944480896,
-0.04125030338764191,
0.15243449807167053,
0.03748257830739021,
-0.07369150221347809,
0.09439654648303986,
0.0018749242881312966,
-0.04833212122321129,
0.01210679579526186,
0.015074475668370724,
-0.04066435247659683,
0.007531808689236641,
-0.03066398948431015,
0.04514171555638313,
-0.050017099827528,
0.010945036076009274,
-0.0009987718658521771,
-0.11835546791553497,
0.12677066028118134,
0.05616112798452377,
-0.09290847927331924,
0.04115242138504982,
-0.06977871805429459,
-0.01108553446829319,
0.0005010500317439437,
0.012760662473738194,
-0.18738602101802826,
-0.15099097788333893,
0.016707153990864754,
-0.022990772500634193,
0.11058586090803146,
0.00714408652856946,
0.059949856251478195,
0.054647743701934814,
-0.04477206990122795,
0.00042314871097914875,
0.03756112605333328,
0.012229855172336102,
-0.0649494156241417,
-0.14832140505313873,
0.0007152793114073575,
-0.05300605297088623,
0.05225873738527298,
-0.10961233079433441,
0.03484339267015457,
0.012922362424433231,
0.11002868413925171,
0.04968860372900963,
-0.028470555320382118,
-0.0049247234128415585,
-0.03981512412428856,
-0.008939719758927822,
-0.05064406991004944,
0.049386315047740936,
0.024928206577897072,
-0.12727180123329163,
0.08357265591621399,
-0.18628881871700287,
-0.11635246127843857,
0.07557450234889984,
-0.0177115760743618,
-0.05330127477645874,
-0.02818768285214901,
-0.014835089445114136,
-0.004355068784207106,
-0.058547619730234146,
-0.07709658145904541,
0.16997435688972473,
0.08668054640293121,
0.09997807443141937,
-0.04281093552708626,
-0.04017869755625725,
-0.05194297805428505,
-0.028787359595298767,
-0.043014947324991226,
0.08335988968610764,
-0.06189339607954025,
-0.16373629868030548,
0.07056848704814911,
0.07718221843242645,
-0.04736233502626419,
0.14200793206691742,
-0.01516545470803976,
-0.04515929892659187,
-0.05903206765651703,
0.028681857511401176,
0.033891260623931885,
-0.005139511544257402,
-0.08065561205148697,
0.005701624322682619,
0.037350933998823166,
0.02240527793765068,
0.025834400206804276,
-0.04758942127227783,
0.04644009470939636,
0.037309885025024414,
-0.0005046554142609239,
0.10676124691963196,
0.0089260870590806,
0.004305635113269091,
0.05435710772871971,
0.013462980277836323,
0.05939951166510582,
-0.020541507750749588,
-0.05046772584319115,
-0.10639940202236176,
0.15062682330608368,
-0.1329488605260849,
-0.19635266065597534,
-0.12313838303089142,
-0.016119282692670822,
-0.07300936430692673,
0.01587577722966671,
0.08717303723096848,
-0.056245554238557816,
-0.05199018865823746,
-0.08112446218729019,
0.07754051685333252,
0.07709873467683792,
-0.06394493579864502,
0.028889799490571022,
0.04947945103049278,
0.022089850157499313,
-0.13384990394115448,
-0.012441635131835938,
0.007497800514101982,
-0.05963510647416115,
-0.036000240594148636,
-0.019201762974262238,
0.0425882413983345,
0.09513486921787262,
0.08368992805480957,
0.002416192553937435,
-0.003650487633422017,
0.22291025519371033,
-0.0702299103140831,
0.06448951363563538,
0.11802032589912415,
-0.013183416798710823,
0.07279455661773682,
0.09453156590461731,
0.02884686551988125,
-0.0660276859998703,
0.044159214943647385,
0.08947282284498215,
-0.008969804272055626,
-0.17011958360671997,
-0.08596213161945343,
-0.06543731689453125,
-0.02589724399149418,
0.11237616091966629,
0.05518629029393196,
0.0066955131478607655,
0.02982960268855095,
-0.03975744917988777,
0.001827368512749672,
0.11051268130540848,
0.1210278645157814,
0.12282675504684448,
-0.024891434237360954,
0.0925082340836525,
-0.055389970541000366,
-0.06351743638515472,
0.04918927699327469,
-0.020766912028193474,
0.14405710995197296,
0.03335750848054886,
0.15486080944538116,
0.07117773592472076,
-0.044096291065216064,
-0.01686926744878292,
0.0726465955376625,
-0.03451508283615112,
0.026092534884810448,
-0.03609956055879593,
-0.10866402834653854,
0.004858081229031086,
0.06507053226232529,
0.0784178376197815,
-0.04134547337889671,
-0.022869884967803955,
0.04165012016892433,
0.15205131471157074,
0.1472829431295395,
0.04460780322551727,
-0.1831207126379013,
-0.03608408197760582,
0.05129357427358627,
-0.07132995128631592,
-0.06815728545188904,
-0.001494657015427947,
0.04686778411269188,
-0.11777766048908234,
0.06167043000459671,
-0.02263312041759491,
0.10345520824193954,
-0.055303361266851425,
0.03495904430747032,
-0.04704377055168152,
0.05565405264496803,
-0.009059978649020195,
0.06585567444562912,
-0.21034003794193268,
0.09295720607042313,
0.02954486943781376,
0.0727073922753334,
-0.047921210527420044,
0.03081473894417286,
0.07090285420417786,
0.016101639717817307,
0.17941994965076447,
-0.019396180287003517,
-0.046993859112262726,
0.052626173943281174,
-0.05069195479154587,
-0.0012817978858947754,
0.06887856125831604,
-0.12332404404878616,
0.08353807777166367,
-0.0436515286564827,
-0.036273080855607986,
-0.0012842098949477077,
0.06379285454750061,
-0.09742522239685059,
-0.1895618736743927,
0.0030064815655350685,
-0.008663170970976353,
0.01238517090678215,
-0.025736842304468155,
-0.01334336306899786,
0.007515360601246357,
0.20276285707950592,
-0.10151505470275879,
-0.06165940687060356,
-0.12209894508123398,
-0.02896772511303425,
0.10046659409999847,
-0.0818227231502533,
0.004183139652013779,
-0.019386274740099907,
0.1599716991186142,
-0.07304389774799347,
-0.08480207622051239,
0.0704839825630188,
-0.045058850198984146,
-0.08161579072475433,
-0.040796373039484024,
0.10906026512384415,
0.052732646465301514,
0.024134790524840355,
0.031027425080537796,
0.07250244170427322,
-0.007958662696182728,
-0.09776991605758667,
-0.04135648533701897,
0.13746798038482666,
-0.012989290058612823,
0.0644494816660881,
-0.13844972848892212,
-0.021031411364674568,
-0.09729254245758057,
0.0455281138420105,
0.2087484747171402,
0.2386002242565155,
-0.06609796732664108,
0.09978217631578445,
0.17317263782024384,
-0.12278016656637192,
-0.2056923806667328,
-0.08334026485681534,
0.005980812478810549,
0.02590429224073887,
0.0577792152762413,
-0.15853041410446167,
0.0894986242055893,
0.038639821112155914,
-0.002134536160156131,
-0.08965376019477844,
-0.21976278722286224,
-0.14555615186691284,
0.11145094782114029,
0.010398747399449348,
-0.008850783109664917,
-0.10857890546321869,
-0.06093740090727806,
-0.08831513673067093,
-0.013026602566242218,
0.11424317210912704,
-0.10549307614564896,
0.11127042770385742,
0.05858602374792099,
-0.018630990758538246,
0.04331345111131668,
-0.008639111183583736,
0.12144847214221954,
0.05896512418985367,
0.044510677456855774,
-0.04395584389567375,
-0.05272570624947548,
0.12112917006015778,
-0.08927988260984421,
0.10352656990289688,
-0.05359148979187012,
0.03545335680246353,
-0.06780505925416946,
-0.03736206889152527,
-0.04744986072182655,
0.03195877745747566,
-0.05285118147730827,
-0.04430749639868736,
-0.011757406406104565,
0.05961059406399727,
0.1219174712896347,
-0.0052728476002812386,
0.08221612125635147,
-0.06928984075784683,
0.06287337094545364,
0.15458442270755768,
0.08383535593748093,
0.06751861423254013,
-0.15821294486522675,
0.007246577180922031,
-0.006236158311367035,
0.05172732472419739,
-0.10517850518226624,
0.0878119021654129,
0.05302279442548752,
-0.0021769970189779997,
0.1491754651069641,
0.030915774405002594,
-0.09720969945192337,
-0.019415609538555145,
0.033678457140922546,
-0.11162136495113373,
-0.15047571063041687,
-0.03656819462776184,
-0.029502548277378082,
-0.09687145054340363,
-0.045088671147823334,
0.1697549670934677,
-0.0011476019863039255,
0.0019267287570983171,
0.03312881290912628,
0.03718703240156174,
-0.030015313997864723,
0.07527875900268555,
0.013307451270520687,
0.03854944184422493,
-0.048488274216651917,
0.1254071295261383,
0.07924538850784302,
-0.08377689868211746,
0.04568998143076897,
0.1581326276063919,
-0.06691848486661911,
-0.0780387669801712,
-0.04634099453687668,
0.16876336932182312,
-0.03692464157938957,
0.033152975142002106,
-0.056214720010757446,
-0.06357265263795853,
0.018107647076249123,
0.08048423379659653,
0.032196756452322006,
0.0655234307050705,
-0.0888148620724678,
0.005434762686491013,
-0.07051846385002136,
0.09059648215770721,
0.059376537799835205,
0.009103153832256794,
-0.043364960700273514,
0.07791272550821304,
-0.00072581967106089,
-0.018227530643343925,
-0.029071824625134468,
-0.049085553735494614,
-0.10893135517835617,
-0.00846013892441988,
-0.06401371210813522,
0.004268004093319178,
-0.09676717221736908,
-0.008874395862221718,
0.02464163303375244,
0.031474098563194275,
-0.0016524832462891936,
-0.005085066892206669,
-0.03807048872113228,
-0.07362942397594452,
-0.03666575625538826,
0.09585247188806534,
-0.16121231019496918,
-0.0025743539445102215,
0.02921951375901699,
-0.10665731132030487,
0.08583628386259079,
0.009780379943549633,
-0.04188616946339607,
0.030928315594792366,
-0.0878363773226738,
-0.0453735888004303,
-0.0025238608941435814,
0.009670591913163662,
0.03599671646952629,
-0.10744690895080566,
0.004103309474885464,
-0.05226026847958565,
0.035495322197675705,
0.009291010908782482,
0.058202892541885376,
-0.09649475663900375,
0.043246522545814514,
0.004895892459899187,
-0.03313262388110161,
-0.07947514206171036,
0.012170914560556412,
0.028954215347766876,
0.022763680666685104,
0.12803970277309418,
-0.07402771711349487,
0.08523698896169662,
-0.1314227283000946,
0.0031270894687622786,
0.030913587659597397,
-0.04754163697361946,
0.10010358691215515,
-0.09973654896020889,
0.05830777809023857,
-0.04829738661646843,
0.10161976516246796,
-0.04534050449728966,
0.026928037405014038,
0.07337585091590881,
0.007101756986230612,
-0.05275857076048851,
0.03560081496834755,
0.06879556179046631,
0.017929550260305405,
-0.0018290313892066479,
-0.051971424371004105,
-0.009776758961379528,
0.016000261530280113,
-0.005730141419917345,
0.07482652366161346,
0.12896883487701416,
0.05580192431807518,
0.07863041013479233,
0.09449707716703415,
0.0062693823128938675,
-0.0885310098528862,
0.0565384179353714,
0.007327998988330364,
0.05262937769293785,
-0.0741364136338234,
0.0011563936714082956,
0.135513573884964,
-0.13812999427318573,
0.1105484887957573,
0.012364293448626995,
-0.05925048887729645,
-0.09772931039333344,
-0.13938088715076447,
-0.07014497369527817,
-0.028598716482520103,
-0.0020291891414672136,
-0.12782661616802216,
0.010561410337686539,
0.00608484260737896,
0.014429643750190735,
0.006548967678099871,
0.13309834897518158,
-0.08018483966588974,
-0.0903213620185852,
0.08530554920434952,
-0.015477394685149193,
0.04477991908788681,
0.023946160450577736,
0.03053608536720276,
0.013658066280186176,
0.08097173273563385,
0.02627088874578476,
0.07177616655826569,
0.05724722892045975,
0.02148088440299034,
-0.09369009733200073,
-0.08231958746910095,
0.006483383011072874,
-0.00003592321809264831,
-0.053455933928489685,
0.09613277018070221,
0.04552348330616951,
-0.08875814080238342,
-0.009513067081570625,
0.2249070107936859,
-0.09768085926771164,
-0.11924154311418533,
-0.17558732628822327,
0.1593829244375229,
0.027974596247076988,
0.043107107281684875,
-0.03628304600715637,
-0.08965994417667389,
-0.019644683226943016,
0.15006396174430847,
0.1955193430185318,
-0.07397591322660446,
0.02815422974526882,
0.052032291889190674,
0.018060438334941864,
0.02852286770939827,
0.028297532349824905,
0.03173971176147461,
0.1800754964351654,
-0.04168060049414635,
0.09981363266706467,
-0.010511806234717369,
-0.07042170315980911,
-0.09159491211175919,
0.119674913585186,
0.0018663862720131874,
0.029479267075657845,
-0.02437606453895569,
0.10768408328294754,
-0.0574357844889164,
-0.1399495154619217,
-0.03687101975083351,
-0.08960910141468048,
-0.11180917918682098,
-0.03042447566986084,
0.05596036836504936,
0.029959674924612045,
0.08268731087446213,
0.040647879242897034,
-0.04238084703683853,
0.1396845281124115,
-0.0028720456175506115,
-0.041860613971948624,
-0.02092147432267666,
0.0313563235104084,
-0.06379552185535431,
0.15301068127155304,
0.004668009001761675,
-0.03098463825881481,
0.12201996892690659,
-0.00468303170055151,
-0.060743093490600586,
0.061004992574453354,
0.03637336567044258,
-0.06627524644136429,
0.09082029014825821,
0.09349151700735092,
-0.025850187987089157,
0.10637351870536804,
0.07463786751031876,
-0.16435027122497559,
0.06255944818258286,
0.02792445197701454,
-0.054056476801633835,
-0.06320115923881531,
0.0507163479924202,
-0.09556815773248672,
0.10387828201055527,
0.16535067558288574,
-0.022473620250821114,
-0.008481433615088463,
0.0012160774786025286,
0.007792350836098194,
0.02485770918428898,
0.04073561728000641,
-0.05996837839484215,
-0.11531368643045425,
0.0029205360915511847,
0.03800121322274208,
0.03932442516088486,
-0.2727819085121155,
-0.13039784133434296,
0.03219737112522125,
-0.0033867654856294394,
-0.043574485927820206,
0.117266945540905,
0.09538436681032181,
0.009186896495521069,
-0.029426729306578636,
-0.22291377186775208,
0.024089716374874115,
0.10755554586648941,
-0.11231602728366852,
-0.07420959323644638
] |
null | null | null | --inbrowser Automatically open the url in browser, if --share is used, the public url will be automatically open instead
--server_port Choose a specific server port, default=7860 (example --server_port 420 so the local url will be: http://127.0.0.1:420)
--share Creates a public URL
---
title: Stable Cascade
emoji: 👁
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 4.18.0
app_file: app.py
pinned: false
license: mit
hf_oauth: true
---
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
| {} | null | Aitrepreneur/Stable-Cascade | [
"region:us"
] | 2024-02-14T15:46:59+00:00 | [] | [] | TAGS
#region-us
| --inbrowser Automatically open the url in browser, if --share is used, the public url will be automatically open instead
--server_port Choose a specific server port, default=7860 (example --server_port 420 so the local url will be: http://127.0.0.1:420)
--share Creates a public URL
---
title: Stable Cascade
emoji:
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 4.18.0
app_file: URL
pinned: false
license: mit
hf_oauth: true
---
Check out the configuration reference at URL
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] | [
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | kaushalpowar/llama2_finetuned_5 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T15:49:08+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | null |
# Lora of lutzow/リュッツォウ/吕佐夫 (Azur Lane)
## What Is This?
This is the LoRA model of waifu lutzow/リュッツォウ/吕佐夫 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/lutzow_azurlane](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane), which contains 167 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1680 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `lutzow_azurlane`.**
* Pruned core tags for this waifu are `breasts, long_hair, large_breasts, hat, grey_hair, black_headwear, bangs`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 672, you need to download [`672/lutzow_azurlane.pt`](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/672/lutzow_azurlane.pt) as the embedding and [`672/lutzow_azurlane.safetensors`](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/672/lutzow_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 672.
1640 images (1.74 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0 | pattern_1_0 | pattern_1_1 | pattern_2_0 | pattern_2_1 | pattern_2_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:----------------------------------------------------------------------------------------------------|:------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 672 | 17 | **0.920** | 0.981 | **0.851** | **0.742** | [Download](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/672/lutzow_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 630 | 16 | 0.903 | 0.974 | 0.849 | 0.722 | [Download](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/630/lutzow_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1302 | 32 | 0.919 | 0.962 | 0.836 | 0.717 | [Download](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/1302/lutzow_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1512 | 37 | 0.907 | **0.985** | 0.836 | 0.706 | [Download](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/1512/lutzow_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1008 | 25 | 0.887 | 0.954 | 0.843 | 0.695 | [Download](https://huggingface.co/CyberHarem/lutzow_azurlane/resolve/main/1008/lutzow_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1302 to 1680](all/0.md)
* [Steps From 882 to 1260](all/1.md)
* [Steps From 462 to 840](all/2.md)
* [Steps From 42 to 420](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/lutzow_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/lutzow_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/lutzow_azurlane",
"license:mit",
"region:us"
] | 2024-02-14T15:49:55+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/lutzow_azurlane #license-mit #region-us
| Lora of lutzow/リュッツォウ/吕佐夫 (Azur Lane)
=====================================
What Is This?
-------------
This is the LoRA model of waifu lutzow/リュッツォウ/吕佐夫 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/lutzow\_azurlane, which contains 167 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1680 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'lutzow\_azurlane'.
* Pruned core tags for this waifu are 'breasts, long\_hair, large\_breasts, hat, grey\_hair, black\_headwear, bangs'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 672, you need to download '672/lutzow\_azurlane.pt' as the embedding and '672/lutzow\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 672.
1640 images (1.74 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1302 to 1680
* Steps From 882 to 1260
* Steps From 462 to 840
* Steps From 42 to 420
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 672, you need to download '672/lutzow\\_azurlane.pt' as the embedding and '672/lutzow\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 672.\n\n\n1640 images (1.74 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1302 to 1680\n* Steps From 882 to 1260\n* Steps From 462 to 840\n* Steps From 42 to 420"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/lutzow_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 672, you need to download '672/lutzow\\_azurlane.pt' as the embedding and '672/lutzow\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 672.\n\n\n1640 images (1.74 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1302 to 1680\n* Steps From 882 to 1260\n* Steps From 462 to 840\n* Steps From 42 to 420"
] | [
45,
38,
474
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/lutzow_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.0073394132778048515,
-0.010549704544246197,
-0.0042297495529055595,
0.08621204644441605,
0.07123208045959473,
0.08026517182588577,
0.22900600731372833,
0.07918063551187515,
0.1270841658115387,
-0.06914390623569489,
0.08924972265958786,
0.0653306394815445,
-0.007307658903300762,
0.020691493526101112,
-0.02775726281106472,
-0.14925000071525574,
-0.06656995415687561,
-0.02907472662627697,
0.001029966864734888,
0.011822095140814781,
0.0763825848698616,
0.011350534856319427,
0.0983053594827652,
-0.05391181632876396,
-0.03249238058924675,
0.05609484389424324,
-0.0339987687766552,
-0.036517780274152756,
0.028185799717903137,
0.0897507518529892,
0.11711950600147247,
0.017909446731209755,
0.0677335262298584,
-0.16571062803268433,
0.06597708910703659,
-0.014129635877907276,
-0.10894203186035156,
-0.009180656634271145,
0.021355755627155304,
-0.044933926314115524,
0.1290672868490219,
0.013349992223083973,
-0.11924748867750168,
0.03933550417423248,
-0.12812228500843048,
-0.008912532590329647,
-0.051386021077632904,
0.04379778727889061,
0.1326754093170166,
0.0560804083943367,
0.02687346749007702,
0.05822771415114403,
-0.052131958305835724,
0.08595522493124008,
0.1232328861951828,
-0.13823392987251282,
-0.06625501066446304,
0.11601819843053818,
0.01454639807343483,
0.12207049131393433,
-0.09699149429798126,
0.10163366049528122,
0.07399102300405502,
-0.0522603765130043,
-0.15963104367256165,
-0.09648581594228745,
-0.21006472408771515,
-0.01065956149250269,
0.011101752519607544,
0.024471543729305267,
0.3999309837818146,
0.06158438324928284,
0.034277547150850296,
0.06820721924304962,
-0.06364794075489044,
0.0342935249209404,
-0.10016029328107834,
0.14880704879760742,
0.04227902740240097,
0.09197763353586197,
-0.039498355239629745,
-0.1008162572979927,
-0.11443856358528137,
-0.0645618587732315,
-0.08145926892757416,
-0.010330756194889545,
0.02134815789759159,
0.11520188301801682,
-0.193708598613739,
0.0009486867347732186,
-0.05080511420965195,
-0.12826430797576904,
0.01905081979930401,
-0.09303071349859238,
0.17164623737335205,
0.06673146039247513,
-0.014845563098788261,
0.01781531050801277,
0.25495070219039917,
0.12046147137880325,
0.19592420756816864,
0.059022530913352966,
-0.10573422163724899,
0.1294027715921402,
0.0344168059527874,
-0.08393371850252151,
-0.001171184005215764,
-0.1052955612540245,
0.13855476677417755,
-0.05065685510635376,
0.11296171694993973,
-0.05986609309911728,
-0.11841679364442825,
0.010813113301992416,
-0.1083839163184166,
0.07063797861337662,
0.044598326086997986,
0.014636116102337837,
-0.04723624512553215,
0.04906409978866577,
0.026942426338791847,
-0.03773865848779678,
-0.0012713189935311675,
-0.004888713825494051,
-0.05510490760207176,
0.05509406700730324,
0.11503610014915466,
0.038771871477365494,
0.05652187764644623,
-0.00032209529308602214,
-0.01843077503144741,
0.004030346870422363,
-0.043248604983091354,
0.003893878310918808,
0.05295451357960701,
0.051528651267290115,
0.09310412406921387,
-0.15953217446804047,
-0.07747848331928253,
-0.015475706197321415,
0.056513771414756775,
0.011103455908596516,
0.09634315222501755,
-0.01513929758220911,
0.05314942076802254,
0.009295864962041378,
-0.021802673116326332,
0.04365043342113495,
-0.10284428298473358,
0.0898536890745163,
-0.016628025099635124,
0.08798634260892868,
-0.20301134884357452,
-0.008695783093571663,
-0.044098060578107834,
0.01193234883248806,
0.06870225071907043,
-0.00621507503092289,
-0.11302205920219421,
0.1277761459350586,
-0.011517074890434742,
0.070284903049469,
-0.10428588837385178,
0.05034532770514488,
0.025693662464618683,
0.08533646166324615,
-0.10059595853090286,
0.009977396577596664,
0.1090412363409996,
-0.13709646463394165,
-0.1526215821504593,
0.08338592946529388,
-0.02503243461251259,
0.02913396619260311,
0.05171336233615875,
0.1718297004699707,
0.193865567445755,
-0.1921922266483307,
-0.02686205692589283,
0.06476309150457382,
-0.015102121978998184,
-0.07957886904478073,
-0.013262752443552017,
0.10361480712890625,
0.011329183354973793,
0.035274457186460495,
-0.031681329011917114,
0.11907965689897537,
-0.02768828347325325,
-0.08408498018980026,
-0.02924584597349167,
-0.08165759593248367,
-0.06540779769420624,
0.04845677316188812,
-0.013356470502912998,
-0.0559520460665226,
0.01662023365497589,
-0.15125922858715057,
0.16881129145622253,
0.021116044372320175,
0.025361277163028717,
-0.08227512985467911,
0.09469357877969742,
0.02472817152738571,
0.005139710381627083,
0.01401133369654417,
-0.06545480340719223,
-0.11301487684249878,
0.22979237139225006,
0.08919977396726608,
0.08839143812656403,
0.06076938286423683,
-0.04836544021964073,
-0.06539074331521988,
0.01713053323328495,
0.013942566700279713,
-0.04404613375663757,
0.01645275019109249,
-0.10787057131528854,
0.046346016228199005,
-0.010492178611457348,
0.023583047091960907,
-0.012024999596178532,
-0.032000795006752014,
0.07305030524730682,
0.013382511213421822,
-0.02625555358827114,
0.08950091898441315,
0.050368018448352814,
-0.018679741770029068,
-0.06509494036436081,
0.004478786140680313,
0.07528053224086761,
-0.007613369729369879,
-0.08322758227586746,
0.018006445840001106,
0.0003546955995261669,
0.03982284292578697,
0.20205308496952057,
-0.21744351089000702,
0.05169466882944107,
0.004351521376520395,
0.052854735404253006,
0.03693767264485359,
0.010519801639020443,
-0.03808224946260452,
0.03196847438812256,
-0.024697832763195038,
0.06513775140047073,
-0.01451992429792881,
0.06781215220689774,
-0.032010700553655624,
-0.13768112659454346,
-0.010901358909904957,
-0.019665004685521126,
0.1690601110458374,
-0.1598653942346573,
0.062431998550891876,
0.174008309841156,
-0.11843879520893097,
0.14246009290218353,
-0.0047291601076722145,
-0.0035578347742557526,
0.008997579105198383,
0.03945908322930336,
-0.0024166323710232973,
0.10505597293376923,
-0.08140198141336441,
-0.027251489460468292,
0.02104748971760273,
-0.08627461642026901,
0.03447508066892624,
-0.11986401677131653,
-0.11249924451112747,
-0.06912178546190262,
-0.03966350480914116,
-0.04800692945718765,
0.024299412965774536,
-0.05404302850365639,
0.07466777414083481,
-0.08998099714517593,
-0.07968927174806595,
-0.024020332843065262,
-0.08751275390386581,
0.018951237201690674,
0.005929275415837765,
-0.060366492718458176,
-0.13680967688560486,
-0.11557285487651825,
-0.09316329658031464,
-0.1492132842540741,
-0.006363512482494116,
0.06618504971265793,
-0.11115848273038864,
-0.040313322097063065,
0.021026916801929474,
-0.044794078916311264,
0.09426891803741455,
-0.074591726064682,
0.028990982100367546,
0.05318182334303856,
-0.03751131147146225,
-0.17284776270389557,
-0.001784110558219254,
-0.06317402422428131,
-0.059807565063238144,
0.15275101363658905,
-0.1556997150182724,
0.18116150796413422,
-0.031337399035692215,
0.0647864118218422,
0.0636390969157219,
0.029409507289528847,
0.12308325618505478,
-0.10934481024742126,
0.07186829298734665,
0.18544626235961914,
0.041662923991680145,
0.08067575097084045,
0.11724551022052765,
0.08186287432909012,
-0.11328133940696716,
0.036794811487197876,
0.08059234917163849,
-0.10041553527116776,
-0.0767858475446701,
-0.0606655590236187,
-0.1128462627530098,
-0.04881154000759125,
0.05708445608615875,
0.06147845461964607,
0.03925999999046326,
0.12122569978237152,
-0.05719257891178131,
0.0013964964309707284,
0.10125664621591568,
0.043123893439769745,
0.07165176421403885,
0.013638230971992016,
0.05301594361662865,
-0.15142999589443207,
-0.0483914352953434,
0.16032880544662476,
0.20989441871643066,
0.21941527724266052,
0.026279306039214134,
0.06472039222717285,
0.11975141614675522,
0.09859731793403625,
0.0939439907670021,
0.056234344840049744,
0.006962707731872797,
0.015345310792326927,
-0.06886723637580872,
-0.058686885982751846,
0.012766631320118904,
0.0032738461159169674,
-0.04120966047048569,
-0.1500442922115326,
0.09918110072612762,
-0.004672762472182512,
0.08287893235683441,
0.12552490830421448,
0.03620746359229088,
-0.09634683281183243,
0.160476416349411,
0.09866640716791153,
0.08744390308856964,
-0.06454510241746902,
0.13126471638679504,
0.041447561234235764,
-0.0037331765051931143,
0.17178580164909363,
0.036642007529735565,
0.15109778940677643,
-0.033802926540374756,
-0.07241132110357285,
-0.07460091263055801,
-0.05936776474118233,
0.0066893212497234344,
0.03907472640275955,
-0.23261433839797974,
0.09384974092245102,
0.0586014948785305,
0.015505319461226463,
-0.01257152110338211,
-0.05607353523373604,
0.18606361746788025,
0.14876794815063477,
0.07832153141498566,
0.02777131088078022,
-0.024119585752487183,
-0.02104533649981022,
-0.09061142802238464,
0.05613989010453224,
0.027547264471650124,
0.06553074717521667,
-0.027867676690220833,
-0.09851469844579697,
-0.01946009323000908,
-0.0019138484494760633,
0.027815645560622215,
-0.08581100404262543,
-0.10848987847566605,
-0.04283265769481659,
0.25068628787994385,
-0.05685001239180565,
0.04894206300377846,
0.054949481040239334,
0.019968844950199127,
-0.04262147471308708,
0.029919426888227463,
-0.03212146461009979,
-0.019361374899744987,
-0.04009077697992325,
0.0016006717924028635,
0.005922746378928423,
-0.05211881548166275,
-0.05137760192155838,
-0.030638625845313072,
-0.10530290752649307,
-0.10983099788427353,
-0.0018353888299316168,
-0.04515821859240532,
0.014908612705767155,
-0.014715811237692833,
0.027343623340129852,
-0.0957871600985527,
-0.029580838978290558,
0.027876706793904305,
0.029145915061235428,
-0.06788080930709839,
-0.1396821141242981,
-0.007723940536379814,
-0.01871560327708721,
-0.06984002143144608,
0.0350005216896534,
-0.10808447003364563,
-0.08974479138851166,
-0.05395277589559555,
-0.031135227531194687,
0.12419828772544861,
0.23116283118724823,
-0.019706081598997116,
0.0031454372219741344,
0.1392146497964859,
-0.10037773102521896,
-0.32751041650772095,
-0.16256114840507507,
-0.15682893991470337,
-0.10043428093194962,
0.03143872693181038,
-0.07772210240364075,
0.02557564526796341,
0.06855693459510803,
-0.03858962655067444,
0.21626313030719757,
-0.2092430740594864,
-0.09844319522380829,
0.08219033479690552,
0.08416178822517395,
0.32029345631599426,
-0.24758759140968323,
0.007576202508062124,
-0.11922798305749893,
-0.03441201150417328,
0.01768207736313343,
-0.08648506551980972,
0.11753284931182861,
0.04409828037023544,
0.08622792363166809,
0.0010787637438625097,
-0.002855223137885332,
0.14638538658618927,
-0.0688466727733612,
0.13466928899288177,
-0.12484609335660934,
-0.1053386852145195,
0.1859786957502365,
-0.03440980985760689,
0.004806374199688435,
-0.21493569016456604,
-0.03723704442381859,
-0.03897864371538162,
0.03431672602891922,
-0.009128888137638569,
0.06154676526784897,
-0.013674433343112469,
-0.009207298047840595,
-0.12962648272514343,
-0.020473353564739227,
-0.03290219232439995,
0.059161409735679626,
0.22191999852657318,
-0.06184346601366997,
-0.07965655624866486,
0.031953245401382446,
-0.001956627005711198,
0.1079115942120552,
0.007634117733687162,
-0.0573599711060524,
-0.042411647737026215,
0.09263313561677933,
-0.2021985948085785,
0.05732859671115875,
0.012489428743720055,
-0.0005627440987154841,
0.007224331609904766,
0.008965385146439075,
0.021589715033769608,
0.11203507333993912,
0.18385474383831024,
-0.008587509393692017,
-0.02441420592367649,
-0.023306487128138542,
0.02701055072247982,
0.12956197559833527,
-0.016966626048088074,
0.1157195121049881,
0.021603990346193314,
0.04391949623823166,
0.012087746523320675,
0.055465128272771835,
-0.08386167138814926,
-0.08576904237270355,
0.10260216891765594,
-0.04832194745540619,
-0.08557768911123276,
0.091383196413517,
0.04349929839372635,
0.083020880818367,
0.002219550544396043,
0.05093112587928772,
0.018259139731526375,
-0.12683692574501038,
0.012627427466213703,
0.21860428154468536,
-0.08386970311403275,
-0.06334905326366425,
-0.0646321251988411,
0.011464878916740417,
-0.12431047111749649,
0.08648113906383514,
0.03647875413298607,
-0.023950034752488136,
0.12256959825754166,
-0.04060588404536247,
-0.02620759792625904,
0.004680797923356295,
-0.056047093123197556,
0.03239698335528374,
-0.14868323504924774,
-0.18526118993759155,
0.04967127740383148,
-0.001370907062664628,
-0.06164924055337906,
-0.08642725646495819,
-0.09410344809293747,
0.07019975036382675,
-0.16691581904888153,
0.14537708461284637,
-0.06778952479362488,
0.06421030312776566,
-0.0366220586001873,
-0.054564010351896286,
-0.10851189494132996,
-0.020714761689305305,
-0.05155174061655998,
-0.021648062393069267,
0.052449703216552734,
0.017340565100312233,
-0.12292666733264923,
-0.11991926282644272,
0.06310015916824341,
-0.0030826490838080645,
-0.010055051185190678,
0.018034709617495537,
-0.06931954622268677,
0.032662034034729004,
-0.22105355560779572,
-0.06347852945327759,
0.08473557978868484,
0.043678294867277145,
-0.09170254319906235,
0.1308869570493698,
0.04051195830106735,
-0.02440621145069599,
0.04550618305802345,
0.00474341306835413,
0.18479254841804504,
-0.07977108657360077,
0.028783682733774185,
-0.12510553002357483,
-0.16319382190704346,
-0.023838356137275696,
0.030609549954533577,
0.22561678290367126,
0.08684763312339783,
0.12478306889533997,
-0.05101010575890541,
0.0229121632874012,
-0.014381743967533112,
0.07286912202835083,
0.01608128286898136,
-0.10051659494638443,
-0.05456254631280899,
-0.17302623391151428,
-0.06034862995147705,
-0.0651865303516388,
0.14726263284683228,
0.036684490740299225,
-0.14482399821281433,
-0.005697218701243401,
0.11271430552005768,
-0.17692410945892334,
-0.014086144044995308,
0.17806661128997803,
-0.045462820678949356,
0.031445909291505814,
-0.15582698583602905,
0.030408138409256935,
0.07951708137989044,
-0.0398559533059597,
-0.0012893539387732744,
0.12461811304092407,
0.008592082187533379,
0.0030728033743798733,
0.03976981341838837,
-0.03953726962208748,
0.08033052086830139,
-0.07217662781476974,
0.05942435562610626,
-0.010701708495616913,
-0.05386160686612129,
-0.11073947697877884,
0.1839739978313446,
-0.014013523235917091,
0.014070302248001099,
-0.056813206523656845,
0.0010080179199576378,
-0.09926800429821014,
-0.11055093258619308,
-0.07016263902187347,
-0.12461134046316147,
0.07090841233730316,
-0.06307900696992874,
0.014098810032010078,
-0.0006224437966011465,
0.014770043082535267,
-0.07006119191646576,
0.014809731394052505,
-0.19852423667907715,
-0.054976221174001694,
0.014092907309532166,
-0.01963287591934204,
-0.022199338302016258,
-0.04289180785417557,
-0.039222195744514465,
0.023258185014128685,
-0.05638603866100311,
-0.06538848578929901,
0.06570151448249817,
0.08407812565565109,
0.05971623584628105,
-0.16310788691043854,
-0.10513842850923538,
-0.07436707615852356,
0.033762458711862564,
0.08659717440605164,
0.18375730514526367,
0.039345454424619675,
-0.008738207630813122,
0.040286093950271606,
0.13631415367126465,
0.014890458434820175,
-0.09314759820699692,
-0.05637441202998161,
-0.13602939248085022,
-0.14007967710494995,
-0.012178177013993263,
-0.06993966549634933,
-0.02035694196820259,
0.029925202950835228,
0.25317656993865967,
0.1892089694738388,
-0.14936523139476776,
0.04386821389198303,
-0.08110599219799042,
0.04337944835424423,
-0.022020330652594566,
0.1613059639930725,
0.049435392022132874,
0.16231945157051086,
-0.033355142921209335,
-0.04222744703292847,
-0.06272610276937485,
0.022526061162352562,
-0.10144080221652985,
0.04281209036707878,
-0.017777571454644203,
-0.06760364025831223,
-0.06557179987430573,
0.10514934360980988,
-0.11747641861438751,
0.06189287081360817,
0.18281230330467224,
-0.1449393928050995,
-0.010392621159553528,
-0.038955673575401306,
0.06017112359404564,
0.11639378219842911,
0.021538808941841125,
-0.08293160796165466,
-0.024266023188829422,
-0.007948322221636772,
0.024848461151123047,
-0.180983766913414,
-0.11258315294981003,
-0.006397177465260029,
-0.11110244691371918,
0.12969578802585602,
-0.006899735890328884,
0.011363315396010876,
0.03427198529243469,
-0.06208315119147301,
-0.0030112704262137413,
0.17066223919391632,
0.02096932753920555,
-0.04559873044490814,
-0.031645093113183975,
-0.07625587284564972,
-0.10849780589342117,
0.07772058993577957,
0.0901295393705368,
0.05181007459759712,
-0.00437180558219552,
0.19197756052017212,
-0.019376767799258232,
-0.04202113673090935,
0.1365736722946167,
-0.17748664319515228,
0.08628100901842117,
-0.0010620784014463425,
-0.019084947183728218,
-0.07375967502593994,
-0.04413628205657005,
0.04170209541916847,
0.07290447503328323,
-0.17593254148960114,
-0.049171265214681625,
0.05603192374110222,
-0.09880358725786209,
0.05423866957426071,
0.04234907776117325,
-0.07693279534578323,
0.015632005408406258,
-0.11926818639039993,
0.0019525899551808834,
-0.10096772760152817,
0.04910609498620033,
0.19890375435352325,
-0.03505706414580345,
0.013551394455134869,
-0.14775650203227997,
0.05720609426498413,
-0.030718257650732994,
-0.041673604398965836,
-0.08219397068023682
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | kaushalpowar/llama2_finetuned_5merged | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T15:50:00+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mistral_3
This model is a fine-tuned version of [ybelkada/mistral-7b-instruct-v0.1-sharded](https://huggingface.co/ybelkada/mistral-7b-instruct-v0.1-sharded) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 8
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.2 | {"library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "ybelkada/mistral-7b-instruct-v0.1-sharded", "model-index": [{"name": "mistral_3", "results": []}]} | null | anyiwang/mistral_3 | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:ybelkada/mistral-7b-instruct-v0.1-sharded",
"region:us"
] | 2024-02-14T15:52:45+00:00 | [] | [] | TAGS
#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us
|
# mistral_3
This model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 8
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.2 | [
"# mistral_3\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0004\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 8\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
"TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us \n",
"# mistral_3\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0004\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 8\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
50,
41,
6,
12,
8,
3,
103,
4,
36
] | [
"passage: TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us \n# mistral_3\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0004\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 8\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
-0.10200830549001694,
0.003961034584790468,
-0.0018557221628725529,
0.05921097472310066,
0.15534399449825287,
0.024959446862339973,
0.12322882562875748,
0.10478182137012482,
-0.039475005120038986,
0.07208707183599472,
0.06001714617013931,
0.020910583436489105,
0.05332499369978905,
0.13926556706428528,
-0.028652677312493324,
-0.262786328792572,
0.04477458819746971,
-0.024022400379180908,
-0.023270737379789352,
0.08932969719171524,
0.12304992228746414,
-0.10650849342346191,
0.04735393449664116,
0.00888937059789896,
-0.14539659023284912,
0.01440206728875637,
-0.012994802556931973,
-0.03901493176817894,
0.10611442476511002,
0.025402793660759926,
0.1545981615781784,
-0.00009497809514869004,
0.1363239586353302,
-0.2296246439218521,
0.017216280102729797,
0.07475695013999939,
0.041487421840429306,
0.07636242359876633,
0.07248661667108536,
0.014684654772281647,
0.08687832206487656,
-0.11494304984807968,
0.11785750836133957,
0.0320892296731472,
-0.0825885459780693,
-0.20725947618484497,
-0.1078578382730484,
0.060023002326488495,
0.11880724132061005,
0.07700375467538834,
0.010668613016605377,
0.14409220218658447,
-0.0920085608959198,
0.051568690687417984,
0.19435928761959076,
-0.2199385166168213,
-0.0817343071103096,
0.05575800687074661,
0.05233961343765259,
0.08526692539453506,
-0.10569760203361511,
-0.04046526178717613,
0.06627001613378525,
0.04570278152823448,
0.06804786622524261,
0.007450050208717585,
-0.0651051476597786,
-0.00018003385048359632,
-0.14752648770809174,
-0.012258544564247131,
0.1331206113100052,
0.04514262080192566,
-0.052440229803323746,
-0.07682238519191742,
-0.033048585057258606,
-0.05252848193049431,
-0.035402003675699234,
-0.03889364376664162,
0.00120257749222219,
-0.02412985824048519,
-0.012718564830720425,
-0.04599817469716072,
-0.09914212673902512,
-0.09883224964141846,
0.003950047772377729,
0.13143494725227356,
0.04757159203290939,
0.0015221507055684924,
-0.04216337203979492,
0.10875950008630753,
0.00006920010491739959,
-0.08543143421411514,
-0.009464081376791,
-0.01994687132537365,
-0.08236255496740341,
-0.07586635649204254,
-0.04556882381439209,
-0.04155543074011803,
0.00695763248950243,
0.11188066005706787,
-0.10828949511051178,
0.08933132886886597,
0.022822029888629913,
0.03499899059534073,
-0.024504804983735085,
0.08016470074653625,
-0.041380591690540314,
0.044474046677351,
-0.010848075151443481,
0.10016289353370667,
-0.01729772426187992,
-0.014927668496966362,
-0.07201314717531204,
-0.04806554317474365,
0.08036860078573227,
0.05046100914478302,
-0.05454019829630852,
0.008350348100066185,
-0.06669764965772629,
-0.022578177973628044,
0.016036855056881905,
-0.11989472806453705,
0.0354282520711422,
0.015417808666825294,
-0.05655570700764656,
-0.02746126428246498,
0.03017614781856537,
0.02964470349252224,
0.007565253879874945,
0.0716967061161995,
-0.07259073108434677,
0.024910565465688705,
-0.09907238185405731,
-0.07994698733091354,
0.013743222691118717,
-0.013760249130427837,
-0.015862947329878807,
-0.10670245438814163,
-0.1789328157901764,
-0.05991468206048012,
0.03581060841679573,
-0.04260283336043358,
0.0013116949703544378,
-0.050390664488077164,
-0.030352560803294182,
0.03945224732160568,
-0.007809814065694809,
0.1147235557436943,
-0.055028658360242844,
0.08251557499170303,
-0.056212279945611954,
-0.002236715517938137,
-0.026083074510097504,
0.02900109253823757,
-0.06236381083726883,
0.0442153736948967,
-0.10422363132238388,
0.05650500953197479,
-0.10663998872041702,
0.03596124425530434,
-0.13218578696250916,
-0.09786336123943329,
-0.01738949678838253,
-0.015211977064609528,
0.10995882004499435,
0.11058556288480759,
-0.20056480169296265,
-0.009449224919080734,
0.12417960166931152,
-0.11353409290313721,
-0.08074786514043808,
0.10691375285387039,
-0.06301485002040863,
0.08028008043766022,
0.03650618717074394,
0.1729179322719574,
0.12220299988985062,
-0.14090147614479065,
0.045132674276828766,
0.0022319976706057787,
0.09358092397451401,
0.06697889417409897,
0.06280367076396942,
-0.03267894312739372,
-0.08267946541309357,
0.0065829139202833176,
-0.06435274332761765,
0.048156820237636566,
-0.09620671719312668,
-0.07270577549934387,
-0.04601631313562393,
-0.05958971381187439,
0.0777185708284378,
0.019510764628648758,
0.02676430717110634,
-0.0698171928524971,
-0.05971153452992439,
0.16363154351711273,
0.1493891030550003,
-0.05456119403243065,
0.005059296730905771,
-0.06102372705936432,
0.03815118595957756,
-0.01366423349827528,
-0.03886820003390312,
-0.16904844343662262,
-0.12172909826040268,
0.03046215884387493,
-0.038698866963386536,
0.0006027885829098523,
0.028540078550577164,
0.07678969204425812,
0.07315774261951447,
-0.044366367161273956,
-0.05196056887507439,
-0.11543715000152588,
0.013839779421687126,
-0.11227382719516754,
-0.1896308809518814,
-0.05409589409828186,
-0.06264737993478775,
0.1637553572654724,
-0.25165098905563354,
0.021578248590230942,
-0.012967310845851898,
0.13208363950252533,
0.03695980831980705,
-0.04644022509455681,
-0.008662085048854351,
0.06796415895223618,
-0.00024952026433311403,
-0.08479912579059601,
0.04847541078925133,
0.017703600227832794,
-0.1280488818883896,
-0.015140109695494175,
-0.14964760839939117,
0.036302804946899414,
0.04752320423722267,
0.037953272461891174,
-0.10809267312288284,
-0.1400884985923767,
-0.06379644572734833,
-0.052648186683654785,
-0.08375561982393265,
-0.0023235438857227564,
0.2021922916173935,
0.006954955868422985,
0.11696099489927292,
-0.06276147067546844,
-0.03895288333296776,
-0.0057212128303945065,
-0.023860281333327293,
-0.01989067532122135,
0.08750119805335999,
0.02194332890212536,
-0.11717185378074646,
0.07086624205112457,
0.12608253955841064,
-0.06426600366830826,
0.17034310102462769,
-0.05499127879738808,
-0.10055824369192123,
-0.011523721739649773,
0.0552968755364418,
-0.004800036083906889,
0.10687173157930374,
-0.062297217547893524,
0.030913274735212326,
0.016804372891783714,
0.048927582800388336,
0.04076498746871948,
-0.18627865612506866,
-0.022619331255555153,
0.013954171910881996,
-0.02009054273366928,
-0.019500765949487686,
-0.01644314080476761,
0.028389984741806984,
0.08350341022014618,
0.019387174397706985,
-0.013077491894364357,
0.01880507543683052,
-0.007228209171444178,
-0.11602362245321274,
0.19898775219917297,
-0.15249086916446686,
-0.09119225293397903,
-0.11859015375375748,
0.07566533982753754,
-0.002397107193246484,
-0.029099659994244576,
0.027846766635775566,
-0.09598823636770248,
-0.03883400559425354,
-0.08523612469434738,
0.001228545792400837,
-0.05255836993455887,
-0.02472945675253868,
0.024883177131414413,
0.006526229903101921,
0.09182681888341904,
-0.1274535357952118,
0.014249531552195549,
-0.004893845412880182,
-0.08629921078681946,
0.02499157376587391,
0.02644762024283409,
0.07903874665498734,
0.1411011517047882,
-0.008879224769771099,
-0.005035943817347288,
-0.04936506226658821,
0.22476603090763092,
-0.05054634064435959,
-0.0258563794195652,
0.10935249924659729,
-0.006750343833118677,
0.05961323529481888,
0.12272103130817413,
0.04606293886899948,
-0.08925772458314896,
0.05106396973133087,
0.045960552990436554,
-0.005863426718860865,
-0.25414422154426575,
-0.05271969363093376,
-0.026124756783246994,
-0.10115424543619156,
0.09288931638002396,
0.048403315246105194,
-0.029145151376724243,
0.062323834747076035,
-0.046669851988554,
0.018335679545998573,
0.015026467852294445,
0.09220033884048462,
0.0203991886228323,
0.023167449980974197,
0.07103123515844345,
-0.022121351212263107,
-0.004024837631732225,
0.0574202686548233,
0.011808760464191437,
0.2700498700141907,
-0.015188418328762054,
0.05850556865334511,
0.06183135509490967,
0.16037051379680634,
-0.005601835902780294,
0.019847124814987183,
0.02117612026631832,
-0.01076518651098013,
-0.004440569784492254,
-0.06314447522163391,
-0.040177121758461,
0.05626499652862549,
-0.040134258568286896,
0.07091283053159714,
-0.12296762317419052,
0.0013324758037924767,
0.020327942445874214,
0.2676207423210144,
0.012104165740311146,
-0.2608336806297302,
-0.10274535417556763,
0.03348512947559357,
-0.018092241138219833,
-0.0989871472120285,
0.009659797884523869,
0.1632404327392578,
-0.1369221806526184,
0.030982116237282753,
-0.06286649405956268,
0.09597524255514145,
0.0007192690391093493,
-0.012548977509140968,
0.012205803766846657,
0.12640829384326935,
-0.01460253819823265,
0.08260859549045563,
-0.2308986485004425,
0.24552416801452637,
0.012438875623047352,
0.10308348387479782,
-0.022901959717273712,
0.022389784455299377,
0.03514156863093376,
0.08392693847417831,
0.0603271946310997,
0.013166254386305809,
-0.11151737719774246,
-0.20981672406196594,
-0.038739994168281555,
0.03688105195760727,
0.10684014111757278,
-0.013861249200999737,
0.05873047560453415,
-0.051015038043260574,
0.04292406886816025,
0.05148971453309059,
-0.07592441141605377,
-0.22956228256225586,
-0.10483304411172867,
0.005892490968108177,
0.01293521374464035,
0.017850376665592194,
-0.12955598533153534,
-0.09788575768470764,
-0.02271432615816593,
0.10860345512628555,
-0.009739270433783531,
-0.027709446847438812,
-0.14388220012187958,
0.06891314685344696,
0.11575628817081451,
-0.055075135082006454,
0.02436603233218193,
0.034979771822690964,
0.12974780797958374,
0.008801186457276344,
-0.04746926575899124,
0.06847880780696869,
-0.07800749689340591,
-0.1647108644247055,
-0.07313511520624161,
0.09322897344827652,
0.09767159074544907,
0.04688220098614693,
0.008206111378967762,
0.012961508706212044,
0.013362701050937176,
-0.09613189101219177,
0.006621799897402525,
0.18357889354228973,
0.03561310097575188,
0.09229659289121628,
-0.09928730130195618,
-0.0015231111319735646,
-0.04094545915722847,
-0.038289736956357956,
0.1323631852865219,
0.24322111904621124,
-0.08783634752035141,
0.06953644752502441,
0.10209359228610992,
-0.09268484264612198,
-0.17020408809185028,
0.09741190075874329,
0.14653906226158142,
0.02511611394584179,
0.04816757142543793,
-0.1823517084121704,
0.05028283968567848,
0.14475423097610474,
-0.01886032521724701,
0.02258826233446598,
-0.3415285646915436,
-0.11608058959245682,
0.07924884557723999,
0.14189015328884125,
0.03190085291862488,
-0.11706607043743134,
-0.04006289690732956,
-0.02645615115761757,
-0.1156807541847229,
0.03739487752318382,
-0.12536698579788208,
0.07634104043245316,
-0.011735672131180763,
0.07519076764583588,
0.0381297692656517,
-0.033601950854063034,
0.20111067593097687,
-0.02664020285010338,
0.11218871921300888,
-0.0561942458152771,
0.058058977127075195,
0.023749617859721184,
-0.07080117613077164,
0.03896038234233856,
-0.007002642843872309,
0.06607130169868469,
-0.13395754992961884,
-0.009511727839708328,
-0.06425488740205765,
0.07102309912443161,
-0.04840916022658348,
-0.07447360455989838,
-0.032424863427877426,
0.058763325214385986,
0.04393618553876877,
-0.02837943099439144,
0.0638417974114418,
-0.01987992785871029,
0.175504669547081,
0.030780835077166557,
0.10398004949092865,
-0.01774166151881218,
-0.05870414152741432,
0.000007263926818268374,
-0.028176849707961082,
0.0856696367263794,
-0.12798847258090973,
0.02307502180337906,
0.1197478175163269,
0.029425064101815224,
0.15998637676239014,
0.04241263121366501,
-0.07334738224744797,
0.039086759090423584,
0.04125432297587395,
-0.06947726756334305,
-0.1596737951040268,
-0.02067604847252369,
0.14175128936767578,
-0.14789652824401855,
0.0021328460425138474,
0.11422223597764969,
-0.07297253608703613,
-0.0217637587338686,
-0.02250541001558304,
0.013116181828081608,
-0.04778297245502472,
0.18023568391799927,
0.04000203683972359,
0.061996880918741226,
-0.053656503558158875,
0.09753554314374924,
0.07515066862106323,
-0.0822996199131012,
0.05403894931077957,
0.05963185802102089,
-0.08483948558568954,
-0.04082048684358597,
0.07287660241127014,
0.16528524458408356,
-0.01559174619615078,
-0.04605315998196602,
-0.05124552547931671,
-0.10607811063528061,
0.012277468107640743,
0.10676231980323792,
0.027369458228349686,
-0.009175716899335384,
-0.003751535899937153,
0.02528320997953415,
-0.10153741389513016,
0.062174249440431595,
0.045365482568740845,
0.08657453954219818,
-0.1567198783159256,
0.14905579388141632,
0.0002245494833914563,
0.024342842400074005,
-0.007548276335000992,
-0.005324737634509802,
-0.10969958454370499,
0.004669617395848036,
-0.17896206676959991,
0.023630240932106972,
-0.03754522278904915,
0.02581070177257061,
0.010344421491026878,
-0.05202610418200493,
-0.008500218391418457,
0.04529175907373428,
-0.07873713225126266,
-0.04024193063378334,
0.005570470355451107,
0.08989877253770828,
-0.08448614925146103,
-0.024840008467435837,
0.041224557906389236,
-0.0821814313530922,
0.06485346704721451,
0.06468181312084198,
0.028004657477140427,
0.04877479001879692,
-0.2036486268043518,
0.015480824746191502,
0.03700101375579834,
0.00796522293239832,
0.03612188622355461,
-0.0985533744096756,
-0.025154326111078262,
-0.04286292567849159,
0.03851953148841858,
0.013916688971221447,
0.061770226806402206,
-0.11881915479898453,
-0.06197147071361542,
-0.03832819685339928,
-0.07216896116733551,
-0.0797499492764473,
0.03634113818407059,
0.07196637243032455,
0.0692104771733284,
0.12406732141971588,
-0.10710565745830536,
0.05472971498966217,
-0.17493082582950592,
-0.035982318222522736,
-0.024095261469483376,
0.0052775610238313675,
-0.06963621824979782,
-0.05264291167259216,
0.06846585124731064,
-0.04923911392688751,
0.04960588738322258,
-0.04926561564207077,
0.07488786429166794,
0.01896614022552967,
-0.10826405137777328,
0.012881887145340443,
0.018070874735713005,
0.21605359017848969,
0.05266350880265236,
-0.0015313820913434029,
0.05283081531524658,
0.0008495149668306112,
0.03542697802186012,
0.10425566881895065,
0.13085603713989258,
0.17483238875865936,
-0.028626535087823868,
0.059504494071006775,
0.05147181823849678,
-0.1027085930109024,
-0.05815023556351662,
0.087138332426548,
0.018421992659568787,
0.059312913566827774,
-0.0638740062713623,
0.17512147128582,
0.14328297972679138,
-0.1885567307472229,
0.022015294060111046,
-0.0659208670258522,
-0.09822362661361694,
-0.10545741021633148,
-0.017175134271383286,
-0.07608441263437271,
-0.14635904133319855,
0.008257614448666573,
-0.11046285182237625,
0.009785190224647522,
0.06524582207202911,
0.010877693071961403,
0.044672660529613495,
0.13109561800956726,
0.023011960089206696,
0.009191169403493404,
0.04895547032356262,
0.006447621155530214,
0.018195608630776405,
-0.09865943342447281,
-0.10782323777675629,
0.08998265862464905,
-0.0394878163933754,
0.0418081134557724,
-0.045365817844867706,
0.00800368282943964,
0.029195284470915794,
-0.0048389495350420475,
-0.07853887975215912,
0.033463023602962494,
0.00971598643809557,
0.005265120416879654,
0.07103114575147629,
0.08559074997901917,
-0.01193843875080347,
-0.044264405965805054,
0.2831812798976898,
-0.07104667276144028,
-0.07659394294023514,
-0.14761565625667572,
0.22166100144386292,
0.01144652720540762,
0.006821500137448311,
0.03993341326713562,
-0.1082538366317749,
0.016746405512094498,
0.07337068021297455,
0.10563936829566956,
-0.03776618838310242,
0.005778565537184477,
-0.024329571053385735,
-0.025120992213487625,
-0.0955905169248581,
0.14554902911186218,
0.09854352474212646,
-0.007023531943559647,
-0.06797736883163452,
0.005489354021847248,
-0.021692747250199318,
0.0006914568948559463,
-0.0843116044998169,
0.02268766239285469,
-0.009393292479217052,
0.0024482710286974907,
-0.04302183538675308,
0.09874128550291061,
0.01839148998260498,
-0.1267867386341095,
0.01589077338576317,
-0.10640256851911545,
-0.14012612402439117,
-0.045031819492578506,
0.056280411779880524,
-0.00010781782475532964,
0.046353403478860855,
-0.0452689491212368,
0.014291171915829182,
0.1343599408864975,
-0.021686017513275146,
-0.015502063557505608,
-0.1423950344324112,
0.09285936504602432,
-0.02067144773900509,
0.22344817221164703,
-0.0035625984892249107,
0.06999007612466812,
0.10042683035135269,
0.042716074734926224,
-0.11972038447856903,
0.0607466846704483,
0.06785958260297775,
-0.04019668698310852,
0.01493892166763544,
0.14635758101940155,
-0.0557994544506073,
0.12689900398254395,
0.046703215688467026,
-0.17621064186096191,
0.0191985871642828,
-0.048896726220846176,
-0.026273570954799652,
-0.0677383691072464,
0.03921334818005562,
-0.0534651055932045,
0.15016920864582062,
0.16854172945022583,
-0.04129020497202873,
-0.014134617522358894,
-0.06368911266326904,
0.043568141758441925,
0.04552694782614708,
0.10915686935186386,
-0.026589155197143555,
-0.20085258781909943,
0.01720661111176014,
0.036311641335487366,
0.027469884604215622,
-0.22846761345863342,
-0.1035161167383194,
0.028670847415924072,
-0.05764922872185707,
-0.02991311252117157,
0.12187818437814713,
0.061067573726177216,
0.02744544856250286,
-0.03658747673034668,
-0.20594394207000732,
-0.03648236393928528,
0.15847565233707428,
-0.09073208272457123,
-0.04748658463358879
] |
null | null | sample-factory |
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment.
This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory.
Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/
## Downloading the model
After installing Sample-Factory, download the model with:
```
python -m sample_factory.huggingface.load_from_hub -r itsdhanoob/rl_course_vizdoom_health_gathering_supreme
```
## Using the model
To run the model after download, use the `enjoy` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme
```
You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag.
See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details
## Training with this model
To continue training with this model, use the `train` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000
```
Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
| {"library_name": "sample-factory", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "sample-factory"], "model-index": [{"name": "APPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "doom_health_gathering_supreme", "type": "doom_health_gathering_supreme"}, "metrics": [{"type": "mean_reward", "value": "8.58 +/- 3.39", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | itsdhanoob/rl_course_vizdoom_health_gathering_supreme | [
"sample-factory",
"tensorboard",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-14T16:04:10+00:00 | [] | [] | TAGS
#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
A(n) APPO model trained on the doom_health_gathering_supreme environment.
This model was trained using Sample-Factory 2.0: URL
Documentation for how to use Sample-Factory can be found at URL
## Downloading the model
After installing Sample-Factory, download the model with:
## Using the model
To run the model after download, use the 'enjoy' script corresponding to this environment:
You can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.
See URL for more details
## Training with this model
To continue training with this model, use the 'train' script corresponding to this environment:
Note, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at.
| [
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
"TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
34,
19,
59,
67
] | [
"passage: TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n## Downloading the model\n\nAfter installing Sample-Factory, download the model with:## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
-0.162887305021286,
-0.07949446886777878,
0.0013769814977422357,
0.0244897473603487,
0.13643795251846313,
0.08826540410518646,
0.13243556022644043,
0.07938782125711441,
0.19449298083782196,
0.07451266050338745,
0.12160012871026993,
0.06742649525403976,
0.02505551464855671,
0.31084391474723816,
0.08655242621898651,
-0.18235880136489868,
0.031082456931471825,
-0.06436605006456375,
-0.02882574498653412,
0.05590416118502617,
0.050910040736198425,
-0.06422623991966248,
0.11641133576631546,
-0.05714287608861923,
-0.15497641265392303,
0.08288847655057907,
0.008126083761453629,
0.03596968948841095,
0.12199652194976807,
-0.007729834411293268,
0.06358569860458374,
0.02508161962032318,
0.09885215014219284,
-0.08979995548725128,
0.05817115306854248,
0.037268251180648804,
-0.005583701189607382,
0.0697544738650322,
-0.02916712686419487,
0.01197513286024332,
0.20552261173725128,
0.051445573568344116,
-0.014811687171459198,
0.0707944929599762,
-0.04854035750031471,
0.005004523321986198,
0.024828260764479637,
0.08118943125009537,
0.1108563020825386,
-0.013300174847245216,
-0.015604399144649506,
0.2098497599363327,
-0.045419543981552124,
0.030687451362609863,
0.1803472340106964,
-0.13901305198669434,
-0.00587898213416338,
0.3598267436027527,
0.13591337203979492,
0.07389762997627258,
-0.05572221428155899,
0.065569669008255,
0.12957775592803955,
-0.013377981260418892,
-0.022062024101614952,
-0.037468962371349335,
0.01014290377497673,
0.02470328100025654,
-0.08271043002605438,
-0.03898613899946213,
0.18779566884040833,
0.027798498049378395,
-0.0647122785449028,
-0.11388745903968811,
-0.08383605629205704,
-0.01143614575266838,
-0.08729266375303268,
-0.06047317758202553,
0.061255209147930145,
0.06450130045413971,
-0.05541218817234039,
-0.16354843974113464,
-0.08759765326976776,
-0.14808951318264008,
0.09711641818284988,
-0.018818290904164314,
0.020023507997393608,
0.039053402841091156,
-0.13240769505500793,
0.13932685554027557,
-0.12239529192447662,
-0.005040881223976612,
-0.00391974626109004,
-0.10012788325548172,
-0.0298643596470356,
-0.02757178619503975,
-0.06954579800367355,
-0.08072661608457565,
0.06621979922056198,
0.1397300660610199,
0.1075919046998024,
0.04457515478134155,
-0.016096504405140877,
0.0929836705327034,
0.0659836158156395,
0.015487046912312508,
-0.046446919441223145,
-0.03190334141254425,
0.06750229746103287,
0.09463070333003998,
-0.0025161339435726404,
-0.04405781999230385,
-0.12502750754356384,
0.004669501446187496,
-0.05889439582824707,
0.07438734918832779,
-0.01944235898554325,
0.09347380697727203,
0.0012449703644961119,
-0.0658751055598259,
0.09675891697406769,
-0.056166794151067734,
-0.015024078078567982,
0.05717969685792923,
-0.09829384088516235,
-0.044000294059515,
0.02636338584125042,
-0.018662840127944946,
0.02191256918013096,
-0.08697114139795303,
-0.1281215101480484,
-0.0406981036067009,
-0.15496762096881866,
-0.0733695924282074,
0.020342092961072922,
-0.10162562131881714,
0.040819648653268814,
-0.08701786398887634,
-0.27291807532310486,
-0.016108427196741104,
0.05915366858243942,
0.0003154690202791244,
0.03663148358464241,
-0.06209208071231842,
0.0267410296946764,
-0.030988745391368866,
-0.013702943921089172,
0.12538094818592072,
-0.04706621542572975,
0.005733184050768614,
0.02853262610733509,
0.09092917293310165,
0.029396481812000275,
-0.011824010871350765,
-0.09237373620271683,
0.03002769686281681,
-0.1866937130689621,
0.0038047281559556723,
-0.051012441515922546,
0.14028684794902802,
-0.07785230129957199,
-0.0034444157499819994,
-0.07691079378128052,
0.06912831217050552,
0.052552226930856705,
0.21963854134082794,
-0.22059281170368195,
-0.09743031859397888,
0.1902308464050293,
-0.09678838402032852,
-0.1949385702610016,
0.06732125580310822,
-0.03079940192401409,
0.20069970190525055,
0.02597416751086712,
0.1891578733921051,
0.00020795770979020745,
-0.25584760308265686,
0.035303130745887756,
0.07686726003885269,
-0.2078019231557846,
-0.11653494834899902,
0.00783967413008213,
0.04216665402054787,
-0.050144799053668976,
0.023388857021927834,
-0.07392873615026474,
0.1217033788561821,
-0.023950038477778435,
-0.021695949137210846,
-0.009935722686350346,
-0.06940963864326477,
-0.039610356092453,
0.012346661649644375,
0.06086154654622078,
-0.02202412113547325,
-0.025860905647277832,
-0.05173748731613159,
0.16720648109912872,
-0.0795547217130661,
0.011736705899238586,
-0.11241740733385086,
0.1497063785791397,
0.007124151568859816,
0.025635361671447754,
-0.0980280190706253,
-0.014672551304101944,
0.044151511043310165,
0.08621654659509659,
0.011970171704888344,
0.1326037049293518,
0.06774137914180756,
0.01454958226531744,
0.042493220418691635,
-0.004039871972054243,
-0.0012205307139083743,
-0.10230473428964615,
-0.05593033879995346,
-0.11311958730220795,
-0.11286478489637375,
-0.09429361671209335,
0.08868816494941711,
-0.20066434144973755,
0.05826579034328461,
-0.15120604634284973,
0.047645486891269684,
0.038803353905677795,
-0.07772190868854523,
0.05121537670493126,
-0.08661998063325882,
-0.021283775568008423,
-0.08784573525190353,
0.0805407464504242,
-0.014386715367436409,
-0.08415807038545609,
0.006313080433756113,
-0.09094364196062088,
-0.08295580744743347,
0.09175937622785568,
0.013830476440489292,
0.0026490744203329086,
-0.1170414388179779,
-0.04695970565080643,
0.001149212708696723,
0.03873389959335327,
-0.0591595321893692,
0.08649469166994095,
0.06776818633079529,
0.09646541625261307,
-0.09070473909378052,
0.03797374665737152,
-0.020416714251041412,
-0.06236580014228821,
-0.045745182782411575,
0.014070805162191391,
0.1767948418855667,
-0.022993814200162888,
-0.01734299771487713,
-0.005982444155961275,
-0.048861317336559296,
0.20095843076705933,
-0.018403954803943634,
-0.11935548484325409,
0.0030399553943425417,
-0.01395543571561575,
-0.017944620922207832,
0.11660698801279068,
-0.13726668059825897,
-0.05182260647416115,
0.030854813754558563,
-0.06529976427555084,
0.10216285288333893,
-0.08242622762918472,
-0.0392029769718647,
-0.05685178562998772,
-0.043409593403339386,
0.046979792416095734,
0.12330524623394012,
-0.07290767133235931,
-0.009151018224656582,
-0.047789376229047775,
-0.03510203957557678,
-0.025379952043294907,
-0.05724980682134628,
-0.11478709429502487,
0.1582695096731186,
0.002751561114564538,
-0.09990474581718445,
-0.17415542900562286,
-0.08029486984014511,
-0.03834356367588043,
0.05337152257561684,
-0.034037429839372635,
-0.04430336132645607,
-0.01500723510980606,
-0.07299388945102692,
0.1465158462524414,
0.063304103910923,
-0.0472191721200943,
-0.01852818764746189,
0.08560720086097717,
0.04456184431910515,
-0.15394946932792664,
0.007078593596816063,
-0.08948076516389847,
-0.08794131129980087,
0.03091353550553322,
-0.08061819523572922,
0.012820594012737274,
0.11341627687215805,
0.03525753691792488,
0.02826494723558426,
0.01035099383443594,
0.23537762463092804,
-0.0369284451007843,
-0.01093987375497818,
0.19019025564193726,
0.0682438537478447,
0.020443644374608994,
0.055847786366939545,
0.027420951053500175,
-0.15370461344718933,
0.10424364358186722,
0.012530675157904625,
-0.044538769870996475,
-0.10689681768417358,
-0.04666181653738022,
-0.03360101953148842,
0.09803235530853271,
0.12185155600309372,
0.03158954530954361,
0.025155838578939438,
0.096546471118927,
0.02187134325504303,
-0.0098390718922019,
-0.11183010786771774,
0.05996714532375336,
-0.1770814210176468,
-0.043808963149785995,
0.00898060668259859,
-0.028755301609635353,
0.00010461114288773388,
0.0659034252166748,
0.026660064235329628,
0.12833580374717712,
0.0295290257781744,
0.06181740015745163,
0.0663255974650383,
0.10200989991426468,
0.01538698747754097,
0.1999037265777588,
-0.06215142831206322,
-0.1075027585029602,
-0.03758005052804947,
-0.04118350148200989,
-0.11916319280862808,
0.12439136207103729,
0.1381523460149765,
-0.030515994876623154,
-0.06625506281852722,
0.07200724631547928,
0.014589293859899044,
0.08729344606399536,
0.08250882476568222,
-0.29115065932273865,
-0.034177567809820175,
0.031450141221284866,
0.01114452164620161,
-0.04308335855603218,
0.010566305369138718,
0.10542299598455429,
-0.07616783678531647,
-0.09982791543006897,
-0.03972722589969635,
0.1055394783616066,
0.08046542853116989,
0.03702867403626442,
-0.10841067880392075,
0.20128826797008514,
-0.01744360849261284,
0.07004447281360626,
-0.07662706822156906,
0.1728198230266571,
0.018701205030083656,
0.05943213775753975,
-0.07497778534889221,
-0.009592941962182522,
0.1228223443031311,
0.03374773636460304,
0.09092900156974792,
-0.0056656887754797935,
-0.09995020180940628,
-0.13336431980133057,
-0.1216202825307846,
0.024986369535326958,
-0.000090524394181557,
-0.08169890940189362,
0.03341596573591232,
-0.016717763617634773,
0.017487963661551476,
-0.0027857583481818438,
0.23440547287464142,
-0.18267135322093964,
0.012482558377087116,
-0.054521817713975906,
0.02707577496767044,
-0.04300008341670036,
-0.0709642544388771,
-0.027162717655301094,
0.060507629066705704,
0.09744840115308762,
0.07921962440013885,
0.030401866883039474,
-0.07419665157794952,
0.1431404948234558,
0.06514685600996017,
-0.058246973901987076,
-0.01524845976382494,
0.01951364241540432,
0.1256532073020935,
-0.07438289374113083,
-0.10393836349248886,
0.10585980117321014,
-0.11736445128917694,
0.008749126456677914,
-0.05019083246588707,
0.04299405962228775,
0.02305823378264904,
0.011290842667222023,
0.007447924464941025,
-0.04279239848256111,
0.0015383695717900991,
-0.06904047727584839,
0.0778660774230957,
0.020559091120958328,
-0.0047941361553967,
-0.0006717707728967071,
-0.16239388287067413,
0.08390985429286957,
-0.04138755425810814,
0.052877847105264664,
0.1489589661359787,
0.27864590287208557,
-0.02386910282075405,
0.030926240608096123,
0.1617380678653717,
-0.01897917501628399,
-0.2491649091243744,
0.04654841497540474,
0.014908025041222572,
0.10310175269842148,
0.04640066251158714,
-0.19236695766448975,
0.11111847311258316,
0.009474517777562141,
-0.02225719392299652,
0.009804603643715382,
-0.24880149960517883,
-0.13740544021129608,
0.17525193095207214,
0.06902051717042923,
0.15983323752880096,
-0.03665107116103172,
-0.013587141409516335,
-0.061109546571969986,
-0.03419603407382965,
-0.026354335248470306,
-0.12708203494548798,
0.12749767303466797,
-0.017607107758522034,
0.047745801508426666,
0.027817612513899803,
-0.07676684111356735,
0.12058744579553604,
-0.017944786697626114,
0.13344953954219818,
-0.017018258571624756,
-0.031023232266306877,
0.042466819286346436,
-0.09033756703138351,
0.1662607043981552,
-0.10233280807733536,
0.057950668036937714,
-0.11091876775026321,
-0.03109682910144329,
-0.015322481282055378,
0.15654151141643524,
0.005544521380215883,
-0.0855189636349678,
-0.041066281497478485,
0.04975702613592148,
-0.05784251168370247,
0.05022609233856201,
-0.0021613158751279116,
-0.03506873920559883,
0.022246064618229866,
0.08415499329566956,
0.040208954364061356,
-0.10403558611869812,
-0.011038471013307571,
0.03089289739727974,
0.01896476000547409,
0.09993185102939606,
-0.20835483074188232,
-0.020152123644948006,
0.019231827929615974,
-0.015702085569500923,
0.13085414469242096,
0.04400704801082611,
-0.08080117404460907,
0.027568496763706207,
0.13726983964443207,
-0.061186157166957855,
-0.030986590310931206,
-0.04847807064652443,
-0.016679393127560616,
-0.12794725596904755,
-0.01594163477420807,
0.057148490101099014,
-0.04251079633831978,
0.02512725070118904,
-0.03424951806664467,
0.0004248716577421874,
-0.10717252641916275,
0.07036283612251282,
0.06859682500362396,
0.0642281174659729,
-0.07167360186576843,
0.09394960850477219,
-0.07811970263719559,
0.014289900660514832,
0.03734226152300835,
0.045441556721925735,
-0.06931920349597931,
-0.06820165365934372,
-0.05322124809026718,
0.27575042843818665,
-0.024388493970036507,
-0.02025510184466839,
-0.06021025776863098,
0.11942195147275925,
-0.057836465537548065,
-0.06673881411552429,
0.08716115355491638,
-0.007450808770954609,
-0.059019722044467926,
0.022327717393636703,
-0.0734894648194313,
-0.014457973651587963,
0.04693116992712021,
0.016375891864299774,
-0.11610891669988632,
0.1136312261223793,
0.031648989766836166,
0.02891513518989086,
-0.09186926484107971,
-0.0486464723944664,
-0.12123195827007294,
0.0032020595390349627,
-0.025323880836367607,
-0.06051601842045784,
-0.07913094758987427,
-0.0425749197602272,
0.049642790108919144,
0.018434861674904823,
-0.08444267511367798,
-0.0022111251018941402,
-0.12617166340351105,
0.006370943505316973,
0.006689207162708044,
0.10316617041826248,
-0.06351965665817261,
0.04670397937297821,
0.10049878805875778,
-0.07692139595746994,
0.09893755614757538,
0.0846271738409996,
-0.00729260453954339,
0.08929292112588882,
-0.20261284708976746,
-0.02319980226457119,
0.047821637243032455,
0.055264540016651154,
0.03154374286532402,
0.06104309484362602,
0.013487739488482475,
-0.05460033565759659,
0.04538526386022568,
-0.03539090231060982,
0.0028435050044208765,
-0.09104080498218536,
0.09713591635227203,
0.009731475263834,
-0.009716489352285862,
-0.060456521809101105,
-0.01384128537029028,
0.01817488856613636,
0.10404353588819504,
0.09692291915416718,
-0.07237115502357483,
-0.0035003575030714273,
-0.11786255985498428,
0.024597108364105225,
0.02565017342567444,
0.010576808825135231,
0.03638135641813278,
-0.11692339926958084,
0.03729743883013725,
-0.05475534871220589,
0.19700418412685394,
0.019796879962086678,
-0.10531783103942871,
-0.008661900646984577,
0.07250577956438065,
0.17378750443458557,
-0.006129021290689707,
0.21011123061180115,
0.05919691175222397,
0.09556611627340317,
0.0324610099196434,
0.11373614519834518,
0.11542147397994995,
0.004254546947777271,
0.10733281821012497,
0.0500684529542923,
-0.04822303727269173,
0.14306919276714325,
0.032827045768499374,
-0.017670227214694023,
0.0304852481931448,
0.04704435542225838,
-0.03187015652656555,
0.02075354754924774,
-0.06440161913633347,
0.11196915805339813,
0.13514995574951172,
-0.08471442013978958,
-0.0081911850720644,
0.04797748476266861,
-0.0438203290104866,
-0.1532401293516159,
-0.08671712130308151,
-0.024648865684866905,
-0.2236001342535019,
0.08533021807670593,
-0.06946314871311188,
-0.13578248023986816,
0.019155733287334442,
0.013867083936929703,
-0.028145823627710342,
0.11776147037744522,
-0.07801362872123718,
-0.03346126526594162,
0.020983682945370674,
-0.039618294686079025,
-0.09754771739244461,
-0.09402462840080261,
-0.07874704152345657,
0.03500581532716751,
-0.04535633698105812,
0.025271590799093246,
-0.05421067774295807,
0.015182215720415115,
0.10334893316030502,
-0.04038224741816521,
-0.041323766112327576,
-0.0359976626932621,
-0.035855069756507874,
-0.11793428659439087,
0.025968458503484726,
0.044103916734457016,
-0.03597194701433182,
-0.05585090070962906,
0.17637495696544647,
-0.04257858544588089,
-0.01666315644979477,
-0.1211012676358223,
0.14332374930381775,
-0.04330325871706009,
0.03261799365282059,
-0.10366860777139664,
-0.08559805154800415,
-0.10071583092212677,
0.27439257502555847,
0.2784624397754669,
-0.14349330961704254,
-0.009759977459907532,
0.02939503826200962,
0.004204166121780872,
-0.14250165224075317,
0.14376720786094666,
0.01570971868932247,
-0.024460898712277412,
-0.027595078572630882,
0.026391539722681046,
-0.007621914613991976,
-0.0827714279294014,
-0.03114704228937626,
-0.05752136558294296,
-0.006779014132916927,
-0.05148708075284958,
-0.034257955849170685,
0.06298708915710449,
-0.12136059254407883,
-0.09091135859489441,
-0.05560125410556793,
-0.0083417734131217,
-0.03344108536839485,
-0.07473809272050858,
-0.019548200070858,
0.07662302255630493,
0.14781777560710907,
-0.05502733215689659,
0.06005467101931572,
-0.004367031157016754,
-0.04969286173582077,
-0.13970479369163513,
-0.13660922646522522,
0.05449144169688225,
-0.129489928483963,
0.26909253001213074,
-0.050524767488241196,
-0.05207161232829094,
0.041712693870067596,
-0.03221052139997482,
-0.05838879942893982,
0.020522039383649826,
0.009778409264981747,
-0.05078497156500816,
-0.029240628704428673,
0.09255361557006836,
-0.033305004239082336,
0.009149706922471523,
-0.022496739402413368,
-0.22135144472122192,
0.0034119023475795984,
-0.05107501149177551,
0.028507398441433907,
-0.12569822371006012,
0.06501629203557968,
-0.09348012506961823,
0.12403472512960434,
0.07595156878232956,
-0.01166640967130661,
-0.036088403314352036,
-0.04733064025640488,
0.1257045865058899,
0.08392459154129028,
-0.02910126931965351,
-0.0870935395359993,
-0.16758979856967926,
-0.004611360374838114,
-0.0011314527364447713,
-0.08687946200370789,
-0.23090760409832,
-0.008421163074672222,
-0.031696807593107224,
0.0109195401892066,
-0.00838692206889391,
0.12826944887638092,
0.14749252796173096,
0.05249129980802536,
0.016358694061636925,
-0.12719306349754333,
0.041898638010025024,
0.08496948331594467,
-0.15762199461460114,
-0.1707899123430252
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.0`
```yaml
adapter: qlora
base_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
bf16: false
dataset_prepared_path: null
datasets:
- path: mhenrichsen/alpaca_2k_test
type: alpaca
debug: null
deepspeed: null
early_stopping_patience: null
evals_per_epoch: null
flash_attention: false
fp16: true
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 1
gradient_checkpointing: true
group_by_length: false
is_llama_derived_model: true
learning_rate: 0.0002
load_in_4bit: true
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 32
lora_target_linear: true
lora_target_modules: null
lr_scheduler: cosine
max_steps: 20
micro_batch_size: 1
mlflow_experiment_name: colab-example
model_type: LlamaForCausalLM
num_epochs: 4
optimizer: paged_adamw_32bit
output_dir: ./hug_test
pad_to_sequence_len: true
resume_from_checkpoint: null
sample_packing: true
saves_per_epoch: null
sequence_len: 1096
special_tokens: null
strict: false
tf32: false
tokenizer_type: LlamaTokenizer
train_on_inputs: false
val_set_size: 0.05
wandb_entity: null
wandb_log_model: null
wandb_name: null
wandb_project: null
wandb_watch: null
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# hug_test
This model is a fine-tuned version of [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2788
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.3201 | 0.05 | 20 | 1.2788 |
### Framework versions
- PEFT 0.8.2
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0 | {"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T", "model-index": [{"name": "hug_test", "results": []}]} | null | joseagmz/hug_test | [
"peft",
"safetensors",
"llama",
"generated_from_trainer",
"base_model:TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T",
"license:apache-2.0",
"4-bit",
"region:us"
] | 2024-02-14T16:07:04+00:00 | [] | [] | TAGS
#peft #safetensors #llama #generated_from_trainer #base_model-TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T #license-apache-2.0 #4-bit #region-us
| <img src="URL alt="Built with Axolotl" width="200" height="32"/>
See axolotl config
axolotl version: '0.4.0'
hug\_test
=========
This model is a fine-tuned version of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2788
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0002
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_steps: 10
* training\_steps: 20
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.8.2
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu121
* Datasets 2.17.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#peft #safetensors #llama #generated_from_trainer #base_model-TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T #license-apache-2.0 #4-bit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
63,
130,
4,
44
] | [
"passage: TAGS\n#peft #safetensors #llama #generated_from_trainer #base_model-TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T #license-apache-2.0 #4-bit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
-0.1360912024974823,
0.08209233731031418,
-0.004137488082051277,
0.10144400596618652,
0.1056944727897644,
0.014718879945576191,
0.10730763524770737,
0.12315692007541656,
-0.08877985179424286,
0.10782807320356369,
0.11671297997236252,
0.0749816745519638,
0.05895868316292763,
0.170396089553833,
-0.030039867386221886,
-0.2207590937614441,
0.021036798134446144,
-0.02696794457733631,
-0.05418599396944046,
0.12544670701026917,
0.08774270862340927,
-0.12925514578819275,
0.05179305002093315,
-0.02387751266360283,
-0.10906312614679337,
-0.050556447356939316,
-0.016448576003313065,
-0.035066064447164536,
0.12146554887294769,
-0.009849143214523792,
0.1310388296842575,
0.05323630943894386,
0.11977913975715637,
-0.20622403919696808,
0.008794363588094711,
0.06312159448862076,
0.02179144136607647,
0.0738590881228447,
0.08244249224662781,
0.0023204279132187366,
0.08015505224466324,
-0.11269722878932953,
0.06162257492542267,
0.006978798191994429,
-0.13543769717216492,
-0.20375265181064606,
-0.13109458982944489,
0.08404260873794556,
0.1105891689658165,
0.08971339464187622,
0.0008055068319663405,
0.09370478242635727,
-0.08956774324178696,
0.07969870418310165,
0.2618309259414673,
-0.25414639711380005,
-0.07267454266548157,
0.0204668790102005,
0.055650368332862854,
0.09210482984781265,
-0.091792032122612,
-0.05992183834314346,
0.0293468926101923,
0.04482914134860039,
0.10545267909765244,
-0.010971580632030964,
-0.07475259900093079,
-0.021942496299743652,
-0.15204432606697083,
-0.0790763720870018,
0.0702517181634903,
0.03219779580831528,
-0.04691414162516594,
-0.03258863836526871,
-0.052431125193834305,
-0.21966202557086945,
-0.06688594073057175,
0.013260046020150185,
0.042333584278821945,
-0.038151368498802185,
-0.027439404278993607,
0.01001439243555069,
-0.09975020587444305,
-0.08355509489774704,
-0.00025523113436065614,
0.107143335044384,
0.05953933298587799,
0.0032201337162405252,
0.008957715705037117,
0.12831014394760132,
0.03326411545276642,
-0.1541328728199005,
-0.01341236848384142,
0.028553567826747894,
-0.008343774825334549,
-0.020283672958612442,
-0.0264352485537529,
0.04424767196178436,
0.04728711023926735,
0.1480516940355301,
-0.13400334119796753,
0.04475496709346771,
0.02446635439991951,
0.04133804142475128,
-0.12026721239089966,
0.11625431478023529,
-0.07385901361703873,
-0.005571128334850073,
0.03729362040758133,
0.13941894471645355,
0.04994181916117668,
-0.0034550996497273445,
-0.05939175933599472,
0.012788485735654831,
0.11063652485609055,
0.08224228024482727,
-0.011803037486970425,
0.002978743752464652,
-0.0681813582777977,
0.005246944259852171,
0.07479342818260193,
-0.1052403524518013,
0.020457014441490173,
0.03442682698369026,
-0.056130070239305496,
-0.05965699255466461,
0.020752178505063057,
0.011594753712415695,
-0.022302815690636635,
0.09445763379335403,
-0.07055280357599258,
0.013286259956657887,
-0.07277444005012512,
-0.10420345515012741,
0.04635332524776459,
-0.0791657343506813,
-0.008470519445836544,
-0.09879232943058014,
-0.14545655250549316,
-0.06086936220526695,
0.013955592177808285,
-0.04419037699699402,
-0.053459081798791885,
-0.053214702755212784,
-0.10550545156002045,
0.0202175360172987,
-0.024834711104631424,
0.13178782165050507,
-0.07696732878684998,
0.12982924282550812,
-0.018142078071832657,
0.05529836565256119,
0.0006833295919932425,
0.035538364201784134,
-0.07424845546483994,
0.06308833509683609,
-0.1610117405653,
0.02633700706064701,
-0.07239066809415817,
0.05075729265809059,
-0.11458786576986313,
-0.09941425919532776,
0.0033347252756357193,
-0.039290573447942734,
0.12214487791061401,
0.12058904767036438,
-0.17317058145999908,
-0.024359382688999176,
0.20831318199634552,
-0.09853674471378326,
-0.11329035460948944,
0.11419129371643066,
-0.02105860412120819,
-0.013056882657110691,
0.03162361681461334,
0.17061252892017365,
0.0852997750043869,
-0.11495500802993774,
-0.008931544609367847,
-0.024135341867804527,
0.09940745681524277,
-0.02408423461019993,
0.09515884518623352,
-0.0043136137537658215,
0.02583313174545765,
0.0020713810808956623,
-0.07650704681873322,
0.05151631310582161,
-0.11220725625753403,
-0.08276734501123428,
-0.03724517300724983,
-0.08497398346662521,
0.011261132545769215,
0.047723203897476196,
0.010533135384321213,
-0.09671977907419205,
-0.08656026422977448,
-0.018869945779442787,
0.13529497385025024,
-0.061838146299123764,
0.03745550662279129,
-0.05746958404779434,
0.10632918030023575,
-0.012200151570141315,
-0.027791569009423256,
-0.17736956477165222,
-0.05900794640183449,
0.03487471491098404,
-0.006493990309536457,
-0.025099044665694237,
-0.06628181785345078,
0.05503137782216072,
0.08904756605625153,
-0.031522076576948166,
-0.05389149487018585,
-0.07335396856069565,
-0.01091640442609787,
-0.11070267856121063,
-0.22832635045051575,
-0.05234960466623306,
-0.02949904277920723,
0.17029425501823425,
-0.2077414095401764,
0.019259372726082802,
-0.006477476563304663,
0.09663892537355423,
0.01097849104553461,
-0.03187912330031395,
-0.013162103481590748,
0.0956932082772255,
-0.015862807631492615,
-0.07946117967367172,
0.060027990490198135,
0.019455110654234886,
-0.05870452895760536,
0.0176328644156456,
-0.12591086328029633,
0.08452154695987701,
0.10323291271924973,
0.02930166944861412,
-0.0968606024980545,
-0.022759921848773956,
-0.07289167493581772,
-0.041125666350126266,
-0.02997259423136711,
0.04270815849304199,
0.1034206971526146,
0.030047768726944923,
0.11984877288341522,
-0.09232470393180847,
-0.039130497723817825,
0.03717071935534477,
-0.012918622232973576,
0.0504978746175766,
0.14905746281147003,
0.08202851563692093,
-0.028573909774422646,
0.12201746553182602,
0.1208169162273407,
-0.06506498903036118,
0.06595245748758316,
-0.07509516924619675,
-0.10192137211561203,
-0.027292419224977493,
0.05153757333755493,
0.03418402373790741,
0.12984804809093475,
-0.04083535075187683,
0.026120221242308617,
0.023035401478409767,
0.016733216121792793,
0.003619810566306114,
-0.24314607679843903,
-0.0361454114317894,
0.015453807078301907,
-0.07381319254636765,
-0.03494082763791084,
-0.03127282112836838,
0.01644134148955345,
0.10719325393438339,
0.004698523785918951,
-0.07969909906387329,
-0.019813036546111107,
-0.0059280977584421635,
-0.0837000384926796,
0.21577978134155273,
-0.09325334429740906,
-0.14774289727210999,
-0.0845467820763588,
-0.002305096946656704,
0.01831509731709957,
-0.013541369698941708,
0.06665629893541336,
-0.05221015587449074,
-0.018759574741125107,
-0.10737741738557816,
-0.053210388869047165,
0.015242028050124645,
0.02007407508790493,
-0.02006901428103447,
0.014126158319413662,
0.06369780749082565,
-0.09493348747491837,
0.0021163399796932936,
-0.03075341321527958,
-0.04405231028795242,
0.056690946221351624,
0.04242244362831116,
0.10452056676149368,
0.16226765513420105,
0.015373185276985168,
-0.015261335298418999,
-0.0331280492246151,
0.2025972604751587,
-0.0697004571557045,
-0.021737689152359962,
0.1434333175420761,
0.005973448045551777,
0.07221149653196335,
0.14631113409996033,
0.04310739412903786,
-0.08623258769512177,
0.001681761466898024,
0.00741746136918664,
-0.028945554047822952,
-0.2468383014202118,
-0.053438205271959305,
-0.03789486363530159,
0.012967805378139019,
0.09626077115535736,
0.03663584962487221,
-0.035930484533309937,
0.03851550072431564,
-0.01690765656530857,
0.021436434239149094,
-0.0067323921248316765,
0.058566298335790634,
0.06262414157390594,
0.03372426703572273,
0.11414415389299393,
-0.027767762541770935,
-0.008493732661008835,
0.04149038717150688,
0.005129432305693626,
0.2337019145488739,
-0.04052995517849922,
0.1334950178861618,
0.07252068817615509,
0.20431937277317047,
-0.002425580518320203,
0.060955777764320374,
0.009591512382030487,
-0.018789146095514297,
0.009040473960340023,
-0.058027081191539764,
-0.04368217661976814,
0.006825354881584644,
-0.0378255620598793,
0.03890949487686157,
-0.1328885704278946,
0.035689741373062134,
0.022581737488508224,
0.3167385756969452,
0.07268016040325165,
-0.3240070939064026,
-0.0944194495677948,
-0.011372797191143036,
-0.024180786684155464,
-0.03943004831671715,
0.01123755518347025,
0.13381703197956085,
-0.07378608733415604,
0.007517893798649311,
-0.04563410207629204,
0.08204831928014755,
-0.03844169154763222,
0.00176235509570688,
0.04647192358970642,
0.07715712487697601,
0.005625084973871708,
0.04800791293382645,
-0.22776047885417938,
0.2941702604293823,
-0.0005596203845925629,
0.06926444172859192,
-0.02640143781900406,
-0.007613550405949354,
0.028346549719572067,
0.053070880472660065,
0.11055172234773636,
-0.0008057841332629323,
-0.06377466768026352,
-0.224630206823349,
-0.16005036234855652,
0.02933996357023716,
0.11218240112066269,
-0.031119566410779953,
0.12256129086017609,
-0.022762637585401535,
-0.001467683701775968,
0.030554288998246193,
-0.08529195934534073,
-0.08317343145608902,
-0.059090133756399155,
0.01379405241459608,
0.010741442441940308,
0.005863257218152285,
-0.11013159900903702,
-0.10524366796016693,
-0.03371297940611839,
0.11291442066431046,
-0.09314007312059402,
-0.04232544079422951,
-0.12157871574163437,
0.07029851526021957,
0.14562776684761047,
-0.08624144643545151,
0.060171160846948624,
0.003069171216338873,
0.07631723582744598,
0.015201382339000702,
-0.012349658645689487,
0.07956293225288391,
-0.08610492944717407,
-0.2369043231010437,
-0.06456948816776276,
0.15683068335056305,
0.04042849689722061,
0.07084476202726364,
-0.010984516702592373,
0.0259039718657732,
-0.011664321646094322,
-0.08875963091850281,
0.0033228935208171606,
0.03808784857392311,
0.07993114739656448,
0.015954675152897835,
-0.05092744901776314,
0.04915464296936989,
-0.06272384524345398,
-0.05252031236886978,
0.13745033740997314,
0.32897430658340454,
-0.09031227976083755,
0.06317101418972015,
0.045468688011169434,
-0.05209168791770935,
-0.20992186665534973,
-0.024215618148446083,
0.10743514448404312,
0.015533996745944023,
0.00857541337609291,
-0.15520460903644562,
0.050506141036748886,
0.1325676143169403,
-0.034150753170251846,
0.12861309945583344,
-0.36072322726249695,
-0.12415292114019394,
0.07381215691566467,
0.13065561652183533,
0.04648450389504433,
-0.173843652009964,
-0.050216104835271835,
0.01720222271978855,
-0.12266118079423904,
0.0713503360748291,
-0.07820204645395279,
0.11304692924022675,
-0.03145177289843559,
0.018770024180412292,
0.011010550893843174,
-0.05908549204468727,
0.15082047879695892,
-0.016801554709672928,
0.07645057886838913,
-0.02583603374660015,
0.005815194919705391,
0.038300465792417526,
-0.0735732689499855,
0.019368700683116913,
-0.04989897832274437,
0.058617208153009415,
-0.07602690905332565,
-0.007891998626291752,
-0.09386838972568512,
0.009456727653741837,
-0.05424601957201958,
-0.029671670868992805,
-0.03530122712254524,
0.04897737130522728,
0.05959371104836464,
0.0010842602932825685,
0.1069689616560936,
0.005578549578785896,
0.12898658215999603,
0.11059863120317459,
0.04515327885746956,
-0.03190845623612404,
-0.04151616990566254,
-0.007936635985970497,
-0.03543810546398163,
0.0345727913081646,
-0.14910557866096497,
0.018018081784248352,
0.14022381603717804,
0.04512866213917732,
0.09447132796049118,
0.05722591280937195,
-0.053702011704444885,
0.002860978012904525,
0.06905516982078552,
-0.12422837316989899,
-0.08875948190689087,
0.029161887243390083,
0.04247328266501427,
-0.1325048804283142,
0.00025137478951364756,
0.11687736958265305,
-0.07461021840572357,
-0.031685661524534225,
-0.014775075949728489,
0.040155403316020966,
-0.024853253737092018,
0.2228095829486847,
0.06151260435581207,
0.07922453433275223,
-0.09390652924776077,
0.06847316771745682,
0.07807723432779312,
-0.11901291459798813,
0.0015656452160328627,
0.11648595333099365,
-0.1030382588505745,
-0.03750505670905113,
0.07627761363983154,
0.06989257037639618,
-0.01281821634620428,
-0.05633150413632393,
-0.1312331259250641,
-0.11849414557218552,
0.08819720894098282,
0.16685135662555695,
0.06105797365307808,
0.04275597631931305,
0.02408536523580551,
0.008499900810420513,
-0.11489657312631607,
0.10900366306304932,
0.05591801181435585,
0.0785105973482132,
-0.10517722368240356,
0.16665415465831757,
0.009199685417115688,
0.043727461248636246,
-0.0009077903814613819,
0.04734398052096367,
-0.09971586614847183,
0.013881956227123737,
-0.14135828614234924,
0.03464968875050545,
-0.0375409759581089,
-0.014020917005836964,
-0.00474517373368144,
-0.03329002484679222,
-0.03772330656647682,
0.03612108901143074,
-0.09242887794971466,
-0.049335792660713196,
-0.038463279604911804,
0.051849234849214554,
-0.12189368158578873,
-0.04389576613903046,
0.028657354414463043,
-0.10194481909275055,
0.0855807438492775,
0.036369092762470245,
0.05217166244983673,
0.022400567308068275,
-0.1211196705698967,
0.008311193436384201,
0.046211615204811096,
0.009835872799158096,
0.03256797045469284,
-0.14146879315376282,
-0.011191043071448803,
-0.027788208797574043,
-0.006544926203787327,
0.014271392486989498,
0.06909507513046265,
-0.13399329781532288,
0.0021655105520039797,
-0.020295191556215286,
-0.04795059561729431,
-0.04431980848312378,
0.01391071081161499,
0.06712718307971954,
0.027794817462563515,
0.12824483215808868,
-0.08975853770971298,
0.055503927171230316,
-0.2172786146402359,
-0.02068117819726467,
-0.04147455096244812,
-0.05939028412103653,
-0.11520007252693176,
-0.02884775586426258,
0.09777476638555527,
-0.039115335792303085,
0.06422170996665955,
-0.027065575122833252,
0.07393614947795868,
0.021862808614969254,
-0.060904089361429214,
0.024203460663557053,
0.051278796046972275,
0.1645931601524353,
0.03891768679022789,
-0.04338475689291954,
0.04471900686621666,
-0.0014992760261520743,
0.08152031153440475,
0.07711523026227951,
0.2072983980178833,
0.17586000263690948,
0.09988117963075638,
0.07926207035779953,
0.04171212762594223,
-0.10974981635808945,
-0.12784849107265472,
0.049897126853466034,
-0.04564547911286354,
0.0782698467373848,
-0.01689596101641655,
0.2198997586965561,
0.09599392861127853,
-0.18658505380153656,
0.02521061711013317,
-0.05509693920612335,
-0.08104019612073898,
-0.09327524900436401,
-0.007560922764241695,
-0.0819706842303276,
-0.1278616040945053,
-0.016245732083916664,
-0.11597689241170883,
0.028736531734466553,
0.13142849504947662,
0.01815653219819069,
0.02971688099205494,
0.1166284829378128,
0.07052107155323029,
0.02888154610991478,
0.05623115971684456,
0.04080187901854515,
0.01605634018778801,
-0.044692784547805786,
-0.1109132319688797,
0.03180153667926788,
-0.04936928674578667,
0.050785087049007416,
-0.03951585665345192,
-0.029363946989178658,
0.06990746408700943,
0.009283872321248055,
-0.10333090275526047,
0.035654932260513306,
0.0015132986009120941,
0.07695692777633667,
0.09502402693033218,
0.0387992337346077,
0.02677006460726261,
0.0038730944506824017,
0.23525387048721313,
-0.05513367801904678,
-0.10133383423089981,
-0.09935184568166733,
0.2615320682525635,
0.022523460909724236,
-0.034684862941503525,
0.03651580959558487,
-0.08568443357944489,
-0.0009430064237676561,
0.18032920360565186,
0.12889771163463593,
-0.08364956825971603,
-0.006357617676258087,
0.013795559294521809,
-0.009714803658425808,
-0.05651198327541351,
0.11390367150306702,
0.1463940143585205,
0.07864146679639816,
-0.11141712963581085,
-0.06975513696670532,
-0.058156099170446396,
-0.01636878028512001,
-0.07519569247961044,
0.028769774362444878,
-0.004379949066787958,
0.007176767103374004,
-0.04557674750685692,
0.07189201563596725,
-0.04765496030449867,
-0.08872616291046143,
0.097129687666893,
-0.1998348832130432,
-0.18821386992931366,
-0.02568136341869831,
0.060389939695596695,
0.011049900203943253,
0.038296252489089966,
-0.03147729113698006,
-0.013915127143263817,
0.10613782703876495,
-0.03579062223434448,
-0.03922193497419357,
-0.08512534946203232,
0.06604098528623581,
-0.039557721465826035,
0.24088366329669952,
-0.03164484724402428,
0.06208416819572449,
0.11561770737171173,
0.06166217103600502,
-0.1362413763999939,
0.05906347930431366,
0.08449817448854446,
-0.07839402556419373,
0.0326649434864521,
0.11309020966291428,
-0.04664482921361923,
0.05247814208269119,
0.049806736409664154,
-0.1159658208489418,
0.0038179331459105015,
-0.006859015207737684,
-0.03662648797035217,
-0.04801032319664955,
-0.04689915478229523,
-0.03686479106545448,
0.12533435225486755,
0.16890275478363037,
-0.061414629220962524,
0.006821609102189541,
-0.04876116290688515,
0.007599945645779371,
0.05363279581069946,
0.10047231614589691,
-0.022127186879515648,
-0.2273896038532257,
0.046961359679698944,
0.029517292976379395,
0.022236766293644905,
-0.22473949193954468,
-0.08398032933473587,
0.025565842166543007,
-0.06548555940389633,
-0.10871601849794388,
0.10813791304826736,
0.03477159142494202,
0.04001802206039429,
-0.04204392805695534,
-0.06769318133592606,
-0.05763252452015877,
0.16811783611774445,
-0.13990981876850128,
-0.09975819289684296
] |
null | null | null |
# Lora of washington/ワシントン/华盛顿 (Azur Lane)
## What Is This?
This is the LoRA model of waifu washington/ワシントン/华盛顿 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/washington_azurlane](https://huggingface.co/datasets/CyberHarem/washington_azurlane), which contains 322 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 11, resolution is 720x720, clustering into 20 buckets.
* Trained for 3240 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `washington_azurlane`.**
* Pruned core tags for this waifu are `blue_eyes, breasts, large_breasts, short_hair, grey_hair, hair_between_eyes, mole, mole_on_breast, bangs`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 2106, you need to download [`2106/washington_azurlane.pt`](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/2106/washington_azurlane.pt) as the embedding and [`2106/washington_azurlane.safetensors`](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/2106/washington_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 2106.
1640 images (1.58 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1 | pattern_2 | pattern_3_0 | pattern_3_1 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:------------------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 2106 | 27 | **0.961** | **0.957** | 0.841 | **0.751** | [Download](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/2106/washington_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1458 | 19 | 0.954 | 0.929 | **0.848** | 0.748 | [Download](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/1458/washington_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1539 | 20 | 0.951 | 0.955 | 0.846 | 0.739 | [Download](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/1539/washington_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1944 | 25 | 0.948 | 0.939 | 0.843 | 0.730 | [Download](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/1944/washington_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 2835 | 36 | 0.940 | 0.942 | 0.839 | 0.712 | [Download](https://huggingface.co/CyberHarem/washington_azurlane/resolve/main/2835/washington_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 2511 to 3240](all/0.md)
* [Steps From 1701 to 2430](all/1.md)
* [Steps From 891 to 1620](all/2.md)
* [Steps From 81 to 810](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/washington_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/washington_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/washington_azurlane",
"license:mit",
"region:us"
] | 2024-02-14T16:07:29+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/washington_azurlane #license-mit #region-us
| Lora of washington/ワシントン/华盛顿 (Azur Lane)
========================================
What Is This?
-------------
This is the LoRA model of waifu washington/ワシントン/华盛顿 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/washington\_azurlane, which contains 322 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 11, resolution is 720x720, clustering into 20 buckets.
* Trained for 3240 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'washington\_azurlane'.
* Pruned core tags for this waifu are 'blue\_eyes, breasts, large\_breasts, short\_hair, grey\_hair, hair\_between\_eyes, mole, mole\_on\_breast, bangs'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 2106, you need to download '2106/washington\_azurlane.pt' as the embedding and '2106/washington\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 2106.
1640 images (1.58 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 2511 to 3240
* Steps From 1701 to 2430
* Steps From 891 to 1620
* Steps From 81 to 810
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 2106, you need to download '2106/washington\\_azurlane.pt' as the embedding and '2106/washington\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 2106.\n\n\n1640 images (1.58 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 2511 to 3240\n* Steps From 1701 to 2430\n* Steps From 891 to 1620\n* Steps From 81 to 810"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/washington_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 2106, you need to download '2106/washington\\_azurlane.pt' as the embedding and '2106/washington\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 2106.\n\n\n1640 images (1.58 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 2511 to 3240\n* Steps From 1701 to 2430\n* Steps From 891 to 1620\n* Steps From 81 to 810"
] | [
45,
38,
476
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/washington_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.00717578548938036,
0.013200740329921246,
-0.003856855211779475,
0.06761985272169113,
0.06439193338155746,
0.08882088214159012,
0.21729914844036102,
0.08004958927631378,
0.13198846578598022,
-0.07757586240768433,
0.10513994097709656,
0.06263284385204315,
-0.0014127191388979554,
0.02969999611377716,
-0.04049365967512131,
-0.14664071798324585,
-0.06813479959964752,
-0.027685241773724556,
0.01022727508097887,
0.022176750004291534,
0.061584170907735825,
-0.0022652449551969767,
0.09183011204004288,
-0.048600614070892334,
-0.03395473584532738,
0.04888848587870598,
-0.034952327609062195,
-0.04032718017697334,
0.034674856811761856,
0.07428690791130066,
0.12575842440128326,
0.03154916688799858,
0.07035087794065475,
-0.1581050157546997,
0.06476368010044098,
-0.007427901495248079,
-0.10291558504104614,
0.004326204303652048,
0.014505764469504356,
-0.03349285572767258,
0.11925815790891647,
0.027823826298117638,
-0.10590260475873947,
0.04386201500892639,
-0.12948428094387054,
-0.02901134081184864,
-0.05572113022208214,
0.04037047550082207,
0.14314188063144684,
0.05657804012298584,
0.027980200946331024,
0.05964149162173271,
-0.04245547577738762,
0.08196373283863068,
0.11216007173061371,
-0.12734489142894745,
-0.05594291910529137,
0.10630114376544952,
0.005705666728317738,
0.12189814448356628,
-0.07755097001791,
0.08899082243442535,
0.062498126178979874,
-0.045266043394804,
-0.13833105564117432,
-0.0853196531534195,
-0.1900549679994583,
0.0042752488516271114,
0.00797868613153696,
0.02058190107345581,
0.4174129366874695,
0.0788421630859375,
0.028205905109643936,
0.06313011795282364,
-0.06115740165114403,
0.03150022402405739,
-0.09406032413244247,
0.13706541061401367,
0.050203438848257065,
0.09855972230434418,
-0.06631015986204147,
-0.11867950856685638,
-0.1188024953007698,
-0.07136959582567215,
-0.0763619989156723,
-0.033808596432209015,
0.01057539414614439,
0.12092076241970062,
-0.212349534034729,
0.01718221977353096,
-0.04117707163095474,
-0.11896059662103653,
0.020280824974179268,
-0.10133562237024307,
0.1643362045288086,
0.0625111535191536,
-0.025925269350409508,
0.03454427793622017,
0.2466527372598648,
0.119051493704319,
0.23370380699634552,
0.04228523001074791,
-0.09651558846235275,
0.1304551661014557,
0.03464708477258682,
-0.08486806601285934,
-0.004756386391818523,
-0.09297355264425278,
0.14671875536441803,
-0.0648188441991806,
0.10473059862852097,
-0.05505264177918434,
-0.10532618314027786,
0.03232617676258087,
-0.11290489137172699,
0.06874203681945801,
0.05043601989746094,
-0.005487489979714155,
-0.055569205433130264,
0.0571056492626667,
0.04066716879606247,
-0.030848391354084015,
-0.012911084108054638,
-0.017926117405295372,
-0.047546595335006714,
0.051361825317144394,
0.11717548221349716,
0.040643032640218735,
0.06122051551938057,
0.02726595103740692,
-0.02084404230117798,
-0.009978284128010273,
-0.03653275966644287,
0.010942509397864342,
0.032371822744607925,
0.04257369413971901,
0.08937247842550278,
-0.15388263761997223,
-0.10569187998771667,
-0.01093992404639721,
0.046500638127326965,
0.015008175745606422,
0.08192089200019836,
0.003707769326865673,
0.0557539276778698,
-0.006093072704970837,
-0.020695557817816734,
0.042235150933265686,
-0.10344754159450531,
0.08148956298828125,
-0.03136264160275459,
0.10060834139585495,
-0.19990433752536774,
-0.004356189165264368,
-0.05473930388689041,
0.002419526455923915,
0.07070498168468475,
-0.0047660572454333305,
-0.10174877196550369,
0.15538571774959564,
-0.008999899961054325,
0.07078902423381805,
-0.12259427458047867,
0.039054471999406815,
0.01750040054321289,
0.08364904671907425,
-0.08733600378036499,
0.0017458971124142408,
0.11714193969964981,
-0.13765403628349304,
-0.16077017784118652,
0.08695502579212189,
-0.020788926631212234,
0.04171924293041229,
0.06064911559224129,
0.15731006860733032,
0.17241865396499634,
-0.21328331530094147,
-0.012292521074414253,
0.06698732078075409,
-0.04559154063463211,
-0.09554651379585266,
-0.024699077010154724,
0.09326053410768509,
-0.0038466178812086582,
0.03020280785858631,
-0.03186630830168724,
0.12405627965927124,
-0.04709797725081444,
-0.07755068689584732,
-0.039150748401880264,
-0.09387227892875671,
-0.08796106278896332,
0.04318444803357124,
-0.012860827147960663,
-0.06011588126420975,
0.003330160863697529,
-0.15842394530773163,
0.15476813912391663,
0.026723165065050125,
0.014810452237725258,
-0.07510711252689362,
0.08573853969573975,
0.03780660778284073,
-0.0027219518087804317,
0.008723471313714981,
-0.06061068922281265,
-0.1099664717912674,
0.2482384443283081,
0.08208579570055008,
0.10484430938959122,
0.05147175490856171,
-0.05604000762104988,
-0.0622832216322422,
0.0160568505525589,
0.022077428176999092,
-0.028774617239832878,
0.023468734696507454,
-0.08290989696979523,
0.04743802547454834,
-0.014232341200113297,
0.03215019032359123,
-0.0001346783246845007,
-0.01928318291902542,
0.08705306053161621,
0.013348287902772427,
-0.014565205201506615,
0.08699426800012589,
0.05463380739092827,
-0.027159562334418297,
-0.051137588918209076,
0.002249868819490075,
0.07574862241744995,
0.005715186707675457,
-0.08677109330892563,
0.01374933123588562,
0.013132435269653797,
0.03623140603303909,
0.2005595713853836,
-0.20984844863414764,
0.04317599534988403,
0.012340051122009754,
0.038588400930166245,
0.034384142607450485,
-0.009338799864053726,
-0.05385271832346916,
0.04337448626756668,
-0.02294170670211315,
0.06373216956853867,
-0.005795008037239313,
0.057806164026260376,
-0.029966842383146286,
-0.11589758098125458,
-0.011986752972006798,
-0.05297858268022537,
0.16650773584842682,
-0.1592504382133484,
0.04511555656790733,
0.19244635105133057,
-0.11309488117694855,
0.11035961657762527,
-0.00824158638715744,
-0.022374199703335762,
0.011298999190330505,
0.017606468871235847,
-0.00005746898386860266,
0.090937539935112,
-0.06782224029302597,
-0.02533610165119171,
0.02148984558880329,
-0.08734258264303207,
0.032118793576955795,
-0.12528792023658752,
-0.11783729493618011,
-0.06178949773311615,
-0.024830684065818787,
-0.04440529644489288,
0.013185222633183002,
-0.047212447971105576,
0.07360140234231949,
-0.0955532118678093,
-0.08013515174388885,
-0.016678350046277046,
-0.08938499540090561,
0.02024340070784092,
0.00129860604647547,
-0.05110107734799385,
-0.15171650052070618,
-0.12399057298898697,
-0.08046691119670868,
-0.10271777212619781,
-0.013971895910799503,
0.070559561252594,
-0.10938213765621185,
-0.03869548439979553,
0.007881201803684235,
-0.05056225135922432,
0.09347450733184814,
-0.05985311418771744,
0.032584451138973236,
0.06380762159824371,
-0.031173786148428917,
-0.17201939225196838,
-0.001532325753942132,
-0.06699169427156448,
-0.05426942929625511,
0.15720021724700928,
-0.14299629628658295,
0.17141473293304443,
-0.030180662870407104,
0.05603649839758873,
0.061269842088222504,
0.026092693209648132,
0.12437734752893448,
-0.09995274245738983,
0.07934682816267014,
0.17229770123958588,
0.03450906649231911,
0.079194076359272,
0.10688911378383636,
0.0846850574016571,
-0.12309584021568298,
0.04897569119930267,
0.07689178735017776,
-0.10036396235227585,
-0.08221757411956787,
-0.05044402927160263,
-0.09874635189771652,
-0.056221622973680496,
0.06501467525959015,
0.05968499556183815,
0.05377640202641487,
0.11548445373773575,
-0.05103665962815285,
-0.009778257459402084,
0.09486491978168488,
0.042008038610219955,
0.08833538740873337,
0.011131604202091694,
0.055659420788288116,
-0.15336757898330688,
-0.04688401147723198,
0.15659098327159882,
0.20650862157344818,
0.2317243069410324,
0.03410271182656288,
0.07865017652511597,
0.11201386898756027,
0.07611443847417831,
0.10920237749814987,
0.05590495839715004,
0.012072323821485043,
0.01317014079540968,
-0.06697621941566467,
-0.04844501242041588,
0.03741629794239998,
0.00014463897969108075,
-0.04087406024336815,
-0.14552414417266846,
0.11610960960388184,
-0.009409761056303978,
0.06167284771800041,
0.16025404632091522,
0.06244082376360893,
-0.10721305012702942,
0.15178076922893524,
0.09931569546461105,
0.07285942882299423,
-0.05119946226477623,
0.13776619732379913,
0.048118706792593,
0.006040617823600769,
0.15778222680091858,
0.03855664283037186,
0.15430434048175812,
-0.042020250111818314,
-0.07558176666498184,
-0.07972712814807892,
-0.05310548469424248,
0.0010567212011665106,
0.026629110798239708,
-0.20844872295856476,
0.09104055166244507,
0.05501129850745201,
0.015397362411022186,
-0.012169268913567066,
-0.048494853079319,
0.1861155480146408,
0.14444993436336517,
0.08128044754266739,
0.020731788128614426,
-0.031233079731464386,
0.0003117648302577436,
-0.11010413616895676,
0.05723275989294052,
0.011316707357764244,
0.061720240861177444,
-0.04501044750213623,
-0.09784255921840668,
-0.02578887902200222,
0.0018865866586565971,
0.008762964978814125,
-0.08563651889562607,
-0.1074669137597084,
-0.04454278573393822,
0.24841775000095367,
-0.04762144014239311,
0.04671527072787285,
0.05920787155628204,
0.01467775460332632,
-0.03357991576194763,
0.03146430104970932,
-0.020178629085421562,
-0.015661636367440224,
-0.03099902532994747,
-0.00007912927685538307,
0.019777223467826843,
-0.04269808530807495,
-0.03743068501353264,
-0.02664235047996044,
-0.10564008355140686,
-0.11258424818515778,
0.0020327831152826548,
-0.05440850555896759,
0.01355777494609356,
-0.029195938259363174,
0.031689852476119995,
-0.09419915080070496,
-0.02284497208893299,
0.02711312286555767,
0.04370063543319702,
-0.08372030407190323,
-0.13576078414916992,
0.014677365310490131,
-0.030987856909632683,
-0.04480326175689697,
0.03384441137313843,
-0.08340175449848175,
-0.07842382788658142,
-0.049658168107271194,
-0.031173264607787132,
0.1280166208744049,
0.22422979772090912,
-0.03513353690505028,
0.006956859026104212,
0.1470744013786316,
-0.10080046206712723,
-0.3139537274837494,
-0.1502533257007599,
-0.17407216131687164,
-0.1048392578959465,
0.049856022000312805,
-0.0724654570221901,
0.04168514162302017,
0.10400595515966415,
-0.0462893508374691,
0.22699303925037384,
-0.1713455468416214,
-0.09956799447536469,
0.10227994620800018,
0.08945057541131973,
0.304513543844223,
-0.2567034661769867,
0.0008718100725673139,
-0.10442717373371124,
-0.058339305222034454,
0.02407793328166008,
-0.07914676517248154,
0.10300654172897339,
0.04633815214037895,
0.06577803194522858,
-0.0074097393080592155,
-0.005260053556412458,
0.14992773532867432,
-0.07527278363704681,
0.1296195089817047,
-0.11995217204093933,
-0.09020072221755981,
0.2192266881465912,
-0.027586685493588448,
0.009476562961935997,
-0.2053254246711731,
-0.027186758816242218,
-0.015816669911146164,
0.03560224920511246,
-0.004952033516019583,
0.05435407534241676,
-0.011310928501188755,
-0.014571991749107838,
-0.13560955226421356,
-0.030573932453989983,
-0.04949871078133583,
0.047497693449258804,
0.21532617509365082,
-0.06836384534835815,
-0.0655941367149353,
0.030947081744670868,
0.004872068762779236,
0.09338109195232391,
0.015188408084213734,
-0.05707666650414467,
-0.0606495700776577,
0.08455315232276917,
-0.20453666150569916,
0.055385760962963104,
0.0057798102498054504,
0.0035052800085395575,
0.014917639084160328,
0.01212949026376009,
0.024764105677604675,
0.10793574899435043,
0.18003547191619873,
-0.018630748614668846,
-0.029039498418569565,
-0.026906128972768784,
0.04192962497472763,
0.12364131212234497,
-0.013490261510014534,
0.10110745579004288,
0.022586887702345848,
0.03492040932178497,
0.010947839356958866,
0.05858844518661499,
-0.07841832935810089,
-0.0986800491809845,
0.10403468459844589,
-0.03802161291241646,
-0.08726278692483902,
0.0873074010014534,
0.050495609641075134,
0.06382115185260773,
-0.009828466922044754,
0.04402397572994232,
0.019854377955198288,
-0.12019708007574081,
0.009077091701328754,
0.18515625596046448,
-0.03531690686941147,
-0.05595959722995758,
-0.07235950976610184,
0.010945220477879047,
-0.10682132095098495,
0.08276648074388504,
0.027074912562966347,
-0.016525793820619583,
0.11672955751419067,
-0.04093014821410179,
-0.03594692796468735,
-0.005262316670268774,
-0.08753442019224167,
0.03148958832025528,
-0.15009541809558868,
-0.1957152783870697,
0.049495670944452286,
-0.006403211038559675,
-0.06299948692321777,
-0.09813804179430008,
-0.08097683638334274,
0.06936243921518326,
-0.14634910225868225,
0.1444699913263321,
-0.07462061196565628,
0.06194554269313812,
-0.0337478406727314,
-0.05681707710027695,
-0.10869540274143219,
-0.014000535942614079,
-0.052226752042770386,
-0.017936572432518005,
0.06555910408496857,
0.008069692179560661,
-0.11048132926225662,
-0.11338741332292557,
0.060883358120918274,
-0.006977251265197992,
0.0043428377248346806,
0.006575197912752628,
-0.07630988210439682,
0.014243979938328266,
-0.22459600865840912,
-0.06688591837882996,
0.09299880266189575,
0.02871323749423027,
-0.10076233744621277,
0.11713051795959473,
0.039904143661260605,
-0.024749934673309326,
0.03235452249646187,
0.0023074971977621317,
0.15291784703731537,
-0.07453936338424683,
0.029390893876552582,
-0.11331123858690262,
-0.15314650535583496,
-0.032290276139974594,
-0.001612866180948913,
0.2468855232000351,
0.07254256308078766,
0.11830953508615494,
-0.04486572742462158,
0.021784473210573196,
-0.03220038488507271,
0.06660354882478714,
0.010297492146492004,
-0.10000056028366089,
-0.038359686732292175,
-0.17306353151798248,
-0.06499851495027542,
-0.05277640372514725,
0.18104256689548492,
0.005401027388870716,
-0.15699724853038788,
0.01169660035520792,
0.126810222864151,
-0.17294646799564362,
-0.0059389518573880196,
0.1926204413175583,
-0.04832885414361954,
0.02772602066397667,
-0.15981177985668182,
0.019046258181333542,
0.07076646387577057,
-0.029424864798784256,
0.01706746779382229,
0.09955593943595886,
0.007087992038577795,
0.004622724838554859,
0.042608510702848434,
-0.01944378949701786,
0.0891021341085434,
-0.06608448922634125,
0.05884692817926407,
-0.0020323575008660555,
-0.03948196396231651,
-0.13064385950565338,
0.19218499958515167,
-0.006052009761333466,
0.00863095372915268,
-0.05370922014117241,
0.0034911194816231728,
-0.10750777274370193,
-0.10160507261753082,
-0.06443299353122711,
-0.12496203184127808,
0.07567117363214493,
-0.054123405367136,
0.010241758078336716,
0.004210478160530329,
0.012585489079356194,
-0.07729842513799667,
-0.010667400434613228,
-0.1665983498096466,
-0.0432787723839283,
0.0246733408421278,
-0.0264640673995018,
-0.03641759604215622,
-0.05877622589468956,
-0.041287507861852646,
0.02421475015580654,
-0.060078367590904236,
-0.07416685670614243,
0.05299847200512886,
0.07837104797363281,
0.05339657515287399,
-0.16383148729801178,
-0.10730285942554474,
-0.07261600345373154,
0.03733348473906517,
0.06957054883241653,
0.1636243611574173,
0.03948238492012024,
-0.0015899561112746596,
0.05151846259832382,
0.13407669961452484,
0.004958579782396555,
-0.07569577544927597,
-0.06425883620977402,
-0.15616969764232635,
-0.1492365002632141,
-0.01637037843465805,
-0.05046754702925682,
-0.023885436356067657,
0.03456127643585205,
0.23010970652103424,
0.18985719978809357,
-0.13028347492218018,
0.054666709154844284,
-0.07430369406938553,
0.034461285918951035,
-0.017776677384972572,
0.15690737962722778,
0.05557134747505188,
0.15727825462818146,
-0.038481682538986206,
-0.04468943178653717,
-0.05031086876988411,
0.015540705062448978,
-0.10024598240852356,
0.02431103028357029,
-0.02289656177163124,
-0.07100927829742432,
-0.051990751177072525,
0.11243346333503723,
-0.11023910343647003,
0.0710122212767601,
0.1859898418188095,
-0.14759612083435059,
-0.010501054115593433,
-0.03137429803609848,
0.0425870418548584,
0.09999372810125351,
0.017132511362433434,
-0.08728552609682083,
-0.021700115874409676,
0.011874477379024029,
0.014455112628638744,
-0.19201897084712982,
-0.11659352481365204,
-0.01105492003262043,
-0.1292310208082199,
0.11347591876983643,
-0.007123926654458046,
0.014237944036722183,
0.031661346554756165,
-0.05769631639122963,
-0.021948542445898056,
0.18311305344104767,
0.009362909011542797,
-0.055937185883522034,
-0.025455690920352936,
-0.07434544712305069,
-0.10544727742671967,
0.08337138593196869,
0.09316756576299667,
0.0390809066593647,
-0.005077442619949579,
0.1632571667432785,
-0.024144446477293968,
-0.03387075290083885,
0.12448291480541229,
-0.16382911801338196,
0.09226656705141068,
-0.016643034294247627,
-0.0265666451305151,
-0.0718020498752594,
-0.03178153187036514,
0.02871309407055378,
0.08499521017074585,
-0.17893347144126892,
-0.040393978357315063,
0.06572169810533524,
-0.09050101786851883,
0.06469672173261642,
0.03815280273556709,
-0.09665866196155548,
0.010391809977591038,
-0.12049105763435364,
-0.01324330922216177,
-0.10662923753261566,
0.04404093697667122,
0.21140098571777344,
-0.032381922006607056,
0.01392367109656334,
-0.12010715901851654,
0.055300477892160416,
-0.03157544508576393,
-0.05888638645410538,
-0.072418712079525
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# LoRA-phi-2-coach-lite
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.17.0
- Tokenizers 0.15.1 | {"license": "mit", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "datasets": ["generator"], "base_model": "microsoft/phi-2", "model-index": [{"name": "LoRA-phi-2-coach-lite", "results": []}]} | null | Teapack1/LoRA-phi-2-coach-lite | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:microsoft/phi-2",
"license:mit",
"region:us"
] | 2024-02-14T16:07:47+00:00 | [] | [] | TAGS
#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us
|
# LoRA-phi-2-coach-lite
This model is a fine-tuned version of microsoft/phi-2 on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.17.0
- Tokenizers 0.15.1 | [
"# LoRA-phi-2-coach-lite\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 1\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us \n",
"# LoRA-phi-2-coach-lite\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 1\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
51,
31,
6,
12,
8,
3,
143,
4,
36
] | [
"passage: TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us \n# LoRA-phi-2-coach-lite\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 1\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.10091286897659302,
0.12480033189058304,
-0.0038727582432329655,
0.0892660990357399,
0.10744721442461014,
0.01926182582974434,
0.1171785295009613,
0.13880516588687897,
-0.0545656718313694,
0.11559519916772842,
0.07201503217220306,
0.035067103803157806,
0.07640597969293594,
0.17412057518959045,
-0.006384828127920628,
-0.2661663293838501,
0.044659316539764404,
-0.05496053770184517,
-0.005694648250937462,
0.0932251363992691,
0.09106406569480896,
-0.08319003134965897,
0.07117346674203873,
0.016634885221719742,
-0.10565876960754395,
-0.017085928469896317,
-0.03939782455563545,
-0.04850815609097481,
0.07872682064771652,
0.006049292627722025,
0.08449900150299072,
0.019007233902812004,
0.13582630455493927,
-0.23564152419567108,
0.0027890100609511137,
0.07437998801469803,
0.04275393858551979,
0.10013695061206818,
0.10649692267179489,
0.040249962359666824,
0.08463422954082489,
-0.12222305685281754,
0.09931613504886627,
0.03254161775112152,
-0.09593318402767181,
-0.14055919647216797,
-0.10806233435869217,
0.07944034039974213,
0.08637309819459915,
0.09146581590175629,
0.019500553607940674,
0.17031008005142212,
-0.06002664193511009,
0.057059310376644135,
0.2397695928812027,
-0.28302404284477234,
-0.06238115206360817,
0.04784894734621048,
0.07291875779628754,
0.042872875928878784,
-0.11769095063209534,
-0.027613922953605652,
0.010829560458660126,
0.017086034640669823,
0.0868435874581337,
0.0011989737395197153,
-0.022712696343660355,
-0.027844011783599854,
-0.12295464426279068,
-0.04546824097633362,
0.09958920627832413,
0.04938096925616264,
-0.040953025221824646,
-0.13159440457820892,
-0.055211517959833145,
-0.14310015738010406,
-0.013910301961004734,
-0.025023505091667175,
0.03131170943379402,
-0.038664836436510086,
-0.02400347962975502,
-0.05000893026590347,
-0.07464928925037384,
-0.06284995377063751,
0.011605720035731792,
0.09082402288913727,
0.042333412915468216,
0.02527022920548916,
0.013677124865353107,
0.1261141300201416,
0.014366628602147102,
-0.12370586395263672,
-0.04536568745970726,
-0.014941585250198841,
-0.11304199695587158,
-0.04685620591044426,
-0.0217510424554348,
0.033965520560741425,
0.014831231907010078,
0.1556040197610855,
-0.06715714931488037,
0.07452869415283203,
0.03371649608016014,
0.01090458407998085,
-0.018539831042289734,
0.14177313446998596,
-0.05551343411207199,
-0.01594354584813118,
-0.007146773859858513,
0.09785521775484085,
0.035252735018730164,
-0.015944581478834152,
-0.08580687642097473,
-0.015924038365483284,
0.09516873955726624,
0.06430621445178986,
-0.018794964998960495,
0.007717253174632788,
-0.06850071996450424,
-0.035655517131090164,
0.07421508431434631,
-0.12229586392641068,
0.07225777953863144,
0.01575148105621338,
-0.06381585448980331,
-0.004222940653562546,
0.03021509200334549,
0.009723308496177197,
-0.05532738193869591,
0.08698572218418121,
-0.0755934789776802,
0.0048195370472967625,
-0.06946296989917755,
-0.03871671110391617,
0.03608357161283493,
-0.09698367118835449,
-0.020267333835363388,
-0.07179324328899384,
-0.1917528659105301,
-0.05038492754101753,
0.028644785284996033,
-0.09470388293266296,
-0.05739184096455574,
-0.00864421483129263,
-0.05759548023343086,
0.025645412504673004,
-0.038361117243766785,
0.10088247805833817,
-0.06224312633275986,
0.06429919600486755,
-0.031231746077537537,
0.007185982074588537,
0.015057830139994621,
0.036338046193122864,
-0.06987807154655457,
0.037710849195718765,
-0.13396649062633514,
0.06108144298195839,
-0.08572687208652496,
0.016398031264543533,
-0.12301508337259293,
-0.10374375432729721,
-0.01738908141851425,
-0.033343784511089325,
0.06118350848555565,
0.11130709946155548,
-0.13807491958141327,
-0.002120211021974683,
0.13753248751163483,
-0.10577867925167084,
-0.049826521426439285,
0.06646043807268143,
-0.020379960536956787,
-0.007991887629032135,
0.03687489405274391,
0.13616108894348145,
0.10414962470531464,
-0.18661963939666748,
-0.0007642122800461948,
0.00913009513169527,
0.05667239427566528,
0.007554669864475727,
0.0763557180762291,
-0.03166261687874794,
0.07958844304084778,
0.02785961702466011,
-0.04929057136178017,
-0.006011167541146278,
-0.06626636534929276,
-0.06711173802614212,
-0.05290373042225838,
-0.05855183303356171,
0.0013661565026268363,
0.02454368583858013,
0.015286680310964584,
-0.042162686586380005,
-0.11280061304569244,
0.1035405695438385,
0.14025557041168213,
-0.03891339525580406,
0.016809193417429924,
-0.08353957533836365,
0.048469483852386475,
0.004024548456072807,
-0.03988110274076462,
-0.19953253865242004,
-0.131364643573761,
0.060949381440877914,
-0.10919827222824097,
0.028810054063796997,
0.023695912212133408,
0.05273863673210144,
0.07759355008602142,
-0.023616774007678032,
-0.030128465965390205,
-0.09486141055822372,
-0.004569631535559893,
-0.1009109616279602,
-0.16999585926532745,
-0.0721343457698822,
-0.021969635039567947,
0.15644314885139465,
-0.18616385757923126,
0.005243297200649977,
0.022803030908107758,
0.16993041336536407,
0.03838775306940079,
-0.07817497104406357,
0.020530764013528824,
0.04608619958162308,
0.009322067722678185,
-0.10539181530475616,
0.041240014135837555,
0.01505436934530735,
-0.06369180977344513,
-0.040228839963674545,
-0.16336925327777863,
0.041154976934194565,
0.07303576916456223,
0.11788982897996902,
-0.08165409415960312,
-0.07089028507471085,
-0.06549415737390518,
-0.05221294239163399,
-0.07661458104848862,
-0.011166642419993877,
0.16915582120418549,
0.04226990044116974,
0.10891764611005783,
-0.0755748450756073,
-0.06899651885032654,
0.00998783577233553,
0.014073779806494713,
-0.003351545659825206,
0.06608813256025314,
0.05940307304263115,
-0.10378310829401016,
0.08289357274770737,
0.07020894438028336,
-0.0185814518481493,
0.10898369550704956,
-0.06774073839187622,
-0.10699300467967987,
-0.03235320374369621,
0.021979037672281265,
0.006702296435832977,
0.1262308657169342,
-0.011140837334096432,
0.03512398898601532,
0.04636044427752495,
0.02780449576675892,
0.01793159358203411,
-0.1690431833267212,
-0.00005561093348660506,
0.001886717858724296,
-0.05268251895904541,
0.017784982919692993,
-0.02010301873087883,
0.0441293828189373,
0.06953591108322144,
0.017293987795710564,
-0.03779476881027222,
-0.0030536421108990908,
-0.018635977059602737,
-0.07541467994451523,
0.16345998644828796,
-0.10425063967704773,
-0.16593147814273834,
-0.14534325897693634,
0.08151416480541229,
-0.04476860538125038,
-0.030904600396752357,
0.008164203725755215,
-0.010555570013821125,
-0.0668262243270874,
-0.11682024598121643,
-0.03562483936548233,
-0.012472988106310368,
-0.0043984996154904366,
0.025431066751480103,
0.026487134397029877,
0.09723364561796188,
-0.12732474505901337,
0.00807411503046751,
-0.009390185587108135,
-0.06614339351654053,
0.004832530859857798,
0.048503853380680084,
0.0728968158364296,
0.10515296459197998,
0.02492057904601097,
0.015425165183842182,
-0.03649777173995972,
0.19706469774246216,
-0.0910070389509201,
0.009705024771392345,
0.1437794417142868,
-0.006076029967516661,
0.06743092834949493,
0.11083720624446869,
0.024506425485014915,
-0.07653118669986725,
0.013346828520298004,
0.048589546233415604,
-0.02017250657081604,
-0.227483868598938,
-0.054908815771341324,
-0.030113931745290756,
-0.03743203356862068,
0.11793138086795807,
0.0730263963341713,
0.003807991975918412,
0.03235240653157234,
-0.030417732894420624,
-0.028005339205265045,
0.02314744144678116,
0.08642948418855667,
0.0708153173327446,
0.04514244571328163,
0.09518273919820786,
-0.02478094771504402,
-0.0034172607120126486,
0.07339964807033539,
0.022874437272548676,
0.24236179888248444,
-0.004863952752202749,
0.12534776329994202,
0.01061332132667303,
0.15275359153747559,
-0.0100093437358737,
0.05279969051480293,
0.046543531119823456,
0.015185936354100704,
0.007662594318389893,
-0.07086704671382904,
-0.015798553824424744,
0.0446716733276844,
0.004369158763438463,
0.02762363664805889,
-0.06367157399654388,
0.017110049724578857,
0.012414703145623207,
0.2968621253967285,
0.06118866428732872,
-0.30066388845443726,
-0.05247329920530319,
0.0013284742599353194,
-0.05305362120270729,
-0.06624920666217804,
-0.0006699234363622963,
0.09430119395256042,
-0.1589593142271042,
0.08468694239854813,
-0.06698410958051682,
0.09132298082113266,
-0.07502254843711853,
-0.008480565622448921,
0.09060955792665482,
0.10574491322040558,
0.015407748520374298,
0.06684082746505737,
-0.15240252017974854,
0.19722828269004822,
0.01122374553233385,
0.0852520540356636,
-0.043568357825279236,
0.04676538333296776,
0.0008329927804879844,
0.05771079286932945,
0.09971398115158081,
-0.011458960361778736,
-0.05509883537888527,
-0.16884058713912964,
-0.16434864699840546,
0.008463513106107712,
0.10553408414125443,
-0.05292096361517906,
0.07419662177562714,
-0.046281371265649796,
0.004632089752703905,
0.013335371389985085,
-0.07105040550231934,
-0.1465936154127121,
-0.12005271762609482,
0.04449343681335449,
-0.0020765834487974644,
0.0028425781056284904,
-0.09301839768886566,
-0.10215073823928833,
-0.04415399208664894,
0.14987090229988098,
-0.03394991531968117,
-0.04636847600340843,
-0.16062451899051666,
0.06905898451805115,
0.14341585338115692,
-0.05002673715353012,
0.028242038562893867,
0.001555884606204927,
0.1317770928144455,
0.037083499133586884,
-0.06783723086118698,
0.06196090206503868,
-0.039689693599939346,
-0.21739301085472107,
-0.04725922271609306,
0.1425178050994873,
0.03844790160655975,
0.04135843366384506,
-0.0009656444890424609,
0.03559118136763573,
0.015579510480165482,
-0.1012982428073883,
0.01014952827244997,
0.11251436918973923,
0.05995927006006241,
0.034451697021722794,
-0.04783181846141815,
0.06750475615262985,
-0.021315699443221092,
-0.015952197834849358,
0.08594199270009995,
0.20590835809707642,
-0.09832534193992615,
0.10037631541490555,
0.03759654611349106,
-0.06363534182310104,
-0.17564906179904938,
0.0027135098353028297,
0.13132236897945404,
0.019049838185310364,
0.06543616205453873,
-0.16391722857952118,
0.07946876436471939,
0.14033663272857666,
-0.045978814363479614,
0.025966094806790352,
-0.31693774461746216,
-0.1394714117050171,
0.07464703917503357,
0.08046045899391174,
-0.03113536164164543,
-0.15382903814315796,
-0.060126371681690216,
-0.014429224655032158,
-0.12038620561361313,
0.09713627398014069,
-0.10533875972032547,
0.10791730135679245,
-0.01650347374379635,
0.05133587494492531,
0.024698974564671516,
-0.034097954630851746,
0.1599540114402771,
0.012534786947071552,
0.07130863517522812,
-0.05627814680337906,
0.05607153847813606,
0.034087102860212326,
-0.07622524350881577,
0.07076245546340942,
-0.03046732023358345,
0.05654728040099144,
-0.14170822501182556,
-0.020178932696580887,
-0.04554978013038635,
0.06817474961280823,
-0.04898606985807419,
-0.039755720645189285,
-0.041029155254364014,
0.05045849084854126,
0.02734685316681862,
-0.03020666353404522,
0.09763763844966888,
0.010028908960521221,
0.06465785205364227,
0.17601601779460907,
0.072254478931427,
0.02074441872537136,
-0.15232957899570465,
-0.01687297783792019,
-0.017434697598218918,
0.04397038370370865,
-0.061595067381858826,
0.010448015294969082,
0.1252232789993286,
0.021202923730015755,
0.10796748846769333,
0.01567288115620613,
-0.08197186142206192,
-0.006521093193441629,
0.04062012955546379,
-0.0998091921210289,
-0.1644032597541809,
0.010282590053975582,
0.03654143586754799,
-0.12044279277324677,
0.009277335368096828,
0.13221518695354462,
-0.03708427771925926,
-0.01920487731695175,
0.01285560056567192,
0.03375159949064255,
-0.0010632972698658705,
0.17140905559062958,
0.014327588491141796,
0.06882955133914948,
-0.0890808030962944,
0.133100688457489,
0.08564502745866776,
-0.04809447377920151,
0.02721189707517624,
0.09713908284902573,
-0.095562644302845,
-0.012465746141970158,
0.06508816033601761,
0.10292941331863403,
-0.037122949957847595,
-0.05451798066496849,
-0.05890227481722832,
-0.11371497809886932,
0.05403333157300949,
0.06058317795395851,
0.04189999774098396,
0.019867336377501488,
-0.00753839360550046,
0.010733840055763721,
-0.0887826681137085,
0.0742940902709961,
0.038047779351472855,
0.07219921797513962,
-0.1659308671951294,
0.06609152257442474,
0.0107994070276618,
0.020263725891709328,
-0.015310540795326233,
0.01606091484427452,
-0.09074251353740692,
-0.03020063228905201,
-0.08363421261310577,
0.03080952912569046,
-0.04896189272403717,
0.0012464870233088732,
0.0003423284215386957,
-0.06625008583068848,
-0.019251255318522453,
0.0438285730779171,
-0.060052163898944855,
-0.07045134156942368,
-0.02325259894132614,
0.06004856526851654,
-0.11997170746326447,
-0.009278139099478722,
0.021661873906850815,
-0.0967903807759285,
0.10248849540948868,
0.0388922244310379,
0.03288746252655983,
0.016040608286857605,
-0.0792798101902008,
0.020872699096798897,
0.035225629806518555,
0.034217242151498795,
0.04531262442469597,
-0.09710852801799774,
-0.02354438416659832,
-0.025638900697231293,
0.01475694589316845,
0.009700164198875427,
0.05032132565975189,
-0.1492612510919571,
-0.03788262605667114,
-0.0541895367205143,
-0.057903092354536057,
-0.05202842876315117,
0.01728542149066925,
0.11002516001462936,
0.02048618532717228,
0.14744190871715546,
-0.07131130248308182,
0.05932128056883812,
-0.21177200973033905,
-0.00999251939356327,
0.007133281324058771,
-0.004241049289703369,
-0.06388426572084427,
-0.029830537736415863,
0.08930809050798416,
-0.027608418837189674,
0.11094756424427032,
-0.019544456154108047,
0.0810539722442627,
0.035996727645397186,
-0.03702208399772644,
-0.0019499489571899176,
0.03664049878716469,
0.127531498670578,
0.07142601162195206,
0.0019634985364973545,
0.11623065173625946,
-0.044222958385944366,
0.0285645741969347,
0.02153725177049637,
0.19103308022022247,
0.16163425147533417,
0.0009124452481046319,
0.07547269761562347,
0.1016157791018486,
-0.11276189982891083,
-0.12923404574394226,
0.10917002707719803,
-0.04122351109981537,
0.10629262775182724,
-0.06784708797931671,
0.13698029518127441,
0.08745872974395752,
-0.18247325718402863,
0.03137768432497978,
-0.03882283717393875,
-0.09366855025291443,
-0.1187429279088974,
-0.06648880988359451,
-0.08267198503017426,
-0.12465719878673553,
0.00871027447283268,
-0.09567560255527496,
0.024628106504678726,
0.08779682219028473,
0.007404186297208071,
0.020124735310673714,
0.1116044893860817,
-0.02434701845049858,
0.0046177394688129425,
0.04291987791657448,
0.03927044942975044,
0.019604310393333435,
-0.06481725722551346,
-0.07546418905258179,
0.03197535499930382,
0.0231393501162529,
0.08088981360197067,
-0.018623067066073418,
0.028904981911182404,
0.03869881480932236,
-0.008812179788947105,
-0.07069456577301025,
0.03485991805791855,
0.005473498720675707,
-0.000557134160771966,
0.06767979264259338,
0.047835901379585266,
0.0077834478579461575,
-0.05015600845217705,
0.24916104972362518,
-0.06568744033575058,
-0.06521095335483551,
-0.12934383749961853,
0.18966399133205414,
0.005158236250281334,
-0.01180640421807766,
0.06427086889743805,
-0.09946975857019424,
-0.031235095113515854,
0.13241174817085266,
0.14247024059295654,
-0.05728549510240555,
-0.0075190202333033085,
-0.0031501040793955326,
-0.017426397651433945,
-0.04273683950304985,
0.10807707160711288,
0.08464796841144562,
0.062355730682611465,
-0.057397499680519104,
0.015873482450842857,
0.015738168731331825,
-0.031405847519636154,
-0.09777434915304184,
0.03831096366047859,
-0.002427472034469247,
0.019908200949430466,
-0.05266052857041359,
0.07656452804803848,
-0.008185995742678642,
-0.14433227479457855,
0.06822824478149414,
-0.14485272765159607,
-0.187569260597229,
-0.005092046223580837,
0.07113883644342422,
-0.03153321146965027,
0.05573872849345207,
-0.01645970344543457,
-0.002832647878676653,
0.11847469955682755,
-0.023896461352705956,
-0.07108094543218613,
-0.084377221763134,
0.06970320641994476,
-0.08277978748083115,
0.204806387424469,
-0.0029328076634556055,
0.08034171909093857,
0.09018009155988693,
0.01972130313515663,
-0.16809049248695374,
0.024495437741279602,
0.06756310164928436,
-0.09417125582695007,
0.026578176766633987,
0.12865372002124786,
-0.023778950795531273,
0.07359008491039276,
0.046073444187641144,
-0.10801973938941956,
-0.032420359551906586,
-0.03476623073220253,
-0.0011803202796727419,
-0.06309712678194046,
-0.0035979575477540493,
-0.060660749673843384,
0.16331762075424194,
0.16651757061481476,
-0.04200337082147598,
-0.014571188017725945,
-0.04321041703224182,
0.03907133638858795,
0.013995139859616756,
0.051186803728342056,
0.006754589732736349,
-0.1945919245481491,
0.03525384142994881,
-0.0003856065741274506,
0.04062763601541519,
-0.17285417020320892,
-0.09631948173046112,
0.033067863434553146,
-0.056448690593242645,
-0.07340320199728012,
0.1189407929778099,
0.017526784911751747,
0.014496817253530025,
-0.030589444562792778,
-0.13434647023677826,
-0.029128020629286766,
0.13332118093967438,
-0.14966467022895813,
-0.052924931049346924
] |
null | null | transformers |
# TinyPoliticaLlama-4x1.1B-nf4
TinyPoliticaLlama-4x1.1B-nf4 is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [TinyLlama/TinyLlama-1.1B-Chat-v1.0](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0)
* [h4rz3rk4s3/TinyNewsLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyNewsLlama-1.1B)
* [h4rz3rk4s3/TinyParlaMintLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyParlaMintLlama-1.1B)
* [Tensoic/TinyLlama-1.1B-3T-openhermes](https://huggingface.co/Tensoic/TinyLlama-1.1B-3T-openhermes)
## 🧩 Configuration
```yaml
base_model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
dtype: bfloat16
gate_mode: hidden
experts:
- source_model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
positive_prompts: ["chat", "assistant", "tell me", "explain"]
- source_model: h4rz3rk4s3/TinyNewsLlama-1.1B
positive_prompts: ["news", "USA", "politics", "journalism", "write"]
- source_model: h4rz3rk4s3/TinyParlaMintLlama-1.1B
positive_prompts: ["speech", "politics", "EU", "europe", "write"]
- source_model: Tensoic/TinyLlama-1.1B-3T-openhermes
positive_prompts: ["reason", "provide", "instruct", "summarize", "count"]```
## 💻 Usage
```python
!pip install -qU transformers bitsandbytes accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "h4rz3rk4s3/TinyPoliticaLlama-4x1.1B-nf4"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)
messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["moe", "frankenmoe", "merge", "mergekit", "lazymergekit", "TinyLlama/TinyLlama-1.1B-Chat-v1.0", "h4rz3rk4s3/TinyNewsLlama-1.1B", "h4rz3rk4s3/TinyParlaMintLlama-1.1B", "Tensoic/TinyLlama-1.1B-3T-openhermes"], "base_model": ["TinyLlama/TinyLlama-1.1B-Chat-v1.0", "h4rz3rk4s3/TinyNewsLlama-1.1B", "h4rz3rk4s3/TinyParlaMintLlama-1.1B", "Tensoic/TinyLlama-1.1B-3T-openhermes"]} | text-generation | h4rz3rk4s3/TinyPoliticaLlama-4x1.1B-nf4 | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"moe",
"frankenmoe",
"merge",
"mergekit",
"lazymergekit",
"TinyLlama/TinyLlama-1.1B-Chat-v1.0",
"h4rz3rk4s3/TinyNewsLlama-1.1B",
"h4rz3rk4s3/TinyParlaMintLlama-1.1B",
"Tensoic/TinyLlama-1.1B-3T-openhermes",
"conversational",
"base_model:TinyLlama/TinyLlama-1.1B-Chat-v1.0",
"base_model:h4rz3rk4s3/TinyNewsLlama-1.1B",
"base_model:h4rz3rk4s3/TinyParlaMintLlama-1.1B",
"base_model:Tensoic/TinyLlama-1.1B-3T-openhermes",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T16:08:20+00:00 | [] | [] | TAGS
#transformers #safetensors #mixtral #text-generation #moe #frankenmoe #merge #mergekit #lazymergekit #TinyLlama/TinyLlama-1.1B-Chat-v1.0 #h4rz3rk4s3/TinyNewsLlama-1.1B #h4rz3rk4s3/TinyParlaMintLlama-1.1B #Tensoic/TinyLlama-1.1B-3T-openhermes #conversational #base_model-TinyLlama/TinyLlama-1.1B-Chat-v1.0 #base_model-h4rz3rk4s3/TinyNewsLlama-1.1B #base_model-h4rz3rk4s3/TinyParlaMintLlama-1.1B #base_model-Tensoic/TinyLlama-1.1B-3T-openhermes #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# TinyPoliticaLlama-4x1.1B-nf4
TinyPoliticaLlama-4x1.1B-nf4 is a Mixure of Experts (MoE) made with the following models using LazyMergekit:
* TinyLlama/TinyLlama-1.1B-Chat-v1.0
* h4rz3rk4s3/TinyNewsLlama-1.1B
* h4rz3rk4s3/TinyParlaMintLlama-1.1B
* Tensoic/TinyLlama-1.1B-3T-openhermes
## Configuration
## Usage
| [
"# TinyPoliticaLlama-4x1.1B-nf4\n\nTinyPoliticaLlama-4x1.1B-nf4 is a Mixure of Experts (MoE) made with the following models using LazyMergekit:\n* TinyLlama/TinyLlama-1.1B-Chat-v1.0\n* h4rz3rk4s3/TinyNewsLlama-1.1B\n* h4rz3rk4s3/TinyParlaMintLlama-1.1B\n* Tensoic/TinyLlama-1.1B-3T-openhermes",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #moe #frankenmoe #merge #mergekit #lazymergekit #TinyLlama/TinyLlama-1.1B-Chat-v1.0 #h4rz3rk4s3/TinyNewsLlama-1.1B #h4rz3rk4s3/TinyParlaMintLlama-1.1B #Tensoic/TinyLlama-1.1B-3T-openhermes #conversational #base_model-TinyLlama/TinyLlama-1.1B-Chat-v1.0 #base_model-h4rz3rk4s3/TinyNewsLlama-1.1B #base_model-h4rz3rk4s3/TinyParlaMintLlama-1.1B #base_model-Tensoic/TinyLlama-1.1B-3T-openhermes #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# TinyPoliticaLlama-4x1.1B-nf4\n\nTinyPoliticaLlama-4x1.1B-nf4 is a Mixure of Experts (MoE) made with the following models using LazyMergekit:\n* TinyLlama/TinyLlama-1.1B-Chat-v1.0\n* h4rz3rk4s3/TinyNewsLlama-1.1B\n* h4rz3rk4s3/TinyParlaMintLlama-1.1B\n* Tensoic/TinyLlama-1.1B-3T-openhermes",
"## Configuration",
"## Usage"
] | [
242,
125,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #moe #frankenmoe #merge #mergekit #lazymergekit #TinyLlama/TinyLlama-1.1B-Chat-v1.0 #h4rz3rk4s3/TinyNewsLlama-1.1B #h4rz3rk4s3/TinyParlaMintLlama-1.1B #Tensoic/TinyLlama-1.1B-3T-openhermes #conversational #base_model-TinyLlama/TinyLlama-1.1B-Chat-v1.0 #base_model-h4rz3rk4s3/TinyNewsLlama-1.1B #base_model-h4rz3rk4s3/TinyParlaMintLlama-1.1B #base_model-Tensoic/TinyLlama-1.1B-3T-openhermes #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# TinyPoliticaLlama-4x1.1B-nf4\n\nTinyPoliticaLlama-4x1.1B-nf4 is a Mixure of Experts (MoE) made with the following models using LazyMergekit:\n* TinyLlama/TinyLlama-1.1B-Chat-v1.0\n* h4rz3rk4s3/TinyNewsLlama-1.1B\n* h4rz3rk4s3/TinyParlaMintLlama-1.1B\n* Tensoic/TinyLlama-1.1B-3T-openhermes## Configuration## Usage"
] | [
-0.026751382276415825,
0.11679167300462723,
-0.00819103792309761,
0.025166649371385574,
0.05553353950381279,
0.050349827855825424,
0.15074124932289124,
0.17143571376800537,
0.028148410841822624,
0.12176866084337234,
0.0644051730632782,
0.1584584265947342,
0.06686997413635254,
0.070368692278862,
-0.02482493594288826,
-0.20192475616931915,
0.038986045867204666,
-0.02781388722360134,
0.03663419559597969,
0.09474705904722214,
0.09257400780916214,
-0.05208727344870567,
0.08617523312568665,
-0.0021451455540955067,
-0.041719019412994385,
-0.018475525081157684,
0.02256234735250473,
-0.04825447499752045,
0.049078408628702164,
0.04590310528874397,
0.03124167211353779,
0.05569286271929741,
0.023667752742767334,
-0.21984094381332397,
0.022760797291994095,
0.07018404453992844,
-0.0364048108458519,
0.05224942788481712,
0.12397021055221558,
-0.0929320678114891,
0.15504035353660583,
-0.11904479563236237,
0.03138105571269989,
0.10964363813400269,
-0.13387969136238098,
-0.08818923681974411,
-0.0998469889163971,
0.1133260726928711,
0.10372087359428406,
0.057364869862794876,
-0.008250226266682148,
0.11486446857452393,
0.022754540666937828,
0.09081120043992996,
0.31850874423980713,
-0.230606347322464,
-0.08118299394845963,
0.13860680162906647,
0.06294354796409607,
-0.041333381086587906,
-0.02602117881178856,
0.07242421805858612,
0.007180734071880579,
0.007915305905044079,
0.02722506783902645,
-0.09664486348628998,
0.17134714126586914,
-0.05400097370147705,
-0.06513787060976028,
-0.029260490089654922,
0.06649227440357208,
0.0343790240585804,
-0.011152422055602074,
-0.14416763186454773,
-0.06190713122487068,
-0.028153691440820694,
-0.08126377314329147,
0.006962205749005079,
0.0018910522339865565,
-0.04781109094619751,
0.01851893775165081,
-0.020889421924948692,
-0.042033664882183075,
-0.00010511631990084425,
0.012993541546165943,
0.11280066519975662,
-0.02670653909444809,
-0.015244119800627232,
-0.034429747611284256,
0.0507340282201767,
0.03699871525168419,
-0.15028204023838043,
-0.010645377449691296,
-0.0025655271019786596,
-0.04804141819477081,
0.008602743037045002,
0.010730920359492302,
0.02335463836789131,
0.12223562598228455,
0.1699930876493454,
-0.04594208672642708,
0.11356134712696075,
0.006115101743489504,
0.03141205012798309,
0.028840521350502968,
-0.01959497667849064,
-0.039862461388111115,
-0.16669891774654388,
-0.022315986454486847,
0.033766187727451324,
0.047174565494060516,
-0.04263878986239433,
-0.024683842435479164,
-0.02417844533920288,
0.03390511870384216,
0.08181969821453094,
0.14960558712482452,
0.058651674538850784,
-0.0833878144621849,
-0.04900748282670975,
0.05184371396899223,
-0.11821401864290237,
0.02848239243030548,
0.029066702350974083,
-0.053879186511039734,
0.061415378004312515,
-0.005690745078027248,
0.03177981823682785,
-0.06195954605937004,
0.11335040628910065,
-0.04999285563826561,
-0.01277527678757906,
-0.00568017503246665,
-0.06034589186310768,
0.039423868060112,
-0.08210119605064392,
-0.04950600117444992,
-0.1056772917509079,
-0.09096145629882812,
-0.11165706068277359,
0.05857941135764122,
-0.06706835329532623,
-0.04307881370186806,
-0.05969661846756935,
-0.03266621753573418,
0.06177062168717384,
0.0070250672288239,
-0.027418900281190872,
-0.03615273907780647,
0.01490824855864048,
-0.08669564127922058,
0.059000786393880844,
0.014027881436049938,
0.015538852661848068,
-0.023394690826535225,
0.09417389333248138,
-0.16521520912647247,
0.04384107142686844,
-0.08711354434490204,
0.07314787060022354,
-0.18226484954357147,
-0.06345266848802567,
0.03801659867167473,
0.02279031090438366,
0.0150609714910388,
0.1448691338300705,
-0.16477340459823608,
-0.06205500662326813,
0.11850021779537201,
-0.06395085901021957,
-0.12222441285848618,
0.02829844318330288,
0.024739939719438553,
0.06532716751098633,
0.040437642484903336,
0.18346108496189117,
0.14189286530017853,
-0.04246305674314499,
-0.05096729099750519,
-0.05505470559000969,
0.07396495342254639,
0.10718420147895813,
0.030405983328819275,
-0.07605984061956406,
0.03262178599834442,
0.006774213630706072,
-0.03308645635843277,
-0.01172921247780323,
-0.026265420019626617,
-0.05105419084429741,
-0.022168319672346115,
-0.03938130661845207,
0.004701830912381411,
-0.014229233376681805,
-0.004135060124099255,
-0.057937562465667725,
-0.06366200000047684,
-0.02837177738547325,
0.11409510672092438,
-0.01286311261355877,
0.022641949355602264,
-0.10212614387273788,
0.0733354240655899,
0.09298153966665268,
0.025988969951868057,
-0.13634710013866425,
-0.08842161297798157,
-0.001783712417818606,
-0.09386434406042099,
0.024829577654600143,
-0.06485264003276825,
0.08098260313272476,
-0.020207326859235764,
-0.02615755796432495,
-0.060341984033584595,
0.09521691501140594,
-0.0006375350640155375,
-0.013666699640452862,
-0.17552919685840607,
-0.08524995297193527,
-0.018458595499396324,
0.12964925169944763,
-0.08942772448062897,
0.030127862468361855,
0.040642447769641876,
0.19429843127727509,
0.022562749683856964,
-0.05552436411380768,
0.03331378102302551,
0.006123581901192665,
-0.032434482127428055,
-0.05105321854352951,
0.09299095720052719,
-0.016868863254785538,
-0.08800171315670013,
0.060362715274095535,
-0.1552787721157074,
0.008884472772479057,
0.07445280998945236,
-0.044586051255464554,
-0.08477054536342621,
-0.030646339058876038,
-0.0007775515550747514,
-0.029470322653651237,
0.014448725618422031,
-0.0332794114947319,
0.03492511063814163,
0.06550870090723038,
0.08543127775192261,
-0.027200739830732346,
-0.05761909857392311,
0.014307823963463306,
-0.0016618173103779554,
-0.06267417222261429,
0.10843700915575027,
-0.0011321977945044637,
-0.13039010763168335,
0.05709182471036911,
0.10793860256671906,
0.08611708879470825,
0.09010003507137299,
0.025927873328328133,
-0.0167660154402256,
-0.07206572592258453,
-0.00888711679726839,
0.046017445623874664,
-0.04847805202007294,
-0.08352988213300705,
0.004619013052433729,
0.04360886663198471,
0.02657211199402809,
-0.003717434359714389,
-0.05873562768101692,
0.002522157970815897,
0.003673854051157832,
-0.03007899597287178,
0.09276594966650009,
0.06479982286691666,
-0.0022098831832408905,
0.08008930087089539,
0.031177397817373276,
-0.012075000442564487,
-0.03523410111665726,
-0.06907448917627335,
-0.10464520007371902,
0.1411343216896057,
-0.10765185207128525,
-0.18303334712982178,
-0.08975077420473099,
-0.156990647315979,
-0.08208532631397247,
-0.024965887889266014,
0.060494568198919296,
-0.04207602143287659,
-0.042534563690423965,
-0.10047465562820435,
0.020774194970726967,
0.006302041932940483,
-0.02165994979441166,
0.06555978208780289,
0.01516854390501976,
0.04889718070626259,
-0.10541556775569916,
-0.027836497873067856,
0.05717406049370766,
-0.07325246185064316,
0.05776992067694664,
0.00034904354833997786,
0.06458806246519089,
0.12453420460224152,
0.0706183984875679,
0.03779197856783867,
-0.007000769022852182,
0.27090170979499817,
-0.08764481544494629,
0.09700905531644821,
0.1496836245059967,
0.012440504506230354,
0.0809907391667366,
0.15980355441570282,
0.06141355261206627,
-0.05046144127845764,
-0.01204428169876337,
0.025357693433761597,
0.00045903355930931866,
-0.20105338096618652,
-0.08350104093551636,
-0.00308620766736567,
0.0823187455534935,
0.1582227349281311,
0.06434427201747894,
0.01335593406111002,
0.06721299141645432,
-0.06175072863698006,
-0.02282843366265297,
0.043332796543836594,
0.08319862931966782,
0.11183857172727585,
0.012940840795636177,
0.08414744585752487,
-0.026540134102106094,
0.02488463558256626,
0.06522952020168304,
-0.006520072929561138,
0.041537366807460785,
0.03840978443622589,
0.17657499015331268,
0.05523819103837013,
0.07050982862710953,
0.007023213896900415,
0.05281173810362816,
0.040281884372234344,
0.0055383783765137196,
0.014812045730650425,
-0.10066170990467072,
-0.014546597376465797,
0.052555765956640244,
0.012386951595544815,
-0.02193053625524044,
-0.06550201773643494,
0.005170043092221022,
0.06180096045136452,
0.2565188705921173,
0.0741027444601059,
-0.18368420004844666,
-0.02046831138432026,
0.0461835153400898,
0.010284889489412308,
-0.0745350569486618,
0.020467767491936684,
-0.016613692045211792,
-0.15049007534980774,
0.161714568734169,
0.005323879420757294,
0.09473954141139984,
-0.09778851270675659,
0.02291644923388958,
0.03134856000542641,
0.013491951860487461,
-0.015515010803937912,
0.022327154874801636,
-0.19301529228687286,
0.15670707821846008,
0.025708097964525223,
-0.00034216162748634815,
-0.024670490995049477,
-0.0044576553627848625,
0.03311110660433769,
0.0323977917432785,
0.16471236944198608,
0.027412381023168564,
-0.16750197112560272,
-0.1399916559457779,
-0.02837735041975975,
-0.04015149176120758,
0.12027790397405624,
-0.09640537202358246,
0.11168740689754486,
-0.034144651144742966,
-0.07136581093072891,
-0.027087125927209854,
-0.05258806794881821,
-0.15393505990505219,
-0.09409623593091965,
0.09225775301456451,
0.10258295387029648,
0.026875128969550133,
-0.09517320245504379,
-0.029961146414279938,
-0.060947246849536896,
0.19040854275226593,
-0.05382409319281578,
-0.05405816435813904,
-0.11329428106546402,
-0.04399653896689415,
0.10761816799640656,
-0.10670240223407745,
0.05990098789334297,
-0.023904044181108475,
0.10807022452354431,
-0.019152915105223656,
-0.086663618683815,
0.029774637892842293,
-0.1027965247631073,
-0.12937797605991364,
-0.030867651104927063,
0.15006186068058014,
-0.01256245281547308,
0.04266534000635147,
0.022286199033260345,
0.05170001462101936,
0.03793463110923767,
-0.09461523592472076,
0.05396423488855362,
0.09354811161756516,
-0.025697123259305954,
0.05207651108503342,
-0.017536811530590057,
-0.07931055873632431,
-0.10217897593975067,
-0.0334264300763607,
0.15725158154964447,
0.34041938185691833,
-0.07790001481771469,
0.09099498391151428,
0.05286424607038498,
-0.07360988110303879,
-0.18917879462242126,
-0.05248347297310829,
0.03653368353843689,
-0.05676977336406708,
0.059363167732954025,
-0.17839956283569336,
0.07022763043642044,
0.062369611114263535,
-0.00230263639241457,
0.09408953040838242,
-0.3569351136684418,
-0.1204962357878685,
0.07972141355276108,
0.06095529720187187,
-0.004201327916234732,
-0.17446333169937134,
-0.14003388583660126,
-0.11233652383089066,
-0.1872396320104599,
0.05630607530474663,
-0.02686520852148533,
0.06413502246141434,
-0.04293663054704666,
0.022310292348265648,
0.05248934030532837,
-0.01395422499626875,
0.14755184948444366,
0.016789177432656288,
0.013427594676613808,
-0.11115184426307678,
-0.06827595084905624,
0.013511573895812035,
-0.07992435246706009,
0.018227534368634224,
-0.05881524458527565,
0.024032099172472954,
-0.10558249056339264,
0.0043113804422318935,
-0.08761642873287201,
0.0325390063226223,
-0.0748617947101593,
-0.022047799080610275,
-0.03260605037212372,
0.04986322671175003,
0.09052380919456482,
0.030718374997377396,
0.08907841891050339,
-0.0669943243265152,
0.15614984929561615,
0.22834013402462006,
0.09401483833789825,
-0.03103652410209179,
-0.112561896443367,
-0.02567480504512787,
-0.027319682762026787,
-0.009126180782914162,
-0.06334759294986725,
0.008901205845177174,
0.09164649248123169,
0.047550175338983536,
0.10704223066568375,
0.017558977007865906,
-0.0715632364153862,
-0.020578600466251373,
0.08674590289592743,
-0.131252259016037,
-0.1239176020026207,
-0.021563325077295303,
0.07639505714178085,
-0.07432372868061066,
-0.039741531014442444,
0.19439899921417236,
0.03663506358861923,
-0.02101658098399639,
0.03787096589803696,
0.04729573056101799,
-0.0016486390959471464,
0.13721027970314026,
-0.01678488217294216,
0.08157137781381607,
-0.07789961993694305,
0.05505285784602165,
0.08518258482217789,
-0.08598223328590393,
0.017080366611480713,
0.20034152269363403,
-0.06112578883767128,
-0.07643049955368042,
-0.04145537316799164,
0.05277493968605995,
-0.07511743903160095,
-0.003852816065773368,
-0.05795712396502495,
-0.105882927775383,
0.056636177003383636,
0.11593563854694366,
0.033521831035614014,
-0.0066824802197515965,
0.06807509064674377,
-0.04734615609049797,
0.015849187970161438,
0.030332328751683235,
0.039127763360738754,
0.05920841917395592,
-0.11696682870388031,
0.07680435478687286,
-0.0017076707445085049,
0.048407841473817825,
0.015570145100355148,
-0.038245584815740585,
-0.1353284865617752,
0.001646987977437675,
-0.13417722284793854,
0.032040923833847046,
-0.1580304652452469,
-0.034334324300289154,
0.008390423841774464,
0.01254122145473957,
-0.040756672620773315,
-0.03624871000647545,
-0.07548723369836807,
-0.09621144086122513,
-0.052453432232141495,
0.0917360782623291,
-0.08147679269313812,
-0.004803691990673542,
0.04575485363602638,
-0.08347710967063904,
0.07590557634830475,
0.012055128812789917,
-0.046221304684877396,
-0.01342281699180603,
-0.1644127070903778,
-0.05515803396701813,
-0.0027262638323009014,
-0.02450699359178543,
0.03076663240790367,
-0.14782558381557465,
-0.020648302510380745,
-0.06066667288541794,
-0.02443031221628189,
0.0201705414801836,
0.05448471009731293,
-0.12903092801570892,
0.03136809915304184,
-0.014454858377575874,
-0.05024036765098572,
-0.09263699501752853,
0.00978805311024189,
0.06664292514324188,
-0.03617604449391365,
0.12731455266475677,
-0.04122582823038101,
0.11251204460859299,
-0.16018712520599365,
0.005469245836138725,
-0.005804806016385555,
-0.06262516230344772,
0.057494957000017166,
-0.02032659761607647,
0.06278087943792343,
-0.03184047341346741,
0.03778499737381935,
-0.04203375428915024,
-0.074166439473629,
0.04768355190753937,
-0.067426398396492,
-0.06354556977748871,
0.0211473498493433,
0.1570035070180893,
0.06479884684085846,
-0.03944864124059677,
0.00250420649535954,
0.016452480107545853,
-0.013436643406748772,
0.06443339586257935,
0.1129898801445961,
0.12555068731307983,
0.04833139106631279,
0.050817448645830154,
0.04350540414452553,
-0.08109471946954727,
-0.0038357654120773077,
0.09382424503564835,
-0.03056100569665432,
0.11412275582551956,
-0.056292835623025894,
0.12367149442434311,
0.12894907593727112,
-0.15286129713058472,
0.05527651309967041,
-0.07666043192148209,
-0.002873092656955123,
-0.10590613633394241,
-0.16913944482803345,
-0.09738501161336899,
-0.09651102870702744,
-0.00474175252020359,
-0.09641893208026886,
0.07943965494632721,
0.01778292842209339,
0.01879751868546009,
0.03078296221792698,
0.09844637662172318,
-0.10079992562532425,
-0.0492481030523777,
0.09781589359045029,
0.01450822502374649,
-0.027265047654509544,
-0.049351420253515244,
-0.024140147492289543,
-0.03895161673426628,
0.010923800058662891,
0.000848360185045749,
0.056520868092775345,
-0.03542281314730644,
0.027698634192347527,
-0.06486846506595612,
-0.1312934309244156,
0.017794329673051834,
-0.0005368692218326032,
0.006524914875626564,
0.08456555753946304,
0.04892028495669365,
0.00013096394832246006,
-0.021023180335760117,
0.1871783584356308,
-0.05700939521193504,
-0.07542815804481506,
-0.0449303574860096,
0.09582683444023132,
-0.024004299193620682,
-0.026070311665534973,
-0.007768119219690561,
-0.06183379888534546,
-0.00606278283521533,
0.1886606067419052,
0.25164538621902466,
-0.007459348067641258,
0.038209378719329834,
-0.004968573804944754,
0.03016085922718048,
0.04808918759226799,
0.09575003385543823,
0.05660827457904816,
0.11655506491661072,
-0.050479743629693985,
0.05259380489587784,
-0.007091681472957134,
-0.0629420056939125,
-0.09625085443258286,
0.03902289271354675,
0.021791277453303337,
0.02271321974694729,
-0.006208505947142839,
0.1291668862104416,
-0.10867378115653992,
-0.06265974044799805,
0.03034232370555401,
-0.17048875987529755,
-0.11439083516597748,
-0.036308396607637405,
0.04943009838461876,
0.06885817646980286,
0.11443669348955154,
-0.008418720215559006,
-0.06992191076278687,
0.06788459420204163,
-0.003652745857834816,
-0.08737873286008835,
-0.07011160254478455,
0.0489492230117321,
-0.14124995470046997,
0.20246483385562897,
-0.013408562168478966,
0.02237359620630741,
0.13027556240558624,
-0.017838606610894203,
-0.13426069915294647,
0.018467554822564125,
0.0813671126961708,
-0.06811919063329697,
0.055904217064380646,
0.17424437403678894,
-0.021911470219492912,
0.08707170188426971,
0.06855545192956924,
-0.12551511824131012,
0.05066673457622528,
0.16045236587524414,
0.04628825932741165,
-0.06280364841222763,
0.041098419576883316,
-0.050479914993047714,
0.12688416242599487,
0.15288463234901428,
-0.06083414703607559,
-0.04501676186919212,
-0.03989887237548828,
-0.008573316968977451,
0.022568536922335625,
0.03226374462246895,
-0.07039220631122589,
-0.18126651644706726,
0.03764296695590019,
-0.014803600497543812,
0.06347794085741043,
-0.20831900835037231,
-0.07873684912919998,
0.012245560064911842,
-0.013527430593967438,
-0.06906618177890778,
0.11510263383388519,
0.07416990399360657,
0.010530509054660797,
-0.03010987676680088,
-0.1264193058013916,
0.03003871627151966,
0.16386982798576355,
-0.11618039757013321,
-0.10225988924503326
] |
null | null | null | # InfernoSaber Model Repository
## Overview
Welcome to the InfernoSaber, an Automapper for BeatSaber with fully adjusteable difficulty.
### Model List
Recommendation of the models/branches in the following order:
- expert_15: Current favorite, trained on curated high difficulty maps (8+ nps) with a like/dislike rate >90%. Usually generates good flow.
- easy_15: Trained on curated low difficulty maps (5- nps) with a like/dislike rate >90%. More creativity but less flow, especially for high difficulty maps.
- pp3_15: Trained on random maps from my personal collection, mainly ranked high PP maps.
### Model Details
- **Model Name**: InfernoSaber
- **Model Version**: v.1.7.0
- **Architecture**: Multiple custom DNN and Autoencoder with TF v.15
- **Training Objective**: Classification/Creation of song maps
- **Language**: Python
- **License**: Model and source code are free to use
- **Repository**: https://github.com/fred-brenner/InfernoSaber---BeatSaber-Automapper
## Intended Use
This repository contains the trained models for inference (go to the other branches at the top: "Files and versions", click on "main" -> select "expert_15" or similar).
All models are trained on different datasets with mixed genre.
The models are required by the source code at github.
## Inference and Training
The models can be run on most systems and do not require GPU for inference.
For training / creation of new models, the following specs are required:
- 10-20 GB free RAM per 50 maps (mainly depending on the creativity of the maps)
- 8-15 GB VRAM per 50 maps (mainly depending on the creativity of the maps)
## Limitations and Ethical Considerations
The models and corresponding source code are *free to use*.
Selling the generated maps is prohibited and labeling as human made is against the beatsaver policies.
This project is not meant to be replacing human maps, but as extension to cover more unkown songs and massively reduce the amount of human effort required.
## Contact Information
If you have comments, suggestions, or want to contribute to new models/features feel free to open a discussion.
Cheers!
| {"license": "mit"} | null | BierHerr/InfernoSaber | [
"license:mit",
"region:us"
] | 2024-02-14T16:09:39+00:00 | [] | [] | TAGS
#license-mit #region-us
| # InfernoSaber Model Repository
## Overview
Welcome to the InfernoSaber, an Automapper for BeatSaber with fully adjusteable difficulty.
### Model List
Recommendation of the models/branches in the following order:
- expert_15: Current favorite, trained on curated high difficulty maps (8+ nps) with a like/dislike rate >90%. Usually generates good flow.
- easy_15: Trained on curated low difficulty maps (5- nps) with a like/dislike rate >90%. More creativity but less flow, especially for high difficulty maps.
- pp3_15: Trained on random maps from my personal collection, mainly ranked high PP maps.
### Model Details
- Model Name: InfernoSaber
- Model Version: v.1.7.0
- Architecture: Multiple custom DNN and Autoencoder with TF v.15
- Training Objective: Classification/Creation of song maps
- Language: Python
- License: Model and source code are free to use
- Repository: URL
## Intended Use
This repository contains the trained models for inference (go to the other branches at the top: "Files and versions", click on "main" -> select "expert_15" or similar).
All models are trained on different datasets with mixed genre.
The models are required by the source code at github.
## Inference and Training
The models can be run on most systems and do not require GPU for inference.
For training / creation of new models, the following specs are required:
- 10-20 GB free RAM per 50 maps (mainly depending on the creativity of the maps)
- 8-15 GB VRAM per 50 maps (mainly depending on the creativity of the maps)
## Limitations and Ethical Considerations
The models and corresponding source code are *free to use*.
Selling the generated maps is prohibited and labeling as human made is against the beatsaver policies.
This project is not meant to be replacing human maps, but as extension to cover more unkown songs and massively reduce the amount of human effort required.
## Contact Information
If you have comments, suggestions, or want to contribute to new models/features feel free to open a discussion.
Cheers!
| [
"# InfernoSaber Model Repository",
"## Overview\n\nWelcome to the InfernoSaber, an Automapper for BeatSaber with fully adjusteable difficulty.",
"### Model List\n\nRecommendation of the models/branches in the following order:\n- expert_15: Current favorite, trained on curated high difficulty maps (8+ nps) with a like/dislike rate >90%. Usually generates good flow.\n- easy_15: Trained on curated low difficulty maps (5- nps) with a like/dislike rate >90%. More creativity but less flow, especially for high difficulty maps.\n- pp3_15: Trained on random maps from my personal collection, mainly ranked high PP maps.",
"### Model Details\n\n- Model Name: InfernoSaber\n- Model Version: v.1.7.0\n- Architecture: Multiple custom DNN and Autoencoder with TF v.15\n- Training Objective: Classification/Creation of song maps\n- Language: Python\n- License: Model and source code are free to use\n- Repository: URL",
"## Intended Use\n\nThis repository contains the trained models for inference (go to the other branches at the top: \"Files and versions\", click on \"main\" -> select \"expert_15\" or similar).\nAll models are trained on different datasets with mixed genre.\nThe models are required by the source code at github.",
"## Inference and Training\n\nThe models can be run on most systems and do not require GPU for inference.\n\nFor training / creation of new models, the following specs are required:\n- 10-20 GB free RAM per 50 maps (mainly depending on the creativity of the maps)\n- 8-15 GB VRAM per 50 maps (mainly depending on the creativity of the maps)",
"## Limitations and Ethical Considerations\n\nThe models and corresponding source code are *free to use*.\nSelling the generated maps is prohibited and labeling as human made is against the beatsaver policies.\n\nThis project is not meant to be replacing human maps, but as extension to cover more unkown songs and massively reduce the amount of human effort required.",
"## Contact Information\n\nIf you have comments, suggestions, or want to contribute to new models/features feel free to open a discussion.\nCheers!"
] | [
"TAGS\n#license-mit #region-us \n",
"# InfernoSaber Model Repository",
"## Overview\n\nWelcome to the InfernoSaber, an Automapper for BeatSaber with fully adjusteable difficulty.",
"### Model List\n\nRecommendation of the models/branches in the following order:\n- expert_15: Current favorite, trained on curated high difficulty maps (8+ nps) with a like/dislike rate >90%. Usually generates good flow.\n- easy_15: Trained on curated low difficulty maps (5- nps) with a like/dislike rate >90%. More creativity but less flow, especially for high difficulty maps.\n- pp3_15: Trained on random maps from my personal collection, mainly ranked high PP maps.",
"### Model Details\n\n- Model Name: InfernoSaber\n- Model Version: v.1.7.0\n- Architecture: Multiple custom DNN and Autoencoder with TF v.15\n- Training Objective: Classification/Creation of song maps\n- Language: Python\n- License: Model and source code are free to use\n- Repository: URL",
"## Intended Use\n\nThis repository contains the trained models for inference (go to the other branches at the top: \"Files and versions\", click on \"main\" -> select \"expert_15\" or similar).\nAll models are trained on different datasets with mixed genre.\nThe models are required by the source code at github.",
"## Inference and Training\n\nThe models can be run on most systems and do not require GPU for inference.\n\nFor training / creation of new models, the following specs are required:\n- 10-20 GB free RAM per 50 maps (mainly depending on the creativity of the maps)\n- 8-15 GB VRAM per 50 maps (mainly depending on the creativity of the maps)",
"## Limitations and Ethical Considerations\n\nThe models and corresponding source code are *free to use*.\nSelling the generated maps is prohibited and labeling as human made is against the beatsaver policies.\n\nThis project is not meant to be replacing human maps, but as extension to cover more unkown songs and massively reduce the amount of human effort required.",
"## Contact Information\n\nIf you have comments, suggestions, or want to contribute to new models/features feel free to open a discussion.\nCheers!"
] | [
11,
10,
28,
131,
74,
78,
87,
82,
30
] | [
"passage: TAGS\n#license-mit #region-us \n# InfernoSaber Model Repository## Overview\n\nWelcome to the InfernoSaber, an Automapper for BeatSaber with fully adjusteable difficulty.### Model List\n\nRecommendation of the models/branches in the following order:\n- expert_15: Current favorite, trained on curated high difficulty maps (8+ nps) with a like/dislike rate >90%. Usually generates good flow.\n- easy_15: Trained on curated low difficulty maps (5- nps) with a like/dislike rate >90%. More creativity but less flow, especially for high difficulty maps.\n- pp3_15: Trained on random maps from my personal collection, mainly ranked high PP maps.### Model Details\n\n- Model Name: InfernoSaber\n- Model Version: v.1.7.0\n- Architecture: Multiple custom DNN and Autoencoder with TF v.15\n- Training Objective: Classification/Creation of song maps\n- Language: Python\n- License: Model and source code are free to use\n- Repository: URL## Intended Use\n\nThis repository contains the trained models for inference (go to the other branches at the top: \"Files and versions\", click on \"main\" -> select \"expert_15\" or similar).\nAll models are trained on different datasets with mixed genre.\nThe models are required by the source code at github.## Inference and Training\n\nThe models can be run on most systems and do not require GPU for inference.\n\nFor training / creation of new models, the following specs are required:\n- 10-20 GB free RAM per 50 maps (mainly depending on the creativity of the maps)\n- 8-15 GB VRAM per 50 maps (mainly depending on the creativity of the maps)## Limitations and Ethical Considerations\n\nThe models and corresponding source code are *free to use*.\nSelling the generated maps is prohibited and labeling as human made is against the beatsaver policies.\n\nThis project is not meant to be replacing human maps, but as extension to cover more unkown songs and massively reduce the amount of human effort required."
] | [
-0.046311113983392715,
0.1302507221698761,
-0.006208472419530153,
0.05631736293435097,
0.0832461565732956,
0.015474374406039715,
-0.05074725300073624,
0.08631758391857147,
0.040858540683984756,
0.10755830258131027,
-0.05584525689482689,
0.057290736585855484,
0.07661005854606628,
0.02616451308131218,
0.03343643248081207,
-0.18983113765716553,
0.017455697059631348,
-0.08525843173265457,
0.00005411090751294978,
0.05922115966677666,
0.08656726032495499,
-0.02221858501434326,
0.0742398351430893,
0.027669120579957962,
-0.08971365541219711,
-0.0007949906284920871,
0.024089189246296883,
-0.0004127054417040199,
0.0648064836859703,
0.07872875779867172,
0.019049599766731262,
-0.01752721145749092,
0.022143417969346046,
-0.14958885312080383,
0.021854795515537262,
0.14249704778194427,
-0.039503589272499084,
0.07217983901500702,
0.16007845103740692,
0.012666388414800167,
0.06692451983690262,
-0.059271086007356644,
0.05853920802474022,
0.09926581382751465,
-0.05442199110984802,
-0.11981482803821564,
-0.10265608876943588,
0.033602021634578705,
0.07321233302354813,
0.06372649222612381,
-0.01074482686817646,
0.0856638252735138,
-0.08487558364868164,
0.03034818172454834,
0.034389298409223557,
-0.15910008549690247,
-0.029692644253373146,
0.06778596341609955,
-0.0435863621532917,
0.03655087575316429,
-0.07726739346981049,
-0.0179140605032444,
-0.003738457802683115,
0.024346157908439636,
0.11086970567703247,
-0.008140246383845806,
0.01673399657011032,
-0.09740962833166122,
-0.08834146708250046,
-0.0614895336329937,
0.11971693485975266,
0.04476995766162872,
-0.09094619750976562,
-0.12361589074134827,
-0.052633244544267654,
0.07878266274929047,
-0.0023752949200570583,
-0.011704808101058006,
0.011447235941886902,
0.06172589212656021,
0.0644482672214508,
-0.09007859230041504,
-0.05492688715457916,
-0.031313735991716385,
0.012350929901003838,
0.0024471792858093977,
0.03189888969063759,
-0.002717977622523904,
0.09056918323040009,
0.09858614951372147,
-0.1373031735420227,
-0.0795384868979454,
-0.06946085393428802,
-0.09290197491645813,
-0.17819148302078247,
-0.05398694425821304,
-0.04548678919672966,
-0.008372188545763493,
-0.03750743344426155,
0.1391940414905548,
0.029471665620803833,
0.012475945986807346,
-0.05107996612787247,
0.01007546205073595,
0.06450126320123672,
0.0050095063634216785,
-0.04606899991631508,
-0.02778141386806965,
0.09043075889348984,
-0.05469747260212898,
0.023548804223537445,
-0.007905685342848301,
-0.021600013598799706,
-0.06430288404226303,
0.03877409175038338,
0.03388265147805214,
0.017837684601545334,
0.015442200936377048,
-0.04246599227190018,
-0.08712930977344513,
0.13031728565692902,
-0.10316955298185349,
0.03739286959171295,
0.038561511784791946,
-0.03598155081272125,
0.05058709904551506,
0.02433595433831215,
-0.010740147903561592,
-0.12383240461349487,
0.07347660511732101,
-0.05580544099211693,
-0.02008039504289627,
-0.08050482720136642,
-0.07961961627006531,
0.07726981490850449,
-0.028265370056033134,
-0.002383706159889698,
-0.0835503488779068,
-0.1495508849620819,
-0.03445350378751755,
0.053335290402173996,
-0.06640399247407913,
-0.08641789853572845,
0.026161663234233856,
-0.027414701879024506,
-0.026494378224015236,
0.009713789448142052,
0.027869362384080887,
-0.02127956598997116,
0.054786331951618195,
-0.10001803934574127,
0.0037009138613939285,
0.053364746272563934,
0.06599721312522888,
-0.06564901024103165,
-0.008910837583243847,
-0.2338654100894928,
0.0902232751250267,
-0.06203371286392212,
0.010579794645309448,
-0.07639594376087189,
-0.04516882076859474,
-0.02689051628112793,
0.005939291324466467,
-0.007773008197546005,
0.09204322844743729,
-0.1594438999891281,
-0.04317289963364601,
0.0723477452993393,
-0.0841747522354126,
0.011112457141280174,
0.15620404481887817,
-0.04834047704935074,
0.02316189743578434,
0.09750346839427948,
0.05615180358290672,
0.15221309661865234,
-0.023670271039009094,
-0.027056312188506126,
0.0016700163250789046,
-0.04597008228302002,
0.0630088821053505,
0.0403740257024765,
-0.02911338210105896,
-0.018845342099666595,
0.029864845797419548,
0.00037749725743196905,
0.003200686536729336,
0.027075855061411858,
-0.011660617776215076,
-0.05694902688264847,
-0.014353744685649872,
-0.02642691321671009,
0.00004938215352012776,
-0.03548291698098183,
-0.01580405980348587,
-0.0924856886267662,
-0.02402651309967041,
0.11404722929000854,
0.0033756475895643234,
0.00190438749268651,
-0.04825213924050331,
0.0643620565533638,
-0.020365595817565918,
0.009435899555683136,
-0.15738347172737122,
0.03271392732858658,
0.08767575025558472,
-0.12510539591312408,
0.08826400339603424,
-0.0016489377012476325,
0.01741674169898033,
0.04060713201761246,
-0.03806515038013458,
-0.012535128742456436,
-0.03374231606721878,
-0.04427656531333923,
0.01851685903966427,
-0.12283218652009964,
-0.06647837162017822,
-0.024760348722338676,
0.04724280908703804,
-0.10663115233182907,
-0.02171880193054676,
0.058269672095775604,
0.13783617317676544,
0.057727836072444916,
-0.07719264924526215,
0.016052335500717163,
-0.03499886766076088,
-0.018026340752840042,
-0.05211286246776581,
-0.02763030119240284,
0.03855561837553978,
0.01797865703701973,
0.013366390950977802,
-0.04231368377804756,
-0.019295113161206245,
0.07027743756771088,
-0.019871678203344345,
-0.06957612186670303,
-0.009230299852788448,
0.010839664377272129,
0.00373876397497952,
-0.07502321898937225,
-0.06488963216543198,
0.12278001755475998,
0.025494076311588287,
0.07206367701292038,
-0.10360515862703323,
-0.05075262859463692,
0.008930574171245098,
0.020366506651043892,
0.000856229045893997,
0.021883398294448853,
0.1325904130935669,
-0.14200431108474731,
0.06844069063663483,
0.0077897352166473866,
0.022012339904904366,
0.18156875669956207,
-0.022980598732829094,
-0.08693423867225647,
-0.01619683764874935,
0.03589213266968727,
0.025626489892601967,
0.03134530782699585,
-0.05337902903556824,
0.08885104209184647,
0.05699162930250168,
0.03833896666765213,
-0.009375781752169132,
-0.06463247537612915,
0.03262101858854294,
0.035811491310596466,
-0.00795693974941969,
-0.03854191675782204,
0.0024800996761769056,
0.028710277751088142,
0.05836579576134682,
-0.0359562411904335,
0.07680167257785797,
0.0332346148788929,
-0.0590483695268631,
-0.09543614089488983,
0.08711984008550644,
-0.1405574381351471,
-0.17246566712856293,
-0.1761416494846344,
0.02438700757920742,
-0.10365850478410721,
0.008312785066664219,
0.00447040144354105,
-0.03455528989434242,
-0.07609106600284576,
-0.05738525837659836,
0.08484014868736267,
-0.039046987891197205,
-0.07045943289995193,
-0.09048689156770706,
0.0648716613650322,
0.056354034692049026,
-0.0994901955127716,
0.0335291251540184,
0.03331129625439644,
-0.09874159097671509,
-0.012459314428269863,
0.022415155544877052,
0.030296413227915764,
0.08667917549610138,
0.05223977565765381,
-0.020163696259260178,
0.0038527760189026594,
0.2303851991891861,
-0.0795426219701767,
0.12538181245326996,
0.16672353446483612,
0.02408340759575367,
0.07539483904838562,
0.10528948158025742,
0.04021938517689705,
-0.03656427189707756,
0.025666678324341774,
0.06999514997005463,
-0.029222853481769562,
-0.2252693921327591,
-0.071287140250206,
-0.0927606150507927,
-0.07867201417684555,
0.06652069091796875,
0.04114367812871933,
0.07928074896335602,
0.0645546242594719,
-0.1298147588968277,
0.07931981980800629,
0.03297875076532364,
0.0614926777780056,
0.05610521882772446,
0.006171936634927988,
0.027979683130979538,
-0.02864924632012844,
0.046259842813014984,
0.14259609580039978,
0.010108056478202343,
0.212303027510643,
-0.03021007403731346,
0.1630740761756897,
-0.004318283870816231,
0.17872785031795502,
0.022700795903801918,
0.08715938776731491,
-0.03301786631345749,
0.04211343824863434,
-0.034188173711299896,
-0.06935256719589233,
-0.04050060361623764,
0.0733969435095787,
0.0710241049528122,
-0.03411036729812622,
-0.012646332383155823,
-0.08515440672636032,
0.02071550115942955,
0.18934990465641022,
-0.03277726471424103,
-0.06907457858324051,
-0.04539739713072777,
0.03914623707532883,
-0.06395036727190018,
-0.11587747931480408,
0.023942936211824417,
0.09795014560222626,
-0.08324984461069107,
0.032123178243637085,
-0.033387601375579834,
0.09896256774663925,
-0.06781938672065735,
-0.02094574272632599,
0.002586859045550227,
0.15329186618328094,
-0.011490792036056519,
0.09275710582733154,
-0.08807246387004852,
-0.001023988239467144,
0.042773522436618805,
0.11171648651361465,
-0.07231088727712631,
0.04918487370014191,
0.05842873081564903,
0.07086504995822906,
0.12513799965381622,
0.021816952154040337,
-0.08093816041946411,
-0.06647603958845139,
-0.04554877057671547,
-0.02154587395489216,
0.06284486502408981,
-0.10772936046123505,
0.03685209900140762,
-0.051228586584329605,
0.0031698334496468306,
-0.0041730767115950584,
-0.03777894005179405,
-0.1367202252149582,
-0.16156606376171112,
0.01982012763619423,
-0.09403639286756516,
0.061680614948272705,
-0.06342768669128418,
0.015041163191199303,
-0.016916576772928238,
0.09909647703170776,
-0.11563438922166824,
-0.06304416805505753,
-0.1647685170173645,
-0.059202153235673904,
0.11441777646541595,
-0.03256703540682793,
0.10387834161520004,
-0.011697416193783283,
0.1823156476020813,
-0.03813185915350914,
-0.12403048574924469,
0.024364937096834183,
-0.08039113879203796,
-0.1319977343082428,
-0.02473601885139942,
0.1564514935016632,
0.03836615011096001,
0.05045485496520996,
0.020909691229462624,
0.008391697891056538,
0.024065611883997917,
-0.10468883812427521,
0.03329139202833176,
0.11265262961387634,
-0.018294278532266617,
0.09960904717445374,
-0.08472229540348053,
-0.06282909959554672,
-0.0195821113884449,
-0.028398197144269943,
0.04332997277379036,
0.2111908495426178,
-0.055859439074993134,
0.11791329830884933,
0.15886647999286652,
-0.06447843462228775,
-0.167535200715065,
0.04309758543968201,
0.038408517837524414,
0.019432947039604187,
0.057160310447216034,
-0.27868422865867615,
0.019735731184482574,
0.04974875971674919,
-0.00806355569511652,
0.09556519240140915,
-0.1944153904914856,
-0.08297974616289139,
-0.006798309739679098,
0.03875211626291275,
0.022863369435071945,
-0.022750522941350937,
-0.04183885455131531,
-0.056891512125730515,
-0.07402696460485458,
0.13852262496948242,
-0.0668957456946373,
0.09436266869306564,
0.023828446865081787,
0.0024941004812717438,
0.04929142817854881,
-0.0099834855645895,
0.06129887327551842,
-0.005493671167641878,
0.134310781955719,
-0.0436730720102787,
0.11948155611753464,
-0.005361591000109911,
-0.016857992857694626,
0.06435022503137589,
0.005359637085348368,
0.018190771341323853,
-0.11981678754091263,
-0.05876297503709793,
-0.06516796350479126,
0.07495755702257156,
-0.04871233552694321,
-0.07714444398880005,
-0.08066108822822571,
0.09413929283618927,
0.10403428971767426,
-0.029827497899532318,
-0.04985540360212326,
0.003725688671693206,
-0.044227082282304764,
0.2089136838912964,
0.10311119258403778,
0.01679171435534954,
-0.13468171656131744,
0.011631040833890438,
-0.014603305608034134,
0.09203526377677917,
-0.1033572182059288,
0.06703586131334305,
0.03529162332415581,
0.05481190234422684,
0.09215768426656723,
-0.016508396714925766,
-0.17512571811676025,
-0.03560136258602142,
-0.0020744898356497288,
-0.10650252550840378,
-0.20099015533924103,
-0.05599936470389366,
-0.019740531221032143,
-0.030737048014998436,
-0.08808699995279312,
0.0838412195444107,
-0.04588209465146065,
-0.019943570718169212,
0.019013339653611183,
0.0445571094751358,
0.029027611017227173,
0.0550617091357708,
0.024135837331414223,
-0.012929449789226055,
-0.0958758145570755,
0.1372862160205841,
0.0598282516002655,
-0.04529038071632385,
0.02144746109843254,
0.05699707195162773,
-0.08857854455709457,
-0.07168927043676376,
-0.010722463950514793,
0.026013707742094994,
0.000010959965948131867,
-0.0019100694917142391,
0.03830364719033241,
-0.11840984225273132,
0.06766970455646515,
0.08321809768676758,
0.020496970042586327,
0.022978603839874268,
-0.06041104346513748,
0.006612721364945173,
-0.08377835154533386,
0.06419918686151505,
-0.003816246287897229,
0.018318044021725655,
-0.0996284931898117,
0.07075151056051254,
0.007736647967249155,
-0.0701977089047432,
-0.015484413132071495,
-0.02600971795618534,
-0.042945846915245056,
-0.02087048813700676,
-0.11486310511827469,
-0.025541407987475395,
-0.02423008345067501,
0.01747378520667553,
-0.009460381232202053,
0.03127560764551163,
-0.0038879152853041887,
0.050891220569610596,
-0.06594468653202057,
-0.05541129782795906,
-0.0543164499104023,
0.08954057842493057,
-0.07794322073459625,
0.015472974628210068,
0.08370698988437653,
-0.10013739764690399,
0.10717754065990448,
-0.01038396917283535,
-0.037420839071273804,
0.02674145996570587,
-0.14542491734027863,
-0.03641519695520401,
-0.005419955588877201,
0.06594054400920868,
0.023080898448824883,
-0.14749911427497864,
0.03913697972893715,
-0.03638821095228195,
-0.061178985983133316,
-0.03402051702141762,
0.011247166432440281,
-0.12170079350471497,
0.0030087409541010857,
0.04376516863703728,
-0.05464790016412735,
-0.0787271037697792,
0.03541911393404007,
0.10970798879861832,
0.01167064718902111,
0.1514984667301178,
0.022725557908415794,
0.03635799139738083,
-0.17784757912158966,
-0.013729682192206383,
0.033558037132024765,
-0.00460411049425602,
-0.02700042724609375,
-0.026519838720560074,
0.05464678630232811,
0.010965131223201752,
0.08898407965898514,
-0.021304504945874214,
-0.03893674910068512,
0.023361241444945335,
0.016264934092760086,
-0.0408339723944664,
0.009324932470917702,
0.06335508078336716,
-0.00628384156152606,
0.0003794307995121926,
0.005792923271656036,
-0.06402604281902313,
0.014454055577516556,
-0.0024932727683335543,
0.03722018748521805,
0.09827694296836853,
0.05899915471673012,
-0.024206822738051414,
0.08538009971380234,
-0.06285061687231064,
-0.04033483937382698,
0.11716384440660477,
0.04684542864561081,
0.07487143576145172,
-0.04745589569211006,
0.07284244894981384,
0.08126549422740936,
-0.1437215358018875,
0.13342179358005524,
-0.01472961064428091,
-0.028520165011286736,
-0.043332748115062714,
-0.26869362592697144,
-0.042218685150146484,
-0.08658048510551453,
0.018781786784529686,
-0.09265515953302383,
0.091950424015522,
0.06337307393550873,
0.0004762947210110724,
-0.04639468342065811,
0.0707913488149643,
-0.10580473393201828,
-0.11355707794427872,
0.09421197324991226,
0.004074124153703451,
-0.053572166711091995,
0.04183022305369377,
-0.04311683028936386,
0.052916426211595535,
0.004277316853404045,
0.05581880360841751,
0.052575841546058655,
0.05119338631629944,
0.01917312666773796,
-0.017494115978479385,
-0.031878065317869186,
0.025198103860020638,
-0.03415120765566826,
-0.05559203773736954,
0.22864115238189697,
0.039306435734033585,
-0.022258633747696877,
0.0025621415115892887,
0.17256374657154083,
-0.04064035415649414,
0.00988919660449028,
-0.14875318109989166,
0.05299974977970123,
0.029000427573919296,
0.011517231352627277,
0.09139107167720795,
-0.10860326886177063,
0.0296664759516716,
0.14471878111362457,
0.03750869631767273,
-0.02132805809378624,
-0.011030931025743484,
-0.007830352522432804,
-0.018389910459518433,
-0.041394129395484924,
0.13022741675376892,
0.028080686926841736,
0.13041356205940247,
-0.019953381270170212,
0.15082038938999176,
-0.010942788794636726,
-0.021374449133872986,
-0.040895797312259674,
0.15469840168952942,
-0.02096959389746189,
-0.035173624753952026,
-0.025806590914726257,
0.06790126860141754,
-0.047426436096429825,
-0.22266244888305664,
0.011084109544754028,
0.041348330676555634,
-0.08881112188100815,
-0.043445661664009094,
-0.0185186006128788,
0.039150189608335495,
0.05604720115661621,
-0.01961800642311573,
-0.01480663102120161,
0.19072851538658142,
-0.014515816234052181,
-0.0773870125412941,
-0.1066034734249115,
0.09097830206155777,
-0.02078825607895851,
0.18600700795650482,
0.04923881217837334,
0.036872997879981995,
0.04025653749704361,
0.0017104708822444081,
-0.09328292310237885,
0.023171715438365936,
0.022319642826914787,
-0.056825343519449234,
-0.00883000809699297,
0.2410440295934677,
0.013327649794518948,
0.13408546149730682,
0.04088272154331207,
-0.01420163456350565,
0.034962501376867294,
-0.044357020407915115,
-0.0208432599902153,
-0.08346538245677948,
0.09327653795480728,
-0.12500956654548645,
0.10836593061685562,
0.14640377461910248,
-0.012711318209767342,
-0.01765199564397335,
-0.09655557572841644,
0.03602353483438492,
0.04236244037747383,
0.09695187956094742,
-0.002787293866276741,
-0.09695610404014587,
0.028313200920820236,
0.06990642845630646,
0.046630606055259705,
-0.10634492337703705,
-0.09784390032291412,
-0.001735021360218525,
-0.033935174345970154,
-0.06824161112308502,
0.09814804047346115,
0.011441066861152649,
0.03079671412706375,
-0.021083658561110497,
-0.042997654527425766,
-0.027809130027890205,
0.046750787645578384,
-0.056613098829984665,
-0.009944443590939045
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | smartbrain/tiny_lora_model | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:11:48+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | null |
# Psyche 3B
Psyche's story encourages readers to embrace their vulnerabilities, cultivate empathy, and strive for personal growth.
Adapter model for MiniChat-2-3B.
Psyche, the Narrator, is designed to have a childlike intellect. She reviews emotional profiles and provides feedback to the instructions. She is half-decent in summarizing tasks, she can calmly respond to your queries midway through the conversation:
```
> Indicate your understanding of the scenario with a single word. Please reply with just "ok" for now.
OK
> Can you provide a concise summary of our conversation?
The visiting tourist heard about the Coffee Shop and found them on their first day in town.
```
Don't expect her to answer complex questions in a single turn.
## Quickstart
```python
model = AutoModelForCausalLM.from_pretrained('GeneZC/MiniChat-2-3B')
model.load_adapter('twodgirl/Psyche-3B')
```
```
koboldcpp --model minichat-2-3b.q6_k.gguf --lora adapter-model.ggml.bin
```
## Prompt format
She is not suited for roleplaying or prolonged conversations.
However, it's hoped that the model can be used to create datasets from smaller textual data.
```
<|system|>
The roleplay between USER and ASSISTANT begins.</s>
<|user|>
</s>
<|assistant|>
```
_Summarize_ a page-long conversation with large models.
You import a chatlog to this small model.
```
<|system|>
You are a helpful assistant. Briefly summarize the conversation.</s>
<|user|>
Hey there, I'm new in town. Any cool spots to meet folks?</s>
<|assistant|>
Check out the coffee shop on Main Street. Great vibes, lots of locals.</s>
<|user|>
Thanks! What about some fun activities?</s>
<|assistant|>
Hit up the arcade downtown. Retro games, pinball, chill people.</s>
<|user|>
Can you provide a concise summary of our conversation?</s>
<|assistant|>
```
*In adventure mode*, tell something that **you** do.
```
<|system|>
[Genres: Fantasy, Humor]</s>
<|user|>
You look at your reflection.</s>
<|assistant|>
```
_Assign a score to the summary._ Import the summary from a larger model that can understand and categorize text by personal traits.
This model is not suitable for understanding diverse personalities.
Communication Skills: *assertiveness, loquacity*
Social Skills: *empathy, kindness, patience*
Moral: *fidelity, integrity*
Personal Qualities: *confidence, creativity, wisdom*
Personal Issues: *bluntness, capriciousness, fragility, shyness, stubbornness, profanity*
Interpersonal Issues: *arrogance, bellicosity, cruelty*
```
<|system|>
Assign a score to each trait. Each score is an integer from 1 to 10. If the trait is not presented, set the score to 1. If the trait is over-represented, set the score to 10.</s>
<|user|>
[Traits: empathy, kindness, patience]
Yasuo shows empathy by delving into the user's personal life, asking about their family and experiences. He also demonstrates an understanding of their profession, indicating a shared understanding of her feelings.
Yasuo is relatively courteous and considerate in his exchange with the user, showing respect for her profession and experiences.
Yasuo demonstrates patience throughout the conversation; he listens and responds thoughtfully, willing to take on a lengthy discussion.</s>
<|assistant|>
```
| {"language": ["en"], "license": "apache-2.0", "tags": ["causal-lm", "llama2"], "datasets": ["allenai/soda", "grimulkan/LimaRP-augmented", "npc-engine/light-batch-summarize-dialogue", "IlyaGusev/pippa_scored", "grimulkan/physical-reasoning", "samsum", "grimulkan/theory-of-mind", "PocketDoc/Floyd-Text-Adventures"], "pipeline_tag": "text-generation", "base_model": "GeneZC/MiniChat-2-3B"} | text-generation | twodgirl/Psyche-3B | [
"safetensors",
"causal-lm",
"llama2",
"text-generation",
"en",
"dataset:allenai/soda",
"dataset:grimulkan/LimaRP-augmented",
"dataset:npc-engine/light-batch-summarize-dialogue",
"dataset:IlyaGusev/pippa_scored",
"dataset:grimulkan/physical-reasoning",
"dataset:samsum",
"dataset:grimulkan/theory-of-mind",
"dataset:PocketDoc/Floyd-Text-Adventures",
"base_model:GeneZC/MiniChat-2-3B",
"license:apache-2.0",
"region:us"
] | 2024-02-14T16:14:46+00:00 | [] | [
"en"
] | TAGS
#safetensors #causal-lm #llama2 #text-generation #en #dataset-allenai/soda #dataset-grimulkan/LimaRP-augmented #dataset-npc-engine/light-batch-summarize-dialogue #dataset-IlyaGusev/pippa_scored #dataset-grimulkan/physical-reasoning #dataset-samsum #dataset-grimulkan/theory-of-mind #dataset-PocketDoc/Floyd-Text-Adventures #base_model-GeneZC/MiniChat-2-3B #license-apache-2.0 #region-us
|
# Psyche 3B
Psyche's story encourages readers to embrace their vulnerabilities, cultivate empathy, and strive for personal growth.
Adapter model for MiniChat-2-3B.
Psyche, the Narrator, is designed to have a childlike intellect. She reviews emotional profiles and provides feedback to the instructions. She is half-decent in summarizing tasks, she can calmly respond to your queries midway through the conversation:
Don't expect her to answer complex questions in a single turn.
## Quickstart
## Prompt format
She is not suited for roleplaying or prolonged conversations.
However, it's hoped that the model can be used to create datasets from smaller textual data.
_Summarize_ a page-long conversation with large models.
You import a chatlog to this small model.
*In adventure mode*, tell something that you do.
_Assign a score to the summary._ Import the summary from a larger model that can understand and categorize text by personal traits.
This model is not suitable for understanding diverse personalities.
Communication Skills: *assertiveness, loquacity*
Social Skills: *empathy, kindness, patience*
Moral: *fidelity, integrity*
Personal Qualities: *confidence, creativity, wisdom*
Personal Issues: *bluntness, capriciousness, fragility, shyness, stubbornness, profanity*
Interpersonal Issues: *arrogance, bellicosity, cruelty*
| [
"# Psyche 3B\n\nPsyche's story encourages readers to embrace their vulnerabilities, cultivate empathy, and strive for personal growth.\n\nAdapter model for MiniChat-2-3B.\n\nPsyche, the Narrator, is designed to have a childlike intellect. She reviews emotional profiles and provides feedback to the instructions. She is half-decent in summarizing tasks, she can calmly respond to your queries midway through the conversation:\n\n\n\nDon't expect her to answer complex questions in a single turn.",
"## Quickstart",
"## Prompt format\n\nShe is not suited for roleplaying or prolonged conversations.\nHowever, it's hoped that the model can be used to create datasets from smaller textual data.\n\n\n\n_Summarize_ a page-long conversation with large models.\nYou import a chatlog to this small model.\n\n\n\n*In adventure mode*, tell something that you do.\n\n\n\n_Assign a score to the summary._ Import the summary from a larger model that can understand and categorize text by personal traits.\nThis model is not suitable for understanding diverse personalities.\n\nCommunication Skills: *assertiveness, loquacity*\n\nSocial Skills: *empathy, kindness, patience*\n\nMoral: *fidelity, integrity*\n\nPersonal Qualities: *confidence, creativity, wisdom*\n\nPersonal Issues: *bluntness, capriciousness, fragility, shyness, stubbornness, profanity*\n\nInterpersonal Issues: *arrogance, bellicosity, cruelty*"
] | [
"TAGS\n#safetensors #causal-lm #llama2 #text-generation #en #dataset-allenai/soda #dataset-grimulkan/LimaRP-augmented #dataset-npc-engine/light-batch-summarize-dialogue #dataset-IlyaGusev/pippa_scored #dataset-grimulkan/physical-reasoning #dataset-samsum #dataset-grimulkan/theory-of-mind #dataset-PocketDoc/Floyd-Text-Adventures #base_model-GeneZC/MiniChat-2-3B #license-apache-2.0 #region-us \n",
"# Psyche 3B\n\nPsyche's story encourages readers to embrace their vulnerabilities, cultivate empathy, and strive for personal growth.\n\nAdapter model for MiniChat-2-3B.\n\nPsyche, the Narrator, is designed to have a childlike intellect. She reviews emotional profiles and provides feedback to the instructions. She is half-decent in summarizing tasks, she can calmly respond to your queries midway through the conversation:\n\n\n\nDon't expect her to answer complex questions in a single turn.",
"## Quickstart",
"## Prompt format\n\nShe is not suited for roleplaying or prolonged conversations.\nHowever, it's hoped that the model can be used to create datasets from smaller textual data.\n\n\n\n_Summarize_ a page-long conversation with large models.\nYou import a chatlog to this small model.\n\n\n\n*In adventure mode*, tell something that you do.\n\n\n\n_Assign a score to the summary._ Import the summary from a larger model that can understand and categorize text by personal traits.\nThis model is not suitable for understanding diverse personalities.\n\nCommunication Skills: *assertiveness, loquacity*\n\nSocial Skills: *empathy, kindness, patience*\n\nMoral: *fidelity, integrity*\n\nPersonal Qualities: *confidence, creativity, wisdom*\n\nPersonal Issues: *bluntness, capriciousness, fragility, shyness, stubbornness, profanity*\n\nInterpersonal Issues: *arrogance, bellicosity, cruelty*"
] | [
162,
116,
3,
219
] | [
"passage: TAGS\n#safetensors #causal-lm #llama2 #text-generation #en #dataset-allenai/soda #dataset-grimulkan/LimaRP-augmented #dataset-npc-engine/light-batch-summarize-dialogue #dataset-IlyaGusev/pippa_scored #dataset-grimulkan/physical-reasoning #dataset-samsum #dataset-grimulkan/theory-of-mind #dataset-PocketDoc/Floyd-Text-Adventures #base_model-GeneZC/MiniChat-2-3B #license-apache-2.0 #region-us \n# Psyche 3B\n\nPsyche's story encourages readers to embrace their vulnerabilities, cultivate empathy, and strive for personal growth.\n\nAdapter model for MiniChat-2-3B.\n\nPsyche, the Narrator, is designed to have a childlike intellect. She reviews emotional profiles and provides feedback to the instructions. She is half-decent in summarizing tasks, she can calmly respond to your queries midway through the conversation:\n\n\n\nDon't expect her to answer complex questions in a single turn.## Quickstart## Prompt format\n\nShe is not suited for roleplaying or prolonged conversations.\nHowever, it's hoped that the model can be used to create datasets from smaller textual data.\n\n\n\n_Summarize_ a page-long conversation with large models.\nYou import a chatlog to this small model.\n\n\n\n*In adventure mode*, tell something that you do.\n\n\n\n_Assign a score to the summary._ Import the summary from a larger model that can understand and categorize text by personal traits.\nThis model is not suitable for understanding diverse personalities.\n\nCommunication Skills: *assertiveness, loquacity*\n\nSocial Skills: *empathy, kindness, patience*\n\nMoral: *fidelity, integrity*\n\nPersonal Qualities: *confidence, creativity, wisdom*\n\nPersonal Issues: *bluntness, capriciousness, fragility, shyness, stubbornness, profanity*\n\nInterpersonal Issues: *arrogance, bellicosity, cruelty*"
] | [
-0.04123295098543167,
-0.020266469568014145,
-0.00794302485883236,
0.03138703107833862,
0.06413251161575317,
-0.0007851346163079143,
0.07920659333467484,
0.1055576503276825,
0.04374438151717186,
0.12385782599449158,
-0.03514117747545242,
-0.10631078481674194,
0.07162626087665558,
0.012577507644891739,
0.006299884058535099,
-0.2629467248916626,
0.05172637477517128,
-0.05063287541270256,
0.05521785467863083,
0.08335836976766586,
0.10591721534729004,
0.011535459198057652,
0.07587368041276932,
-0.02054763399064541,
0.03390245512127876,
-0.06802365928888321,
-0.009606747888028622,
0.02444830909371376,
0.03271689638495445,
0.042435646057128906,
0.10284387320280075,
0.0037067667581140995,
-0.05840367451310158,
-0.2249140739440918,
0.010886005125939846,
0.03513164073228836,
-0.000591041287407279,
0.007288695313036442,
0.01242843084037304,
-0.04636517912149429,
0.15665851533412933,
-0.20934700965881348,
0.08141425251960754,
0.05916975811123848,
-0.12292861193418503,
-0.11126624792814255,
-0.03644350543618202,
0.06446388363838196,
0.05589764565229416,
-0.024209927767515182,
-0.05154116451740265,
0.06790301203727722,
-0.09984482079744339,
0.025351282209157944,
0.1358940303325653,
-0.1706853210926056,
-0.11065264046192169,
0.029329221695661545,
-0.007343563716858625,
0.09603402018547058,
-0.10107419639825821,
-0.003705996787175536,
-0.0628557875752449,
0.013289070688188076,
-0.033827491104602814,
-0.016277387738227844,
0.14291763305664062,
-0.032521869987249374,
-0.11831236630678177,
0.03073308989405632,
0.07151772081851959,
0.033022139221429825,
-0.050619788467884064,
-0.18528403341770172,
-0.03749451786279678,
0.09512975066900253,
-0.06189717352390289,
-0.10112602263689041,
0.04165438562631607,
-0.005540723446756601,
0.08900231868028641,
-0.06695450842380524,
-0.10325615853071213,
0.023049810901284218,
-0.06747333705425262,
0.07609301805496216,
-0.002020015846937895,
-0.007706766482442617,
0.014041535556316376,
-0.04980335757136345,
-0.03224226459860802,
-0.05932768061757088,
-0.07036150991916656,
-0.048151761293411255,
-0.13127882778644562,
-0.04974474385380745,
-0.06428764015436172,
-0.07682546973228455,
0.033766813576221466,
0.1473677009344101,
0.01232084259390831,
0.03098747506737709,
0.013864007778465748,
-0.00022092685685493052,
0.0724179595708847,
0.08894714713096619,
0.04363800957798958,
-0.07225678116083145,
-0.050209708511829376,
0.05020730942487717,
0.023836791515350342,
-0.04992923140525818,
-0.009466495364904404,
-0.019852790981531143,
0.06084016337990761,
0.06700050085783005,
0.06342495232820511,
0.03834922984242439,
-0.07482334226369858,
-0.03089972212910652,
0.1365816742181778,
-0.07519248872995377,
0.05357734113931656,
0.027486609295010567,
-0.015963969752192497,
0.11919918656349182,
-0.06730813533067703,
0.017718248069286346,
-0.04134645685553551,
0.07092957943677902,
-0.06524349004030228,
-0.01178663782775402,
-0.02857189066708088,
0.008615286089479923,
0.07004868239164352,
0.04575518146157265,
-0.07195653021335602,
-0.10921800136566162,
-0.11498795449733734,
-0.09325909614562988,
0.05524565279483795,
-0.15147867798805237,
-0.0038604838773608208,
-0.026675879955291748,
-0.07249277830123901,
0.038379423320293427,
0.03969498723745346,
-0.05528007447719574,
0.00793758686631918,
0.005260407458990812,
-0.057768985629081726,
0.03253936022520065,
0.08472981303930283,
0.004468496423214674,
-0.09601493179798126,
0.013619477860629559,
-0.20499369502067566,
0.11476368457078934,
-0.0650920495390892,
-0.09170746058225632,
-0.10400409996509552,
-0.05113380774855614,
-0.019562911242246628,
0.06147388368844986,
-0.0224369578063488,
0.12857390940189362,
-0.10920185595750809,
-0.026992762461304665,
0.12663573026657104,
-0.1629563421010971,
-0.0554998405277729,
0.15940777957439423,
0.04271990433335304,
0.09251607209444046,
0.10258794575929642,
0.16058243811130524,
0.03947925940155983,
-0.0590401291847229,
0.008775778114795685,
-0.012127192690968513,
-0.011805473826825619,
0.20542237162590027,
0.08674212545156479,
-0.06274271011352539,
0.05807368829846382,
-0.021832924336194992,
-0.058751802891492844,
-0.020284369587898254,
0.035031042993068695,
-0.05846833065152168,
0.00219629961065948,
0.024916213005781174,
0.051760610193014145,
0.004908958449959755,
-0.09837477654218674,
-0.01501806452870369,
-0.1353190392255783,
-0.02379986271262169,
0.012361765839159489,
-0.010843280702829361,
0.05037270113825798,
-0.11970154941082001,
0.07671243697404861,
0.12275651842355728,
-0.0007061291835270822,
-0.10437281429767609,
-0.136907160282135,
0.02434195950627327,
-0.11378997564315796,
0.10056313127279282,
0.04648352041840553,
0.0542508065700531,
-0.000274727470241487,
-0.04692739620804787,
0.02109522745013237,
0.030247673392295837,
0.02650570683181286,
-0.040184058248996735,
-0.16395892202854156,
0.03164508193731308,
0.0031698315870016813,
0.1991211324930191,
-0.08403123915195465,
-0.009672311134636402,
0.1252499371767044,
0.03249134123325348,
0.05608333274722099,
-0.020521238446235657,
0.0017597153782844543,
0.047066546976566315,
0.07139582931995392,
0.016115980222821236,
0.08109351247549057,
0.0014688435476273298,
-0.04994842782616615,
0.07936053723096848,
-0.16159407794475555,
-0.18620529770851135,
0.0006629150593653321,
0.020031897351145744,
-0.0979328528046608,
-0.09993072599172592,
0.005049361381679773,
-0.04436296597123146,
0.03196857124567032,
-0.04661281034350395,
0.1242876648902893,
0.05409780517220497,
0.014684848487377167,
-0.10483484715223312,
-0.0009681811206974089,
-0.046925585716962814,
-0.06734609603881836,
-0.014818643219769001,
0.041071224957704544,
-0.08043907582759857,
-0.14067427814006805,
0.012735391035676003,
0.05737603083252907,
-0.05536279082298279,
0.1330241560935974,
0.010967189446091652,
-0.07184002548456192,
-0.11031150072813034,
0.035382501780986786,
0.000953812850639224,
-0.008159207180142403,
-0.10132898390293121,
-0.012806001119315624,
0.050478409975767136,
0.040184225887060165,
-0.003111200174316764,
-0.034437086433172226,
0.03936006873846054,
0.03933078795671463,
0.012612349353730679,
0.04835103824734688,
0.09115824103355408,
0.05863690748810768,
0.10206229984760284,
0.036027941852808,
0.023281829431653023,
-0.010308095254004002,
-0.0823238343000412,
-0.11544986814260483,
0.11031056940555573,
-0.07856578379869461,
-0.18506713211536407,
-0.05207835137844086,
0.05699734017252922,
-0.015673629939556122,
-0.006448959466069937,
-0.003031026339158416,
-0.05828583985567093,
-0.028327863663434982,
-0.10057263821363449,
0.15878333151340485,
0.033794037997722626,
-0.08917143195867538,
-0.1480899304151535,
-0.0018261449877172709,
-0.001592718530446291,
-0.051196154206991196,
-0.014607761055231094,
0.01801280491054058,
-0.10713712871074677,
0.008829870261251926,
-0.040426742285490036,
0.01016941200941801,
0.09170286357402802,
0.057708121836185455,
-0.0673842653632164,
-0.060843657702207565,
0.19772733747959137,
-0.0909225344657898,
0.16701684892177582,
0.16232679784297943,
-0.03789849206805229,
0.09789208322763443,
0.2117689847946167,
0.04018402472138405,
-0.014936959370970726,
-0.03116314858198166,
0.048930030316114426,
-0.019932536408305168,
-0.1490364968776703,
-0.14257381856441498,
-0.01632382906973362,
0.0711267814040184,
-0.026934420689940453,
0.0031290799379348755,
0.07168421149253845,
0.028113162145018578,
-0.09838511794805527,
-0.08675333857536316,
0.0623672753572464,
0.1039881631731987,
0.184405118227005,
0.007463039364665747,
0.0385468490421772,
0.010088040493428707,
0.006837895605713129,
0.07093656808137894,
0.015976140275597572,
0.16394582390785217,
-0.031520500779151917,
0.21073973178863525,
0.07889553904533386,
0.011168577708303928,
0.008735730312764645,
-0.07497303187847137,
0.000879308208823204,
-0.020428001880645752,
-0.04868330433964729,
-0.04780983552336693,
-0.037357088178396225,
0.09259112924337387,
0.11030246317386627,
-0.04441603645682335,
-0.036808617413043976,
0.05313345789909363,
0.0998307317495346,
0.07409316301345825,
-0.009713105857372284,
0.03187033534049988,
-0.03604043647646904,
0.04539763927459717,
-0.04139227792620659,
-0.07625555992126465,
0.031049441546201706,
0.07228264957666397,
-0.18400172889232635,
-0.0002199995651608333,
-0.018421556800603867,
0.04807892069220543,
-0.09749458730220795,
-0.02349347621202469,
-0.02309997007250786,
0.03317635506391525,
0.030864166095852852,
0.10368452966213226,
-0.10063808411359787,
0.11246683448553085,
0.01123414933681488,
0.018642161041498184,
-0.05379994213581085,
-0.03393268957734108,
0.013535010628402233,
-0.016518453136086464,
0.09849919378757477,
0.05339636653661728,
-0.09982901066541672,
-0.022001955658197403,
0.038905829191207886,
0.000452143867732957,
0.16751131415367126,
-0.013143954798579216,
0.07897660881280899,
-0.09706149250268936,
-0.019690800458192825,
-0.08783385902643204,
0.005260461010038853,
-0.02990531176328659,
-0.1362847238779068,
0.05949937179684639,
-0.03158023580908775,
-0.03579563647508621,
-0.07720101624727249,
0.009000630117952824,
0.052493564784526825,
0.13437475264072418,
-0.03468207269906998,
-0.11338118463754654,
-0.07481717318296432,
-0.03901461884379387,
0.1112430989742279,
-0.07896680384874344,
0.04135792329907417,
-0.04385685548186302,
0.12207113951444626,
0.017853641882538795,
0.03189118579030037,
0.04771583899855614,
-0.021893328055739403,
-0.18198072910308838,
-0.07281570881605148,
0.15602129697799683,
0.09453212469816208,
0.07733366638422012,
0.016978701576590538,
0.07979805767536163,
-0.0419197753071785,
-0.04318056255578995,
-0.015708450227975845,
-0.00638108653947711,
-0.10261105746030807,
0.08657391369342804,
-0.008721353486180305,
0.01557577308267355,
-0.11391043663024902,
-0.049268219619989395,
0.1253546178340912,
0.2416759729385376,
-0.012147899717092514,
0.06448537111282349,
0.16446830332279205,
-0.04662094637751579,
-0.22318722307682037,
-0.06045033037662506,
0.10645990818738937,
-0.06760697811841965,
-0.015491965226829052,
-0.19781392812728882,
0.11756360530853271,
0.050450246781110764,
-0.005947141908109188,
-0.090766042470932,
-0.1741282343864441,
-0.07070428133010864,
0.010239996016025543,
0.007322733290493488,
-0.02454678900539875,
-0.15972985327243805,
-0.05948227271437645,
0.004965732805430889,
-0.012437601573765278,
0.09819383919239044,
-0.02645842730998993,
0.0480133481323719,
0.020246263593435287,
0.0929497703909874,
0.049752481281757355,
0.041766002774238586,
0.14809173345565796,
0.03392728045582771,
0.003991313278675079,
-0.07558607310056686,
-0.052745141088962555,
0.018746240064501762,
-0.05307051166892052,
0.06835740059614182,
-0.07950981706380844,
-0.027894597500562668,
-0.04949874058365822,
-0.016286112368106842,
-0.14031802117824554,
-0.024844542145729065,
-0.08823011815547943,
-0.005442137364298105,
-0.11199666559696198,
0.12328208982944489,
0.08698534965515137,
-0.00032332888804376125,
-0.035659559071063995,
-0.11703336238861084,
-0.008234048262238503,
0.0013598421355709434,
0.14034654200077057,
-0.027345551177859306,
-0.16294269263744354,
-0.03535314276814461,
0.005437975283712149,
0.05060449242591858,
-0.09357942640781403,
0.030238235369324684,
0.06811834126710892,
-0.014936844818294048,
0.14778360724449158,
-0.04357011243700981,
-0.08292286098003387,
-0.06613435596227646,
0.08190952986478806,
-0.01965130865573883,
-0.1879359632730484,
0.021069273352622986,
0.09924251586198807,
-0.148727148771286,
-0.15065594017505646,
0.12840230762958527,
0.0408090278506279,
-0.034335967153310776,
0.03524758294224739,
0.06523659825325012,
0.00679476885125041,
-0.0224250927567482,
0.008861065842211246,
0.0604020357131958,
-0.03921756520867348,
0.050536151975393295,
0.07279499620199203,
-0.17375831305980682,
0.053837116807699203,
0.1578693836927414,
-0.037554752081632614,
-0.0695720985531807,
-0.01799127832055092,
0.0044436827301979065,
-0.012903832830488682,
-0.004304254427552223,
0.0597507506608963,
-0.14045773446559906,
-0.004651425871998072,
0.16477461159229279,
0.008177747949957848,
-0.015299896709620953,
0.012767631560564041,
-0.009638695977628231,
0.004643899388611317,
0.08616147935390472,
-0.006334834732115269,
-0.005099785514175892,
-0.006401730701327324,
0.04609152674674988,
-0.019577644765377045,
-0.04123767837882042,
-0.0006659192149527371,
-0.04332893714308739,
-0.09419114887714386,
-0.022783834487199783,
-0.04782916605472565,
0.011426207609474659,
-0.07893639802932739,
0.0017361138015985489,
0.06344097852706909,
-0.012927119620144367,
0.01957898586988449,
-0.05016307160258293,
-0.030362453311681747,
-0.03595994785428047,
0.009076550602912903,
0.12919676303863525,
-0.15730901062488556,
-0.00865031499415636,
0.09204943478107452,
-0.07196321338415146,
0.04352531209588051,
-0.010913503356277943,
-0.044330839067697525,
0.023526398465037346,
-0.1859726458787918,
0.03456160053610802,
-0.06789383292198181,
0.015581873245537281,
0.022310229018330574,
-0.15772086381912231,
-0.005688928999006748,
-0.012439841404557228,
-0.00887361355125904,
0.06484430283308029,
0.10019291192293167,
-0.037165138870477676,
0.11321571469306946,
0.020606471225619316,
-0.10798949003219604,
-0.07073090970516205,
-0.02584192156791687,
-0.005440332926809788,
-0.06156238541007042,
0.13088299334049225,
-0.042576663196086884,
0.058414526283741,
-0.11798924952745438,
0.013567076995968819,
0.04444972053170204,
0.06831923872232437,
0.012064398266375065,
-0.1272607296705246,
0.016965080052614212,
-0.02751370705664158,
0.07397617399692535,
-0.009624394588172436,
-0.03244920074939728,
0.06185629591345787,
0.09073836356401443,
0.0038325702771544456,
0.02869534306228161,
-0.09817469865083694,
0.03756999224424362,
-0.06716307997703552,
-0.008745967410504818,
-0.050073057413101196,
-0.06530299037694931,
0.05927610769867897,
0.15099464356899261,
0.12080598622560501,
0.12915851175785065,
0.07795757800340652,
0.004703327547758818,
0.02523619681596756,
-0.05520118772983551,
0.036690618842840195,
0.006796334870159626,
0.01271104533225298,
-0.10221736878156662,
0.2008833885192871,
0.08542980998754501,
-0.1393059939146042,
0.10785014182329178,
-0.09693477302789688,
-0.06805621087551117,
-0.02150503173470497,
-0.21310433745384216,
-0.046107687056064606,
-0.011750658974051476,
0.0010534824104979634,
-0.09339500963687897,
0.007069272454828024,
0.06288952380418777,
0.02691931277513504,
-0.029155949130654335,
0.13237406313419342,
-0.17883256077766418,
-0.07212916761636734,
0.08015921711921692,
0.00796465389430523,
0.1145988404750824,
0.1730826199054718,
0.026607956737279892,
-0.011588606052100658,
0.03183360397815704,
0.03960337117314339,
0.09567860513925552,
-0.018076956272125244,
-0.029535256326198578,
-0.12336239218711853,
-0.12990741431713104,
0.028463175520300865,
-0.005755983758717775,
-0.028193699195981026,
0.21358345448970795,
0.0036504885647445917,
-0.006221263203769922,
-0.015744250267744064,
0.1304425597190857,
0.013156553730368614,
-0.06551499664783478,
-0.1659170687198639,
0.13476897776126862,
-0.05129669979214668,
-0.02463400550186634,
-0.002047785324975848,
-0.09544502198696136,
-0.024498051032423973,
0.13001064956188202,
0.1094682440161705,
-0.06865333020687103,
0.019471801817417145,
0.08733759075403214,
0.024404242634773254,
0.001988587900996208,
0.095500148832798,
0.008639207109808922,
0.2340833693742752,
-0.0609467551112175,
0.1000375896692276,
-0.05386413633823395,
-0.09569618105888367,
-0.05125606060028076,
0.08928840607404709,
-0.0021056360565125942,
0.045800238847732544,
-0.08533824235200882,
0.12324109673500061,
-0.08611195534467697,
-0.18976327776908875,
0.06820312142372131,
-0.08086707442998886,
-0.0850975513458252,
-0.03942064195871353,
0.04307620972394943,
0.04218972846865654,
0.09548243135213852,
0.04888521134853363,
-0.04304182529449463,
0.12977099418640137,
-0.009635554626584053,
-0.008444301784038544,
0.03169284388422966,
0.11106545478105545,
-0.12965431809425354,
0.1571897268295288,
0.020517023280262947,
0.14412051439285278,
0.12291339039802551,
-0.019847756251692772,
-0.04309806600213051,
0.03867886960506439,
0.06667594611644745,
-0.06535322219133377,
0.038607411086559296,
0.2310059666633606,
0.02267802506685257,
0.11700650304555893,
0.2081827074289322,
-0.01994457095861435,
0.0899241492152214,
0.07504455745220184,
-0.07347994297742844,
-0.0830320492386818,
0.1450750231742859,
-0.13319584727287292,
0.06938140094280243,
0.21644923090934753,
-0.005053181666880846,
-0.053166840225458145,
-0.035383500158786774,
-0.052703723311424255,
0.03116258978843689,
0.11012367904186249,
-0.04707346856594086,
-0.06971026211977005,
0.0314302034676075,
0.03329408913850784,
0.1650252640247345,
-0.133684903383255,
-0.05704547464847565,
-0.024661852046847343,
0.03161903843283653,
-0.03553590178489685,
0.1558937132358551,
-0.004784746561199427,
-0.019412854686379433,
-0.022933479398489,
-0.13116997480392456,
0.055515140295028687,
0.09648145735263824,
-0.0817335769534111,
0.0015065241605043411
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# my-test-yelp-model
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.38.0.dev0
- TensorFlow 2.15.0
- Datasets 2.17.1.dev0
- Tokenizers 0.15.1
| {"tags": ["generated_from_keras_callback"], "model-index": [{"name": "my-test-yelp-model", "results": []}]} | text-classification | dah1214/my-test-yelp-model | [
"transformers",
"tf",
"safetensors",
"bert",
"text-classification",
"generated_from_keras_callback",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:17:41+00:00 | [] | [] | TAGS
#transformers #tf #safetensors #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us
|
# my-test-yelp-model
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.38.0.dev0
- TensorFlow 2.15.0
- Datasets 2.17.1.dev0
- Tokenizers 0.15.1
| [
"# my-test-yelp-model\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tf #safetensors #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us \n",
"# my-test-yelp-model\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
51,
35,
6,
12,
8,
3,
33,
4,
39
] | [
"passage: TAGS\n#transformers #tf #safetensors #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us \n# my-test-yelp-model\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32### Training results### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
-0.09886705875396729,
0.05689619109034538,
-0.0015971892280504107,
0.07635253667831421,
0.16059526801109314,
-0.005278492346405983,
0.1212036982178688,
0.09424368292093277,
-0.12359508126974106,
0.031840257346630096,
0.08553412556648254,
0.11080877482891083,
0.012444618158042431,
0.11729860305786133,
-0.05684971436858177,
-0.18557368218898773,
0.031527236104011536,
0.013152376748621464,
-0.06672772020101547,
0.10106105357408524,
0.09347254037857056,
-0.11317193508148193,
0.09346577525138855,
-0.016918513923883438,
-0.19229665398597717,
0.008247253485023975,
0.021537136286497116,
-0.08520831912755966,
0.10995952785015106,
0.02094619907438755,
0.10688896477222443,
0.04120205342769623,
0.06429296731948853,
-0.1281946748495102,
0.023557454347610474,
0.08162626624107361,
0.008088361471891403,
0.06690585613250732,
0.05713802948594093,
-0.0392691008746624,
0.08759991824626923,
-0.07198406755924225,
0.08266198635101318,
0.047184135764837265,
-0.10884281992912292,
-0.13789188861846924,
-0.08350218832492828,
0.020015738904476166,
0.06191322207450867,
0.10431261360645294,
-0.016743207350373268,
0.2704302966594696,
-0.06973574310541153,
0.09079357236623764,
0.10009419173002243,
-0.2958907186985016,
-0.06792944669723511,
0.05816297605633736,
0.06357945501804352,
0.04028885066509247,
-0.08295853435993195,
0.033371686935424805,
0.08270341157913208,
0.06303082406520844,
0.0838024839758873,
-0.03182469308376312,
-0.11213617771863937,
-0.007976224645972252,
-0.11958598345518112,
0.03772460296750069,
0.21237215399742126,
-0.018674250692129135,
-0.08260541409254074,
-0.048735931515693665,
-0.05817129835486412,
-0.06927419453859329,
-0.007103803101927042,
-0.07835523039102554,
0.03341119736433029,
-0.0057169352658092976,
-0.06449798494577408,
-0.041094329208135605,
-0.11169449239969254,
-0.07834745198488235,
-0.09913407266139984,
0.1729101836681366,
0.01805153489112854,
0.021084532141685486,
-0.09198272228240967,
0.09553741663694382,
-0.04462851211428642,
-0.09318459779024124,
-0.019245604053139687,
-0.02780359424650669,
-0.03496028110384941,
-0.10002711415290833,
-0.0723700001835823,
-0.2105032056570053,
0.04965311288833618,
0.08526264876127243,
-0.07690590620040894,
0.06888195127248764,
-0.07852189987897873,
0.02252260036766529,
-0.01199352741241455,
0.12374041974544525,
-0.014714778400957584,
0.025410566478967667,
0.029345069080591202,
-0.0007704957388341427,
0.00020821543876081705,
-0.015715135261416435,
-0.08778870105743408,
0.012522261589765549,
0.09813109040260315,
0.0553128682076931,
-0.048846494406461716,
0.09504300355911255,
-0.033656179904937744,
-0.0025357974227517843,
0.0019325127359479666,
-0.10031109303236008,
0.0250252652913332,
-0.0015470772050321102,
-0.04823325574398041,
-0.0022237133234739304,
0.08254029601812363,
-0.02563103847205639,
-0.05074237659573555,
0.008054275996983051,
-0.08558999001979828,
0.007804151624441147,
-0.07942942529916763,
-0.11110606789588928,
0.011307447217404842,
-0.07195957005023956,
-0.004490003455430269,
-0.12181834876537323,
-0.19421769678592682,
-0.033154651522636414,
0.043292898684740067,
-0.05113318935036659,
0.04612689092755318,
-0.06527306884527206,
-0.07964393496513367,
0.02019408531486988,
-0.0012259489158168435,
0.07649510353803635,
-0.04672561213374138,
0.05880158394575119,
0.016609597951173782,
0.0499565564095974,
-0.040084365755319595,
0.029022548347711563,
-0.1231464296579361,
0.028671802952885628,
-0.11536035686731339,
0.079923115670681,
-0.051465775817632675,
0.06222793832421303,
-0.09900917112827301,
-0.06035853549838066,
-0.015469607897102833,
0.020228872075676918,
0.08209987729787827,
0.1737801730632782,
-0.2212437093257904,
-0.011452341452240944,
0.1671970933675766,
-0.13594043254852295,
-0.11811107397079468,
0.08504921942949295,
-0.06036319211125374,
0.14321500062942505,
0.09704121947288513,
0.1405704915523529,
0.08807147294282913,
-0.10438168048858643,
0.03275499492883682,
0.011475018225610256,
-0.020127274096012115,
-0.006702965125441551,
-0.008669394999742508,
-0.008025672286748886,
-0.09977661073207855,
0.03007582016289234,
-0.0024833905044943094,
0.019482960924506187,
-0.09290482848882675,
-0.04720441624522209,
-0.06657794862985611,
-0.0903358981013298,
0.09116433560848236,
-0.0016156992642208934,
0.08161581307649612,
-0.09234754741191864,
-0.1014450341463089,
0.09453296661376953,
0.05335776507854462,
-0.042718637734651566,
0.015793142840266228,
-0.10744310170412064,
0.06661638617515564,
-0.07818031311035156,
-0.00662123691290617,
-0.20020923018455505,
-0.061992090195417404,
0.016449779272079468,
0.053354885429143906,
0.043318092823028564,
0.039459023624658585,
0.09322458505630493,
0.06700430810451508,
-0.04804830625653267,
0.000025687359084258787,
0.01916426420211792,
0.04026639834046364,
-0.11423943936824799,
-0.18399080634117126,
0.024242160841822624,
-0.07295434176921844,
0.06879456341266632,
-0.2572881281375885,
0.025562157854437828,
0.0561494342982769,
0.11574526131153107,
0.04639557749032974,
0.0016471939161419868,
-0.00006862890586489812,
0.04711122438311577,
-0.03692498803138733,
-0.07975374162197113,
0.053055539727211,
0.03945452347397804,
-0.11651427298784256,
0.02306850254535675,
-0.17888115346431732,
0.07655847072601318,
0.12809479236602783,
-0.05562933534383774,
-0.13105003535747528,
0.01969773881137371,
-0.03265281021595001,
-0.02540384978055954,
-0.02303161844611168,
0.0390782468020916,
0.14012578129768372,
0.0020420197397470474,
0.16263210773468018,
-0.04978200048208237,
-0.042244408279657364,
0.039516981691122055,
-0.043126918375492096,
-0.023963112384080887,
0.06415349990129471,
-0.024292033165693283,
-0.16433344781398773,
0.1078806146979332,
0.08154230564832687,
-0.03922353312373161,
0.14836281538009644,
-0.039311595261096954,
-0.042346689850091934,
-0.030916262418031693,
0.01219237968325615,
0.012960851192474365,
0.10530726611614227,
-0.13267362117767334,
-0.009800443425774574,
0.01810094341635704,
0.0462680384516716,
0.03646770119667053,
-0.16890563070774078,
0.006779086776077747,
0.022900350391864777,
0.013180337846279144,
-0.00048649500240571797,
0.019460728392004967,
-0.004277161322534084,
0.10691149532794952,
0.02482682652771473,
0.008103864267468452,
0.07348483800888062,
0.009602697566151619,
-0.11512547731399536,
0.2282017469406128,
-0.14091551303863525,
-0.14327913522720337,
-0.0902402251958847,
-0.0016291035572066903,
-0.04749751463532448,
0.022953582927584648,
0.04890260100364685,
-0.11987216770648956,
-0.04850554093718529,
-0.09169840067625046,
-0.01283406000584364,
-0.03492683172225952,
0.012297259643673897,
-0.02142752707004547,
0.0054372516460716724,
0.09191057831048965,
-0.11228542774915695,
-0.01965438760817051,
-0.01842508465051651,
-0.06635976582765579,
0.06029444932937622,
-0.01023510005325079,
0.08098528534173965,
0.13359171152114868,
-0.05688450112938881,
0.030744880437850952,
-0.0406256802380085,
0.20381703972816467,
-0.06559400260448456,
-0.00041359965689480305,
0.11514842510223389,
-0.030403494834899902,
0.01726492866873741,
0.07072177529335022,
0.014340476132929325,
-0.11812824010848999,
0.06048879399895668,
0.01930229924619198,
-0.04111722111701965,
-0.2269006073474884,
-0.05928179621696472,
-0.024207528680562973,
-0.02956075593829155,
0.03429083898663521,
0.05602303892374039,
0.08227167278528214,
0.07533462345600128,
0.06074568256735802,
0.06103229895234108,
-0.017525650560855865,
0.08726545423269272,
0.1007470190525055,
0.032030265778303146,
0.09059114754199982,
-0.06248554587364197,
-0.04443703219294548,
0.06666681170463562,
-0.0350211076438427,
0.20900262892246246,
0.020729128271341324,
0.021615777164697647,
0.07783155143260956,
0.07467901706695557,
0.01241176389157772,
0.09175287187099457,
0.047529906034469604,
-0.03395375609397888,
0.017874393612146378,
-0.07714991271495819,
-0.043799400329589844,
0.018205922096967697,
-0.09818780422210693,
0.039480991661548615,
-0.1212618425488472,
0.025496218353509903,
0.04517105594277382,
0.20986391603946686,
0.04697508364915848,
-0.3445958197116852,
-0.12623251974582672,
0.0016392040997743607,
-0.014323082752525806,
-0.0924556627869606,
0.004777390509843826,
0.1021031066775322,
-0.082049660384655,
0.06721453368663788,
-0.05843672528862953,
0.08219611644744873,
0.012268069200217724,
0.030151525512337685,
-0.006868151482194662,
0.08336251229047775,
-0.03246600925922394,
0.07813751697540283,
-0.27189937233924866,
0.21888838708400726,
0.024023212492465973,
0.13579605519771576,
-0.07322660088539124,
-0.013989115133881569,
0.022522686049342155,
0.1621394157409668,
0.12504355609416962,
-0.012795691378414631,
-0.07290645688772202,
-0.1284068375825882,
-0.03648544102907181,
0.01815023273229599,
0.0934407189488411,
0.0321088582277298,
0.10686223208904266,
-0.03057166561484337,
0.016891490668058395,
0.05749629810452461,
-0.04707269370555878,
-0.17635294795036316,
-0.06425869464874268,
0.007427383214235306,
0.016997363418340683,
-0.0107610784471035,
-0.06868866831064224,
-0.09170413762331009,
-0.017129795625805855,
0.18027368187904358,
-0.033670514822006226,
-0.047361768782138824,
-0.15045490860939026,
0.04219771549105644,
0.06389299035072327,
-0.024340718984603882,
0.005455296952277422,
-0.0015079026343300939,
0.10523989051580429,
0.03955287113785744,
-0.11216603219509125,
0.09714669734239578,
-0.08098292350769043,
-0.15277501940727234,
-0.049726713448762894,
0.0800110250711441,
0.07724880427122116,
0.016049029305577278,
0.030603798106312752,
0.030313977971673012,
-0.0020679922308772802,
-0.08436320722103119,
0.005072342231869698,
0.027341565117239952,
0.08220048993825912,
0.026153739541769028,
-0.05938125029206276,
-0.040328498929739,
-0.021562041714787483,
-0.007974452339112759,
0.1166033074259758,
0.18617987632751465,
-0.06927470117807388,
0.08391072601079941,
0.06632441282272339,
-0.11640796810388565,
-0.2503224313259125,
0.11201346665620804,
0.01234807912260294,
0.02748488448560238,
0.011251688934862614,
-0.1266716718673706,
0.11470232903957367,
0.010049591772258282,
-0.01195224653929472,
0.08613159507513046,
-0.22005242109298706,
-0.14375418424606323,
0.12623687088489532,
0.13099674880504608,
0.1649169772863388,
-0.12579591572284698,
-0.043041735887527466,
-0.06237123906612396,
-0.0281328447163105,
0.14589381217956543,
-0.24966123700141907,
0.08451898396015167,
-0.0031535339076071978,
0.06765058636665344,
0.03649120777845383,
-0.042824000120162964,
0.11471690237522125,
-0.023215174674987793,
0.11665204912424088,
-0.08918062597513199,
0.03853284940123558,
0.13381612300872803,
-0.04740867763757706,
0.08439839631319046,
-0.01297910325229168,
0.06166866049170494,
-0.06129482388496399,
-0.028908079490065575,
-0.06161531060934067,
0.073972687125206,
-0.03112545795738697,
-0.05693087726831436,
-0.03507149592041969,
0.03383469581604004,
0.05239412561058998,
-0.05127956345677376,
0.052690859884023666,
-0.01281151082366705,
0.18441472947597504,
0.16782785952091217,
0.19984397292137146,
-0.013270526193082333,
0.013338884338736534,
0.06396446377038956,
-0.0508442185819149,
0.07495391368865967,
-0.1409151405096054,
0.040022894740104675,
0.10712878406047821,
0.0023294398561120033,
0.11427748203277588,
0.08219510316848755,
-0.06012137606739998,
-0.0014800672652199864,
0.052480846643447876,
-0.1381894052028656,
-0.13496796786785126,
-0.015679938718676567,
-0.03153648599982262,
-0.05228723958134651,
0.07756812870502472,
0.16224157810211182,
-0.10791082680225372,
0.01212973240762949,
-0.019686652347445488,
-0.018505742773413658,
-0.07417555153369904,
0.17732688784599304,
0.05026915296912193,
0.04659675061702728,
-0.06836623698472977,
0.12951652705669403,
0.024783318862318993,
-0.07297854870557785,
0.08869700878858566,
0.02703055366873741,
-0.1240474134683609,
-0.08117544651031494,
0.1216532438993454,
0.24661895632743835,
-0.03263086453080177,
-0.08684065192937851,
-0.113162100315094,
-0.10769210755825043,
0.028245745226740837,
0.24075046181678772,
0.059835195541381836,
0.038150105625391006,
-0.053749945014715195,
0.028420401737093925,
-0.15155072510242462,
0.07660268247127533,
0.05553816258907318,
0.05128858983516693,
-0.14500969648361206,
0.18192514777183533,
-0.019452670589089394,
0.058580320328474045,
-0.0780901089310646,
-0.012471764348447323,
-0.12700362503528595,
0.007847229018807411,
-0.21029868721961975,
0.011803103610873222,
-0.02678617835044861,
-0.01569700799882412,
0.024230072274804115,
-0.024795351549983025,
-0.03369378671050072,
0.037062522023916245,
-0.07847478240728378,
0.011454288847744465,
0.02781022898852825,
0.02228650078177452,
-0.09594191610813141,
-0.022964127361774445,
-0.009987433440983295,
-0.0699102059006691,
0.04424195736646652,
0.05824486166238785,
-0.038479216396808624,
0.060727860778570175,
-0.15219412744045258,
0.015228296630084515,
0.0425235740840435,
-0.010471757501363754,
0.06006818264722824,
-0.05280604213476181,
-0.01024082861840725,
0.0161625724285841,
0.05421319603919983,
0.04251788184046745,
0.11355788260698318,
-0.07884746789932251,
-0.04195818677544594,
-0.03539305925369263,
-0.01201969850808382,
-0.05211811512708664,
0.051814720034599304,
0.07988996803760529,
0.03246987611055374,
0.14738328754901886,
-0.12015005946159363,
0.018556194379925728,
-0.14167426526546478,
-0.03173995763063431,
0.0016568260034546256,
-0.06933299452066422,
-0.056507695466279984,
-0.04075814411044121,
0.08554795384407043,
-0.06717485934495926,
0.13897572457790375,
0.026291968300938606,
0.10920508950948715,
0.03462434187531471,
-0.030292194336652756,
-0.04858187213540077,
0.03728582337498665,
0.19629134237766266,
0.019852453842759132,
-0.020087482407689095,
-0.015528608113527298,
0.04252491146326065,
0.06330705434083939,
-0.02303488925099373,
0.1556374430656433,
0.08529830724000931,
-0.09312453866004944,
0.10198335349559784,
0.015127538703382015,
-0.046773511916399,
-0.11556631326675415,
0.007388210855424404,
-0.06972145289182663,
0.10769801586866379,
-0.06728176027536392,
0.07786940783262253,
0.0896795243024826,
-0.09796709567308426,
0.0406193882226944,
-0.08400116860866547,
-0.08164413273334503,
-0.131351500749588,
-0.09035584330558777,
-0.09766645729541779,
-0.11300785839557648,
-0.0027637688908725977,
-0.08930151909589767,
-0.015069620683789253,
0.011554702185094357,
0.019949616864323616,
-0.031006217002868652,
0.20634447038173676,
-0.046935006976127625,
-0.009838992729783058,
0.08907362073659897,
-0.022544370964169502,
-0.005743343383073807,
-0.05033431574702263,
-0.011472487822175026,
0.02545243874192238,
-0.011096430011093616,
0.023437365889549255,
0.005469414871186018,
0.01772448793053627,
0.040209800004959106,
-0.022430134937167168,
-0.09845241904258728,
0.037569496780633926,
0.05568535998463631,
-0.0046564554795622826,
-0.0013111833250150084,
0.037474025040864944,
-0.035538677126169205,
-0.01777973212301731,
0.23044829070568085,
-0.09591256082057953,
-0.07596925646066666,
-0.12235940992832184,
0.2972774803638458,
0.03395606204867363,
0.021070005372166634,
0.03805476799607277,
-0.10306093841791153,
0.01132427528500557,
0.21801790595054626,
0.17697618901729584,
-0.04704868420958519,
0.00567311467602849,
0.0008751877467148006,
-0.009649792686104774,
-0.050596144050359726,
0.1436302214860916,
0.05616304278373718,
0.014306357130408287,
-0.062095485627651215,
0.0011794791789725423,
-0.030299199745059013,
-0.03731665387749672,
-0.049852561205625534,
0.06760269403457642,
0.010088733397424221,
0.027285443618893623,
-0.04199608415365219,
0.08307115733623505,
-0.06858541816473007,
-0.14348769187927246,
0.0742865651845932,
-0.11846941709518433,
-0.12307446449995041,
-0.03363046422600746,
0.00109284115023911,
-0.024614939466118813,
0.06569100171327591,
-0.02774694934487343,
0.008667867630720139,
0.12077951431274414,
-0.01575053669512272,
-0.03394386172294617,
-0.03293367475271225,
0.06658601015806198,
-0.07144404947757721,
0.1889711320400238,
-0.006053270306438208,
0.07238168269395828,
0.11211136728525162,
0.010196895338594913,
-0.10568271577358246,
0.09622462838888168,
0.011111306957900524,
-0.028399460017681122,
0.05375508964061737,
0.1057015210390091,
-0.026327261701226234,
0.03934089094400406,
0.022952720522880554,
-0.15696647763252258,
0.012851106934249401,
-0.04274914786219597,
-0.06379839032888412,
-0.07766152173280716,
0.004402376711368561,
-0.07130341976881027,
0.13506123423576355,
0.1589047610759735,
-0.03271019831299782,
0.042290035635232925,
-0.07441820204257965,
0.055783461779356,
0.03691716492176056,
-0.0024999482557177544,
-0.019267700612545013,
-0.20957808196544647,
0.016834961250424385,
0.11407579481601715,
0.008412256836891174,
-0.33323004841804504,
-0.047622665762901306,
0.0044747330248355865,
-0.03550456091761589,
-0.05804113298654556,
0.08313986659049988,
0.09928911924362183,
0.04818571358919144,
-0.07338577508926392,
-0.09037366509437561,
-0.03262491151690483,
0.13614985346794128,
-0.09842291474342346,
-0.07767673581838608
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# my-test-yelp-tfmodel
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.38.0.dev0
- TensorFlow 2.15.0
- Datasets 2.17.1.dev0
- Tokenizers 0.15.1
| {"tags": ["generated_from_keras_callback"], "model-index": [{"name": "my-test-yelp-tfmodel", "results": []}]} | text-classification | dah1214/my-test-yelp-tfmodel | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:18:36+00:00 | [] | [] | TAGS
#transformers #tf #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us
|
# my-test-yelp-tfmodel
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.38.0.dev0
- TensorFlow 2.15.0
- Datasets 2.17.1.dev0
- Tokenizers 0.15.1
| [
"# my-test-yelp-tfmodel\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tf #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us \n",
"# my-test-yelp-tfmodel\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
46,
37,
6,
12,
8,
3,
33,
4,
39
] | [
"passage: TAGS\n#transformers #tf #bert #text-classification #generated_from_keras_callback #autotrain_compatible #endpoints_compatible #region-us \n# my-test-yelp-tfmodel\n\nThis model was trained from scratch on an unknown dataset.\nIt achieves the following results on the evaluation set:## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32### Training results### Framework versions\n\n- Transformers 4.38.0.dev0\n- TensorFlow 2.15.0\n- Datasets 2.17.1.dev0\n- Tokenizers 0.15.1"
] | [
-0.08889059722423553,
0.043165724724531174,
-0.0013792839599773288,
0.08635973930358887,
0.16000686585903168,
0.01678677462041378,
0.12205733358860016,
0.11260897666215897,
-0.13344533741474152,
0.006977423094213009,
0.07431142777204514,
0.10692296922206879,
0.011670962907373905,
0.11020848155021667,
-0.0410991832613945,
-0.22292165458202362,
0.010255051776766777,
0.0395084023475647,
-0.08738315850496292,
0.09746209532022476,
0.10219727456569672,
-0.09781637042760849,
0.10040878504514694,
-0.007127978838980198,
-0.24021396040916443,
0.02157033234834671,
0.039130181074142456,
-0.09503190964460373,
0.11337052285671234,
0.04858701303601265,
0.09750138968229294,
0.03029770217835903,
0.058104682713747025,
-0.0963207334280014,
0.02186656929552555,
0.08441153913736343,
-0.0028876063879579306,
0.08523660153150558,
0.06400995701551437,
-0.01209202129393816,
0.11529339104890823,
-0.050218284130096436,
0.07697945088148117,
0.05907473713159561,
-0.11126936227083206,
-0.14426738023757935,
-0.07712943851947784,
0.04357216879725456,
0.04028253257274628,
0.10238510370254517,
-0.008843885734677315,
0.27086666226387024,
-0.06557036191225052,
0.10424771904945374,
0.11065519601106644,
-0.29364439845085144,
-0.07556601613759995,
0.052734121680259705,
0.05490100011229515,
0.021629154682159424,
-0.07025042176246643,
0.038187794387340546,
0.07543309777975082,
0.06386129558086395,
0.08186817914247513,
-0.039962030947208405,
-0.1267259269952774,
0.00017128667968790978,
-0.11565084010362625,
0.041449181735515594,
0.21877965331077576,
-0.011591737158596516,
-0.08025777339935303,
-0.029281843453645706,
-0.06341329216957092,
-0.07715392112731934,
-0.001626643817871809,
-0.08538156002759933,
0.027953561395406723,
-0.008218779228627682,
-0.09413788467645645,
-0.06110571324825287,
-0.10351350903511047,
-0.06900563091039658,
-0.12136855721473694,
0.15972229838371277,
0.016079597175121307,
0.023955732583999634,
-0.11161044239997864,
0.10350514203310013,
-0.05286810174584389,
-0.09568533301353455,
-0.014865469187498093,
-0.024684015661478043,
-0.0716499462723732,
-0.10922028869390488,
-0.0814313143491745,
-0.2309575229883194,
0.01909647136926651,
0.09660545736551285,
-0.04595617204904556,
0.0670461356639862,
-0.0885465070605278,
0.01561659388244152,
-0.02044612728059292,
0.15133972465991974,
-0.034870695322752,
0.030089326202869415,
0.021778538823127747,
-0.04146883636713028,
-0.03300169110298157,
-0.024058014154434204,
-0.10903876274824142,
-0.00248443684540689,
0.08613864332437515,
0.03663420304656029,
-0.044400397688150406,
0.10161420702934265,
-0.019504493102431297,
-0.03362659737467766,
-0.017929544672369957,
-0.09393347054719925,
0.023453563451766968,
-0.018255440518260002,
-0.060542792081832886,
0.03160879388451576,
0.09634819626808167,
-0.03915967792272568,
-0.06299037486314774,
0.006231611128896475,
-0.10182104259729385,
-0.0019375524716451764,
-0.07531804591417313,
-0.11763733625411987,
0.016699520871043205,
-0.07673755288124084,
-0.00454439502209425,
-0.13138025999069214,
-0.17210721969604492,
-0.008378260768949986,
0.05826310068368912,
-0.05832288786768913,
0.03544455021619797,
-0.043261583894491196,
-0.07849801331758499,
0.028041182085871696,
0.008924716152250767,
0.09393095225095749,
-0.030679401010274887,
0.05392984673380852,
0.02431376650929451,
0.04556122049689293,
-0.05831921100616455,
0.03877222165465355,
-0.10028748959302902,
0.012845522724092007,
-0.13685671985149384,
0.09448482096195221,
-0.045737236738204956,
0.06437671929597855,
-0.11620953679084778,
-0.06998839974403381,
-0.01662641204893589,
0.007212814409285784,
0.08332757651805878,
0.1579393744468689,
-0.2185913622379303,
-0.026440314948558807,
0.14800161123275757,
-0.10863598436117172,
-0.12119796127080917,
0.08507129549980164,
-0.06603576987981796,
0.152324840426445,
0.09238678216934204,
0.13937118649482727,
0.10327135771512985,
-0.10890939831733704,
0.0477411150932312,
0.027010934427380562,
-0.01901669055223465,
-0.00904688611626625,
-0.010017403401434422,
-0.0002327476831851527,
-0.10027719289064407,
0.02705661952495575,
-0.008413268253207207,
0.041534241288900375,
-0.09743597358465195,
-0.04737505689263344,
-0.07025044411420822,
-0.08025623112916946,
0.09816792607307434,
0.0006010328652337193,
0.10806719958782196,
-0.06635899841785431,
-0.10783585160970688,
0.13692709803581238,
0.04773441329598427,
-0.03645921126008034,
0.022299185395240784,
-0.10921639204025269,
0.048507705330848694,
-0.07692967355251312,
-0.0014191449154168367,
-0.20616741478443146,
-0.07682149112224579,
0.02226053923368454,
0.07311679422855377,
0.07761700451374054,
0.05068338289856911,
0.09564118832349777,
0.05069408938288689,
-0.05692363902926445,
0.017120996490120888,
0.000952612841501832,
0.035659030079841614,
-0.11172228306531906,
-0.18174289166927338,
0.0036068852059543133,
-0.05891624465584755,
0.045639023184776306,
-0.22619161009788513,
0.012226646766066551,
0.06677427887916565,
0.13021127879619598,
0.05369566008448601,
0.0011781231733039021,
-0.00782361626625061,
0.040349870920181274,
-0.041328661143779755,
-0.08427941799163818,
0.03916824236512184,
0.031202249228954315,
-0.13580439984798431,
0.02210632711648941,
-0.14358121156692505,
0.08992946147918701,
0.12561966478824615,
-0.07813303172588348,
-0.14409703016281128,
0.03584350273013115,
-0.04005438834428787,
-0.028228776529431343,
-0.005537611898034811,
0.01643059030175209,
0.16047893464565277,
-0.004778755363076925,
0.15759184956550598,
-0.045617736876010895,
-0.05734407156705856,
0.03806010261178017,
-0.02182929962873459,
-0.023018524050712585,
0.05287003517150879,
-0.007467512506991625,
-0.17941834032535553,
0.09153151512145996,
0.06448414921760559,
-0.036178309470415115,
0.16499939560890198,
-0.046965766698122025,
-0.0651303231716156,
-0.047443509101867676,
-0.004524465650320053,
0.02464168146252632,
0.09679603576660156,
-0.1417844444513321,
-0.012887652963399887,
0.027858203276991844,
0.03800472244620323,
0.03656890615820885,
-0.1565513014793396,
0.0031773769296705723,
0.03369593247771263,
0.008891896344721317,
-0.015394081361591816,
0.031541988253593445,
-0.005539882928133011,
0.11033321917057037,
0.030461909249424934,
0.00810049194842577,
0.08294665813446045,
0.01846638321876526,
-0.11871928721666336,
0.21774359047412872,
-0.1389954686164856,
-0.13305263221263885,
-0.09747221320867538,
-0.008414276875555515,
-0.05262766778469086,
0.014115344732999802,
0.03856462240219116,
-0.12430703639984131,
-0.045681603252887726,
-0.06492786854505539,
0.00666838837787509,
-0.06512594223022461,
0.03673644736409187,
-0.03241787478327751,
-0.0004605415742844343,
0.08907857537269592,
-0.12344109266996384,
-0.010475938208401203,
-0.011685307137668133,
-0.06321903318166733,
0.04919998720288277,
-0.0350852832198143,
0.08691992610692978,
0.14973463118076324,
-0.06199580430984497,
0.044127099215984344,
-0.051836758852005005,
0.2155851125717163,
-0.07977675646543503,
0.020021142438054085,
0.10380931943655014,
-0.030303621664643288,
0.008050799369812012,
0.04750863462686539,
0.022748949006199837,
-0.10983256250619888,
0.060772672295570374,
0.02177618257701397,
-0.048360973596572876,
-0.24633239209651947,
-0.04184577614068985,
-0.04787402227520943,
-0.039849743247032166,
0.03808833286166191,
0.05499909445643425,
0.12224866449832916,
0.06748116761445999,
0.07275526225566864,
0.10727407038211823,
-0.03734719753265381,
0.08130889385938644,
0.11992263793945312,
0.0417238250374794,
0.08898663520812988,
-0.06924588978290558,
-0.03835345804691315,
0.07721955329179764,
-0.044702138751745224,
0.22667057812213898,
0.016835028305649757,
0.06430666148662567,
0.06775417923927307,
0.0748661607503891,
0.0009319952805526555,
0.09638623148202896,
0.05104238912463188,
-0.032345063984394073,
0.01681601256132126,
-0.07903069257736206,
-0.021318109706044197,
0.0166312362998724,
-0.06690381467342377,
0.026536820456385612,
-0.11897995322942734,
0.011396117508411407,
0.033956676721572876,
0.21969860792160034,
0.0316571407020092,
-0.33909377455711365,
-0.133775532245636,
-0.020004205405712128,
-0.025104045867919922,
-0.08322613686323166,
-0.009730266407132149,
0.09339594841003418,
-0.08566085249185562,
0.07188962399959564,
-0.06477732211351395,
0.09047138690948486,
0.023276396095752716,
0.02420072630047798,
0.026705771684646606,
0.08278697729110718,
-0.028037244454026222,
0.07411117106676102,
-0.2984030246734619,
0.2272130697965622,
0.031993359327316284,
0.12467356771230698,
-0.08273736387491226,
-0.00567085575312376,
0.01794363372027874,
0.15614895522594452,
0.11059282720088959,
-0.008507652208209038,
-0.051569148898124695,
-0.12961827218532562,
-0.014027681201696396,
0.005426236428320408,
0.10380785167217255,
0.024853572249412537,
0.09693965315818787,
-0.031207289546728134,
0.008911472745239735,
0.07626116275787354,
-0.008692339062690735,
-0.17987540364265442,
-0.06112354248762131,
0.030206579715013504,
-0.001405377872288227,
-0.044010285288095474,
-0.05481899157166481,
-0.09564853459596634,
0.0015023115556687117,
0.17699569463729858,
-0.005875044036656618,
-0.04214587062597275,
-0.17327497899532318,
0.05715441703796387,
0.08834388852119446,
-0.03657487779855728,
0.013964930549263954,
0.0016073491424322128,
0.09379275143146515,
0.045452360063791275,
-0.14922985434532166,
0.11159782111644745,
-0.07531245797872543,
-0.13606199622154236,
-0.043823424726724625,
0.07510954886674881,
0.0782921090722084,
0.03796721622347832,
0.020633593201637268,
0.020871330052614212,
0.01647118479013443,
-0.08230334520339966,
0.013744059950113297,
0.009650889784097672,
0.039682358503341675,
0.006939831655472517,
-0.061201971024274826,
-0.003146236529573798,
-0.020194387063384056,
0.004218699410557747,
0.11762827634811401,
0.14642608165740967,
-0.07923097908496857,
0.0792023092508316,
0.041169434785842896,
-0.12068429589271545,
-0.2212270051240921,
0.13853630423545837,
0.03884478285908699,
0.02535378746688366,
0.012589458376169205,
-0.15995970368385315,
0.11770592629909515,
-0.002803379436954856,
0.004659710917621851,
0.08687763661146164,
-0.24135330319404602,
-0.1473166048526764,
0.12274360656738281,
0.1101243868470192,
0.13006848096847534,
-0.12065131217241287,
-0.04202008619904518,
-0.049061983823776245,
-0.04990040138363838,
0.16877517104148865,
-0.22699390351772308,
0.09624845534563065,
0.0027124793268740177,
0.07905639708042145,
0.035941433161497116,
-0.028224945068359375,
0.08891986310482025,
0.007516522891819477,
0.10870987921953201,
-0.08474988490343094,
0.016336051747202873,
0.15733368694782257,
-0.03866451978683472,
0.09094599634408951,
0.031782835721969604,
0.0586448572576046,
-0.08752556890249252,
-0.02263190969824791,
-0.07336337864398956,
0.07756608724594116,
-0.029267646372318268,
-0.07511449605226517,
-0.04538208991289139,
0.03249514847993851,
0.05314663052558899,
-0.06165657564997673,
0.05504348874092102,
0.00009395374945597723,
0.1753910630941391,
0.16704070568084717,
0.19025033712387085,
-0.00952893029898405,
-0.02400358021259308,
0.05974418669939041,
-0.05031198263168335,
0.0816228911280632,
-0.15567904710769653,
0.03518105670809746,
0.11344113200902939,
0.010618330910801888,
0.1156647652387619,
0.0868886336684227,
-0.07208175212144852,
-0.009329196065664291,
0.044253550469875336,
-0.12732268869876862,
-0.12645223736763,
-0.046781253069639206,
-0.07716123759746552,
-0.04883357882499695,
0.06669303774833679,
0.17116905748844147,
-0.11281327903270721,
0.01529992837458849,
-0.000013439534995995928,
-0.04772176221013069,
-0.09107930958271027,
0.18928828835487366,
0.05833642557263374,
0.037956468760967255,
-0.06873775273561478,
0.13019394874572754,
0.029791217297315598,
-0.06434173136949539,
0.09272075444459915,
0.04424084722995758,
-0.11467549949884415,
-0.07051368802785873,
0.09898678958415985,
0.22765269875526428,
-0.08354422450065613,
-0.054937321692705154,
-0.11455211043357849,
-0.09501136839389801,
0.047334522008895874,
0.24842803180217743,
0.05149344727396965,
0.04335524141788483,
-0.08562179654836655,
0.03976394236087799,
-0.1375771164894104,
0.06172097101807594,
0.05241614580154419,
0.06075320392847061,
-0.14033158123493195,
0.2269665151834488,
-0.024343453347682953,
0.07162553817033768,
-0.08527122437953949,
-0.017242850735783577,
-0.1294693648815155,
0.005605556070804596,
-0.20657767355442047,
-0.014028947800397873,
-0.002497067442163825,
-0.016974110156297684,
0.019671915099024773,
-0.0398755706846714,
-0.03658849000930786,
0.04040374606847763,
-0.09030115604400635,
0.0033047571778297424,
0.041089609265327454,
0.017148936167359352,
-0.08557455986738205,
-0.02039084956049919,
-0.015197809785604477,
-0.06326620280742645,
0.05659056827425957,
0.06112498417496681,
-0.04167141765356064,
0.06387753784656525,
-0.13188894093036652,
0.0016774105606600642,
0.03557484224438667,
-0.004762386903166771,
0.07840114086866379,
-0.04603618010878563,
-0.001869357656687498,
-0.0018506565829738975,
0.07156219333410263,
0.042669784277677536,
0.09297147393226624,
-0.08506983518600464,
-0.06930393725633621,
-0.043233610689640045,
-0.018035652115941048,
-0.06341990828514099,
0.06653488427400589,
0.08407928794622421,
0.04181816801428795,
0.12429419159889221,
-0.10475053638219833,
0.02215493470430374,
-0.1529742330312729,
-0.036708276718854904,
0.0005076650995761156,
-0.06577274948358536,
-0.03985900431871414,
-0.05142071098089218,
0.0869666114449501,
-0.0768962949514389,
0.11859586089849472,
0.022870389744639397,
0.11155830323696136,
0.042270317673683167,
-0.020368797704577446,
-0.05308863893151283,
0.020588243380188942,
0.21325290203094482,
0.02636050246655941,
-0.01312868483364582,
-0.013130883686244488,
0.051843415945768356,
0.052663952112197876,
0.02033521980047226,
0.1767718493938446,
0.05289094150066376,
-0.11252813041210175,
0.1115075945854187,
0.032690007239580154,
-0.043683554977178574,
-0.1215575784444809,
0.0038573306519538164,
-0.056096844375133514,
0.13049015402793884,
-0.06917082518339157,
0.05155746638774872,
0.08420085161924362,
-0.10919932276010513,
0.05427663028240204,
-0.060689955949783325,
-0.08633362501859665,
-0.1255842000246048,
-0.0972081646323204,
-0.08686025440692902,
-0.11665958911180496,
-0.0036687578540295362,
-0.09460367262363434,
-0.003176062600687146,
-0.012345400638878345,
0.03375987708568573,
-0.032331470400094986,
0.21463213860988617,
-0.05655868351459503,
-0.024634679779410362,
0.11610069870948792,
-0.020092664286494255,
-0.01310813706368208,
-0.059916432946920395,
-0.000009622405741538387,
0.014911757782101631,
0.005544023588299751,
0.010130645707249641,
-0.008568506687879562,
0.013268496841192245,
0.038367144763469696,
-0.017142562195658684,
-0.09794601798057556,
0.03414643555879593,
0.05318881943821907,
-0.0071912663988769054,
-0.0048100450076162815,
0.031641390174627304,
-0.03443372994661331,
-0.02888626419007778,
0.21566155552864075,
-0.1059715747833252,
-0.056670986115932465,
-0.13719415664672852,
0.2947438359260559,
0.03265488147735596,
0.011676435358822346,
0.03637082502245903,
-0.0933554470539093,
-0.00811080913990736,
0.2329002171754837,
0.1917419284582138,
-0.04270632192492485,
-0.008164746686816216,
0.023165473714470863,
-0.007636902388185263,
-0.03733433410525322,
0.17580366134643555,
0.0374709889292717,
0.026465672999620438,
-0.05715055391192436,
-0.001190766110084951,
-0.027052197605371475,
-0.039886411279439926,
-0.022231921553611755,
0.08701547980308533,
0.03916875272989273,
0.01604176126420498,
-0.026977961882948875,
0.08807893097400665,
-0.08838821202516556,
-0.14533120393753052,
0.06495629996061325,
-0.1206662729382515,
-0.1301376074552536,
-0.04406953230500221,
-0.026412060484290123,
-0.015052465721964836,
0.07863359898328781,
-0.031801581382751465,
0.018735947087407112,
0.142486110329628,
-0.00646011158823967,
-0.07157653570175171,
-0.04525645077228546,
0.09320750087499619,
-0.07777146995067596,
0.16029441356658936,
-0.012464834377169609,
0.07030130922794342,
0.11065758764743805,
0.025641849264502525,
-0.09790129959583282,
0.0684216171503067,
0.0165303573012352,
-0.03344346210360527,
0.04917589947581291,
0.12432648986577988,
-0.025031307712197304,
0.04338718205690384,
0.012880120426416397,
-0.16532163321971893,
0.012473317794501781,
-0.04632715508341789,
-0.04438764601945877,
-0.08991850167512894,
-0.00960993766784668,
-0.0856395810842514,
0.13187579810619354,
0.20134294033050537,
-0.031464770436286926,
0.02664223685860634,
-0.09249072521924973,
0.044854264706373215,
0.056016288697719574,
-0.014272178523242474,
-0.03873061761260033,
-0.18561449646949768,
0.0054826741106808186,
0.09158536046743393,
0.0014793436275795102,
-0.2972817122936249,
-0.05419601500034332,
0.009608307853341103,
-0.048103563487529755,
-0.056300435215234756,
0.10179122537374496,
0.09942106902599335,
0.05197533592581749,
-0.058694012463092804,
-0.08044926077127457,
-0.035500939935445786,
0.11471696197986603,
-0.12930157780647278,
-0.05773692950606346
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# videomae-base-finetuned-ucf101-subset
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2049
- Accuracy: 0.8857
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 3
- eval_batch_size: 3
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 400
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.4998 | 0.25 | 100 | 1.2317 | 0.7429 |
| 0.874 | 1.25 | 200 | 0.5223 | 0.8 |
| 0.1615 | 2.25 | 300 | 0.5311 | 0.8143 |
| 0.4135 | 3.25 | 400 | 0.2049 | 0.8857 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "cc-by-nc-4.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "MCG-NJU/videomae-base", "model-index": [{"name": "videomae-base-finetuned-ucf101-subset", "results": []}]} | video-classification | yasmina666/videomae-base-finetuned-ucf101-subset | [
"transformers",
"tensorboard",
"safetensors",
"videomae",
"video-classification",
"generated_from_trainer",
"base_model:MCG-NJU/videomae-base",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:23:56+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #videomae #video-classification #generated_from_trainer #base_model-MCG-NJU/videomae-base #license-cc-by-nc-4.0 #endpoints_compatible #region-us
| videomae-base-finetuned-ucf101-subset
=====================================
This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2049
* Accuracy: 0.8857
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 3
* eval\_batch\_size: 3
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 400
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 3\n* eval\\_batch\\_size: 3\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 400",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #videomae #video-classification #generated_from_trainer #base_model-MCG-NJU/videomae-base #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 3\n* eval\\_batch\\_size: 3\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 400",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
69,
115,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #videomae #video-classification #generated_from_trainer #base_model-MCG-NJU/videomae-base #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 3\n* eval\\_batch\\_size: 3\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 400### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.0874498039484024,
0.0884881392121315,
-0.0023074860218912363,
0.08449561148881912,
0.11268649250268936,
0.0009402279974892735,
0.14214852452278137,
0.15553675591945648,
-0.10143065452575684,
0.04398950934410095,
0.13222795724868774,
0.12660682201385498,
0.030590955168008804,
0.17974910140037537,
-0.033350810408592224,
-0.27104252576828003,
0.02213938720524311,
0.03807564079761505,
-0.051956579089164734,
0.12419676035642624,
0.0990377888083458,
-0.150787815451622,
0.0904880091547966,
-0.017432114109396935,
-0.21170884370803833,
-0.004579656757414341,
0.030359169468283653,
-0.06598023325204849,
0.12191091477870941,
0.010894875973463058,
0.06970278173685074,
0.038307368755340576,
0.107505664229393,
-0.21824370324611664,
0.013785401359200478,
0.06076051667332649,
-0.003114820923656225,
0.05920226871967316,
0.06169874593615532,
0.037431418895721436,
0.1044580340385437,
-0.11688762158155441,
0.0685056671500206,
0.029064854606986046,
-0.13529515266418457,
-0.2361927181482315,
-0.08240073174238205,
-0.012290732935070992,
0.09102322161197662,
0.0526651069521904,
-0.014882574789226055,
0.11144286394119263,
-0.05187759920954704,
0.11752411723136902,
0.264604389667511,
-0.2643386125564575,
-0.07839475572109222,
0.07849124819040298,
0.09298699349164963,
0.077733613550663,
-0.13519632816314697,
0.024501804262399673,
0.053594786673784256,
0.0018423862056806684,
0.14409340918064117,
-0.028805114328861237,
-0.0061749243177473545,
-0.020586522296071053,
-0.13208161294460297,
-0.06667640805244446,
0.11947070062160492,
0.07711769640445709,
-0.024675879627466202,
-0.0819985419511795,
-0.03009405918419361,
-0.1800958216190338,
-0.08780208230018616,
0.01826660893857479,
0.059234265238046646,
-0.060479164123535156,
-0.10922325402498245,
0.012609332799911499,
-0.08199518918991089,
-0.0907338410615921,
0.006586017087101936,
0.1405842900276184,
0.038232918828725815,
0.026112424209713936,
-0.07236725091934204,
0.09140045195817947,
-0.029723530635237694,
-0.1525333672761917,
0.003908230923116207,
0.01750747114419937,
-0.00775930704548955,
-0.03443673625588417,
-0.026863912120461464,
-0.058737702667713165,
0.0015437162946909666,
0.12931182980537415,
-0.09210408478975296,
0.07475433498620987,
-0.002108609303832054,
0.05712852627038956,
-0.08797786384820938,
0.1536358892917633,
-0.04717997834086418,
-0.0244887787848711,
-0.004969581495970488,
0.11356627196073532,
0.0134343970566988,
-0.01008300855755806,
-0.09452293813228607,
0.04822881892323494,
0.06870287656784058,
0.030868906527757645,
-0.03261331468820572,
0.07635846734046936,
-0.06339048594236374,
-0.0028072146233171225,
0.014064288698136806,
-0.08785742521286011,
0.041973743587732315,
-0.002781918505206704,
-0.06285645067691803,
-0.053297221660614014,
-0.009386705234646797,
0.007864375598728657,
0.028315344825387,
0.08183863013982773,
-0.07545938342809677,
0.0333438403904438,
-0.0968879833817482,
-0.12017260491847992,
0.0452047660946846,
-0.13862675428390503,
0.009981202892959118,
-0.0698503851890564,
-0.08833006769418716,
0.005998183973133564,
0.055164169520139694,
-0.02061302214860916,
-0.0032543959096074104,
-0.06532382220029831,
-0.08664031326770782,
0.03322513774037361,
0.001566013670526445,
0.08830223232507706,
-0.07753553241491318,
0.09207061678171158,
0.031368549913167953,
0.09693261981010437,
-0.016030659899115562,
0.028966106474399567,
-0.051179416477680206,
0.045923300087451935,
-0.22180289030075073,
0.043288636952638626,
-0.08818051964044571,
0.04521283134818077,
-0.09012513607740402,
-0.07534131407737732,
0.04602871090173721,
-0.00556286983191967,
0.03166208043694496,
0.11245115101337433,
-0.2291651964187622,
-0.06829307973384857,
0.17077450454235077,
-0.0859512910246849,
-0.1336878091096878,
0.09596902132034302,
-0.04737195000052452,
-0.002062680432572961,
0.03288011997938156,
0.1932438462972641,
0.044335104525089264,
-0.18258832395076752,
0.020476307719945908,
0.0022577198687940836,
0.03370458260178566,
0.0039582871831953526,
0.09368282556533813,
0.035912495106458664,
0.10647253692150116,
-0.01243309024721384,
-0.06277778744697571,
0.0366324856877327,
-0.1108681783080101,
-0.08679572492837906,
-0.026269542053341866,
-0.07128448039293289,
0.011510372161865234,
0.058482274413108826,
0.0415341891348362,
-0.10843103379011154,
-0.09840439260005951,
0.03478822857141495,
0.0754295140504837,
-0.07806546241044998,
0.0635482668876648,
-0.12046045064926147,
0.07266919314861298,
-0.06108986213803291,
-0.020095432177186012,
-0.1434023678302765,
-0.04519791528582573,
0.021993208676576614,
-0.02250065468251705,
-0.009938043542206287,
-0.028994308784604073,
0.06981850415468216,
0.07681465148925781,
-0.06330493092536926,
-0.026782019063830376,
-0.06021396443247795,
0.03192100673913956,
-0.0676528811454773,
-0.24702824652194977,
-0.03990256041288376,
-0.056281495839357376,
0.0714339166879654,
-0.18621303141117096,
0.016039332374930382,
0.1057366356253624,
0.13558146357536316,
0.07287930697202682,
-0.05331981182098389,
-0.0007420066394843161,
0.056084755808115005,
-0.010578709654510021,
-0.08037510514259338,
0.06456378102302551,
-0.0029181786812841892,
-0.08788881450891495,
-0.016860436648130417,
-0.13743330538272858,
0.1258213073015213,
0.13703037798404694,
-0.10478223860263824,
-0.05281812697649002,
0.015817904844880104,
-0.04456661641597748,
-0.01815924048423767,
-0.010062848217785358,
0.02501591295003891,
0.10446090251207352,
0.010491257533431053,
0.14278510212898254,
-0.09080547094345093,
-0.04837798327207565,
0.06774742901325226,
-0.03180000185966492,
-0.022721506655216217,
0.08765281736850739,
0.07499215751886368,
-0.07947064936161041,
0.12584908306598663,
0.16935306787490845,
-0.036614641547203064,
0.17597974836826324,
-0.0797799602150917,
-0.07878793776035309,
-0.03298057243227959,
0.0048425025306642056,
0.024373294785618782,
0.14824943244457245,
-0.10107848793268204,
-0.023691397160291672,
0.002782883122563362,
-0.005834608804434538,
-0.027185026556253433,
-0.2192079722881317,
-0.037154797464609146,
0.04154015704989433,
-0.07748963683843613,
-0.0193941630423069,
-0.007631506305187941,
-0.010831905528903008,
0.09822329878807068,
0.026943854987621307,
-0.06992153823375702,
0.03326214477419853,
-0.026612328365445137,
-0.06083410605788231,
0.17534059286117554,
-0.08646082878112793,
-0.15453894436359406,
-0.0942659005522728,
-0.08371850103139877,
-0.04011263698339462,
0.002027863869443536,
0.04621864855289459,
-0.10646502673625946,
-0.0324455127120018,
-0.0900481715798378,
-0.045197516679763794,
-0.0005475764046423137,
0.031581733375787735,
0.05961566045880318,
0.027436615899205208,
0.0794418677687645,
-0.09562576562166214,
-0.00795324519276619,
-0.033798698335886,
-0.06413528323173523,
0.054620008915662766,
0.043505776673555374,
0.13321641087532043,
0.10016371309757233,
-0.03553609549999237,
0.04108355939388275,
-0.051431115716695786,
0.23392082750797272,
-0.11889185756444931,
-0.006111877039074898,
0.1311948150396347,
-0.0192838404327631,
0.04836559668183327,
0.13937048614025116,
0.07517778873443604,
-0.10191803425550461,
-0.014714377000927925,
0.03254754841327667,
-0.04196390137076378,
-0.16481687128543854,
-0.005796049255877733,
-0.046444088220596313,
0.010725217871367931,
0.11689845472574234,
0.028134824708104134,
-0.003966502379626036,
0.03271828591823578,
0.018007172271609306,
0.022595373913645744,
0.03239166736602783,
0.10640934854745865,
0.07997547090053558,
0.04486656188964844,
0.1057717502117157,
-0.0488559789955616,
-0.013239836320281029,
0.03731169551610947,
0.03385791555047035,
0.21919918060302734,
0.025576332584023476,
0.16616927087306976,
0.09029527753591537,
0.11500649899244308,
0.030188463628292084,
0.016101326793432236,
-0.006793076638132334,
-0.04582468047738075,
-0.0031095435842871666,
-0.057664137333631516,
-0.01204624306410551,
0.032084036618471146,
-0.051717184484004974,
-0.017685018479824066,
-0.09870269894599915,
0.05812334641814232,
0.06688159704208374,
0.27530425786972046,
0.036563120782375336,
-0.3405172526836395,
-0.05783607438206673,
0.013333265669643879,
-0.023210037499666214,
-0.01883762702345848,
0.03934190422296524,
0.14633414149284363,
-0.06775157153606415,
0.10355670005083084,
-0.06973566859960556,
0.07541230320930481,
-0.06392315775156021,
0.02986108884215355,
0.11008051037788391,
0.060517776757478714,
-0.0021420451812446117,
0.03246616944670677,
-0.2871747314929962,
0.2935792803764343,
0.02088078111410141,
0.07265281677246094,
-0.03359281271696091,
-0.015538185834884644,
0.020340571179986,
0.07053578644990921,
0.1439007669687271,
-0.015503219328820705,
-0.1081080362200737,
-0.18761616945266724,
-0.035580314695835114,
0.013810359872877598,
0.13862046599388123,
-0.003994067199528217,
0.10401802510023117,
-0.022095365449786186,
-0.0179489366710186,
0.06224050745368004,
-0.10056139528751373,
-0.07951649278402328,
-0.07709413021802902,
-0.012399762868881226,
0.034907832741737366,
-0.008404618129134178,
-0.08936690539121628,
-0.08272848278284073,
-0.09645826369524002,
0.13376449048519135,
-0.10416809469461441,
-0.015739792957901955,
-0.11228620260953903,
0.041225921362638474,
0.04892105981707573,
-0.05811886116862297,
0.07515902817249298,
-0.009130059741437435,
0.15818209946155548,
-0.009622092358767986,
-0.05116329714655876,
0.13599129021167755,
-0.0752183198928833,
-0.18064820766448975,
-0.07480690628290176,
0.11705669015645981,
0.006798901595175266,
0.05787220224738121,
-0.014296360313892365,
0.03861866891384125,
0.01860089600086212,
-0.06703679263591766,
0.030859993770718575,
-0.01008706446737051,
0.04345690831542015,
-0.04524543136358261,
-0.03228702396154404,
0.011435200460255146,
-0.05819565802812576,
-0.015313873998820782,
0.15720435976982117,
0.34933120012283325,
-0.11609186977148056,
0.03837123513221741,
0.03743154928088188,
-0.046292439103126526,
-0.20035932958126068,
0.046193934977054596,
0.060677558183670044,
-0.04825613275170326,
0.025468317791819572,
-0.15096373856067657,
0.057679448276758194,
0.07685689628124237,
-0.015678949654102325,
0.08362914621829987,
-0.2832030653953552,
-0.131631001830101,
0.06293383985757828,
0.1778498739004135,
0.057387419044971466,
-0.13518886268138885,
-0.010383302345871925,
0.006787475664168596,
-0.1481611579656601,
0.11268292367458344,
-0.0835660994052887,
0.11659599095582962,
-0.022865770384669304,
0.025747058913111687,
-0.00038207339821383357,
-0.0598461888730526,
0.11378145962953568,
-0.012423847801983356,
0.1300153136253357,
-0.05318112298846245,
-0.05780029296875,
0.11423001438379288,
-0.07677605003118515,
-0.0006067405338399112,
-0.08098933100700378,
0.00301380455493927,
-0.10074196010828018,
0.002891191281378269,
-0.07441236823797226,
-0.019387664273381233,
-0.03280145674943924,
-0.0500122606754303,
-0.05724307894706726,
0.054646432399749756,
0.046129677444696426,
-0.003590585896745324,
0.226729154586792,
-0.023964401334524155,
0.11212523281574249,
0.16661393642425537,
0.0785006731748581,
-0.09586890041828156,
-0.07304488867521286,
-0.0038338189478963614,
-0.016370465978980064,
0.08223936706781387,
-0.1545187383890152,
0.044677186757326126,
0.12340275198221207,
0.026991527527570724,
0.1424308717250824,
0.0631701648235321,
-0.027911188080906868,
0.048165496438741684,
0.09291411191225052,
-0.13554158806800842,
-0.10428634285926819,
0.021593622863292694,
-0.017063593491911888,
-0.08670207113027573,
0.027138805016875267,
0.09323760122060776,
-0.06091291829943657,
0.029578840360045433,
-0.010597663931548595,
0.029627544805407524,
-0.051259443163871765,
0.14620129764080048,
0.04880363494157791,
0.06014535576105118,
-0.11564060300588608,
0.11268845945596695,
0.01550306472927332,
-0.0994366928935051,
0.00969888735562563,
0.09414292126893997,
-0.09233446419239044,
-0.01799703761935234,
0.03119058720767498,
0.13562718033790588,
-0.03974633663892746,
-0.05330018326640129,
-0.15160921216011047,
-0.12199204415082932,
0.07394472509622574,
0.18429377675056458,
0.059450458735227585,
0.012812754139304161,
-0.009753897786140442,
0.03167960047721863,
-0.12746506929397583,
0.09691263735294342,
-0.001193248899653554,
0.061539459973573685,
-0.16687829792499542,
0.11685707420110703,
0.012712567113339901,
0.03129654750227928,
-0.03229701519012451,
0.026597976684570312,
-0.08543794602155685,
0.028859268873929977,
-0.0651908740401268,
0.0037541266065090895,
-0.05066058784723282,
0.023702654987573624,
-0.01746124029159546,
-0.04352312907576561,
-0.0723523199558258,
0.02229641005396843,
-0.09943874180316925,
-0.023456700146198273,
0.03338092193007469,
0.044473595917224884,
-0.13479766249656677,
-0.03371192142367363,
0.005765052046626806,
-0.07966314256191254,
0.05370780825614929,
-0.0010162153048440814,
0.013058588840067387,
0.02978852018713951,
-0.15740607678890228,
-0.021032217890024185,
0.0795152485370636,
-0.017384981736540794,
0.039344750344753265,
-0.07141119986772537,
-0.0123166898265481,
-0.018834466114640236,
0.010701389983296394,
0.004867925308644772,
0.055740270763635635,
-0.11722305417060852,
0.019982876256108284,
-0.010610394179821014,
-0.04850579425692558,
-0.05840468779206276,
0.06723194569349289,
0.1131814569234848,
-0.006941268686205149,
0.17068585753440857,
-0.08616319298744202,
-0.001958769280463457,
-0.20496678352355957,
-0.010967408306896687,
0.019730165600776672,
-0.12342677265405655,
-0.09620156139135361,
-0.030934162437915802,
0.06847070157527924,
-0.07825607806444168,
0.14850637316703796,
-0.004985014908015728,
-0.009714119136333466,
0.06966134905815125,
-0.08842524886131287,
-0.022082988172769547,
0.03810373693704605,
0.17780311405658722,
0.02824750542640686,
-0.0334756039083004,
0.03242028132081032,
0.02284054085612297,
0.1071852371096611,
0.08784922957420349,
0.15669366717338562,
0.16478636860847473,
-0.01036983635276556,
0.08803325146436691,
0.06761670857667923,
-0.02595229633152485,
-0.1570972502231598,
0.1574329435825348,
-0.06337400525808334,
0.13742728531360626,
-0.024138113483786583,
0.14065884053707123,
0.16712015867233276,
-0.18421128392219543,
0.02783680148422718,
-0.01846185140311718,
-0.06777001172304153,
-0.08715864270925522,
-0.06996045261621475,
-0.08713068813085556,
-0.17684967815876007,
0.04326188564300537,
-0.12236610800027847,
0.091728575527668,
0.07345529645681381,
0.04571104794740677,
0.00382066797465086,
0.19144897162914276,
0.00659484788775444,
0.029485899955034256,
0.10320264846086502,
0.023222189396619797,
-0.03742944076657295,
-0.053833041340112686,
-0.0687909722328186,
0.04562738910317421,
-0.06005376949906349,
0.019011756405234337,
-0.02283305674791336,
-0.04538169503211975,
0.0617268942296505,
0.012444078922271729,
-0.11496403068304062,
0.04373730719089508,
0.0230430718511343,
0.0550251305103302,
0.04860766977071762,
0.005479895975440741,
0.014869538135826588,
0.006022067740559578,
0.18219952285289764,
-0.0672507956624031,
-0.05950552225112915,
-0.08950401097536087,
0.17883406579494476,
0.004644849803298712,
0.015876267105340958,
-0.012360124848783016,
-0.07240236550569534,
-0.011578771285712719,
0.1605883687734604,
0.17400787770748138,
-0.06114713475108147,
-0.005406996235251427,
-0.03869049996137619,
-0.0058269016444683075,
-0.039754800498485565,
0.09733947366476059,
0.056976575404405594,
-0.008399376645684242,
-0.07686720788478851,
-0.05796503648161888,
-0.0418894998729229,
-0.02290058508515358,
-0.01386372372508049,
0.013816282153129578,
0.05496658757328987,
0.0013391274260357022,
-0.08274541795253754,
0.05803712457418442,
-0.034340888261795044,
-0.10342799127101898,
0.10859254002571106,
-0.19551140069961548,
-0.115679070353508,
-0.004171674605458975,
0.09255272895097733,
-0.007444168906658888,
0.03171945735812187,
-0.007594443392008543,
-0.012332290410995483,
0.022658012807369232,
-0.0006802575080655515,
-0.04572901874780655,
-0.11312054842710495,
0.07850458472967148,
-0.12790660560131073,
0.239702507853508,
-0.04549084231257439,
0.04469168186187744,
0.09866368770599365,
0.014749766327440739,
-0.08401507884263992,
0.07965170592069626,
0.056263383477926254,
-0.09004581719636917,
-0.02949298731982708,
0.14959365129470825,
-0.04740098491311073,
0.15896548330783844,
0.05424540117383003,
-0.0883975699543953,
0.024183068424463272,
-0.0989910140633583,
-0.0904579758644104,
-0.043767645955085754,
-0.04352177307009697,
-0.03366564214229584,
0.13812008500099182,
0.19120511412620544,
-0.033364828675985336,
0.015312612056732178,
-0.0711432695388794,
0.028839362785220146,
0.11016257852315903,
0.017782168462872505,
-0.039244379848241806,
-0.21523742377758026,
0.03985045477747917,
0.09336348623037338,
0.0024853795766830444,
-0.1777527779340744,
-0.1103968545794487,
-0.004247434437274933,
-0.03925328329205513,
-0.06335805356502533,
0.0801776722073555,
0.09243215620517731,
0.055930979549884796,
-0.06235559657216072,
-0.13487085700035095,
-0.01595343090593815,
0.16306822001934052,
-0.13587352633476257,
-0.07430996745824814
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.0`
```yaml
adapter: qlora
base_model: mistralai/Mistral-7B-v0.1
bf16: false
dataset_prepared_path: null
datasets:
- path: https://raw.githubusercontent.com/joseagmz/small_data/main/DSM-5.txt
type: completion
debug: null
deepspeed: null
early_stopping_patience: null
evals_per_epoch: null
flash_attention: false
fp16: true
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 1
gradient_checkpointing: true
group_by_length: false
is_mistral_derived_model: true
learning_rate: 0.0002
load_in_4bit: true
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 32
lora_target_linear: true
lora_target_modules: null
lr_scheduler: cosine
max_steps: 20
micro_batch_size: 1
mlflow_experiment_name: colab-example
model_type: MistralForCausalLM
num_epochs: 4
optimizer: paged_adamw_32bit
output_dir: ./DSM_output
pad_to_sequence_len: true
resume_from_checkpoint: null
sample_packing: true
saves_per_epoch: null
sequence_len: 1096
special_tokens: null
strict: false
tf32: false
tokenizer_type: LlamaTokenizer
train_on_inputs: false
val_set_size: 0.05
wandb_entity: null
wandb_log_model: null
wandb_name: null
wandb_project: null
wandb_watch: null
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# DSM_output
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6101
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.9112 | 0.02 | 20 | 2.6101 |
### Framework versions
- PEFT 0.8.2
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0 | {"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "mistralai/Mistral-7B-v0.1", "model-index": [{"name": "DSM_output", "results": []}]} | null | joseagmz/DSM_output | [
"peft",
"safetensors",
"mistral",
"generated_from_trainer",
"base_model:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"4-bit",
"region:us"
] | 2024-02-14T16:30:17+00:00 | [] | [] | TAGS
#peft #safetensors #mistral #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #4-bit #region-us
| <img src="URL alt="Built with Axolotl" width="200" height="32"/>
See axolotl config
axolotl version: '0.4.0'
DSM\_output
===========
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 2.6101
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0002
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_steps: 10
* training\_steps: 20
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.8.2
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu121
* Datasets 2.17.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#peft #safetensors #mistral #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #4-bit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
51,
130,
4,
44
] | [
"passage: TAGS\n#peft #safetensors #mistral #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #4-bit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 20\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
-0.14339207112789154,
0.09749188274145126,
-0.002712343353778124,
0.09908951073884964,
0.10643689334392548,
-0.0010761988814920187,
0.13607297837734222,
0.12393800169229507,
-0.08471038937568665,
0.09984428435564041,
0.13751348853111267,
0.09683908522129059,
0.05226800590753555,
0.22218021750450134,
-0.045383114367723465,
-0.23300085961818695,
0.048839014023542404,
-0.016485370695590973,
-0.024022718891501427,
0.12294382601976395,
0.08429724723100662,
-0.13502882421016693,
0.07050681114196777,
-0.01655612513422966,
-0.14913725852966309,
-0.04647308215498924,
-0.013067499734461308,
-0.03625136613845825,
0.11674799770116806,
-0.007075088564306498,
0.11433775722980499,
0.04389636218547821,
0.09690268337726593,
-0.2009267807006836,
0.01671936921775341,
0.057510726153850555,
0.01864583231508732,
0.09524302929639816,
0.06280418485403061,
-0.014339195564389229,
0.06334998458623886,
-0.10760712623596191,
0.07928735017776489,
0.01816590130329132,
-0.1434229463338852,
-0.2544080317020416,
-0.1449865847826004,
0.05784511938691139,
0.10007050633430481,
0.0719594731926918,
-0.0037851198576390743,
0.17147433757781982,
-0.04357225075364113,
0.0895649790763855,
0.2952117621898651,
-0.28179094195365906,
-0.07851884514093399,
0.024558749049901962,
0.050996337085962296,
0.11551543325185776,
-0.10244370996952057,
-0.04074399545788765,
0.05013702064752579,
0.04271519184112549,
0.1231919676065445,
-0.0012134979479014874,
-0.03127208352088928,
-0.014149118214845657,
-0.15262243151664734,
-0.06058155372738838,
0.10554227232933044,
0.031618453562259674,
-0.05380701273679733,
-0.04262582212686539,
-0.06264109164476395,
-0.20607730746269226,
-0.0626840814948082,
-0.0042574480175971985,
0.04130929335951805,
-0.03937140479683876,
-0.040904849767684937,
0.008897071704268456,
-0.08596434444189072,
-0.07844483852386475,
-0.008093129843473434,
0.12804804742336273,
0.052709195762872696,
0.011486612260341644,
-0.02235727570950985,
0.10877539217472076,
-0.025593791157007217,
-0.16692084074020386,
-0.021190941333770752,
0.002470490289852023,
-0.02230134978890419,
-0.0492548793554306,
-0.02783416211605072,
-0.010080216452479362,
0.04130737856030464,
0.17594926059246063,
-0.17558445036411285,
0.07731813937425613,
0.007802024018019438,
0.03430904075503349,
-0.09372961521148682,
0.12107067555189133,
-0.06340810656547546,
0.014140516519546509,
0.018585368990898132,
0.09703651815652847,
0.04460740461945534,
-0.023698437958955765,
-0.07026975601911545,
0.04558297246694565,
0.10188015550374985,
0.04681602492928505,
-0.034708261489868164,
0.012244454585015774,
-0.056935399770736694,
0.005279265809804201,
0.10235392302274704,
-0.09948226809501648,
0.03251510486006737,
0.0251797866076231,
-0.06140213832259178,
-0.07183795422315598,
0.009455381892621517,
0.025006795302033424,
0.0022572213783860207,
0.08369172364473343,
-0.08659864962100983,
0.027368757873773575,
-0.07529841363430023,
-0.12946850061416626,
0.039197150617837906,
-0.09228880703449249,
-0.006356931757181883,
-0.08604250103235245,
-0.14220015704631805,
-0.048672303557395935,
0.04187813028693199,
-0.05083368718624115,
-0.028819767758250237,
-0.05383742228150368,
-0.09493079781532288,
0.02685963362455368,
-0.016074584797024727,
0.10621000826358795,
-0.08629123121500015,
0.10211361944675446,
-0.026066510006785393,
0.06544192880392075,
-0.03797177970409393,
0.026760132983326912,
-0.08135078847408295,
0.05509379878640175,
-0.22385890781879425,
0.041542209684848785,
-0.08540307730436325,
0.0849178209900856,
-0.12556247413158417,
-0.09755414724349976,
0.034033019095659256,
-0.05012740194797516,
0.12576864659786224,
0.13434645533561707,
-0.20326706767082214,
-0.027042776346206665,
0.2003406286239624,
-0.10731173306703568,
-0.12049929797649384,
0.11737945675849915,
-0.045060768723487854,
0.0072940681129693985,
0.048523809760808945,
0.22151370346546173,
0.06782721728086472,
-0.1365097463130951,
0.012183722108602524,
-0.06002948805689812,
0.0890546441078186,
-0.01864613965153694,
0.07961229234933853,
-0.011485245078802109,
0.04032894968986511,
0.005553641356527805,
-0.05718892067670822,
0.0401025265455246,
-0.11296927183866501,
-0.08406618237495422,
-0.0416388213634491,
-0.08439343422651291,
0.02030722238123417,
0.041762277483940125,
0.03361112251877785,
-0.12060301750898361,
-0.07781723141670227,
0.03346233069896698,
0.10609374195337296,
-0.06045925244688988,
0.045381560921669006,
-0.06275498121976852,
0.12176000326871872,
-0.021056896075606346,
-0.02727375738322735,
-0.18029309809207916,
-0.04513135179877281,
0.035483960062265396,
-0.010595892556011677,
-0.01531616784632206,
-0.07264605909585953,
0.07840082794427872,
0.1057184487581253,
-0.041456833481788635,
-0.037849657237529755,
-0.03725532069802284,
0.013309670612215996,
-0.1162707507610321,
-0.25710049271583557,
-0.03475350886583328,
-0.05809365212917328,
0.13944676518440247,
-0.17799463868141174,
0.03833934664726257,
0.04346911981701851,
0.11121391505002975,
0.03199712187051773,
-0.04617875814437866,
0.0034280195832252502,
0.08342194557189941,
-0.013440658338367939,
-0.08488871157169342,
0.048985209316015244,
0.018658332526683807,
-0.07448864728212357,
0.007874655537307262,
-0.15003915131092072,
0.11299615353345871,
0.11477571725845337,
0.0811493769288063,
-0.09343580901622772,
-0.06511811167001724,
-0.06392987072467804,
-0.0312288086861372,
-0.03291326016187668,
0.06041223928332329,
0.12450458109378815,
0.027589362114667892,
0.11458109319210052,
-0.09947163611650467,
-0.04522152990102768,
0.047509826719760895,
-0.021115433424711227,
0.023145565763115883,
0.13290388882160187,
0.024636533111333847,
-0.10138428956270218,
0.13573208451271057,
0.13718321919441223,
-0.04929390177130699,
0.10231465846300125,
-0.07677826285362244,
-0.07166989147663116,
-0.030808359384536743,
0.04129670560359955,
0.01967410184442997,
0.15203234553337097,
-0.036056339740753174,
0.023556536063551903,
0.022980334237217903,
0.031733863055706024,
-0.0088678989559412,
-0.21155805885791779,
-0.03208715096116066,
0.015485190786421299,
-0.061101946979761124,
-0.06691066920757294,
-0.021226095035672188,
0.006446445360779762,
0.10099588334560394,
0.004342465661466122,
-0.07605672627687454,
-0.007077895570546389,
0.013122030533850193,
-0.0823613852262497,
0.2128092497587204,
-0.13672055304050446,
-0.08063049614429474,
-0.07066965103149414,
-0.03074384108185768,
-0.009222593158483505,
-0.012769862078130245,
0.08082344383001328,
-0.08527852594852448,
-0.019435802474617958,
-0.10879338532686234,
-0.07167820632457733,
0.04563207924365997,
0.013471432030200958,
0.005631611682474613,
-0.02195361815392971,
0.057434163987636566,
-0.10274171829223633,
-0.009935787878930569,
-0.03832687437534332,
0.00833191815763712,
0.07086314260959625,
0.023490803316235542,
0.1090736836194992,
0.14844590425491333,
-0.0003412218939047307,
0.018732454627752304,
-0.04215259850025177,
0.24882598221302032,
-0.0744718685746193,
-0.00871301256120205,
0.09733297675848007,
0.0010120637016370893,
0.08572878688573837,
0.15729601681232452,
0.05208095908164978,
-0.1000223383307457,
0.0002630585222505033,
-0.003589312080293894,
-0.04337781295180321,
-0.23048141598701477,
-0.03723489120602608,
-0.036275848746299744,
-0.02622712403535843,
0.09129194170236588,
0.035574913024902344,
-0.008528969250619411,
0.04158797860145569,
-0.005806955974549055,
-0.031777720898389816,
0.015270451083779335,
0.08314177393913269,
0.02056942693889141,
0.0389368012547493,
0.11130309104919434,
-0.037802860140800476,
0.011665231548249722,
0.04831387475132942,
0.009273293428122997,
0.2508815824985504,
-0.014385441318154335,
0.12787701189517975,
0.0691453218460083,
0.1919679194688797,
0.006591742392629385,
0.07012670487165451,
0.008882093243300915,
-0.022998306900262833,
-0.0072906906716525555,
-0.0690748542547226,
-0.027732182294130325,
0.034135375171899796,
-0.06883562356233597,
0.035248249769210815,
-0.10489542037248611,
0.028858833014965057,
0.053703323006629944,
0.3326539099216461,
0.04970258101820946,
-0.3521340489387512,
-0.10971421748399734,
0.003932766616344452,
-0.0003258800134062767,
-0.03847203403711319,
0.0003365691227372736,
0.14810070395469666,
-0.055842164903879166,
0.06194410100579262,
-0.058066949248313904,
0.091618113219738,
0.009296846576035023,
0.008942688815295696,
0.038338951766490936,
0.09028621017932892,
-0.024581415578722954,
0.016967378556728363,
-0.25277507305145264,
0.31301751732826233,
0.01542512234300375,
0.0832032710313797,
-0.02720293402671814,
-0.011415502987802029,
0.025997066870331764,
0.0950465053319931,
0.08530077338218689,
0.0033176832366734743,
-0.13468973338603973,
-0.19789129495620728,
-0.13353769481182098,
0.02397199720144272,
0.0854218378663063,
0.0009091131505556405,
0.09722838550806046,
0.005436442326754332,
0.003249003319069743,
0.03495870158076286,
-0.07154694944620132,
-0.09920767694711685,
-0.061372578144073486,
0.0008662007749080658,
0.003294786438345909,
0.0003621447831392288,
-0.09951420873403549,
-0.105925053358078,
-0.07044150680303574,
0.09251716732978821,
-0.04482891038060188,
-0.05296093970537186,
-0.1322900354862213,
0.042364031076431274,
0.11662859469652176,
-0.08250729739665985,
0.04649965837597847,
0.001625562901608646,
0.08788672834634781,
-0.017555436119437218,
-0.018373742699623108,
0.1046740785241127,
-0.0628800094127655,
-0.20973160862922668,
-0.056411806493997574,
0.11003835499286652,
0.07323043793439865,
0.06970534473657608,
-0.004353848751634359,
0.05141532048583031,
-0.006897009909152985,
-0.09332100301980972,
0.023178938776254654,
0.039912283420562744,
0.08457940071821213,
-0.024104148149490356,
-0.021868126466870308,
0.015245320275425911,
-0.07474129647016525,
-0.03916674479842186,
0.13071468472480774,
0.3288634121417999,
-0.10628638416528702,
0.08244869858026505,
0.07772926986217499,
-0.047699443995952606,
-0.17189690470695496,
0.020832041278481483,
0.08224235475063324,
-0.0005652729887515306,
0.016270365566015244,
-0.15498660504817963,
0.025172075256705284,
0.1195521429181099,
-0.03088422678411007,
0.09541380405426025,
-0.3332229554653168,
-0.12748035788536072,
0.08195775747299194,
0.13852719962596893,
0.062308527529239655,
-0.1626589298248291,
-0.042333632707595825,
0.006429576314985752,
-0.13251993060112,
0.06648585200309753,
-0.14348843693733215,
0.09514766186475754,
-0.020860841497778893,
0.051546476781368256,
0.006033864337950945,
-0.06794888526201248,
0.16127605736255646,
-0.0015391985652968287,
0.11371780931949615,
-0.04364651441574097,
0.01714801974594593,
0.06716550886631012,
-0.08658212423324585,
0.030792728066444397,
-0.06789208203554153,
0.05071193724870682,
-0.06342596560716629,
0.010084245353937149,
-0.0778932049870491,
0.012453318573534489,
-0.04261612892150879,
-0.03429858386516571,
-0.04890371859073639,
0.04725273326039314,
0.053796764463186264,
-0.019374873489141464,
0.1506648063659668,
0.0002075115335173905,
0.17673693597316742,
0.12598282098770142,
0.04945559799671173,
-0.09964026510715485,
-0.017183983698487282,
0.021175887435674667,
-0.04022517427802086,
0.05170849338173866,
-0.18165767192840576,
0.014403234235942364,
0.14104697108268738,
0.027623500674962997,
0.10002946853637695,
0.05020679160952568,
-0.07072722166776657,
0.02759925276041031,
0.05640498176217079,
-0.14913736283779144,
-0.11927706748247147,
0.04740225151181221,
0.07063417881727219,
-0.108186274766922,
0.0334748774766922,
0.11620905250310898,
-0.06786318123340607,
-0.028281833976507187,
-0.01611214503645897,
0.04297768324613571,
-0.02600017935037613,
0.22799226641654968,
0.04764856770634651,
0.07643072307109833,
-0.11305771768093109,
0.06475735455751419,
0.06152161583304405,
-0.0935487225651741,
0.03729111701250076,
0.10762716829776764,
-0.11768434941768646,
-0.033393118530511856,
0.0909624993801117,
0.12500710785388947,
-0.018338650465011597,
-0.04848825931549072,
-0.1286189705133438,
-0.12377389520406723,
0.08393760025501251,
0.20360885560512543,
0.06081046536564827,
0.02250969596207142,
0.017433006316423416,
-0.011273469775915146,
-0.09523110836744308,
0.09277979284524918,
0.04380268603563309,
0.07919593900442123,
-0.1111324355006218,
0.127350851893425,
-0.008641389198601246,
0.018755361437797546,
-0.009629356674849987,
0.02901133894920349,
-0.12003695219755173,
0.018864421173930168,
-0.146218940615654,
0.021753329783678055,
-0.055891700088977814,
0.003238694043830037,
-0.020335890352725983,
-0.05786241218447685,
-0.03980984538793564,
0.03671364113688469,
-0.1111922338604927,
-0.02661805972456932,
-0.013921000994741917,
0.0384242981672287,
-0.13888564705848694,
-0.0365326926112175,
0.023194337263703346,
-0.0831560268998146,
0.08097705245018005,
0.04336289316415787,
0.0052108545787632465,
0.024500638246536255,
-0.1253598928451538,
0.008333773352205753,
0.05570276454091072,
-0.017241185531020164,
0.05904567986726761,
-0.14881254732608795,
-0.030775586143136024,
-0.024701153859496117,
0.006295078434050083,
0.033953677862882614,
0.12098609656095505,
-0.11146362125873566,
0.003419033717364073,
-0.006705157458782196,
-0.06030168756842613,
-0.045557837933301926,
0.03915085643529892,
0.1101514995098114,
0.015991654247045517,
0.13626371324062347,
-0.0856124684214592,
0.03939993679523468,
-0.1954220086336136,
-0.025851234793663025,
-0.011904703453183174,
-0.1062704548239708,
-0.11929924041032791,
-0.015776358544826508,
0.0872400775551796,
-0.052757807075977325,
0.07195312529802322,
-0.030473781749606133,
0.03041093610227108,
0.036670442670583725,
-0.051784321665763855,
-0.020205825567245483,
0.04389568418264389,
0.1559763103723526,
0.01973893493413925,
-0.0351666621863842,
0.06719067692756653,
0.030578987672924995,
0.06795289367437363,
0.08465715497732162,
0.21302072703838348,
0.15617018938064575,
0.08813080936670303,
0.07933686673641205,
0.025890309363603592,
-0.08165113627910614,
-0.1279991865158081,
0.07638856768608093,
-0.04002824425697327,
0.08910734206438065,
-0.02324989251792431,
0.17625051736831665,
0.1405881941318512,
-0.19159410893917084,
0.03177862986922264,
-0.04861186817288399,
-0.08949445933103561,
-0.11106184870004654,
-0.019940059632062912,
-0.08382978290319443,
-0.15112194418907166,
-0.015356352552771568,
-0.11327498406171799,
0.03305915370583534,
0.08683440834283829,
0.018042996525764465,
0.04148497059941292,
0.1296422779560089,
0.07193846255540848,
0.03752196580171585,
0.03583723306655884,
0.02370874397456646,
-0.021293705329298973,
-0.04124082252383232,
-0.1114480122923851,
0.06198001652956009,
-0.052900560200214386,
0.04285229369997978,
-0.024459561333060265,
-0.009875879622995853,
0.06979218870401382,
-0.011591571383178234,
-0.10411228984594345,
0.02779383957386017,
0.032702527940273285,
0.06049191206693649,
0.0751703679561615,
0.03678351268172264,
0.008001508191227913,
-0.0016191614558920264,
0.20246168971061707,
-0.05418158322572708,
-0.0819435566663742,
-0.1287248283624649,
0.290010005235672,
0.027420829981565475,
-0.024004142731428146,
0.03611932322382927,
-0.07517022639513016,
0.005090121645480394,
0.12931670248508453,
0.16528946161270142,
-0.04117787629365921,
-0.0028651533648371696,
-0.022405855357646942,
-0.015680797398090363,
-0.059995125979185104,
0.1033189669251442,
0.13414302468299866,
0.004995562601834536,
-0.10408465564250946,
-0.023014770820736885,
-0.06360166519880295,
-0.03309284523129463,
-0.07287688553333282,
0.014592496678233147,
0.005492216907441616,
0.0030743791721761227,
-0.04895631596446037,
0.08690103143453598,
-0.040540482848882675,
-0.07747998088598251,
0.07371862232685089,
-0.19323621690273285,
-0.16924411058425903,
-0.008325979113578796,
0.05644868314266205,
0.007209781091660261,
0.042930807918310165,
-0.016001341864466667,
-0.008601497858762741,
0.11420179903507233,
-0.033740051090717316,
-0.03332999721169472,
-0.12336751818656921,
0.09429101645946503,
-0.11864388734102249,
0.2195662409067154,
-0.037359192967414856,
0.046546366065740585,
0.13128723204135895,
0.047140851616859436,
-0.13021285831928253,
0.058897607028484344,
0.06893961876630783,
-0.06936488300561905,
-0.01052011363208294,
0.0992516502737999,
-0.048665378242731094,
0.07373078167438507,
0.05181758478283882,
-0.10455788671970367,
0.0009040613076649606,
-0.0711866095662117,
-0.02906773053109646,
-0.029460720717906952,
-0.015011068433523178,
-0.034941185265779495,
0.12692028284072876,
0.15433834493160248,
-0.051745351403951645,
0.02106126770377159,
-0.052474748343229294,
0.024362850934267044,
0.050986506044864655,
0.02679135464131832,
-0.04249592870473862,
-0.26624947786331177,
0.048041582107543945,
0.06284419447183609,
0.0053169094026088715,
-0.22477076947689056,
-0.09160041064023972,
0.002866927767172456,
-0.04524052515625954,
-0.0751309022307396,
0.10405217111110687,
0.052329014986753464,
0.04640034958720207,
-0.055033113807439804,
-0.1042608916759491,
-0.06370382755994797,
0.16620425879955292,
-0.12198233604431152,
-0.06766664981842041
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# volk_assistant
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0610
- Rouge1: 0.0
- Rouge2: 0.0
- Rougel: 0.0
- Rougelsum: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
| No log | 1.0 | 4 | 0.2216 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 2.0 | 8 | 0.1434 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 3.0 | 12 | 0.1166 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 4.0 | 16 | 0.0772 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 5.0 | 20 | 0.0732 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 6.0 | 24 | 0.0673 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 7.0 | 28 | 0.0663 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 8.0 | 32 | 0.0621 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 9.0 | 36 | 0.0610 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 10.0 | 40 | 0.0651 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 11.0 | 44 | 0.0696 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 12.0 | 48 | 0.0708 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 13.0 | 52 | 0.0762 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 14.0 | 56 | 0.0817 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 15.0 | 60 | 0.0828 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 16.0 | 64 | 0.0812 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 17.0 | 68 | 0.0794 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 18.0 | 72 | 0.0781 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 19.0 | 76 | 0.0770 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 20.0 | 80 | 0.0763 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "google/flan-t5-base", "model-index": [{"name": "volk_assistant", "results": []}]} | text2text-generation | opelumen/volk_assistant | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T16:32:34+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| volk\_assistant
===============
This model is a fine-tuned version of google/flan-t5-base on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0610
* Rouge1: 0.0
* Rouge2: 0.0
* Rougel: 0.0
* Rougelsum: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 20
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
80,
97,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.09367556124925613,
0.09084559977054596,
-0.0018313603941351175,
0.11850062757730484,
0.11552663147449493,
0.0005932221538387239,
0.1909739226102829,
0.11951612681150436,
-0.04825383797287941,
0.035428237169981,
0.1405571550130844,
0.10192251950502396,
0.018162043765187263,
0.13173848390579224,
-0.05455389246344566,
-0.20777572691440582,
0.013213039375841618,
0.025291118770837784,
-0.0460193008184433,
0.1355830729007721,
0.0946124717593193,
-0.10633713006973267,
0.11493929475545883,
-0.004634972661733627,
-0.14735132455825806,
0.013133461587131023,
0.03440139442682266,
-0.05218353495001793,
0.1463710367679596,
0.045646850019693375,
0.08748238533735275,
0.04426147788763046,
0.05972830578684807,
-0.17912043631076813,
0.014214163646101952,
0.06101902201771736,
-0.013536393642425537,
0.08927556127309799,
0.04848590865731239,
-0.0017459191149100661,
0.0782984271645546,
-0.06555570662021637,
0.03670068830251694,
0.03929819166660309,
-0.1310797780752182,
-0.20688416063785553,
-0.0762876570224762,
0.043994635343551636,
0.06478438526391983,
0.09497243911027908,
-0.015107548795640469,
0.12783317267894745,
-0.004535256884992123,
0.1065000593662262,
0.2354838252067566,
-0.3298100233078003,
-0.063090018928051,
0.05930311605334282,
0.06171388924121857,
0.10429172962903976,
-0.08450903743505478,
0.00942184403538704,
0.05677454546093941,
0.01838821917772293,
0.14854881167411804,
-0.03402969613671303,
-0.014558673836290836,
0.0030361649114638567,
-0.12163913995027542,
-0.0378933846950531,
0.19573171436786652,
0.07254207879304886,
-0.05297117680311203,
-0.06472498178482056,
-0.08366790413856506,
-0.12090238183736801,
-0.014999685809016228,
-0.013007828034460545,
0.04811018332839012,
-0.006108755711466074,
-0.0888485535979271,
-0.0675373375415802,
-0.11222781240940094,
-0.06874585151672363,
-0.04045262187719345,
0.11470671743154526,
0.016837604343891144,
-0.00047253252705559134,
-0.025501953437924385,
0.10547418147325516,
-0.013426941819489002,
-0.151221364736557,
0.01740063913166523,
0.024170534685254097,
0.02674907259643078,
-0.033332303166389465,
-0.05050850287079811,
-0.11555314809083939,
0.02856689877808094,
0.13037532567977905,
-0.04766666144132614,
0.04817500337958336,
-0.011345594190061092,
0.045251552015542984,
-0.11380328983068466,
0.17972232401371002,
-0.049271468073129654,
-0.05447382852435112,
0.037166908383369446,
0.10127057880163193,
0.07416436076164246,
-0.019217919558286667,
-0.14122770726680756,
0.011224808171391487,
0.11975332349538803,
0.012790678068995476,
-0.029897358268499374,
0.07983899116516113,
-0.048759300261735916,
-0.019792679697275162,
0.004013502970337868,
-0.08450990915298462,
0.006776771973818541,
-0.01316018681973219,
-0.050472911447286606,
-0.07472915947437286,
0.04130125045776367,
0.030392490327358246,
-0.0027103638276457787,
0.057237595319747925,
-0.09368495643138885,
-0.006413265131413937,
-0.062087979167699814,
-0.10447565466165543,
0.01345948688685894,
-0.07029516249895096,
0.019454432651400566,
-0.11720399558544159,
-0.21552716195583344,
0.00015932845417410135,
0.05843733996152878,
-0.029114089906215668,
-0.05669489875435829,
-0.058118660002946854,
-0.0637931376695633,
0.010259654372930527,
-0.017024703323841095,
0.07666826248168945,
-0.06635458022356033,
0.10743242502212524,
0.05608302727341652,
0.05221942067146301,
-0.07282474637031555,
0.031295184046030045,
-0.10821438580751419,
0.04730231687426567,
-0.1575353592634201,
0.04470561444759369,
-0.022080659866333008,
0.080081045627594,
-0.10144039243459702,
-0.06671735644340515,
-0.02810509316623211,
-0.005893992260098457,
0.07355347275733948,
0.10236529260873795,
-0.1548520028591156,
-0.06614170968532562,
0.1727496236562729,
-0.08198090642690659,
-0.19348785281181335,
0.14074677228927612,
-0.04023737460374832,
0.08847451955080032,
0.07301773130893707,
0.20728003978729248,
0.06601186096668243,
-0.0808776468038559,
0.013766147196292877,
-0.010656950995326042,
0.07000290602445602,
-0.04338111728429794,
0.08756964653730392,
-0.01104128547012806,
-0.010712231509387493,
0.016743317246437073,
-0.05699215456843376,
0.06887581944465637,
-0.06260278820991516,
-0.07723679393529892,
-0.047969672828912735,
-0.10714810341596603,
0.05080922320485115,
0.03834261745214462,
0.06473435461521149,
-0.10879582911729813,
-0.09668523073196411,
0.030951527878642082,
0.050518061965703964,
-0.08918894827365875,
0.024021374061703682,
-0.07083337008953094,
0.10523170977830887,
-0.07257729023694992,
-0.004304306581616402,
-0.13738203048706055,
-0.05644689500331879,
0.01795879937708378,
-0.0007336498238146305,
0.02704974263906479,
-0.015203434973955154,
0.08325725793838501,
0.06625154614448547,
-0.08124103397130966,
-0.040326718240976334,
-0.018928170204162598,
0.006679220125079155,
-0.11379699409008026,
-0.1725238859653473,
-0.01345480140298605,
-0.019718017429113388,
0.15267644822597504,
-0.2136041820049286,
0.056952979415655136,
0.005180777050554752,
0.08680237084627151,
0.03443160280585289,
-0.019201304763555527,
-0.02595512568950653,
0.03265918791294098,
-0.05239850655198097,
-0.08044543862342834,
0.0686192661523819,
0.028657296672463417,
-0.1291476935148239,
0.0000982775745796971,
-0.16191011667251587,
0.19703353941440582,
0.13174040615558624,
-0.07615852355957031,
-0.05417146906256676,
0.003142531029880047,
-0.03613562509417534,
-0.03432493656873703,
-0.03286579251289368,
-0.03631957992911339,
0.12634117901325226,
0.004579805303364992,
0.16737671196460724,
-0.10988378524780273,
-0.045288391411304474,
0.026392195373773575,
-0.03483732417225838,
0.0033993201795965433,
0.1042141541838646,
0.036865297704935074,
-0.1282653510570526,
0.14181679487228394,
0.19191604852676392,
-0.05761546269059181,
0.1346343606710434,
-0.045474614948034286,
-0.05717705935239792,
-0.031116411089897156,
0.026933444663882256,
0.015297649428248405,
0.10098294168710709,
-0.10720985382795334,
0.0096112210303545,
0.008223334327340126,
0.009995914995670319,
0.016315270215272903,
-0.1988515555858612,
-0.029261944815516472,
0.05101630091667175,
-0.06450258940458298,
-0.012266797944903374,
-0.005173663143068552,
-0.031853463500738144,
0.09088367223739624,
0.006145566701889038,
-0.06899305433034897,
0.05731382220983505,
0.0023471531458199024,
-0.08936034142971039,
0.1934899240732193,
-0.053288571536540985,
-0.17294347286224365,
-0.1637454628944397,
-0.057584188878536224,
-0.07920494675636292,
0.03282417356967926,
0.07309260219335556,
-0.04217236116528511,
-0.03697533905506134,
-0.13638059794902802,
0.0035367438104003668,
0.002091925125569105,
0.01778499037027359,
0.0159376859664917,
-0.007959128357470036,
0.09518563002347946,
-0.10352354496717453,
-0.007035794202238321,
0.0020303933415561914,
-0.03563627973198891,
0.024842562153935432,
-0.0065634590573608875,
0.11070957779884338,
0.11480250954627991,
-0.018943440169095993,
0.009352822788059711,
-0.03539589047431946,
0.23258034884929657,
-0.06021417677402496,
0.0011101121781393886,
0.16283243894577026,
-0.008708624169230461,
0.061010684818029404,
0.12726083397865295,
0.040883954614400864,
-0.09626661986112595,
0.029982950538396835,
0.025554820895195007,
-0.03851946070790291,
-0.21444979310035706,
0.0008368752896785736,
-0.04453638568520546,
0.03151630982756615,
0.0989200547337532,
0.03599996864795685,
0.0734509751200676,
0.07308001071214676,
0.014781762845814228,
0.10183846205472946,
0.01658891886472702,
0.08260814100503922,
0.14017465710639954,
0.042913779616355896,
0.12406649440526962,
-0.047770906239748,
-0.04540140926837921,
0.03664727136492729,
0.0052597480826079845,
0.1721419245004654,
0.01048937626183033,
0.20968768000602722,
0.028796909376978874,
0.1449708640575409,
-0.011380525305867195,
0.07577741891145706,
-0.005775236990302801,
-0.024716587737202644,
-0.018260164186358452,
-0.05643171817064285,
-0.029412826523184776,
0.03562769666314125,
-0.07882694154977798,
0.06352316588163376,
-0.07829230278730392,
0.02376147359609604,
0.0498545840382576,
0.26229575276374817,
0.048236194998025894,
-0.34788283705711365,
-0.09755296260118484,
0.02053825929760933,
-0.014555448666214943,
-0.03765182942152023,
0.03174727410078049,
0.14430314302444458,
-0.05078738555312157,
0.04634051397442818,
-0.08491109311580658,
0.08825650066137314,
-0.028641926124691963,
0.047698237001895905,
0.056449536234140396,
0.06277512013912201,
-0.014512740075588226,
0.07138436287641525,
-0.28707870841026306,
0.2521931827068329,
0.016486817970871925,
0.05258272588253021,
-0.043304961174726486,
0.0001207059613079764,
0.022757573053240776,
0.0748838409781456,
0.08947616070508957,
-0.014484584331512451,
-0.023494407534599304,
-0.1621878743171692,
-0.08388404548168182,
0.027174077928066254,
0.08630681782960892,
-0.07543912529945374,
0.10349193960428238,
-0.05787211284041405,
-0.0035534484777599573,
0.0748685970902443,
0.028011497110128403,
-0.08361760526895523,
-0.09770777821540833,
0.002572328317910433,
0.054639484733343124,
0.014717650599777699,
-0.08667626231908798,
-0.09097251296043396,
-0.12029460072517395,
0.15274955332279205,
-0.022773345932364464,
-0.048080768436193466,
-0.10163747519254684,
0.05807938799262047,
0.05997229367494583,
-0.0792020633816719,
0.03814248740673065,
0.0006262582610361278,
0.08436579257249832,
0.021589327603578568,
-0.06971874833106995,
0.12403109669685364,
-0.05149718001484871,
-0.17080695927143097,
-0.05424508824944496,
0.13937701284885406,
-0.01919632777571678,
0.035079196095466614,
-0.001945223775692284,
0.015146327205002308,
-0.04105718806385994,
-0.06821034103631973,
0.026152990758419037,
-0.03723657503724098,
0.047037288546562195,
-0.018519893288612366,
-0.023600980639457703,
0.037083808332681656,
-0.059680063277482986,
-0.048260338604450226,
0.15532968938350677,
0.2776298224925995,
-0.07241034507751465,
-0.007331266533583403,
0.049521684646606445,
-0.0487150214612484,
-0.16643579304218292,
0.011477652937173843,
0.015221046283841133,
0.0035776335280388594,
0.07286487519741058,
-0.13343144953250885,
0.0678265243768692,
0.08062677830457687,
-0.022579427808523178,
0.11278335005044937,
-0.29488512873649597,
-0.14994050562381744,
0.09050014615058899,
0.1606510877609253,
0.12036561965942383,
-0.16940079629421234,
-0.06025150045752525,
-0.04476052150130272,
-0.14436572790145874,
0.10701078921556473,
-0.1421368271112442,
0.10927949845790863,
-0.002910568844527006,
0.05446096882224083,
0.009582148864865303,
-0.0490504689514637,
0.12221162766218185,
-0.03687175735831261,
0.0872666984796524,
-0.06937652826309204,
-0.00682693300768733,
0.09085562825202942,
-0.06615166366100311,
0.032896555960178375,
-0.14934949576854706,
0.05112448334693909,
-0.04273902624845505,
-0.034295789897441864,
-0.04910697788000107,
0.033906713128089905,
-0.03819496929645538,
-0.0592694915831089,
-0.03241895139217377,
0.013067934662103653,
0.04381971433758736,
-0.009709601290524006,
0.17303645610809326,
-0.000892278621904552,
0.14114898443222046,
0.16513976454734802,
0.09419136494398117,
-0.06529048085212708,
-0.020182861015200615,
-0.02069728635251522,
-0.0427265428006649,
0.04161861538887024,
-0.16070671379566193,
0.041248783469200134,
0.10719285160303116,
0.009774119593203068,
0.14745508134365082,
0.06687810271978378,
-0.033375855535268784,
0.014410513453185558,
0.06192449852824211,
-0.1862214058637619,
-0.1705123335123062,
-0.05474308505654335,
-0.042496245354413986,
-0.13288961350917816,
0.056527409702539444,
0.14172929525375366,
-0.07176143676042557,
0.007016526535153389,
-0.00421698484569788,
0.0024187848903238773,
-0.03048255294561386,
0.14990660548210144,
0.04642442986369133,
0.044099874794483185,
-0.071163609623909,
0.08482250571250916,
0.04922787845134735,
-0.05732579156756401,
0.006953593343496323,
0.0383315309882164,
-0.09187980741262436,
-0.04138752445578575,
0.014143084175884724,
0.16503965854644775,
-0.04574987664818764,
-0.046581003814935684,
-0.16762326657772064,
-0.10808312147855759,
0.03611869364976883,
0.16410426795482635,
0.07563668489456177,
0.03479620814323425,
-0.023668412119150162,
0.00030711424187757075,
-0.08334522694349289,
0.12502379715442657,
0.036872368305921555,
0.08777011930942535,
-0.17429405450820923,
0.10101412236690521,
-0.0002662220213096589,
0.013625318184494972,
-0.02193237841129303,
0.03864812105894089,
-0.09716097265481949,
-0.007368128281086683,
-0.13762661814689636,
0.0000684900805936195,
-0.018107470124959946,
0.0006678990321233869,
-0.009081915020942688,
-0.0676812082529068,
-0.06344884634017944,
0.018477315083146095,
-0.09262776374816895,
-0.038952261209487915,
0.04082803428173065,
0.056143805384635925,
-0.1252780258655548,
-0.02975585125386715,
0.032033488154411316,
-0.07492523640394211,
0.08589798212051392,
0.011412905529141426,
0.00300126476213336,
0.03922048211097717,
-0.155101016163826,
0.050139810889959335,
0.039208658039569855,
0.007243696600198746,
0.020712709054350853,
-0.088508740067482,
-0.020102374255657196,
0.0006492913234978914,
0.034032709896564484,
0.01648736372590065,
0.08013945072889328,
-0.12368308007717133,
-0.005565709434449673,
-0.014666177332401276,
-0.04088156670331955,
-0.05758478119969368,
0.02933177724480629,
0.06508041173219681,
0.022341785952448845,
0.20834942162036896,
-0.09045924991369247,
0.00528707867488265,
-0.21819142997264862,
0.016820145770907402,
0.013711865991353989,
-0.12329622358083725,
-0.1237090602517128,
-0.07023697346448898,
0.03958559408783913,
-0.061981599777936935,
0.11148586124181747,
-0.009771565906703472,
0.05372420325875282,
0.03069278970360756,
-0.008877516724169254,
0.06155996769666672,
0.013224881142377853,
0.2464974969625473,
-0.00038659613346681,
-0.03425300866365433,
0.03573625534772873,
0.026451587677001953,
0.10294786840677261,
0.08968936651945114,
0.15766409039497375,
0.14289098978042603,
-0.06670531630516052,
0.10445814579725266,
0.040239233523607254,
-0.015110266394913197,
-0.14077728986740112,
0.045155614614486694,
-0.027438966557383537,
0.107254259288311,
-0.024897964671254158,
0.22032023966312408,
0.11908238381147385,
-0.15276171267032623,
0.011253181844949722,
-0.04518134519457817,
-0.06400521844625473,
-0.09055148810148239,
-0.09991070628166199,
-0.09620390087366104,
-0.15486715734004974,
-0.006325446534901857,
-0.1032448261976242,
0.013373800553381443,
0.08417879790067673,
0.004467658698558807,
-0.033669181168079376,
0.16890718042850494,
0.013166253454983234,
-0.005263449624180794,
0.05260738357901573,
-0.0005770425195805728,
-0.03965218737721443,
-0.07000814378261566,
-0.10039713978767395,
0.009874429553747177,
-0.0026163693983107805,
0.01662176288664341,
-0.032634638249874115,
-0.013816660270094872,
0.030721833929419518,
-0.02374386414885521,
-0.1059427410364151,
0.0038150029722601175,
0.031089289113879204,
0.04467691481113434,
0.02115628495812416,
0.009729157201945782,
-0.0051432461477816105,
0.0031829336658120155,
0.2471739500761032,
-0.07988535612821579,
-0.054347388446331024,
-0.0875198021531105,
0.16562491655349731,
-0.0006224307580851018,
-0.009073966182768345,
0.006649685557931662,
-0.09515829384326935,
0.041768480092287064,
0.23368746042251587,
0.1646064668893814,
-0.08625385165214539,
-0.006140556186437607,
-0.013025414198637009,
-0.0061237746849656105,
-0.00856162142008543,
0.07996407896280289,
0.09232846647500992,
-0.006843077950179577,
-0.06565473228693008,
-0.020025255158543587,
-0.034951351583004,
-0.00676740100607276,
-0.03373442590236664,
0.08716550469398499,
0.01300771813839674,
0.009638710878789425,
-0.046826597303152084,
0.06455069780349731,
-0.03663068264722824,
-0.08665069937705994,
0.011926987208425999,
-0.19447782635688782,
-0.12288688123226166,
-0.03268659859895706,
0.10223007202148438,
-0.01959419995546341,
0.0429343655705452,
-0.02564648538827896,
0.028148580342531204,
0.04275566712021828,
-0.021670471876859665,
-0.08295546472072601,
-0.04831584915518761,
0.05577346682548523,
-0.11860526353120804,
0.22857125103473663,
-0.04432506859302521,
0.029425622895359993,
0.1266350895166397,
0.03381900489330292,
-0.08703494817018509,
0.09053012728691101,
0.0428650788962841,
-0.033731263130903244,
0.04101641848683357,
0.06736791878938675,
-0.01852540671825409,
0.10831315070390701,
0.04463604465126991,
-0.09012097120285034,
0.014394555240869522,
-0.021564193069934845,
-0.05520027130842209,
-0.05720512941479683,
-0.05497971177101135,
-0.05919251590967178,
0.1408008337020874,
0.16224908828735352,
-0.060529038310050964,
-0.007454846519976854,
-0.05381530150771141,
0.015001037158071995,
0.0849640816450119,
0.029446173459291458,
-0.021078821271657944,
-0.22068825364112854,
0.0015100188320502639,
0.07652390748262405,
-0.01106933131814003,
-0.30983445048332214,
-0.08159074187278748,
-0.016235552728176117,
-0.04115520045161247,
-0.08870331197977066,
0.0900878831744194,
0.14399705827236176,
0.04371050372719765,
-0.053218502551317215,
-0.060721565037965775,
-0.07596644014120102,
0.165596604347229,
-0.13020490109920502,
-0.09688983112573624
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | readingrocket/clip-vit-base-patch32-001 | [
"transformers",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:38:34+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
26,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.08389580249786377,
0.19830818474292755,
-0.0013316317927092314,
0.02313883788883686,
0.11396584659814835,
0.01961737498641014,
0.053626976907253265,
0.14538456499576569,
0.0060051376931369305,
0.10656800121068954,
0.066679947078228,
0.09131570905447006,
0.09678101539611816,
0.20042605698108673,
0.04371999576687813,
-0.17659740149974823,
0.010636410675942898,
-0.06930278241634369,
-0.010073255747556686,
0.11651819199323654,
0.141214057803154,
-0.10151198506355286,
0.07627976685762405,
-0.03319970890879631,
-0.02870541252195835,
-0.0070160143077373505,
-0.07769215852022171,
-0.05755697935819626,
0.07573003321886063,
0.054863471537828445,
0.04207949340343475,
-0.0008347301045432687,
0.08447454124689102,
-0.2674994468688965,
0.013753628358244896,
0.07452993094921112,
0.010659529827535152,
0.05990942195057869,
0.07833302766084671,
-0.04036625102162361,
0.12881849706172943,
-0.06320446729660034,
0.13035163283348083,
0.0906217098236084,
-0.0681561604142189,
-0.24378153681755066,
-0.08239314705133438,
0.06505522131919861,
0.12533815205097198,
0.07694927603006363,
-0.02823091857135296,
0.16422191262245178,
-0.07247646898031235,
0.019290022552013397,
0.09481704235076904,
-0.1151006743311882,
-0.060644298791885376,
0.08318385481834412,
0.14101974666118622,
0.10340547561645508,
-0.1255619376897812,
-0.012289565056562424,
0.04275871813297272,
0.045979104936122894,
0.07389909774065018,
0.011339850723743439,
0.1143413558602333,
0.05629947781562805,
-0.13526225090026855,
-0.05700986459851265,
0.14547574520111084,
0.023872992023825645,
-0.057064127177000046,
-0.2138909548521042,
-0.002902575535699725,
-0.07730814069509506,
-0.011685127392411232,
-0.06846728920936584,
0.0291305985301733,
-0.01194276288151741,
0.060226380825042725,
-0.0496203787624836,
-0.09797755628824234,
-0.046314824372529984,
0.1015089675784111,
0.054820988327264786,
0.011354796588420868,
-0.01489334274083376,
0.03576440364122391,
0.13432876765727997,
0.04213530570268631,
-0.10012737661600113,
-0.07065672427415848,
-0.0701170489192009,
-0.09620913118124008,
-0.03947552293539047,
0.04272124543786049,
0.020167991518974304,
0.042202774435281754,
0.2283228635787964,
0.024096308276057243,
0.05459817871451378,
0.029667891561985016,
0.0026177873369306326,
0.03211980313062668,
0.1073630079627037,
-0.041210614144802094,
-0.188126802444458,
-0.03292805701494217,
0.0931866466999054,
-0.009821015410125256,
-0.028658604249358177,
-0.033444397151470184,
0.035014089196920395,
0.08379437029361725,
0.11821532249450684,
0.08875755965709686,
-0.012828069739043713,
-0.037612639367580414,
-0.03493109717965126,
0.2115669697523117,
-0.14141373336315155,
0.045799970626831055,
-0.022097334265708923,
-0.018195297569036484,
-0.06905751675367355,
0.030103791505098343,
0.01831657998263836,
-0.003142025787383318,
0.06966056674718857,
-0.061253178864717484,
-0.05794486775994301,
-0.11518853157758713,
-0.045523155480623245,
0.04711875319480896,
-0.024105608463287354,
-0.024469668045639992,
-0.07765042781829834,
-0.11219723522663116,
-0.06417357176542282,
0.06612563133239746,
-0.04156653955578804,
-0.03974827378988266,
0.005308232270181179,
-0.07131324708461761,
0.008387917652726173,
0.008993842639029026,
0.12122467905282974,
-0.030063031241297722,
0.05833350867033005,
-0.002476902212947607,
0.05916252359747887,
0.10643328726291656,
0.03227818012237549,
-0.08492200076580048,
0.057466037571430206,
-0.20633617043495178,
0.08371785283088684,
-0.11420095711946487,
0.034276340156793594,
-0.17048145830631256,
-0.024183684960007668,
0.008447963744401932,
0.023597201332449913,
0.023726604878902435,
0.1338067352771759,
-0.2097422182559967,
-0.016196569427847862,
0.14133213460445404,
-0.09649793803691864,
-0.12422871589660645,
0.07990546524524689,
-0.03459475561976433,
0.1747698187828064,
0.038475677371025085,
-0.019652999937534332,
0.09909367561340332,
-0.15559963881969452,
-0.05852397903800011,
-0.026064254343509674,
-0.008927824907004833,
0.08823978155851364,
0.07542291283607483,
-0.05844951793551445,
0.02285866066813469,
0.02562655322253704,
-0.04727208614349365,
-0.0268824752420187,
-0.05256075784564018,
-0.10127434879541397,
-0.023140445351600647,
-0.09642518311738968,
0.026515161618590355,
0.000058677000197349116,
-0.07310442626476288,
-0.028560271486639977,
-0.17347893118858337,
-0.02563360333442688,
0.10103316605091095,
0.004820956848561764,
-0.007559072691947222,
-0.08540112525224686,
0.022149885073304176,
-0.05362366884946823,
-0.006164622958749533,
-0.16996455192565918,
-0.03558015450835228,
0.051895126700401306,
-0.14917676150798798,
0.015460150316357613,
-0.07327745854854584,
0.07047311216592789,
0.02098717913031578,
-0.05859505757689476,
-0.03108096309006214,
0.0007694467785768211,
0.004292082041501999,
-0.06229274719953537,
-0.1903683841228485,
-0.058886781334877014,
-0.041500482708215714,
0.15720732510089874,
-0.24841000139713287,
0.0300158578902483,
0.03247617185115814,
0.13185922801494598,
0.007058668415993452,
-0.06344027817249298,
0.02096918225288391,
-0.04676475748419762,
-0.050621338188648224,
-0.06898977607488632,
-0.009901339188218117,
-0.014539826661348343,
-0.031393732875585556,
0.012980648316442966,
-0.14970256388187408,
-0.060514215379953384,
0.09452559798955917,
0.11224991828203201,
-0.14555825293064117,
0.00204002158716321,
-0.0460561066865921,
-0.07002599537372589,
-0.07487804442644119,
-0.0761631652712822,
0.07739497721195221,
0.044650159776210785,
0.049250341951847076,
-0.06317461282014847,
-0.06234706938266754,
0.023210179060697556,
0.005524294450879097,
-0.019023682922124863,
0.0948529988527298,
0.074309803545475,
-0.09122881293296814,
0.07973480224609375,
0.08461450785398483,
0.04414684325456619,
0.086973637342453,
0.005991141777485609,
-0.11396963149309158,
-0.03062884695827961,
0.037754856050014496,
0.024159027263522148,
0.15351562201976776,
-0.08692087233066559,
0.030462130904197693,
0.052177220582962036,
-0.03854219615459442,
0.03157065063714981,
-0.0923321321606636,
0.025362705811858177,
0.021495236083865166,
-0.006555700208991766,
0.05864228308200836,
-0.018769768998026848,
-0.01403577346354723,
0.06336429715156555,
0.05677810311317444,
0.044270504266023636,
0.02595379762351513,
-0.02093072421848774,
-0.1278371512889862,
0.16537296772003174,
-0.09028079360723495,
-0.2540280222892761,
-0.17074446380138397,
0.015454737469553947,
0.03706491366028786,
-0.021728800609707832,
0.039588842540979385,
-0.06286025792360306,
-0.10237989574670792,
-0.09417891502380371,
0.0029635571409016848,
0.023925531655550003,
-0.058347854763269424,
-0.0817074254155159,
0.060779985040426254,
0.04047083482146263,
-0.13689260184764862,
0.0349188968539238,
0.06170675903558731,
-0.03042641654610634,
0.0018567070364952087,
0.07321398705244064,
0.12743599712848663,
0.14838241040706635,
-0.006730219814926386,
-0.012446845881640911,
0.035035960376262665,
0.229813352227211,
-0.1490442156791687,
0.10630457103252411,
0.14053207635879517,
-0.021705523133277893,
0.06635113060474396,
0.1461038440465927,
0.023231739178299904,
-0.07546708732843399,
0.04147516191005707,
0.04027445614337921,
-0.04228919371962547,
-0.2589097023010254,
-0.05694316700100899,
-0.00946022942662239,
-0.07043391466140747,
0.09718906134366989,
0.09238530695438385,
0.11972260475158691,
0.0337289460003376,
-0.05568677559494972,
-0.025771914049983025,
-0.003401360474526882,
0.114128477871418,
-0.027640055865049362,
-0.004564122296869755,
0.07965842634439468,
-0.05878787487745285,
0.011684526689350605,
0.09941446036100388,
0.019347423687577248,
0.17601320147514343,
0.02533329278230667,
0.10681075602769852,
0.06725578010082245,
0.09347675740718842,
-0.0015635732561349869,
0.034774236381053925,
0.05337131395936012,
0.022044572979211807,
0.010453542694449425,
-0.09408048540353775,
-0.012431944720447063,
0.13713060319423676,
0.019816776737570763,
0.009031654335558414,
0.008926562033593655,
-0.01010479498654604,
0.03131420537829399,
0.20501568913459778,
0.0009575071162544191,
-0.22537250816822052,
-0.09500737488269806,
0.059459153562784195,
-0.06931101530790329,
-0.143676295876503,
-0.02094252221286297,
0.030270220711827278,
-0.17292405664920807,
0.016790566965937614,
-0.0316389761865139,
0.09112390875816345,
-0.07145322859287262,
-0.028050832450389862,
0.06891903281211853,
0.07569212466478348,
-0.012108199298381805,
0.07973295450210571,
-0.19069278240203857,
0.12254468351602554,
0.03037673607468605,
0.08605273067951202,
-0.11708726733922958,
0.07849059253931046,
-0.0019813794642686844,
-0.014807495288550854,
0.17999744415283203,
-0.014062200672924519,
-0.0586031936109066,
-0.08878950774669647,
-0.08704045414924622,
-0.011727320961654186,
0.10361312329769135,
-0.09322915226221085,
0.09586969763040543,
-0.02775636687874794,
-0.03705112263560295,
0.012418309226632118,
-0.10469507426023483,
-0.1636953055858612,
-0.18679304420948029,
0.06244563311338425,
-0.07802703976631165,
0.012347841635346413,
-0.11227322369813919,
-0.06334327906370163,
-0.01575082167983055,
0.23160123825073242,
-0.16648635268211365,
-0.07049825042486191,
-0.1498587429523468,
-0.03997112438082695,
0.17463743686676025,
-0.042160745710134506,
0.06849376112222672,
-0.021383514627814293,
0.1873992383480072,
-0.008081548847258091,
-0.013158116489648819,
0.06569221615791321,
-0.09637628495693207,
-0.16879262030124664,
-0.05748843029141426,
0.14160962402820587,
0.10863390564918518,
0.05731578543782234,
-0.0038195757661014795,
0.013171887956559658,
-0.03383830562233925,
-0.09896382689476013,
0.013824623078107834,
0.13817466795444489,
0.0034514935687184334,
0.00682973163202405,
-0.03995988517999649,
-0.07027145475149155,
-0.05825701728463173,
-0.07912654429674149,
0.057147104293107986,
0.187900573015213,
-0.09512355923652649,
0.1602867990732193,
0.12431421875953674,
-0.06468851119279861,
-0.2306901067495346,
0.03996593505144119,
0.04701630026102066,
0.007666614837944508,
0.022401191294193268,
-0.19138796627521515,
0.09788824617862701,
0.0009011493530124426,
-0.06807263940572739,
0.14616990089416504,
-0.16564498841762543,
-0.1461436152458191,
0.08002161979675293,
0.025075770914554596,
-0.22560662031173706,
-0.14821304380893707,
-0.1037549376487732,
-0.03735695406794548,
-0.13707835972309113,
0.048581719398498535,
0.02614329755306244,
0.019834673032164574,
0.025222565978765488,
0.005338077899068594,
0.029657263308763504,
-0.07272187620401382,
0.1870686560869217,
-0.020297454670071602,
0.0072362530045211315,
-0.050640691071748734,
-0.04617878794670105,
0.09227550774812698,
-0.06150037795305252,
0.11741586774587631,
0.018679620698094368,
0.018796883523464203,
-0.1431548148393631,
-0.049209367483854294,
-0.060803934931755066,
0.04456847906112671,
-0.07284719496965408,
-0.09393193572759628,
-0.04137463867664337,
0.08888561278581619,
0.07211937010288239,
-0.032792408019304276,
-0.0027768779546022415,
-0.07569456845521927,
0.09405932575464249,
0.184477761387825,
0.17357055842876434,
0.009977072477340698,
-0.07020942866802216,
0.024555526673793793,
-0.042279548943042755,
0.03349342197179794,
-0.24652716517448425,
0.03456863760948181,
0.066053606569767,
0.03803660348057747,
0.08509242534637451,
-0.016836483031511307,
-0.1781480610370636,
-0.04086102172732353,
0.08498652279376984,
-0.06206206604838371,
-0.19876568019390106,
-0.02703288197517395,
0.08424776047468185,
-0.20383712649345398,
-0.032998621463775635,
0.041543323546648026,
-0.03834589570760727,
-0.02396267279982567,
-0.002415500348433852,
0.06396626681089401,
-0.008327016606926918,
0.12156640738248825,
0.06747189164161682,
0.10266115516424179,
-0.09284433722496033,
0.08920657634735107,
0.10416955500841141,
-0.09140542894601822,
0.03545991703867912,
0.10264154523611069,
-0.05670900270342827,
-0.04460543021559715,
0.033935222774744034,
0.05925208330154419,
-0.028357384726405144,
-0.06409841030836105,
-0.000502707262057811,
-0.0359574519097805,
0.04993389546871185,
0.08058220148086548,
0.036113787442445755,
-0.01202210783958435,
0.06544706225395203,
0.028145326301455498,
-0.11693570017814636,
0.10949387401342392,
0.04405685141682625,
0.04509059712290764,
-0.07182393968105316,
-0.012280966155230999,
0.015999672934412956,
0.032540347427129745,
-0.019734015688300133,
-0.014576527290046215,
-0.03146412968635559,
-0.007561005651950836,
-0.1553635597229004,
-0.02064543403685093,
-0.06516171246767044,
0.006067827809602022,
0.022207623347640038,
-0.03830232471227646,
-0.012014663778245449,
0.01381110493093729,
-0.07979435473680496,
-0.07571027427911758,
-0.01700955256819725,
0.08539021760225296,
-0.1381402313709259,
0.006627439055591822,
0.07182712107896805,
-0.10980239510536194,
0.07347989827394485,
-0.0048679932951927185,
0.017079560086131096,
0.010923396795988083,
-0.11654401570558548,
0.04386281594634056,
-0.005810429807752371,
0.01551580335944891,
0.022556742653250694,
-0.171111062169075,
0.011553828604519367,
-0.038553636521101,
-0.03114982508122921,
0.011926400475203991,
-0.025060230866074562,
-0.11875922232866287,
0.08676479011774063,
-0.028097305446863174,
-0.037512701004743576,
-0.03292486071586609,
0.06296087801456451,
0.08736220002174377,
-0.011740099638700485,
0.09667140990495682,
-0.025766119360923767,
0.04818311333656311,
-0.1756584197282791,
-0.01910574547946453,
-0.050167568027973175,
0.02537350542843342,
-0.01759655587375164,
-0.0070639788173139095,
0.055272240191698074,
-0.004191063344478607,
0.20991376042366028,
-0.03921036794781685,
0.1548677533864975,
0.05199402943253517,
-0.009925156831741333,
0.010884369723498821,
0.05032730847597122,
0.06423956155776978,
0.031145188957452774,
0.00853167474269867,
0.04660189896821976,
-0.004552975296974182,
-0.020357951521873474,
-0.13699717819690704,
0.02791593410074711,
0.16117429733276367,
0.061918217688798904,
0.0392887257039547,
0.03704594820737839,
-0.1422400325536728,
-0.09538721293210983,
0.10306388139724731,
-0.0331864058971405,
0.014331420883536339,
-0.08317886292934418,
0.17621558904647827,
0.12328410148620605,
-0.1574767529964447,
0.0577850341796875,
-0.07234696298837662,
-0.05066767707467079,
-0.1024852767586708,
-0.11832084506750107,
-0.06293155997991562,
-0.06027044355869293,
-0.004747506696730852,
-0.042489297688007355,
0.05734556168317795,
0.026751231402158737,
-0.003270963439717889,
-0.006759525276720524,
0.12665949761867523,
-0.0249644722789526,
-0.004145825747400522,
0.04152364656329155,
0.0326087586581707,
0.019319625571370125,
-0.05872373282909393,
0.017997145652770996,
0.018602589145302773,
0.022180357947945595,
0.06835069507360458,
0.0260987039655447,
-0.059317342936992645,
0.044286735355854034,
0.00319746439345181,
-0.11313364654779434,
0.018146557733416557,
-0.00002245741598017048,
-0.05020225793123245,
0.13557326793670654,
0.04076748713850975,
0.01548024732619524,
-0.029270920902490616,
0.24342355132102966,
-0.07199113070964813,
-0.08681939542293549,
-0.13965600728988647,
0.11511493474245071,
-0.023563209921121597,
0.03755274787545204,
0.016542524099349976,
-0.12659503519535065,
0.011511262506246567,
0.18531471490859985,
0.12824349105358124,
0.012459068559110165,
-0.007656481582671404,
0.05736639350652695,
-0.0007639875984750688,
-0.05985576659440994,
0.05051197111606598,
0.0664999932050705,
0.16097788512706757,
-0.09069112688302994,
0.0652846097946167,
-0.008405503816902637,
-0.0831485390663147,
-0.027498632669448853,
0.11705785244703293,
-0.022675158455967903,
0.02148384228348732,
-0.03778035193681717,
0.11204422265291214,
-0.052532415837049484,
-0.2719486355781555,
0.02952493168413639,
-0.09503202140331268,
-0.13993041217327118,
-0.02591860294342041,
0.041448429226875305,
-0.03349510580301285,
0.01577647216618061,
0.06254769116640091,
-0.045389387756586075,
0.18837277591228485,
0.025987716391682625,
-0.08679025620222092,
-0.07755549252033234,
0.05874146893620491,
-0.08695939928293228,
0.2789687216281891,
0.003863075515255332,
0.04782010242342949,
0.12108923494815826,
-0.03053574077785015,
-0.18664880096912384,
0.014769754372537136,
0.11989909410476685,
-0.09114406257867813,
0.07780203968286514,
0.18139931559562683,
-0.005561648402363062,
0.12649618089199066,
0.04705416411161423,
-0.03877115994691849,
0.03976387158036232,
-0.02721380814909935,
-0.03821522742509842,
-0.12209630757570267,
0.05661242455244064,
-0.0612691193819046,
0.15957388281822205,
0.1158948540687561,
-0.05964287370443344,
0.001120698289014399,
-0.06126941740512848,
0.06300627440214157,
0.014774397015571594,
0.12115653604269028,
0.018452486023306847,
-0.2023056596517563,
0.05087360367178917,
-0.03283824771642685,
0.08166342973709106,
-0.254973828792572,
-0.08186668157577515,
0.07622263580560684,
-0.019022729247808456,
-0.04275642707943916,
0.12311509251594543,
0.06101066991686821,
0.03676839917898178,
-0.03853875398635864,
-0.08537755906581879,
-0.01412904355674982,
0.15376435220241547,
-0.14123432338237762,
-0.029574336484074593
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | readingrocket/clip-vit-large-patch14-001 | [
"transformers",
"safetensors",
"clip_text_model",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:38:35+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #clip_text_model #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #clip_text_model #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
37,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #clip_text_model #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.051232416182756424,
0.20436473190784454,
-0.003778991987928748,
0.024887878447771072,
0.10536987334489822,
-0.001453306176699698,
0.06099526584148407,
0.11824915558099747,
-0.013611295260488987,
0.11596455425024033,
0.02840925008058548,
0.08278053998947144,
0.11481429636478424,
0.1576504409313202,
0.005973390769213438,
-0.2363836020231247,
0.04203276336193085,
-0.08552176505327225,
0.018479354679584503,
0.11621345579624176,
0.13730907440185547,
-0.10445672273635864,
0.09055983275175095,
-0.012379788793623447,
-0.01423747930675745,
-0.008774486370384693,
-0.08013757318258286,
-0.07806821167469025,
0.07090700417757034,
0.08158472925424576,
0.0705459862947464,
0.02177765965461731,
0.08963412791490555,
-0.27688223123550415,
0.01753099635243416,
0.09009262174367905,
-0.004057284444570541,
0.06906062364578247,
0.07114355266094208,
-0.06956128776073456,
0.11227671056985855,
-0.08133985102176666,
0.1431182324886322,
0.07644931226968765,
-0.09005925059318542,
-0.19779187440872192,
-0.07411731034517288,
0.07298080623149872,
0.13059605658054352,
0.05751047283411026,
-0.025547100231051445,
0.1496819108724594,
-0.0888620987534523,
0.00967206060886383,
0.1287892609834671,
-0.07832050323486328,
-0.04991907253861427,
0.03380030021071434,
0.11140188574790955,
0.09074179083108902,
-0.11585110425949097,
0.0023137053940445185,
0.03823603689670563,
0.012235704809427261,
0.08567759394645691,
0.019634071737527847,
0.11227937787771225,
0.04098920524120331,
-0.13911332190036774,
-0.049576401710510254,
0.09808392077684402,
0.03075331263244152,
-0.05645253136754036,
-0.21738378703594208,
-0.02607138082385063,
-0.009872336871922016,
-0.021097227931022644,
-0.04671182855963707,
0.050928205251693726,
-0.04108458757400513,
0.06483147293329239,
-0.020635509863495827,
-0.08567088842391968,
-0.03312988206744194,
0.05753567814826965,
0.05699571594595909,
0.016875019297003746,
-0.010413682088255882,
0.01887732557952404,
0.1205807700753212,
0.07176177948713303,
-0.12136226892471313,
-0.07612016052007675,
-0.0655624121427536,
-0.09947585314512253,
-0.04881424829363823,
0.03228630870580673,
0.05963897705078125,
0.02739146538078785,
0.20707261562347412,
-0.01650421880185604,
0.043981824070215225,
0.03337625414133072,
0.00467774597927928,
0.06750867515802383,
0.09527754038572311,
-0.06231040507555008,
-0.14611545205116272,
-0.049231354147195816,
0.08712926506996155,
0.001687817508354783,
-0.03837509825825691,
-0.052478715777397156,
0.037983812391757965,
0.033747099339962006,
0.11710286140441895,
0.08384725451469421,
-0.0022946547251194715,
-0.04709390923380852,
-0.032934799790382385,
0.22323501110076904,
-0.14776796102523804,
0.043990202248096466,
-0.00030232040444388986,
-0.044632233679294586,
-0.027396410703659058,
0.02820945717394352,
0.02066325582563877,
-0.026622112840414047,
0.10134907066822052,
-0.06980768591165543,
-0.03479640930891037,
-0.11137524992227554,
-0.05948808789253235,
0.03165903314948082,
0.0014957344392314553,
-0.023197883740067482,
-0.050009001046419144,
-0.12027385830879211,
-0.04221020266413689,
0.0649915486574173,
-0.06361566483974457,
-0.05273035541176796,
0.012292823754251003,
-0.03969072178006172,
0.0013634969945997,
-0.006069946568459272,
0.11135037243366241,
-0.03555804491043091,
0.026816613972187042,
-0.03488432615995407,
0.07008364796638489,
0.10188078880310059,
0.040231652557849884,
-0.07285238802433014,
0.06405041366815567,
-0.22727127373218536,
0.09510360658168793,
-0.09021937847137451,
0.011650178581476212,
-0.15190055966377258,
-0.04530080407857895,
0.03695599362254143,
0.024820007383823395,
-0.0005037287482991815,
0.12461037188768387,
-0.19408556818962097,
-0.027022158727049828,
0.145589217543602,
-0.10885480046272278,
-0.09480426460504532,
0.075563445687294,
-0.05738705396652222,
0.11946038901805878,
0.04624507203698158,
-0.03387218713760376,
0.061208371073007584,
-0.14552444219589233,
-0.047071199864149094,
-0.017297288402915,
-0.008106756955385208,
0.13856607675552368,
0.06264059990644455,
-0.05583328753709793,
0.07475734502077103,
0.017170928418636322,
-0.028724199160933495,
-0.04803555831313133,
-0.04324330389499664,
-0.09188703447580338,
0.0006117025041021407,
-0.06822550296783447,
0.03741994872689247,
-0.026644723489880562,
-0.08861822634935379,
-0.031375084072351456,
-0.1744590401649475,
0.06975363940000534,
0.08446090668439865,
0.012319422326982021,
-0.01262954343110323,
-0.08815763890743256,
0.010465524159371853,
-0.015403496101498604,
-0.018844544887542725,
-0.16621537506580353,
-0.04754708707332611,
0.040500421077013016,
-0.18107487261295319,
0.03776957467198372,
-0.04308033362030983,
0.051096513867378235,
0.04174768924713135,
-0.04412797838449478,
0.004761035554111004,
-0.006134907249361277,
0.0149694150313735,
-0.02973899431526661,
-0.1931270807981491,
-0.03893629088997841,
-0.026102228090167046,
0.15442563593387604,
-0.22990268468856812,
0.03336458280682564,
0.07043547928333282,
0.1438058465719223,
-0.004182778764516115,
-0.040693044662475586,
0.02029518038034439,
-0.05470677837729454,
-0.058053165674209595,
-0.06362863630056381,
-0.006206595804542303,
-0.03241005912423134,
-0.03820691630244255,
0.06292017549276352,
-0.20151568949222565,
-0.03993739187717438,
0.10420793294906616,
0.10266537964344025,
-0.1502939760684967,
-0.020553868263959885,
-0.0464441180229187,
-0.062387917190790176,
-0.09869661182165146,
-0.06028065085411072,
0.16059117019176483,
0.045445479452610016,
0.05432417616248131,
-0.09244974702596664,
-0.06427840143442154,
0.00807530339807272,
0.000922447710763663,
-0.03807354345917702,
0.0771099179983139,
0.0820174366235733,
-0.09804119914770126,
0.08241340517997742,
0.07588730752468109,
0.0706230029463768,
0.09881036728620529,
0.01223856769502163,
-0.1084936186671257,
-0.02609558403491974,
0.00965067557990551,
0.017575431615114212,
0.14384663105010986,
-0.043364543467760086,
0.04089226946234703,
0.05358186736702919,
-0.035433150827884674,
0.016415225341916084,
-0.11442917585372925,
0.03408423066139221,
0.0460074357688427,
-0.009219459258019924,
0.01696793921291828,
-0.041596777737140656,
0.023288888856768608,
0.08730988949537277,
0.04220324754714966,
0.033727649599313736,
0.0040068961679935455,
-0.02183770015835762,
-0.10645344853401184,
0.17684708535671234,
-0.09422820806503296,
-0.3069242238998413,
-0.1496608704328537,
0.008157364092767239,
0.04620055481791496,
-0.02357938326895237,
0.02312551625072956,
-0.0499705970287323,
-0.10838872939348221,
-0.10923535376787186,
0.007910285145044327,
0.03326551243662834,
-0.07583785802125931,
-0.06779206544160843,
0.05752981826663017,
0.032018356025218964,
-0.14911724627017975,
0.033931341022253036,
0.05104735866189003,
-0.04438245669007301,
-0.015184029936790466,
0.08385400474071503,
0.10061254352331161,
0.1642886996269226,
-0.01957457698881626,
-0.02159314602613449,
0.01927577145397663,
0.21918433904647827,
-0.13835659623146057,
0.11352024972438812,
0.15502694249153137,
-0.060845308005809784,
0.09791775047779083,
0.18241046369075775,
0.019253501668572426,
-0.08264889568090439,
0.03652165085077286,
0.047497957944869995,
-0.05332659184932709,
-0.24243856966495514,
-0.054474517703056335,
0.005314461421221495,
-0.07191300392150879,
0.08818135410547256,
0.08932143449783325,
0.12383197247982025,
0.03833381086587906,
-0.08682088553905487,
-0.05832170322537422,
0.013259105384349823,
0.1092611774802208,
-0.030633587390184402,
-0.0023328352253884077,
0.08906691521406174,
-0.04763512685894966,
-0.0023902456741780043,
0.1063537448644638,
0.022022748365998268,
0.1848851889371872,
0.027622131630778313,
0.14155012369155884,
0.06653230637311935,
0.04510883241891861,
0.022526104003190994,
0.021753767505288124,
0.03830648958683014,
0.014352588914334774,
-0.016050659120082855,
-0.09393449127674103,
0.020367998629808426,
0.126719668507576,
0.05505383759737015,
0.02876087836921215,
0.012787995859980583,
-0.03397754579782486,
0.055249594151973724,
0.1727810502052307,
0.0018994795391336083,
-0.22684542834758759,
-0.05677860602736473,
0.07710637152194977,
-0.06740188598632812,
-0.12038815021514893,
-0.029651273041963577,
0.030011994764208794,
-0.18053343892097473,
0.03995291143655777,
-0.015702908858656883,
0.10823041945695877,
-0.12691307067871094,
-0.027780594304203987,
0.025076530873775482,
0.08361076563596725,
-0.03515695035457611,
0.08223618566989899,
-0.15955136716365814,
0.12807582318782806,
0.020472608506679535,
0.07141618430614471,
-0.11376422643661499,
0.09131114929914474,
0.002101482590660453,
0.005614803172647953,
0.1642366349697113,
-0.004712372552603483,
-0.08064562827348709,
-0.06579344719648361,
-0.08821536600589752,
-0.021633155643939972,
0.10488249361515045,
-0.10475372523069382,
0.08341490477323532,
-0.010112715885043144,
-0.04421776533126831,
0.009314844384789467,
-0.10521908104419708,
-0.15066970884799957,
-0.1936490684747696,
0.06264233589172363,
-0.09857151657342911,
0.009040809236466885,
-0.09719017148017883,
-0.06218426674604416,
-0.03620466589927673,
0.2169337421655655,
-0.1324661821126938,
-0.08885446935892105,
-0.1556459218263626,
-0.08793539553880692,
0.17066626250743866,
-0.03955378383398056,
0.08389080315828323,
-0.004847696516662836,
0.22614623606204987,
-0.0021034374367445707,
-0.011168522760272026,
0.07046638429164886,
-0.0883980542421341,
-0.1739228367805481,
-0.0800631120800972,
0.1277487426996231,
0.11715451627969742,
0.0507349856197834,
-0.004943216219544411,
0.01431850902736187,
-0.030363403260707855,
-0.10531038790941238,
-0.0050230431370437145,
0.12082348763942719,
0.06945854425430298,
0.03529510647058487,
-0.017756164073944092,
-0.10450491309165955,
-0.06299583613872528,
-0.04328043758869171,
0.03388093411922455,
0.1939132809638977,
-0.0832151398062706,
0.16827689111232758,
0.14099951088428497,
-0.05556853115558624,
-0.21031554043293,
0.04645504802465439,
0.04019912704825401,
-0.003373176557943225,
0.03869429603219032,
-0.18290947377681732,
0.07433620095252991,
0.029624532908201218,
-0.050264403223991394,
0.1452644318342209,
-0.16327492892742157,
-0.15770478546619415,
0.06752203404903412,
0.051284268498420715,
-0.20937110483646393,
-0.12502780556678772,
-0.08236128091812134,
-0.06984458863735199,
-0.15401558578014374,
0.09680766612291336,
-0.007822735235095024,
0.0007930613355711102,
0.04972194880247116,
0.03702351450920105,
0.01675218716263771,
-0.050367310643196106,
0.20358777046203613,
-0.010602201335132122,
0.027404647320508957,
-0.07915748655796051,
-0.09043405950069427,
0.0843893438577652,
-0.05567900463938713,
0.10072773694992065,
-0.016941368579864502,
0.011175313033163548,
-0.08364859223365784,
-0.05889030173420906,
-0.043409716337919235,
0.043681513518095016,
-0.08204232901334763,
-0.10806262493133545,
-0.06020430102944374,
0.08455660939216614,
0.08311260491609573,
-0.03688499331474304,
-0.0241681020706892,
-0.08448270708322525,
0.06320605427026749,
0.19819939136505127,
0.16627340018749237,
0.04173582047224045,
-0.08878371864557266,
0.01034871581941843,
-0.026294557377696037,
0.040590424090623856,
-0.2263050377368927,
0.041152071207761765,
0.046669330447912216,
0.03417179733514786,
0.11594212055206299,
-0.020685948431491852,
-0.16327506303787231,
-0.04322872683405876,
0.05485227331519127,
-0.04699348285794258,
-0.2043781727552414,
-0.009070083498954773,
0.0640672892332077,
-0.18450187146663666,
-0.0434892363846302,
0.024947630241513252,
-0.025326933711767197,
-0.031441617757081985,
0.008228014223277569,
0.0631331279873848,
0.030038760975003242,
0.09433695673942566,
0.0555247999727726,
0.10126970708370209,
-0.10870123654603958,
0.10083179175853729,
0.10242434591054916,
-0.09581031650304794,
0.020147429779171944,
0.07496211677789688,
-0.056478600949048996,
-0.02950763888657093,
0.032123226672410965,
0.0536724254488945,
-0.016753319650888443,
-0.06211940199136734,
-0.019456589594483376,
-0.09310055524110794,
0.06343130767345428,
0.14617778360843658,
0.02976672537624836,
-0.016456609591841698,
0.06383352726697922,
0.02665531449019909,
-0.09768950194120407,
0.10027070343494415,
0.01917506381869316,
0.036640629172325134,
-0.04706798866391182,
0.003697384148836136,
0.048219092190265656,
0.017291881144046783,
-0.019199755042791367,
-0.033080339431762695,
-0.04778473451733589,
-0.016753079369664192,
-0.1791423112154007,
0.00689761433750391,
-0.0645512193441391,
0.0042097000405192375,
0.010902770794928074,
-0.03785454109311104,
-0.01292155310511589,
0.02522699162364006,
-0.0763353705406189,
-0.05273051932454109,
-0.008273771032691002,
0.09433843195438385,
-0.14998462796211243,
0.00676126591861248,
0.08730819076299667,
-0.11155026406049728,
0.0662379264831543,
-0.011551082134246826,
-0.012704103253781796,
0.0008630965021438897,
-0.14022667706012726,
0.04654063284397125,
-0.008669032715260983,
0.020908290520310402,
0.0474558025598526,
-0.1695529967546463,
0.008806135505437851,
-0.04349026456475258,
-0.03609219938516617,
-0.016894787549972534,
-0.06370312720537186,
-0.11307510733604431,
0.10081591457128525,
-0.004303404595702887,
-0.06791272759437561,
-0.007579766679555178,
0.04750705137848854,
0.1030273586511612,
-0.04174215719103813,
0.1015220656991005,
0.0004969099536538124,
0.05913374945521355,
-0.17722371220588684,
-0.029529111459851265,
-0.024764811620116234,
0.002549501368775964,
0.015470228157937527,
-0.013796424493193626,
0.04362579062581062,
-0.007690896280109882,
0.247127503156662,
-0.028464844450354576,
0.1148422360420227,
0.059709228575229645,
0.024386543780565262,
0.0030695958994328976,
0.07727647572755814,
0.06945234537124634,
0.013996325433254242,
0.008568000979721546,
0.028022220358252525,
-0.026617173105478287,
-0.00880188774317503,
-0.1719008982181549,
0.06478574126958847,
0.13965553045272827,
0.08130571246147156,
0.014567406848073006,
0.06474722176790237,
-0.1151006892323494,
-0.1045185923576355,
0.06681827455759048,
-0.032170865684747696,
0.003436793340370059,
-0.06538334488868713,
0.1441062092781067,
0.1556253284215927,
-0.1578836292028427,
0.07879135757684708,
-0.035957351326942444,
-0.04681331291794777,
-0.10615081340074539,
-0.12704940140247345,
-0.06285175681114197,
-0.02792247198522091,
-0.0031114204321056604,
-0.0564565435051918,
0.06560389697551727,
0.07995190471410751,
-0.003869675099849701,
0.006418116856366396,
0.11214377731084824,
-0.02033410780131817,
-0.011803015135228634,
0.025787262246012688,
0.0469365268945694,
0.035506535321474075,
-0.05674617737531662,
0.012734832242131233,
0.016441838815808296,
0.027637474238872528,
0.05951123684644699,
0.025520699098706245,
-0.026177680119872093,
0.022076494991779327,
0.0016506966203451157,
-0.10037951916456223,
0.022503269836306572,
-0.02517148293554783,
-0.05866222828626633,
0.12848378717899323,
0.032200541347265244,
0.012288914062082767,
-0.03420720249414444,
0.21196803450584412,
-0.06736771762371063,
-0.07292116433382034,
-0.13097700476646423,
0.12689357995986938,
-0.025591185316443443,
0.060717374086380005,
0.05311565101146698,
-0.11538508534431458,
-0.005535853561013937,
0.1307114064693451,
0.11899431049823761,
-0.016699831932783127,
0.005964045878499746,
0.027054667472839355,
0.006748197600245476,
-0.044653113931417465,
0.04646079242229462,
0.047594450414180756,
0.14636047184467316,
-0.06906233727931976,
0.06479811668395996,
0.009637407958507538,
-0.07644528150558472,
-0.04486342892050743,
0.13352197408676147,
-0.014800913631916046,
0.02777041494846344,
-0.04723262041807175,
0.09884928911924362,
-0.07328233867883682,
-0.2783758044242859,
0.03538375720381737,
-0.09471236914396286,
-0.15350179374217987,
-0.016824468970298767,
0.028844891116023064,
-0.01893608085811138,
0.02661667950451374,
0.07257179170846939,
-0.05998701974749565,
0.16989903151988983,
0.03659095987677574,
-0.09277532249689102,
-0.06236744299530983,
0.06275524944067001,
-0.09371813386678696,
0.30026525259017944,
0.007056474220007658,
0.035340286791324615,
0.10405483841896057,
-0.02405174821615219,
-0.144112229347229,
0.03060785122215748,
0.10349760949611664,
-0.09616105258464813,
0.07185753434896469,
0.18740259110927582,
-0.01704815961420536,
0.10487563908100128,
0.0632273405790329,
-0.06386154145002365,
0.05948252975940704,
-0.06732751429080963,
-0.05496177077293396,
-0.09178970009088516,
0.0576414056122303,
-0.05840621516108513,
0.15151196718215942,
0.10769796371459961,
-0.04298880696296692,
-0.005678469315171242,
-0.039455339312553406,
0.0364319272339344,
0.014436799101531506,
0.120704285800457,
0.012343291193246841,
-0.16577959060668945,
0.03570205345749855,
-0.012451929040253162,
0.10738532245159149,
-0.23571598529815674,
-0.08566100150346756,
0.06817395985126495,
-0.03312031924724579,
-0.05132237821817398,
0.09912731498479843,
0.07505525648593903,
0.050415847450494766,
-0.0434066578745842,
-0.07863350212574005,
-0.00935285072773695,
0.15702253580093384,
-0.12681299448013306,
-0.012248565442860126
] |
null | null | null | # HI everyone. this is GL and we are launching an official XAI project for VLL's Company
what is XAI? XAI is an soon to be the advanced and free model alt for ChatGPT. no matter advanced or not, Image generator or not. You can still give me money but through the donations. AND THE BEST PART! ITS OPEN SOURCED! meaning I don't care if you change the file or not. its up to you to figure out what you'll be doing! | {"language": ["en"], "license": "apache-2.0", "datasets": ["fka/awesome-chatgpt-prompts"]} | null | Guiding-light/XAI | [
"en",
"dataset:fka/awesome-chatgpt-prompts",
"license:apache-2.0",
"region:us"
] | 2024-02-14T16:39:24+00:00 | [] | [
"en"
] | TAGS
#en #dataset-fka/awesome-chatgpt-prompts #license-apache-2.0 #region-us
| # HI everyone. this is GL and we are launching an official XAI project for VLL's Company
what is XAI? XAI is an soon to be the advanced and free model alt for ChatGPT. no matter advanced or not, Image generator or not. You can still give me money but through the donations. AND THE BEST PART! ITS OPEN SOURCED! meaning I don't care if you change the file or not. its up to you to figure out what you'll be doing! | [
"# HI everyone. this is GL and we are launching an official XAI project for VLL's Company\nwhat is XAI? XAI is an soon to be the advanced and free model alt for ChatGPT. no matter advanced or not, Image generator or not. You can still give me money but through the donations. AND THE BEST PART! ITS OPEN SOURCED! meaning I don't care if you change the file or not. its up to you to figure out what you'll be doing!"
] | [
"TAGS\n#en #dataset-fka/awesome-chatgpt-prompts #license-apache-2.0 #region-us \n",
"# HI everyone. this is GL and we are launching an official XAI project for VLL's Company\nwhat is XAI? XAI is an soon to be the advanced and free model alt for ChatGPT. no matter advanced or not, Image generator or not. You can still give me money but through the donations. AND THE BEST PART! ITS OPEN SOURCED! meaning I don't care if you change the file or not. its up to you to figure out what you'll be doing!"
] | [
34,
112
] | [
"passage: TAGS\n#en #dataset-fka/awesome-chatgpt-prompts #license-apache-2.0 #region-us \n# HI everyone. this is GL and we are launching an official XAI project for VLL's Company\nwhat is XAI? XAI is an soon to be the advanced and free model alt for ChatGPT. no matter advanced or not, Image generator or not. You can still give me money but through the donations. AND THE BEST PART! ITS OPEN SOURCED! meaning I don't care if you change the file or not. its up to you to figure out what you'll be doing!"
] | [
-0.07987333834171295,
0.05053267255425453,
-0.001050824299454689,
0.09686937928199768,
0.07602269947528839,
-0.015625575557351112,
0.06603828817605972,
0.10732048004865646,
0.09234254062175751,
-0.03344200924038887,
0.06710120290517807,
0.1076364517211914,
0.13336849212646484,
0.18601714074611664,
-0.006070977076888084,
-0.11469089239835739,
0.0070429532788693905,
0.04838885739445686,
-0.03889060392975807,
-0.007768805138766766,
0.09623395651578903,
-0.08503826707601547,
0.13312359154224396,
-0.0035290115047246218,
-0.12872932851314545,
-0.048560187220573425,
-0.05201520770788193,
-0.04460792988538742,
0.01383421290665865,
0.039461906999349594,
-0.0659429281949997,
0.03262735530734062,
-0.0003654249885585159,
-0.022655244916677475,
0.025004610419273376,
-0.09399464726448059,
-0.05073096975684166,
0.01632879674434662,
0.017267296090722084,
0.06950514018535614,
0.09468265622854233,
0.04893823340535164,
-0.16929388046264648,
0.06677694618701935,
-0.040790900588035583,
-0.21995462477207184,
-0.1006862074136734,
0.036782536655664444,
0.11721956729888916,
0.07951708883047104,
0.013814202509820461,
0.16316121816635132,
0.06473276019096375,
0.05087381973862648,
0.15772640705108643,
-0.33837443590164185,
-0.05076426640152931,
0.003304500598460436,
0.08458840847015381,
0.13569112122058868,
0.02260892651975155,
0.1307254284620285,
0.13178731501102448,
-0.018858056515455246,
-0.03033202886581421,
-0.04382425174117088,
-0.01747315004467964,
0.021715624257922173,
-0.08460620790719986,
0.006329342722892761,
0.372953861951828,
0.0326310470700264,
-0.08313784748315811,
0.05612839385867119,
0.07012049108743668,
-0.044726621359586716,
0.033373910933732986,
-0.013265085406601429,
0.0331086665391922,
0.0016830296954140067,
0.1493510752916336,
-0.15650759637355804,
-0.09392713755369186,
-0.08173611015081406,
-0.03701554983854294,
0.07440958172082901,
0.03575160354375839,
0.14641058444976807,
-0.05832705274224281,
0.024860866367816925,
0.004393049515783787,
-0.06739616394042969,
-0.07840071618556976,
-0.06483227759599686,
-0.020117558538913727,
0.1147470623254776,
0.1311824768781662,
0.012270881794393063,
0.10415511578321457,
-0.04823770001530647,
0.013933906331658363,
-0.05042541027069092,
-0.056137777864933014,
0.07530199736356735,
0.10193530470132828,
0.126710906624794,
0.017232738435268402,
-0.059662990272045135,
0.10422396659851074,
-0.09228023886680603,
0.011501933448016644,
-0.04686100408434868,
-0.12866564095020294,
0.03515244275331497,
-0.11901471763849258,
0.11870866268873215,
0.10981471091508865,
-0.00012121200416004285,
0.012678582221269608,
-0.050763778388500214,
0.1678629070520401,
-0.026704730466008186,
0.026858991011977196,
-0.049278926104307175,
-0.09321411699056625,
0.06575004011392593,
-0.04723446071147919,
0.03584606200456619,
-0.017943376675248146,
-0.2284255176782608,
-0.0025990738067775965,
-0.029378842562437057,
-0.08263685554265976,
-0.003314512548968196,
0.005552242044359446,
-0.11977951973676682,
-0.018000569194555283,
-0.13891102373600006,
-0.12277734279632568,
0.00597916916012764,
0.03552301228046417,
-0.004412077832967043,
-0.053231824189424515,
0.03670354187488556,
-0.00028512973221950233,
-0.11596183478832245,
-0.007586562540382147,
-0.02102859690785408,
-0.02973567694425583,
0.022658158093690872,
0.016550743952393532,
0.05622132122516632,
-0.18888849020004272,
0.0792367160320282,
-0.038801662623882294,
0.0450616218149662,
0.1348290741443634,
0.09142722934484482,
-0.14276601374149323,
0.13001929223537445,
0.02056601271033287,
0.011461196467280388,
0.13082124292850494,
-0.03499394282698631,
-0.028463011607527733,
0.05418466404080391,
-0.25892767310142517,
0.033154889941215515,
-0.02520633116364479,
-0.16768336296081543,
-0.2216508537530899,
0.09032530337572098,
0.07133210450410843,
-0.042842522263526917,
0.08512481302022934,
0.16378602385520935,
0.006116974167525768,
-0.10134056955575943,
-0.08881611377000809,
0.03846901282668114,
-0.13598854839801788,
-0.15987418591976166,
0.12188349664211273,
0.15721136331558228,
-0.16254496574401855,
0.04969024658203125,
-0.21960261464118958,
0.06115689128637314,
-0.041727907955646515,
-0.10563083738088608,
-0.04392793029546738,
-0.10564442723989487,
-0.12169443815946579,
0.026533519849181175,
0.020247789099812508,
0.02298278547823429,
0.007956957444548607,
-0.16103237867355347,
0.12144455313682556,
0.03073986992239952,
-0.03408654034137726,
-0.12238678336143494,
0.2609303593635559,
-0.02564416639506817,
0.014380178414285183,
0.06964191049337387,
-0.016134295612573624,
0.10675960779190063,
0.14037162065505981,
0.07663200050592422,
0.09043440967798233,
-0.026045111939311028,
0.03840526193380356,
0.09335019439458847,
-0.035162415355443954,
-0.0012257490307092667,
-0.00711559783667326,
0.02178996242582798,
-0.044101566076278687,
0.031308554112911224,
-0.03815639764070511,
-0.03237560763955116,
-0.04193764552474022,
-0.03783155605196953,
-0.06130371615290642,
-0.10247669368982315,
0.0694686621427536,
0.06029033660888672,
-0.008482631295919418,
-0.05000421404838562,
-0.05407742038369179,
0.013448262587189674,
-0.0009931011591106653,
0.008317218162119389,
-0.1612757295370102,
0.16482611000537872,
0.15054729580879211,
0.0768047422170639,
0.1038564071059227,
-0.05085066333413124,
0.0682004764676094,
0.007016060408204794,
-0.09187627583742142,
0.02071685716509819,
0.07927508652210236,
0.04533299431204796,
0.04000645503401756,
-0.07838866114616394,
0.021846774965524673,
-0.04127127304673195,
0.06056841462850571,
-0.016763517633080482,
-0.023521652445197105,
-0.026207296177744865,
0.07065887749195099,
0.15999062359333038,
-0.0010077947517856956,
0.0919807106256485,
0.10377003252506256,
0.061015281826257706,
0.03934188932180405,
-0.06502749025821686,
-0.027895063161849976,
-0.10226993262767792,
-0.004465751349925995,
0.04338293895125389,
0.1850980967283249,
-0.006161259952932596,
-0.0015208465047180653,
0.033323753625154495,
-0.04586029797792435,
0.007892098277807236,
-0.09304037690162659,
-0.08771757036447525,
-0.020071346312761307,
-0.10299595445394516,
0.049779001623392105,
0.02955736219882965,
-0.18224455416202545,
0.043045200407505035,
-0.02831372804939747,
0.013330343179404736,
-0.007724660448729992,
-0.009498674422502518,
0.004474842455238104,
-0.0031635721679776907,
-0.004809914156794548,
-0.21746142208576202,
-0.013622447848320007,
0.06248678267002106,
-0.02719498984515667,
0.01872718147933483,
0.054868731647729874,
-0.10154871642589569,
-0.05459045246243477,
0.01210301835089922,
-0.11184941977262497,
-0.08794219046831131,
0.007096339017152786,
-0.03256269544363022,
-0.01695982553064823,
-0.0987226814031601,
-0.03482284024357796,
-0.02598300389945507,
-0.057832133024930954,
-0.17447137832641602,
0.06329493224620819,
-0.03193019703030586,
0.17526715993881226,
0.04356743022799492,
-0.024752752855420113,
0.07466427981853485,
-0.027723519131541252,
0.1124633252620697,
-0.0725506916642189,
0.06662639230489731,
0.11453713476657867,
0.07934698462486267,
0.010446433909237385,
0.023227328434586525,
-0.002671323949471116,
-0.059168435633182526,
-0.04048505425453186,
-0.03777892887592316,
-0.0863967314362526,
-0.29042157530784607,
-0.027139555662870407,
-0.04852528125047684,
0.12583325803279877,
-0.15790307521820068,
0.11521092802286148,
0.1162661761045456,
0.11466959863901138,
0.05280733108520508,
0.05342560634016991,
-0.16989260911941528,
-0.030314570292830467,
-0.14810848236083984,
-0.021412011235952377,
0.01773613505065441,
-0.08547024428844452,
0.04758751019835472,
0.15970498323440552,
0.09598138928413391,
0.25515565276145935,
0.02729424089193344,
0.11257076263427734,
0.13085117936134338,
0.19641758501529694,
0.0482213981449604,
-0.0476633757352829,
-0.06115986406803131,
-0.018368342891335487,
-0.052019525319337845,
-0.07263629138469696,
0.06919682025909424,
0.05887049436569214,
0.0546482652425766,
-0.0046668583527207375,
0.0306744072586298,
0.030095843598246574,
0.08621443063020706,
0.059483617544174194,
-0.07699596881866455,
-0.017649168148636818,
0.05291163921356201,
0.067245714366436,
0.04654775559902191,
0.043302495032548904,
0.04380153492093086,
0.04781503602862358,
-0.068636953830719,
0.013391394168138504,
-0.020133141428232193,
0.09641337394714355,
-0.04982300102710724,
-0.040675144642591476,
0.0695965364575386,
-0.07955867797136307,
0.05923421308398247,
0.053874239325523376,
-0.15717579424381256,
0.13630411028862,
0.0055632381699979305,
0.017250346019864082,
-0.02039414830505848,
-0.010312503203749657,
0.02102028764784336,
0.15773312747478485,
0.17816156148910522,
-0.009942864999175072,
-0.19130919873714447,
0.0058691855520009995,
-0.12515275180339813,
0.09543195366859436,
-0.09943930059671402,
-0.010655018500983715,
-0.06092062592506409,
0.012601899914443493,
0.016523204743862152,
-0.05861665681004524,
0.05416132137179375,
-0.1732291728258133,
-0.0012641993816941977,
0.049695443361997604,
0.0789569616317749,
0.10173223167657852,
0.011830391362309456,
-0.025634415447711945,
-0.10978754609823227,
0.14770197868347168,
-0.03191515430808067,
-0.08074156194925308,
-0.03720557689666748,
-0.0016814745031297207,
0.005890657193958759,
-0.0602555014193058,
0.012394413352012634,
-0.012448790483176708,
-0.005583065561950207,
0.04300020635128021,
-0.0693918764591217,
-0.035526927560567856,
-0.053426340222358704,
0.014710882678627968,
0.030573803931474686,
0.004609431605786085,
0.07212870568037033,
0.0509403832256794,
0.04866784065961838,
-0.04501564800739288,
-0.04140384495258331,
-0.15516208112239838,
0.13036401569843292,
0.08771687000989914,
-0.09490661323070526,
0.048096928745508194,
-0.022364698350429535,
-0.19904284179210663,
-0.10955128818750381,
-0.12124048918485641,
0.1255321055650711,
0.08674933016300201,
-0.1641031801700592,
0.10842210054397583,
0.07170369476079941,
-0.09871810674667358,
-0.15202783048152924,
-0.027185998857021332,
-0.057363975793123245,
0.02194097824394703,
0.0019680571276694536,
-0.08896283805370331,
0.01413195300847292,
0.02294122613966465,
-0.07767970860004425,
0.19935722649097443,
-0.24633072316646576,
0.01694403402507305,
-0.023694932460784912,
0.03769693523645401,
0.021942341700196266,
-0.19554480910301208,
-0.07259657233953476,
0.017437167465686798,
0.03501230105757713,
0.15389080345630646,
-0.003490094793960452,
0.08031771332025528,
-0.08030326664447784,
0.005425582639873028,
-0.011469215154647827,
-0.00591999851167202,
0.13353760540485382,
-0.04290950670838356,
0.03539702296257019,
-0.0807216614484787,
-0.1410539746284485,
-0.018376410007476807,
-0.009709032252430916,
0.18255358934402466,
-0.10617773979902267,
0.00781058007851243,
-0.10949604958295822,
-0.018063774332404137,
-0.005264995153993368,
0.11155742406845093,
0.046061690896749496,
-0.11919241398572922,
-0.11538761109113693,
0.04012925177812576,
-0.0538342148065567,
0.06006981059908867,
0.005097414832562208,
0.00576585391536355,
-0.08706111460924149,
0.014249804429709911,
-0.0469597727060318,
-0.04216974973678589,
-0.014029174111783504,
-0.12250997871160507,
0.016074873507022858,
0.132553368806839,
-0.1007709950208664,
-0.02861148864030838,
0.10978153347969055,
-0.05436486005783081,
0.08262899518013,
-0.009854592382907867,
-0.10222584009170532,
0.14550136029720306,
0.031941551715135574,
-0.03743140026926994,
-0.19403153657913208,
-0.09212743490934372,
0.0720779150724411,
0.1650126576423645,
0.1484648734331131,
-0.023615777492523193,
-0.09707798063755035,
-0.018890127539634705,
-0.00345286400988698,
0.061662230640649796,
-0.049206484109163284,
-0.05483861267566681,
0.10509135574102402,
-0.07249157875776291,
-0.10075997561216354,
0.010868215933442116,
0.06156738102436066,
0.009790790267288685,
0.0323471836745739,
0.07140394300222397,
0.059356689453125,
-0.13737605512142181,
0.005911021493375301,
0.07483174651861191,
-0.07054188847541809,
-0.12622161209583282,
0.03776317089796066,
-0.09629744291305542,
-0.011153346858918667,
-0.09590242803096771,
0.028193362057209015,
0.03339951112866402,
0.09909138083457947,
0.020597195252776146,
0.026587892323732376,
0.06210053712129593,
-0.052960965782403946,
-0.0029252679087221622,
-0.20304854214191437,
-0.06195738539099693,
0.09918533265590668,
0.2162235975265503,
-0.06421265751123428,
-0.050494711846113205,
-0.046640798449516296,
0.0049572899006307125,
-0.10048959404230118,
0.11987162381410599,
-0.008298630826175213,
-0.0197772029787302,
0.010766998864710331,
-0.10837990045547485,
-0.08316260576248169,
0.1014675721526146,
-0.045908547937870026,
0.03020946867763996,
0.023561304435133934,
0.0410052165389061,
-0.05591728910803795,
-0.0798865482211113,
0.04530056565999985,
0.07709693908691406,
0.09677005559206009,
-0.00023743082419969141,
0.003192922566086054,
-0.03623083606362343,
-0.21175052225589752,
0.09292437136173248,
0.025449492037296295,
0.06584411859512329,
0.1097111627459526,
0.006390695925801992,
0.024930395185947418,
0.028374042361974716,
-0.11807520687580109,
-0.03259486332535744,
0.020449861884117126,
-0.09804549813270569,
-0.10038483142852783,
-0.0021086183842271566,
-0.08710073679685593,
0.010114430449903011,
0.054686617106199265,
0.041126661002635956,
0.0786314457654953,
0.0462082140147686,
0.031100546941161156,
0.12596601247787476,
-0.06221403181552887,
0.03251374140381813,
-0.008570296689867973,
-0.007827653549611568,
-0.061056334525346756,
-0.04368620738387108,
-0.025511080399155617,
-0.03561616316437721,
0.12575292587280273,
0.19959045946598053,
-0.10709092766046524,
-0.052061934024095535,
-0.02567356824874878,
0.11810560524463654,
-0.057928506284952164,
0.0708756223320961,
-0.027103951200842857,
0.1219087466597557,
-0.012732869945466518,
0.018518375232815742,
-0.06575535237789154,
-0.08513224869966507,
0.03395301103591919,
-0.07583468407392502,
0.11842186003923416,
-0.016815442591905594,
0.13829924166202545,
-0.04398932307958603,
-0.14732874929904938,
-0.14268766343593597,
-0.045795463025569916,
0.08204423636198044,
-0.006092760246247053,
0.09593905508518219,
0.10898330807685852,
-0.07604062557220459,
0.023807112127542496,
0.02962893806397915,
-0.026723617687821388,
-0.043952856212854385,
-0.15176725387573242,
-0.06520664691925049,
-0.1372513622045517,
0.001957325264811516,
-0.03074771538376808,
-0.0051446459256112576,
-0.06599532812833786,
0.0005218299338594079,
-0.08506684750318527,
-0.002499312860891223,
0.03804519400000572,
0.04026784747838974,
0.012575034983456135,
0.0009776438819244504,
-0.08654200285673141,
-0.08142805844545364,
0.11908016353845596,
-0.06776813417673111,
-0.005618448834866285,
-0.06798961013555527,
0.049700405448675156,
-0.02858537621796131,
0.09811115264892578,
-0.0357435941696167,
0.00971266720443964,
-0.0664566233754158,
-0.05282200872898102,
0.04762481898069382,
0.11846651881933212,
0.05674219876527786,
-0.011106333695352077,
0.07229974120855331,
0.03981946408748627,
0.06621198356151581,
0.057040926069021225,
-0.052442632615566254,
-0.024767665192484856,
-0.17159011960029602,
-0.06925883889198303,
-0.043703772127628326,
0.03795802593231201,
0.003582171630114317,
0.2108481377363205,
0.2207401543855667,
-0.05747028440237045,
-0.04183071106672287,
-0.10519839078187943,
0.00039462134009227157,
-0.12799398601055145,
0.09971105307340622,
0.041536904871463776,
0.3204924762248993,
-0.06248725950717926,
0.02594129927456379,
-0.20277199149131775,
0.06417414546012878,
-0.0995992049574852,
-0.0545199029147625,
0.015087871812283993,
0.02781837061047554,
-0.007707206532359123,
0.09011499583721161,
-0.08856363594532013,
-0.028712289407849312,
0.012290854007005692,
0.027156712487339973,
-0.021338384598493576,
0.04784636199474335,
0.06985080242156982,
0.1012922003865242,
-0.04901360347867012,
-0.07749251276254654,
0.051475685089826584,
0.019550548866391182,
0.017348086461424828,
-0.1419011801481247,
0.032012537121772766,
0.19029003381729126,
-0.10941068083047867,
0.28189289569854736,
-0.025738446041941643,
-0.0014682001201435924,
-0.04084136337041855,
0.003835823619738221,
-0.1180596649646759,
0.11087588965892792,
-0.0013925826642662287,
-0.04496759548783302,
0.014402041211724281,
-0.01324146706610918,
0.014991523697972298,
0.06225268170237541,
0.05434079468250275,
0.22602391242980957,
0.002252075355499983,
0.03462154418230057,
0.04678871110081673,
-0.06638247519731522,
0.10865989327430725,
-0.1406928449869156,
0.12619683146476746,
0.04712383449077606,
-0.05409398302435875,
-0.06371378898620605,
-0.06778779625892639,
0.0604669563472271,
0.007588752079755068,
-0.018828386440873146,
0.004988667089492083,
0.014866050332784653,
-0.005796775221824646,
0.1890675127506256,
0.04974917694926262,
0.012830150313675404,
-0.01841386780142784,
-0.1080210730433464,
-0.042714715003967285,
-0.021940449252724648,
-0.05656483396887779,
0.21565376222133636,
-0.032876089215278625,
-0.045844465494155884,
0.017874909564852715,
-0.013320107012987137,
-0.057616982609033585,
-0.012078664265573025,
-0.03749769553542137
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sft-microsoft-phi2-on-dialogsum
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3041
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 5
- total_train_batch_size: 10
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.088 | 0.05 | 10 | 2.0210 |
| 1.9945 | 0.1 | 20 | 1.9228 |
| 1.8836 | 0.15 | 30 | 1.7275 |
| 1.6446 | 0.2 | 40 | 1.4587 |
| 1.4515 | 0.25 | 50 | 1.3866 |
| 1.4383 | 0.3 | 60 | 1.3570 |
| 1.3909 | 0.35 | 70 | 1.3467 |
| 1.3326 | 0.4 | 80 | 1.3409 |
| 1.3776 | 0.45 | 90 | 1.3379 |
| 1.294 | 0.5 | 100 | 1.3333 |
| 1.3302 | 0.55 | 110 | 1.3299 |
| 1.3272 | 0.6 | 120 | 1.3272 |
| 1.3121 | 0.65 | 130 | 1.3251 |
| 1.3259 | 0.7 | 140 | 1.3233 |
| 1.3359 | 0.75 | 150 | 1.3220 |
| 1.2652 | 0.8 | 160 | 1.3214 |
| 1.3379 | 0.85 | 170 | 1.3201 |
| 1.3724 | 0.9 | 180 | 1.3181 |
| 1.296 | 0.95 | 190 | 1.3173 |
| 1.3137 | 1.0 | 200 | 1.3166 |
| 1.3418 | 1.05 | 210 | 1.3157 |
| 1.279 | 1.1 | 220 | 1.3158 |
| 1.2952 | 1.15 | 230 | 1.3151 |
| 1.3373 | 1.2 | 240 | 1.3140 |
| 1.3266 | 1.25 | 250 | 1.3127 |
| 1.3438 | 1.3 | 260 | 1.3118 |
| 1.3005 | 1.35 | 270 | 1.3108 |
| 1.2956 | 1.4 | 280 | 1.3102 |
| 1.3305 | 1.45 | 290 | 1.3100 |
| 1.3162 | 1.5 | 300 | 1.3090 |
| 1.3043 | 1.55 | 310 | 1.3085 |
| 1.3038 | 1.6 | 320 | 1.3082 |
| 1.2942 | 1.65 | 330 | 1.3077 |
| 1.2853 | 1.7 | 340 | 1.3074 |
| 1.2924 | 1.75 | 350 | 1.3066 |
| 1.3229 | 1.8 | 360 | 1.3061 |
| 1.2597 | 1.85 | 370 | 1.3059 |
| 1.3058 | 1.9 | 380 | 1.3058 |
| 1.3168 | 1.95 | 390 | 1.3058 |
| 1.2991 | 2.0 | 400 | 1.3055 |
| 1.2974 | 2.05 | 410 | 1.3055 |
| 1.3091 | 2.1 | 420 | 1.3053 |
| 1.2637 | 2.15 | 430 | 1.3050 |
| 1.3002 | 2.2 | 440 | 1.3047 |
| 1.2989 | 2.25 | 450 | 1.3044 |
| 1.2609 | 2.3 | 460 | 1.3044 |
| 1.2734 | 2.35 | 470 | 1.3044 |
| 1.2927 | 2.4 | 480 | 1.3043 |
| 1.3172 | 2.45 | 490 | 1.3042 |
| 1.3341 | 2.5 | 500 | 1.3041 |
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2
- Datasets 2.15.0
- Tokenizers 0.15.1 | {"license": "mit", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "microsoft/phi-2", "model-index": [{"name": "sft-microsoft-phi2-on-dialogsum", "results": []}]} | null | ghost613/sft-microsoft-phi2-on-dialogsum | [
"peft",
"safetensors",
"generated_from_trainer",
"base_model:microsoft/phi-2",
"license:mit",
"region:us"
] | 2024-02-14T16:43:14+00:00 | [] | [] | TAGS
#peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us
| sft-microsoft-phi2-on-dialogsum
===============================
This model is a fine-tuned version of microsoft/phi-2 on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.3041
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 2
* eval\_batch\_size: 2
* seed: 42
* gradient\_accumulation\_steps: 5
* total\_train\_batch\_size: 10
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 50
* training\_steps: 500
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.7.1
* Transformers 4.36.2
* Pytorch 2.1.2
* Datasets 2.15.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2\n* Datasets 2.15.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2\n* Datasets 2.15.0\n* Tokenizers 0.15.1"
] | [
35,
158,
4,
36
] | [
"passage: TAGS\n#peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2\n* Datasets 2.15.0\n* Tokenizers 0.15.1"
] | [
-0.1189567968249321,
0.08719213306903839,
-0.003132718149572611,
0.08770846575498581,
0.11823828518390656,
0.014019089750945568,
0.09797748923301697,
0.1490527242422104,
-0.08658260852098465,
0.07962153106927872,
0.10460751503705978,
0.10678038001060486,
0.060023386031389236,
0.2163049280643463,
-0.033528074622154236,
-0.2795961797237396,
0.027920888736844063,
-0.02352244220674038,
-0.0506320521235466,
0.12554717063903809,
0.09635897725820541,
-0.12089792639017105,
0.05399823188781738,
-0.025003116577863693,
-0.13026323914527893,
-0.006808651611208916,
-0.0077193197794258595,
-0.03141563758254051,
0.11027292162179947,
-0.0024219327606260777,
0.10077943652868271,
0.0359201617538929,
0.10574215650558472,
-0.24305494129657745,
0.007581059820950031,
0.0654742643237114,
0.026708196848630905,
0.07386644184589386,
0.10253425687551498,
-0.021801792085170746,
0.11046519130468369,
-0.10509338229894638,
0.0678512379527092,
0.028168480843305588,
-0.15240538120269775,
-0.33105215430259705,
-0.11512316763401031,
0.048779767006635666,
0.12339936941862106,
0.06563758105039597,
-0.02263455279171467,
0.1183348000049591,
-0.0676545798778534,
0.08585193008184433,
0.2821284532546997,
-0.2790813148021698,
-0.06550737470388412,
0.014443917199969292,
0.03314101696014404,
0.07495620101690292,
-0.11109349131584167,
-0.045568063855171204,
0.0367887020111084,
0.034643352031707764,
0.12869574129581451,
0.0033653208520263433,
-0.047010403126478195,
-0.0036118547432124615,
-0.15253104269504547,
-0.059070806950330734,
0.10733897984027863,
0.04073769971728325,
-0.038495104759931564,
-0.0882691890001297,
-0.04294209182262421,
-0.1751715987920761,
-0.053739771246910095,
0.005407326389104128,
0.043036457151174545,
-0.04510926827788353,
-0.045547161251306534,
0.0013404153287410736,
-0.0749446302652359,
-0.08530472218990326,
0.021310266107320786,
0.1266871988773346,
0.07014148682355881,
0.0022862947080284357,
0.0025713974609971046,
0.11699549853801727,
-0.011986160650849342,
-0.14050304889678955,
-0.011607998050749302,
0.0014894054038450122,
-0.04368855431675911,
-0.037545882165431976,
-0.03465129807591438,
0.02577885426580906,
0.015306474640965462,
0.15531164407730103,
-0.1414242833852768,
0.06877411901950836,
0.009892519563436508,
0.023562978953123093,
-0.08297337591648102,
0.10844264924526215,
-0.05434418097138405,
0.010492783039808273,
-0.024890011176466942,
0.09898851066827774,
0.01702507585287094,
0.0011538562830537558,
-0.059835515916347504,
0.047196585685014725,
0.11296890676021576,
0.06911374628543854,
-0.04014046490192413,
0.025688258931040764,
-0.07098271697759628,
-0.008749694563448429,
0.04883797839283943,
-0.11902520805597305,
0.05381697416305542,
0.020713288336992264,
-0.08051799982786179,
-0.08713825047016144,
-0.0072224740870296955,
0.008210543543100357,
-0.01925414614379406,
0.10880176723003387,
-0.06477134674787521,
0.034432440996170044,
-0.08821132034063339,
-0.12061313539743423,
0.023955022916197777,
-0.07368175685405731,
0.00456559332087636,
-0.05821193754673004,
-0.16604560613632202,
-0.03911135345697403,
0.047255631536245346,
-0.07079671323299408,
-0.0400521345436573,
-0.050832927227020264,
-0.08540669083595276,
0.007813659496605396,
-0.0313224233686924,
0.14001648128032684,
-0.07795926183462143,
0.11471809446811676,
0.01608278974890709,
0.06218884885311127,
0.005818929523229599,
0.04135054349899292,
-0.09533466398715973,
0.04447329789400101,
-0.21536040306091309,
0.05052045360207558,
-0.08473066985607147,
0.05032628774642944,
-0.12599903345108032,
-0.11075161397457123,
-0.0006341848056763411,
-0.019899623468518257,
0.1063050702214241,
0.1345665156841278,
-0.1925051212310791,
-0.04601004719734192,
0.22151950001716614,
-0.11703009903430939,
-0.1152753233909607,
0.09420083463191986,
-0.03209712728857994,
0.01622381992638111,
0.04330601915717125,
0.23369109630584717,
0.034791819751262665,
-0.13203008472919464,
0.024885268881917,
-0.05587068945169449,
0.09191462397575378,
0.006755353882908821,
0.07866925746202469,
-0.030148759484291077,
0.030790887773036957,
0.014576630666851997,
-0.04976809024810791,
0.039147719740867615,
-0.11335919052362442,
-0.08215561509132385,
-0.034494880586862564,
-0.08102864027023315,
0.018421975895762444,
0.0634000301361084,
0.03268313407897949,
-0.11889483034610748,
-0.08848918229341507,
0.04234980046749115,
0.09305176138877869,
-0.06475412100553513,
0.0328369140625,
-0.05989747866988182,
0.08543810248374939,
-0.006670475006103516,
-0.018979836255311966,
-0.1835632473230362,
-0.07418433576822281,
0.03980633243918419,
-0.045637186616659164,
-0.015539607964456081,
-0.07349848002195358,
0.07839532196521759,
0.08337206393480301,
-0.07257912307977676,
-0.06085887551307678,
-0.04814481362700462,
0.0015895914984866977,
-0.1084257960319519,
-0.24043738842010498,
-0.06428426504135132,
-0.03847784921526909,
0.10841459780931473,
-0.2246704399585724,
0.030852586030960083,
0.005386393517255783,
0.12096680700778961,
0.0321233794093132,
-0.051390718668699265,
-0.001765035092830658,
0.06982097774744034,
-0.01209582295268774,
-0.07765962928533554,
0.05523912236094475,
0.0007506599649786949,
-0.06987996399402618,
0.008090793155133724,
-0.11772975325584412,
0.11892180889844894,
0.1018000990152359,
0.018925083801150322,
-0.10148060321807861,
-0.06050489842891693,
-0.08232787251472473,
-0.0398377999663353,
-0.05692063271999359,
0.04603040590882301,
0.09060141444206238,
0.012731168419122696,
0.11465591192245483,
-0.0999566987156868,
-0.04739491641521454,
0.035534705966711044,
-0.030641961842775345,
0.010391906835138798,
0.14174434542655945,
0.040525171905756,
-0.0700770765542984,
0.12842172384262085,
0.13862447440624237,
-0.006991908885538578,
0.10233235359191895,
-0.07530773431062698,
-0.10601300001144409,
-0.02974729984998703,
0.046106934547424316,
0.01410833839327097,
0.15197767317295074,
-0.03785531967878342,
0.022516539320349693,
0.019860604777932167,
0.035806868225336075,
0.009128399193286896,
-0.21184921264648438,
-0.02912699244916439,
0.014215536415576935,
-0.08153524249792099,
-0.04602692648768425,
-0.03132795915007591,
0.02131180837750435,
0.11243195086717606,
-0.006987620610743761,
-0.041490089148283005,
-0.010049989446997643,
-0.005638001952320337,
-0.08282259106636047,
0.20460253953933716,
-0.097804956138134,
-0.11706428974866867,
-0.06575658917427063,
-0.02858807146549225,
-0.011654416099190712,
-0.016232062131166458,
0.06839688867330551,
-0.08089117705821991,
-0.02922200784087181,
-0.11330390721559525,
-0.010766584426164627,
0.006180507596582174,
0.013506945222616196,
-0.03372490778565407,
0.008062917739152908,
0.08875270932912827,
-0.08639735728502274,
0.0011558395344763994,
-0.03808808699250221,
-0.03820077329874039,
0.06198251247406006,
0.035984139889478683,
0.09905332326889038,
0.12613555788993835,
0.010250731371343136,
0.033235009759664536,
-0.048357103019952774,
0.2197345495223999,
-0.07101967185735703,
-0.021615972742438316,
0.09379013627767563,
0.004820963833481073,
0.07147074490785599,
0.1610681116580963,
0.045252613723278046,
-0.12118761241436005,
0.011258265934884548,
0.04480794072151184,
-0.02153650112450123,
-0.22554273903369904,
-0.0532258003950119,
-0.03142503276467323,
-0.015587422996759415,
0.11642885208129883,
0.041282206773757935,
-0.03417051210999489,
0.019677363336086273,
-0.014607168734073639,
-0.04433146491646767,
0.020078878849744797,
0.07431931793689728,
0.012902536429464817,
0.031485795974731445,
0.11293032765388489,
-0.0286780446767807,
-0.01656019501388073,
0.045490607619285583,
-0.006163945887237787,
0.27100831270217896,
-0.025963492691516876,
0.09308944642543793,
0.06466881185770035,
0.20472899079322815,
-0.003078247420489788,
0.06775099039077759,
0.01597491465508938,
-0.017789201810956,
0.004473572596907616,
-0.060617510229349136,
-0.003694961778819561,
0.03929997980594635,
-0.0339234322309494,
0.025287901982665062,
-0.14537018537521362,
-0.02816709876060486,
0.05382270738482475,
0.3203917443752289,
0.08411504328250885,
-0.3143596649169922,
-0.06764432042837143,
0.01040541473776102,
-0.028172612190246582,
-0.044118497520685196,
0.0024580340832471848,
0.10464087873697281,
-0.08156102895736694,
0.07702828198671341,
-0.06554076820611954,
0.08856555074453354,
-0.022905055433511734,
0.01451889332383871,
0.06585193425416946,
0.06794759631156921,
-0.012069565244019032,
0.056564755737781525,
-0.27557000517845154,
0.32684004306793213,
0.009896359406411648,
0.07238682359457016,
-0.04030165076255798,
-0.0034442765172570944,
0.009377947077155113,
0.01771143451333046,
0.11254642903804779,
-0.00248510530218482,
-0.1481328010559082,
-0.21539899706840515,
-0.10994438827037811,
0.016054444015026093,
0.1355070024728775,
-0.030451113358139992,
0.11539982259273529,
-0.01138317957520485,
0.006011881399899721,
0.0383303202688694,
-0.07938219606876373,
-0.11404164135456085,
-0.05783861503005028,
-0.0000787838944233954,
-0.02881208434700966,
0.01388558465987444,
-0.09995197504758835,
-0.09507139027118683,
-0.04417745769023895,
0.1418561190366745,
-0.03760918602347374,
-0.05080714076757431,
-0.13810911774635315,
0.09156154096126556,
0.12956926226615906,
-0.07248884439468384,
0.04131421446800232,
0.019017329439520836,
0.08902288973331451,
0.028507564216852188,
-0.03450687602162361,
0.12996423244476318,
-0.06246637925505638,
-0.21197691559791565,
-0.0496021993458271,
0.14034676551818848,
0.05366920679807663,
0.05390943959355354,
-0.01594303362071514,
0.04187902435660362,
0.0074081216007471085,
-0.0853366106748581,
0.052400026470422745,
0.010514753870666027,
0.07065451890230179,
0.04528123512864113,
-0.07374260574579239,
0.03752424195408821,
-0.07175621390342712,
-0.04626872390508652,
0.11204066872596741,
0.335021048784256,
-0.10133955627679825,
0.06224571168422699,
0.04445084556937218,
-0.06183403730392456,
-0.18405716121196747,
0.0186797846108675,
0.08460211008787155,
-0.006522847339510918,
0.06703418493270874,
-0.1832560896873474,
0.03783031925559044,
0.11669702082872391,
-0.03577359393239021,
0.08951765298843384,
-0.31863096356391907,
-0.13061556220054626,
0.09113876521587372,
0.14037688076496124,
-0.005786046851426363,
-0.17643025517463684,
-0.05497756600379944,
0.018137222155928612,
-0.06869849562644958,
0.07666663080453873,
-0.0914861336350441,
0.10025513172149658,
-0.022797726094722748,
0.0276163499802351,
0.02197333239018917,
-0.05667947232723236,
0.14216428995132446,
-0.014718628488481045,
0.10836639255285263,
-0.038488298654556274,
0.048652347177267075,
0.040548428893089294,
-0.06883067637681961,
0.04354095831513405,
-0.040605224668979645,
0.04091241583228111,
-0.09838003665208817,
-0.011318488977849483,
-0.09510064870119095,
0.0260281041264534,
-0.046616457402706146,
-0.04401657357811928,
-0.03370115906000137,
0.06806279718875885,
0.06694039702415466,
-0.011775818653404713,
0.13441944122314453,
-0.011364887468516827,
0.18380574882030487,
0.12310187518596649,
0.050008539110422134,
-0.03560244292020798,
-0.04411689564585686,
0.004042463377118111,
-0.022845081984996796,
0.030946284532546997,
-0.14315450191497803,
0.033911459147930145,
0.1375783085823059,
0.028383268043398857,
0.1239214614033699,
0.053237564861774445,
-0.0685960128903389,
-0.0034358755219727755,
0.0718071460723877,
-0.14181311428546906,
-0.13814115524291992,
0.02182191237807274,
0.01666099578142166,
-0.12754546105861664,
0.025629442185163498,
0.08393324166536331,
-0.05494484305381775,
-0.02498943917453289,
-0.013766455464065075,
0.060461752116680145,
-0.026577899232506752,
0.22415363788604736,
0.03548293933272362,
0.07537395507097244,
-0.1114019826054573,
0.1055065393447876,
0.05765829235315323,
-0.10840858519077301,
0.028760064393281937,
0.10425711423158646,
-0.08046567440032959,
-0.02610941044986248,
0.09076102823019028,
0.1171507015824318,
0.0014402122469618917,
-0.05018656328320503,
-0.12367068976163864,
-0.13372060656547546,
0.08342832326889038,
0.1246219053864479,
0.06085417792201042,
0.016759609803557396,
0.03233572840690613,
0.012702701613307,
-0.10770036280155182,
0.10646156966686249,
0.08074645698070526,
0.07094191014766693,
-0.12092640995979309,
0.1349404901266098,
0.0061719948425889015,
0.028382014483213425,
-0.011982993222773075,
0.04007303714752197,
-0.13074864447116852,
0.01831893064081669,
-0.0959937646985054,
-0.014335575513541698,
-0.04905588924884796,
-0.0070825680159032345,
-0.010219232179224491,
-0.0614212304353714,
-0.04209137335419655,
0.02232913114130497,
-0.12002266943454742,
-0.04442937299609184,
-0.011024704203009605,
0.04974706843495369,
-0.14168916642665863,
-0.044618770480155945,
0.02017795667052269,
-0.08759640157222748,
0.08038037270307541,
0.04396980628371239,
0.025150233879685402,
0.043670374900102615,
-0.10154373943805695,
0.017996232956647873,
0.05108707770705223,
-0.02124708518385887,
0.04379451274871826,
-0.15127018094062805,
-0.028525875881314278,
-0.00766929192468524,
0.020774882286787033,
0.0210945513099432,
0.05757250264286995,
-0.14093990623950958,
-0.00685137091204524,
-0.024261051788926125,
-0.06471835821866989,
-0.03086644969880581,
0.034578606486320496,
0.06277605146169662,
0.022759269922971725,
0.15387074649333954,
-0.09364873170852661,
0.03879246860742569,
-0.2326085865497589,
-0.01670686900615692,
-0.024960588663816452,
-0.08236230909824371,
-0.08521126210689545,
-0.01656811311841011,
0.09098386019468307,
-0.05327485129237175,
0.1420198529958725,
-0.004728986881673336,
0.05813443660736084,
0.043504469096660614,
-0.06402426958084106,
0.023518912494182587,
0.05060877278447151,
0.17712898552417755,
0.008091080002486706,
-0.04493088647723198,
0.08408956974744797,
0.03456352278590202,
0.052463360130786896,
0.12519732117652893,
0.22120946645736694,
0.17997713387012482,
0.06225653365254402,
0.057628605514764786,
0.04190509021282196,
-0.1087406724691391,
-0.12366852164268494,
0.08159489929676056,
-0.0018659877823665738,
0.10050580650568008,
-0.024750197306275368,
0.20373067259788513,
0.11191649734973907,
-0.21342499554157257,
0.06042121723294258,
-0.03692205250263214,
-0.08748157322406769,
-0.11902960389852524,
-0.05098283290863037,
-0.08491306751966476,
-0.16707146167755127,
-0.009232657961547375,
-0.1157742440700531,
0.04227438196539879,
0.08943292498588562,
0.013438640162348747,
0.030849311500787735,
0.13740980625152588,
0.07314740121364594,
0.030601374804973602,
0.055476538836956024,
0.023621397092938423,
-0.010944284498691559,
-0.03769697993993759,
-0.0913061797618866,
0.030780717730522156,
-0.03378509730100632,
0.034087471663951874,
-0.029699373990297318,
-0.07182073593139648,
0.05426036939024925,
-0.010665087029337883,
-0.10249637812376022,
0.02222617156803608,
0.015460091643035412,
0.056115783751010895,
0.07713327556848526,
0.037422921508550644,
-0.005564833525568247,
-0.0240100659430027,
0.2586657702922821,
-0.07656405121088028,
-0.0689597874879837,
-0.09813819825649261,
0.307557076215744,
0.03505357727408409,
-0.016205988824367523,
0.031846996396780014,
-0.08600860834121704,
0.008648505434393883,
0.15254350006580353,
0.16493920981884003,
-0.03847748786211014,
-0.0031533280853182077,
-0.020690737292170525,
-0.014566609635949135,
-0.025029899552464485,
0.10435476899147034,
0.12239345908164978,
0.03252865746617317,
-0.0949317216873169,
-0.02238539047539234,
-0.05521964654326439,
-0.03113798424601555,
-0.07337040454149246,
0.06386744976043701,
0.034895703196525574,
0.007711644284427166,
-0.04821367934346199,
0.10560457408428192,
-0.057025305926799774,
-0.07460962980985641,
0.052442532032728195,
-0.1830846220254898,
-0.17715756595134735,
-0.021525384858250618,
0.053358495235443115,
0.015173125080764294,
0.04614279419183731,
-0.016980906948447227,
0.001663379603996873,
0.0839688777923584,
-0.019373323768377304,
-0.04313311353325844,
-0.14906686544418335,
0.08014169335365295,
-0.12768840789794922,
0.24054685235023499,
-0.024167682975530624,
0.02645793929696083,
0.11747857183218002,
0.02184435725212097,
-0.12464317679405212,
0.06932006031274796,
0.06366046518087387,
-0.08159948885440826,
0.020005477592349052,
0.14707235991954803,
-0.04324650019407272,
0.0778588056564331,
0.04413812234997749,
-0.11866553127765656,
0.01148560643196106,
-0.05046636611223221,
-0.06207425147294998,
-0.039997659623622894,
-0.012725500389933586,
-0.03859923779964447,
0.12086952477693558,
0.2067970484495163,
-0.057449065148830414,
0.033490169793367386,
-0.07001391798257828,
0.022356929257512093,
0.04469779506325722,
0.08410897850990295,
-0.010968226008117199,
-0.2646096646785736,
0.03509020432829857,
0.10054326802492142,
0.007857133634388447,
-0.2201489806175232,
-0.08285602927207947,
0.028290005400776863,
-0.05607222765684128,
-0.09184736758470535,
0.12894737720489502,
0.07588285952806473,
0.05213402211666107,
-0.0558391697704792,
-0.12327702343463898,
-0.04549938067793846,
0.186030313372612,
-0.13639536499977112,
-0.06527870148420334
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Small Ro - Sarbu Vlad
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 16.1 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2920
- Wer: 18.6647
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 3
- total_train_batch_size: 96
- total_eval_batch_size: 48
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.1437 | 3.91 | 500 | 0.2167 | 20.5100 |
| 0.0268 | 7.81 | 1000 | 0.2202 | 18.6557 |
| 0.008 | 11.72 | 1500 | 0.2478 | 18.6829 |
| 0.0037 | 15.62 | 2000 | 0.2644 | 18.6708 |
| 0.0024 | 19.53 | 2500 | 0.2761 | 18.6405 |
| 0.0018 | 23.44 | 3000 | 0.2844 | 18.6859 |
| 0.0016 | 27.34 | 3500 | 0.2900 | 18.6799 |
| 0.0014 | 31.25 | 4000 | 0.2920 | 18.6647 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"language": ["ro"], "license": "apache-2.0", "tags": ["hf-asr-leaderboard", "generated_from_trainer"], "datasets": ["mozilla-foundation/common_voice_16_1"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "Whisper Small Ro - Sarbu Vlad", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 16.1", "type": "mozilla-foundation/common_voice_16_1", "args": "config: ro, split: test"}, "metrics": [{"type": "wer", "value": 18.664730616813383, "name": "Wer"}]}]}]} | automatic-speech-recognition | VladS159/whisper_small_ro_VladS_02_14_24_4000_steps | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"ro",
"dataset:mozilla-foundation/common_voice_16_1",
"base_model:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:46:03+00:00 | [] | [
"ro"
] | TAGS
#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #ro #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Whisper Small Ro - Sarbu Vlad
=============================
This model is a fine-tuned version of openai/whisper-small on the Common Voice 16.1 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2920
* Wer: 18.6647
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 3
* total\_train\_batch\_size: 96
* total\_eval\_batch\_size: 48
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 4000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 96\n* total\\_eval\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #ro #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 96\n* total\\_eval\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
103,
179,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #ro #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 96\n* total\\_eval\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10245386511087418,
0.11498093605041504,
-0.005243919789791107,
0.06165464222431183,
0.0881020799279213,
0.025434982031583786,
0.12781086564064026,
0.15295271575450897,
-0.05229659751057625,
0.10609976202249527,
0.07488463073968887,
0.03406110405921936,
0.09280393272638321,
0.13744933903217316,
-0.015795374289155006,
-0.28390100598335266,
0.018861917778849602,
-0.029762474820017815,
-0.1359098255634308,
0.10062438249588013,
0.09425438195466995,
-0.09766270965337753,
0.03429122641682625,
0.011866944842040539,
-0.07144320011138916,
-0.00462875934317708,
-0.032001812011003494,
-0.043530769646167755,
0.09263453632593155,
0.04322837293148041,
0.06669842451810837,
0.030096566304564476,
0.09510219842195511,
-0.25292667746543884,
0.0033096608240157366,
0.06876663863658905,
0.031643643975257874,
0.06869889795780182,
0.11333807557821274,
0.008655653335154057,
0.0741690844297409,
-0.06653670221567154,
0.067064568400383,
0.0446680523455143,
-0.10511677712202072,
-0.3265019655227661,
-0.06949858367443085,
0.05238385498523712,
0.14746515452861786,
0.04725751280784607,
-0.03126620128750801,
0.05857442319393158,
-0.04850354418158531,
0.0778072252869606,
0.22728583216667175,
-0.22008490562438965,
-0.07278762012720108,
-0.0029857123736292124,
0.052320148795843124,
0.053454745560884476,
-0.0971815213561058,
-0.022819792851805687,
0.011828655377030373,
0.002554620848968625,
0.08890731632709503,
0.027772819623351097,
0.04292468726634979,
-0.00035985204158350825,
-0.13549278676509857,
-0.05043720826506615,
0.12245146930217743,
0.07403726130723953,
-0.010038324631750584,
-0.1194569543004036,
-0.03570941835641861,
-0.1475614607334137,
-0.05464920401573181,
0.007629237603396177,
0.023525195196270943,
-0.03633377328515053,
-0.051311690360307693,
0.029452743008732796,
-0.05201204866170883,
-0.09374537318944931,
0.0593043752014637,
0.13565205037593842,
0.05790400505065918,
-0.02062656544148922,
0.006569521967321634,
0.10276807844638824,
0.02981390990316868,
-0.17829850316047668,
-0.011897335760295391,
0.023956693708896637,
-0.10319821536540985,
-0.020163273438811302,
-0.015652906149625778,
0.029004834592342377,
0.045705344527959824,
0.1759229153394699,
-0.019050506874918938,
0.0821281373500824,
0.022945335134863853,
0.008166031911969185,
-0.08401339501142502,
0.15013229846954346,
-0.0648682564496994,
-0.06371515244245529,
-0.04864410310983658,
0.13917532563209534,
-0.005941534880548716,
-0.0053860461339354515,
-0.05503762885928154,
0.026435356587171555,
0.09063433110713959,
0.051513802260160446,
0.004789081867784262,
0.026936236768960953,
-0.065631203353405,
-0.022772055119276047,
0.029186991974711418,
-0.12696880102157593,
0.04129858314990997,
0.04992358013987541,
-0.06572216749191284,
-0.047125253826379776,
-0.006753539200872183,
0.022992359474301338,
-0.007892586290836334,
0.07832734286785126,
-0.047833673655986786,
-0.026511389762163162,
-0.08233962953090668,
-0.08099406957626343,
0.02171560749411583,
-0.005531064234673977,
0.0060346778482198715,
-0.04482780024409294,
-0.11779718101024628,
-0.048295244574546814,
0.06174825131893158,
-0.07671304047107697,
-0.06451442092657089,
-0.040133435279130936,
-0.07465262711048126,
0.04474423825740814,
-0.01543914433568716,
0.16161468625068665,
-0.04344065859913826,
0.0874333530664444,
0.04245564341545105,
0.06287376582622528,
0.12378665059804916,
0.05118221789598465,
-0.040558021515607834,
0.07572475075721741,
-0.14598357677459717,
0.07518036663532257,
-0.10941460728645325,
0.05547862499952316,
-0.13482221961021423,
-0.09484566003084183,
-0.007812689989805222,
-0.0005407929420471191,
0.08541258424520493,
0.11632950603961945,
-0.15203183889389038,
-0.08224673569202423,
0.19595074653625488,
-0.09024090319871902,
-0.1211734488606453,
0.13461993634700775,
-0.015181035734713078,
-0.026434967294335365,
0.015381919220089912,
0.16009217500686646,
0.13819117844104767,
-0.08460716903209686,
0.0204878319054842,
-0.03353338688611984,
0.1309356540441513,
0.0654301792383194,
0.11205699294805527,
-0.02987796813249588,
0.030951296910643578,
0.0009683617972768843,
-0.06575071066617966,
0.048726897686719894,
-0.07269416749477386,
-0.08951324969530106,
-0.016739211976528168,
-0.07738237828016281,
0.009820186533033848,
0.057061176747083664,
0.014761801809072495,
-0.08457231521606445,
-0.13547109067440033,
0.011753293685615063,
0.10536852478981018,
-0.08638159185647964,
0.0006090717506594956,
-0.07784658670425415,
0.048896368592977524,
-0.002397847594693303,
-0.00230123708024621,
-0.13041572272777557,
-0.04403551667928696,
0.04446465149521828,
-0.09981487691402435,
-0.0075135426595807076,
-0.019528424367308617,
0.08795404434204102,
0.07028282433748245,
-0.05030173063278198,
-0.06272771954536438,
-0.03494078665971756,
0.0045209904201328754,
-0.06075859069824219,
-0.22758260369300842,
-0.07811179012060165,
-0.024592285975813866,
0.14990463852882385,
-0.2137652188539505,
0.02597910165786743,
0.0398905985057354,
0.1488090306520462,
0.02596304938197136,
-0.045161884278059006,
-0.002942371182143688,
0.029467971995472908,
-0.014516395516693592,
-0.08254111558198929,
0.013031413778662682,
-0.010038265958428383,
-0.11104703694581985,
0.0036564550828188658,
-0.15809857845306396,
0.10958188027143478,
0.08034761250019073,
0.026644010096788406,
-0.08243745565414429,
-0.02938130497932434,
-0.059736382216215134,
-0.0625181719660759,
-0.023109156638383865,
-0.035609208047389984,
0.1371319591999054,
0.012998837046325207,
0.10305114090442657,
-0.08428410440683365,
-0.07224825769662857,
0.021011972799897194,
-0.0030480974819511175,
-0.014628911390900612,
0.14075453579425812,
0.02313975617289543,
-0.0695377066731453,
0.11118346452713013,
0.082829050719738,
-0.05535763129591942,
0.13867560029029846,
-0.09143075346946716,
-0.0840199813246727,
-0.04305240511894226,
0.0426645427942276,
0.04147711023688316,
0.10386720299720764,
-0.11885210871696472,
-0.0019135246984660625,
0.027209170162677765,
0.00921170599758625,
0.014412005431950092,
-0.17716385424137115,
-0.007989214733242989,
0.055930666625499725,
-0.07817214727401733,
0.012918265536427498,
-0.02369392104446888,
-0.010446407832205296,
0.08706248551607132,
0.004822572227567434,
-0.05553944408893585,
-0.024108365178108215,
-0.04414563253521919,
-0.08661988377571106,
0.17377640306949615,
-0.07362597435712814,
-0.115269735455513,
-0.13334359228610992,
-0.0011779186315834522,
0.004834623541682959,
-0.009910227730870247,
0.02221921645104885,
-0.09795660525560379,
-0.05050310119986534,
-0.0842989832162857,
0.007793615572154522,
-0.022435542196035385,
0.017908398061990738,
0.051229801028966904,
0.016502952203154564,
0.0937524288892746,
-0.08804996311664581,
0.02249724604189396,
0.004161949269473553,
-0.03900128975510597,
0.002802435774356127,
0.011447899043560028,
0.07302907854318619,
0.13830024003982544,
0.03876905515789986,
0.030036473646759987,
-0.01945827528834343,
0.18294839560985565,
-0.11027931421995163,
0.012253482826054096,
0.1267687976360321,
-0.009219907224178314,
0.05200648680329323,
0.16214342415332794,
0.04785145819187164,
-0.07315549999475479,
0.016022639349102974,
0.03235108405351639,
-0.016068238765001297,
-0.217827707529068,
-0.0042914291843771935,
-0.0472712479531765,
0.00428239069879055,
0.1201794445514679,
0.0321912057697773,
-0.01055559515953064,
0.035024963319301605,
-0.04428819939494133,
-0.006474839989095926,
0.05835512652993202,
0.061738356947898865,
0.04672762379050255,
0.04371074587106705,
0.11653522402048111,
-0.009488570503890514,
-0.04094049334526062,
0.024643706157803535,
0.0030991581734269857,
0.20847725868225098,
-0.005803888663649559,
0.19505372643470764,
0.03275715187191963,
0.13210242986679077,
-0.0018507703207433224,
0.054513804614543915,
0.00047788117080926895,
-0.005000158213078976,
0.02207767404615879,
-0.05839346721768379,
-0.003504085121676326,
0.05303533002734184,
0.044751349836587906,
0.029717419296503067,
-0.07806725800037384,
0.007493536453694105,
0.032139115035533905,
0.2900448441505432,
0.06867536902427673,
-0.2737055718898773,
-0.07563036680221558,
0.04065098613500595,
-0.07257089763879776,
-0.03475575894117355,
0.016945883631706238,
0.1219744086265564,
-0.07297375053167343,
0.07726588100194931,
-0.07031098753213882,
0.08084074407815933,
-0.047769542783498764,
0.006099276710301638,
0.10986151546239853,
0.08030767738819122,
0.005195127334445715,
0.07061012089252472,
-0.2208395153284073,
0.2717728614807129,
-0.014054175466299057,
0.050080761313438416,
-0.045272838324308395,
0.05910729616880417,
0.016396790742874146,
-0.03557508811354637,
0.07786539942026138,
-0.012588088400661945,
-0.1435065120458603,
-0.18777646124362946,
-0.09030096232891083,
0.012101452797651291,
0.1275489777326584,
-0.09740307182073593,
0.11992351710796356,
-0.04417010024189949,
-0.03483090177178383,
0.05390741676092148,
-0.06369003653526306,
-0.08986274152994156,
-0.09679728001356125,
0.042813677340745926,
-0.024093322455883026,
0.04595445841550827,
-0.10854669660329819,
-0.0976056158542633,
-0.08781678974628448,
0.14486005902290344,
-0.12463755905628204,
-0.04663658142089844,
-0.13470618426799774,
0.07471922785043716,
0.17767848074436188,
-0.06662576645612717,
0.0399596281349659,
0.016755148768424988,
0.1368332803249359,
0.03058585897088051,
-0.01775529608130455,
0.11795099079608917,
-0.08491121232509613,
-0.2391156107187271,
-0.052374035120010376,
0.1634918749332428,
0.027069134637713432,
0.05885440856218338,
-0.025037923827767372,
0.04223352670669556,
-0.0119960131123662,
-0.07317843288183212,
0.06673608720302582,
0.017204364761710167,
0.0007391286198981106,
0.04029068723320961,
-0.02616678737103939,
-0.009151501581072807,
-0.05926036089658737,
-0.04835234582424164,
0.06980346143245697,
0.29000160098075867,
-0.08358082175254822,
0.0351296104490757,
0.03365528956055641,
-0.05349850654602051,
-0.14143329858779907,
-0.02924305573105812,
0.11901668459177017,
0.028732962906360626,
0.03244759514927864,
-0.1861838549375534,
0.03933316469192505,
0.07266934961080551,
-0.039766594767570496,
0.05351269245147705,
-0.3049468994140625,
-0.1441345512866974,
0.09241873025894165,
0.08695628494024277,
-0.05133358761668205,
-0.18052257597446442,
-0.07216379791498184,
-0.012123825028538704,
-0.0819069892168045,
0.03730051591992378,
-0.018575143069028854,
0.09611813724040985,
-0.005488600116223097,
0.008707614615559578,
0.026615766808390617,
-0.05558324605226517,
0.1714596003293991,
-0.009568282403051853,
0.06052921712398529,
-0.029195334762334824,
0.03717890381813049,
0.027483046054840088,
-0.06746380776166916,
0.01815883442759514,
-0.09453240036964417,
0.027960803359746933,
-0.1228547990322113,
-0.0338631309568882,
-0.07085735350847244,
-0.0017210035584867,
-0.053748954087495804,
-0.027450358495116234,
-0.013027462176978588,
0.060436613857746124,
0.0771174281835556,
0.00002022513945121318,
0.09753989428281784,
-0.031348519027233124,
0.1537759006023407,
0.14052654802799225,
0.10638624429702759,
0.05486183613538742,
-0.09043335169553757,
-0.0016882071504369378,
0.006474500987678766,
0.0422738678753376,
-0.12542693316936493,
0.055689357221126556,
0.12908698618412018,
0.03507937863469124,
0.12761682271957397,
0.04512288421392441,
-0.09758450090885162,
-0.01643962413072586,
0.0648629292845726,
-0.08098506182432175,
-0.1603166162967682,
-0.01841019280254841,
0.040141403675079346,
-0.16412805020809174,
-0.013861139304935932,
0.09944160282611847,
-0.04140953719615936,
-0.0070192557759583,
0.012398256920278072,
0.06366570293903351,
-0.018196893855929375,
0.22588828206062317,
0.03433863818645477,
0.10509683936834335,
-0.08146508038043976,
0.06586692482233047,
0.0606803372502327,
-0.07615137100219727,
0.02990831807255745,
0.09485946595668793,
-0.046402882784605026,
-0.016186006367206573,
0.03936580941081047,
0.0719989761710167,
0.053894221782684326,
-0.04554159566760063,
-0.12809935212135315,
-0.14856784045696259,
0.06934933364391327,
0.07379405200481415,
0.029725827276706696,
0.026271147653460503,
0.007948276586830616,
0.03960825875401497,
-0.07562976330518723,
0.1515664905309677,
0.09786277264356613,
0.07524978369474411,
-0.12427598983049393,
0.09248378127813339,
0.0038404888473451138,
-0.016228294000029564,
-0.0037145367823541164,
0.030971575528383255,
-0.11750154197216034,
0.009338708594441414,
-0.11595720052719116,
-0.008774494752287865,
-0.04129168018698692,
-0.001735095283947885,
0.005706694908440113,
-0.06845549494028091,
-0.037221670150756836,
0.017887897789478302,
-0.10060219466686249,
-0.045670926570892334,
-0.03580104932188988,
0.06612128764390945,
-0.10932332277297974,
-0.02342994697391987,
0.05923452973365784,
-0.125524640083313,
0.08930221945047379,
0.016318686306476593,
0.036541152745485306,
0.005074055399745703,
-0.08865267783403397,
0.020148715004324913,
0.002240869915112853,
0.01560112927109003,
0.013943399302661419,
-0.1860990971326828,
-0.005959087982773781,
-0.039882905781269073,
0.0023138930555433035,
-0.007384737953543663,
0.013984618708491325,
-0.1227656826376915,
0.019015561789274216,
-0.02928510122001171,
-0.059241194278001785,
-0.03877457603812218,
0.044067688286304474,
0.06325288116931915,
0.0033652421552687883,
0.1420639306306839,
-0.0793672502040863,
0.0635901391506195,
-0.25633829832077026,
0.001262434758245945,
0.0019470701226964593,
-0.07149077206850052,
-0.06203007698059082,
-0.01710369810461998,
0.08640244603157043,
-0.056344639509916306,
0.1179223582148552,
-0.019894525408744812,
0.03646194562315941,
0.03377455472946167,
-0.0914502665400505,
0.0925687849521637,
0.06448002904653549,
0.17796513438224792,
0.03870585188269615,
-0.022487450391054153,
0.06299501657485962,
-0.027633264660835266,
0.04178078845143318,
0.0481480173766613,
0.12402933835983276,
0.16254842281341553,
0.015977175906300545,
0.041778117418289185,
0.09237448871135712,
-0.1346670538187027,
-0.1246013268828392,
0.1176910400390625,
-0.06645487248897552,
0.11664626002311707,
-0.03332307189702988,
0.17114141583442688,
0.11712797731161118,
-0.20580308139324188,
0.053000617772340775,
-0.04542030766606331,
-0.07810845971107483,
-0.10691497474908829,
-0.09765737503767014,
-0.08605191856622696,
-0.15868785977363586,
0.00755138136446476,
-0.1009409949183464,
0.03556513786315918,
0.0791788399219513,
0.03359464183449745,
0.03575407713651657,
0.12178922444581985,
0.07356227189302444,
0.024444200098514557,
0.07644832879304886,
0.021527709439396858,
-0.01449313573539257,
-0.01951451599597931,
-0.10100608319044113,
0.04569118842482567,
-0.004507715813815594,
0.04282442107796669,
-0.044405728578567505,
-0.08805276453495026,
0.050963226705789566,
0.024352023378014565,
-0.10630439966917038,
0.01891428418457508,
-0.032940905541181564,
0.03869566321372986,
0.04724723473191261,
0.026397081092000008,
0.009924247860908508,
-0.026114895939826965,
0.1992442011833191,
-0.08745569735765457,
-0.06105823814868927,
-0.12026906758546829,
0.20446202158927917,
-0.026681644842028618,
-0.00177281117066741,
0.027416091412305832,
-0.06754519790410995,
-0.003922198433429003,
0.14058367908000946,
0.14418555796146393,
-0.02402542158961296,
-0.016891898587346077,
0.016500696539878845,
-0.01113336905837059,
-0.01978844404220581,
0.07286936789751053,
0.10716217756271362,
0.026076985523104668,
-0.04747708514332771,
-0.012181705795228481,
-0.007475045509636402,
-0.06592658162117004,
-0.03874482586979866,
0.07957276701927185,
0.03421792387962341,
0.015199982561171055,
-0.025299398228526115,
0.11053460836410522,
-0.028887245804071426,
-0.13157862424850464,
0.043126266449689865,
-0.1734083890914917,
-0.18755699694156647,
-0.03800256550312042,
0.083675317466259,
0.02654280699789524,
0.044692281633615494,
0.00821499153971672,
-0.02892477437853813,
0.08314058929681778,
0.005577548407018185,
-0.03592976927757263,
-0.09281488507986069,
0.07569602131843567,
-0.12676487863063812,
0.20656326413154602,
-0.03689346835017204,
0.03136823698878288,
0.11614294350147247,
0.002723759738728404,
-0.0975448340177536,
0.021118178963661194,
0.09471746534109116,
-0.11682691425085068,
0.03600194677710533,
0.18845300376415253,
-0.03156644105911255,
0.13392925262451172,
0.05200110003352165,
-0.10576748102903366,
-0.0012136301957070827,
-0.06117110699415207,
-0.05859903246164322,
-0.07262320071458817,
-0.0017199040157720447,
-0.039153944700956345,
0.1477936953306198,
0.224056214094162,
-0.083420030772686,
-0.020038004964590073,
-0.033735547214746475,
0.01861344277858734,
0.02814347855746746,
0.14220397174358368,
-0.021325621753931046,
-0.2701627016067505,
0.017111562192440033,
0.011511641554534435,
0.03230268880724907,
-0.19954752922058105,
-0.08807244896888733,
0.02594721131026745,
-0.04766760393977165,
-0.06650509685277939,
0.11611700803041458,
0.0973329097032547,
0.053629081696271896,
-0.053495634347200394,
-0.12779107689857483,
-0.03444803133606911,
0.1691582202911377,
-0.17010532319545746,
-0.04839826002717018
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | automatic-speech-recognition | spsither/wav2vec2_run9.595 | [
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:47:16+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
47,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06877388060092926,
0.1546701192855835,
-0.0037609888240695,
0.013798683881759644,
0.11170210689306259,
0.0049477447755634785,
0.07622946053743362,
0.1076156347990036,
-0.024175573140382767,
0.12644733488559723,
0.04164152219891548,
0.09870775043964386,
0.11074616760015488,
0.18980292975902557,
0.0015578214079141617,
-0.20271944999694824,
0.06667982041835785,
-0.11557482928037643,
0.02210802026093006,
0.12125445902347565,
0.14131462574005127,
-0.10717527568340302,
0.06805222481489182,
-0.03453851491212845,
-0.022604284808039665,
-0.03256304934620857,
-0.06200181692838669,
-0.0628168061375618,
0.06936536729335785,
0.060818396508693695,
0.06474827229976654,
0.023958178237080574,
0.07868874818086624,
-0.2985154092311859,
0.020363550633192062,
0.07747753709554672,
0.005190075840801001,
0.0596587099134922,
0.07716850191354752,
-0.06847380846738815,
0.11357854306697845,
-0.0553223080933094,
0.15529125928878784,
0.07729580253362656,
-0.09200245141983032,
-0.18732582032680511,
-0.08171983063220978,
0.09086527675390244,
0.16344711184501648,
0.05807739868760109,
-0.035454582422971725,
0.14257195591926575,
-0.08119463175535202,
0.015228749252855778,
0.06432900577783585,
-0.07448869198560715,
-0.04995284602046013,
0.044303327798843384,
0.07393822818994522,
0.09027253836393356,
-0.12936420738697052,
-0.005840824451297522,
0.04285894334316254,
0.01751609519124031,
0.1045890524983406,
0.0271924901753664,
0.10937820374965668,
0.030452799052000046,
-0.13982591032981873,
-0.06308452039957047,
0.12294159829616547,
0.03608649969100952,
-0.05978325754404068,
-0.24299637973308563,
-0.007494248915463686,
-0.030862024053931236,
-0.022421855479478836,
-0.0449565127491951,
0.040200937539339066,
-0.03043903410434723,
0.0803007185459137,
0.005218773614615202,
-0.07346875220537186,
-0.0566013865172863,
0.08528164029121399,
0.0660456046462059,
0.024965541437268257,
-0.02511134371161461,
0.022877119481563568,
0.11602471768856049,
0.09200266003608704,
-0.11191211640834808,
-0.07020656764507294,
-0.06118712201714516,
-0.09110330045223236,
-0.04440220445394516,
0.03338851034641266,
0.07138838618993759,
0.04954010248184204,
0.19076436758041382,
0.006971653085201979,
0.05134076997637749,
0.026316070929169655,
0.018496420234441757,
0.061533693224191666,
0.06859898567199707,
-0.05315755307674408,
-0.12085959315299988,
-0.043275654315948486,
0.1195915937423706,
0.008576745167374611,
-0.03422791138291359,
-0.034871865063905716,
0.05920550227165222,
0.05124519392848015,
0.11922229826450348,
0.06299308687448502,
0.015805674716830254,
-0.06944610923528671,
-0.041848812252283096,
0.17807698249816895,
-0.15696440637111664,
0.01886504516005516,
0.019594965502619743,
-0.05179493874311447,
-0.028022583574056625,
0.01927095092833042,
0.011918062344193459,
-0.028684133663773537,
0.09848573058843613,
-0.06384129822254181,
-0.037289999425411224,
-0.10494036227464676,
-0.051826175302267075,
0.03436095267534256,
-0.01885044015944004,
-0.030469300225377083,
-0.04276524484157562,
-0.11668366193771362,
-0.07342278957366943,
0.06446365267038345,
-0.06070359796285629,
-0.06312011927366257,
-0.04004829749464989,
-0.05974921956658363,
0.01184001937508583,
-0.0018999426392838359,
0.12804386019706726,
-0.03126852586865425,
0.04724927991628647,
-0.05154479295015335,
0.07010733336210251,
0.13001501560211182,
0.0328618623316288,
-0.06312436610460281,
0.06317896395921707,
-0.20583610236644745,
0.10645388811826706,
-0.0948607325553894,
0.026716187596321106,
-0.16420963406562805,
-0.024270139634609222,
0.02872021123766899,
0.03977278992533684,
-0.014035328291356564,
0.13902691006660461,
-0.1889396458864212,
-0.037479519844055176,
0.1823769360780716,
-0.1340419203042984,
-0.09025664627552032,
0.06442771852016449,
-0.056058306246995926,
0.1311984360218048,
0.051679398864507675,
-0.016549112275242805,
0.050827931612730026,
-0.14181455969810486,
-0.021199021488428116,
-0.05750836804509163,
-0.01345672644674778,
0.14918801188468933,
0.06591099500656128,
-0.060217004269361496,
0.03262941166758537,
0.02008114755153656,
-0.02076314203441143,
-0.052245598286390305,
-0.03416990861296654,
-0.09862805157899857,
0.003799794940277934,
-0.08055862784385681,
0.018423959612846375,
-0.026528598740696907,
-0.08738208562135696,
-0.0410190187394619,
-0.1575777381658554,
-0.001173238386400044,
0.1026405617594719,
0.0026203012093901634,
-0.02646641992032528,
-0.10305316001176834,
0.001408840762451291,
0.015838710591197014,
-0.010245922021567822,
-0.14677146077156067,
-0.04217318072915077,
0.026863576844334602,
-0.16719304025173187,
0.031281016767024994,
-0.045817263424396515,
0.03617605194449425,
0.042714666575193405,
-0.04341552406549454,
-0.026187991723418236,
0.011214246973395348,
0.01926763355731964,
-0.01759723760187626,
-0.24584431946277618,
-0.01623428985476494,
-0.05088721215724945,
0.17665798962116241,
-0.2476477026939392,
0.04387471452355385,
0.07402390241622925,
0.1185368224978447,
0.006659833248704672,
-0.0473252609372139,
0.03859061002731323,
-0.04956425726413727,
-0.039547327905893326,
-0.06162410229444504,
-0.002731422893702984,
-0.034249331802129745,
-0.04925791174173355,
0.04766050726175308,
-0.19274261593818665,
-0.0254798773676157,
0.1145588755607605,
0.07196282595396042,
-0.16417020559310913,
-0.0721944123506546,
-0.03388380631804466,
-0.060263555496931076,
-0.0855790227651596,
-0.05511211231350899,
0.10627889633178711,
0.042532145977020264,
0.053568705916404724,
-0.07193132489919662,
-0.0538090355694294,
0.014475145377218723,
-0.008023109287023544,
-0.03674730286002159,
0.08616615831851959,
0.07892905920743942,
-0.111492820084095,
0.0967666357755661,
0.06781410425901413,
0.06170906499028206,
0.10836543887853622,
0.0035758649464696646,
-0.09838994592428207,
-0.013410377316176891,
0.028753211721777916,
0.013008177280426025,
0.1445195972919464,
-0.08268706500530243,
0.02993486076593399,
0.04475158452987671,
-0.029572229832410812,
0.014260980300605297,
-0.10948343575000763,
0.020612964406609535,
0.03188888356089592,
-0.01410164125263691,
0.016051514074206352,
-0.05129382014274597,
0.013738108798861504,
0.10363461822271347,
0.031123731285333633,
0.025897923856973648,
0.016665659844875336,
-0.04273077845573425,
-0.12888197600841522,
0.17441782355308533,
-0.09573886543512344,
-0.24906472861766815,
-0.13649064302444458,
0.0033230632543563843,
0.04450872540473938,
-0.01420661062002182,
0.019941311329603195,
-0.06085766479372978,
-0.10865217447280884,
-0.10793688893318176,
0.02346382476389408,
0.04952440410852432,
-0.08567548543214798,
-0.05095811188220978,
0.05441328510642052,
0.03898037597537041,
-0.12600500881671906,
0.024548007175326347,
0.04095667228102684,
-0.07147589325904846,
0.005656755063682795,
0.061115942895412445,
0.08382482826709747,
0.1812773495912552,
0.012779363431036472,
-0.015533777885138988,
0.01035984791815281,
0.21022020280361176,
-0.14754468202590942,
0.08923394232988358,
0.142924964427948,
-0.06379926204681396,
0.07994367927312851,
0.20067699253559113,
0.030222468078136444,
-0.0959763154387474,
0.0354040265083313,
0.03157598897814751,
-0.03929230570793152,
-0.24485765397548676,
-0.07799134403467178,
0.004727535881102085,
-0.06941798329353333,
0.0999692752957344,
0.08970286697149277,
0.11357339471578598,
0.04878859966993332,
-0.10688808560371399,
-0.07536104321479797,
0.04997042194008827,
0.11770502477884293,
-0.025654911994934082,
0.0004288276832085103,
0.09490229189395905,
-0.032173965126276016,
0.024045821279287338,
0.09091470390558243,
0.01785297878086567,
0.1891387403011322,
0.045389045029878616,
0.13416282832622528,
0.08966030925512314,
0.05892613157629967,
0.02283613197505474,
0.020396918058395386,
0.022836502641439438,
0.028627371415495872,
-0.02071341499686241,
-0.08800762891769409,
-0.01406664215028286,
0.1445012241601944,
0.03501417487859726,
0.03224355727434158,
0.005818283185362816,
-0.03822546452283859,
0.07026989012956619,
0.16923215985298157,
0.01291902456432581,
-0.22557523846626282,
-0.06553208827972412,
0.07285686582326889,
-0.07819344103336334,
-0.10939628630876541,
-0.00628721434623003,
0.039236925542354584,
-0.1781243532896042,
0.0453440323472023,
-0.016895415261387825,
0.09935811161994934,
-0.11019659787416458,
-0.022818224504590034,
0.03339223191142082,
0.06351818144321442,
-0.033710017800331116,
0.07605454325675964,
-0.20844414830207825,
0.14833855628967285,
0.007355031557381153,
0.06984888762235641,
-0.10627210140228271,
0.07959222793579102,
0.018262188881635666,
0.0005360859213396907,
0.16532482206821442,
-0.0075689139775931835,
-0.07650822401046753,
-0.08155251294374466,
-0.07923656702041626,
-0.010918287560343742,
0.10160883516073227,
-0.10205793380737305,
0.08789419382810593,
-0.006757213734090328,
-0.030893130227923393,
-0.00026032759342342615,
-0.11519953608512878,
-0.1342930644750595,
-0.18055365979671478,
0.04992220178246498,
-0.10558607429265976,
0.04552379995584488,
-0.11181014776229858,
-0.062069665640592575,
-0.04111560434103012,
0.18840233981609344,
-0.20550832152366638,
-0.07671810686588287,
-0.14316488802433014,
-0.08166468888521194,
0.11773297190666199,
-0.036535169929265976,
0.08007847517728806,
0.008441719226539135,
0.20702308416366577,
-0.00666013965383172,
0.002528243465349078,
0.08686443418264389,
-0.09668374806642532,
-0.2072489857673645,
-0.09340810775756836,
0.14340825378894806,
0.12398830056190491,
0.045563604682683945,
-0.0001787850633263588,
0.021285003051161766,
-0.004406071733683348,
-0.11160994321107864,
0.036765191704034805,
0.1599014699459076,
0.08414851129055023,
0.041826896369457245,
-0.023910723626613617,
-0.15188267827033997,
-0.1039518192410469,
-0.06143968924880028,
0.022748636081814766,
0.18740743398666382,
-0.06844107806682587,
0.17012163996696472,
0.157639279961586,
-0.061386726796627045,
-0.20854754745960236,
0.031976643949747086,
0.03363525867462158,
-0.008795025758445263,
0.0332365483045578,
-0.20113597810268402,
0.06802120804786682,
0.01531505398452282,
-0.057996444404125214,
0.1332528293132782,
-0.16826434433460236,
-0.15160627663135529,
0.08843177556991577,
0.07692008465528488,
-0.20126505196094513,
-0.12921905517578125,
-0.09711465984582901,
-0.05218008533120155,
-0.10807206481695175,
0.08772927522659302,
-0.006655422504991293,
0.007214459590613842,
0.037578340619802475,
0.02635364979505539,
0.015357093885540962,
-0.05328182876110077,
0.19721722602844238,
0.0011987579055130482,
0.044046565890312195,
-0.07511261850595474,
-0.077226422727108,
0.034381043165922165,
-0.06312628090381622,
0.07982822507619858,
-0.020660031586885452,
0.0017429457511752844,
-0.11481664329767227,
-0.06663372367620468,
-0.05009456351399422,
0.029989875853061676,
-0.08466581255197525,
-0.09467059373855591,
-0.051657307893037796,
0.09798348695039749,
0.09048279374837875,
-0.03396918624639511,
-0.06807554513216019,
-0.10042613744735718,
0.06601390987634659,
0.22872091829776764,
0.18910692632198334,
0.06991440057754517,
-0.06895517557859421,
-0.0038870053831487894,
-0.026509825140237808,
0.05879383906722069,
-0.20851773023605347,
0.044600993394851685,
0.036500073969364166,
0.032537586987018585,
0.13215065002441406,
-0.02442602440714836,
-0.16357013583183289,
-0.043075863271951675,
0.056227099150419235,
-0.06633396446704865,
-0.16863006353378296,
0.005107434932142496,
0.09075167030096054,
-0.15091724693775177,
-0.04752274975180626,
0.030901111662387848,
-0.03220430761575699,
-0.02397167682647705,
0.00030637482996098697,
0.08078145235776901,
0.020850084722042084,
0.1107739508152008,
0.06640642136335373,
0.11335843801498413,
-0.10278842598199844,
0.08162284642457962,
0.08386309444904327,
-0.11347422748804092,
0.04244251549243927,
0.05978094041347504,
-0.06325716525316238,
-0.03386267274618149,
0.016484335064888,
0.0787876546382904,
0.03214597329497337,
-0.08122093230485916,
0.0026990212500095367,
-0.11556044965982437,
0.06788678467273712,
0.14209748804569244,
0.03322440758347511,
0.007564007304608822,
0.04558844491839409,
0.031089849770069122,
-0.09967122226953506,
0.10952559113502502,
0.0327114500105381,
0.03264835476875305,
-0.052766215056180954,
0.007493352517485619,
0.044093240052461624,
-0.012370331212878227,
-0.01659340038895607,
-0.04159332811832428,
-0.062125492841005325,
-0.004501889459788799,
-0.15752804279327393,
0.029296958819031715,
-0.06990371644496918,
0.009181820787489414,
0.0195058211684227,
-0.03118128329515457,
0.001035416848026216,
0.014971627853810787,
-0.0777391716837883,
-0.03601877763867378,
-0.00462498189881444,
0.10573451966047287,
-0.15904870629310608,
0.012398114427924156,
0.0838126391172409,
-0.12594857811927795,
0.0813586562871933,
-0.0006106876535341144,
-0.01206875778734684,
0.022131776437163353,
-0.14767099916934967,
0.06096983700990677,
-0.00651735020801425,
0.005330943502485752,
0.022080490365624428,
-0.20231451094150543,
0.0010611782781779766,
-0.046166326850652695,
-0.0580565482378006,
-0.006821162533015013,
-0.034208331257104874,
-0.10881488770246506,
0.10119375586509705,
0.01840946450829506,
-0.0807829275727272,
-0.019118202850222588,
0.049314580857753754,
0.10984907299280167,
-0.05423201248049736,
0.13843025267124176,
-0.022093484178185463,
0.05561875179409981,
-0.17508383095264435,
-0.015010466799139977,
-0.01884511485695839,
0.01675039529800415,
-0.032699406147003174,
-0.0063448576256632805,
0.053761400282382965,
-0.021795762702822685,
0.23006084561347961,
-0.03329315781593323,
0.022746775299310684,
0.0662616565823555,
-0.007395898457616568,
-0.02466614730656147,
0.09141410142183304,
0.05831921473145485,
0.019823938608169556,
0.023462723940610886,
0.009678727947175503,
-0.051977336406707764,
-0.011846045032143593,
-0.1287335902452469,
0.08032830059528351,
0.17006289958953857,
0.0832807645201683,
-0.0011417492059990764,
0.05661620944738388,
-0.11824764311313629,
-0.08884397894144058,
0.10315068811178207,
-0.03696487843990326,
-0.008325101807713509,
-0.05479050800204277,
0.14003127813339233,
0.16284166276454926,
-0.1792466789484024,
0.06529472023248672,
-0.06703231483697891,
-0.054111137986183167,
-0.1079135313630104,
-0.1702733039855957,
-0.06385406106710434,
-0.04134172946214676,
-0.003200325183570385,
-0.056672241538763046,
0.07026970386505127,
0.10425727069377899,
0.015394158661365509,
0.007145122159272432,
0.08924684673547745,
-0.034410521388053894,
0.003967431839555502,
0.04615078866481781,
0.05031316727399826,
0.015370454639196396,
-0.06289559602737427,
0.003805057378485799,
0.012086667120456696,
0.03619912639260292,
0.05767577514052391,
0.03358588367700577,
-0.015441972762346268,
0.00826429296284914,
-0.019517268985509872,
-0.0962890237569809,
0.0407244898378849,
-0.028659315779805183,
-0.04762914776802063,
0.14599058032035828,
0.023316938430070877,
-0.005744231399148703,
-0.019850272685289383,
0.22833019495010376,
-0.06841307878494263,
-0.08293036371469498,
-0.13890130817890167,
0.1406106948852539,
-0.04129096865653992,
0.054532211273908615,
0.048289187252521515,
-0.10287833213806152,
0.031274814158678055,
0.14709845185279846,
0.14302049577236176,
-0.028337303549051285,
0.01196619775146246,
0.009999874047935009,
0.005250520538538694,
-0.026724260300397873,
0.052909236401319504,
0.049603480845689774,
0.12155342847108841,
-0.06124946475028992,
0.09144628793001175,
-0.0038096080534160137,
-0.08695073425769806,
-0.01940424181520939,
0.13583695888519287,
-0.001434069243259728,
0.020704632624983788,
-0.08129720389842987,
0.11675985902547836,
-0.06527755409479141,
-0.2561015188694,
0.060353249311447144,
-0.06762448698282242,
-0.14944049715995789,
-0.018578823655843735,
0.027211744338274002,
0.0003355915832798928,
0.021279368549585342,
0.06146527826786041,
-0.06275594234466553,
0.15064457058906555,
0.03758588433265686,
-0.07729688286781311,
-0.07095571607351303,
0.07545747607946396,
-0.0798204317688942,
0.2952599823474884,
0.007051850203424692,
0.05692324787378311,
0.09223286807537079,
-0.033274851739406586,
-0.1323377937078476,
0.049896061420440674,
0.09064158797264099,
-0.06194010376930237,
0.06410481035709381,
0.20840007066726685,
-0.011975160799920559,
0.12260035425424576,
0.07416624575853348,
-0.08735647797584534,
0.05223854258656502,
-0.07405798882246017,
-0.09430453926324844,
-0.08655916899442673,
0.08934324234724045,
-0.06278510391712189,
0.15317323803901672,
0.12562185525894165,
-0.04725475609302521,
0.0027636797167360783,
-0.025733815506100655,
0.054841578006744385,
-0.0038393251597881317,
0.11300427466630936,
0.026762498542666435,
-0.19724777340888977,
0.03347480297088623,
-0.01826278306543827,
0.10099007189273834,
-0.2592698633670807,
-0.08135145157575607,
0.039587851613759995,
-0.009570525959134102,
-0.05378785356879234,
0.11855222284793854,
0.06144152209162712,
0.04968099668622017,
-0.0558135025203228,
-0.05388732627034187,
0.0009833982912823558,
0.1646765172481537,
-0.10682281851768494,
-0.0031281758565455675
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | AIMH-DHgroup/llama-2-7b-chat-paragraphs-q4-2 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T16:52:22+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04654794931411743,
0.16618601977825165,
-0.005445904564112425,
0.01853804849088192,
0.0981811136007309,
0.011998992413282394,
0.06433123350143433,
0.11398410052061081,
-0.0230073444545269,
0.11406639218330383,
0.03047988750040531,
0.10172267258167267,
0.11317981779575348,
0.14841650426387787,
-0.002152352826669812,
-0.22403094172477722,
0.050844956189394,
-0.12105348706245422,
-0.033293843269348145,
0.11749980598688126,
0.1483822613954544,
-0.09928343445062637,
0.07274559140205383,
-0.029687678441405296,
-0.012143402360379696,
-0.030057786032557487,
-0.05890674889087677,
-0.046214159578084946,
0.04651786759495735,
0.06640566885471344,
0.06770290434360504,
0.0071083661168813705,
0.09012923389673233,
-0.2696533799171448,
0.018959321081638336,
0.07145345956087112,
-0.002759667346253991,
0.06957992166280746,
0.06404146552085876,
-0.07107418030500412,
0.10337356477975845,
-0.05106033384799957,
0.14650006592273712,
0.08365883678197861,
-0.09081148356199265,
-0.1895141303539276,
-0.08866965025663376,
0.09882009029388428,
0.17572562396526337,
0.04925641790032387,
-0.02320658043026924,
0.09761467576026917,
-0.08769196271896362,
0.015438909642398357,
0.04981724172830582,
-0.07620415836572647,
-0.05378096550703049,
0.05986575037240982,
0.07907199114561081,
0.06627275794744492,
-0.12434766441583633,
-0.02885502204298973,
0.005009706597775221,
0.010980482213199139,
0.0769270583987236,
0.01728810742497444,
0.146672785282135,
0.0338633768260479,
-0.12615777552127838,
-0.04880760237574577,
0.09869225323200226,
0.03395522013306618,
-0.04422314465045929,
-0.24749068915843964,
-0.03152675926685333,
-0.030810698866844177,
-0.029386121779680252,
-0.03716538846492767,
0.04340358078479767,
-0.007673026993870735,
0.08638741075992584,
-0.0060646249912679195,
-0.07403432577848434,
-0.03937075287103653,
0.06169692054390907,
0.0672287791967392,
0.02999979443848133,
-0.013745363801717758,
0.010938193649053574,
0.11620724946260452,
0.1095694974064827,
-0.12054188549518585,
-0.05555335059762001,
-0.06393084675073624,
-0.08656639605760574,
-0.040790557861328125,
0.034162238240242004,
0.03456587344408035,
0.05349370837211609,
0.25305667519569397,
0.015654386952519417,
0.059652652591466904,
0.034477248787879944,
0.007892133668065071,
0.05848940089344978,
0.11044429242610931,
-0.06018859148025513,
-0.10444226115942001,
-0.02648012898862362,
0.08843598514795303,
0.008199662901461124,
-0.03287925571203232,
-0.05088530853390694,
0.06019928678870201,
0.01946467161178589,
0.11926145106554031,
0.09061790257692337,
0.010536285117268562,
-0.07121123373508453,
-0.061038948595523834,
0.1891259253025055,
-0.16544590890407562,
0.04322727024555206,
0.035097137093544006,
-0.03903156518936157,
0.00019933005387429148,
0.013914269395172596,
0.016625655815005302,
-0.025983380153775215,
0.09017423540353775,
-0.054113563150167465,
-0.04145489260554314,
-0.11186197400093079,
-0.03383193537592888,
0.033762916922569275,
0.008953776210546494,
-0.035059962421655655,
-0.033713940531015396,
-0.08351044356822968,
-0.07577689737081528,
0.09320491552352905,
-0.07346344739198685,
-0.04878907650709152,
-0.01804324984550476,
-0.07530532777309418,
0.022395428270101547,
0.019394835457205772,
0.07707412540912628,
-0.02362251654267311,
0.04399976506829262,
-0.05189276114106178,
0.05863580107688904,
0.11207318305969238,
0.03570080175995827,
-0.05736649036407471,
0.06062258034944534,
-0.23834340274333954,
0.09552820026874542,
-0.07409077137708664,
0.05591456592082977,
-0.153293639421463,
-0.024439791217446327,
0.04788333550095558,
0.008784620091319084,
-0.009650949388742447,
0.13416339457035065,
-0.21702027320861816,
-0.02536402828991413,
0.1717337965965271,
-0.10057014971971512,
-0.07069246470928192,
0.05619903281331062,
-0.04835370555520058,
0.10988964140415192,
0.03825836628675461,
-0.025690359994769096,
0.06171267107129097,
-0.1267417073249817,
0.003717758459970355,
-0.05005312338471413,
-0.017048977315425873,
0.1548657864332199,
0.07182947546243668,
-0.07217690348625183,
0.07399354875087738,
0.025708531960844994,
-0.0246540866792202,
-0.04625825211405754,
-0.015164627693593502,
-0.10536660254001617,
0.014689887873828411,
-0.06369215250015259,
0.014470234513282776,
-0.020807426422834396,
-0.09071163833141327,
-0.027962757274508476,
-0.17504668235778809,
-0.03014434315264225,
0.08651752024888992,
-0.008693269453942776,
-0.01803150773048401,
-0.1178668737411499,
0.009341353550553322,
0.04177580401301384,
0.0061247628182172775,
-0.13462838530540466,
-0.04812471568584442,
0.02780051715672016,
-0.1600649207830429,
0.034652888774871826,
-0.05392369255423546,
0.04932025074958801,
0.025790516287088394,
-0.028889117762446404,
-0.026493212208151817,
0.021633783355355263,
0.005992184858769178,
-0.011999987065792084,
-0.24343903362751007,
-0.028118690475821495,
-0.024888472631573677,
0.1682123839855194,
-0.20917098224163055,
0.03546025976538658,
0.07867541164159775,
0.15366052091121674,
0.011240328662097454,
-0.04177491366863251,
0.005974748637527227,
-0.06935794651508331,
-0.02736494317650795,
-0.05875484645366669,
-0.0047869328409433365,
-0.03310677409172058,
-0.04545191675424576,
0.04568447172641754,
-0.16510973870754242,
-0.032636504620313644,
0.09776268899440765,
0.06289951503276825,
-0.13922683894634247,
-0.020621931180357933,
-0.03630133345723152,
-0.049253206700086594,
-0.04911839962005615,
-0.0605199858546257,
0.10893940925598145,
0.05891856551170349,
0.04574795812368393,
-0.05928509309887886,
-0.07568105310201645,
-0.001827909960411489,
-0.013898161239922047,
-0.017864689230918884,
0.09759635478258133,
0.0751434788107872,
-0.13251115381717682,
0.09224759042263031,
0.09603385627269745,
0.07919023185968399,
0.09113933145999908,
-0.02355697751045227,
-0.08261934667825699,
-0.045987509191036224,
0.031442027539014816,
0.020124373957514763,
0.13039541244506836,
-0.024294709786772728,
0.04352088272571564,
0.042134687304496765,
-0.019369594752788544,
0.014752166345715523,
-0.08687400817871094,
0.033972494304180145,
0.028472330421209335,
-0.016721390187740326,
0.050190530717372894,
-0.03876714035868645,
0.02440318465232849,
0.08830609917640686,
0.045322712510824203,
0.03507532551884651,
0.015493292361497879,
-0.05206458270549774,
-0.1083620935678482,
0.16405931115150452,
-0.12714070081710815,
-0.22483378648757935,
-0.13936103880405426,
0.0037376401014626026,
0.035628627985715866,
-0.015835661441087723,
0.002417160663753748,
-0.059374887496232986,
-0.12220635265111923,
-0.08858037739992142,
0.015140829607844353,
0.04942670464515686,
-0.09028962254524231,
-0.06437795609235764,
0.058117836713790894,
0.03889724239706993,
-0.14560972154140472,
0.017612040042877197,
0.04854894429445267,
-0.09789852797985077,
-0.006774199660867453,
0.08094939589500427,
0.0698540136218071,
0.1770169734954834,
0.017703235149383545,
-0.021850809454917908,
0.032354529947042465,
0.20614571869373322,
-0.13538233935832977,
0.11083246022462845,
0.13607586920261383,
-0.09041404724121094,
0.08072979003190994,
0.19951270520687103,
0.03932560607790947,
-0.10153959691524506,
0.031980328261852264,
0.02283124253153801,
-0.0284719280898571,
-0.24526868760585785,
-0.07212468236684799,
-0.004402178805321455,
-0.058010730892419815,
0.07660572230815887,
0.09286724030971527,
0.08215958625078201,
0.012304253876209259,
-0.09310996532440186,
-0.08154371380805969,
0.05942574888467789,
0.10367169976234436,
0.024584239348769188,
-0.010839897207915783,
0.08998730033636093,
-0.034100502729415894,
0.019626356661319733,
0.0853661298751831,
0.005239574704319239,
0.17840281128883362,
0.05159219726920128,
0.18830420076847076,
0.07925192266702652,
0.07219027727842331,
0.009912233799695969,
0.013080619275569916,
0.018877580761909485,
0.03300119563937187,
-0.002769160782918334,
-0.08440786600112915,
-0.02248465269804001,
0.11566436290740967,
0.06668911874294281,
0.010815348476171494,
0.015172341838479042,
-0.04104290530085564,
0.07965951412916183,
0.1831512451171875,
-0.007656289264559746,
-0.1783534437417984,
-0.057547420263290405,
0.07553383708000183,
-0.09879875183105469,
-0.09854305535554886,
-0.013454320840537548,
0.03072015568614006,
-0.17046253383159637,
0.023390959948301315,
-0.02239842526614666,
0.1106182336807251,
-0.14194999635219574,
-0.020490378141403198,
0.07218493521213531,
0.07199500501155853,
0.004729843698441982,
0.05758659541606903,
-0.16417601704597473,
0.10671813786029816,
0.008950476534664631,
0.06779605895280838,
-0.09610627591609955,
0.1008887067437172,
-0.004196076653897762,
-0.02063460275530815,
0.1393408179283142,
0.002700034761801362,
-0.06884108483791351,
-0.0763031542301178,
-0.08754398673772812,
-0.009632662869989872,
0.12754282355308533,
-0.1419651061296463,
0.08767123520374298,
-0.037212442606687546,
-0.0424150750041008,
-0.0017086371080949903,
-0.10206665843725204,
-0.11638247221708298,
-0.18888559937477112,
0.06001543253660202,
-0.13492922484874725,
0.03152317553758621,
-0.10799519717693329,
-0.032371897250413895,
-0.030304040759801865,
0.19337286055088043,
-0.23447458446025848,
-0.07199826091527939,
-0.1475764364004135,
-0.10233612358570099,
0.1443224400281906,
-0.0501345656812191,
0.08485390990972519,
-0.007241467013955116,
0.16846685111522675,
0.019060896709561348,
-0.02531743235886097,
0.0971490666270256,
-0.09173708409070969,
-0.19302815198898315,
-0.07869284600019455,
0.15662524104118347,
0.13260218501091003,
0.031680017709732056,
-0.002461588243022561,
0.036563750356435776,
-0.015421539545059204,
-0.11935004591941833,
0.015969349071383476,
0.1787186712026596,
0.06237189099192619,
0.02331034652888775,
-0.027346095070242882,
-0.11273157596588135,
-0.06900003552436829,
-0.028530338779091835,
0.03054865077137947,
0.17762407660484314,
-0.07057618349790573,
0.18207968771457672,
0.14163152873516083,
-0.05922834202647209,
-0.20400173962116241,
0.010538800619542599,
0.03055560030043125,
0.0009220078936778009,
0.02591954916715622,
-0.20123432576656342,
0.08688826113939285,
0.004683020059019327,
-0.05110127478837967,
0.13194532692432404,
-0.17217805981636047,
-0.14451217651367188,
0.0765485092997551,
0.038384392857551575,
-0.19559739530086517,
-0.12913893163204193,
-0.09174312651157379,
-0.045869920402765274,
-0.18591414391994476,
0.09569250047206879,
0.0305706188082695,
0.010893458500504494,
0.03030681423842907,
0.029179483652114868,
0.019487828016281128,
-0.0418255440890789,
0.18391458690166473,
-0.024792250245809555,
0.026594700291752815,
-0.08539514988660812,
-0.06927408277988434,
0.03743394836783409,
-0.052842434495687485,
0.07349982857704163,
-0.023486759513616562,
0.007861839607357979,
-0.10348054021596909,
-0.042148489505052567,
-0.03735732287168503,
0.015448716469109058,
-0.09657872468233109,
-0.08514349907636642,
-0.045032672584056854,
0.09675803780555725,
0.09690850973129272,
-0.033646680414676666,
-0.028050623834133148,
-0.07533035427331924,
0.04412057250738144,
0.19926515221595764,
0.1785389482975006,
0.042153384536504745,
-0.08034496754407883,
-0.004150947090238333,
-0.010121207684278488,
0.04310847446322441,
-0.20463712513446808,
0.06283636391162872,
0.05450061708688736,
0.01973269321024418,
0.11436162889003754,
-0.019565396010875702,
-0.15359151363372803,
-0.07263088971376419,
0.06303015351295471,
-0.060181066393852234,
-0.19620554149150848,
0.00867035984992981,
0.060603946447372437,
-0.16371412575244904,
-0.04535605385899544,
0.04643881320953369,
-0.005620351992547512,
-0.038163937628269196,
0.021896906197071075,
0.09194854646921158,
0.0026654244866222143,
0.07427921891212463,
0.05387866869568825,
0.0827430784702301,
-0.10537070035934448,
0.08090532571077347,
0.08839722722768784,
-0.08452684432268143,
0.023530138656497,
0.10478579998016357,
-0.059433579444885254,
-0.03440561518073082,
0.020135708153247833,
0.08153781294822693,
0.01775863952934742,
-0.040019966661930084,
0.013229827396571636,
-0.10452935844659805,
0.05954122915863991,
0.08839859813451767,
0.032507482916116714,
0.016702456399798393,
0.03425082191824913,
0.04607953503727913,
-0.07238735258579254,
0.12142276018857956,
0.031868141144514084,
0.017129309475421906,
-0.036505792289972305,
-0.040896978229284286,
0.019542274996638298,
-0.03214648738503456,
-0.005015232600271702,
-0.03023446537554264,
-0.07695909589529037,
-0.014793801121413708,
-0.1626158058643341,
-0.011131818406283855,
-0.05648450180888176,
0.010329355485737324,
0.03204665705561638,
-0.032609567046165466,
0.008124498650431633,
0.009250079281628132,
-0.07695289701223373,
-0.0663459524512291,
-0.020460480824112892,
0.09540658444166183,
-0.16213038563728333,
0.022481130436062813,
0.08244425803422928,
-0.12187694013118744,
0.09281346201896667,
0.016204802319407463,
-0.006236857734620571,
0.025038830935955048,
-0.1475188434123993,
0.034843120723962784,
-0.03386561945080757,
0.010836300440132618,
0.04373383894562721,
-0.21569781005382538,
-0.00004886732858722098,
-0.033673107624053955,
-0.06639216095209122,
-0.009451326914131641,
-0.03672455996274948,
-0.11508306115865707,
0.1058407872915268,
0.007236586883664131,
-0.08753558248281479,
-0.03186136856675148,
0.029325377196073532,
0.0838974118232727,
-0.021959776058793068,
0.15145497024059296,
-0.008370938710868359,
0.07429654151201248,
-0.16209737956523895,
-0.018623165786266327,
-0.006028574425727129,
0.022658247500658035,
-0.01664556935429573,
-0.01111356820911169,
0.044031109660863876,
-0.022746501490473747,
0.17925859987735748,
-0.030318550765514374,
0.02272745408117771,
0.06815794110298157,
0.019072026014328003,
-0.030184008181095123,
0.10406795144081116,
0.04094860330224037,
0.02014910988509655,
0.018591465428471565,
0.003289656015112996,
-0.04647882282733917,
-0.03173251822590828,
-0.19407226145267487,
0.07288651913404465,
0.15608493983745575,
0.09729263186454773,
-0.016707008704543114,
0.07954329252243042,
-0.10199416428804398,
-0.1109243705868721,
0.12477338314056396,
-0.04797708988189697,
-0.002418199321255088,
-0.07150927931070328,
0.13247236609458923,
0.1437523066997528,
-0.1859612911939621,
0.07269313186407089,
-0.0699717253446579,
-0.04708027467131615,
-0.10980689525604248,
-0.19441905617713928,
-0.05561789125204086,
-0.049456022679805756,
-0.016053348779678345,
-0.04698808491230011,
0.07504211366176605,
0.054538097232580185,
0.006766852922737598,
-0.0023397188633680344,
0.06506035476922989,
-0.031050674617290497,
-0.0037882844917476177,
0.032597362995147705,
0.06591679900884628,
0.012734474614262581,
-0.030802709981799126,
0.016619903966784477,
-0.013545602560043335,
0.045626189559698105,
0.06578011065721512,
0.04976864159107208,
-0.02938537672162056,
0.014603170566260815,
-0.038539156317710876,
-0.10249634087085724,
0.043612558394670486,
-0.024421939626336098,
-0.0789753645658493,
0.15477414429187775,
0.023680059239268303,
0.007779473438858986,
-0.020137663930654526,
0.23901568353176117,
-0.0738423764705658,
-0.0964353010058403,
-0.14737580716609955,
0.10557299107313156,
-0.038081806153059006,
0.05800395458936691,
0.04625935107469559,
-0.10226529091596603,
0.018044332042336464,
0.1338089406490326,
0.16182038187980652,
-0.039008259773254395,
0.020095856860280037,
0.031135575845837593,
0.00566398398950696,
-0.03622615709900856,
0.04847532883286476,
0.06906453520059586,
0.16569648683071136,
-0.04632584750652313,
0.09100406616926193,
0.0019041687482967973,
-0.09579581767320633,
-0.038361791521310806,
0.11069868505001068,
-0.016052277758717537,
0.019335128366947174,
-0.05818064883351326,
0.11742528527975082,
-0.06386786699295044,
-0.23783175647258759,
0.06453443318605423,
-0.0684293657541275,
-0.13765870034694672,
-0.02378307841718197,
0.08207765966653824,
-0.012955902144312859,
0.027587108314037323,
0.0730307325720787,
-0.07240920513868332,
0.201939657330513,
0.03798431158065796,
-0.05499868467450142,
-0.055047210305929184,
0.0805421993136406,
-0.10008571296930313,
0.2739645540714264,
0.01557221356779337,
0.04601577669382095,
0.10384146869182587,
-0.009341772645711899,
-0.13838784396648407,
0.019836371764540672,
0.09581108391284943,
-0.10502193123102188,
0.04196618124842644,
0.19815568625926971,
-0.0014755994779989123,
0.12389086186885834,
0.07657600939273834,
-0.07551808655261993,
0.0478031262755394,
-0.08054235577583313,
-0.06760486960411072,
-0.09260394424200058,
0.09703279286623001,
-0.07772123068571091,
0.14251399040222168,
0.13876807689666748,
-0.05074559152126312,
0.012724342755973339,
-0.031311117112636566,
0.044293127954006195,
-0.00010600237874314189,
0.10321761667728424,
0.004272161517292261,
-0.1832672357559204,
0.024692710489034653,
0.005650998093187809,
0.10749758034944534,
-0.16033467650413513,
-0.09566054493188858,
0.042343202978372574,
0.003505636239424348,
-0.0672195628285408,
0.1290110945701599,
0.05665452033281326,
0.04342988133430481,
-0.03997718170285225,
-0.03521440550684929,
-0.0060732318088412285,
0.13561366498470306,
-0.10713256150484085,
0.0009933578548952937
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Dagobert42/mobilebert-uncased-biored-finetuned
This model is a fine-tuned version of [mobilebert-uncased](https://huggingface.co/mobilebert-uncased) on the bigbio/biored dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7632
- Accuracy: 0.7385
- Precision: 0.2012
- Recall: 0.2384
- F1: 0.215
- Weighted F1: 0.7009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Weighted F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:-----------:|
| No log | 1.0 | 25 | 1.2345 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 2.0 | 50 | 1.0379 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 3.0 | 75 | 1.0300 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 4.0 | 100 | 1.0228 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 5.0 | 125 | 1.0144 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 6.0 | 150 | 0.9994 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 7.0 | 175 | 0.9681 | 0.7114 | 0.1016 | 0.1429 | 0.1188 | 0.5914 |
| No log | 8.0 | 200 | 0.8869 | 0.7147 | 0.2167 | 0.1487 | 0.1303 | 0.6007 |
| No log | 9.0 | 225 | 0.8511 | 0.7242 | 0.2064 | 0.1716 | 0.1598 | 0.6298 |
| No log | 10.0 | 250 | 0.8187 | 0.7287 | 0.157 | 0.1991 | 0.1754 | 0.653 |
| No log | 11.0 | 275 | 0.8046 | 0.7317 | 0.1581 | 0.2035 | 0.1775 | 0.6581 |
| No log | 12.0 | 300 | 0.7900 | 0.732 | 0.1935 | 0.2126 | 0.1887 | 0.6688 |
| No log | 13.0 | 325 | 0.7865 | 0.734 | 0.2312 | 0.2129 | 0.1828 | 0.6664 |
| No log | 14.0 | 350 | 0.7758 | 0.7346 | 0.1604 | 0.2148 | 0.1819 | 0.6672 |
| No log | 15.0 | 375 | 0.7958 | 0.7376 | 0.2086 | 0.2141 | 0.1884 | 0.6697 |
| No log | 16.0 | 400 | 0.7757 | 0.733 | 0.2002 | 0.2347 | 0.2122 | 0.6904 |
| No log | 17.0 | 425 | 0.7874 | 0.7393 | 0.2067 | 0.2196 | 0.2119 | 0.6828 |
| No log | 18.0 | 450 | 0.7915 | 0.735 | 0.2043 | 0.2391 | 0.2197 | 0.6959 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.15.0
| {"language": ["en"], "license": "mit", "tags": ["low-resource NER", "token_classification", "biomedicine", "medical NER", "generated_from_trainer"], "datasets": ["medicine"], "metrics": ["accuracy", "precision", "recall", "f1"], "base_model": "mobilebert-uncased", "model-index": [{"name": "Dagobert42/mobilebert-uncased-biored-finetuned", "results": []}]} | token-classification | Dagobert42/mobilebert-uncased-biored-finetuned | [
"transformers",
"safetensors",
"mobilebert",
"token-classification",
"low-resource NER",
"token_classification",
"biomedicine",
"medical NER",
"generated_from_trainer",
"en",
"dataset:medicine",
"base_model:mobilebert-uncased",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:55:45+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #mobilebert #token-classification #low-resource NER #token_classification #biomedicine #medical NER #generated_from_trainer #en #dataset-medicine #base_model-mobilebert-uncased #license-mit #autotrain_compatible #endpoints_compatible #region-us
| Dagobert42/mobilebert-uncased-biored-finetuned
==============================================
This model is a fine-tuned version of mobilebert-uncased on the bigbio/biored dataset.
It achieves the following results on the evaluation set:
* Loss: 0.7632
* Accuracy: 0.7385
* Precision: 0.2012
* Recall: 0.2384
* F1: 0.215
* Weighted F1: 0.7009
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 50
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.0.1+cu117
* Datasets 2.12.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 50",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.12.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #safetensors #mobilebert #token-classification #low-resource NER #token_classification #biomedicine #medical NER #generated_from_trainer #en #dataset-medicine #base_model-mobilebert-uncased #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 50",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.12.0\n* Tokenizers 0.15.0"
] | [
92,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #mobilebert #token-classification #low-resource NER #token_classification #biomedicine #medical NER #generated_from_trainer #en #dataset-medicine #base_model-mobilebert-uncased #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 50### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.12.0\n* Tokenizers 0.15.0"
] | [
-0.08097153156995773,
0.1780368536710739,
-0.003434477373957634,
0.03900350630283356,
0.10920730233192444,
-0.013689043931663036,
0.127739816904068,
0.15261468291282654,
-0.13246671855449677,
0.1404716670513153,
0.1532735675573349,
0.10488846153020859,
0.01892741024494171,
0.2132045477628708,
-0.057157788425683975,
-0.20446500182151794,
0.019224321469664574,
0.036097344011068344,
-0.008367073722183704,
0.10888916999101639,
0.07132579386234283,
-0.1219954714179039,
0.09249742329120636,
0.03709828853607178,
-0.11886825412511826,
-0.009006956592202187,
0.0431474968791008,
-0.0546145923435688,
0.08032983541488647,
-0.038780078291893005,
0.08529281616210938,
0.04164474084973335,
0.06200701743364334,
-0.20348688960075378,
0.006393082439899445,
0.02837960422039032,
0.0031772498041391373,
0.11107538640499115,
0.003803649917244911,
-0.0812494307756424,
0.10589096695184708,
-0.1855250746011734,
0.07252795994281769,
0.028081869706511497,
-0.0958651676774025,
-0.2557307779788971,
-0.12673696875572205,
0.09548138827085495,
0.06022555008530617,
0.057397328317165375,
0.005870939698070288,
0.20028667151927948,
-0.04359220713376999,
0.07790557295084,
0.2834473252296448,
-0.3124329447746277,
-0.0643133670091629,
0.015372103080153465,
0.07103316485881805,
0.06128998473286629,
-0.0939418375492096,
0.0023595651146024466,
0.054400164633989334,
0.01979631558060646,
0.15575838088989258,
0.007124272640794516,
0.018922099843621254,
-0.003522770944982767,
-0.14220790565013885,
-0.023805253207683563,
0.13059286773204803,
0.028596078976988792,
-0.039888784289360046,
-0.07603815197944641,
-0.050454553216695786,
-0.18604524433612823,
-0.03894544392824173,
-0.05787501111626625,
0.046948399394750595,
-0.09171289205551147,
-0.08916489034891129,
0.016074659302830696,
-0.03494036942720413,
-0.049761462956666946,
-0.021978113800287247,
0.10002051293849945,
0.054112277925014496,
-0.009289190173149109,
-0.003937620669603348,
0.056456319987773895,
-0.016154391691088676,
-0.13506998121738434,
-0.021782059222459793,
0.016585012897849083,
0.006905100308358669,
-0.07867785543203354,
-0.0026727078948169947,
-0.00840875506401062,
0.08063796162605286,
0.18346086144447327,
-0.07849102467298508,
0.026888815686106682,
0.005282958038151264,
0.0033527992200106382,
-0.0867108404636383,
0.05684042349457741,
-0.05036703124642372,
-0.0496172271668911,
0.031271934509277344,
0.11752676963806152,
0.07473460584878922,
0.032292552292346954,
-0.07567290216684341,
0.044418785721063614,
0.107542984187603,
0.037868402898311615,
-0.06412279605865479,
0.03165166825056076,
-0.05741143599152565,
-0.019865596666932106,
0.09753885865211487,
-0.06897950172424316,
0.0059160529635846615,
0.023238345980644226,
-0.025074427947402,
-0.022770538926124573,
0.006522809155285358,
0.002208951860666275,
-0.002334890654310584,
0.06235688179731369,
-0.09700164943933487,
-0.029602788388729095,
-0.04580189287662506,
-0.08937716484069824,
0.02353876642882824,
-0.07144095748662949,
-0.00624542310833931,
-0.11332793533802032,
-0.09795060008764267,
-0.021858030930161476,
0.012291781604290009,
-0.043181806802749634,
-0.05618662014603615,
-0.07209622859954834,
-0.07112002372741699,
0.031357597559690475,
-0.00992633681744337,
0.002905926899984479,
-0.07498563081026077,
0.07410348206758499,
0.0168015006929636,
0.09749836474657059,
-0.004388132132589817,
0.00605040742084384,
-0.08593014627695084,
0.06279265880584717,
-0.17962133884429932,
-0.0014754312578588724,
-0.07576645910739899,
0.036358483135700226,
-0.14262205362319946,
-0.09950698912143707,
-0.00623977230861783,
-0.078519806265831,
0.08295909315347672,
0.12105347961187363,
-0.10035574436187744,
-0.100201316177845,
0.20040346682071686,
-0.06400797516107559,
-0.1788105070590973,
0.11280917376279831,
-0.035314273089170456,
0.018612073734402657,
0.06850342452526093,
0.2274353951215744,
0.1076192706823349,
-0.1179543286561966,
-0.09291528165340424,
-0.09307599812746048,
0.053939878940582275,
-0.05814654752612114,
0.09692217409610748,
-0.00954813789576292,
0.012440362013876438,
-0.001892761094495654,
-0.05777725204825401,
0.03711853176355362,
-0.1207881048321724,
-0.06480034440755844,
-0.016343099996447563,
-0.12289400398731232,
0.05236721411347389,
0.012848441489040852,
0.07122500240802765,
-0.12002532184123993,
-0.03598465397953987,
0.07662990689277649,
0.10532863438129425,
-0.06253421306610107,
0.02232566475868225,
-0.09772556275129318,
0.10511717945337296,
-0.07768166065216064,
-0.02678311988711357,
-0.12380807101726532,
-0.07665127515792847,
0.04869038239121437,
-0.08793162554502487,
-0.035242851823568344,
-0.10395097732543945,
0.049341000616550446,
0.10260149091482162,
-0.06984741240739822,
-0.02548089250922203,
-0.025465862825512886,
0.037318721413612366,
-0.09207230806350708,
-0.21368730068206787,
0.01759377308189869,
-0.04619738832116127,
0.10080619901418686,
-0.204654723405838,
0.03732703626155853,
0.023827072232961655,
0.15059849619865417,
0.06121531501412392,
-0.05610344931483269,
0.022321492433547974,
0.04411759600043297,
-0.03530985489487648,
-0.06213495507836342,
0.048897720873355865,
-0.035251472145318985,
-0.07687360793352127,
-0.00432342104613781,
-0.10392710566520691,
0.16379733383655548,
0.07220026105642319,
0.056616634130477905,
-0.07934259623289108,
-0.041016608476638794,
-0.062125951051712036,
-0.03010784089565277,
-0.07597710937261581,
0.03190706670284271,
0.07991335541009903,
0.022013498470187187,
0.14483758807182312,
-0.07447929680347443,
-0.055414117872714996,
0.04774005338549614,
-0.03695913031697273,
0.014066549018025398,
0.11597969383001328,
0.0538434274494648,
-0.18820346891880035,
0.122951939702034,
0.1624670773744583,
-0.03137947618961334,
0.12950000166893005,
-0.030318496748805046,
-0.05814054608345032,
-0.06304413080215454,
-0.006044607609510422,
0.06085299700498581,
0.08390665054321289,
-0.05301966890692711,
0.0012193035800009966,
0.02578260563313961,
0.01527694147080183,
-0.0040978239849209785,
-0.10376598685979843,
-0.007171614095568657,
0.015438751317560673,
-0.06030214577913284,
-0.01208257582038641,
0.023012975230813026,
-0.006926689762622118,
0.11740100383758545,
0.03572385013103485,
-0.021991204470396042,
0.029077552258968353,
0.01974008046090603,
-0.09509339183568954,
0.20240989327430725,
-0.08450890332460403,
-0.09444174915552139,
-0.08481886982917786,
-0.06770150363445282,
-0.039379969239234924,
-0.002194575499743223,
0.0380135253071785,
-0.07350041717290878,
-0.018402256071567535,
-0.08759348094463348,
-0.07204775512218475,
0.07403972744941711,
0.0353759229183197,
0.018630532547831535,
0.01261298917233944,
0.11251085251569748,
-0.07393334060907364,
-0.010896923951804638,
-0.05420795828104019,
-0.024664951488375664,
0.04464937746524811,
-0.035940755158662796,
0.10676891356706619,
0.10728769749403,
-0.022576777264475822,
0.01928439736366272,
-0.02932046353816986,
0.22020764648914337,
-0.0473644845187664,
-0.05103529244661331,
0.10680945962667465,
0.05477694422006607,
0.06343939900398254,
0.0990486666560173,
0.05103312432765961,
-0.09010828286409378,
0.0267977062612772,
0.04133700206875801,
-0.016744019463658333,
-0.24397771060466766,
-0.04759111627936363,
-0.034835174679756165,
-0.045782435685396194,
0.09529735893011093,
0.03670674189925194,
0.006387886591255665,
0.03864471614360809,
0.01272517442703247,
0.0312696173787117,
-0.03575172647833824,
0.11478165537118912,
0.11432851105928421,
0.060448192059993744,
0.12683117389678955,
-0.015527753159403801,
-0.0588124580681324,
0.044603150337934494,
-0.0066962470300495625,
0.24453580379486084,
0.013273732736706734,
0.19044050574302673,
0.07624814659357071,
0.15093131363391876,
0.030514275655150414,
0.03757258877158165,
0.018582148477435112,
-0.020043842494487762,
0.011932946741580963,
-0.06355296075344086,
-0.03187457472085953,
0.006791382562369108,
-0.05689578503370285,
0.060543302446603775,
-0.09540140628814697,
0.06929011642932892,
0.08205560594797134,
0.22355832159519196,
0.07241298258304596,
-0.3319413959980011,
-0.07930111140012741,
0.006497479043900967,
-0.031907178461551666,
-0.02253396064043045,
0.020628293976187706,
0.13123013079166412,
-0.062387850135564804,
0.10586025565862656,
-0.07257547974586487,
0.07046277076005936,
-0.046973954886198044,
0.053230758756399155,
0.014313588850200176,
0.04132063686847687,
-0.030506879091262817,
0.07056330889463425,
-0.22173362970352173,
0.2402694970369339,
0.01250078808516264,
0.053389012813568115,
-0.025743944570422173,
-0.05047224462032318,
0.004883871879428625,
0.11956027150154114,
0.12295334041118622,
0.033217668533325195,
-0.08392610400915146,
-0.14315608143806458,
-0.1391880214214325,
0.016998248174786568,
0.10744970291852951,
-0.039771951735019684,
0.12396407127380371,
-0.0012695064069703221,
-0.0161216389387846,
0.013102958910167217,
-0.0867333859205246,
-0.10333418846130371,
-0.07188455760478973,
0.013414542190730572,
-0.006817504297941923,
0.007679054979234934,
-0.09377969056367874,
-0.09788627177476883,
-0.04998970776796341,
0.1619691401720047,
-0.10598815977573395,
-0.02693944424390793,
-0.13361141085624695,
0.06656250357627869,
0.06432698667049408,
-0.11052420735359192,
0.03743641823530197,
0.014974919147789478,
0.09539704769849777,
0.01303771510720253,
-0.06405875086784363,
0.08850399404764175,
-0.0645841434597969,
-0.23491546511650085,
-0.0685301125049591,
0.12011265009641647,
0.046792466193437576,
0.06147307530045509,
0.007614958565682173,
0.04363515228033066,
-0.0075782909989356995,
-0.0651853159070015,
0.07983999699354172,
0.08165706694126129,
0.11772049963474274,
0.020571354776620865,
-0.04981253296136856,
-0.0010404043132439256,
-0.052443407475948334,
-0.03321357071399689,
0.159915491938591,
0.30499836802482605,
-0.08033998310565948,
0.06834256649017334,
0.058883875608444214,
-0.06456605345010757,
-0.1767183095216751,
0.011383435688912868,
0.08931083977222443,
-0.0026569850742816925,
0.045881036669015884,
-0.15164639055728912,
0.09145661443471909,
0.11146949231624603,
-0.026032239198684692,
0.014399363659322262,
-0.20238913595676422,
-0.12042734771966934,
0.04734989255666733,
0.12070314586162567,
0.11665615439414978,
-0.12393160909414291,
-0.06507104635238647,
-0.004308762960135937,
-0.06974153965711594,
0.14878195524215698,
-0.15479151904582977,
0.10190125554800034,
-0.060717687010765076,
0.009059750474989414,
0.011356845498085022,
-0.06920495629310608,
0.10479658842086792,
0.015824493020772934,
0.07207800447940826,
-0.012439030222594738,
-0.04346682131290436,
0.10157344490289688,
-0.07088663429021835,
0.025338225066661835,
-0.06178423762321472,
0.046537403017282486,
-0.09211976826190948,
-0.01749141700565815,
-0.0813327431678772,
0.05294344946742058,
-0.0175943523645401,
-0.028889089822769165,
-0.09778498858213425,
0.061708964407444,
0.10474546998739243,
-0.00039281946374103427,
0.21385085582733154,
0.012219104915857315,
0.14014461636543274,
0.11648821830749512,
0.10410405695438385,
-0.03272221237421036,
0.014395263977348804,
0.011707715690135956,
-0.057999614626169205,
0.03185637667775154,
-0.15258407592773438,
0.04226050153374672,
0.1307303011417389,
0.028180912137031555,
0.1066136583685875,
0.039688240736722946,
-0.06894844025373459,
-0.0019914961885660887,
0.08127778768539429,
-0.15871009230613708,
-0.10239270329475403,
-0.007926058024168015,
0.003975837491452694,
-0.1660124659538269,
0.06873509287834167,
0.11717096716165543,
-0.08684448897838593,
-0.016792044043540955,
-0.03460017219185829,
0.026098202913999557,
-0.025918496772646904,
0.21619458496570587,
0.07185874134302139,
0.06809244304895401,
-0.08087675273418427,
0.0169273279607296,
0.07934603095054626,
-0.04524625837802887,
0.04709077998995781,
0.03989415988326073,
-0.13534075021743774,
-0.01954703778028488,
0.07220570743083954,
0.21397829055786133,
-0.023512791842222214,
-0.043052662163972855,
-0.1439499706029892,
-0.09059108048677444,
0.07346243411302567,
0.21052144467830658,
0.06551479548215866,
0.006475046742707491,
-0.008851987309753895,
-0.026119055226445198,
-0.12862727046012878,
0.15015539526939392,
-0.006826671306043863,
0.10483846068382263,
-0.10680817812681198,
0.0876583680510521,
-0.018316814675927162,
0.02505389042198658,
-0.03451132774353027,
0.07005161792039871,
-0.13013194501399994,
-0.038641590625047684,
-0.09648452699184418,
0.01740219071507454,
-0.0420752689242363,
-0.01055125705897808,
-0.02067759819328785,
-0.033215831965208054,
-0.06571771949529648,
0.01616220735013485,
-0.07915185391902924,
-0.03194442763924599,
0.032546259462833405,
0.06676816195249557,
-0.1241220012307167,
-0.014530484564602375,
0.03265386447310448,
-0.08064108341932297,
0.09800416231155396,
0.008858630433678627,
0.04915971681475639,
0.012829761952161789,
-0.04545465484261513,
-0.020137913525104523,
0.0634503960609436,
0.04925718531012535,
0.043880775570869446,
-0.17738276720046997,
-0.016975294798612595,
0.010826606303453445,
0.0032475306652486324,
0.047960638999938965,
0.07225243002176285,
-0.10453581809997559,
-0.016382748261094093,
-0.046025387942790985,
-0.003597237402573228,
-0.05712667107582092,
0.020578086376190186,
0.03647088631987572,
-0.018269391730427742,
0.2013462781906128,
-0.09536582231521606,
-0.01656370237469673,
-0.18268989026546478,
-0.020253071561455727,
-0.011612669564783573,
-0.14460863173007965,
-0.1379411518573761,
0.00790199264883995,
0.07167277485132217,
-0.028191326186060905,
0.13708584010601044,
-0.07157938927412033,
-0.03790033236145973,
0.04699920490384102,
-0.04514418542385101,
-0.0042634084820747375,
0.009130987338721752,
0.1413588970899582,
0.051373355090618134,
-0.04764845222234726,
0.04353643208742142,
-0.016118619590997696,
0.08507177233695984,
0.046751756221055984,
0.1978132128715515,
0.15677037835121155,
-0.02711336687207222,
0.006988299544900656,
0.013113663531839848,
-0.06361114233732224,
-0.17349210381507874,
0.07372914254665375,
-0.03713439777493477,
0.07661916315555573,
0.03212518244981766,
0.09542707353830338,
0.17095349729061127,
-0.17271609604358673,
0.006768183317035437,
-0.03529294952750206,
-0.0851261168718338,
-0.1291559934616089,
-0.07041437923908234,
-0.11498555541038513,
-0.1504754275083542,
-0.0057607111521065235,
-0.13370907306671143,
0.014001824893057346,
0.06366648524999619,
0.02213091403245926,
0.03211045637726784,
0.14927418529987335,
-0.007119824178516865,
0.06543779373168945,
0.041910961270332336,
-0.006317255087196827,
-0.04307451471686363,
-0.05426088348031044,
-0.06278293579816818,
0.032353274524211884,
-0.022565754130482674,
0.0582529678940773,
-0.027748804539442062,
-0.0010453364811837673,
0.03645269200205803,
-0.004346097353845835,
-0.10969360172748566,
0.024501735344529152,
0.005988811142742634,
0.0015022921143099666,
0.03138573467731476,
0.05718915909528732,
-0.015594865195453167,
0.004087990149855614,
0.1680852472782135,
-0.047102101147174835,
-0.05404853820800781,
-0.10661189258098602,
0.22116675972938538,
0.07923615723848343,
0.007281723897904158,
0.011259474791586399,
-0.0552191324532032,
0.023698095232248306,
0.18321764469146729,
0.13914382457733154,
-0.010811992920935154,
-0.007969814352691174,
-0.011162430979311466,
-0.01673971116542816,
0.005839650519192219,
0.05532896891236305,
0.10678950697183609,
-0.02612883597612381,
-0.0385301448404789,
-0.05437812954187393,
-0.0650983676314354,
-0.005968252196907997,
-0.09331117570400238,
0.03604140877723694,
0.06333072483539581,
0.007302708923816681,
-0.0380793958902359,
0.03298075497150421,
-0.037374816834926605,
-0.04482891410589218,
0.049743976444005966,
-0.18864072859287262,
-0.16171620786190033,
-0.04601099342107773,
0.04970138147473335,
-0.013129575178027153,
0.0013411942636594176,
-0.00013398121518548578,
-0.0067864819429814816,
0.08426759392023087,
-0.009759411215782166,
-0.09191425144672394,
-0.06203911453485489,
0.07566435635089874,
-0.09547435492277145,
0.21206027269363403,
-0.008691644296050072,
0.02685946226119995,
0.1364525407552719,
-0.030806254595518112,
-0.12721534073352814,
0.09983114153146744,
0.0335707813501358,
-0.042429860681295395,
0.03514575585722923,
0.13042601943016052,
-0.01726662926375866,
0.11746121197938919,
0.04333306849002838,
-0.09882184118032455,
-0.04185185953974724,
-0.03128831461071968,
-0.0357615128159523,
-0.051236264407634735,
0.01710568368434906,
-0.01689504273235798,
0.12752871215343475,
0.159420907497406,
-0.06591439247131348,
0.047986794263124466,
-0.0480249784886837,
0.04025576263666153,
0.08387967944145203,
0.06771916151046753,
-0.006372004747390747,
-0.2726849317550659,
0.03398074582219124,
0.021467121317982674,
0.024910081177949905,
-0.28192219138145447,
-0.08228965848684311,
-0.030302900820970535,
-0.014666137285530567,
-0.09034939855337143,
0.10962414741516113,
0.07679924368858337,
0.042098890990018845,
-0.0772375613451004,
-0.03722313791513443,
-0.07383053004741669,
0.15211300551891327,
-0.14902827143669128,
-0.07809405028820038
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"license": "apache-2.0"} | text-generation | FelixChao/Capricorn-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T16:55:59+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
This modelcard aims to be a base template for new models. It has been generated using this raw template.
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
64,
29,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05098743364214897,
0.18119150400161743,
-0.005546347238123417,
0.017277177423238754,
0.09535548090934753,
0.012573855929076672,
0.06850744038820267,
0.10837393999099731,
-0.018621204420924187,
0.10630100220441818,
0.025171682238578796,
0.08657454699277878,
0.11164019256830215,
0.14977188408374786,
-0.0008030658354982734,
-0.23681508004665375,
0.04359348490834236,
-0.1266832947731018,
-0.02974628657102585,
0.11300726234912872,
0.15117916464805603,
-0.09237026423215866,
0.07796194404363632,
-0.026965906843543053,
-0.009547654539346695,
-0.029042350128293037,
-0.059507694095373154,
-0.045686688274145126,
0.045702628791332245,
0.0660756453871727,
0.06690677255392075,
0.00041688629426062107,
0.09292758256196976,
-0.26813268661499023,
0.02008185349404812,
0.0664597749710083,
-0.0012793500209227204,
0.07728859782218933,
0.056240301579236984,
-0.07440850883722305,
0.09802885353565216,
-0.048399534076452255,
0.1422666758298874,
0.08229760825634003,
-0.090943343937397,
-0.18338368833065033,
-0.09160289168357849,
0.09443580359220505,
0.18546488881111145,
0.05044415965676308,
-0.021347247064113617,
0.08999335765838623,
-0.08516385406255722,
0.005957415793091059,
0.05279853194952011,
-0.07463932782411575,
-0.05264570191502571,
0.06564901769161224,
0.07107672840356827,
0.07033055275678635,
-0.12056542932987213,
-0.023800790309906006,
0.008731144480407238,
0.010739918798208237,
0.07595204561948776,
0.021117212250828743,
0.1447419822216034,
0.03368578106164932,
-0.12334440648555756,
-0.042693156749010086,
0.1282794326543808,
0.04175516963005066,
-0.04885222390294075,
-0.24373401701450348,
-0.02595498040318489,
-0.022665152326226234,
-0.03248772770166397,
-0.03886206075549126,
0.0448467992246151,
0.002830381039530039,
0.09096220880746841,
-0.018628129735589027,
-0.07835695892572403,
-0.029315827414393425,
0.06328904628753662,
0.04903733730316162,
0.02423769235610962,
-0.013522015884518623,
0.005825151689350605,
0.1200474426150322,
0.0931648537516594,
-0.12698234617710114,
-0.0494733527302742,
-0.0625515952706337,
-0.07075663655996323,
-0.04515935108065605,
0.03235740587115288,
0.0397205576300621,
0.051469046622514725,
0.2505846917629242,
0.014960609376430511,
0.05136311799287796,
0.04400661960244179,
0.012981995940208435,
0.05845816805958748,
0.1074749305844307,
-0.05520668625831604,
-0.10785319656133652,
-0.02126893773674965,
0.08356046676635742,
0.015483551658689976,
-0.034644994884729385,
-0.05889322608709335,
0.05063844472169876,
0.01071514468640089,
0.12029755860567093,
0.09420150518417358,
-0.0020220226142555475,
-0.07016023993492126,
-0.0628134235739708,
0.19292974472045898,
-0.16400295495986938,
0.0473443940281868,
0.029314545914530754,
-0.0365741103887558,
-0.001238433294929564,
0.007521297782659531,
0.028401484712958336,
-0.0246182419359684,
0.08293787389993668,
-0.05586033686995506,
-0.03929503634572029,
-0.1104612872004509,
-0.02357238531112671,
0.03238487243652344,
0.02302432991564274,
-0.030460160225629807,
-0.0243829432874918,
-0.08628323674201965,
-0.07036904245615005,
0.09920728206634521,
-0.07587142288684845,
-0.06134931370615959,
-0.021133728325366974,
-0.07577592134475708,
0.025946343317627907,
0.02078121155500412,
0.06897282600402832,
-0.01944863609969616,
0.030176213011145592,
-0.052048422396183014,
0.05514887720346451,
0.09825149923563004,
0.033667098730802536,
-0.0565333254635334,
0.06152923405170441,
-0.23782724142074585,
0.10050790756940842,
-0.05885869637131691,
0.05631159618496895,
-0.15306192636489868,
-0.018819015473127365,
0.04125715419650078,
0.0019174626795575023,
-0.011306731961667538,
0.13673482835292816,
-0.21845147013664246,
-0.023373503237962723,
0.16045258939266205,
-0.09470109641551971,
-0.07761828601360321,
0.05749881640076637,
-0.049321115016937256,
0.11392152309417725,
0.04453955963253975,
-0.020814193412661552,
0.06921925395727158,
-0.13386112451553345,
0.008098425343632698,
-0.03911484777927399,
-0.009868915192782879,
0.14855000376701355,
0.07630777359008789,
-0.07961857318878174,
0.06684968620538712,
0.02708585001528263,
-0.036954548209905624,
-0.04370904341340065,
-0.012701238505542278,
-0.11127448827028275,
0.006379514001309872,
-0.057350050657987595,
0.012217618525028229,
-0.029090026393532753,
-0.09010448306798935,
-0.02476555109024048,
-0.16960756480693817,
-0.021764647215604782,
0.0858255922794342,
-0.008199622854590416,
-0.020143786445260048,
-0.11031179130077362,
0.01924167200922966,
0.03676341474056244,
-0.0024156419094651937,
-0.12995371222496033,
-0.05207955837249756,
0.02718118391931057,
-0.1663770228624344,
0.035771261900663376,
-0.054468683898448944,
0.04823897406458855,
0.031382765620946884,
-0.031815558671951294,
-0.02675204910337925,
0.018448038026690483,
0.0020498076919466257,
-0.010357827879488468,
-0.241690531373024,
-0.026199771091341972,
-0.024984603747725487,
0.16617265343666077,
-0.21149884164333344,
0.03924327343702316,
0.06548555940389633,
0.14468514919281006,
0.00908830389380455,
-0.03872833773493767,
0.00325130857527256,
-0.07682507485151291,
-0.02375245839357376,
-0.06197876110672951,
-0.006265480071306229,
-0.029592575505375862,
-0.056297365576028824,
0.0474017933011055,
-0.16155876219272614,
-0.03318116441369057,
0.0943460464477539,
0.06758622825145721,
-0.13294118642807007,
-0.030522173270583153,
-0.03080666810274124,
-0.04256385564804077,
-0.05002477392554283,
-0.05415209010243416,
0.11267362534999847,
0.056675393134355545,
0.04424947872757912,
-0.06770579516887665,
-0.07564353197813034,
-0.0059568556025624275,
-0.024364864453673363,
-0.021629072725772858,
0.0876426175236702,
0.06839006394147873,
-0.11861623078584671,
0.09229876846075058,
0.10877985507249832,
0.07871770113706589,
0.09374723583459854,
-0.02219315804541111,
-0.08298259973526001,
-0.05079648643732071,
0.03239036351442337,
0.0106898108497262,
0.12816645205020905,
-0.011133784428238869,
0.055088043212890625,
0.039088018238544464,
-0.013070518150925636,
0.017940703779459,
-0.09119090437889099,
0.02961762435734272,
0.030785411596298218,
-0.02225067839026451,
0.0391770638525486,
-0.03700961172580719,
0.01944563537836075,
0.08695637434720993,
0.049476608633995056,
0.04310998693108559,
0.011310328729450703,
-0.0507609024643898,
-0.11596490442752838,
0.16487768292427063,
-0.1310916393995285,
-0.22092364728450775,
-0.15299998223781586,
0.012620022520422935,
0.03313823789358139,
-0.011554974131286144,
0.0014600668800994754,
-0.06272313743829727,
-0.11571827530860901,
-0.0922229140996933,
0.01242272648960352,
0.05040787532925606,
-0.09326481074094772,
-0.059908997267484665,
0.05969405174255371,
0.039082664996385574,
-0.1428091824054718,
0.020056407898664474,
0.05564342439174652,
-0.09713013470172882,
-0.019626006484031677,
0.07774446159601212,
0.06338733434677124,
0.17780159413814545,
0.011641085147857666,
-0.023729432374238968,
0.03950480371713638,
0.22507025301456451,
-0.1325329691171646,
0.11577421426773071,
0.14521914720535278,
-0.0826810896396637,
0.08551780879497528,
0.20404234528541565,
0.043351635336875916,
-0.09732220321893692,
0.03335292264819145,
0.01921760104596615,
-0.022651171311736107,
-0.2431226670742035,
-0.07194231450557709,
-0.0063305203802883625,
-0.07651257514953613,
0.07641167938709259,
0.09124799072742462,
0.08904707431793213,
0.01695016771554947,
-0.09704168140888214,
-0.0876987874507904,
0.06060664728283882,
0.10453283786773682,
0.020114712417125702,
-0.011224123649299145,
0.08347021788358688,
-0.034951116889715195,
0.016350839287042618,
0.08870627731084824,
0.014143294654786587,
0.1766888052225113,
0.06362227350473404,
0.1846485137939453,
0.07776474207639694,
0.07361125946044922,
0.01232505775988102,
0.00872134417295456,
0.01642940193414688,
0.02886364236474037,
-0.003145672148093581,
-0.08890394866466522,
-0.008976506069302559,
0.11655743420124054,
0.05968529358506203,
0.023473775014281273,
0.013177607208490372,
-0.045911915600299835,
0.07631321251392365,
0.19093434512615204,
-0.0009153723949566483,
-0.17904233932495117,
-0.06162286922335625,
0.08616357296705246,
-0.08956354111433029,
-0.10062851011753082,
-0.023238040506839752,
0.03308749198913574,
-0.17246176302433014,
0.02651992067694664,
-0.018314840272068977,
0.1141413077712059,
-0.13121241331100464,
-0.022874070331454277,
0.0691872388124466,
0.07217199355363846,
0.003470273455604911,
0.06070614606142044,
-0.1480747014284134,
0.1045488715171814,
0.014614338986575603,
0.07220901548862457,
-0.08818291872739792,
0.10457614064216614,
-0.0062637836672365665,
-0.0010398438898846507,
0.12917250394821167,
0.012704622000455856,
-0.08315494656562805,
-0.07081159949302673,
-0.08896414935588837,
-0.005403646733611822,
0.1177091896533966,
-0.14973147213459015,
0.0827237069606781,
-0.039415400475263596,
-0.04000493884086609,
0.0012276272755116224,
-0.11071229726076126,
-0.1310325264930725,
-0.19558793306350708,
0.05247398838400841,
-0.12856444716453552,
0.04142291471362114,
-0.10432637482881546,
-0.02850436232984066,
-0.01316818967461586,
0.18452438712120056,
-0.23713962733745575,
-0.07338899374008179,
-0.14616690576076508,
-0.10958345979452133,
0.15083934366703033,
-0.05047786235809326,
0.08364177495241165,
-0.009070590138435364,
0.17759136855602264,
0.02192075550556183,
-0.02559496834874153,
0.09046733379364014,
-0.09006041288375854,
-0.18359091877937317,
-0.07509500533342361,
0.1547618806362152,
0.1364850252866745,
0.03137340396642685,
-0.00009236243204213679,
0.03403643146157265,
-0.025934826582670212,
-0.12580923736095428,
0.012445051223039627,
0.17905670404434204,
0.07442319393157959,
0.02061055228114128,
-0.03987543284893036,
-0.11310092359781265,
-0.06931383907794952,
-0.026115750893950462,
0.03206980600953102,
0.18743392825126648,
-0.06983482837677002,
0.18250030279159546,
0.1439635306596756,
-0.06464575976133347,
-0.1876271665096283,
0.01272567454725504,
0.03231727331876755,
0.007271517999470234,
0.03296826779842377,
-0.20840111374855042,
0.09315025806427002,
0.005137091036885977,
-0.05071651563048363,
0.1370120495557785,
-0.17887669801712036,
-0.14646044373512268,
0.07223373651504517,
0.028904680162668228,
-0.19215600192546844,
-0.11724954843521118,
-0.08848059177398682,
-0.05218749865889549,
-0.17890824377536774,
0.0928690955042839,
0.035659998655319214,
0.006328529212623835,
0.029228543862700462,
0.04099976271390915,
0.017570940777659416,
-0.034000616520643234,
0.1953754723072052,
-0.025309110060334206,
0.03348270058631897,
-0.08410539478063583,
-0.0634893923997879,
0.04695089906454086,
-0.057213250547647476,
0.08370446413755417,
-0.029913514852523804,
0.014560804702341557,
-0.10276637226343155,
-0.038615114986896515,
-0.028055282309651375,
0.01990450732409954,
-0.09439650923013687,
-0.08324754238128662,
-0.04971373453736305,
0.09117461740970612,
0.09475960582494736,
-0.034476231783628464,
-0.025337999686598778,
-0.07224450260400772,
0.050935011357069016,
0.17326311767101288,
0.1753956526517868,
0.04247729107737541,
-0.0770951434969902,
0.00020656864217016846,
-0.009295777417719364,
0.04424283653497696,
-0.20921464264392853,
0.06171146035194397,
0.04861802980303764,
0.015397347509860992,
0.11190910637378693,
-0.02175346575677395,
-0.1547207087278366,
-0.06734275072813034,
0.0655275210738182,
-0.05564737319946289,
-0.19119581580162048,
-0.0015418418915942311,
0.05998261272907257,
-0.1725488007068634,
-0.04902401193976402,
0.043246448040008545,
-0.004179911222308874,
-0.04368313401937485,
0.019003113731741905,
0.09280961006879807,
-0.004413523245602846,
0.06393697112798691,
0.05462450534105301,
0.07884405553340912,
-0.09972895681858063,
0.07550083100795746,
0.08475533127784729,
-0.06779135018587112,
0.021509729325771332,
0.09533023089170456,
-0.05755164474248886,
-0.032326262444257736,
0.03258265554904938,
0.07859939336776733,
0.012596184387803078,
-0.041238002479076385,
0.01913696900010109,
-0.09541146457195282,
0.05581831559538841,
0.07808537036180496,
0.03262934461236,
0.011093069799244404,
0.03359393775463104,
0.03784278780221939,
-0.0644177570939064,
0.11518306285142899,
0.02587002143263817,
0.014913653023540974,
-0.037573423236608505,
-0.05676570162177086,
0.021193647757172585,
-0.033675044775009155,
-0.00463699409738183,
-0.038403820246458054,
-0.06861646473407745,
-0.019439343363046646,
-0.16266533732414246,
-0.012191741727292538,
-0.05826016142964363,
0.013241472654044628,
0.03000813163816929,
-0.03695882856845856,
0.004327703267335892,
0.007653922773897648,
-0.07277437299489975,
-0.06814969331026077,
-0.028041290119290352,
0.09671425074338913,
-0.1584416776895523,
0.0319603756070137,
0.08783707767724991,
-0.11725916713476181,
0.08885382115840912,
0.016516920179128647,
-0.00037986721144989133,
0.030176682397723198,
-0.15937243402004242,
0.03337664157152176,
-0.03232354298233986,
0.012355729006230831,
0.04085836187005043,
-0.22666671872138977,
-0.001465921988710761,
-0.030821116641163826,
-0.060066789388656616,
-0.012130483984947205,
-0.020190633833408356,
-0.11788560450077057,
0.09418036788702011,
0.010323213413357735,
-0.09105202555656433,
-0.034208908677101135,
0.03431219235062599,
0.09318374842405319,
-0.02494099922478199,
0.15748997032642365,
-0.004250435158610344,
0.07382263988256454,
-0.16479109227657318,
-0.019081395119428635,
-0.012243317440152168,
0.0292672086507082,
-0.02226962335407734,
-0.014464865438640118,
0.034872230142354965,
-0.020914284512400627,
0.17916569113731384,
-0.027987400069832802,
0.02451128326356411,
0.06518793851137161,
0.02345157414674759,
-0.018241364508867264,
0.10278960317373276,
0.056563593447208405,
0.02294408157467842,
0.018069501966238022,
0.010073256678879261,
-0.038073886185884476,
-0.03105715662240982,
-0.2051125019788742,
0.06024419143795967,
0.14490185678005219,
0.0803639218211174,
-0.013404328376054764,
0.07971688359975815,
-0.09786573052406311,
-0.10358753800392151,
0.12096197158098221,
-0.037024762481451035,
-0.01178735215216875,
-0.06821750849485397,
0.12925268709659576,
0.15778256952762604,
-0.1957709789276123,
0.07497909665107727,
-0.060612305998802185,
-0.046610940247774124,
-0.12327694147825241,
-0.20910800993442535,
-0.05508040264248848,
-0.047508079558610916,
-0.02215556427836418,
-0.05385427549481392,
0.06967004388570786,
0.05055632442235947,
0.0030300726648420095,
-0.002720841206610203,
0.055767983198165894,
-0.022411813959479332,
-0.013799558393657207,
0.028742652386426926,
0.06666351854801178,
0.022012729197740555,
-0.03296925500035286,
0.019523845985531807,
-0.0068005952052772045,
0.041506461799144745,
0.06027986854314804,
0.05017794668674469,
-0.02476835623383522,
0.015370680019259453,
-0.04556899890303612,
-0.10415440052747726,
0.04394756630063057,
-0.02317158132791519,
-0.07887127995491028,
0.14471043646335602,
0.026771942153573036,
0.0015075445408001542,
-0.017098277807235718,
0.24631071090698242,
-0.07347014546394348,
-0.09652724862098694,
-0.15004828572273254,
0.10078779608011246,
-0.042118173092603683,
0.061743881553411484,
0.04199874401092529,
-0.10357179492712021,
0.019095564261078835,
0.11929444968700409,
0.1714634746313095,
-0.04140309989452362,
0.019019711762666702,
0.02337595634162426,
0.0031649600714445114,
-0.036799654364585876,
0.04888617619872093,
0.06792740523815155,
0.1605997532606125,
-0.049703408032655716,
0.10666686296463013,
-0.0016597626963630319,
-0.089536651968956,
-0.04966892674565315,
0.11527734249830246,
-0.025868743658065796,
0.013937955722212791,
-0.0522877499461174,
0.13030187785625458,
-0.059832725673913956,
-0.21440553665161133,
0.05762794613838196,
-0.060670070350170135,
-0.14221197366714478,
-0.022004295140504837,
0.07185368239879608,
-0.016225198283791542,
0.02670050971210003,
0.069224514067173,
-0.07622170448303223,
0.20599152147769928,
0.038055576384067535,
-0.054234836250543594,
-0.06403503566980362,
0.08333315700292587,
-0.10256686806678772,
0.27284157276153564,
0.020483549684286118,
0.043097127228975296,
0.1018582135438919,
-0.011639008298516273,
-0.13639964163303375,
0.01857207715511322,
0.09395286440849304,
-0.08456414937973022,
0.044102296233177185,
0.19214273989200592,
-0.0023759787436574697,
0.12534835934638977,
0.08313769847154617,
-0.0665183737874031,
0.04617252200841904,
-0.09977047890424728,
-0.06693964451551437,
-0.0931321457028389,
0.09399199485778809,
-0.08075552433729172,
0.14307093620300293,
0.1299404501914978,
-0.0568833202123642,
0.009563712403178215,
-0.027316907420754433,
0.04724361002445221,
0.009638158604502678,
0.09812209010124207,
0.008474581874907017,
-0.1704498678445816,
0.019718635827302933,
0.01983269676566124,
0.10463733971118927,
-0.15966567397117615,
-0.09304337203502655,
0.04343968629837036,
0.003504045307636261,
-0.059013694524765015,
0.1283906102180481,
0.05894576758146286,
0.043327439576387405,
-0.041388269513845444,
-0.03907050937414169,
-0.007298397831618786,
0.13337571918964386,
-0.1050582155585289,
-0.002106369473040104
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | tyemel/dinov2-base-finetuned-lora-dino_genre_augmentation | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T16:56:17+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# OpenBuddy - Open Multilingual Chatbot
GitHub and Usage Guide: [https://github.com/OpenBuddy/OpenBuddy](https://github.com/OpenBuddy/OpenBuddy)
Website and Demo: [https://openbuddy.ai](https://openbuddy.ai)
Evaluation result of this model: [Evaluation.txt](Evaluation.txt)

# Copyright Notice
Base model: https://huggingface.co/mistralai/Mixtral-8x7B-v0.1
License: Apache 2.0
## Disclaimer
All OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.
OpenBuddy is provided "as-is" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.
By using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.
## 免责声明
所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。
OpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。
使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。 | {"language": ["zh", "en", "fr", "de", "ja", "ko", "it", "ru"], "license": "apache-2.0", "library_name": "transformers", "pipeline_tag": "text-generation", "inference": false} | text-generation | LoneStriker/openbuddy-mixtral-7bx8-v18.1-32k-GGUF | [
"transformers",
"gguf",
"text-generation",
"zh",
"en",
"fr",
"de",
"ja",
"ko",
"it",
"ru",
"license:apache-2.0",
"region:us"
] | 2024-02-14T16:58:08+00:00 | [] | [
"zh",
"en",
"fr",
"de",
"ja",
"ko",
"it",
"ru"
] | TAGS
#transformers #gguf #text-generation #zh #en #fr #de #ja #ko #it #ru #license-apache-2.0 #region-us
|
# OpenBuddy - Open Multilingual Chatbot
GitHub and Usage Guide: URL
Website and Demo: URL
Evaluation result of this model: URL
!Demo
# Copyright Notice
Base model: URL
License: Apache 2.0
## Disclaimer
All OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.
OpenBuddy is provided "as-is" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.
By using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.
## 免责声明
所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。
OpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。
使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。 | [
"# OpenBuddy - Open Multilingual Chatbot\n\nGitHub and Usage Guide: URL\n\nWebsite and Demo: URL\n\nEvaluation result of this model: URL\n\n!Demo",
"# Copyright Notice\n\nBase model: URL\n\nLicense: Apache 2.0",
"## Disclaimer\n\nAll OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.\n\nOpenBuddy is provided \"as-is\" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.\n\nBy using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.",
"## 免责声明\n\n所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。\n\nOpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。\n\n使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。"
] | [
"TAGS\n#transformers #gguf #text-generation #zh #en #fr #de #ja #ko #it #ru #license-apache-2.0 #region-us \n",
"# OpenBuddy - Open Multilingual Chatbot\n\nGitHub and Usage Guide: URL\n\nWebsite and Demo: URL\n\nEvaluation result of this model: URL\n\n!Demo",
"# Copyright Notice\n\nBase model: URL\n\nLicense: Apache 2.0",
"## Disclaimer\n\nAll OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.\n\nOpenBuddy is provided \"as-is\" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.\n\nBy using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.",
"## 免责声明\n\n所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。\n\nOpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。\n\n使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。"
] | [
41,
35,
13,
298,
234
] | [
"passage: TAGS\n#transformers #gguf #text-generation #zh #en #fr #de #ja #ko #it #ru #license-apache-2.0 #region-us \n# OpenBuddy - Open Multilingual Chatbot\n\nGitHub and Usage Guide: URL\n\nWebsite and Demo: URL\n\nEvaluation result of this model: URL\n\n!Demo# Copyright Notice\n\nBase model: URL\n\nLicense: Apache 2.0## Disclaimer\n\nAll OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.\n\nOpenBuddy is provided \"as-is\" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.\n\nBy using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy."
] | [
-0.03697323426604271,
-0.029472285881638527,
-0.0053304932080209255,
0.016129594296216965,
0.004611433017998934,
-0.09047933667898178,
0.08073660731315613,
0.045750536024570465,
0.10729382932186127,
0.04665878787636757,
0.006476307287812233,
0.008009321056306362,
0.0016217043157666922,
-0.032549407333135605,
0.04010904207825661,
-0.1554938405752182,
0.02262653224170208,
-0.04265068843960762,
0.09706089645624161,
0.02209785208106041,
0.06784394383430481,
0.0247510839253664,
0.01947847008705139,
0.06915978342294693,
0.03061055578291416,
-0.06442208588123322,
0.03511536493897438,
0.025066513568162918,
0.06864312291145325,
0.13961532711982727,
-0.02130497433245182,
-0.02575528807938099,
-0.03569721430540085,
-0.22132283449172974,
0.009901950135827065,
0.057496849447488785,
-0.05670004338026047,
-0.01574586145579815,
0.03523952141404152,
0.055471353232860565,
0.18698304891586304,
0.007051859516650438,
-0.02859860099852085,
0.07928745448589325,
-0.08779294043779373,
-0.15255622565746307,
-0.03538923338055611,
0.056295666843652725,
0.0696907639503479,
0.11473201215267181,
-0.05372263863682747,
0.11066631227731705,
-0.004664476960897446,
0.04547299072146416,
0.16814281046390533,
-0.1821821630001068,
0.01030491478741169,
0.013468666933476925,
0.11075601726770401,
-0.0008872105972841382,
-0.040345944464206696,
0.0019187584985047579,
0.03294442221522331,
0.044326864182949066,
0.017134064808487892,
-0.03667125105857849,
0.05072138085961342,
-0.021114422008395195,
-0.11585438251495361,
-0.014219640754163265,
0.26145341992378235,
0.01282793004065752,
-0.132010817527771,
-0.10041745752096176,
0.019001808017492294,
0.15126308798789978,
0.06630124896764755,
0.02013704925775528,
0.028625383973121643,
0.03439950570464134,
0.1042085587978363,
-0.035163745284080505,
-0.09305118769407272,
0.003725900547578931,
-0.11209296435117722,
0.1384609341621399,
0.033488679677248,
0.06552993506193161,
-0.04983709752559662,
-0.0255623497068882,
-0.20469602942466736,
-0.03804049268364906,
-0.04141038656234741,
-0.05708226561546326,
-0.012384464964270592,
0.018042126670479774,
-0.20188046991825104,
-0.15218770503997803,
0.0753304585814476,
0.11293771117925644,
-0.06018722802400589,
-0.035133246332407,
-0.05874018743634224,
0.09620171785354614,
0.11452166736125946,
-0.06374472379684448,
-0.0566980242729187,
0.021004756912589073,
0.058092448860406876,
0.027511391788721085,
0.13181626796722412,
-0.03204241022467613,
-0.03554859384894371,
0.04902425408363342,
-0.09016342461109161,
0.0019934545271098614,
0.08808177709579468,
0.03991546854376793,
-0.03186987712979317,
-0.02706071361899376,
0.10439342260360718,
-0.03998195379972458,
-0.08171441406011581,
-0.026329128071665764,
-0.05594881251454353,
-0.14910487830638885,
0.06152819097042084,
0.06479211896657944,
-0.028234807774424553,
0.011456109583377838,
-0.05160258710384369,
0.00895190890878439,
-0.0823599100112915,
-0.08937311917543411,
0.10252351313829422,
0.04807791858911514,
-0.0007561745005659759,
-0.10340973734855652,
-0.10223181545734406,
-0.02506188489496708,
0.04228528216481209,
0.0026221880689263344,
-0.05703023448586464,
0.07409628480672836,
0.0317130871117115,
-0.05114240199327469,
-0.00325488462112844,
-0.1629914790391922,
-0.03050404228270054,
0.05371726676821709,
-0.04255823791027069,
0.009695162065327168,
-0.10040809959173203,
-0.007251880597323179,
-0.1437595784664154,
0.026357343420386314,
-0.12255410104990005,
0.005895990412682295,
-0.09393081814050674,
-0.01718760095536709,
-0.02494066022336483,
0.01129019632935524,
-0.061922237277030945,
0.11936810612678528,
-0.014703006483614445,
0.08647250384092331,
-0.05430271103978157,
-0.024022404104471207,
0.15297867357730865,
-0.15778842568397522,
-0.06460422277450562,
0.15435539186000824,
-0.017798589542508125,
0.10588184744119644,
0.06908077746629715,
0.0959378331899643,
0.02979009971022606,
-0.09635986387729645,
-0.08811867982149124,
-0.006465303711593151,
-0.003036878537386656,
0.10305176675319672,
0.033037107437849045,
-0.11086047440767288,
-0.027453377842903137,
-0.0036732028238475323,
0.0396052785217762,
-0.03029034659266472,
0.03836855664849281,
-0.016605986282229424,
-0.010553231462836266,
-0.0260931346565485,
0.032146576792001724,
-0.02381610870361328,
-0.010750715620815754,
0.028897657990455627,
-0.0339711531996727,
0.01741798222064972,
0.051165204495191574,
0.02494242787361145,
0.04451991990208626,
-0.06907246261835098,
-0.05253749340772629,
-0.00968439131975174,
-0.007703965995460749,
-0.12295964360237122,
-0.03891776129603386,
0.048604536801576614,
-0.1579977571964264,
0.15835659205913544,
0.11873363703489304,
0.03030848689377308,
0.01945735700428486,
-0.03164428472518921,
0.08707093447446823,
-0.054556697607040405,
-0.020915906876325607,
-0.043734535574913025,
-0.20325802266597748,
0.07546490430831909,
-0.07780914008617401,
0.009667612612247467,
-0.08823218941688538,
0.030497180297970772,
0.098456010222435,
-0.0195928867906332,
0.05340728163719177,
-0.0464792437851429,
0.048582661896944046,
0.023717835545539856,
0.014186719432473183,
0.05148741975426674,
0.02781955897808075,
0.008025059476494789,
-0.17171712219715118,
0.16061732172966003,
-0.1263205111026764,
0.035224560648202896,
0.06044141575694084,
-0.07534630596637726,
-0.03434904292225838,
-0.15049946308135986,
-0.0006188464467413723,
-0.0592319518327713,
-0.0326901376247406,
-0.07263559848070145,
0.05790955573320389,
-0.0012593204155564308,
-0.026194030418992043,
-0.10714682936668396,
-0.016479624435305595,
0.038784414529800415,
-0.15449956059455872,
0.05217789486050606,
0.02502056024968624,
0.07355677336454391,
-0.14351974427700043,
0.054231543093919754,
0.030766194686293602,
-0.17950965464115143,
0.12084870785474777,
0.016730567440390587,
-0.03868982568383217,
-0.05903612822294235,
0.04146008938550949,
0.06714501976966858,
0.16492989659309387,
-0.06203797459602356,
0.05062778294086456,
0.011765819042921066,
0.033003468066453934,
0.02176126465201378,
-0.07985105365514755,
-0.008560116402804852,
-0.003079549642279744,
0.03181493654847145,
-0.04591207206249237,
0.006673538591712713,
-0.10141757130622864,
0.0738462507724762,
-0.04148882254958153,
0.039032138884067535,
0.07616020739078522,
-0.04232843965291977,
-0.1296796053647995,
0.1017957478761673,
-0.09481588006019592,
-0.20691287517547607,
-0.06093771755695343,
0.008872350677847862,
-0.08364677429199219,
0.011693613603711128,
0.06396438926458359,
-0.0597190260887146,
-0.0494469590485096,
-0.07364904135465622,
0.012515596114099026,
-0.027713028714060783,
-0.07060971856117249,
-0.06067781522870064,
0.03629753366112709,
0.00421692943200469,
-0.04050179570913315,
-0.006989399902522564,
-0.027843616902828217,
-0.1329376995563507,
0.004196470603346825,
-0.043487295508384705,
0.06914865970611572,
0.07872536778450012,
0.1364513486623764,
-0.07114563882350922,
-0.08307944238185883,
0.06440195441246033,
-0.02537807635962963,
0.018808644264936447,
0.2277158945798874,
-0.06999509781599045,
0.09884103387594223,
0.11821725219488144,
0.04528988152742386,
-0.021454134956002235,
0.059226084500551224,
0.0996919721364975,
-0.04941590130329132,
-0.13976295292377472,
-0.05119152367115021,
-0.10552474111318588,
0.09200811386108398,
-0.016422491520643234,
-0.006507740821689367,
0.2136407047510147,
0.032774198800325394,
-0.05457112565636635,
0.08517980575561523,
0.08507169038057327,
0.06909329444169998,
0.14333178102970123,
-0.02064208686351776,
0.11211101710796356,
-0.01781184785068035,
0.06540366262197495,
0.1344051957130432,
-0.00903908722102642,
0.3319772183895111,
0.002842132467776537,
0.08734006434679031,
0.17465676367282867,
0.015363196842372417,
-0.027100572362542152,
-0.0880749374628067,
-0.08409645408391953,
0.04061382636427879,
-0.08049435168504715,
-0.10173516720533371,
-0.06593970209360123,
0.10700125247240067,
-0.053161464631557465,
-0.07247103750705719,
0.020960107445716858,
-0.03587298095226288,
0.06874290108680725,
0.010504985228180885,
-0.009595820680260658,
0.042839907109737396,
-0.028972070664167404,
0.040341220796108246,
-0.042023926973342896,
0.006551033817231655,
0.07452218234539032,
0.09238653630018234,
-0.060091812163591385,
0.04231550917029381,
-0.03483648598194122,
0.04090730473399162,
0.009719003923237324,
0.04701070860028267,
-0.06047828868031502,
0.043683841824531555,
-0.02865583635866642,
0.052992530167102814,
-0.2938360869884491,
0.07800358533859253,
0.0020815206225961447,
-0.022799132391810417,
-0.0696340948343277,
0.022581571713089943,
0.07754677534103394,
0.056214891374111176,
0.09973160177469254,
0.04570915177464485,
-0.091171033680439,
0.06445523351430893,
0.03985300287604332,
0.047789670526981354,
0.00519132474437356,
-0.013760332949459553,
0.02294398844242096,
0.01727387309074402,
0.06571314483880997,
-0.01711510494351387,
0.15928508341312408,
-0.14591394364833832,
-0.13558071851730347,
0.0860125944018364,
-0.054565317928791046,
-0.030983544886112213,
-0.12447545677423477,
0.019427094608545303,
0.11093847453594208,
0.03468022868037224,
-0.10916357487440109,
-0.03292699530720711,
-0.05392443761229515,
-0.08386774361133575,
0.04820820316672325,
-0.043690748512744904,
0.05962183699011803,
-0.019701894372701645,
-0.009897754527628422,
-0.11675793677568436,
-0.037860412150621414,
0.06313994526863098,
-0.12557709217071533,
-0.11197500675916672,
-0.09758373349905014,
-0.013514316640794277,
0.12121812999248505,
0.02494054101407528,
-0.02772621251642704,
0.012945186346769333,
-0.03602912649512291,
-0.11625729501247406,
-0.015405615791678429,
0.10899262875318527,
0.06424162536859512,
0.05361701175570488,
-0.0670110434293747,
-0.05085102468729019,
-0.06843835860490799,
-0.11469779908657074,
-0.03995397314429283,
0.12876181304454803,
-0.00977236870676279,
0.09281416237354279,
0.2067248523235321,
-0.0881117433309555,
-0.20624199509620667,
-0.022308530285954475,
0.017462339252233505,
-0.05931713059544563,
0.09231895953416824,
-0.1616208255290985,
-0.005944120232015848,
0.05654898285865784,
-0.06326262652873993,
0.08816511929035187,
-0.09432822465896606,
-0.07787279039621353,
-0.001102988957427442,
0.0353272408246994,
0.15875284373760223,
-0.12494193017482758,
-0.0702168270945549,
-0.014411170035600662,
-0.047864776104688644,
0.1169169470667839,
-0.1646287441253662,
0.031142424792051315,
0.019252318888902664,
-0.06458651274442673,
0.011921684257686138,
-0.023134276270866394,
0.09824158251285553,
-0.037652868777513504,
0.041363779455423355,
-0.05745525658130646,
0.04232471436262131,
-0.021778350695967674,
-0.049232516437768936,
0.06203153729438782,
-0.14962050318717957,
-0.03409174457192421,
-0.07473315298557281,
-0.05668340250849724,
-0.08328056335449219,
0.10581760853528976,
-0.03727645426988602,
-0.11016061156988144,
-0.0722222700715065,
0.07533594220876694,
0.007790472824126482,
0.016312263906002045,
-0.0038955125492066145,
-0.14303775131702423,
0.04819396138191223,
0.16692869365215302,
0.1773696392774582,
0.11001662909984589,
-0.14597342908382416,
-0.007027748506516218,
-0.06416545808315277,
0.1268957555294037,
-0.18660438060760498,
-0.00123973423615098,
-0.009942665696144104,
0.0025731942150741816,
0.08901236951351166,
0.0017366474494338036,
-0.13144747912883759,
0.08540835976600647,
0.04822295159101486,
-0.010746880434453487,
-0.0500626303255558,
-0.006901429034769535,
0.22420388460159302,
-0.057070162147283554,
0.011720280162990093,
0.12974683940410614,
-0.08910360932350159,
0.01800045743584633,
-0.022600539028644562,
0.11067517101764679,
-0.007667867466807365,
-0.0037462504114955664,
0.044503115117549896,
-0.01842597872018814,
-0.028312023729085922,
0.07618877291679382,
0.017848791554570198,
-0.06748097389936447,
0.08668865263462067,
-0.027583783492445946,
-0.02158641256392002,
-0.056245751678943634,
-0.12779000401496887,
-0.03763575851917267,
-0.07874467223882675,
-0.1433197557926178,
0.017116734758019447,
-0.06756553053855896,
-0.03753123804926872,
0.19110901653766632,
-0.005471671000123024,
0.033975619822740555,
0.0380009301006794,
0.06139590963721275,
0.011856785044074059,
0.07991781085729599,
0.024630818516016006,
0.022460203617811203,
0.023588715121150017,
-0.06778550893068314,
0.046710480004549026,
-0.03174327686429024,
-0.013467478565871716,
0.03897226229310036,
-0.13817362487316132,
-0.04208947345614433,
-0.14295686781406403,
0.03438405692577362,
-0.05337528884410858,
-0.039578892290592194,
0.002408895641565323,
0.02451719343662262,
0.03785973787307739,
0.021938258782029152,
-0.0016840501921251416,
0.024099554866552353,
0.02790527231991291,
0.07032041251659393,
-0.12935538589954376,
-0.030689634382724762,
0.11615610867738724,
-0.0420561321079731,
0.09900080412626266,
-0.03789179027080536,
-0.05048011243343353,
0.01827378198504448,
-0.059759628027677536,
0.15459421277046204,
-0.03475523740053177,
0.01629408448934555,
-0.09073130041360855,
-0.07213892787694931,
0.025425536558032036,
-0.023107610642910004,
-0.022429466247558594,
-0.0051780021749436855,
0.06218605116009712,
-0.08647934347391129,
0.059245653450489044,
0.10411284863948822,
-0.03603232651948929,
-0.05982910469174385,
-0.021905025467276573,
0.00007051092688925564,
0.04234696924686432,
0.042830705642700195,
-0.029229870066046715,
0.0016919721383601427,
-0.12978769838809967,
0.016430968418717384,
0.05540399253368378,
0.039893556386232376,
-0.08495479822158813,
-0.05871975049376488,
0.002961358753964305,
0.0022285350132733583,
0.1479102522134781,
-0.023301400244235992,
-0.06531982123851776,
0.01531065721064806,
0.06472034752368927,
0.10970491915941238,
-0.02137502282857895,
-0.13725903630256653,
-0.05595759302377701,
0.007580493576824665,
-0.16996370255947113,
-0.0008463281556032598,
-0.056670892983675,
-0.2511979639530182,
0.05319972336292267,
0.004312219098210335,
0.11650142073631287,
-0.023093685507774353,
0.04549102485179901,
-0.06629133224487305,
-0.04731906205415726,
-0.001472181174904108,
0.08881768584251404,
0.006848530378192663,
-0.016668614000082016,
0.046522967517375946,
0.18522919714450836,
-0.017990829423069954,
0.14093855023384094,
0.014826225116848946,
-0.008234639652073383,
-0.06996562331914902,
-0.23196864128112793,
0.01768799126148224,
-0.025916503742337227,
-0.008931358344852924,
-0.07882989943027496,
0.022692373022437096,
0.08172830194234848,
-0.02129504829645157,
-0.012419033795595169,
0.04038091003894806,
-0.10677140206098557,
-0.06434737145900726,
-0.03294995054602623,
-0.042803939431905746,
0.014894784428179264,
0.11508534103631973,
0.011580108664929867,
0.025651486590504646,
0.004639219492673874,
0.04440324753522873,
0.09114566445350647,
0.01380141917616129,
0.0205056294798851,
-0.010069402866065502,
0.0054301852360367775,
0.01422901265323162,
-0.04688558727502823,
0.013973204419016838,
0.29627862572669983,
0.043551862239837646,
-0.013739570043981075,
0.05563993752002716,
0.1774521917104721,
-0.05728055164217949,
-0.08036967366933823,
-0.1293264478445053,
0.2235563099384308,
-0.08281933516263962,
-0.018792228773236275,
-0.06844981014728546,
-0.03474777936935425,
0.11948982626199722,
0.14521335065364838,
0.018400071188807487,
-0.08661212027072906,
0.00753869628533721,
0.035073455423116684,
-0.0031842864118516445,
0.0366065539419651,
0.03231804817914963,
-0.04240105673670769,
0.4451933801174164,
-0.0772005021572113,
0.15052776038646698,
0.030360683798789978,
0.048310745507478714,
-0.06262088567018509,
0.032308898866176605,
0.023385053500533104,
-0.052594300359487534,
-0.05467918515205383,
0.052321821451187134,
-0.07106595486402512,
-0.09782639145851135,
-0.01134014967828989,
0.03488621860742569,
0.04350650683045387,
-0.008068308234214783,
0.12640531361103058,
-0.01246324647217989,
0.06768595427274704,
0.004057582933455706,
-0.02918163686990738,
0.07493217289447784,
-0.024540409445762634,
-0.060731060802936554,
0.03783319517970085,
0.08494401723146439,
0.12778852880001068,
0.08903779089450836,
0.004395937547087669,
0.1848153918981552,
0.04552828520536423,
0.04175823554396629,
0.02873041108250618,
0.0544012114405632,
-0.025339817628264427,
-0.06794239580631256,
-0.09135021269321442,
-0.02317146211862564,
0.049743667244911194,
0.13160645961761475,
0.08647208660840988,
0.014599247835576534,
0.03391402214765549,
0.12387298792600632,
-0.03261658549308777,
-0.04155883565545082,
0.12381693720817566,
-0.13286858797073364,
0.1296035647392273,
0.05477728322148323,
-0.011606705375015736,
-0.06016544997692108,
-0.067694291472435,
-0.0022991348523646593,
0.013796886429190636,
0.030835898593068123,
0.015090450644493103,
0.013984479010105133,
0.05624321475625038,
0.15627047419548035,
0.0383811816573143,
-0.189866840839386,
-0.07839830219745636,
0.08703365921974182,
0.004409478977322578,
-0.01605076529085636,
-0.04257480427622795,
0.11929679661989212,
-0.03540525585412979,
-0.05889776721596718,
-0.08636102080345154,
0.040348250418901443,
-0.006337996572256088,
-0.019910914823412895,
-0.065192312002182
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mistral-verilog-v2-end
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | {"license": "apache-2.0", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "mistralai/Mistral-7B-v0.1", "model-index": [{"name": "mistral-verilog-v2-end", "results": []}]} | null | emilgoh/mistral-verilog-v2 | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"region:us"
] | 2024-02-14T17:00:17+00:00 | [] | [] | TAGS
#peft #safetensors #trl #sft #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us
|
# mistral-verilog-v2-end
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | [
"# mistral-verilog-v2-end\n\nThis model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us \n",
"# mistral-verilog-v2-end\n\nThis model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
51,
39,
6,
12,
8,
3,
116,
4,
39
] | [
"passage: TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #region-us \n# mistral-verilog-v2-end\n\nThis model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.11359327286481857,
0.0880155935883522,
-0.0028370898216962814,
0.0795043334364891,
0.12074995785951614,
0.033435624092817307,
0.101921945810318,
0.11425859481096268,
-0.03673119470477104,
0.08388425409793854,
0.09039237350225449,
0.029681198298931122,
0.0493280366063118,
0.14629754424095154,
-0.034460537135601044,
-0.21127861738204956,
0.038434699177742004,
-0.08211428672075272,
-0.055558063089847565,
0.08969762921333313,
0.1080133393406868,
-0.09684580564498901,
0.09309829026460648,
0.009358590468764305,
-0.11746221035718918,
-0.029730109497904778,
-0.037864625453948975,
-0.04365094378590584,
0.09979233145713806,
-0.0011635205009952188,
0.1012883335351944,
0.020519712939858437,
0.15076380968093872,
-0.21054592728614807,
0.007254802621901035,
0.07169798016548157,
0.035782340914011,
0.10214801877737045,
0.04525545984506607,
0.007484034169465303,
0.08599215745925903,
-0.12110772728919983,
0.10304734110832214,
0.03366663679480553,
-0.08060204237699509,
-0.09689872711896896,
-0.11238467693328857,
0.07229697704315186,
0.07851666212081909,
0.09424500912427902,
0.020986337214708328,
0.1440989226102829,
-0.09386878460645676,
0.045137669891119,
0.23414795100688934,
-0.24664396047592163,
-0.05474180728197098,
0.07912988215684891,
0.04872838780283928,
0.06682773679494858,
-0.11023879796266556,
-0.01700361631810665,
0.04196562618017197,
0.02207060158252716,
0.07347484678030014,
0.003924813121557236,
-0.07087991386651993,
-0.007129293400794268,
-0.1305246204137802,
-0.028747962787747383,
0.14707693457603455,
0.03903549537062645,
-0.055236417800188065,
-0.07806898653507233,
-0.04915998876094818,
-0.09057796746492386,
-0.013378300704061985,
-0.03438984975218773,
0.03284840285778046,
-0.04506317153573036,
0.008235587738454342,
-0.037979982793331146,
-0.09807506948709488,
-0.08688626438379288,
0.005025280639529228,
0.08302301168441772,
0.04221444949507713,
0.028956321999430656,
-0.01904142089188099,
0.12297181040048599,
-0.020998142659664154,
-0.11081770807504654,
-0.03493291884660721,
-0.022112984210252762,
-0.10028527677059174,
-0.054540544748306274,
-0.033650659024715424,
0.0005812437739223242,
0.03194960951805115,
0.16275645792484283,
-0.09971331804990768,
0.06835351884365082,
0.06375103443861008,
0.011198507621884346,
-0.022377202287316322,
0.10647862404584885,
-0.06870625913143158,
-0.021639831364154816,
0.018180590122938156,
0.11373742669820786,
0.022856563329696655,
-0.004791371989995241,
-0.0670664981007576,
-0.04130854457616806,
0.055423904210329056,
0.07315273582935333,
-0.046322617679834366,
0.020038682967424393,
-0.05245131254196167,
-0.020715434104204178,
0.11536233872175217,
-0.11166416853666306,
0.01608239859342575,
0.025681661441922188,
-0.09523536264896393,
-0.057207174599170685,
0.046555690467357635,
0.01727188378572464,
-0.01852709800004959,
0.07897043228149414,
-0.091164231300354,
-0.01184607483446598,
-0.056931883096694946,
-0.05090702325105667,
0.022456055507063866,
-0.08917895704507828,
-0.024583807215094566,
-0.08396089822053909,
-0.1980455219745636,
-0.04574928805232048,
0.04164373129606247,
-0.0655401423573494,
-0.03447685018181801,
-0.044440772384405136,
-0.07455340027809143,
0.028697768226265907,
0.002404287923127413,
0.14593173563480377,
-0.06492212414741516,
0.05365196615457535,
-0.06307514756917953,
0.020538246259093285,
0.012027186341583729,
0.027674347162246704,
-0.07537976652383804,
0.030985744670033455,
-0.10526356101036072,
0.048494603484869,
-0.0965685322880745,
0.03350675851106644,
-0.14851048588752747,
-0.08351451903581619,
0.03153224661946297,
-0.04851829633116722,
0.06701796501874924,
0.13637381792068481,
-0.16764727234840393,
0.013400350697338581,
0.10062738507986069,
-0.08949590474367142,
-0.08364970982074738,
0.10297974199056625,
-0.03431094065308571,
0.013360599055886269,
0.0374596044421196,
0.13863638043403625,
0.11088015884160995,
-0.19495509564876556,
-0.004293140023946762,
0.02846784144639969,
0.032234497368335724,
0.018591778352856636,
0.07352878898382187,
-0.010636892169713974,
0.009635686874389648,
0.008216925896704197,
-0.08337387442588806,
-0.005786308087408543,
-0.07830527424812317,
-0.09596335887908936,
-0.0702550932765007,
-0.09203337877988815,
0.017643850296735764,
0.0022271883208304644,
0.01234703790396452,
-0.05446799099445343,
-0.10083969682455063,
0.0181900467723608,
0.1718861609697342,
-0.027996787801384926,
0.005780040752142668,
-0.07432949542999268,
0.0625573992729187,
0.018222643062472343,
-0.03459058329463005,
-0.17617787420749664,
-0.12160517275333405,
0.04056413471698761,
-0.07112770527601242,
0.002842741785570979,
0.019132940098643303,
0.06461428105831146,
0.08668377995491028,
-0.0357482023537159,
-0.022271104156970978,
-0.06917617470026016,
0.005452276673167944,
-0.0882796049118042,
-0.20997969806194305,
-0.029543105512857437,
-0.050293318927288055,
0.20236432552337646,
-0.21878640353679657,
0.003959777764976025,
-0.01912299543619156,
0.13749778270721436,
0.033474501222372055,
-0.050699908286333084,
0.011996674351394176,
0.007558254990726709,
0.00011814672325272113,
-0.10029726475477219,
0.027961138635873795,
0.000314126955345273,
-0.08824948966503143,
-0.011752866208553314,
-0.13713473081588745,
-0.006051366683095694,
0.05241122841835022,
0.12283024936914444,
-0.11028964817523956,
-0.06488528847694397,
-0.06477782875299454,
-0.046107061207294464,
-0.07749305665493011,
0.017529809847474098,
0.16525311768054962,
0.015364178456366062,
0.10615924745798111,
-0.06290467828512192,
-0.05739973485469818,
0.011284511536359787,
0.0015620740596204996,
0.0018046250334009528,
0.09537077695131302,
0.033453505486249924,
-0.13846607506275177,
0.06923593580722809,
0.10729147493839264,
-0.07516986131668091,
0.10670371353626251,
-0.05373352766036987,
-0.07907077670097351,
-0.024982541799545288,
0.03485342860221863,
-0.00167043786495924,
0.1198807805776596,
-0.01709488034248352,
0.04630593582987785,
0.040288619697093964,
0.024684326723217964,
0.011771561577916145,
-0.17054349184036255,
-0.014265041798353195,
0.00265081156976521,
-0.05247826501727104,
-0.037290289998054504,
0.005813537165522575,
0.015696655958890915,
0.0815441682934761,
0.012045670300722122,
-0.05553777888417244,
-0.0005200738087296486,
-0.012817594222724438,
-0.0782519057393074,
0.19913698732852936,
-0.1335172802209854,
-0.08667613565921783,
-0.10431627929210663,
0.10647991299629211,
-0.01745186187326908,
-0.02496105432510376,
0.016043389216065407,
-0.07182904332876205,
-0.0675848051905632,
-0.11217250674962997,
-0.03950052708387375,
-0.0061968471854925156,
-0.006548861972987652,
0.020941682159900665,
0.018931686878204346,
0.06444068998098373,
-0.13345378637313843,
0.013771734200417995,
-0.029549572616815567,
-0.09232421964406967,
0.011075587011873722,
0.04602903127670288,
0.06808486580848694,
0.12419889867305756,
0.005422370973974466,
-0.0032249256037175655,
-0.019178878515958786,
0.23935352265834808,
-0.06325764209032059,
0.0005440632230602205,
0.08413787931203842,
0.018712110817432404,
0.06150488927960396,
0.12578165531158447,
0.02617563307285309,
-0.07771555334329605,
0.024937056005001068,
0.052179113030433655,
-0.012856749817728996,
-0.27047768235206604,
-0.05525365471839905,
-0.003093644045293331,
-0.06495358049869537,
0.07912781834602356,
0.06261193007230759,
0.0028786533512175083,
0.055394578725099564,
-0.03403189033269882,
-0.025429164990782738,
0.018200581893324852,
0.06228531897068024,
0.028391417115926743,
0.04768984764814377,
0.0691227987408638,
-0.02770206145942211,
0.020717233419418335,
0.07335726916790009,
0.018992895260453224,
0.29127630591392517,
-0.04918699339032173,
0.1352505385875702,
0.02902832254767418,
0.1439923644065857,
-0.030787458643317223,
0.04183954745531082,
0.016609184443950653,
-0.00426546111702919,
-0.007532415445894003,
-0.07000337541103363,
-0.013043854385614395,
0.05103211849927902,
-0.008192169480025768,
0.03981185704469681,
-0.0750327929854393,
0.0848003402352333,
0.026091020554304123,
0.2805892825126648,
0.030740920454263687,
-0.28473496437072754,
-0.08616454899311066,
0.004913468845188618,
-0.010283133015036583,
-0.04866473004221916,
0.013230511918663979,
0.1541113406419754,
-0.1192818209528923,
0.06052510440349579,
-0.05172383785247803,
0.07740844041109085,
-0.02002822607755661,
-0.015878740698099136,
0.0339277982711792,
0.14806409180164337,
-0.009940577670931816,
0.07275920361280441,
-0.16506272554397583,
0.23783740401268005,
0.018143558874726295,
0.11417217552661896,
-0.049976758658885956,
0.016113203018903732,
0.024714292958378792,
0.12904146313667297,
0.12700186669826508,
0.011739988811314106,
-0.04469236359000206,
-0.15091854333877563,
-0.12537327408790588,
0.02242732048034668,
0.09568027406930923,
-0.01637294515967369,
0.05458517745137215,
-0.05010005086660385,
-0.0016344600589945912,
0.04237431660294533,
-0.06787361949682236,
-0.18787577748298645,
-0.100876584649086,
0.03230006992816925,
0.013702603988349438,
-0.04111497476696968,
-0.08435077220201492,
-0.09118232131004333,
-0.024920839816331863,
0.0823429748415947,
-0.04278818890452385,
-0.03265959024429321,
-0.13476982712745667,
0.007571921218186617,
0.15897372364997864,
-0.06577276438474655,
0.02297583967447281,
-0.007583422586321831,
0.09875094145536423,
0.024155091494321823,
-0.05090506747364998,
0.0533599853515625,
-0.08323481678962708,
-0.2098533660173416,
-0.06627334654331207,
0.09866993874311447,
0.08584216982126236,
0.05242037773132324,
0.010609284974634647,
0.03308236598968506,
-0.013899392448365688,
-0.09834980964660645,
0.024912267923355103,
0.13412931561470032,
0.07607939094305038,
0.05440172925591469,
-0.07008369266986847,
0.013619531877338886,
-0.024269862100481987,
-0.03306764364242554,
0.09837986528873444,
0.2606453001499176,
-0.08826570212841034,
0.12188661098480225,
0.10700003802776337,
-0.06828804314136505,
-0.18202008306980133,
0.026468142867088318,
0.11962491273880005,
0.015792038291692734,
0.042982302606105804,
-0.1538924127817154,
0.08030175417661667,
0.14116501808166504,
-0.0383114218711853,
0.055758021771907806,
-0.3429234027862549,
-0.13500462472438812,
0.0794118121266365,
0.08177047222852707,
0.02053348906338215,
-0.11533107608556747,
-0.02589668706059456,
-0.030213892459869385,
-0.1306980550289154,
0.09529325366020203,
-0.108157217502594,
0.1148313656449318,
-0.025765307247638702,
0.06531093269586563,
0.02574576437473297,
-0.0485617071390152,
0.1777905374765396,
0.02914663963019848,
0.08908366411924362,
-0.05372482165694237,
-0.0032189583871513605,
0.06687986850738525,
-0.08361045271158218,
0.07603202015161514,
-0.06353046000003815,
0.08918844908475876,
-0.09492886066436768,
0.0013647512532770634,
-0.06575031578540802,
0.07328207790851593,
-0.04394017159938812,
-0.05077352374792099,
-0.054377272725105286,
0.07130890339612961,
0.06509263068437576,
-0.04014592617750168,
0.09246040880680084,
0.02984166517853737,
0.06886086612939835,
0.1199561208486557,
0.0803673043847084,
-0.006396345794200897,
-0.08367390930652618,
0.007427528966218233,
-0.014043068513274193,
0.07597549259662628,
-0.12257393449544907,
0.013736101798713207,
0.11637145280838013,
0.03257104009389877,
0.11771371215581894,
0.019560687243938446,
-0.08221323788166046,
0.006048860494047403,
0.04257189482450485,
-0.10552169382572174,
-0.15376055240631104,
0.01691916212439537,
0.025179145857691765,
-0.11728999763727188,
0.019907105714082718,
0.1248101145029068,
-0.07175285369157791,
-0.006938338745385408,
-0.015437648631632328,
0.04807986319065094,
-0.01854453980922699,
0.21063652634620667,
0.04164835810661316,
0.06275802105665207,
-0.08535906672477722,
0.13981100916862488,
0.07453936338424683,
-0.05557537451386452,
0.0419083870947361,
0.06757748872041702,
-0.12788043916225433,
-0.04080693796277046,
0.07903623580932617,
0.1145705133676529,
-0.053131695836782455,
-0.04392942413687706,
-0.05164053663611412,
-0.08266261965036392,
0.043878715485334396,
0.10273299366235733,
0.044034771621227264,
0.011552216485142708,
-0.008055266924202442,
0.003681682515889406,
-0.10707017779350281,
0.08458322286605835,
0.05125835910439491,
0.0657227411866188,
-0.14062954485416412,
0.07754441350698471,
-0.006713632959872484,
0.019411228597164154,
-0.01172658521682024,
0.020021459087729454,
-0.0827120840549469,
-0.02537268027663231,
-0.170386403799057,
0.043637752532958984,
-0.020923864096403122,
0.01736278086900711,
0.004829598590731621,
-0.05284593626856804,
-0.018497193232178688,
0.059587761759757996,
-0.06984546035528183,
-0.047971587628126144,
-0.014713305979967117,
0.06319980323314667,
-0.14309385418891907,
-0.03192347660660744,
0.028016917407512665,
-0.0756840631365776,
0.09104351699352264,
0.0725943073630333,
0.017500320449471474,
0.02700410969555378,
-0.17610183358192444,
-0.008100598119199276,
0.02793900854885578,
0.022837592288851738,
0.060630958527326584,
-0.12683497369289398,
-0.030997896566987038,
-0.05258471518754959,
0.008800427429378033,
0.008324983529746532,
0.08321786671876907,
-0.11807060986757278,
-0.028755629435181618,
-0.027090132236480713,
-0.05914904549717903,
-0.07447352260351181,
0.019152291119098663,
0.10297880321741104,
0.028134001418948174,
0.15991143882274628,
-0.07087618112564087,
0.0356658473610878,
-0.15381982922554016,
-0.032310277223587036,
0.007404169999063015,
-0.01640901155769825,
-0.10634401440620422,
-0.004192857537418604,
0.0824943482875824,
-0.05811114236712456,
0.06281241029500961,
-0.041194330900907516,
0.05240774527192116,
0.03180947154760361,
-0.021119797602295876,
-0.03407495468854904,
0.020900782197713852,
0.1256592720746994,
0.06440271437168121,
0.0001031184583553113,
0.07202781736850739,
-0.037046656012535095,
0.05823769047856331,
0.01985401101410389,
0.19173312187194824,
0.1260347068309784,
0.005139671731740236,
0.07404879480600357,
0.059617698192596436,
-0.1428462117910385,
-0.08706753700971603,
0.12125523388385773,
-0.05509920045733452,
0.09549383819103241,
-0.06414654850959778,
0.12320895493030548,
0.1139516830444336,
-0.16861507296562195,
0.0191020704805851,
-0.07520700246095657,
-0.10927034169435501,
-0.12135441601276398,
-0.04591609537601471,
-0.0786384791135788,
-0.10802134871482849,
0.014557952992618084,
-0.10415039211511612,
0.05329807847738266,
0.08961939066648483,
0.025322135537862778,
0.038983479142189026,
0.12104957550764084,
-0.02532879263162613,
0.02246798761188984,
0.026500863954424858,
0.03605198860168457,
-0.007976748049259186,
-0.04180685803294182,
-0.07914526015520096,
0.07226883620023727,
-0.034431200474500656,
0.05487215146422386,
-0.03635362535715103,
0.0263435710221529,
0.05288740620017052,
0.005902571603655815,
-0.05870329216122627,
0.029965462163090706,
0.014734391123056412,
0.010809187777340412,
0.04549116641283035,
0.0698464959859848,
0.020104970782995224,
-0.02127915248274803,
0.28121310472488403,
-0.06089509651064873,
-0.08165982365608215,
-0.14277532696723938,
0.19060415029525757,
0.02913295105099678,
0.0056354571133852005,
0.07417062669992447,
-0.10244544595479965,
-0.028008239343762398,
0.10989966988563538,
0.1701377034187317,
-0.06840114295482635,
-0.01874784380197525,
-0.026410989463329315,
-0.019055690616369247,
-0.07361498475074768,
0.105537548661232,
0.08843290060758591,
0.016232768073678017,
-0.07313738763332367,
0.030639318749308586,
0.0008283320930786431,
-0.026913469657301903,
-0.0886581763625145,
0.03942846134305,
-0.024992546066641808,
0.002456225687637925,
-0.046544719487428665,
0.051397182047367096,
0.00007425474905176088,
-0.193333700299263,
0.04713521897792816,
-0.12248886376619339,
-0.16770458221435547,
-0.03536905720829964,
0.07664380222558975,
-0.0009357200469821692,
0.05828671529889107,
-0.020491259172558784,
-0.02363436669111252,
0.1487046331167221,
-0.03400249406695366,
-0.06187143921852112,
-0.13531944155693054,
0.08027205616235733,
-0.023594507947564125,
0.24171431362628937,
0.0072325970977544785,
0.09083796292543411,
0.09026438742876053,
0.01295617874711752,
-0.16992570459842682,
0.039388686418533325,
0.08435937762260437,
-0.08598020672798157,
0.023372111842036247,
0.1418241560459137,
-0.06651004403829575,
0.09902862459421158,
0.033440832048654556,
-0.098798468708992,
-0.01564802974462509,
0.003761555999517441,
-0.008326892741024494,
-0.06303750723600388,
0.009222788736224174,
-0.05219234153628349,
0.16708721220493317,
0.16801966726779938,
-0.03976169601082802,
0.007085609715431929,
-0.05968331918120384,
0.040566690266132355,
0.057015374302864075,
0.043195709586143494,
-0.015645049512386322,
-0.19214676320552826,
0.04714595153927803,
0.02928812988102436,
0.05211737006902695,
-0.15918470919132233,
-0.09231528639793396,
0.01539702620357275,
-0.04889625683426857,
-0.03239239752292633,
0.10284290462732315,
0.040640149265527725,
0.030873557552695274,
-0.02637035772204399,
-0.08384338766336441,
-0.03692132234573364,
0.12432616204023361,
-0.13710607588291168,
-0.07519316673278809
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7717
- Bleu: 56.2780
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["translation", "generated_from_trainer"], "datasets": ["kde4"], "metrics": ["bleu"], "base_model": "Helsinki-NLP/opus-mt-en-fr", "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "config": "en-fr", "split": "train", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 56.27801913553065, "name": "Bleu"}]}]}]} | translation | Surbhit/marian-finetuned-kde4-en-to-fr | [
"transformers",
"tensorboard",
"safetensors",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"base_model:Helsinki-NLP/opus-mt-en-fr",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:04:54+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7717
- Bleu: 56.2780
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.7717\n- Bleu: 56.2780",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.7717\n- Bleu: 56.2780",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
94,
73,
6,
12,
8,
3,
103,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.7717\n- Bleu: 56.2780## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.11566229164600372,
0.10733175277709961,
-0.0037832316011190414,
0.07883384823799133,
0.08642066270112991,
0.023925185203552246,
0.08523435145616531,
0.14701443910598755,
-0.04311110079288483,
0.07773956656455994,
0.05477011576294899,
0.07469512522220612,
0.07256149500608444,
0.14304736256599426,
-0.04697411507368088,
-0.20723652839660645,
0.04459540545940399,
-0.013831560499966145,
-0.11005272716283798,
0.08238302916288376,
0.11641708761453629,
-0.07795289903879166,
0.04272221773862839,
0.028061751276254654,
-0.08024006336927414,
0.02500026486814022,
-0.00019965336832683533,
-0.0543309822678566,
0.08600214868783951,
0.06544613093137741,
0.10150439292192459,
0.02909691073000431,
0.08402449637651443,
-0.22585532069206238,
0.0065286182798445225,
0.06659013777971268,
0.023704182356595993,
0.06427492946386337,
0.10137856751680374,
0.00797863956540823,
0.14774943888187408,
-0.12094009667634964,
0.08367272466421127,
0.008540336973965168,
-0.04449104145169258,
-0.17036674916744232,
-0.08725452423095703,
0.0697905644774437,
0.1145218014717102,
0.09020823240280151,
-0.01775733195245266,
0.10581783205270767,
-0.09884846210479736,
0.09290479123592377,
0.14605562388896942,
-0.2580016553401947,
-0.04715880751609802,
0.06360357999801636,
0.0408688448369503,
0.010252730920910835,
-0.08684717863798141,
-0.007541245315223932,
0.04090465232729912,
0.0014624278992414474,
0.032143693417310715,
-0.013657012954354286,
-0.05418039858341217,
0.007320764008909464,
-0.10176217555999756,
-0.019562900066375732,
0.17217232286930084,
0.06328245252370834,
-0.01784379966557026,
-0.11897191405296326,
-0.02025677263736725,
-0.08164864033460617,
-0.00029387781978584826,
-0.057920899242162704,
0.01243954710662365,
-0.037505678832530975,
-0.021167857572436333,
-0.025883985683321953,
-0.0869867131114006,
-0.06189524009823799,
0.019594144076108932,
0.09567102789878845,
0.054137635976076126,
0.00965297780930996,
-0.017385046929121017,
0.11149648576974869,
-0.012906810268759727,
-0.11025302857160568,
-0.04128085449337959,
-0.012007917277514935,
-0.10556913912296295,
-0.051879558712244034,
-0.03992193564772606,
-0.08966967463493347,
0.024689724668860435,
0.0866001695394516,
-0.04979582875967026,
0.08012263476848602,
0.03212197497487068,
0.0045697554014623165,
-0.023499002680182457,
0.12414161115884781,
-0.05556555837392807,
-0.06980223953723907,
-0.006973235867917538,
0.09296468645334244,
-0.012514833360910416,
-0.00013987637066747993,
-0.05080423876643181,
-0.03809996694326401,
0.07849457114934921,
0.05752784013748169,
-0.054636359214782715,
0.04785674810409546,
-0.05093023553490639,
-0.007940652780234814,
-0.007732335943728685,
-0.14406955242156982,
0.04872943088412285,
0.025158418342471123,
-0.10850612074136734,
-0.011254841461777687,
0.045689892023801804,
-0.015992818400263786,
-0.05274365842342377,
0.10229037702083588,
-0.040327560156583786,
-0.01723634824156761,
-0.07761085778474808,
-0.08284536749124527,
0.014865581877529621,
-0.025701895356178284,
-0.004994600545614958,
-0.08358050137758255,
-0.14961360394954681,
-0.06112043187022209,
0.061169106513261795,
-0.05809566751122475,
-0.028884997591376305,
-0.0830010250210762,
-0.06709892302751541,
0.045836009085178375,
-0.011557206511497498,
0.10048503428697586,
-0.044254135340452194,
0.046887751668691635,
-0.038943205028772354,
0.027559053152799606,
0.06381280720233917,
0.02640806883573532,
-0.04842279851436615,
0.04556482657790184,
-0.08214852958917618,
0.10297828912734985,
-0.10383288562297821,
-0.018986137583851814,
-0.11756458133459091,
-0.07397062331438065,
0.0018895670073106885,
0.004856526851654053,
0.10668335855007172,
0.12123364955186844,
-0.18079660832881927,
-0.02076900377869606,
0.13240672647953033,
-0.09267682582139969,
-0.07264705747365952,
0.07062824815511703,
-0.03673255443572998,
0.02356809191405773,
0.057723261415958405,
0.15089751780033112,
0.15465176105499268,
-0.11454442143440247,
-0.046288471668958664,
0.020251993089914322,
0.048270922154188156,
0.027787648141384125,
0.034353695809841156,
0.004121843259781599,
-0.008750312961637974,
0.037037193775177,
-0.06975042819976807,
0.026803500950336456,
-0.04483998566865921,
-0.08393315970897675,
-0.043311260640621185,
-0.059649888426065445,
-0.005085889250040054,
0.023734549060463905,
0.04562956094741821,
-0.05926577001810074,
-0.10137399286031723,
0.10367443412542343,
0.14740602672100067,
-0.07309849560260773,
0.017353756353259087,
-0.08780151605606079,
0.08201032876968384,
-0.024314355105161667,
-0.007832936942577362,
-0.16502641141414642,
-0.08953113108873367,
0.01952228881418705,
-0.12702423334121704,
-0.025713322684168816,
0.05443890020251274,
0.0783013179898262,
0.057601314038038254,
-0.047310635447502136,
-0.042950715869665146,
-0.0713561475276947,
0.005047690123319626,
-0.07514003664255142,
-0.19760547578334808,
-0.012316159904003143,
-0.02757980301976204,
0.15742355585098267,
-0.2651865780353546,
-0.0053076064214110374,
0.019605349749326706,
0.1697753518819809,
0.014619380235671997,
-0.03409448266029358,
-0.007243876811116934,
0.013195074163377285,
-0.010241590440273285,
-0.08227778226137161,
0.032615069299936295,
0.009940283372998238,
-0.10771003365516663,
0.035871487110853195,
-0.12702316045761108,
-0.021965600550174713,
0.06422235071659088,
0.1074042022228241,
-0.12487087398767471,
-0.026315180584788322,
-0.05130043998360634,
-0.05153120309114456,
-0.056891366839408875,
0.014963961206376553,
0.15790817141532898,
0.016584066674113274,
0.12329518049955368,
-0.06479830294847488,
-0.0746721625328064,
0.004307262599468231,
-0.0004055570752825588,
-0.04004592448472977,
0.13150008022785187,
0.022629326209425926,
-0.1176433190703392,
0.047604139894247055,
0.053171198815107346,
-0.07023808360099792,
0.19148258864879608,
-0.050072263926267624,
-0.10181073844432831,
-0.04419548064470291,
0.04561196267604828,
0.009775334037840366,
0.14266426861286163,
-0.03894202783703804,
0.040009237825870514,
0.028088174760341644,
0.04392202943563461,
0.045151595026254654,
-0.15396331250667572,
0.004612652584910393,
0.033773522824048996,
-0.0541803240776062,
0.031100038439035416,
0.011666619218885899,
0.026096461340785027,
0.08821391314268112,
-0.0048044282011687756,
-0.048013221472501755,
-0.006503545679152012,
-0.032911818474531174,
-0.07415638118982315,
0.19870516657829285,
-0.12453339993953705,
-0.15582644939422607,
-0.13062340021133423,
0.09034334868192673,
-0.06018679589033127,
-0.052091531455516815,
0.013531149365007877,
-0.08332518488168716,
-0.07405826449394226,
-0.11207995563745499,
0.019824910908937454,
-0.026042921468615532,
-0.016167614609003067,
-0.019564401358366013,
0.04634798318147659,
0.08349570631980896,
-0.13945184648036957,
0.022956451401114464,
-0.013655777089297771,
-0.05452820658683777,
-0.006545696873217821,
0.02625207044184208,
0.042667437344789505,
0.11090689897537231,
-0.01497098058462143,
0.042667455971241,
-0.013802935369312763,
0.161943718791008,
-0.07712503522634506,
0.021338116377592087,
0.05918703228235245,
0.022956551983952522,
0.02974303811788559,
0.1436649113893509,
0.007945111021399498,
-0.0755317360162735,
0.020931372418999672,
0.07778049260377884,
0.009779318235814571,
-0.2696828842163086,
-0.031953733414411545,
-0.04138292744755745,
-0.03197615221142769,
0.0840296670794487,
0.05524888262152672,
-0.02434774674475193,
0.058073241263628006,
-0.0005712928832508624,
0.0010228377068415284,
0.04351689666509628,
0.05359581485390663,
0.08450828492641449,
0.051827121526002884,
0.07610146701335907,
-0.045106176286935806,
0.011442923918366432,
0.07039479911327362,
0.026073355227708817,
0.25129175186157227,
-0.07884878665208817,
0.06251587718725204,
0.055647075176239014,
0.13208669424057007,
-0.017872948199510574,
0.043663837015628815,
0.023341283202171326,
-0.006165439262986183,
0.0413016714155674,
-0.0654364749789238,
-0.007402464281767607,
0.03419487923383713,
0.009532677941024303,
0.029815226793289185,
-0.08387133479118347,
0.055367160588502884,
0.02017601579427719,
0.21847595274448395,
0.08999238908290863,
-0.24905145168304443,
-0.07253307104110718,
0.016246557235717773,
-0.011813540942966938,
-0.07965224236249924,
0.010123401880264282,
0.14565719664096832,
-0.12621885538101196,
0.06458587199449539,
-0.06608562171459198,
0.08070431649684906,
-0.06151026114821434,
-0.05752267688512802,
0.057027772068977356,
0.10190428048372269,
0.0051246401853859425,
0.09483151882886887,
-0.18589870631694794,
0.22774633765220642,
0.0032075161579996347,
0.12055288255214691,
-0.022978810593485832,
0.04417673870921135,
0.02649698778986931,
0.06719006597995758,
0.1124434843659401,
0.03108402155339718,
-0.14321644604206085,
-0.15027019381523132,
-0.09196993708610535,
0.030777741223573685,
0.0636046975851059,
-0.018310729414224625,
0.07387632131576538,
-0.0533648356795311,
-0.0029027655255049467,
0.021257005631923676,
-0.0460575632750988,
-0.1762838214635849,
-0.15190058946609497,
0.0024286098778247833,
0.03289347141981125,
-0.022744877263903618,
-0.0881178230047226,
-0.10925894975662231,
-0.023936588317155838,
0.21503306925296783,
-0.017450429499149323,
-0.04735856130719185,
-0.12760768830776215,
0.050377704203128815,
0.1315198391675949,
-0.06859958916902542,
0.007222488056868315,
0.011095443740487099,
0.12085139751434326,
0.020855814218521118,
-0.057275619357824326,
0.04851292446255684,
-0.06366019695997238,
-0.14263439178466797,
-0.0427020899951458,
0.12611272931098938,
0.05904604494571686,
0.024646587669849396,
0.03353355452418327,
0.012255196459591389,
0.025387998670339584,
-0.07648178189992905,
-0.025707552209496498,
0.11204695701599121,
0.05262893810868263,
0.056761521846055984,
-0.07399420440196991,
-0.017841165885329247,
-0.04049697518348694,
-0.04518001526594162,
0.10119973123073578,
0.2213808000087738,
-0.08961273729801178,
0.0783163532614708,
0.03113807551562786,
-0.10645097494125366,
-0.16837550699710846,
0.05056777223944664,
0.12373325228691101,
0.06625894457101822,
0.01981537975370884,
-0.12922009825706482,
0.06674881279468536,
0.09087452292442322,
-0.01648642309010029,
0.034348659217357635,
-0.30711862444877625,
-0.1511627435684204,
0.09305030107498169,
0.13136179745197296,
0.01603016071021557,
-0.08689635246992111,
-0.04106789827346802,
-0.030565759167075157,
-0.11366289854049683,
0.0787070021033287,
-0.05057393014431,
0.09917566180229187,
-0.016656318679451942,
0.029347103089094162,
0.048492785543203354,
-0.03574027493596077,
0.16439345479011536,
0.007648013532161713,
0.060810066759586334,
-0.06390135735273361,
0.09470023214817047,
0.03586887940764427,
-0.10002446919679642,
0.07564355432987213,
-0.04949057474732399,
0.0642186850309372,
-0.16290301084518433,
-0.0221049003303051,
-0.06063997000455856,
0.08446674048900604,
-0.060349781066179276,
-0.033551838248968124,
-0.06427163630723953,
0.08640774339437485,
0.1041647270321846,
-0.0340573787689209,
0.05321762338280678,
-0.006969660986214876,
0.06192983314394951,
0.08655005693435669,
0.08889451622962952,
0.07863914966583252,
-0.09347305446863174,
0.035033367574214935,
-0.00010102830128744245,
0.057445067912340164,
-0.09342027455568314,
0.031309179961681366,
0.13187669217586517,
-0.0065974523313343525,
0.1341772824525833,
0.013747187331318855,
-0.08507760614156723,
-0.0052286856807768345,
0.06908543407917023,
-0.0947624146938324,
-0.12045789510011673,
-0.0122879259288311,
-0.042398642748594284,
-0.07800517976284027,
-0.021264124661684036,
0.1306532472372055,
-0.07143057882785797,
0.0019163702381774783,
-0.01414825115352869,
0.028727086260914803,
-0.04096032306551933,
0.19693641364574432,
0.017272448167204857,
0.06645461916923523,
-0.05171271786093712,
0.12668342888355255,
0.04095533490180969,
-0.11476081609725952,
0.05556872859597206,
0.037014540284872055,
-0.09316854923963547,
-0.029392492026090622,
0.07824114710092545,
0.16250085830688477,
0.014117926359176636,
-0.09542106091976166,
-0.08907124400138855,
-0.10313962399959564,
0.03836105391383171,
0.05237254500389099,
0.03548344597220421,
0.030425546690821648,
-0.01081746444106102,
-0.02785715088248253,
-0.12777192890644073,
0.11761568486690521,
0.09883643686771393,
0.03806716203689575,
-0.12886974215507507,
0.0897555723786354,
0.006503426004201174,
0.013424496166408062,
-0.0005934311775490642,
0.015082521364092827,
-0.09757085889577866,
-0.02981521561741829,
-0.12055560946464539,
0.024090666323900223,
-0.0368453674018383,
0.008629810996353626,
-0.015703104436397552,
-0.007010337430983782,
-0.048555441200733185,
0.05221725255250931,
-0.06967591494321823,
-0.03740261122584343,
-0.015141356736421585,
0.06812284141778946,
-0.1156148687005043,
-0.026706041768193245,
0.006174969021230936,
-0.10795751959085464,
0.050814948976039886,
0.047034852206707,
0.02756987139582634,
0.041920579969882965,
-0.10255858302116394,
0.026192842051386833,
0.0011869193986058235,
0.029801517724990845,
0.033497437834739685,
-0.1549326479434967,
0.022213879972696304,
-0.007693542633205652,
0.024948298931121826,
0.005590681917965412,
0.009376495145261288,
-0.11237616091966629,
-0.046556055545806885,
-0.07188443839550018,
-0.03932718560099602,
-0.06430013477802277,
0.07371991872787476,
0.05489601567387581,
0.05783938616514206,
0.17424853146076202,
-0.1020316630601883,
0.024677816778421402,
-0.17090843617916107,
-0.03749686852097511,
0.02230922318994999,
-0.020017195492982864,
-0.020859282463788986,
-0.030670875683426857,
0.08342979848384857,
-0.05572108179330826,
0.09898030012845993,
0.0035445354878902435,
0.030592819675803185,
0.035685185343027115,
-0.1264447271823883,
0.01049575861543417,
0.03155507147312164,
0.13950012624263763,
0.007170714903622866,
0.000761022325605154,
0.04544728249311447,
-0.022058425471186638,
0.025765584781765938,
0.08731591701507568,
0.1356573849916458,
0.17234669625759125,
-0.015001771040260792,
0.09451204538345337,
0.039657384157180786,
-0.0888129249215126,
-0.0634625181555748,
0.09520821273326874,
-0.10979990661144257,
0.11728247255086899,
-0.0352647490799427,
0.1569673717021942,
0.08877598494291306,
-0.20457793772220612,
0.0748346820473671,
-0.07646045833826065,
-0.12864261865615845,
-0.12203200906515121,
-0.12689274549484253,
-0.10178551077842712,
-0.12090516090393066,
0.03908892348408699,
-0.11763160675764084,
0.03527263179421425,
0.04636943340301514,
0.05988139659166336,
-0.003639311995357275,
0.13034161925315857,
-0.045507218688726425,
0.005128085147589445,
0.08246204257011414,
0.024485019966959953,
0.005780987907201052,
-0.0431586392223835,
-0.029282499104738235,
0.02470826916396618,
-0.035387732088565826,
0.030573029071092606,
-0.022569136694073677,
0.020622143521904945,
0.028823111206293106,
0.026226289570331573,
-0.043512627482414246,
0.018087811768054962,
-0.0015158524038270116,
0.0572725348174572,
0.04240730032324791,
0.09128499776124954,
-0.018801704049110413,
-0.04777275770902634,
0.27257150411605835,
-0.04567199945449829,
-0.07744460552930832,
-0.16219031810760498,
0.11175847798585892,
0.04846441373229027,
0.016401147469878197,
0.061981379985809326,
-0.11247025430202484,
-0.012901394627988338,
0.12640301883220673,
0.12355662137269974,
-0.006653378251940012,
-0.02228081040084362,
-0.024605082347989082,
-0.020208103582262993,
-0.04405607655644417,
0.10085383802652359,
0.06260323524475098,
0.04572957754135132,
-0.0283419918268919,
-0.00868457742035389,
-0.03801128640770912,
-0.013276596553623676,
-0.022297821938991547,
0.12223885208368301,
0.0033612377010285854,
-0.03223917633295059,
-0.03640644997358322,
0.0741596445441246,
0.0059390258975327015,
-0.17790904641151428,
0.06765247881412506,
-0.14451368153095245,
-0.1778515875339508,
-0.057002272456884384,
0.09502865374088287,
-0.03335103765130043,
0.050881773233413696,
-0.019372321665287018,
-0.00825403816998005,
0.12446745485067368,
-0.008631322532892227,
-0.05272124335169792,
-0.1167202964425087,
0.01771656423807144,
-0.057268064469099045,
0.21917179226875305,
0.03658452257514,
0.061421800404787064,
0.11133365333080292,
0.007865598425269127,
-0.11410722881555557,
0.05567217245697975,
0.08010849356651306,
-0.07100282609462738,
0.054239582270383835,
0.16521519422531128,
-0.04682028666138649,
0.09640233218669891,
0.048261333256959915,
-0.09081467986106873,
0.02636461704969406,
-0.08333903551101685,
-0.03262238949537277,
-0.08726668357849121,
0.03937636688351631,
-0.05618371069431305,
0.1435554325580597,
0.24088898301124573,
-0.031076354905962944,
0.000778853427618742,
-0.07936521619558334,
0.04966326430439949,
0.015199247747659683,
0.09242710471153259,
0.03027508780360222,
-0.1731349527835846,
0.02123359777033329,
0.010554703883826733,
0.05241512507200241,
-0.2199030965566635,
-0.10242754220962524,
0.014437209814786911,
-0.09654983133077621,
-0.05212731659412384,
0.14183500409126282,
0.053613319993019104,
0.04761567339301109,
-0.042729005217552185,
-0.10580593347549438,
-0.014597315341234207,
0.12027202546596527,
-0.10350962728261948,
-0.08346009999513626
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# DataSnipper_FinerDistilBert_FullSequence
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0913
- Precision: 0.8606
- Recall: 0.8141
- F1: 0.8367
- Accuracy: 0.9170
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0108 | 1.0 | 56274 | 0.1081 | 0.7908 | 0.7278 | 0.7580 | 0.8778 |
| 0.0082 | 2.0 | 112548 | 0.0950 | 0.8250 | 0.7870 | 0.8056 | 0.9022 |
| 0.0066 | 3.0 | 168822 | 0.0893 | 0.8471 | 0.7902 | 0.8177 | 0.9065 |
| 0.0052 | 4.0 | 225096 | 0.0898 | 0.8585 | 0.8107 | 0.8339 | 0.9160 |
| 0.0043 | 5.0 | 281370 | 0.0913 | 0.8606 | 0.8141 | 0.8367 | 0.9170 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1", "accuracy"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "DataSnipper_FinerDistilBert_FullSequence", "results": []}]} | token-classification | gvisser/DataSnipper_FinerDistilBert_FullSequence | [
"transformers",
"safetensors",
"distilbert",
"token-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:06:03+00:00 | [] | [] | TAGS
#transformers #safetensors #distilbert #token-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| DataSnipper\_FinerDistilBert\_FullSequence
==========================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0913
* Precision: 0.8606
* Recall: 0.8141
* F1: 0.8367
* Accuracy: 0.9170
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #distilbert #token-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
69,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #token-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.10113692283630371,
0.09854251146316528,
-0.0020981444977223873,
0.11675982177257538,
0.15007472038269043,
0.016674140468239784,
0.14083372056484222,
0.09336518496274948,
-0.0693931058049202,
0.041790250688791275,
0.13069306313991547,
0.13961884379386902,
-0.0011912722839042544,
0.13946881890296936,
-0.07611535489559174,
-0.21230889856815338,
0.025403767824172974,
0.018102366477251053,
-0.04222672060132027,
0.11217347532510757,
0.10180122405290604,
-0.1293199211359024,
0.09126437455415726,
-0.007971265353262424,
-0.17155222594738007,
-0.00022502655338030308,
0.023572787642478943,
-0.0455649197101593,
0.13050049543380737,
0.01731264963746071,
0.13252678513526917,
0.018569624051451683,
0.09969417005777359,
-0.19356080889701843,
0.0024367424193769693,
0.04927242547273636,
0.006795316468924284,
0.0711490735411644,
0.02471722848713398,
0.004580073989927769,
0.0558457188308239,
-0.07950488477945328,
0.06281482428312302,
0.01838003844022751,
-0.1283608227968216,
-0.214202418923378,
-0.0921553522348404,
0.04026437923312187,
0.10212742537260056,
0.0672823041677475,
-0.0053784726187586784,
0.12785440683364868,
-0.08132804185152054,
0.08214996755123138,
0.20774583518505096,
-0.3187797963619232,
-0.061046354472637177,
0.06047280505299568,
0.01761031150817871,
0.06182611361145973,
-0.1016804501414299,
-0.038692228496074677,
0.06446174532175064,
0.025591878220438957,
0.1356004923582077,
-0.02588149905204773,
-0.08707619458436966,
0.0010585482232272625,
-0.14727142453193665,
-0.020844124257564545,
0.15810266137123108,
0.058772340416908264,
-0.06382808089256287,
-0.04062242805957794,
-0.06430820375680923,
-0.14323489367961884,
-0.03953394293785095,
-0.02633344754576683,
0.05337339639663696,
-0.019248012453317642,
-0.04852132126688957,
0.009325094521045685,
-0.09006806463003159,
-0.08040658384561539,
-0.05062445253133774,
0.1796986311674118,
0.04095114395022392,
0.0019953136797994375,
0.008783551864326,
0.10810713469982147,
-0.05596086382865906,
-0.12605291604995728,
0.003599392483010888,
0.012983068823814392,
0.011925836093723774,
-0.06345134228467941,
-0.053952332586050034,
-0.02343347854912281,
0.024455750361084938,
0.18170471489429474,
-0.060842473059892654,
0.03265024349093437,
0.03092184104025364,
0.03605886921286583,
-0.10286176204681396,
0.15441347658634186,
-0.030840417370200157,
-0.03516792505979538,
0.028754951432347298,
0.07419814169406891,
0.04713215306401253,
0.00743524543941021,
-0.11141318082809448,
0.017455004155635834,
0.11334340274333954,
0.021392803639173508,
-0.08321678638458252,
0.07136589288711548,
-0.06410149484872818,
-0.000059044330555479974,
0.038653500378131866,
-0.08604523539543152,
0.022454196587204933,
-0.00955484714359045,
-0.04920737072825432,
-0.07041358947753906,
0.023291025310754776,
0.03156294301152229,
0.0249774269759655,
0.10578681528568268,
-0.0918976217508316,
0.0038726951461285353,
-0.0836319625377655,
-0.10687141865491867,
0.004159584175795317,
-0.08303997665643692,
0.03754490613937378,
-0.11468464136123657,
-0.19986547529697418,
-0.0013040258781984448,
0.05494625121355057,
-0.016841687262058258,
-0.036362022161483765,
-0.052660372108221054,
-0.07457007467746735,
0.004409406334161758,
-0.013923502527177334,
0.05320937931537628,
-0.0688907727599144,
0.09471829980611801,
0.04849359765648842,
0.06244314834475517,
-0.05422411486506462,
0.0343795008957386,
-0.1260780543088913,
0.04210924729704857,
-0.18839238584041595,
0.01286033820360899,
-0.07105755805969238,
0.07131480425596237,
-0.07090634107589722,
-0.08042381703853607,
0.007495918311178684,
-0.006634373217821121,
0.06929462403059006,
0.10204397886991501,
-0.14947305619716644,
-0.04815734550356865,
0.16269730031490326,
-0.09929778426885605,
-0.1466144472360611,
0.124217189848423,
-0.058309826999902725,
0.057038139551877975,
0.061650682240724564,
0.16269551217556,
0.06781866401433945,
-0.0930105596780777,
-0.003462308319285512,
-0.008444111794233322,
0.051643237471580505,
-0.02985774353146553,
0.06965915113687515,
0.006337613798677921,
-0.030373552814126015,
0.024808626621961594,
-0.06765013188123703,
0.04896319657564163,
-0.0878525823354721,
-0.08653268218040466,
-0.05172258988022804,
-0.11399015039205551,
0.07163461297750473,
0.04055226594209671,
0.055797141045331955,
-0.1164117082953453,
-0.07462561875581741,
0.07074572890996933,
0.0851360633969307,
-0.05667656287550926,
0.008884628303349018,
-0.07102666795253754,
0.08791039884090424,
-0.05799850448966026,
-0.02748906798660755,
-0.14470136165618896,
-0.059275832027196884,
0.02551497146487236,
0.00027363805565983057,
0.004649180918931961,
-0.0197808388620615,
0.06539490818977356,
0.09793636947870255,
-0.06903085112571716,
-0.04242952913045883,
-0.02860141173005104,
0.025984324514865875,
-0.11958400160074234,
-0.18420639634132385,
-0.027743225917220116,
-0.028307132422924042,
0.13990941643714905,
-0.2288532853126526,
0.04937785863876343,
-0.026036275550723076,
0.09216881543397903,
0.03674257919192314,
-0.01134558767080307,
-0.04970931634306908,
0.07220610976219177,
-0.04569542407989502,
-0.06545663625001907,
0.051009465008974075,
0.012989477254450321,
-0.08197513967752457,
-0.06467898935079575,
-0.12338275462388992,
0.20016448199748993,
0.1220288947224617,
-0.07832857966423035,
-0.08933991938829422,
-0.011116014793515205,
-0.043893229216337204,
-0.027516238391399384,
-0.056606657803058624,
0.013299091719090939,
0.11171884089708328,
-0.020653581246733665,
0.14044013619422913,
-0.08402866125106812,
-0.028317062184214592,
0.019089294597506523,
-0.056188467890024185,
0.025199010968208313,
0.08868040889501572,
0.09582410007715225,
-0.1152808889746666,
0.15478582680225372,
0.19035279750823975,
-0.08926327526569366,
0.11031673848628998,
-0.04518962651491165,
-0.048196062445640564,
-0.029739446938037872,
0.013307985849678516,
0.004604261368513107,
0.11404528468847275,
-0.10493362694978714,
0.01465608086436987,
0.010270214639604092,
0.02732907049357891,
-0.0022595704067498446,
-0.2138078510761261,
-0.03404950723052025,
0.03999265655875206,
-0.04128933697938919,
-0.0012755251955240965,
-0.017825324088335037,
-0.008266553282737732,
0.0884658694267273,
-0.002113251481205225,
-0.095208540558815,
0.05570576339960098,
0.00850037019699812,
-0.07787914574146271,
0.2045956552028656,
-0.09226491302251816,
-0.09906017035245895,
-0.1265895515680313,
-0.07116532325744629,
-0.05185812711715698,
0.03778505697846413,
0.06957171112298965,
-0.057095833122730255,
-0.047042448073625565,
-0.09720488637685776,
-0.0051005007699131966,
0.04735046997666359,
0.027263019233942032,
0.015720965340733528,
-0.002759548369795084,
0.08837098628282547,
-0.09638544917106628,
-0.018350059166550636,
-0.0326586589217186,
-0.06273680925369263,
0.037421196699142456,
0.03551129996776581,
0.11260012537240982,
0.12813186645507812,
-0.0285627543926239,
-0.01456755492836237,
-0.02548890747129917,
0.2569655478000641,
-0.04677347093820572,
-0.021809663623571396,
0.131544291973114,
-0.01607261225581169,
0.04925645515322685,
0.14277496933937073,
0.061143554747104645,
-0.10922848433256149,
0.023354116827249527,
0.029153283685445786,
-0.027620496228337288,
-0.19210968911647797,
-0.04402365908026695,
-0.03560673072934151,
-0.011049332097172737,
0.0952649936079979,
0.023100735619664192,
0.024080919101834297,
0.0793062075972557,
0.023364372551441193,
0.08488041907548904,
-0.01633259654045105,
0.08585768938064575,
0.11275374889373779,
0.049457039684057236,
0.12291491776704788,
-0.033878978341817856,
-0.05709240213036537,
0.032871052622795105,
-0.0015911925584077835,
0.20859019458293915,
0.03336828574538231,
0.10501167178153992,
0.05869598314166069,
0.18407005071640015,
-0.0035021970979869366,
0.0759219229221344,
-0.002705323975533247,
-0.04532482475042343,
-0.01422678492963314,
-0.049336064606904984,
-0.04289810359477997,
0.04432777315378189,
-0.10354619473218918,
0.0761365070939064,
-0.11299426853656769,
0.023801511153578758,
0.06108316034078598,
0.2545101046562195,
0.04958837106823921,
-0.330807626247406,
-0.1021120697259903,
0.02246319130063057,
-0.02523583360016346,
-0.02995501458644867,
0.03715746849775314,
0.10375472158193588,
-0.05865012854337692,
0.028196614235639572,
-0.056403081864118576,
0.07531296461820602,
-0.017942657694220543,
0.04484333097934723,
0.05727943032979965,
0.08528082817792892,
-0.0025988535489887,
0.06474139541387558,
-0.25924333930015564,
0.262197881937027,
0.011606968939304352,
0.07735446095466614,
-0.03818734735250473,
0.000735593494027853,
0.0351586639881134,
0.10979370772838593,
0.08044447749853134,
-0.008379229344427586,
-0.06782318651676178,
-0.2180105596780777,
-0.04196811467409134,
0.024550536647439003,
0.08704078942537308,
-0.0374862439930439,
0.10421399027109146,
-0.04160409793257713,
0.006334047298878431,
0.08114242553710938,
-0.00888415053486824,
-0.09677289426326752,
-0.07449567317962646,
-0.027997741475701332,
0.042853567749261856,
0.015489079058170319,
-0.08883555233478546,
-0.08748546242713928,
-0.12880314886569977,
0.14749249815940857,
-0.052309535443782806,
-0.028853515163064003,
-0.09901222586631775,
0.03901628404855728,
0.05713581293821335,
-0.07182841747999191,
0.06608500331640244,
0.006041824351996183,
0.08490435779094696,
0.02480200119316578,
-0.04405693709850311,
0.12276533991098404,
-0.08467937260866165,
-0.18743853271007538,
-0.07493564486503601,
0.09706556797027588,
0.022401228547096252,
0.038388632237911224,
0.003737193299457431,
0.017742475494742393,
-0.011357279494404793,
-0.0848853588104248,
0.015602889470756054,
-0.0017331377603113651,
0.06462827324867249,
0.03268802911043167,
-0.07048829644918442,
0.007873921655118465,
-0.04790030047297478,
-0.029439399018883705,
0.14623112976551056,
0.30592048168182373,
-0.09358472377061844,
-0.0027074753306806087,
0.06298509240150452,
-0.053368933498859406,
-0.18973635137081146,
0.024319389835000038,
0.027511952444911003,
-0.0024159958120435476,
0.05454498529434204,
-0.13564114272594452,
0.1332717090845108,
0.12144836038351059,
-0.03138576075434685,
0.10118959844112396,
-0.2644922435283661,
-0.1271534264087677,
0.13651646673679352,
0.15613295137882233,
0.135247141122818,
-0.138701930642128,
-0.02737896330654621,
-0.03380706161260605,
-0.1394719034433365,
0.09794019162654877,
-0.09836193919181824,
0.10373443365097046,
-0.01645408570766449,
0.055338066071271896,
0.002443612553179264,
-0.05052873492240906,
0.13543574512004852,
0.010455094277858734,
0.12823840975761414,
-0.05825074389576912,
-0.01851074770092964,
0.0366588830947876,
-0.0587020181119442,
0.029486577957868576,
-0.08872634917497635,
0.05223063752055168,
-0.049564119428396225,
-0.025792323052883148,
-0.043891336768865585,
0.04531339555978775,
-0.034055713564157486,
-0.07633236050605774,
-0.04094740375876427,
0.03290732577443123,
0.04551256448030472,
-0.015387533232569695,
0.1367899477481842,
0.029579970985651016,
0.15107092261314392,
0.12206294387578964,
0.06139418110251427,
-0.08273641020059586,
-0.01709972694516182,
-0.009823168627917767,
-0.03889748081564903,
0.06722202897071838,
-0.1249266192317009,
0.04486856237053871,
0.11771813780069351,
0.014492487534880638,
0.14672861993312836,
0.07496268302202225,
-0.009421116672456264,
0.009037988260388374,
0.061659276485443115,
-0.1690969169139862,
-0.07889754325151443,
0.0009726437856443226,
-0.021400652825832367,
-0.11380566656589508,
0.07350572198629379,
0.11189605295658112,
-0.0827033668756485,
0.004143127240240574,
-0.01834283210337162,
0.014804855920374393,
-0.04776899144053459,
0.17121414840221405,
0.06218581646680832,
0.04964567348361015,
-0.07994332164525986,
0.08051375299692154,
0.04506276920437813,
-0.04103487357497215,
-0.007986416108906269,
0.012320922687649727,
-0.09754344075918198,
-0.0420246347784996,
0.04680229723453522,
0.17306968569755554,
-0.05094918608665466,
-0.05086538568139076,
-0.13084204494953156,
-0.11928031593561172,
0.05008842796087265,
0.16952219605445862,
0.10930637270212173,
0.022271646186709404,
-0.021340930834412575,
0.015530860982835293,
-0.10987075418233871,
0.10668368637561798,
0.03206905350089073,
0.0959741473197937,
-0.1737944781780243,
0.10696781426668167,
-0.009688634425401688,
0.003250132780522108,
-0.022452013567090034,
0.04783458635210991,
-0.1159784346818924,
-0.008652113378047943,
-0.1262560784816742,
-0.013529917225241661,
-0.037026405334472656,
0.019468823447823524,
0.00901876762509346,
-0.061793550848960876,
-0.05373500660061836,
0.020707810297608376,
-0.0951860323548317,
-0.015763651579618454,
0.04577120393514633,
0.0664779543876648,
-0.11900107562541962,
-0.04317989572882652,
0.03288652002811432,
-0.07136719673871994,
0.06493659317493439,
0.026195675134658813,
0.028404194861650467,
0.04931388795375824,
-0.1789710819721222,
0.022401724010705948,
0.07287143170833588,
0.007401564158499241,
0.05181780457496643,
-0.10540416836738586,
-0.019735699519515038,
-0.001375487307086587,
0.03585163876414299,
0.015528406947851181,
0.0841788798570633,
-0.13344420492649078,
-0.00451866677030921,
-0.017125781625509262,
-0.05880851298570633,
-0.05122503265738487,
-0.004239323083311319,
0.10843699425458908,
-0.013717589899897575,
0.2142067849636078,
-0.08982975035905838,
0.00806793849915266,
-0.1927909553050995,
-0.0030423463322222233,
-0.007610765285789967,
-0.1092221587896347,
-0.14894014596939087,
-0.052009642124176025,
0.035350341349840164,
-0.047783706337213516,
0.15037405490875244,
-0.007310809567570686,
0.029212763532996178,
0.03021721914410591,
-0.04006804898381233,
0.03926840052008629,
0.024658946320414543,
0.22900353372097015,
0.037772390991449356,
-0.04402206838130951,
0.016081321984529495,
0.03386549651622772,
0.10898381471633911,
0.041353289037942886,
0.15995965898036957,
0.16527655720710754,
-0.05812922120094299,
0.09371285885572433,
0.03455974906682968,
-0.055169448256492615,
-0.15644097328186035,
0.041837841272354126,
-0.03127061203122139,
0.08165405690670013,
-0.01680627092719078,
0.21303454041481018,
0.08489222079515457,
-0.16545335948467255,
0.013979032635688782,
-0.0519595667719841,
-0.07166720926761627,
-0.10674158483743668,
-0.04324154183268547,
-0.09511349350214005,
-0.16110031306743622,
0.002134589944034815,
-0.10927196592092514,
0.00273802038282156,
0.109650619328022,
-0.008075064048171043,
-0.011503289453685284,
0.1772337257862091,
-0.008649967610836029,
0.0529261976480484,
0.03502858057618141,
-0.005608833394944668,
-0.04320024698972702,
-0.08344655483961105,
-0.10879389196634293,
0.0021408507600426674,
-0.02820540778338909,
0.024267995730042458,
-0.061276260763406754,
-0.034068889915943146,
0.03229904919862747,
-0.00008390981383854523,
-0.09539683163166046,
0.015395112335681915,
0.025669440627098083,
0.034089505672454834,
0.026179049164056778,
0.006592662539333105,
0.02032569609582424,
0.005202570464462042,
0.21143530309200287,
-0.07523131370544434,
-0.07796729356050491,
-0.10117048770189285,
0.234890416264534,
0.042739614844322205,
0.022867286577820778,
0.02163182944059372,
-0.08916454017162323,
0.03835449740290642,
0.19245608150959015,
0.15360964834690094,
-0.08158852159976959,
0.002828132826834917,
-0.021735668182373047,
-0.020810240879654884,
-0.04860666021704674,
0.08686874061822891,
0.11434551328420639,
0.0006799077382311225,
-0.06972233951091766,
-0.03699274733662605,
-0.04939800128340721,
-0.005705561488866806,
-0.048593707382678986,
0.041958022862672806,
0.018905311822891235,
0.012312875129282475,
-0.05427217110991478,
0.04686006158590317,
-0.031014911830425262,
-0.08821333944797516,
0.07054778933525085,
-0.17172537744045258,
-0.14420484006404877,
-0.016732588410377502,
0.09541327506303787,
-0.005953568499535322,
0.046278584748506546,
-0.040056049823760986,
-0.009434536099433899,
0.07391951978206635,
-0.0269415695220232,
-0.04834304377436638,
-0.08781854063272476,
0.06456413865089417,
-0.05005756393074989,
0.24928098917007446,
-0.029112549498677254,
0.054792094975709915,
0.12332935631275177,
0.04532260447740555,
-0.08691318333148956,
0.0976971685886383,
0.04996814951300621,
-0.07193738222122192,
0.025000201538205147,
0.06820585578680038,
-0.04126310721039772,
0.13930962979793549,
0.04770207032561302,
-0.16027268767356873,
0.012324318289756775,
-0.01813783496618271,
-0.08696658909320831,
-0.05839855596423149,
-0.0404290147125721,
-0.05169910192489624,
0.13761289417743683,
0.17830443382263184,
-0.04192258417606354,
0.005427184514701366,
-0.04754709452390671,
0.047377198934555054,
0.07567080855369568,
0.027246778830885887,
-0.032485853880643845,
-0.23523759841918945,
0.04040403291583061,
0.09328963607549667,
-0.014577069319784641,
-0.24050123989582062,
-0.09802444279193878,
-0.005739409476518631,
-0.0490448921918869,
-0.09084048867225647,
0.08294884860515594,
0.1138637512922287,
0.05386406183242798,
-0.062018126249313354,
-0.11093130707740784,
-0.08749385923147202,
0.15601852536201477,
-0.11171318590641022,
-0.1073419451713562
] |
null | null | transformers |
# Wiederchat-7b
Wiederchat-7b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [mlabonne/OmniTruthyBeagle-7B-v0](https://huggingface.co/mlabonne/OmniTruthyBeagle-7B-v0)
* [mayflowergmbh/Wiedervereinigung-7b-dpo-laser](https://huggingface.co/mayflowergmbh/Wiedervereinigung-7b-dpo-laser)
* [cognitivecomputations/openchat-3.5-0106-laser](https://huggingface.co/cognitivecomputations/openchat-3.5-0106-laser)
## 🧩 Configuration
```yaml
models:
- model: mistralai/Mistral-7B-v0.1
# no parameters necessary for base model
- model: mlabonne/OmniTruthyBeagle-7B-v0
parameters:
density: 0.60
weight: 0.30
- model: mayflowergmbh/Wiedervereinigung-7b-dpo-laser
parameters:
density: 0.65
weight: 0.40
- model: cognitivecomputations/openchat-3.5-0106-laser
parameters:
density: 0.6
weight: 0.3
merge_method: dare_ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "johannhartmann/Wiederchat-7b"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"tags": ["merge", "mergekit", "lazymergekit", "mlabonne/OmniTruthyBeagle-7B-v0", "mayflowergmbh/Wiedervereinigung-7b-dpo-laser", "cognitivecomputations/openchat-3.5-0106-laser"], "base_model": ["mlabonne/OmniTruthyBeagle-7B-v0", "mayflowergmbh/Wiedervereinigung-7b-dpo-laser", "cognitivecomputations/openchat-3.5-0106-laser"]} | text-generation | johannhartmann/Wiederchat-7b | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"mlabonne/OmniTruthyBeagle-7B-v0",
"mayflowergmbh/Wiedervereinigung-7b-dpo-laser",
"cognitivecomputations/openchat-3.5-0106-laser",
"base_model:mlabonne/OmniTruthyBeagle-7B-v0",
"base_model:mayflowergmbh/Wiedervereinigung-7b-dpo-laser",
"base_model:cognitivecomputations/openchat-3.5-0106-laser",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T17:09:39+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniTruthyBeagle-7B-v0 #mayflowergmbh/Wiedervereinigung-7b-dpo-laser #cognitivecomputations/openchat-3.5-0106-laser #base_model-mlabonne/OmniTruthyBeagle-7B-v0 #base_model-mayflowergmbh/Wiedervereinigung-7b-dpo-laser #base_model-cognitivecomputations/openchat-3.5-0106-laser #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Wiederchat-7b
Wiederchat-7b is a merge of the following models using LazyMergekit:
* mlabonne/OmniTruthyBeagle-7B-v0
* mayflowergmbh/Wiedervereinigung-7b-dpo-laser
* cognitivecomputations/openchat-3.5-0106-laser
## Configuration
## Usage
| [
"# Wiederchat-7b\n\nWiederchat-7b is a merge of the following models using LazyMergekit:\n* mlabonne/OmniTruthyBeagle-7B-v0\n* mayflowergmbh/Wiedervereinigung-7b-dpo-laser\n* cognitivecomputations/openchat-3.5-0106-laser",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniTruthyBeagle-7B-v0 #mayflowergmbh/Wiedervereinigung-7b-dpo-laser #cognitivecomputations/openchat-3.5-0106-laser #base_model-mlabonne/OmniTruthyBeagle-7B-v0 #base_model-mayflowergmbh/Wiedervereinigung-7b-dpo-laser #base_model-cognitivecomputations/openchat-3.5-0106-laser #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Wiederchat-7b\n\nWiederchat-7b is a merge of the following models using LazyMergekit:\n* mlabonne/OmniTruthyBeagle-7B-v0\n* mayflowergmbh/Wiedervereinigung-7b-dpo-laser\n* cognitivecomputations/openchat-3.5-0106-laser",
"## Configuration",
"## Usage"
] | [
176,
75,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniTruthyBeagle-7B-v0 #mayflowergmbh/Wiedervereinigung-7b-dpo-laser #cognitivecomputations/openchat-3.5-0106-laser #base_model-mlabonne/OmniTruthyBeagle-7B-v0 #base_model-mayflowergmbh/Wiedervereinigung-7b-dpo-laser #base_model-cognitivecomputations/openchat-3.5-0106-laser #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Wiederchat-7b\n\nWiederchat-7b is a merge of the following models using LazyMergekit:\n* mlabonne/OmniTruthyBeagle-7B-v0\n* mayflowergmbh/Wiedervereinigung-7b-dpo-laser\n* cognitivecomputations/openchat-3.5-0106-laser## Configuration## Usage"
] | [
-0.12149081379175186,
0.16418877243995667,
-0.0044039273634552956,
0.013053088448941708,
0.004124176688492298,
0.04044724255800247,
0.14744925498962402,
0.10922110825777054,
0.09621044993400574,
0.15248943865299225,
0.10126747190952301,
0.08851867914199829,
0.05697198584675789,
0.10909445583820343,
-0.030651496723294258,
-0.23717252910137177,
0.07518047094345093,
0.02215891145169735,
-0.02655802294611931,
0.0681389644742012,
0.09861747920513153,
-0.0830051451921463,
0.08869162946939468,
0.01994813047349453,
-0.06432449817657471,
0.005039632320404053,
-0.04873891547322273,
-0.008039074949920177,
0.09638828039169312,
0.09051914513111115,
-0.0284365676343441,
0.04827611893415451,
-0.02700277417898178,
-0.12213945388793945,
0.011685799807310104,
-0.019072962924838066,
-0.009497025050222874,
0.06665180623531342,
0.09302165359258652,
-0.05488291010260582,
0.0680566132068634,
-0.13223515450954437,
0.052707262337207794,
0.0685507208108902,
-0.06386317312717438,
-0.04791587218642235,
-0.06835944205522537,
0.08973181992769241,
0.05397215485572815,
0.045815277844667435,
-0.02608884871006012,
0.10930520296096802,
0.07315076142549515,
0.11410853266716003,
0.21544763445854187,
-0.26935911178588867,
-0.06157257407903671,
0.13796056807041168,
0.012358294799923897,
-0.04035232961177826,
-0.05619869381189346,
0.04779819771647453,
0.021664971485733986,
0.004599172621965408,
-0.010046768933534622,
-0.04056645929813385,
0.1289159208536148,
-0.0635666698217392,
-0.10140837728977203,
-0.009096977300941944,
0.20010222494602203,
0.03982122987508774,
0.02424682304263115,
-0.11531359702348709,
-0.09357036650180817,
0.07668136805295944,
-0.05230326950550079,
-0.01989796943962574,
0.01656477153301239,
-0.07263275235891342,
0.037200748920440674,
-0.04727822542190552,
-0.07027266919612885,
0.040390703827142715,
-0.01797197386622429,
0.1674693375825882,
0.006637532263994217,
-0.025052493438124657,
0.058380696922540665,
0.06742794811725616,
-0.0736909732222557,
-0.14805439114570618,
-0.018573414534330368,
-0.015671830624341965,
-0.03138871490955353,
0.00038436095928773284,
-0.06649221479892731,
-0.03697076812386513,
0.08274491876363754,
0.1824062317609787,
0.0515713207423687,
0.11989414691925049,
0.012944490648806095,
0.03260040283203125,
0.024327397346496582,
0.011216854676604271,
-0.11642248183488846,
-0.1523827463388443,
-0.015015466138720512,
0.05313193053007126,
0.04345519095659256,
0.010005936957895756,
-0.005007532425224781,
-0.040694914758205414,
0.046553172171115875,
-0.013380291871726513,
0.03418770432472229,
0.07178928703069687,
-0.05532919242978096,
-0.050240252166986465,
0.1110895574092865,
-0.09176740795373917,
-0.015987204387784004,
0.01525793131440878,
-0.04947131127119064,
0.07385037839412689,
0.07501741498708725,
0.004321210086345673,
0.012925943359732628,
0.05937570706009865,
-0.11925801634788513,
-0.01815730519592762,
0.01101648434996605,
-0.033889394253492355,
0.03574858978390694,
-0.0857025533914566,
-0.03710201010107994,
-0.07117486000061035,
-0.14075027406215668,
-0.05549199506640434,
0.056539010256528854,
-0.06350777298212051,
-0.03770911321043968,
-0.06474988907575607,
-0.020010754466056824,
0.024965623393654823,
0.026694292202591896,
0.03517976030707359,
0.013752521947026253,
0.005845281761139631,
0.02040482684969902,
0.04717915505170822,
-0.06395288556814194,
0.013330538757145405,
-0.04178856685757637,
0.0958501324057579,
-0.21582357585430145,
0.04249267280101776,
-0.11579239368438721,
0.03555840253829956,
-0.14066538214683533,
-0.028552351519465446,
0.033594466745853424,
-0.012467057444155216,
0.04780665412545204,
0.1651376336812973,
-0.21194417774677277,
-0.06942913681268692,
0.09895116090774536,
-0.11397934705018997,
-0.09000974148511887,
0.0923406258225441,
0.008662000298500061,
-0.009136713109910488,
0.04711657762527466,
0.214350163936615,
0.09869035333395004,
-0.08931608498096466,
-0.046106260269880295,
-0.008303336799144745,
0.07814206182956696,
0.06342799216508865,
0.04708445444703102,
-0.0007890560082159936,
-0.08530557155609131,
0.0818827673792839,
-0.07219678163528442,
0.011440329253673553,
-0.05548371747136116,
-0.058127887547016144,
-0.08793830126523972,
-0.07853482663631439,
0.0896335244178772,
0.015376389026641846,
0.01729549467563629,
-0.11778814345598221,
-0.03192780539393425,
0.05574345588684082,
0.11989930272102356,
-0.05234839767217636,
-0.05298212915658951,
-0.06299363821744919,
0.04740842059254646,
-0.03812276944518089,
0.027484504505991936,
-0.12487314641475677,
-0.08086802810430527,
0.0044222609139978886,
-0.1136484444141388,
0.0398833304643631,
0.06579310446977615,
0.08492080122232437,
0.054044000804424286,
-0.07012760639190674,
-0.047795362770557404,
0.09980523586273193,
0.019097167998552322,
-0.043580688536167145,
-0.2106664776802063,
-0.04864182323217392,
-0.05519997701048851,
0.19383151829242706,
-0.16112060844898224,
0.013717129826545715,
0.054639868438243866,
0.24710822105407715,
-0.013435115106403828,
-0.02669430524110794,
0.033268511295318604,
-0.03450830653309822,
0.01696021668612957,
-0.020057380199432373,
0.03597515448927879,
-0.01900240033864975,
-0.1034378707408905,
0.10202105343341827,
-0.1185813844203949,
0.09608288109302521,
0.07648595422506332,
0.06540778279304504,
-0.0534457229077816,
-0.04710952937602997,
-0.061565544456243515,
-0.08565308153629303,
0.06620840728282928,
-0.0992337018251419,
0.019440580159425735,
0.03488042578101158,
0.08904428780078888,
-0.04154331609606743,
-0.03320934996008873,
0.030800726264715195,
0.00003699032822623849,
-0.11922325193881989,
0.13785842061042786,
-0.08346184343099594,
-0.2235807478427887,
0.06932132691144943,
0.10552701354026794,
0.03511790558695793,
0.14211589097976685,
0.019548580050468445,
0.005697223357856274,
-0.10961500555276871,
-0.022338682785630226,
-0.019994255155324936,
0.06474562734365463,
-0.13262885808944702,
0.01074989140033722,
0.05849042907357216,
-0.00029264463228173554,
0.03191492334008217,
-0.05841178447008133,
0.036593373864889145,
-0.009536693803966045,
0.02325812168419361,
0.2277134209871292,
0.08914438635110855,
-0.014084558933973312,
0.08273445814847946,
0.016108881682157516,
-0.035764019936323166,
-0.0015295371413230896,
-0.028265148401260376,
-0.08958104252815247,
0.15216311812400818,
-0.12646417319774628,
-0.24730893969535828,
-0.07436450570821762,
-0.00966125912964344,
-0.12275952845811844,
-0.04947622865438461,
0.021096838638186455,
0.027438240125775337,
-0.06888844072818756,
-0.10363123565912247,
0.017818769440054893,
0.012837552465498447,
-0.007049348205327988,
-0.030337294563651085,
-0.0211982149630785,
0.04510350525379181,
-0.13690795004367828,
-0.019826842471957207,
0.03404749557375908,
-0.028133945539593697,
0.02094501256942749,
-0.044744908809661865,
0.031007224693894386,
0.06417514383792877,
0.07056110352277756,
0.0065450952388346195,
-0.0009688480058684945,
0.2730439305305481,
-0.05345114693045616,
0.06232236325740814,
0.0773995891213417,
-0.024636954069137573,
0.03860742971301079,
0.13213272392749786,
0.018679825589060783,
-0.09792014956474304,
-0.0042295679450035095,
0.02907092496752739,
-0.0005049618193879724,
-0.13957415521144867,
-0.17283104360103607,
-0.062425851821899414,
0.06353104114532471,
0.12196958065032959,
0.05554327741265297,
0.059337370097637177,
0.051714908331632614,
-0.061046991497278214,
-0.023799806833267212,
0.011912470683455467,
0.07206490635871887,
0.2462419718503952,
-0.03722899779677391,
0.061464302241802216,
-0.04219849407672882,
-0.03213142231106758,
0.041892461478710175,
0.03589076176285744,
0.12109190970659256,
0.07831501215696335,
0.08677221834659576,
0.05716478452086449,
0.051867369562387466,
-0.042484100908041,
0.05000833049416542,
0.01015377789735794,
0.008224954828619957,
-0.03878995403647423,
-0.12120410799980164,
0.004671216942369938,
0.07017206400632858,
0.045250289142131805,
0.035929977893829346,
-0.04993385449051857,
-0.02281014993786812,
0.0660390853881836,
0.15819138288497925,
0.08809866011142731,
-0.19652187824249268,
-0.09027441591024399,
0.056467294692993164,
-0.051839374005794525,
-0.040246617048978806,
-0.0007405159412883222,
0.0676397755742073,
-0.07282686978578568,
0.11686095595359802,
-0.01930956542491913,
0.03946453705430031,
-0.0890987366437912,
0.010303646326065063,
-0.0723872259259224,
0.0631202831864357,
0.04070048779249191,
0.04689908027648926,
-0.2116754949092865,
0.2211112529039383,
0.0382126159965992,
-0.005047145765274763,
-0.0533570870757103,
-0.018094377592206,
0.023653684183955193,
0.037886761128902435,
0.09959419071674347,
-0.0020945563446730375,
0.10220392048358917,
0.07933197170495987,
-0.0591764859855175,
0.009740376845002174,
0.0963221862912178,
-0.07319269329309464,
0.041903361678123474,
0.023720048367977142,
-0.08076869696378708,
-0.004750873893499374,
0.09816696494817734,
-0.18564099073410034,
-0.11731764674186707,
0.14572449028491974,
0.05990966781973839,
0.00604089442640543,
-0.08511202037334442,
-0.10752228647470474,
-0.16700305044651031,
0.19966217875480652,
-0.075710229575634,
-0.028598561882972717,
-0.11908982694149017,
0.03227059543132782,
0.22025637328624725,
-0.033942945301532745,
0.0750424712896347,
-0.024887816980481148,
0.1472991406917572,
-0.05439717695116997,
-0.08625788241624832,
0.04045682027935982,
-0.11404706537723541,
-0.16148342192173004,
-0.030724121257662773,
0.15667657554149628,
-0.01749131828546524,
0.029358170926570892,
-0.005172324366867542,
0.07724826782941818,
-0.00546038243919611,
-0.07360230386257172,
0.06407871097326279,
0.055074114352464676,
-0.050263047218322754,
0.0152029013261199,
0.02401074767112732,
-0.11953001469373703,
-0.06764046102762222,
0.01388140581548214,
0.07528317719697952,
0.22537367045879364,
-0.040564294904470444,
0.07034111768007278,
0.10567592829465866,
-0.043429162353277206,
-0.17795565724372864,
0.012197820469737053,
0.05322030931711197,
0.03264782577753067,
-0.019978288561105728,
-0.12725220620632172,
0.13970832526683807,
0.06582485884428024,
-0.034553784877061844,
0.07080146670341492,
-0.29967668652534485,
-0.1255088597536087,
0.07011289149522781,
0.06518381088972092,
0.03347739577293396,
-0.10844232141971588,
-0.10459680110216141,
-0.05775630846619606,
-0.18593823909759521,
0.12198266386985779,
0.009840385988354683,
0.10573510825634003,
-0.026698734611272812,
0.063470259308815,
0.031232425943017006,
-0.05927497521042824,
0.1539705991744995,
0.020517632365226746,
0.03923839330673218,
-0.06634507328271866,
-0.08122327923774719,
0.020070917904376984,
-0.030522115528583527,
0.10392250120639801,
-0.06592395901679993,
0.058877695351839066,
-0.03816336393356323,
0.00040731881745159626,
-0.11896771937608719,
0.11758270114660263,
-0.06976840645074844,
-0.04682682454586029,
-0.008268730714917183,
0.10582902282476425,
0.040138304233551025,
0.052500270307064056,
0.07977711409330368,
-0.0112397950142622,
-0.0008340585045516491,
0.1831267923116684,
0.09437103569507599,
-0.0448758490383625,
-0.0677318200469017,
-0.027686651796102524,
-0.0327618233859539,
0.02151876874268055,
-0.07982790470123291,
-0.024146828800439835,
0.14822956919670105,
0.054770052433013916,
0.10708989202976227,
-0.008664621040225029,
-0.06824928522109985,
-0.07019101828336716,
0.05197744444012642,
-0.2217644453048706,
-0.18733082711696625,
-0.0033117630518972874,
0.17275795340538025,
-0.031039545312523842,
0.055082935839891434,
0.21088020503520966,
-0.04405035823583603,
-0.0026994531508535147,
0.016764897853136063,
-0.0044855521991848946,
-0.026253661140799522,
0.15200206637382507,
0.032688163220882416,
0.07537910342216492,
-0.10455937683582306,
0.022540217265486717,
-0.002187058562412858,
-0.15115977823734283,
0.04075499624013901,
0.09163027256727219,
-0.08994575589895248,
-0.08274085074663162,
-0.16712498664855957,
0.12374068796634674,
0.005041224416345358,
0.005564957857131958,
-0.042330268770456314,
-0.11361754685640335,
0.03369557484984398,
0.19360806047916412,
0.06216111406683922,
0.05208662152290344,
0.009233363904058933,
-0.04916414991021156,
-0.023005980998277664,
0.09569931775331497,
0.04985004663467407,
0.07845345884561539,
-0.05827711895108223,
0.06750056892633438,
-0.05681690201163292,
0.015892393887043,
-0.022552428767085075,
-0.004423016216605902,
-0.1270081102848053,
-0.07637669891119003,
-0.11366094648838043,
-0.03468738868832588,
-0.13124410808086395,
-0.027033573016524315,
-0.0016839480958878994,
-0.07031713426113129,
-0.01648787036538124,
-0.0036972847301512957,
-0.05819832906126976,
-0.02715742215514183,
-0.006656445097178221,
0.1120704934000969,
-0.04822993278503418,
-0.03149248659610748,
0.062466658651828766,
-0.09445635229349136,
0.08136647194623947,
0.035614773631095886,
-0.029158880934119225,
-0.023624274879693985,
-0.10772187262773514,
-0.04224056005477905,
0.01723974384367466,
0.01953873783349991,
0.05097181722521782,
-0.12690308690071106,
-0.03215321898460388,
-0.009195749647915363,
0.028868209570646286,
-0.013887577690184116,
0.07080738991498947,
-0.10901166498661041,
0.010436180979013443,
-0.03079608641564846,
-0.0936150848865509,
-0.05368490517139435,
-0.04568978771567345,
0.03550872206687927,
-0.023764722049236298,
0.07365347445011139,
-0.0751698836684227,
0.055444493889808655,
-0.1502472311258316,
0.0038616119418293238,
-0.0014881361275911331,
-0.14301395416259766,
0.003997978754341602,
-0.01962297223508358,
0.059951119124889374,
-0.020735379308462143,
0.11142758280038834,
-0.07933583110570908,
-0.08664775639772415,
0.05368899181485176,
-0.03749469667673111,
-0.10125739127397537,
0.0464865081012249,
0.11874415725469589,
0.030349774286150932,
0.0010764029575511813,
-0.01913311518728733,
0.04573233425617218,
-0.004759622272104025,
-0.008549055084586143,
0.022693926468491554,
0.11253257840871811,
0.004741750657558441,
0.08619857579469681,
0.1263279914855957,
-0.05349890887737274,
-0.049677774310112,
0.07244008779525757,
-0.008831096813082695,
0.14159901440143585,
-0.03428546339273453,
0.15323342382907867,
0.08927107602357864,
-0.1423138827085495,
0.06700161099433899,
-0.03401726484298706,
-0.018089843913912773,
-0.0028501253109425306,
-0.11254630982875824,
-0.10951437056064606,
-0.040266022086143494,
0.005125736352056265,
-0.12016850709915161,
0.016049666330218315,
-0.03300366923213005,
0.06386170536279678,
-0.004209831822663546,
0.1648687869310379,
0.0223870687186718,
-0.005302485078573227,
0.1259949952363968,
0.023092247545719147,
-0.05558726191520691,
-0.1074780747294426,
-0.025103991851210594,
-0.06716103106737137,
0.07694480568170547,
-0.051954977214336395,
0.021658364683389664,
-0.0005145516479387879,
-0.02463945373892784,
0.00697575556114316,
-0.10959465056657791,
-0.0007956316112540662,
-0.00763425137847662,
0.01417047530412674,
0.035516057163476944,
0.00471138721331954,
-0.026406750082969666,
-0.05730525031685829,
0.07927373051643372,
-0.056323613971471786,
-0.10475897043943405,
-0.05321482941508293,
0.09996261447668076,
-0.028164980933070183,
0.034959420561790466,
0.01350346114486456,
-0.04758187755942345,
0.035412006080150604,
0.15739350020885468,
0.19468319416046143,
-0.008307176642119884,
0.01901046745479107,
0.053231772035360336,
-0.00015330924361478537,
0.028428494930267334,
-0.02943115495145321,
0.021527616307139397,
0.2435225546360016,
-0.06418987363576889,
0.0505187101662159,
-0.026005787774920464,
-0.07362113147974014,
-0.05279577150940895,
-0.03475544601678848,
0.028123710304498672,
0.01356690376996994,
0.007089483551681042,
0.07713458687067032,
-0.07956209778785706,
-0.08106529712677002,
0.048567034304142,
-0.1900048404932022,
-0.12596134841442108,
-0.06246552988886833,
0.04120761901140213,
0.03294682502746582,
0.08902222663164139,
-0.052178118377923965,
-0.04734019190073013,
0.1079932376742363,
-0.024718696251511574,
-0.04595052823424339,
-0.05777394399046898,
0.016459934413433075,
-0.06426824629306793,
0.1191108301281929,
-0.013605390675365925,
0.07719983905553818,
0.11061792820692062,
0.0017711864784359932,
-0.14746296405792236,
-0.014847490936517715,
0.06506667286157608,
-0.0935329869389534,
0.01479639858007431,
0.13737869262695312,
0.009678268805146217,
0.12463332712650299,
0.04598686099052429,
-0.14526836574077606,
0.012958303093910217,
0.19495265185832977,
0.01583668775856495,
-0.07660819590091705,
0.11253618448972702,
-0.05634353682398796,
0.15332075953483582,
0.15547755360603333,
-0.0666005089879036,
-0.0544125996530056,
-0.029740160331130028,
0.03166738152503967,
0.05843832343816757,
0.04727382957935333,
-0.044917259365320206,
-0.23721110820770264,
0.04370510205626488,
0.032948318868875504,
-0.010482167825102806,
-0.2073877900838852,
-0.13643963634967804,
-0.05256304517388344,
-0.02595455013215542,
-0.01325133629143238,
0.05472276359796524,
0.09973881393671036,
-0.007095668464899063,
-0.012549355626106262,
-0.1406734734773636,
-0.012269035913050175,
0.1165284737944603,
-0.1083144024014473,
-0.09111379086971283
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | Rooney88/bio_llama-2_13b | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:09:52+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | manahilfatima31/VAST_NLP_2 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:16:19+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# agriparts-5
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "gpt2", "model-index": [{"name": "agriparts-5", "results": []}]} | text-generation | Gan1108/agriparts-5 | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T17:20:15+00:00 | [] | [] | TAGS
#transformers #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# agriparts-5
This model is a fine-tuned version of gpt2 on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
| [
"# agriparts-5\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.2+cu121\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# agriparts-5\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.2+cu121\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
68,
26,
6,
12,
8,
3,
141,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# agriparts-5\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.2+cu121\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
-0.10518927872180939,
0.08880318701267242,
-0.0017757306341081858,
0.07936244457960129,
0.1461421698331833,
0.04422188922762871,
0.1091366708278656,
0.1342860460281372,
-0.08484011143445969,
0.07409052550792694,
0.07259728014469147,
0.04601900279521942,
0.0592225156724453,
0.14109519124031067,
-0.019191432744264603,
-0.259949654340744,
0.02057407610118389,
-0.022551221773028374,
-0.11053524166345596,
0.1081300601363182,
0.12499090284109116,
-0.08515574783086777,
0.07839201390743256,
0.03130535036325455,
-0.16959287226200104,
0.000802390044555068,
-0.01072048768401146,
-0.0612146332859993,
0.1122935563325882,
0.046102579683065414,
0.07205839455127716,
0.012914096936583519,
0.13185912370681763,
-0.1993255317211151,
-0.001137427636422217,
0.0924583375453949,
0.031672608107328415,
0.08346223831176758,
0.05717340484261513,
-0.004045479465276003,
0.07834472507238388,
-0.10981961339712143,
0.08678246289491653,
0.03754914551973343,
-0.11094365268945694,
-0.19222651422023773,
-0.09600681066513062,
0.05707598850131035,
0.08322019875049591,
0.0890488475561142,
-0.0008799760253168643,
0.10235686600208282,
-0.07824182510375977,
0.04695761576294899,
0.1790546476840973,
-0.26888155937194824,
-0.06848573684692383,
0.06039980426430702,
0.03652667626738548,
0.054799892008304596,
-0.0987696424126625,
-0.0015828172909095883,
0.044264186173677444,
0.025198446586728096,
0.07745630294084549,
0.011312436312437057,
-0.09928055852651596,
0.0012600532500073314,
-0.11575315147638321,
-0.039668962359428406,
0.12951423227787018,
0.030443690717220306,
-0.05383816361427307,
-0.12275242060422897,
-0.03527143970131874,
-0.11318164318799973,
-0.015403520315885544,
-0.034848641604185104,
0.02701135165989399,
-0.014092202298343182,
-0.0764697715640068,
-0.051347874104976654,
-0.07878672331571579,
-0.09614374488592148,
0.000948712695389986,
0.1332850456237793,
0.03298164904117584,
0.019230203703045845,
-0.021113965660333633,
0.14108408987522125,
-0.06666453182697296,
-0.11763697117567062,
-0.041629113256931305,
-0.010457548312842846,
-0.09347221255302429,
-0.05326643958687782,
-0.048623841255903244,
0.01592114008963108,
0.01713188737630844,
0.1834350824356079,
-0.0612659826874733,
0.07283239811658859,
0.023178966715931892,
-0.005918371491134167,
-0.03990136459469795,
0.16007903218269348,
-0.05053091421723366,
-0.026708431541919708,
0.007309075444936752,
0.07031819224357605,
-0.006444456055760384,
-0.007784311193972826,
-0.07232179492712021,
-0.030557123944163322,
0.07015968859195709,
0.04372626543045044,
-0.054813336580991745,
0.05194125324487686,
-0.046826571226119995,
-0.018096324056386948,
0.047904375940561295,
-0.11530256271362305,
0.03339706361293793,
-0.009181821718811989,
-0.07143057882785797,
-0.03766302391886711,
0.019373144954442978,
0.025911737233400345,
-0.007544017396867275,
0.06586824357509613,
-0.0770738422870636,
-0.018781239166855812,
-0.09231758117675781,
-0.058297205716371536,
0.006794068496674299,
-0.0421241857111454,
-0.015211119316518307,
-0.053144946694374084,
-0.21875815093517303,
-0.04450631141662598,
0.037989452481269836,
-0.06720709800720215,
-0.058664675801992416,
-0.017945289611816406,
-0.044314198195934296,
0.04214693233370781,
-0.012399629689753056,
0.14412859082221985,
-0.05111326649785042,
0.05800287798047066,
0.011296074837446213,
0.029942290857434273,
0.028432128950953484,
0.032252371311187744,
-0.07977116852998734,
0.035820648074150085,
-0.1391676664352417,
0.10326959192752838,
-0.0912427082657814,
0.01757988892495632,
-0.1171407476067543,
-0.08692187070846558,
-0.007163552567362785,
-0.016699757426977158,
0.0612250491976738,
0.11803240329027176,
-0.16251245141029358,
-0.03694455325603485,
0.17977608740329742,
-0.09607832133769989,
-0.08964066952466965,
0.08422474563121796,
-0.039254654198884964,
0.04156104475259781,
0.07064968347549438,
0.12336116284132004,
0.10823101550340652,
-0.10577114671468735,
-0.009172261692583561,
0.0031791259534657,
0.0707593709230423,
0.05927922949194908,
0.07046885043382645,
-0.023195641115307808,
0.01720604859292507,
0.02592420019209385,
-0.07073160260915756,
-0.00036348626599647105,
-0.08331403136253357,
-0.0759834498167038,
-0.0591910183429718,
-0.06908708065748215,
0.02630336582660675,
0.016753394156694412,
0.03871260583400726,
-0.06788557767868042,
-0.13391095399856567,
0.07408525049686432,
0.11869841814041138,
-0.06747212260961533,
0.009209606796503067,
-0.0808395966887474,
0.010998246259987354,
0.0025977271143347025,
-0.021654026582837105,
-0.18131166696548462,
-0.15123699605464935,
0.03360782936215401,
-0.05732455849647522,
0.025270462036132812,
-0.013727937825024128,
0.0780370682477951,
0.0670352354645729,
-0.05175120383501053,
0.003848477965220809,
-0.08792468160390854,
-0.01733325980603695,
-0.08477617055177689,
-0.1929021030664444,
-0.04564918577671051,
-0.023509563878178596,
0.21592091023921967,
-0.2264353185892105,
0.024295106530189514,
0.03536473214626312,
0.1386076658964157,
0.02420203760266304,
-0.07121509313583374,
-0.022423937916755676,
0.012896919623017311,
-0.00898054800927639,
-0.1113303154706955,
0.01733333244919777,
0.01926703006029129,
-0.08293914049863815,
-0.011431816034018993,
-0.12143430858850479,
0.044846128672361374,
0.07409113645553589,
0.07310076057910919,
-0.11324448883533478,
-0.06324926018714905,
-0.06874033063650131,
-0.035339731723070145,
-0.08861034363508224,
-0.013775903731584549,
0.20403213798999786,
-0.0017742076888680458,
0.11165228486061096,
-0.06845450401306152,
-0.07242054492235184,
0.005607618950307369,
0.006293372251093388,
0.03170958533883095,
0.06428402662277222,
0.05786494165658951,
-0.09307895600795746,
0.1007085070014,
0.045328568667173386,
-0.031988076865673065,
0.12570492923259735,
-0.046235039830207825,
-0.08122675865888596,
-0.012669103220105171,
0.01197966281324625,
0.0034214919432997704,
0.10108473896980286,
-0.09442362934350967,
0.013057864271104336,
0.03562283143401146,
0.036076612770557404,
0.040733881294727325,
-0.19208116829395294,
-0.005600752774626017,
0.024347836151719093,
-0.05846615880727768,
-0.0009103290503844619,
-0.021458877250552177,
0.01683877222239971,
0.08651652932167053,
0.027925685048103333,
0.026554210111498833,
0.026295075193047523,
0.0004944347310811281,
-0.0892384946346283,
0.21020400524139404,
-0.10674329847097397,
-0.13559453189373016,
-0.11658826470375061,
0.044785600155591965,
-0.034664686769247055,
-0.005962675902992487,
0.002552068093791604,
-0.08492276817560196,
-0.05845039337873459,
-0.07151292264461517,
0.008188614621758461,
-0.006186333484947681,
0.007316695526242256,
0.04040928930044174,
0.008715655654668808,
0.07489072531461716,
-0.13306058943271637,
0.0094865458086133,
-0.008012128993868828,
-0.12974034249782562,
0.020362941548228264,
0.03519270196557045,
0.06461741775274277,
0.13373981416225433,
-0.007488439325243235,
0.02169700711965561,
-0.033642034977674484,
0.2370942085981369,
-0.10653899610042572,
-0.009565011598169804,
0.10768420249223709,
0.022402582690119743,
0.041109636425971985,
0.09782347083091736,
0.03546486794948578,
-0.11007890105247498,
0.030487241223454475,
0.07422545552253723,
-0.03900473564863205,
-0.25081735849380493,
-0.029602741822600365,
-0.033019475638866425,
-0.07529012858867645,
0.10572104156017303,
0.054924268275499344,
0.02690754644572735,
0.07617509365081787,
-0.03237130865454674,
0.06257183104753494,
0.028374768793582916,
0.08090323954820633,
0.0998358353972435,
0.0626244843006134,
0.11541303992271423,
-0.021999191492795944,
-0.01596651040017605,
0.07472196966409683,
0.015064149163663387,
0.2789067029953003,
-0.025747030973434448,
0.10583781450986862,
0.04117536544799805,
0.10950390249490738,
-0.004452122375369072,
0.03998316824436188,
0.01108674518764019,
-0.018274378031492233,
0.005021423101425171,
-0.06309919059276581,
-0.018259920179843903,
0.0433371439576149,
-0.058390986174345016,
0.00730983167886734,
-0.07158824801445007,
0.025510013103485107,
0.027839193120598793,
0.2532149851322174,
0.04583786800503731,
-0.3068157732486725,
-0.09455569088459015,
0.013967446982860565,
-0.02332892268896103,
-0.0840548500418663,
0.007408262696117163,
0.08909619599580765,
-0.14852942526340485,
0.08320222795009613,
-0.06837990134954453,
0.09003528952598572,
-0.03579707071185112,
-0.012129327282309532,
0.09756836295127869,
0.106270931661129,
-0.015852700918912888,
0.0883466973900795,
-0.19947534799575806,
0.2082238793373108,
0.01386308390647173,
0.09804719686508179,
-0.06861423701047897,
0.05572070926427841,
0.01624349132180214,
0.07080761343240738,
0.0863647609949112,
-0.006784107070416212,
-0.07637201249599457,
-0.14542800188064575,
-0.07618957757949829,
0.052325110882520676,
0.12321962416172028,
-0.05626019462943077,
0.08481334894895554,
-0.04817134141921997,
0.026657599955797195,
0.04968525469303131,
-0.06665200740098953,
-0.17311836779117584,
-0.14193427562713623,
0.03075544163584709,
-0.0001597248628968373,
0.04033338278532028,
-0.09536229819059372,
-0.09247767180204391,
-0.03879864513874054,
0.202443465590477,
-0.01533757895231247,
-0.06129753962159157,
-0.1558048576116562,
0.0836515799164772,
0.14162741601467133,
-0.05119151994585991,
0.04707248508930206,
0.018053283914923668,
0.15405498445034027,
0.04464840516448021,
-0.06764669716358185,
0.07895581424236298,
-0.06896454840898514,
-0.2182442843914032,
-0.0382150374352932,
0.12011189013719559,
0.06981609761714935,
0.03801032900810242,
0.014717566780745983,
0.03936118632555008,
0.002331683412194252,
-0.11640004813671112,
0.023306390270590782,
0.07254388928413391,
0.04914860799908638,
0.05035778507590294,
-0.044010620564222336,
0.04434604197740555,
-0.013003508560359478,
-0.030154023319482803,
0.10863694548606873,
0.2074228972196579,
-0.08222322911024094,
0.08011563867330551,
0.03341027349233627,
-0.0765724927186966,
-0.1702863872051239,
0.0798853263258934,
0.1380911022424698,
0.02976784110069275,
0.017685312777757645,
-0.21451328694820404,
0.11682779341936111,
0.14312215149402618,
-0.036812715232372284,
0.06622014939785004,
-0.2881423234939575,
-0.15034255385398865,
0.05678822472691536,
0.07414855808019638,
-0.009074762463569641,
-0.1415156126022339,
-0.04607443884015083,
-0.043727826327085495,
-0.15917985141277313,
0.14078907668590546,
-0.10871211439371109,
0.10940322279930115,
-0.009246126748621464,
0.08628284186124802,
0.017973702400922775,
-0.03172384947538376,
0.1647951900959015,
0.03812863305211067,
0.07472895085811615,
-0.04163019731640816,
0.05142807960510254,
0.09712348878383636,
-0.04807908087968826,
0.012026460841298103,
-0.030012009665369987,
0.06742549687623978,
-0.13662831485271454,
-0.033506136387586594,
-0.07388638705015182,
0.05226869136095047,
-0.04440850391983986,
-0.06887111067771912,
-0.05935078114271164,
0.05474331974983215,
0.03928995877504349,
-0.03864365071058273,
0.05869244039058685,
-0.005127707030624151,
0.1385597288608551,
0.06748254597187042,
0.08964981138706207,
-0.018308989703655243,
-0.0752725824713707,
0.012594154104590416,
-0.011092402040958405,
0.051276225596666336,
-0.14916390180587769,
0.0449952594935894,
0.11649524420499802,
0.04984729737043381,
0.1356705129146576,
0.049234528094530106,
-0.06099952757358551,
0.017692532390356064,
0.058254290372133255,
-0.09890444576740265,
-0.11751098930835724,
-0.006373237352818251,
-0.001508074114099145,
-0.119418203830719,
0.029406927525997162,
0.11848051846027374,
-0.04136922210454941,
-0.021165624260902405,
-0.014117058366537094,
0.015428725630044937,
-0.030447736382484436,
0.20130302011966705,
0.013119995594024658,
0.06387651711702347,
-0.09939859807491302,
0.10480138659477234,
0.055497318506240845,
-0.05753724277019501,
0.028664302080869675,
0.05767105519771576,
-0.0954023227095604,
-0.007180035579949617,
0.051014773547649384,
0.0921693816781044,
-0.06718603521585464,
-0.04079608991742134,
-0.06996816396713257,
-0.11733980476856232,
0.047144562005996704,
0.06892544776201248,
0.04116804897785187,
0.02000272087752819,
-0.01138002797961235,
0.03422784060239792,
-0.12120908498764038,
0.0604909248650074,
0.04742938280105591,
0.07741168886423111,
-0.12269758433103561,
0.11680595576763153,
0.009599682874977589,
-0.00006978632882237434,
-0.01605767197906971,
0.019900936633348465,
-0.09533830732107162,
-0.028455790132284164,
-0.1310848593711853,
0.0017158458940684795,
-0.023074598982930183,
0.006104113534092903,
-0.0028283242136240005,
-0.048646435141563416,
-0.04895760491490364,
0.03659684583544731,
-0.07995559275150299,
-0.058438196778297424,
-0.00960150919854641,
0.05060139670968056,
-0.1308886557817459,
0.00433354964479804,
0.04385334998369217,
-0.09930529445409775,
0.09862355887889862,
0.06371727585792542,
0.04727137088775635,
0.037141866981983185,
-0.1294003129005432,
0.004557596519589424,
0.03374103829264641,
0.009453170001506805,
0.0353870652616024,
-0.1037561222910881,
0.010859630070626736,
-0.03200271725654602,
0.03510637581348419,
0.026993468403816223,
0.026510225608944893,
-0.11936145275831223,
-0.009076609276235104,
-0.04560435190796852,
-0.039888668805360794,
-0.05516112595796585,
0.03962086886167526,
0.08421685546636581,
0.0433155857026577,
0.1343885064125061,
-0.08327415585517883,
0.015676189213991165,
-0.21599872410297394,
-0.030923722311854362,
-0.0043535116128623486,
-0.013434141874313354,
-0.07145988196134567,
-0.024114055559039116,
0.08143211156129837,
-0.03336752578616142,
0.17352519929409027,
-0.015235676430165768,
0.0722457766532898,
0.0386291928589344,
-0.03122337907552719,
0.008556229062378407,
0.016708282753825188,
0.19203728437423706,
0.08492367714643478,
-0.004338426049798727,
0.08844480663537979,
-0.00570372398942709,
0.04668379947543144,
0.008010256104171276,
0.19208675622940063,
0.1047019362449646,
-0.054195504635572433,
0.04690999165177345,
0.05912717059254646,
-0.12511099874973297,
-0.12327109277248383,
0.11754193902015686,
-0.05913925915956497,
0.09081396460533142,
-0.05916247144341469,
0.1417149156332016,
0.10587792098522186,
-0.18802131712436676,
0.03651903197169304,
-0.05778646096587181,
-0.10242173820734024,
-0.13152678310871124,
-0.047122277319431305,
-0.090831458568573,
-0.16177722811698914,
0.03486904501914978,
-0.13932952284812927,
0.057649124413728714,
0.10841464251279831,
0.017256436869502068,
0.030919240787625313,
0.13808958232402802,
-0.02843940630555153,
-0.009519982151687145,
0.04416767135262489,
0.016085322946310043,
0.025795605033636093,
-0.05786747485399246,
-0.08284411579370499,
0.046304576098918915,
0.01295674592256546,
0.08001073449850082,
-0.05634224787354469,
0.003194198478013277,
0.028086625039577484,
-0.0006059250445105135,
-0.04977502301335335,
0.012854145839810371,
0.02626699022948742,
0.02350691147148609,
0.0022348612546920776,
0.026544297114014626,
0.004408035892993212,
-0.03208516165614128,
0.2919228672981262,
-0.07395608723163605,
-0.09290627390146255,
-0.15733249485492706,
0.20676904916763306,
0.011479683220386505,
-0.002266510156914592,
0.07311812788248062,
-0.09448309987783432,
-0.0006939990562386811,
0.13333472609519958,
0.14294038712978363,
-0.05702243745326996,
-0.0195892546325922,
-0.011653220281004906,
-0.02378986030817032,
-0.031407058238983154,
0.13231152296066284,
0.07920178025960922,
0.044763822108507156,
-0.06168031319975853,
-0.01577906496822834,
0.013132250867784023,
-0.029027312994003296,
-0.04906829446554184,
0.09654957056045532,
-0.008962487801909447,
0.00402285810559988,
-0.03569363057613373,
0.08104681223630905,
0.010224464349448681,
-0.1466568261384964,
0.04450523853302002,
-0.14942236244678497,
-0.1819503903388977,
-0.020001187920570374,
0.01397382840514183,
-0.026120612397789955,
0.05810320004820824,
-0.00047963575343601406,
-0.0253159087151289,
0.11181306838989258,
-0.005339642055332661,
-0.06480508297681808,
-0.09566739946603775,
0.08847884833812714,
-0.03950171545147896,
0.2365870326757431,
-0.014056841842830181,
0.07519226521253586,
0.10603338479995728,
0.008917710743844509,
-0.13591895997524261,
0.0203112605959177,
0.053064633160829544,
-0.0938490629196167,
0.018548760563135147,
0.13147182762622833,
-0.048583388328552246,
0.053440485149621964,
0.05344317480921745,
-0.13570041954517365,
0.00537390261888504,
-0.01785200461745262,
-0.019201477989554405,
-0.09909766167402267,
-0.025634394958615303,
-0.07359416037797928,
0.1589617133140564,
0.19897744059562683,
-0.040922198444604874,
0.014653780497610569,
-0.07090739160776138,
0.04656761884689331,
0.051364731043577194,
0.1168251633644104,
-0.03638481721282005,
-0.23997798562049866,
0.0199450496584177,
0.06808729469776154,
0.007416222710162401,
-0.21675480902194977,
-0.08490599691867828,
0.03974597156047821,
-0.056440841406583786,
-0.043703142553567886,
0.13984327018260956,
0.06330178678035736,
0.02751578763127327,
-0.03171171620488167,
-0.13023272156715393,
-0.03965922072529793,
0.14839377999305725,
-0.14497053623199463,
-0.053326524794101715
] |
null | null | transformers |
# MoMoAlpaca-72b
MoMoAlpaca-72b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [ibivibiv/alpaca-dragon-72b-v1](https://huggingface.co/ibivibiv/alpaca-dragon-72b-v1)
* [moreh/MoMo-72B-lora-1.8.7-DPO](https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: ibivibiv/alpaca-dragon-72b-v1
layer_range: [0, 80]
- model: moreh/MoMo-72B-lora-1.8.7-DPO
layer_range: [0, 80]
merge_method: slerp
base_model: ibivibiv/alpaca-dragon-72b-v1
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: float32
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "paulml/MoMoAlpaca-72b"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"tags": ["merge", "mergekit", "lazymergekit", "ibivibiv/alpaca-dragon-72b-v1", "moreh/MoMo-72B-lora-1.8.7-DPO"], "base_model": ["ibivibiv/alpaca-dragon-72b-v1", "moreh/MoMo-72B-lora-1.8.7-DPO"]} | text-generation | paulml/MoMoAlpaca-72b | [
"transformers",
"safetensors",
"llama",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"ibivibiv/alpaca-dragon-72b-v1",
"moreh/MoMo-72B-lora-1.8.7-DPO",
"base_model:ibivibiv/alpaca-dragon-72b-v1",
"base_model:moreh/MoMo-72B-lora-1.8.7-DPO",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T17:22:59+00:00 | [] | [] | TAGS
#transformers #safetensors #llama #text-generation #merge #mergekit #lazymergekit #ibivibiv/alpaca-dragon-72b-v1 #moreh/MoMo-72B-lora-1.8.7-DPO #base_model-ibivibiv/alpaca-dragon-72b-v1 #base_model-moreh/MoMo-72B-lora-1.8.7-DPO #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# MoMoAlpaca-72b
MoMoAlpaca-72b is a merge of the following models using LazyMergekit:
* ibivibiv/alpaca-dragon-72b-v1
* moreh/MoMo-72B-lora-1.8.7-DPO
## Configuration
## Usage
| [
"# MoMoAlpaca-72b\n\nMoMoAlpaca-72b is a merge of the following models using LazyMergekit:\n* ibivibiv/alpaca-dragon-72b-v1\n* moreh/MoMo-72B-lora-1.8.7-DPO",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #merge #mergekit #lazymergekit #ibivibiv/alpaca-dragon-72b-v1 #moreh/MoMo-72B-lora-1.8.7-DPO #base_model-ibivibiv/alpaca-dragon-72b-v1 #base_model-moreh/MoMo-72B-lora-1.8.7-DPO #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# MoMoAlpaca-72b\n\nMoMoAlpaca-72b is a merge of the following models using LazyMergekit:\n* ibivibiv/alpaca-dragon-72b-v1\n* moreh/MoMo-72B-lora-1.8.7-DPO",
"## Configuration",
"## Usage"
] | [
138,
66,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #merge #mergekit #lazymergekit #ibivibiv/alpaca-dragon-72b-v1 #moreh/MoMo-72B-lora-1.8.7-DPO #base_model-ibivibiv/alpaca-dragon-72b-v1 #base_model-moreh/MoMo-72B-lora-1.8.7-DPO #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# MoMoAlpaca-72b\n\nMoMoAlpaca-72b is a merge of the following models using LazyMergekit:\n* ibivibiv/alpaca-dragon-72b-v1\n* moreh/MoMo-72B-lora-1.8.7-DPO## Configuration## Usage"
] | [
-0.062039170414209366,
0.0733378455042839,
-0.0061091212555766106,
0.0040327818132936954,
0.11086922138929367,
0.022956574335694313,
0.21291886270046234,
0.11940356343984604,
-0.0809999406337738,
0.020705578848719597,
0.016030583530664444,
0.14658509194850922,
0.061522603034973145,
0.1337243914604187,
-0.030541623011231422,
-0.227370023727417,
0.07833224534988403,
0.022180093452334404,
-0.04302578419446945,
0.09006798267364502,
0.10381966829299927,
-0.04311009868979454,
0.09748341888189316,
0.033341892063617706,
-0.05736791342496872,
0.020332863554358482,
-0.05233461782336235,
-0.07249458879232407,
0.07235164195299149,
0.07552742958068848,
0.08464207500219345,
0.04641858860850334,
-0.039142511785030365,
-0.10966397076845169,
0.02738041616976261,
0.002404050435870886,
-0.016727523878216743,
0.027721522375941277,
0.1291549652814865,
-0.027586188167333603,
0.09390678256750107,
-0.05164945498108864,
-0.009124393574893475,
0.053796373307704926,
-0.07801123708486557,
-0.2149786651134491,
-0.07738842070102692,
0.01910657249391079,
0.07319195568561554,
0.04445631057024002,
-0.02136174403131008,
0.12447416037321091,
0.02663632109761238,
0.051877520978450775,
0.24504674971103668,
-0.3335929811000824,
-0.022953763604164124,
0.1453176885843277,
0.0734524205327034,
-0.05872330442070961,
0.01949230022728443,
0.054323505610227585,
-0.012222819961607456,
-0.007122789043933153,
-0.015439169481396675,
-0.10668221116065979,
0.19945108890533447,
-0.09830337762832642,
-0.07761198282241821,
0.028493085876107216,
0.1591816246509552,
-0.007404080126434565,
-0.049110136926174164,
-0.07650452852249146,
-0.07901857793331146,
0.12434292584657669,
-0.08753812313079834,
-0.03038959391415119,
0.07641483098268509,
0.008342309854924679,
0.07585329562425613,
-0.03602368384599686,
-0.0489269383251667,
-0.010724703781306744,
-0.09817001223564148,
0.12908951938152313,
0.014978817664086819,
-0.012688776478171349,
-0.08762095868587494,
0.025769589468836784,
-0.04000167176127434,
-0.13820812106132507,
0.002426503924652934,
-0.04914874956011772,
0.03903403878211975,
0.03283609077334404,
-0.03426513075828552,
-0.1258080154657364,
0.15052592754364014,
0.09878114610910416,
-0.09585551917552948,
0.0768943652510643,
-0.03032372146844864,
0.043985914438962936,
0.030941423028707504,
-0.0019275173544883728,
-0.09824636578559875,
-0.1647297888994217,
0.05414074286818504,
0.04484540596604347,
0.05885202810168266,
-0.008515041321516037,
-0.1169387698173523,
-0.015194248408079147,
0.07156085968017578,
0.038516998291015625,
0.08229809999465942,
0.06691163033246994,
-0.07441836595535278,
-0.059778906404972076,
0.04286479577422142,
-0.09574577957391739,
0.023964257910847664,
-0.019541524350643158,
-0.02451075054705143,
0.07575029134750366,
0.0142085961997509,
0.08084706217050552,
-0.04585302248597145,
0.036045026034116745,
-0.050899919122457504,
-0.017042001709342003,
-0.05383623391389847,
-0.09115514159202576,
0.020284395664930344,
-0.02684563584625721,
-0.031429894268512726,
-0.14103539288043976,
-0.16086041927337646,
-0.022876406088471413,
0.029699375852942467,
-0.05495302751660347,
0.0010463364887982607,
-0.040025241672992706,
0.010119901038706303,
-0.012199818156659603,
0.0062436000443995,
0.01661684922873974,
0.003842778969556093,
0.013972495682537556,
0.059294283390045166,
0.04954075813293457,
-0.09321193397045135,
0.026833627372980118,
-0.08084240555763245,
0.1269780993461609,
-0.17414702475070953,
0.08007195591926575,
-0.018498677760362625,
0.04750169441103935,
-0.13748498260974884,
0.03876705467700958,
-0.002290169708430767,
0.04370471462607384,
0.059431444853544235,
0.1403275430202484,
-0.07550062984228134,
-0.06972548365592957,
0.17468880116939545,
-0.14052580296993256,
-0.132298082113266,
0.06108681112527847,
0.03748864680528641,
0.09202893078327179,
0.06136709079146385,
0.14092372357845306,
0.06511906534433365,
-0.018079223111271858,
-0.03428594022989273,
-0.03515608608722687,
0.03968287631869316,
-0.03308267891407013,
0.12512746453285217,
-0.04310492426156998,
-0.07520262897014618,
0.04304532706737518,
0.008396023884415627,
0.03821275383234024,
-0.015266657806932926,
-0.05404168739914894,
-0.03109104372560978,
-0.04863501712679863,
0.05415915697813034,
-0.03809921070933342,
0.03733723238110542,
-0.05574290081858635,
-0.04390166699886322,
0.06969260424375534,
0.07449513673782349,
-0.014152011834084988,
-0.027553964406251907,
-0.1338152438402176,
0.12833498418331146,
-0.12324705719947815,
0.07136540114879608,
-0.09290368109941483,
-0.07886697351932526,
0.007306857034564018,
-0.050343260169029236,
-0.01677173748612404,
-0.03697950020432472,
0.0661582201719284,
0.04941922426223755,
-0.03295786678791046,
-0.03529601916670799,
0.08303368836641312,
0.025244617834687233,
-0.0156306941062212,
-0.15269851684570312,
-0.01150947529822588,
-0.03685610368847847,
0.16106005012989044,
-0.03434543311595917,
0.0729820728302002,
0.033876169472932816,
0.17127625644207,
-0.010942073538899422,
0.0374053530395031,
0.09487200528383255,
0.0316908061504364,
-0.044678520411252975,
-0.026776587590575218,
0.09912476688623428,
0.010524445213377476,
-0.14649447798728943,
0.16957277059555054,
-0.13412028551101685,
0.19379492104053497,
0.15083721280097961,
0.03570926561951637,
0.028072001412510872,
-0.05554947629570961,
0.02778553031384945,
-0.0483660027384758,
0.059916071593761444,
-0.05463365465402603,
0.08000766485929489,
0.04074149578809738,
0.13730396330356598,
-0.12431218475103378,
0.023930871859192848,
0.03287692740559578,
-0.08415162563323975,
-0.07177770882844925,
0.05215473845601082,
0.037996307015419006,
-0.1798071414232254,
0.1382395178079605,
0.19926205277442932,
0.05450192466378212,
0.13974669575691223,
0.007439866662025452,
0.0031266706064343452,
-0.09634213894605637,
0.04393499344587326,
0.033804915845394135,
-0.018291370943188667,
-0.09770210832357407,
0.03599104657769203,
0.05878572165966034,
-0.0004823979688808322,
0.06817791610956192,
-0.10244888067245483,
-0.009390813298523426,
0.03357161208987236,
0.009862711653113365,
0.1110217422246933,
0.05960283428430557,
-0.00802539475262165,
0.09778113663196564,
0.020816665142774582,
-0.027900349348783493,
0.02596300095319748,
0.024768365547060966,
-0.04741440340876579,
0.15446558594703674,
-0.1285897195339203,
-0.3706437647342682,
-0.11838720738887787,
-0.10904232412576675,
-0.12909378111362457,
-0.002354998840019107,
0.04553946480154991,
-0.10492251068353653,
-0.012381622567772865,
-0.06525234133005142,
0.08883090317249298,
-0.014554581604897976,
0.006948194000869989,
0.020184345543384552,
0.054240938276052475,
0.020091742277145386,
-0.08909877389669418,
-0.05848831683397293,
0.07514414191246033,
-0.06530977785587311,
0.10467234253883362,
-0.08755254745483398,
0.11048821359872818,
0.05167066678404808,
0.020775901153683662,
-0.026011843234300613,
0.01568005047738552,
0.1151776909828186,
-0.05676359310746193,
0.0573384054005146,
0.250492662191391,
0.011383814737200737,
0.07333773374557495,
0.10389174520969391,
0.02292524464428425,
-0.08051040768623352,
-0.014798141084611416,
-0.007511347532272339,
-0.04967757314443588,
-0.1609426587820053,
-0.05612371116876602,
-0.050873901695013046,
0.029429158195853233,
0.023333869874477386,
0.0558069571852684,
0.021530404686927795,
0.08426082879304886,
-0.036236926913261414,
0.05757565051317215,
0.0010218103416264057,
0.06741549074649811,
0.12154599279165268,
-0.017480377107858658,
0.11052592843770981,
-0.03805804252624512,
-0.05983662232756615,
0.057091884315013885,
0.08045229315757751,
0.02522221952676773,
0.07022924721240997,
0.0930936187505722,
0.0597873292863369,
0.041796423494815826,
0.0592130571603775,
0.064185731112957,
-0.0306682251393795,
-0.01233554258942604,
-0.053923171013593674,
-0.10289748758077621,
-0.0438125841319561,
0.03203323855996132,
-0.004646227695047855,
-0.006278282962739468,
-0.02275446243584156,
0.050648465752601624,
0.04838905110955238,
0.15778401494026184,
0.012436737306416035,
-0.24578414857387543,
0.002511401427909732,
0.07154779136180878,
0.06462369859218597,
-0.0559387169778347,
-0.027153704315423965,
0.0451466329395771,
-0.0801444724202156,
0.1835935264825821,
-0.018931856378912926,
0.08580712229013443,
-0.041389528661966324,
0.010607941076159477,
-0.04825839400291443,
0.09299679845571518,
0.000621603976469487,
0.04915370047092438,
-0.2862745225429535,
0.03637509420514107,
0.05777374655008316,
-0.0002032655756920576,
-0.04321962222456932,
0.05902988091111183,
0.031021520495414734,
0.10324805229902267,
0.045973170548677444,
0.001682901056483388,
0.011036241427063942,
0.004874789156019688,
-0.11082997173070908,
0.023893563076853752,
0.016251837834715843,
-0.0416782908141613,
0.11148951202630997,
-0.03163066878914833,
-0.02814577706158161,
-0.014345457777380943,
0.05726056173443794,
-0.13132885098457336,
-0.09460903704166412,
0.09599429368972778,
0.09172312915325165,
0.052184879779815674,
-0.1239565759897232,
-0.007298183161765337,
0.023525359109044075,
0.25708991289138794,
-0.025437604635953903,
-0.1028091236948967,
-0.0951618105173111,
-0.01311209425330162,
0.07992511242628098,
-0.03031410463154316,
0.009792941622436047,
-0.04872819408774376,
0.11684336513280869,
-0.08784929662942886,
-0.09287524968385696,
0.08543389290571213,
-0.08520647883415222,
-0.12062761932611465,
-0.06072777509689331,
0.09975659102201462,
-0.10336652398109436,
0.029301472008228302,
0.008680782280862331,
-0.008019877597689629,
0.003057752503082156,
-0.0383911207318306,
-0.007496558129787445,
0.13405540585517883,
-0.006290704943239689,
0.03274143114686012,
-0.09229255467653275,
-0.114386186003685,
-0.03635412082076073,
-0.07103712856769562,
0.17914864420890808,
0.27903714776039124,
0.03648291155695915,
0.0508895218372345,
0.16968974471092224,
-0.06832918524742126,
-0.20245005190372467,
-0.10112948715686798,
0.02125365659594536,
0.005644399672746658,
-0.04134984314441681,
-0.1734078973531723,
0.027340782806277275,
0.15144580602645874,
-0.011598628014326096,
0.12318778783082962,
-0.3337705731391907,
-0.11923237144947052,
0.05705412104725838,
0.0980190858244896,
0.23583190143108368,
-0.22155866026878357,
-0.08369629830121994,
-0.143681138753891,
-0.06572464108467102,
0.09524893015623093,
-0.1059916615486145,
0.09837227314710617,
-0.03562473505735397,
-0.028017643839120865,
0.04059246927499771,
-0.03386843577027321,
0.1406131535768509,
-0.08770697563886642,
0.005938021931797266,
-0.11478199064731598,
-0.03467455133795738,
0.21357759833335876,
-0.018733063712716103,
0.05445015802979469,
-0.1544262170791626,
-0.0009004083694890141,
-0.1132977157831192,
-0.03366965427994728,
-0.025261033326387405,
0.03578011691570282,
-0.04973478615283966,
-0.05299673601984978,
0.0020057414658367634,
0.04678820073604584,
-0.009268239140510559,
0.02167411521077156,
0.09139237552881241,
-0.053851768374443054,
0.021638933569192886,
0.21401867270469666,
0.04170471429824829,
-0.05231519043445587,
-0.03038983978331089,
-0.025998078286647797,
-0.024369293823838234,
0.04736775904893875,
-0.06830928474664688,
-0.023210588842630386,
0.050713155418634415,
-0.040646471083164215,
0.07844054698944092,
0.01358796562999487,
-0.07788634300231934,
-0.0032983773853629827,
0.11754291504621506,
-0.10154526680707932,
-0.1776839643716812,
-0.025721389800310135,
0.13874341547489166,
-0.015955492854118347,
0.007236976642161608,
0.22159476578235626,
0.006974605843424797,
-0.01476206909865141,
0.011622834950685501,
0.030975064262747765,
-0.0614728182554245,
0.15077535808086395,
-0.023398444056510925,
0.04023435339331627,
-0.09646125882863998,
0.06530921906232834,
0.04775627329945564,
-0.004852867219597101,
0.003520986996591091,
0.0852523148059845,
-0.07601819187402725,
-0.08077641576528549,
-0.01635134033858776,
0.21970033645629883,
-0.0027492111548781395,
-0.07071319222450256,
-0.13825368881225586,
-0.13463497161865234,
0.04031487554311752,
-0.04574993625283241,
0.04225437343120575,
-0.0029665862675756216,
0.06504756212234497,
-0.10519003868103027,
-0.013695543631911278,
0.02418685518205166,
-0.02380889281630516,
0.07915300875902176,
-0.122635118663311,
-0.015449482016265392,
0.023919997736811638,
0.010938366875052452,
-0.030884889885783195,
-0.0009864738676697016,
-0.1515132188796997,
-0.03266463801264763,
-0.17132890224456787,
0.007504620123654604,
-0.14155173301696777,
-0.060915350914001465,
-0.011784081347286701,
0.02094339206814766,
-0.00806917529553175,
-0.023215249180793762,
-0.04358983412384987,
-0.054356031119823456,
-0.051609963178634644,
0.05747712403535843,
-0.048395056277513504,
-0.028675271198153496,
-0.0026358612813055515,
-0.0650840550661087,
0.03690197318792343,
0.01924494095146656,
0.003402934642508626,
-0.06278093159198761,
-0.06832144409418106,
-0.026728641241788864,
0.08614617586135864,
0.007164470385760069,
0.02899298071861267,
-0.090961754322052,
0.005090577062219381,
0.025391802191734314,
-0.0436229445040226,
-0.00860777497291565,
0.18476292490959167,
-0.1303177773952484,
-0.009527230635285378,
-0.016990115866065025,
-0.07525422424077988,
-0.013650856912136078,
0.0109783336520195,
0.11191988736391068,
0.01542806439101696,
0.182674840092659,
-0.05202721431851387,
0.041739676147699356,
-0.13435545563697815,
-0.015027002431452274,
-0.04643937572836876,
-0.13184736669063568,
0.10410887002944946,
-0.052026715129613876,
-0.002447225619107485,
-0.011190886609256268,
0.09120091795921326,
0.030042950063943863,
-0.061177272349596024,
0.004921192303299904,
-0.05502855405211449,
0.043606244027614594,
0.03130355104804039,
0.1600932776927948,
0.09629841148853302,
0.014987473376095295,
-0.06317561119794846,
0.07359402626752853,
0.078739695250988,
-0.0147669967263937,
0.0029838886111974716,
0.14772965013980865,
0.05481244623661041,
0.09768632799386978,
0.13827459514141083,
-0.017709925770759583,
-0.11241105943918228,
0.05752633884549141,
-0.05053766444325447,
0.08804120868444443,
-0.017480263486504555,
0.13420778512954712,
0.18901827931404114,
-0.08506608754396439,
0.0478244349360466,
0.005695861764252186,
0.019848357886075974,
-0.07965268939733505,
-0.09453471750020981,
-0.1269676685333252,
-0.10418704897165298,
-0.044437773525714874,
-0.08010473847389221,
-0.044202324002981186,
-0.011076531372964382,
-0.0037444764748215675,
-0.007598368916660547,
0.13323624432086945,
0.006945472676306963,
-0.10455160588026047,
0.025013040751218796,
-0.02121778577566147,
-0.04232149198651314,
0.02426467090845108,
-0.07056897133588791,
0.015481041744351387,
0.057810310274362564,
0.0135524682700634,
0.045589007437229156,
-0.026268044486641884,
0.0584159754216671,
-0.04811881482601166,
-0.11276627331972122,
-0.009281500242650509,
-0.014969399198889732,
-0.004265861585736275,
-0.021046210080385208,
0.03311718627810478,
-0.06294471770524979,
0.006198907736688852,
0.06782516092061996,
0.019907157868146896,
-0.10379435867071152,
-0.03073486126959324,
0.12086384743452072,
-0.026772189885377884,
0.04492892697453499,
-0.012804592959582806,
-0.042719654738903046,
-0.017715493217110634,
0.15371480584144592,
0.2872733771800995,
-0.044042497873306274,
-0.0115361288189888,
-0.008045190945267677,
0.014043357223272324,
0.02617134526371956,
0.11656520515680313,
0.0471796914935112,
0.14970721304416656,
-0.024353206157684326,
0.08981764316558838,
-0.09254934638738632,
-0.06284268200397491,
-0.0807749330997467,
0.06539967656135559,
0.03374207019805908,
0.0012970800744369626,
0.0266250129789114,
0.08564469963312149,
-0.026784440502524376,
-0.04469072446227074,
0.054735951125621796,
-0.12567003071308136,
-0.1296113282442093,
-0.06908200681209564,
0.04394499585032463,
0.01048933994024992,
0.0990186259150505,
-0.010828092694282532,
-0.011502564884722233,
0.01606033742427826,
-0.01159762218594551,
-0.10082422196865082,
-0.06263373792171478,
0.026873894035816193,
-0.1072353646159172,
0.07426397502422333,
-0.015776343643665314,
-0.05149003490805626,
0.12772563099861145,
-0.010177378542721272,
-0.09796192497015,
0.07903757691383362,
0.00807843916118145,
0.02362729236483574,
0.09710073471069336,
0.01567843370139599,
-0.037795163691043854,
0.06714700162410736,
0.07690945267677307,
-0.10365516692399979,
0.021862639114260674,
0.08037754893302917,
-0.015499907545745373,
-0.04820405691862106,
0.09475340694189072,
-0.0860801711678505,
0.07466907054185867,
0.16693055629730225,
-0.047470517456531525,
0.017813164740800858,
-0.0337824784219265,
0.08127965033054352,
0.04153180122375488,
0.020705802366137505,
-0.05734872445464134,
-0.2107645720243454,
-0.044145796447992325,
0.0626717135310173,
0.05423499271273613,
-0.23602056503295898,
-0.0627746433019638,
-0.07950954139232635,
-0.03340636193752289,
-0.1104988306760788,
0.03027639351785183,
0.1065078005194664,
0.028523704037070274,
-0.04776609688997269,
-0.18925614655017853,
-0.027976399287581444,
0.0994805172085762,
-0.08106449991464615,
-0.07726655155420303
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# legalbench_task_classification
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7115
- Accuracy: 0.9346
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 489 | 2.3560 | 0.9223 |
| 3.6023 | 2.0 | 978 | 1.7115 | 0.9346 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "legalbench_task_classification", "results": []}]} | text-classification | prithviraj-maurya/legalbench_task_classification | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:26:13+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| legalbench\_task\_classification
================================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 1.7115
* Accuracy: 0.9346
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10009026527404785,
0.11160969734191895,
-0.0027496120892465115,
0.11274062842130661,
0.1415753960609436,
0.015253284014761448,
0.16106432676315308,
0.11433778703212738,
-0.06629235297441483,
0.04736238718032837,
0.12742555141448975,
0.12664781510829926,
0.015125550329685211,
0.11821281164884567,
-0.08220787346363068,
-0.2123468965291977,
0.009614093229174614,
0.022669987753033638,
-0.06324856728315353,
0.11473038047552109,
0.09355748444795609,
-0.12131164222955704,
0.08896705508232117,
-0.01618121564388275,
-0.16749274730682373,
0.004479446914047003,
0.018598522990942,
-0.04856426268815994,
0.12484274804592133,
0.033396780490875244,
0.13292871415615082,
0.03729959577322006,
0.08550615608692169,
-0.19182592630386353,
0.01019546203315258,
0.06083536520600319,
-0.005236768163740635,
0.08214868605136871,
0.036239225417375565,
-0.008861588314175606,
0.0689060389995575,
-0.09362684935331345,
0.06169769540429115,
0.01730477251112461,
-0.1281389594078064,
-0.20565582811832428,
-0.08748676627874374,
0.0345340222120285,
0.09289178997278214,
0.07540102303028107,
-0.010323443450033665,
0.11617514491081238,
-0.053170982748270035,
0.095461905002594,
0.20109839737415314,
-0.30546998977661133,
-0.061592064797878265,
0.045971520245075226,
0.025511497631669044,
0.08938869833946228,
-0.09764400869607925,
-0.019794879481196404,
0.05909418314695358,
0.023731868714094162,
0.1283196061849594,
-0.02459762804210186,
-0.05919263884425163,
-0.00016075087478384376,
-0.14285771548748016,
-0.018168844282627106,
0.1520322561264038,
0.04991479963064194,
-0.04669204354286194,
-0.04606898874044418,
-0.07427772879600525,
-0.12956564128398895,
-0.03992857038974762,
-0.009009100496768951,
0.049566734582185745,
-0.020227845758199692,
-0.059866465628147125,
-0.02504808083176613,
-0.09613154083490372,
-0.06567185372114182,
-0.05519591644406319,
0.14483514428138733,
0.03390048071742058,
0.005107732489705086,
-0.010454843752086163,
0.09989625215530396,
-0.025157025083899498,
-0.14584025740623474,
0.021651286631822586,
0.02088100090622902,
0.0038876524195075035,
-0.05028783529996872,
-0.05066544935107231,
-0.08078357577323914,
0.021042698994278908,
0.15937280654907227,
-0.05065108463168144,
0.051086973398923874,
0.0014541882555931807,
0.047361548990011215,
-0.10235797613859177,
0.16860710084438324,
-0.04100237786769867,
-0.03205214813351631,
0.024882985278964043,
0.0915633961558342,
0.057007674127817154,
-0.015842808410525322,
-0.13049520552158356,
0.032934628427028656,
0.10394022613763809,
0.021081632003188133,
-0.05064450576901436,
0.0666612759232521,
-0.05926511809229851,
-0.013549057766795158,
0.04191214591264725,
-0.0947137102484703,
0.02687942422926426,
0.0050791483372449875,
-0.055928558111190796,
-0.04534420371055603,
0.03286640718579292,
0.021833768114447594,
0.006006360054016113,
0.10760451853275299,
-0.08141431957483292,
0.01132600475102663,
-0.08155348896980286,
-0.12982062995433807,
0.017577555030584335,
-0.0940791442990303,
0.019820084795355797,
-0.10942970961332321,
-0.182551771402359,
-0.013015341944992542,
0.06347040086984634,
-0.028870299458503723,
-0.03233719989657402,
-0.06380421668291092,
-0.078241266310215,
0.01850624568760395,
-0.0117806913331151,
0.06485705077648163,
-0.06430386006832123,
0.09751855581998825,
0.03426656126976013,
0.06650882959365845,
-0.06269234418869019,
0.041505325585603714,
-0.10320577770471573,
0.04153472185134888,
-0.17908348143100739,
0.03730727359652519,
-0.0701146274805069,
0.07037714123725891,
-0.08094709366559982,
-0.0708991065621376,
0.003395860316231847,
-0.0031841725576668978,
0.0748145654797554,
0.09878626465797424,
-0.17704352736473083,
-0.06244032829999924,
0.15334439277648926,
-0.0872621238231659,
-0.140264093875885,
0.13796526193618774,
-0.05843111872673035,
0.043838705867528915,
0.06329336762428284,
0.19044199585914612,
0.07654380053281784,
-0.08543851226568222,
0.005410780198872089,
0.004115215502679348,
0.06722275167703629,
-0.031219158321619034,
0.070389024913311,
-0.0014077028026804328,
0.002353328512981534,
0.01235162653028965,
-0.05414658784866333,
0.05086473375558853,
-0.07783375680446625,
-0.09105043858289719,
-0.04125034809112549,
-0.10483746230602264,
0.06753041595220566,
0.04880185052752495,
0.06205010786652565,
-0.10885307937860489,
-0.08726228773593903,
0.0645194873213768,
0.07480038702487946,
-0.07397060096263885,
0.024549057707190514,
-0.06873343884944916,
0.09160102158784866,
-0.06046400964260101,
-0.015125409699976444,
-0.15745609998703003,
-0.04514223709702492,
0.019264424219727516,
0.001665001269429922,
0.018174780532717705,
-0.006223998032510281,
0.07070334255695343,
0.08369956910610199,
-0.066886305809021,
-0.03152180463075638,
-0.01429915614426136,
0.015469240956008434,
-0.1237948089838028,
-0.20028281211853027,
-0.014332279562950134,
-0.03495370224118233,
0.14990484714508057,
-0.23330263793468475,
0.05275983363389969,
-0.0014653714606538415,
0.08926111459732056,
0.04123585671186447,
-0.011081450618803501,
-0.0382685661315918,
0.06879528611898422,
-0.04978092014789581,
-0.06990444660186768,
0.06004130840301514,
0.010039977729320526,
-0.10479165613651276,
-0.04227975010871887,
-0.14891855418682098,
0.18198205530643463,
0.1326628476381302,
-0.08023189008235931,
-0.07286596298217773,
0.008080147206783295,
-0.03473601117730141,
-0.02693113684654236,
-0.03794703260064125,
0.004339942708611488,
0.12792786955833435,
-0.005620430689305067,
0.1549476534128189,
-0.08662332594394684,
-0.03190510720014572,
0.02045891247689724,
-0.04817867651581764,
0.009009075351059437,
0.11460258066654205,
0.08953093737363815,
-0.10654234886169434,
0.14863590896129608,
0.1966652274131775,
-0.09579864889383316,
0.13008049130439758,
-0.044558241963386536,
-0.05200420320034027,
-0.02535102516412735,
0.009945621713995934,
0.015098042786121368,
0.10877082496881485,
-0.11376207321882248,
0.002323142485693097,
0.010002020746469498,
0.013573698699474335,
0.010600542649626732,
-0.21604084968566895,
-0.024857569485902786,
0.041889939457178116,
-0.05180535838007927,
-0.0009464218164794147,
-0.024861477315425873,
-0.007254310883581638,
0.09820067137479782,
-0.0046440837904810905,
-0.08839879184961319,
0.046969007700681686,
-0.004161829128861427,
-0.07780678570270538,
0.20285628736019135,
-0.09424959868192673,
-0.14353874325752258,
-0.136247456073761,
-0.06719893962144852,
-0.05483291298151016,
0.03242390975356102,
0.06164886802434921,
-0.06720905750989914,
-0.0413714237511158,
-0.10994987934827805,
-0.0038263166788965464,
0.02994501031935215,
0.01971830241382122,
0.021163634955883026,
-0.0041278693825006485,
0.08493589609861374,
-0.10040725022554398,
-0.007009963504970074,
-0.035088714212179184,
-0.05126739665865898,
0.036921508610248566,
0.02629355527460575,
0.11170574277639389,
0.14838765561580658,
-0.026169780641794205,
-0.007356948684900999,
-0.02684454247355461,
0.2277047336101532,
-0.057898301631212234,
-0.006551099009811878,
0.12595953047275543,
-0.03257483243942261,
0.057313669472932816,
0.13939939439296722,
0.06310484558343887,
-0.09776148200035095,
0.01858990639448166,
0.033038195222616196,
-0.03442737087607384,
-0.2168537825345993,
-0.03605645149946213,
-0.037701673805713654,
0.005479462910443544,
0.09510310739278793,
0.02876218780875206,
0.022190002724528313,
0.06635619699954987,
0.01961098052561283,
0.08239655941724777,
-0.009019116871058941,
0.07063143700361252,
0.11288672685623169,
0.04081778973340988,
0.13017606735229492,
-0.04684014245867729,
-0.05115062743425369,
0.041642650961875916,
-0.0042169177904725075,
0.2003905326128006,
0.0220237597823143,
0.14608754217624664,
0.05093924328684807,
0.15894715487957,
-0.0022009992972016335,
0.06014814227819443,
-0.009494240395724773,
-0.03497191518545151,
-0.016108987852931023,
-0.05087079852819443,
-0.03094988688826561,
0.03465942293405533,
-0.08259386569261551,
0.05672406777739525,
-0.1041090115904808,
0.017503704875707626,
0.061649076640605927,
0.23341302573680878,
0.057969726622104645,
-0.3213752806186676,
-0.09073582291603088,
0.031684309244155884,
-0.01917683705687523,
-0.0202624574303627,
0.027752656489610672,
0.1269034743309021,
-0.047208063304424286,
0.03733751177787781,
-0.07054189592599869,
0.08604463934898376,
-0.040981508791446686,
0.045002531260252,
0.05053624510765076,
0.08522043377161026,
-0.010817477479577065,
0.06771192699670792,
-0.2801041603088379,
0.2626741826534271,
0.01932024583220482,
0.06679442524909973,
-0.04502832517027855,
-0.0007432673592120409,
0.038670118898153305,
0.09499242156744003,
0.07015971839427948,
-0.013889702968299389,
-0.05170217901468277,
-0.1914292722940445,
-0.06628420948982239,
0.02106204256415367,
0.09809369593858719,
-0.04034658521413803,
0.10071901977062225,
-0.029586344957351685,
0.001242064288817346,
0.08039773255586624,
-0.015724388882517815,
-0.08190225064754486,
-0.09895296394824982,
-0.009710212238132954,
0.03654253110289574,
-0.03733869642019272,
-0.07948805391788483,
-0.09687858074903488,
-0.13472087681293488,
0.15368349850177765,
-0.06931055337190628,
-0.03623994067311287,
-0.10330533236265182,
0.05360981076955795,
0.05774465203285217,
-0.0812373086810112,
0.04176609218120575,
0.004727487452328205,
0.08489906042814255,
0.015254193916916847,
-0.06560662388801575,
0.1214565858244896,
-0.07314290851354599,
-0.17908340692520142,
-0.07064146548509598,
0.1085202693939209,
0.02034112438559532,
0.044864311814308167,
-0.007845807820558548,
0.011815092526376247,
-0.015469148755073547,
-0.07717083394527435,
0.024365276098251343,
0.006563421804457903,
0.051179081201553345,
0.03143497556447983,
-0.05799384042620659,
-0.005367659498006105,
-0.059524066746234894,
-0.02369713969528675,
0.1517072468996048,
0.29242467880249023,
-0.0852581337094307,
0.012720471248030663,
0.06060279160737991,
-0.06821858137845993,
-0.2099754959344864,
0.03617997467517853,
0.026787200942635536,
0.003276597475633025,
0.046961162239313126,
-0.14983652532100677,
0.09996900707483292,
0.10124941170215607,
-0.028385361656546593,
0.11567051708698273,
-0.2916131317615509,
-0.13699433207511902,
0.12642161548137665,
0.14551100134849548,
0.11912427097558975,
-0.15840892493724823,
-0.043518200516700745,
-0.04048480838537216,
-0.1073233112692833,
0.10863247513771057,
-0.130152627825737,
0.10920567810535431,
-0.006229448597878218,
0.052068691700696945,
0.006383421365171671,
-0.051350031048059464,
0.13909493386745453,
0.00015163270290941,
0.11582647264003754,
-0.06167277321219444,
-0.01650283858180046,
0.057735852897167206,
-0.0614519938826561,
0.019843582063913345,
-0.11577825248241425,
0.04522329196333885,
-0.059798985719680786,
-0.022418437525629997,
-0.043459877371788025,
0.03408924117684364,
-0.038980647921562195,
-0.05832085758447647,
-0.0435505174100399,
0.025482069700956345,
0.04564632102847099,
-0.006727216765284538,
0.16419431567192078,
0.013863928616046906,
0.14243409037590027,
0.14659887552261353,
0.07618455588817596,
-0.06763956695795059,
-0.009343961253762245,
-0.007993433624505997,
-0.03653949499130249,
0.06360691785812378,
-0.1603987067937851,
0.04251156747341156,
0.12486596405506134,
0.012759150937199593,
0.14976902306079865,
0.06948457658290863,
-0.029369689524173737,
0.014263210818171501,
0.06035095080733299,
-0.16278547048568726,
-0.10671593993902206,
-0.007427621632814407,
-0.03091421350836754,
-0.11983177065849304,
0.05809660255908966,
0.12876492738723755,
-0.06673786044120789,
0.006823989097028971,
-0.006109589245170355,
0.016246333718299866,
-0.03403685986995697,
0.1792205572128296,
0.07038932293653488,
0.04577841982245445,
-0.08499542623758316,
0.0936291441321373,
0.05812530964612961,
-0.07477720826864243,
0.00964799989014864,
0.04325258359313011,
-0.0850289985537529,
-0.0470457561314106,
0.04329242929816246,
0.19152553379535675,
-0.031727246940135956,
-0.047710660845041275,
-0.14683076739311218,
-0.11350087821483612,
0.05323666334152222,
0.17866814136505127,
0.09845881164073944,
0.014988671988248825,
-0.03558935970067978,
0.010650796815752983,
-0.10852736234664917,
0.11822500079870224,
0.04612217843532562,
0.08916711807250977,
-0.1540316641330719,
0.11775215715169907,
-0.005936783272773027,
0.0114206001162529,
-0.024891121312975883,
0.04640721157193184,
-0.11647232621908188,
-0.008681119419634342,
-0.14550799131393433,
-0.00037763683940283954,
-0.022005828097462654,
0.010138279758393764,
0.0012918752618134022,
-0.055392488837242126,
-0.05583125725388527,
0.010399307124316692,
-0.09914622455835342,
-0.024522384628653526,
0.033487122505903244,
0.05021888017654419,
-0.12403592467308044,
-0.04981870576739311,
0.022189725190401077,
-0.07399234920740128,
0.07011789828538895,
0.019453847780823708,
0.019320186227560043,
0.04797353595495224,
-0.18523812294006348,
0.02077445387840271,
0.05753251165151596,
0.01908203959465027,
0.046946752816438675,
-0.08311384171247482,
-0.02243722788989544,
-0.004839141387492418,
0.04290022328495979,
0.018894214183092117,
0.09103463590145111,
-0.12261652946472168,
0.014223176054656506,
-0.029105188325047493,
-0.062234386801719666,
-0.05124817043542862,
0.03374502435326576,
0.08635517954826355,
0.012226982042193413,
0.20820190012454987,
-0.09658023715019226,
0.01971360109746456,
-0.2016584277153015,
0.004533878993242979,
0.001660775626078248,
-0.11658434569835663,
-0.1173790767788887,
-0.05340830609202385,
0.04990297555923462,
-0.06205129250884056,
0.13329827785491943,
0.009332354180514812,
0.02695404551923275,
0.03574696555733681,
-0.028277989476919174,
0.0354156456887722,
0.02761734090745449,
0.2133827805519104,
0.03283149003982544,
-0.040875229984521866,
0.012669187970459461,
0.024484559893608093,
0.11487449705600739,
0.07695475220680237,
0.16871732473373413,
0.16582489013671875,
-0.04702377691864967,
0.09964647144079208,
0.04306356608867645,
-0.049398165196180344,
-0.1370856612920761,
0.0669572725892067,
-0.03933822736144066,
0.10356904566287994,
-0.015703417360782623,
0.19917190074920654,
0.08974086493253708,
-0.1564660668373108,
0.01724410057067871,
-0.04813627898693085,
-0.08600019663572311,
-0.10917353630065918,
-0.06193744018673897,
-0.09831425547599792,
-0.14226721227169037,
-0.006498123053461313,
-0.11136320233345032,
0.012461462989449501,
0.10075247287750244,
0.0013426815858110785,
-0.016451068222522736,
0.16360020637512207,
0.0009217091137543321,
0.034934721887111664,
0.06272921711206436,
0.00034898470039479434,
-0.042865533381700516,
-0.07273416966199875,
-0.09973816573619843,
0.011972316540777683,
-0.007648703176528215,
0.02483350783586502,
-0.04588986933231354,
-0.02015053853392601,
0.04208378866314888,
-0.01041507814079523,
-0.11112939566373825,
0.01187768392264843,
0.029025403782725334,
0.048285216093063354,
0.05076390504837036,
0.013157417066395283,
0.01073569618165493,
0.0014613887760788202,
0.21901117265224457,
-0.07584387809038162,
-0.06596171855926514,
-0.09814036637544632,
0.21399378776550293,
0.023397162556648254,
0.011966140940785408,
0.012584710493683815,
-0.09443554282188416,
0.02815418131649494,
0.20947939157485962,
0.18842986226081848,
-0.09905103594064713,
-0.0001692796213319525,
-0.018171221017837524,
-0.008922252804040909,
-0.03459518030285835,
0.0915725901722908,
0.11012741923332214,
0.006350040435791016,
-0.07247549295425415,
-0.05520479753613472,
-0.038122765719890594,
-0.008277290500700474,
-0.05145679786801338,
0.05651894211769104,
0.026052454486489296,
0.009636245667934418,
-0.05015645921230316,
0.05856948718428612,
-0.0374404713511467,
-0.10917237401008606,
0.04973645135760307,
-0.1963282823562622,
-0.15246431529521942,
-0.01912187784910202,
0.11154837906360626,
-0.009332713671028614,
0.04433213546872139,
-0.03408404439687729,
-0.001189050730317831,
0.07034170627593994,
-0.031212879344820976,
-0.061599768698215485,
-0.05933063477277756,
0.05827198922634125,
-0.10562718659639359,
0.2238629013299942,
-0.03165009617805481,
0.04682733491063118,
0.12623748183250427,
0.049673039466142654,
-0.07102157920598984,
0.08304491639137268,
0.04691373556852341,
-0.05905837565660477,
0.03411668539047241,
0.0856429934501648,
-0.04292154684662819,
0.12025368958711624,
0.06405256688594818,
-0.13447809219360352,
0.010677926242351532,
-0.03322536498308182,
-0.09649685025215149,
-0.05268259719014168,
-0.042917050421237946,
-0.05988189950585365,
0.13188210129737854,
0.18581262230873108,
-0.03586168587207794,
0.0109806377440691,
-0.04149792715907097,
0.015592525713145733,
0.06741455942392349,
0.03780566528439522,
-0.03523185849189758,
-0.22708064317703247,
0.026214618235826492,
0.06231316551566124,
-0.0005959990085102618,
-0.28178611397743225,
-0.08286678045988083,
-0.013081220909953117,
-0.04308035597205162,
-0.09467852115631104,
0.08970261365175247,
0.11370586603879929,
0.050186578184366226,
-0.06245345249772072,
-0.0884915366768837,
-0.07498291879892349,
0.15508313477039337,
-0.12546521425247192,
-0.09586970508098602
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# outputs
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | {"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "outputs", "results": []}]} | null | MaggieZhang/outputs | [
"peft",
"tensorboard",
"safetensors",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"region:us"
] | 2024-02-14T17:28:14+00:00 | [] | [] | TAGS
#peft #tensorboard #safetensors #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #region-us
|
# outputs
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | [
"# outputs\n\nThis model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #region-us \n",
"# outputs\n\nThis model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
47,
30,
6,
12,
8,
3,
113,
4,
44
] | [
"passage: TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #region-us \n# outputs\n\nThis model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.12047240138053894,
0.17920246720314026,
-0.0018790540052577853,
0.08525816351175308,
0.13527439534664154,
0.011641837656497955,
0.12551917135715485,
0.1281789392232895,
-0.06853970140218735,
0.12576824426651,
0.10699616372585297,
0.02109338343143463,
0.04884577915072441,
0.14823442697525024,
-0.011166050098836422,
-0.24993585050106049,
0.020376140251755714,
-0.02285321243107319,
-0.06505843997001648,
0.0762670561671257,
0.09254223108291626,
-0.0817565843462944,
0.0625796988606453,
0.004392287228256464,
-0.15440011024475098,
0.01707284152507782,
-0.034236449748277664,
-0.04987545683979988,
0.06226223334670067,
0.006423440761864185,
0.08987145125865936,
-0.02377503551542759,
0.10885914415121078,
-0.20749442279338837,
-0.0036571172531694174,
0.08072695136070251,
0.03860848397016525,
0.09149447083473206,
0.07778449356555939,
0.0194531362503767,
0.10456617176532745,
-0.14550141990184784,
0.07339294999837875,
0.021365316584706306,
-0.06292225420475006,
-0.13717904686927795,
-0.09696907550096512,
0.10478031635284424,
0.09986181557178497,
0.07944577187299728,
0.003232154995203018,
0.11590252816677094,
-0.06914728134870529,
0.044397544115781784,
0.17112624645233154,
-0.27675390243530273,
-0.07497716695070267,
0.05156150832772255,
0.03176946938037872,
0.07391506433486938,
-0.11220008134841919,
-0.04019991680979729,
0.07667160779237747,
0.028159931302070618,
0.06689827144145966,
0.01724528893828392,
0.024296678602695465,
-0.006500932388007641,
-0.14569567143917084,
-0.026313036680221558,
0.18563906848430634,
0.07771475613117218,
-0.05288352444767952,
-0.10591592639684677,
-0.056478455662727356,
-0.11736980825662613,
-0.008795327506959438,
-0.025111371651291847,
0.018384691327810287,
-0.022348733618855476,
-0.04106403887271881,
-0.04644790291786194,
-0.06441433727741241,
-0.06327497959136963,
0.015689337626099586,
0.09681886434555054,
0.06057504191994667,
0.0212666355073452,
-0.01700252667069435,
0.0956747978925705,
0.015317736193537712,
-0.11644625663757324,
-0.005680542439222336,
-0.013546141795814037,
-0.060390494763851166,
-0.04494110494852066,
-0.035181429237127304,
-0.004598252009600401,
0.006946139968931675,
0.13950487971305847,
-0.06613177806138992,
0.058713480830192566,
0.033907029777765274,
0.024889929220080376,
-0.015812620520591736,
0.10681464523077011,
-0.06886275857686996,
-0.02146322652697563,
0.006358813494443893,
0.12406603246927261,
0.03549138829112053,
-0.0033955704420804977,
-0.09085036814212799,
0.0010395442368462682,
0.0928734838962555,
0.04177667200565338,
-0.035982634872198105,
-0.01503272820264101,
-0.04605141654610634,
-0.03147821128368378,
0.045589227229356766,
-0.12030965834856033,
0.055805955082178116,
0.0018213014118373394,
-0.07642171531915665,
0.012338310480117798,
0.014610012993216515,
-0.0016608339501544833,
-0.02808082103729248,
0.07878441363573074,
-0.10789243131875992,
-0.007275111973285675,
-0.08129604160785675,
-0.03783122077584267,
0.028442436829209328,
-0.04923294857144356,
-0.008479130454361439,
-0.0806477814912796,
-0.18267540633678436,
-0.031762298196554184,
0.0415380522608757,
-0.06595391035079956,
-0.05919996276497841,
-0.04566855728626251,
-0.06855922937393188,
0.022990480065345764,
-0.0055229864083230495,
0.05755205824971199,
-0.03494725003838539,
0.07402604818344116,
-0.004008691757917404,
0.027077915146946907,
0.013181882910430431,
0.03510066121816635,
-0.08667278289794922,
0.05228516459465027,
-0.11775929480791092,
0.06576751172542572,
-0.08533202856779099,
0.031230803579092026,
-0.11181432008743286,
-0.08758071810007095,
-0.05400360748171806,
-0.017398323863744736,
0.0803937315940857,
0.125011146068573,
-0.1537613421678543,
-0.026887807995080948,
0.16297663748264313,
-0.07887934893369675,
-0.08614951372146606,
0.11714135855436325,
-0.03153048828244209,
0.0035949936136603355,
0.04413336515426636,
0.1630467176437378,
0.09946642816066742,
-0.1257694512605667,
-0.0379011295735836,
0.021209340542554855,
0.08982352167367935,
-0.018971653655171394,
0.08653318881988525,
-0.039829351007938385,
-0.022543421015143394,
0.008847550489008427,
-0.052620917558670044,
0.01537191029638052,
-0.08383329212665558,
-0.08445227891206741,
-0.06502789258956909,
-0.09532301872968674,
0.02772609330713749,
0.020262273028492928,
0.04950037971138954,
-0.0605362243950367,
-0.1110580638051033,
0.11322332918643951,
0.15138721466064453,
-0.057601649314165115,
0.008217453956604004,
-0.0565326027572155,
0.09469467401504517,
-0.08880890160799026,
-0.02957741916179657,
-0.16850711405277252,
-0.09121581166982651,
0.057887136936187744,
-0.07125942409038544,
0.021990898996591568,
0.0024083827156573534,
0.05099565163254738,
0.08852630853652954,
-0.03093096800148487,
-0.04181481525301933,
-0.0906829833984375,
-0.015265562571585178,
-0.10207143425941467,
-0.16690665483474731,
-0.04334650933742523,
-0.03573564067482948,
0.15004493296146393,
-0.24581177532672882,
0.010455346666276455,
-0.05447717383503914,
0.14912068843841553,
0.031614869832992554,
-0.07591897994279861,
-0.005449399352073669,
0.04096798971295357,
-0.023262597620487213,
-0.10020633041858673,
0.027263298630714417,
0.014288662932813168,
-0.07408157736063004,
-0.0993323028087616,
-0.11774924397468567,
0.06970284134149551,
0.0751349925994873,
0.0973278135061264,
-0.07687222212553024,
-0.04136042669415474,
-0.07824503630399704,
-0.04834140092134476,
-0.08559209108352661,
-0.004018253646790981,
0.12446422874927521,
0.0019429728854447603,
0.1197144016623497,
-0.08134575188159943,
-0.048177458345890045,
0.01408273447304964,
-0.014138293452560902,
-0.03894559666514397,
0.07351647317409515,
0.08479631692171097,
-0.07825832068920135,
0.1027609333395958,
0.101716049015522,
-0.07506901025772095,
0.13820910453796387,
-0.06012854725122452,
-0.11982383579015732,
-0.03708093613386154,
0.05978411063551903,
-0.000636923301499337,
0.13744176924228668,
-0.0575883649289608,
0.013285882771015167,
0.018061773851513863,
0.036272644996643066,
0.04754265397787094,
-0.1654772013425827,
-0.012464114464819431,
0.010342542082071304,
-0.03770698979496956,
-0.00543717946857214,
-0.011265533976256847,
-0.0015161933843046427,
0.07655354589223862,
0.021245181560516357,
-0.02599966526031494,
0.0280922818928957,
-0.0000617396435700357,
-0.08297369629144669,
0.16521680355072021,
-0.12146798521280289,
-0.11738499253988266,
-0.15316985547542572,
0.0898297056555748,
-0.06638481467962265,
-0.011468530632555485,
0.008914005011320114,
-0.06411191821098328,
-0.039021771401166916,
-0.10211512446403503,
-0.03598640114068985,
-0.05786596238613129,
0.006697687320411205,
0.07284612953662872,
0.008785857819020748,
0.11709491163492203,
-0.11145101487636566,
0.017910711467266083,
-0.006980041041970253,
-0.05246997997164726,
-0.02956564724445343,
0.05049865320324898,
0.10130415111780167,
0.11718562245368958,
0.01281724777072668,
0.024621395394206047,
-0.022962890565395355,
0.2319587916135788,
-0.0517951138317585,
-0.017772657796740532,
0.10022629052400589,
0.01001705415546894,
0.06373673677444458,
0.10207797586917877,
0.035724200308322906,
-0.09155141562223434,
0.0347856804728508,
0.06734805554151535,
-0.01031766552478075,
-0.2260446548461914,
-0.045822709798812866,
-0.033358294516801834,
-0.03793825954198837,
0.120041623711586,
0.06811854988336563,
-0.005126049276441336,
0.0566440150141716,
-0.01702699065208435,
0.06823600083589554,
-0.06395122408866882,
0.08309359848499298,
0.03562070429325104,
0.04265525937080383,
0.0750037282705307,
-0.03562726825475693,
-0.01719452440738678,
0.06646623462438583,
0.013179237022995949,
0.257547527551651,
-0.009095095098018646,
0.08012844622135162,
0.0325794517993927,
0.1849045306444168,
-0.029811935499310493,
0.03417513519525528,
0.0106436088681221,
-0.0008983018342405558,
0.005837913602590561,
-0.06778202205896378,
-0.02656959369778633,
0.04810457304120064,
-0.010026639327406883,
0.09594355523586273,
-0.11551278829574585,
0.05704433470964432,
0.024947457015514374,
0.2517106533050537,
0.06575290858745575,
-0.3026996850967407,
-0.08718626201152802,
0.025338761508464813,
-0.02897493913769722,
-0.07258518785238266,
0.03868912532925606,
0.14431482553482056,
-0.09909184277057648,
0.028599396347999573,
-0.06950446963310242,
0.07818403840065002,
-0.03332280367612839,
-0.014803286641836166,
0.05078820884227753,
0.14983761310577393,
-0.00017706277139950544,
0.0901651531457901,
-0.17767204344272614,
0.19238200783729553,
0.016856646165251732,
0.1144152283668518,
-0.036709338426589966,
0.042565613985061646,
0.013898384757339954,
0.05079556256532669,
0.10742218047380447,
0.004939527250826359,
-0.06332676112651825,
-0.1463506668806076,
-0.11669450998306274,
0.025536198168992996,
0.10194020718336105,
-0.04859587550163269,
0.07303857803344727,
-0.04503864422440529,
0.014931019395589828,
0.039093129336833954,
-0.06634213030338287,
-0.17896875739097595,
-0.12736362218856812,
0.009541147388517857,
0.013881148770451546,
-0.04495741426944733,
-0.09645813703536987,
-0.09776594489812851,
-0.0426856130361557,
0.1381460428237915,
-0.02476668544113636,
-0.033945102244615555,
-0.12627729773521423,
0.07878542691469193,
0.13006214797496796,
-0.06186452507972717,
0.017587972804903984,
0.001933382940478623,
0.12480828166007996,
0.029934275895357132,
-0.07588663697242737,
0.05000082775950432,
-0.08052445948123932,
-0.1559268981218338,
-0.06824907660484314,
0.13248243927955627,
0.055293649435043335,
0.030526284128427505,
-0.016382532194256783,
-0.001207604305818677,
0.01133048627525568,
-0.09669635444879532,
0.020331503823399544,
0.10246140509843826,
0.06961483508348465,
0.05862888693809509,
-0.0815805196762085,
0.05887855589389801,
-0.028444545343518257,
-0.015964733436703682,
0.09555734694004059,
0.23958463966846466,
-0.08383224159479141,
0.04515326768159866,
0.07033999264240265,
-0.06931939721107483,
-0.15440596640110016,
0.05037400498986244,
0.1147550493478775,
0.030251486226916313,
0.03576377034187317,
-0.18572130799293518,
0.10981181263923645,
0.12600177526474,
-0.03269263356924057,
0.08521896600723267,
-0.34903645515441895,
-0.10329615324735641,
0.08267480880022049,
0.09274008870124817,
0.039372630417346954,
-0.1554105430841446,
-0.06483323127031326,
-0.00980931706726551,
-0.05845755338668823,
0.08787205815315247,
-0.09471195936203003,
0.09072228521108627,
-0.006310527678579092,
0.07302504777908325,
0.019606953486800194,
-0.03861468285322189,
0.16489851474761963,
0.02436840906739235,
0.07293649762868881,
-0.03647693991661072,
0.04271998628973961,
0.035883236676454544,
-0.08833298087120056,
0.07288118451833725,
-0.05223320052027702,
0.08499854803085327,
-0.14647996425628662,
-0.01580175757408142,
-0.04634105786681175,
0.06670699268579483,
-0.05067072808742523,
-0.0547461211681366,
-0.05648946762084961,
0.04711813107132912,
0.050567321479320526,
-0.010979101993143559,
0.08564148843288422,
0.0655711367726326,
0.08505173027515411,
0.08222830295562744,
0.03513002768158913,
0.006548604462295771,
-0.1182725727558136,
-0.010668550617992878,
-0.02682938240468502,
0.08158577233552933,
-0.1447850465774536,
0.0018515018746256828,
0.11321916431188583,
0.026404283940792084,
0.13561755418777466,
0.036202915012836456,
-0.08141171187162399,
-0.0034882035106420517,
0.0494215190410614,
-0.11274786293506622,
-0.1613585352897644,
-0.021755937486886978,
0.009827923960983753,
-0.15304741263389587,
0.024287696927785873,
0.09578067809343338,
-0.07329101860523224,
-0.008539896458387375,
-0.009284848347306252,
0.025531983003020287,
-0.030927563086152077,
0.16580873727798462,
0.06062452122569084,
0.048138976097106934,
-0.05597364529967308,
0.14707313477993011,
0.06574961543083191,
-0.08684545755386353,
0.05833509564399719,
0.040814951062202454,
-0.07144955545663834,
-0.009222318418323994,
0.038419730961322784,
0.14971977472305298,
-0.005496532656252384,
-0.054811473935842514,
-0.07271594554185867,
-0.057969722896814346,
0.04838843271136284,
0.026989832520484924,
0.05114659294486046,
-0.011888704262673855,
-0.04246486723423004,
0.04204295948147774,
-0.14643153548240662,
0.10599071532487869,
0.07089895755052567,
0.08126291632652283,
-0.1908700317144394,
0.06614934653043747,
-0.0077931541018188,
0.027131197974085808,
-0.007092390675097704,
0.026971623301506042,
-0.10228221118450165,
-0.03701046109199524,
-0.10741432756185532,
-0.00018252150039188564,
-0.05926209315657616,
0.0009289863519370556,
0.005165533162653446,
-0.04210390895605087,
-0.04395770654082298,
0.04498940333724022,
-0.054892074316740036,
-0.06498534977436066,
0.0142744742333889,
0.06587184965610504,
-0.12310056388378143,
0.007704433053731918,
0.01772245578467846,
-0.10260289907455444,
0.07447749376296997,
0.059116706252098083,
0.0463377945125103,
0.017420170828700066,
-0.08652933686971664,
0.030395444482564926,
0.042708974331617355,
0.010775359347462654,
0.047030337154865265,
-0.09247792512178421,
-0.008468644693493843,
-0.023987244814634323,
0.038526829332113266,
-0.004065937828272581,
0.03879980742931366,
-0.13074949383735657,
-0.056716252118349075,
-0.06669541448354721,
-0.02964133396744728,
-0.061783045530319214,
0.03125780075788498,
0.09960421174764633,
0.03396448865532875,
0.16095131635665894,
-0.08251078426837921,
0.02695990353822708,
-0.18979133665561676,
-0.019975483417510986,
-0.006924187298864126,
0.00435109855607152,
-0.05695182830095291,
-0.002218939596787095,
0.06214658543467522,
-0.04731990769505501,
0.10162847489118576,
-0.039904508739709854,
0.059364985674619675,
0.022851472720503807,
-0.054986827075481415,
0.011910393834114075,
0.01056709885597229,
0.21474416553974152,
0.060202647000551224,
-0.01097374502569437,
0.045264482498168945,
-0.02556045539677143,
0.08372825384140015,
0.0626378282904625,
0.1441536396741867,
0.14637261629104614,
-0.06244727596640587,
0.0934033915400505,
0.06000050529837608,
-0.10751146078109741,
-0.15692083537578583,
0.1126844584941864,
-0.014832492917776108,
0.09551358968019485,
-0.03032585419714451,
0.14448319375514984,
0.12268415838479996,
-0.15945926308631897,
0.011582124046981335,
-0.029569610953330994,
-0.11082029342651367,
-0.11235789954662323,
-0.07392242550849915,
-0.06522845476865768,
-0.14080750942230225,
0.01822817139327526,
-0.11573809385299683,
0.01584659144282341,
0.0664445161819458,
0.0023193820379674435,
-0.004774675704538822,
0.1666376143693924,
-0.010684607550501823,
0.020486807450652122,
0.0526595301926136,
0.01968374289572239,
-0.01088904868811369,
-0.03285133093595505,
-0.08570632338523865,
0.03593149036169052,
-0.021671270951628685,
0.08853893727064133,
-0.05233481526374817,
0.011569058522582054,
0.06418730318546295,
0.00319432420656085,
-0.05707845091819763,
0.021175816655158997,
0.0096797626465559,
0.004154882859438658,
0.07020486146211624,
0.06142602860927582,
-0.01814871095120907,
-0.04640929028391838,
0.24843154847621918,
-0.08563467115163803,
-0.02256561629474163,
-0.14583061635494232,
0.17553222179412842,
0.015413135290145874,
-0.0013240898260846734,
0.04049331694841385,
-0.13194817304611206,
-0.020130105316638947,
0.1476302593946457,
0.12825459241867065,
-0.053160469979047775,
-0.03718451038002968,
-0.004351507872343063,
-0.026598675176501274,
-0.0925111472606659,
0.10902661085128784,
0.091647669672966,
0.06117573752999306,
-0.027957232668995857,
-0.01744459941983223,
-0.021982256323099136,
-0.021621420979499817,
-0.10494130104780197,
0.05151853337883949,
0.0011932202614843845,
0.005974171683192253,
-0.03326358273625374,
0.07207897305488586,
-0.02201049216091633,
-0.15026450157165527,
0.0405060350894928,
-0.10678628087043762,
-0.17891651391983032,
-0.014203615486621857,
0.07128017395734787,
-0.020789848640561104,
0.061232294887304306,
-0.032694004476070404,
-0.004786509554833174,
0.1585981547832489,
-0.028432121500372887,
-0.06097176671028137,
-0.09044717252254486,
0.03974885493516922,
-0.034584835171699524,
0.2373722791671753,
0.011681552976369858,
0.05762756988406181,
0.10580302774906158,
0.034586504101753235,
-0.1547331064939499,
0.02249862253665924,
0.07563189417123795,
-0.03399820625782013,
0.020127907395362854,
0.1446765959262848,
-0.04062683880329132,
0.10559751838445663,
0.05250396579504013,
-0.09623867273330688,
-0.05326433479785919,
-0.048054445534944534,
-0.012018455192446709,
-0.10344771295785904,
0.0004193454224150628,
-0.0690908432006836,
0.1620692014694214,
0.17090216279029846,
-0.05523510277271271,
-0.014174510724842548,
-0.04263943433761597,
0.05295928195118904,
0.04122339189052582,
0.10009676218032837,
0.01191467884927988,
-0.19965657591819763,
0.019597604870796204,
-0.0038237334229052067,
0.030657334253191948,
-0.24999265372753143,
-0.0833604708313942,
0.04466506838798523,
-0.05229804664850235,
-0.0680173709988594,
0.10694331675767899,
0.05731476843357086,
0.03323252871632576,
-0.05131065472960472,
-0.1017153263092041,
-0.06982232630252838,
0.1363655924797058,
-0.12845966219902039,
-0.06349318474531174
] |
null | null | null | This model is a fine-tuned version of TinyLlama on WizardVicuna Dataset. It should be fully compatible with Vicuna-v1.5 series thanks to https://huggingface.co/Jiayi-Pan/Tiny-Vicuna-1B. I just made it llamafile version. Here is how the model was running in case if you are interested
https://github.com/Mozilla-Ocho/llamafile/issues/242#issuecomment-1930700064
To run in windows just run this attach exe as it is. For linux systems just rename the file removing the .exe and run the llamafile as you run any binary after chmod. That's it. | {"language": ["en"], "license": "apache-2.0", "datasets": ["PocketDoc/Wizard-Vicuna-Refined"]} | null | shaikatasif/tiny-vicuna | [
"en",
"dataset:PocketDoc/Wizard-Vicuna-Refined",
"license:apache-2.0",
"region:us"
] | 2024-02-14T17:29:22+00:00 | [] | [
"en"
] | TAGS
#en #dataset-PocketDoc/Wizard-Vicuna-Refined #license-apache-2.0 #region-us
| This model is a fine-tuned version of TinyLlama on WizardVicuna Dataset. It should be fully compatible with Vicuna-v1.5 series thanks to URL I just made it llamafile version. Here is how the model was running in case if you are interested
URL
To run in windows just run this attach exe as it is. For linux systems just rename the file removing the .exe and run the llamafile as you run any binary after chmod. That's it. | [] | [
"TAGS\n#en #dataset-PocketDoc/Wizard-Vicuna-Refined #license-apache-2.0 #region-us \n"
] | [
36
] | [
"passage: TAGS\n#en #dataset-PocketDoc/Wizard-Vicuna-Refined #license-apache-2.0 #region-us \n"
] | [
-0.04065193980932236,
0.20565184950828552,
-0.006771218962967396,
0.038200799375772476,
-0.015541200526058674,
0.03911488503217697,
0.14986218512058258,
0.11175134032964706,
0.01140625961124897,
-0.07091840356588364,
0.1258852481842041,
0.14898255467414856,
0.03557760268449783,
0.0501120388507843,
-0.00248126732185483,
-0.11279702186584473,
0.08938039839267731,
-0.02392551861703396,
-0.06952997297048569,
0.05083751678466797,
0.08825872838497162,
0.007352655287832022,
0.012368698604404926,
0.000610561459325254,
-0.05533330515027046,
0.0038122397381812334,
0.03507957234978676,
-0.0664869099855423,
0.068569116294384,
-0.06181665509939194,
0.043629348278045654,
0.050546418875455856,
-0.025764649733901024,
-0.1737300604581833,
0.02338271215558052,
0.00766171608120203,
-0.03832025080919266,
0.05556989461183548,
0.02775990031659603,
0.02730950340628624,
0.0897052139043808,
0.03146466240286827,
-0.01907753385603428,
0.05217037349939346,
-0.12063242495059967,
-0.28805238008499146,
-0.2066711187362671,
0.05152679234743118,
-0.0009236939367838204,
0.01951305940747261,
0.057123322039842606,
0.15427303314208984,
-0.03961610049009323,
-0.01904170587658882,
0.14339488744735718,
-0.33913424611091614,
0.01201923843473196,
0.1432393491268158,
0.028486106544733047,
0.02211083099246025,
-0.005535247270017862,
0.045824892818927765,
0.07604087889194489,
-0.021125027909874916,
-0.016569633036851883,
-0.056451790034770966,
-0.19040998816490173,
0.08268244564533234,
-0.035995062440633774,
-0.04981192573904991,
0.44894129037857056,
0.03182481974363327,
-0.036415908485651016,
0.10981561988592148,
-0.023046011105179787,
0.10688446462154388,
0.006903613451868296,
0.023459386080503464,
0.07638508081436157,
0.15065523982048035,
0.13151493668556213,
-0.0459255687892437,
-0.12147077172994614,
-0.026942964643239975,
-0.15361620485782623,
0.012323467060923576,
-0.0058492268435657024,
0.10558630526065826,
-0.1510908603668213,
0.007536272052675486,
-0.0010651095071807504,
-0.0517720989882946,
-0.026319067925214767,
-0.07154905796051025,
0.10522766411304474,
0.03347277641296387,
-0.026285996660590172,
0.09367569535970688,
0.1565772145986557,
0.21514864265918732,
0.11590085178613663,
-0.02297661267220974,
-0.13032706081867218,
0.14052806794643402,
0.0025279715191572905,
-0.017364298924803734,
0.019984394311904907,
0.06278087943792343,
0.12989450991153717,
-0.14128878712654114,
0.12500903010368347,
-0.043035052716732025,
-0.09057706594467163,
0.0012618985492736101,
-0.11826054751873016,
0.13438013195991516,
0.0895080715417862,
-0.07072822749614716,
-0.09262094646692276,
0.010391741059720516,
0.1397656500339508,
-0.0303224828094244,
-0.0028699240647256374,
-0.03383881598711014,
-0.0011547653703019023,
0.03824710100889206,
0.027065863832831383,
0.04504857584834099,
0.04631728678941727,
0.05626463145017624,
-0.09067505598068237,
-0.022412417456507683,
0.028567180037498474,
0.03912801668047905,
0.14216968417167664,
-0.060722775757312775,
0.0802692174911499,
-0.07696936279535294,
-0.22386209666728973,
0.036628883332014084,
0.07476790249347687,
0.010507040657103062,
-0.05876274034380913,
0.08376556634902954,
-0.004458433948457241,
-0.024936210364103317,
-0.05276445299386978,
0.002296447055414319,
-0.08581150323152542,
0.06153895705938339,
-0.13367454707622528,
0.015031379647552967,
-0.17618122696876526,
0.005468165036290884,
-0.11697220802307129,
0.0573546476662159,
0.09459161758422852,
-0.10147769749164581,
-0.1289672553539276,
0.15737535059452057,
-0.06808285415172577,
0.03570634126663208,
-0.003455858211964369,
-0.03189223259687424,
-0.03828040510416031,
0.06406773626804352,
-0.25467172265052795,
-0.005675958935171366,
0.16815058887004852,
-0.1612640917301178,
-0.25740090012550354,
0.009881575591862202,
0.003008368657901883,
0.09865527600049973,
0.02607887051999569,
0.2635950446128845,
-0.016294563189148903,
-0.11336734145879745,
0.013139510527253151,
0.059853386133909225,
-0.1141304224729538,
-0.18763013184070587,
0.1201794371008873,
-0.11610151827335358,
-0.13406498730182648,
0.02637309394776821,
-0.07247541844844818,
0.023183252662420273,
0.021003209054470062,
-0.09765824675559998,
-0.058243971318006516,
-0.06352750957012177,
-0.04828181490302086,
-0.059439558535814285,
0.007391741964966059,
-0.07503156363964081,
0.06374891102313995,
-0.09836269915103912,
0.09514964371919632,
0.10315243154764175,
0.051614902913570404,
-0.03149096667766571,
0.03127824887633324,
0.03900672122836113,
0.01969340443611145,
-0.052319347858428955,
0.018130626529455185,
0.019319361075758934,
-0.05444686487317085,
0.01853209175169468,
0.05543358996510506,
0.017051363363862038,
-0.07472081482410431,
0.0037571676075458527,
0.04149936884641647,
0.0007629782194271684,
0.07326541841030121,
0.04682409018278122,
-0.14293041825294495,
0.052196651697158813,
-0.04200782626867294,
0.020137842744588852,
0.06603064388036728,
0.004572713747620583,
0.06680584698915482,
0.008043976500630379,
-0.012457878328859806,
0.03496837615966797,
0.016420086845755577,
-0.06959586590528488,
0.0027547143399715424,
-0.02803170122206211,
0.08162162452936172,
0.06635020673274994,
-0.08456943929195404,
0.11692284047603607,
0.003257663920521736,
0.12743502855300903,
0.17643801867961884,
-0.027648678049445152,
0.11218202114105225,
-0.0429082028567791,
-0.005631242413073778,
-0.035147570073604584,
0.02254047989845276,
0.022061556577682495,
-0.13559217751026154,
0.0016711857169866562,
0.0038246172480285168,
-0.04004520922899246,
0.016024576500058174,
-0.026252906769514084,
-0.08024724572896957,
-0.057773202657699585,
-0.035244785249233246,
0.17065373063087463,
-0.11009979248046875,
0.12778089940547943,
0.42838603258132935,
0.05616345256567001,
0.029436588287353516,
-0.11872585117816925,
-0.03372570127248764,
-0.05140870809555054,
-0.03672713786363602,
0.012831395491957664,
0.11552894115447998,
-0.08825734257698059,
0.05526382848620415,
0.10398141294717789,
0.045540694147348404,
0.026217669248580933,
-0.1218324676156044,
-0.12073048204183578,
0.007780368439853191,
-0.03093482181429863,
-0.12493643909692764,
0.0924568921327591,
-0.09330593794584274,
0.05041880160570145,
-0.03050922229886055,
-0.03937937691807747,
0.13377967476844788,
-0.0011956386733800173,
-0.03577835112810135,
0.05641871690750122,
-0.2129143923521042,
-0.08461247384548187,
-0.050048939883708954,
-0.11096987873315811,
-0.0031483087223023176,
-0.005054826848208904,
0.08766837418079376,
-0.016200155019760132,
-0.04048072174191475,
0.015166239812970161,
-0.09821261465549469,
-0.10869147628545761,
-0.008999970741569996,
0.05026475340127945,
0.07186290621757507,
0.026734333485364914,
-0.11119117587804794,
-0.024077683687210083,
0.04740935191512108,
0.010168222710490227,
0.07242065668106079,
-0.05717206001281738,
0.09530723094940186,
0.06415124237537384,
0.04984833672642708,
0.036367323249578476,
-0.025405097752809525,
0.11857396364212036,
0.003273551817983389,
-0.05707002431154251,
0.16333706676959991,
-0.0029690004885196686,
0.019205864518880844,
0.1143927127122879,
0.06107447296380997,
-0.08978161960840225,
-0.008297394961118698,
-0.04873590171337128,
-0.07803108543157578,
-0.31594690680503845,
-0.05698389559984207,
-0.0795195922255516,
0.14565865695476532,
0.025671755895018578,
0.08956694602966309,
0.034726765006780624,
0.06187457963824272,
0.018581654876470566,
0.0032611668575555086,
-0.03764352574944496,
0.018619049340486526,
0.11564125865697861,
-0.05525317043066025,
-0.031878940761089325,
-0.11299349367618561,
0.055095553398132324,
0.16243699193000793,
0.15665528178215027,
0.14270910620689392,
0.2220376431941986,
0.11195967346429825,
0.1332758665084839,
0.1860845386981964,
0.015917856246232986,
0.06204747036099434,
0.12774452567100525,
0.004343479871749878,
-0.06608463823795319,
-0.0336296372115612,
-0.015689227730035782,
0.05377010256052017,
-0.06832437217235565,
-0.15061765909194946,
0.05042785406112671,
-0.10098405182361603,
0.0649123564362526,
0.08747034519910812,
0.0451534278690815,
-0.009486513212323189,
0.09862153977155685,
0.09366311877965927,
0.10808990895748138,
-0.01067088358104229,
0.11197654902935028,
-0.10070781409740448,
-0.04252656549215317,
0.08762843906879425,
0.006612986326217651,
0.08602745085954666,
0.024782897904515266,
-0.0009232253069058061,
-0.08332375437021255,
-0.12583141028881073,
0.048197370022535324,
0.13408628106117249,
-0.1970318704843521,
0.19066843390464783,
0.0028577721677720547,
-0.02440478838980198,
-0.03194258362054825,
-0.040512435138225555,
0.08219844847917557,
0.12323020398616791,
0.14067956805229187,
0.06936297565698624,
-0.196129709482193,
0.1083482950925827,
-0.10926418006420135,
0.011659855023026466,
-0.03198513761162758,
-0.002392528112977743,
-0.11583833396434784,
-0.029847348108887672,
0.04314562305808067,
0.01559646986424923,
0.13738121092319489,
-0.10877885669469833,
-0.07888159155845642,
0.0631154403090477,
0.12024762481451035,
-0.002473070751875639,
-0.115567147731781,
0.015406531281769276,
0.0023005991242825985,
0.11328993737697601,
-0.014823424629867077,
0.004673405550420284,
-0.04612292721867561,
-0.07817675918340683,
0.0793285146355629,
0.0047987522557377815,
0.006101482082158327,
-0.030438046902418137,
-0.03259839117527008,
-0.09991764277219772,
-0.15623965859413147,
0.09097357839345932,
-0.09746670722961426,
-0.0018928013741970062,
-0.09071414172649384,
0.060906387865543365,
-0.06552977114915848,
0.025431334972381592,
-0.015126703307032585,
0.029232919216156006,
-0.06550522148609161,
-0.09223341941833496,
0.12275092303752899,
-0.03649396076798439,
-0.026283923536539078,
0.0005807637935504317,
0.014558285474777222,
0.11340966075658798,
0.0204643364995718,
-0.12237507104873657,
0.16053912043571472,
0.3005616068840027,
-0.07854479551315308,
0.17915502190589905,
0.15857529640197754,
-0.13842816650867462,
-0.22233924269676208,
-0.15962675213813782,
-0.23852857947349548,
-0.09952768683433533,
0.10249722003936768,
-0.15616129338741302,
0.08713650703430176,
0.16851171851158142,
-0.13828584551811218,
0.19237719476222992,
-0.2058243304491043,
-0.026540439575910568,
0.19122779369354248,
-0.0027399882674217224,
0.302613765001297,
-0.16223971545696259,
-0.07875518500804901,
-0.08154784888029099,
-0.10117039829492569,
0.1261902153491974,
-0.2413134127855301,
0.03307533636689186,
-0.012784686870872974,
-0.05184220150113106,
-0.057675886899232864,
-0.019978905096650124,
0.19467422366142273,
0.008504638448357582,
0.06135533004999161,
-0.04190180450677872,
-0.013315005227923393,
0.18177530169487,
-0.012258084490895271,
0.009443266317248344,
-0.09139836579561234,
0.01937894895672798,
0.029159370809793472,
0.04339070990681648,
-0.017085354775190353,
0.10358909517526627,
-0.031690966337919235,
-0.05693252757191658,
-0.11326955258846283,
-0.0038799159228801727,
-0.04980139061808586,
-0.029540354385972023,
0.218303844332695,
0.0977027416229248,
-0.037552155554294586,
0.03775055333971977,
-0.09162868559360504,
-0.09704558551311493,
-0.04536645486950874,
-0.05952416732907295,
-0.06915636360645294,
0.046172674745321274,
-0.20438043773174286,
-0.011592062190175056,
0.06814046204090118,
-0.02787277102470398,
0.05212054401636124,
0.0473606251180172,
-0.093813955783844,
0.006775333546102047,
0.09721986949443817,
-0.1178177148103714,
-0.01332809217274189,
0.047238726168870926,
0.15550823509693146,
0.05718783661723137,
-0.008886550553143024,
0.08848168700933456,
0.003278914839029312,
-0.0197873767465353,
0.03235391527414322,
0.10813351720571518,
-0.10656162351369858,
0.006729118525981903,
0.06998391449451447,
-0.02318205125629902,
-0.12978880107402802,
0.24717977643013,
0.03179166465997696,
-0.07317691296339035,
-0.004389162175357342,
-0.012500116601586342,
-0.06746069341897964,
-0.08434602618217468,
-0.041552163660526276,
0.03868618980050087,
-0.051877208054065704,
-0.13842792809009552,
0.0021351450122892857,
-0.09792912006378174,
-0.012098798528313637,
-0.06255056709051132,
0.0836169570684433,
0.10743944346904755,
0.009577615186572075,
-0.0774884819984436,
0.05926663428544998,
-0.03269542008638382,
-0.075952909886837,
0.01009275671094656,
-0.10762651264667511,
-0.20338134467601776,
-0.0037675935309380293,
0.06795795261859894,
0.0013797361170873046,
-0.012454132549464703,
-0.034756459295749664,
0.03747440129518509,
-0.138505220413208,
0.03768133372068405,
-0.09847237169742584,
-0.009854596108198166,
0.06921659409999847,
-0.035485487431287766,
-0.020037459209561348,
0.009298260323703289,
-0.13159678876399994,
-0.026909109205007553,
0.003560612676665187,
0.07781948149204254,
-0.07180895656347275,
-0.06340119987726212,
0.12041721493005753,
0.01444518007338047,
0.0977066308259964,
0.10937173664569855,
-0.009755510836839676,
0.0915268063545227,
-0.16858991980552673,
-0.08442588895559311,
0.08303208649158478,
0.046623170375823975,
0.012072252109646797,
-0.07838433980941772,
-0.08993186056613922,
0.06953607499599457,
-0.08422597497701645,
0.011044228449463844,
-0.033615633845329285,
-0.11439144611358643,
-0.14243747293949127,
-0.012111597694456577,
-0.10385702550411224,
0.029710080474615097,
-0.14320391416549683,
0.21158382296562195,
0.04313940927386284,
0.12679798901081085,
0.05963902547955513,
0.015118790790438652,
-0.01995486579835415,
0.030500704422593117,
-0.047337718307971954,
-0.10228457301855087,
-0.11722495406866074,
0.011041924357414246,
-0.08405901491641998,
-0.060006700456142426,
0.29354214668273926,
-0.05709419399499893,
-0.1622234284877777,
0.033815134316682816,
0.06157093122601509,
-0.02668861672282219,
-0.002629045397043228,
0.28092560172080994,
0.02483420819044113,
0.0036562993191182613,
-0.04618396982550621,
-0.014324672520160675,
0.03221457451581955,
-0.0411842055618763,
0.03821177035570145,
0.031380392611026764,
0.09112224727869034,
0.02617330476641655,
0.03682860732078552,
-0.05769549682736397,
-0.016308192163705826,
-0.03115926682949066,
0.1333075761795044,
0.053545139729976654,
0.08154954761266708,
0.028407294303178787,
0.15032824873924255,
-0.035969845950603485,
-0.011431172490119934,
-0.03960491344332695,
-0.020185576751828194,
-0.12334998697042465,
-0.09015814960002899,
-0.03836555406451225,
-0.09943415969610214,
0.019910041242837906,
-0.05118833854794502,
0.01710847206413746,
0.15215232968330383,
0.015724871307611465,
-0.05757550150156021,
-0.08152003586292267,
-0.054502177983522415,
-0.043472908437252045,
-0.0143012385815382,
-0.026900211349129677,
-0.09154341369867325,
-0.08665218204259872,
-0.0483386255800724,
-0.020948808640241623,
-0.018404655158519745,
-0.0464058443903923,
0.027781447395682335,
0.041044216603040695,
0.043881483376026154,
-0.16022756695747375,
-0.05170160531997681,
-0.08013354241847992,
0.020176270976662636,
-0.01716834306716919,
0.15584413707256317,
0.0600569024682045,
0.025003766641020775,
0.11556259542703629,
0.1039869412779808,
-0.010103829205036163,
-0.08577442169189453,
-0.05740746110677719,
-0.0269162617623806,
-0.033473461866378784,
0.010615520179271698,
-0.007283180020749569,
0.01323525607585907,
-0.014895965345203876,
0.20483407378196716,
0.23582372069358826,
-0.050157029181718826,
-0.000466947618406266,
-0.04011328145861626,
0.009870232082903385,
-0.0018470525974407792,
0.0622912161052227,
0.07332401722669601,
-0.009257261641323566,
-0.053604234009981155,
-0.0022044077049940825,
-0.07520642131567001,
0.006373229436576366,
-0.1355132758617401,
0.01798943616449833,
0.0021940364968031645,
-0.1133328303694725,
-0.009889810346066952,
0.13757658004760742,
-0.043428391218185425,
0.030230019241571426,
0.06514348834753036,
-0.024810023605823517,
-0.008926300331950188,
-0.007255904376506805,
0.09954683482646942,
0.06379503011703491,
0.041234783828258514,
-0.11091136932373047,
-0.04693412035703659,
0.06752507388591766,
-0.011916201561689377,
-0.25374025106430054,
-0.11091332137584686,
0.10506246238946915,
-0.03813415765762329,
0.26773911714553833,
-0.005237323697656393,
0.07119833678007126,
0.05478231981396675,
0.06488141417503357,
-0.17735035717487335,
0.07099626958370209,
0.025219295173883438,
0.05492173880338669,
-0.06066673994064331,
-0.15296371281147003,
-0.06364994496107101,
-0.06576903164386749,
0.0799085795879364,
0.07451152801513672,
-0.006400031037628651,
0.22234125435352325,
-0.03555581718683243,
-0.026905512437224388,
0.01866389997303486,
-0.11432937532663345,
0.07708533108234406,
-0.03086867928504944,
-0.05357865244150162,
-0.03961943835020065,
-0.039510853588581085,
-0.023652125149965286,
0.05949201062321663,
-0.19209203124046326,
-0.018941637128591537,
0.15125831961631775,
-0.010156376287341118,
0.13208724558353424,
0.004239550791680813,
0.007637950591742992,
-0.027987461537122726,
-0.10528473556041718,
-0.013970482163131237,
-0.07379698008298874,
0.02516399696469307,
0.0766037255525589,
-0.02991575002670288,
-0.0017156675457954407,
0.020496848970651627,
0.021214917302131653,
-0.0017871313029900193,
-0.04177013784646988,
-0.07195238769054413
] |
null | null | null |
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="KevStrider/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-FrozenLake-v1-4x4-noSlippery", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "FrozenLake-v1-4x4-no_slippery", "type": "FrozenLake-v1-4x4-no_slippery"}, "metrics": [{"type": "mean_reward", "value": "1.00 +/- 0.00", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | KevStrider/q-FrozenLake-v1-4x4-noSlippery | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-14T17:30:45+00:00 | [] | [] | TAGS
#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 FrozenLake-v1
This is a trained model of a Q-Learning agent playing FrozenLake-v1 .
## Usage
| [
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
"TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
40,
39
] | [
"passage: TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
0.04578453302383423,
-0.08074592798948288,
-0.00430759321898222,
0.10720831900835037,
0.05034215748310089,
-0.040469273924827576,
0.11997015029191971,
0.018999949097633362,
0.20601962506771088,
-0.010012076236307621,
0.1455274522304535,
0.007022971753031015,
-0.006192410364747047,
0.1867983490228653,
0.04572829231619835,
-0.26324528455734253,
0.01831899583339691,
-0.09495259821414948,
-0.07281816750764847,
0.11870454251766205,
0.05470194295048714,
-0.01901467889547348,
-0.0007633853238075972,
0.056141503155231476,
-0.0673527717590332,
0.0007737681735306978,
0.031996939331293106,
-0.012976245954632759,
0.19804789125919342,
-0.02254498563706875,
0.06641989201307297,
0.054705578833818436,
0.0758768692612648,
-0.1998077929019928,
0.0358855277299881,
-0.04215473681688309,
-0.09439758956432343,
-0.03934839740395546,
-0.018780618906021118,
0.05878105387091637,
0.053356342017650604,
0.03858819976449013,
0.058354366570711136,
0.09384993463754654,
-0.0773480236530304,
0.04328357055783272,
0.04280758649110794,
0.024811049923300743,
0.04589218273758888,
-0.0237203948199749,
-0.027002155780792236,
0.08246652781963348,
-0.22182892262935638,
0.10318073630332947,
-0.010159241035580635,
-0.5270710587501526,
-0.00633762264624238,
0.24088262021541595,
0.11517096310853958,
0.05707438662648201,
-0.06903956830501556,
0.10566288232803345,
0.03913382440805435,
-0.007209456991404295,
0.03210983797907829,
0.02150118350982666,
0.12817370891571045,
0.06009242683649063,
-0.09581366181373596,
0.040699947625398636,
0.13722525537014008,
0.012822695076465607,
0.020306183025240898,
-0.08888901025056839,
0.0410032719373703,
-0.03461858257651329,
-0.007679527159780264,
-0.09758518636226654,
0.05478060990571976,
0.012466507963836193,
-0.0934976264834404,
-0.09247440844774246,
-0.04236573353409767,
-0.06708304584026337,
0.11252415925264359,
0.046419668942689896,
-0.0874939113855362,
0.03884070739150047,
-0.06760413944721222,
0.05918780341744423,
-0.16863860189914703,
0.02074250765144825,
-0.06627868115901947,
-0.09376336634159088,
-0.11799788475036621,
-0.01683047041296959,
-0.07946427166461945,
0.009092256426811218,
0.056664444506168365,
0.1447116881608963,
0.22076484560966492,
0.06690320372581482,
0.09728849679231644,
0.07456006109714508,
0.06531001627445221,
0.1538129299879074,
0.10918238013982773,
0.019075315445661545,
-0.015266558155417442,
0.0948706716299057,
-0.06445580720901489,
-0.1351388692855835,
-0.15579092502593994,
0.005488025024533272,
0.0983937531709671,
0.08871900290250778,
-0.044080477207899094,
-0.006702381651848555,
-0.024641724303364754,
0.08566431701183319,
-0.11314457654953003,
-0.024612564593553543,
-0.002267979085445404,
0.06882024556398392,
-0.024801667779684067,
0.020378148183226585,
-0.06242705136537552,
0.12715265154838562,
0.04222423583269119,
-0.059924717992544174,
-0.055308472365140915,
-0.03053177334368229,
-0.014276440255343914,
-0.027539284899830818,
0.02446848154067993,
-0.07659092545509338,
0.04767750948667526,
-0.16766095161437988,
-0.042871296405792236,
-0.04784649610519409,
0.025697942823171616,
-0.03907240927219391,
-0.13557587563991547,
-0.17699143290519714,
-0.048906855285167694,
-0.022438718006014824,
0.03549358621239662,
-0.038111843168735504,
0.006551501806825399,
-0.006318534724414349,
-0.1583600640296936,
0.09783563017845154,
0.09784027189016342,
-0.03643378987908363,
-0.02749447710812092,
0.056263517588377,
-0.07194498926401138,
0.1561182290315628,
-0.21054518222808838,
-0.054014235734939575,
-0.044764336198568344,
-0.06595750898122787,
0.19673264026641846,
0.012690845876932144,
-0.01202624011784792,
0.19873127341270447,
-0.29073721170425415,
-0.06078760325908661,
0.12533614039421082,
-0.07834373414516449,
-0.0936407670378685,
0.06941844522953033,
-0.04206686094403267,
0.023345354944467545,
0.046047765761613846,
0.36345911026000977,
-0.02069227211177349,
-0.16197136044502258,
-0.021782705560326576,
0.13971707224845886,
-0.1184760183095932,
0.059895481914281845,
0.04240793362259865,
0.12543781101703644,
-0.04250509291887283,
-0.018672896549105644,
-0.09023164212703705,
0.05999075248837471,
-0.05241934582591057,
-0.09016361832618713,
-0.03393383324146271,
-0.07645075023174286,
0.13294468820095062,
-0.0629684180021286,
0.05601520463824272,
-0.03255095332860947,
-0.07133250683546066,
-0.050324998795986176,
-0.016492370516061783,
0.04460815340280533,
0.05951254442334175,
-0.12794871628284454,
0.11029167473316193,
0.13025271892547607,
-0.0006193425506353378,
-0.07498852163553238,
-0.17872096598148346,
0.003240168560296297,
0.009576505981385708,
0.039837226271629333,
0.17141658067703247,
0.12209978699684143,
0.033295199275016785,
0.008770671673119068,
-0.06389404833316803,
-0.18276847898960114,
0.058129217475652695,
-0.056212130934000015,
-0.14230976998806,
-0.052409034222364426,
-0.0728459507226944,
0.017381802201271057,
-0.0859743058681488,
-0.017379917204380035,
0.021926190704107285,
0.006908397190272808,
0.02990424446761608,
-0.026645656675100327,
-0.049561817198991776,
0.021254703402519226,
0.06490101665258408,
-0.0037617047782987356,
0.12023693323135376,
0.008277264423668385,
-0.18308481574058533,
0.07930773496627808,
0.08478537946939468,
0.09196605533361435,
0.013250201940536499,
0.02685922384262085,
-0.021522263064980507,
-0.08061408251523972,
-0.054420311003923416,
0.02957955375313759,
0.11417073011398315,
0.1317172348499298,
0.2361993044614792,
0.08753683418035507,
0.04697408527135849,
-0.02164587564766407,
-0.016415923833847046,
0.002810494042932987,
-0.06318057328462601,
-0.029935607686638832,
0.10614971816539764,
0.05865858122706413,
-0.067733034491539,
-0.04576427489519119,
0.09590928256511688,
0.02732124738395214,
0.21205885708332062,
-0.03342745825648308,
0.01286078616976738,
-0.10957037657499313,
-0.06550975888967514,
-0.031982194632291794,
0.09201868623495102,
0.09498392790555954,
0.009755023755133152,
-0.022056059911847115,
-0.04259001836180687,
0.0012916827108711004,
-0.1334889680147171,
-0.10375088453292847,
0.026475343853235245,
0.013400445692241192,
-0.11206940561532974,
0.11674030870199203,
-0.11352457851171494,
0.039504457265138626,
0.06024791672825813,
-0.13837239146232605,
0.04428480193018913,
-0.029713207855820656,
-0.07886212319135666,
0.16866780817508698,
-0.11075661331415176,
-0.094340018928051,
-0.08831550180912018,
0.004082420375198126,
0.0075836325995624065,
-0.03922267258167267,
-0.009283260442316532,
-0.19952571392059326,
-0.005375816952437162,
-0.03544965013861656,
0.013616434298455715,
-0.06988783925771713,
-0.11287739872932434,
-0.010957922786474228,
0.07084179669618607,
-0.043388739228248596,
-0.07803605496883392,
0.007967432029545307,
-0.08923084288835526,
-0.10623309016227722,
0.028189711272716522,
0.019765101373195648,
-0.022883659228682518,
0.16152891516685486,
0.01816628873348236,
0.05626589432358742,
-0.03298520669341087,
0.30665266513824463,
-0.038163769990205765,
0.08371731638908386,
-0.02993497997522354,
-0.07433546334505081,
0.06130730360746384,
-0.022327827289700508,
0.06086638569831848,
-0.020221687853336334,
-0.02362890914082527,
0.0077952733263373375,
-0.08579335361719131,
-0.18365982174873352,
-0.05417544022202492,
0.03724347800016403,
0.195254847407341,
0.031118987128138542,
0.01910330168902874,
-0.0488768145442009,
-0.010547760874032974,
0.1665220558643341,
-0.10005921125411987,
0.04030545800924301,
-0.05366240441799164,
0.11506262421607971,
-0.08640182018280029,
0.06195629760622978,
0.020486772060394287,
0.04266135022044182,
-0.04877188801765442,
0.09486009180545807,
0.0826394334435463,
0.1121082529425621,
-0.02206910029053688,
0.046257395297288895,
0.019012698903679848,
0.07383184134960175,
0.11073657125234604,
0.0368414968252182,
-0.0729052945971489,
0.001982470043003559,
-0.006313489284366369,
-0.039427030831575394,
0.11933320760726929,
0.17963355779647827,
-0.11991413682699203,
-0.05106910318136215,
0.27167606353759766,
0.0031242913100868464,
0.19481229782104492,
-0.01315275114029646,
0.043591804802417755,
-0.04484925419092178,
0.04572054371237755,
-0.05338600277900696,
-0.04086209088563919,
0.2094656229019165,
0.08045925945043564,
-0.17165091633796692,
-0.08549032360315323,
-0.05912299454212189,
0.07081323862075806,
0.10728751868009567,
0.0013539529172703624,
-0.04156802222132683,
0.0004610282776411623,
0.0014198932331055403,
0.08339415490627289,
-0.14520122110843658,
0.11816094070672989,
-0.03172019124031067,
0.05612684786319733,
0.017555562779307365,
-0.045326150953769684,
0.04264266416430473,
0.07474290579557419,
0.26618310809135437,
0.0904107540845871,
-0.040318213403224945,
-0.0892091691493988,
-0.12260187417268753,
0.010461576282978058,
0.029102616012096405,
-0.03534553572535515,
0.0037547778338193893,
-0.020087555050849915,
0.0318896509706974,
0.008264793083071709,
0.016230624169111252,
-0.08987458795309067,
-0.03175399824976921,
-0.027736429125070572,
-0.023839212954044342,
0.10733365267515182,
-0.09495144337415695,
-0.1444292515516281,
-0.15713949501514435,
0.04191131144762039,
-0.0766405463218689,
-0.056593164801597595,
-0.054507751017808914,
-0.05239389091730118,
-0.0311186034232378,
-0.03773957118391991,
0.09099467098712921,
-0.0021037792321294546,
0.14807306230068207,
-0.1920108050107956,
-0.04220759496092796,
0.051812779158353806,
-0.07607918977737427,
-0.08729588985443115,
0.03410962224006653,
0.12136995792388916,
0.05116051807999611,
0.11504370719194412,
0.013609255664050579,
0.09567681699991226,
0.0045484392903745174,
-0.06713183224201202,
0.15302421152591705,
-0.14069625735282898,
-0.27875974774360657,
-0.03836318850517273,
0.016946332529187202,
0.1615200787782669,
-0.05613167956471443,
0.031766023486852646,
0.3335736393928528,
0.27782970666885376,
-0.1428707242012024,
0.25916144251823425,
0.019178593531250954,
0.004398873541504145,
-0.19130495190620422,
-0.10125631093978882,
0.025324683636426926,
0.04740457236766815,
0.12032642960548401,
-0.14564448595046997,
-0.010732659138739109,
-0.04543145373463631,
-0.025908485054969788,
0.10386138409376144,
-0.12300799041986465,
-0.07263197749853134,
0.07765276730060577,
0.039809420704841614,
0.1808302253484726,
0.03932500258088112,
0.0014799144119024277,
0.13626977801322937,
0.06612244248390198,
0.019124457612633705,
0.05216038227081299,
0.08028066903352737,
-0.018944554030895233,
0.14207926392555237,
0.05448179319500923,
-0.02551644667983055,
0.052681710571050644,
-0.0054580713622272015,
-0.03219012916088104,
0.015605825930833817,
-0.183198019862175,
-0.10147556662559509,
-0.0561356320977211,
-0.10798973590135574,
-0.04978342354297638,
0.056853994727134705,
-0.12395523488521576,
-0.007896827533841133,
-0.03841273859143257,
0.03718273714184761,
-0.07831971347332001,
-0.09360362589359283,
-0.036494381725788116,
0.1351792961359024,
0.07210618257522583,
0.04471297934651375,
0.035655103623867035,
-0.07390819489955902,
0.07097936421632767,
0.21671734750270844,
0.08159157633781433,
0.028919655829668045,
-0.19545674324035645,
-0.024042490869760513,
-0.0803457647562027,
0.06306298077106476,
-0.08856996893882751,
-0.016788700595498085,
0.11923003196716309,
0.08616556972265244,
0.05413002520799637,
0.09640096127986908,
-0.045083072036504745,
0.021686913445591927,
0.02684609219431877,
-0.15131035447120667,
-0.18501274287700653,
-0.08534606546163559,
-0.03519878163933754,
0.11561143398284912,
-0.06398691236972809,
0.10897188633680344,
-0.13615410029888153,
0.010051886551082134,
-0.006060056854039431,
0.02693452313542366,
-0.03596206381917,
-0.11251141875982285,
0.15348562598228455,
0.11999429017305374,
-0.06767056882381439,
0.03127254918217659,
-0.09527092427015305,
-0.04423454403877258,
0.12686803936958313,
-0.013623855076730251,
-0.0371493324637413,
-0.054547641426324844,
-0.03628576174378395,
0.15247689187526703,
-0.03436964750289917,
0.008244883269071579,
-0.041229065507650375,
-0.18217355012893677,
0.0798322781920433,
0.09045056998729706,
0.019827889278531075,
-0.031874191015958786,
-0.09797266125679016,
-0.010231015272438526,
-0.0011165260802954435,
0.11730700731277466,
-0.10696814209222794,
-0.10933240503072739,
-0.15144047141075134,
0.06713984161615372,
-0.0007159380475059152,
0.18502596020698547,
-0.06394898891448975,
-0.08904669433832169,
-0.12429379671812057,
0.02344517596065998,
-0.0027384376153349876,
-0.042264558374881744,
0.01618490368127823,
0.07992301136255264,
-0.04095321521162987,
0.02075677551329136,
-0.06651144474744797,
0.06372585147619247,
-0.11786920577287674,
0.09625071287155151,
0.01063506118953228,
0.016993753612041473,
-0.0417880080640316,
-0.01618220843374729,
0.039470795542001724,
-0.057925306260585785,
0.07921463251113892,
0.011758086271584034,
0.0010938759660348296,
0.10196787863969803,
-0.0034960443153977394,
0.06409632414579391,
-0.05372481048107147,
-0.023290161043405533,
0.06578411161899567,
-0.05874887853860855,
-0.03370826691389084,
-0.1573946475982666,
-0.0709633082151413,
0.020051732659339905,
-0.04775108024477959,
0.002077929675579071,
0.03673801198601723,
0.062159497290849686,
-0.06937079131603241,
-0.12125655263662338,
-0.043812792748212814,
-0.028638383373618126,
0.021301284432411194,
0.10829301923513412,
-0.07526551932096481,
0.1547859013080597,
-0.052787959575653076,
-0.00020603960729204118,
0.07437096536159515,
0.04048224538564682,
0.01393822580575943,
-0.10422444343566895,
-0.04698587954044342,
-0.11035211384296417,
0.1502903699874878,
-0.007902312092483044,
-0.03533121198415756,
0.03719403222203255,
-0.11946307867765427,
-0.1572723090648651,
0.03418220207095146,
0.10199101269245148,
0.0448341928422451,
0.025807438418269157,
0.027079269289970398,
-0.04042419046163559,
-0.021270349621772766,
-0.07034418731927872,
0.0882953479886055,
-0.12085357308387756,
-0.09669415652751923,
0.09555385261774063,
0.12178351730108261,
-0.0036850625183433294,
-0.07441367954015732,
0.11554073542356491,
-0.021787192672491074,
0.05525410920381546,
-0.02971339225769043,
0.10308072715997696,
0.0796005055308342,
-0.12273547053337097,
0.005693064536899328,
-0.036891788244247437,
-0.0741485133767128,
-0.12975730001926422,
0.019545545801520348,
-0.061916105449199677,
-0.13383042812347412,
0.12179028987884521,
-0.09376577287912369,
0.030037038028240204,
-0.10506992787122726,
0.021338803693652153,
0.01864001713693142,
0.061665527522563934,
-0.10988292098045349,
0.08575301617383957,
0.13424484431743622,
-0.043199893087148666,
-0.07184189558029175,
-0.12455986440181732,
-0.05022053420543671,
-0.04231856390833855,
-0.13957437872886658,
-0.11600435525178909,
0.0100301094353199,
-0.023418782278895378,
-0.05818291753530502,
0.0015462689334526658,
-0.03659068048000336,
0.008594646118581295,
0.021907730028033257,
0.04032021388411522,
-0.02693161368370056,
0.05134565755724907,
-0.057569269090890884,
-0.052510857582092285,
0.11489357799291611,
0.04113486409187317,
-0.03561042994260788,
-0.052359987050294876,
0.12997733056545258,
-0.11959461867809296,
0.07662346214056015,
-0.020313527435064316,
0.017129231244325638,
-0.06435854732990265,
0.17131924629211426,
0.11673715710639954,
-0.1367570012807846,
-0.005008010193705559,
-0.08210669457912445,
0.020409544929862022,
0.023555370047688484,
0.13693512976169586,
-0.03411718085408211,
-0.0012358218664303422,
-0.1580323874950409,
0.018575575202703476,
-0.18557456135749817,
-0.03716109320521355,
0.04671547934412956,
0.09917585551738739,
0.15293832123279572,
-0.0034432117827236652,
-0.1263325810432434,
0.10424192249774933,
-0.2118520885705948,
0.0907607227563858,
0.05121984705328941,
-0.11874113976955414,
-0.06765396893024445,
-0.06795281916856766,
0.1198519766330719,
0.009196433238685131,
0.2040700763463974,
-0.013615905307233334,
-0.09132910519838333,
-0.07060808688402176,
-0.01980910450220108,
-0.030524181202054024,
0.09714830666780472,
0.041414931416511536,
0.04653804749250412,
0.12821412086486816,
0.00368314771912992,
0.07533777505159378,
0.060310911387205124,
0.02759413793683052,
-0.012300663627684116,
0.04076618701219559,
0.08261215686798096,
-0.14588621258735657,
-0.1659701019525528,
0.1326720416545868,
0.025149408727884293,
0.11792458593845367,
0.03658788278698921,
-0.1549617499113083,
0.06687124073505402,
0.2523096203804016,
-0.11147607117891312,
0.02505038119852543,
0.12737524509429932,
-0.0366884209215641,
0.0672016367316246,
0.1144871786236763,
-0.02633814327418804,
-0.05217865854501724,
-0.011363590136170387,
0.10233135521411896,
0.028660254552960396,
-0.04646271467208862,
-0.02340836264193058,
-0.03373933956027031,
-0.019070526584982872,
-0.011738128960132599,
-0.0909019410610199,
-0.1543993502855301,
-0.10471053421497345,
-0.16619662940502167,
0.04399140924215317,
-0.04626438021659851,
0.13418889045715332,
0.09469578415155411,
-0.012723101302981377,
0.04568437114357948,
0.028575526550412178,
0.07275456190109253,
0.07916246354579926,
-0.02939477376639843,
-0.036159269511699677
] |
null | null | null | GGUF quants for : https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001
Available : Q3_K_M, IQ3_XXS.
Otw : IQ2_XS
I recommand you folks to try this model, because it's quite an efficient merge of Miqu, WinterGoddess, AuroraNights, and XWin.
The Theta Rope 1,000,000 of Miqu, hence the 32k context, is functional up to 16k accordingly to my tests, and probably above (I need a smaller quant to test, which is otw). | {} | null | Nexesenex/alchemonaut_QuartetAnemoi-70B-iMat.GGUF | [
"gguf",
"region:us"
] | 2024-02-14T17:31:44+00:00 | [] | [] | TAGS
#gguf #region-us
| GGUF quants for : URL
Available : Q3_K_M, IQ3_XXS.
Otw : IQ2_XS
I recommand you folks to try this model, because it's quite an efficient merge of Miqu, WinterGoddess, AuroraNights, and XWin.
The Theta Rope 1,000,000 of Miqu, hence the 32k context, is functional up to 16k accordingly to my tests, and probably above (I need a smaller quant to test, which is otw). | [] | [
"TAGS\n#gguf #region-us \n"
] | [
9
] | [
"passage: TAGS\n#gguf #region-us \n"
] | [
0.030724648386240005,
0.026499787345528603,
-0.010017825290560722,
-0.05703527107834816,
0.08247160166501999,
0.07200847566127777,
0.01814177818596363,
0.020192064344882965,
0.2235025018453598,
0.017216520383954048,
0.1496623009443283,
-0.031233953312039375,
0.006174509879201651,
0.05538657680153847,
0.039407629519701004,
-0.19438467919826508,
0.058440499007701874,
-0.02356063388288021,
-0.020945189520716667,
0.01803453452885151,
-0.05310691148042679,
-0.04108472168445587,
0.022135348990559578,
-0.07881014049053192,
-0.15867982804775238,
0.0678698718547821,
0.017852067947387695,
0.0007025183876976371,
0.0820731669664383,
0.05882885307073593,
0.09657382220029831,
-0.024203501641750336,
-0.15220364928245544,
-0.18796531856060028,
0.0366438589990139,
-0.02974788099527359,
-0.10282598435878754,
0.022019000723958015,
0.029453158378601074,
-0.06967076659202576,
0.02238346077501774,
0.1427535116672516,
-0.10206039994955063,
0.051592033356428146,
-0.27165159583091736,
-0.1715938150882721,
-0.06585682183504105,
-0.025845954194664955,
-0.007345964200794697,
0.01241085771471262,
-0.0010092189768329263,
0.047266922891139984,
-0.20188692212104797,
-0.005631127394735813,
0.09329266101121902,
-0.25229454040527344,
0.02776304818689823,
0.21345718204975128,
-0.010520953685045242,
0.09873088449239731,
-0.05590669438242912,
0.14438565075397491,
0.03173782303929329,
-0.019559340551495552,
-0.1924813836812973,
-0.070224329829216,
-0.07177317887544632,
0.162109375,
-0.0823177620768547,
-0.11764442175626755,
0.24176421761512756,
0.009283576160669327,
-0.026472626253962517,
0.15598991513252258,
-0.029037300497293472,
-0.009749599732458591,
0.04555726423859596,
0.01668328419327736,
-0.010545015335083008,
0.1551385223865509,
0.17108163237571716,
-0.08598228543996811,
-0.10847756266593933,
-0.030579885467886925,
-0.2373785674571991,
0.2470305860042572,
-0.01911027915775776,
0.12945520877838135,
-0.20086053013801575,
0.018443629145622253,
-0.3247532844543457,
-0.0012029389617964625,
-0.010316703468561172,
-0.028618358075618744,
-0.006935348734259605,
0.009301352314651012,
-0.050316113978624344,
0.0739501491189003,
0.14580395817756653,
0.1393439620733261,
-0.11465669423341751,
0.060509420931339264,
-0.052172139286994934,
0.14876529574394226,
0.05827285721898079,
0.061183393001556396,
0.04079163819551468,
0.07037676870822906,
-0.008353544399142265,
-0.21633195877075195,
-0.029873060062527657,
-0.07057386636734009,
-0.08445251733064651,
-0.0130265261977911,
-0.13896764814853668,
0.11386743932962418,
-0.022273007780313492,
-0.07913482189178467,
-0.06810981780290604,
0.07626928389072418,
0.017650218680500984,
-0.008536403998732567,
-0.035703565925359726,
-0.012481719255447388,
0.022218508645892143,
-0.014872739091515541,
-0.1519843488931656,
0.02295425534248352,
0.10455024242401123,
0.07257117331027985,
-0.1489023119211197,
-0.011344035156071186,
-0.017298875376582146,
0.06959983706474304,
0.03884255141019821,
-0.10402916371822357,
0.04283881187438965,
-0.10747409611940384,
-0.08414466679096222,
0.022628657519817352,
-0.005062851123511791,
-0.0418001152575016,
0.13524691760540009,
0.03997812792658806,
0.040150050073862076,
-0.016940169036388397,
-0.04259050637483597,
-0.048133596777915955,
-0.07602019608020782,
0.07334327697753906,
0.05418020859360695,
0.027240034192800522,
-0.1915341019630432,
0.01154522504657507,
-0.048245880752801895,
0.09175369143486023,
-0.11856856942176819,
0.014575321227312088,
-0.08105122298002243,
0.1604209989309311,
0.0349995456635952,
0.09055875241756439,
-0.19562625885009766,
0.02605881541967392,
-0.06191767752170563,
0.1854621320962906,
-0.04451294615864754,
-0.11786319315433502,
0.2698904871940613,
-0.09105797111988068,
-0.040079716593027115,
0.056803084909915924,
0.06560484319925308,
-0.06272535026073456,
0.068723164498806,
0.4434472322463989,
-0.06556011736392975,
-0.07118581980466843,
0.05080527812242508,
0.17805561423301697,
-0.1262815296649933,
-0.09372174739837646,
0.09990617632865906,
-0.1480535864830017,
-0.211008220911026,
0.030864350497722626,
0.028955968096852303,
0.1494358479976654,
-0.06205282360315323,
-0.012456154450774193,
0.058214303106069565,
-0.013022401370108128,
0.046677324920892715,
0.03563477098941803,
0.11109840869903564,
-0.06493768095970154,
0.06851828098297119,
-0.16232267022132874,
0.016065504401922226,
0.1209988072514534,
-0.015012580901384354,
-0.04126624017953873,
0.14286154508590698,
-0.03809087723493576,
0.07199656218290329,
-0.07730832695960999,
-0.1804673671722412,
0.027612121775746346,
0.05621999502182007,
0.028122514486312866,
0.09176547825336456,
0.09526687115430832,
-0.039257392287254333,
0.0013902259524911642,
0.0329861082136631,
0.061223939061164856,
-0.007701692637056112,
0.015235940925776958,
-0.015374142676591873,
0.12888981401920319,
-0.07010363042354584,
-0.04155188798904419,
-0.09715848416090012,
-0.00889967754483223,
0.2288777232170105,
-0.01933911070227623,
0.02257734164595604,
-0.06854789704084396,
0.033186767250299454,
-0.0012386917369440198,
0.09506335854530334,
-0.017756229266524315,
0.06063338369131088,
-0.022011179476976395,
-0.06201287358999252,
0.11652727425098419,
-0.043086208403110504,
0.24556174874305725,
0.10792262107133865,
-0.07513239979743958,
-0.01741042546927929,
-0.0871582105755806,
-0.007020947523415089,
0.022898653522133827,
0.08814648538827896,
-0.04863424599170685,
0.06471672654151917,
-0.037898752838373184,
-0.0013588295551016927,
0.018808960914611816,
-0.008487841114401817,
-0.030526969581842422,
-0.04284367710351944,
-0.08270563185214996,
0.09057542681694031,
0.0691855251789093,
-0.13670015335083008,
0.17748047411441803,
0.2472171038389206,
0.1500423550605774,
0.2487964630126953,
-0.06485911458730698,
-0.014139159582555294,
-0.02016172744333744,
0.03673918917775154,
-0.020436765626072884,
0.13109654188156128,
-0.18929845094680786,
-0.032152432948350906,
0.02558354288339615,
0.029807843267917633,
0.10872193425893784,
-0.1365325003862381,
-0.1145850270986557,
-0.0379912331700325,
-0.047677598893642426,
-0.08257206529378891,
0.07034620642662048,
-0.12104500830173492,
0.03338077291846275,
0.07256745547056198,
0.0073080710135400295,
0.12201625853776932,
0.015417544171214104,
-0.055278971791267395,
0.0998256728053093,
-0.14543165266513824,
-0.2384990155696869,
-0.04642500355839729,
-0.10990478098392487,
0.001206184271723032,
0.05318264663219452,
0.016633260995149612,
-0.21265560388565063,
-0.01741623878479004,
0.11141498386859894,
0.06650645285844803,
-0.18111048638820648,
0.024138791486620903,
0.029385030269622803,
-0.004455238115042448,
-0.10212790220975876,
-0.012687300331890583,
-0.05387670546770096,
-0.11039627343416214,
-0.0691843032836914,
0.08163908869028091,
-0.06936442852020264,
0.11164893209934235,
0.1582336574792862,
0.11141853034496307,
0.11249161511659622,
-0.011774544604122639,
0.1976311057806015,
-0.14119699597358704,
-0.14489109814167023,
0.06405922025442123,
-0.014498869888484478,
0.03640124574303627,
0.08232609927654266,
0.04930112138390541,
-0.14269955456256866,
-0.04848511889576912,
-0.007545206230133772,
-0.1497725397348404,
-0.1323675513267517,
-0.05164776369929314,
-0.10658133774995804,
0.12379065901041031,
-0.06248227879405022,
0.10150982439517975,
0.11162466555833817,
0.017522823065519333,
0.11151766777038574,
-0.06246228888630867,
-0.054680291563272476,
-0.04807431995868683,
0.06297076493501663,
-0.05410824716091156,
-0.04205694422125816,
-0.06721562892198563,
-0.008002115413546562,
0.1349310278892517,
0.10885956883430481,
0.07581131905317307,
0.2265089601278305,
0.02780294418334961,
0.05355561524629593,
0.040789585560560226,
0.16015571355819702,
0.015284501947462559,
-0.0046128155663609505,
-0.08788388222455978,
-0.014365277253091335,
-0.0019687749445438385,
-0.031080376356840134,
-0.006052241660654545,
0.1340780407190323,
-0.2559821307659149,
0.03235609456896782,
-0.2989844083786011,
0.11946471780538559,
-0.1565471589565277,
0.07426489144563675,
0.05220162868499756,
0.030080994591116905,
0.08841689676046371,
0.035069406032562256,
-0.02871096506714821,
0.09149409085512161,
0.11694692075252533,
-0.12628670036792755,
0.01540512777864933,
0.04918349161744118,
0.052707213908433914,
-0.0142430504783988,
0.0931062400341034,
-0.11024625599384308,
-0.0737583339214325,
-0.0024255106691271067,
0.07025767862796783,
-0.2099330574274063,
0.23986183106899261,
0.03523903712630272,
-0.10871971398591995,
-0.021638909354805946,
-0.0547538623213768,
0.03316742554306984,
0.08983159810304642,
0.1342458724975586,
0.11251148581504822,
-0.11371640861034393,
-0.12470904737710953,
0.029020745307207108,
0.03679748624563217,
0.1757190227508545,
-0.09047917276620865,
-0.14164063334465027,
0.001811441034078598,
0.05263577029109001,
-0.053646381944417953,
0.07645093649625778,
-0.05327983945608139,
-0.0941789522767067,
0.03495060279965401,
0.04520740360021591,
0.00641082925722003,
-0.019971303641796112,
0.08110581338405609,
-0.02520396187901497,
0.085345059633255,
-0.04878882318735123,
0.00847524031996727,
-0.10202991217374802,
-0.03634759038686752,
0.04376819357275963,
-0.0722225159406662,
0.01614394783973694,
-0.09818518906831741,
-0.15651735663414001,
-0.08556577563285828,
-0.15303048491477966,
0.12497064471244812,
-0.052672382444143295,
0.10244213044643402,
-0.047614291310310364,
0.147609144449234,
-0.013274060562252998,
0.030878636986017227,
-0.05167607590556145,
0.028036773204803467,
0.011671020649373531,
-0.14858771860599518,
0.20959575474262238,
-0.1476162225008011,
-0.023819662630558014,
0.16589532792568207,
0.05426561459898949,
0.1161220371723175,
0.04555299133062363,
-0.0879630371928215,
0.23518426716327667,
0.2702784240245819,
-0.0007818902959115803,
0.17838320136070251,
0.2352202981710434,
-0.026693791151046753,
-0.2436053603887558,
-0.07260585576295853,
-0.2063993662595749,
-0.039628319442272186,
0.0004186074365861714,
-0.282958060503006,
0.06042884290218353,
0.17210599780082703,
-0.07570867985486984,
0.4319494664669037,
-0.22352926433086395,
0.03153151646256447,
0.13982820510864258,
-0.04242865741252899,
0.6181237101554871,
-0.1820172369480133,
-0.16550765931606293,
0.052592549473047256,
-0.1248052790760994,
0.11609237641096115,
-0.005267696920782328,
0.10048385709524155,
-0.00011838242062367499,
-0.02595684304833412,
0.03428659215569496,
-0.0409976989030838,
0.23620888590812683,
0.018790103495121002,
0.045043930411338806,
-0.09004033356904984,
-0.1538960188627243,
0.10746775567531586,
0.02556895837187767,
-0.10341835021972656,
0.03920651972293854,
-0.06092366203665733,
-0.10915451496839523,
0.011575369164347649,
-0.08317004889249802,
0.03433287888765335,
0.09550272673368454,
-0.050003789365291595,
-0.0652989074587822,
0.024777809157967567,
-0.16975140571594238,
0.028226720169186592,
0.1660151481628418,
-0.08661750704050064,
0.17001861333847046,
-0.04084239527583122,
-0.0947834923863411,
-0.15362800657749176,
-0.020637191832065582,
-0.07918675988912582,
-0.01597081869840622,
0.10419487953186035,
-0.11003783345222473,
0.006433290895074606,
0.09035904705524445,
0.002910176757723093,
0.07882846146821976,
0.09883374720811844,
-0.08716033399105072,
0.05550702288746834,
0.1730797290802002,
-0.21496161818504333,
-0.1694899946451187,
-0.04902869462966919,
-0.1887752115726471,
0.2065081000328064,
0.03903897479176521,
0.04895683750510216,
0.16432031989097595,
0.015995748341083527,
-0.010867753997445107,
-0.020683420822024345,
-0.11664224416017532,
0.00450828718021512,
0.04868127405643463,
-0.005741522181779146,
-0.11094820499420166,
0.13042977452278137,
0.05625306814908981,
-0.010265284217894077,
-0.04014173522591591,
0.1808832287788391,
-0.06324239075183868,
-0.06105973571538925,
-0.29144585132598877,
0.07338178157806396,
-0.10203809291124344,
-0.033191971480846405,
0.08307401835918427,
-0.024927617982029915,
-0.0012370682088658214,
0.14441034197807312,
0.009444275870919228,
0.1295502781867981,
0.031338974833488464,
0.03218937665224075,
0.14084547758102417,
-0.13805074989795685,
-0.14429166913032532,
-0.029582731425762177,
-0.08434601873159409,
-0.12847381830215454,
-0.016780147328972816,
0.1751313954591751,
-0.08363176882266998,
-0.12467111647129059,
-0.2756369411945343,
0.049299292266368866,
-0.0641724020242691,
-0.1138453483581543,
-0.03101496584713459,
-0.06544762849807739,
0.052310146391391754,
-0.040101904422044754,
0.014005003497004509,
-0.023109296336770058,
-0.14451682567596436,
0.0458921417593956,
0.06695213168859482,
0.03172319754958153,
-0.02931683138012886,
0.0015236766776069999,
0.15014788508415222,
0.026510147377848625,
0.16621503233909607,
0.22043149173259735,
0.061838917434215546,
0.20056213438510895,
-0.2713247239589691,
-0.10004157572984695,
0.10868333280086517,
-0.07527677714824677,
0.021882841363549232,
0.13841275870800018,
-0.01911449432373047,
-0.0495067797601223,
-0.03201347589492798,
0.08917038887739182,
-0.017281996086239815,
-0.08984966576099396,
-0.04857974499464035,
-0.003589637577533722,
-0.18503929674625397,
-0.0007536212215200067,
-0.15319249033927917,
0.1420021951198578,
0.04460230842232704,
-0.062356118112802505,
0.07465137541294098,
0.05997058004140854,
0.03977793827652931,
0.006764960940927267,
0.018739836290478706,
-0.14650356769561768,
0.01704270951449871,
-0.025170978158712387,
-0.006106532644480467,
0.03402095288038254,
0.34655115008354187,
-0.0466112419962883,
-0.07675225287675858,
-0.019784720614552498,
0.1001124382019043,
0.13863220810890198,
-0.009452453814446926,
0.13600659370422363,
0.13898764550685883,
-0.07470680773258209,
-0.12456237524747849,
0.10025309771299362,
-0.04034053534269333,
-0.15969179570674896,
0.12802298367023468,
-0.0435095950961113,
-0.016280202195048332,
0.04011611267924309,
-0.03383811563253403,
-0.08241409808397293,
0.04869242012500763,
-0.08193223923444748,
-0.03468599542975426,
-0.03921830281615257,
-0.019609715789556503,
-0.02835456281900406,
0.179523304104805,
-0.03646359592676163,
0.07318142801523209,
-0.02748848870396614,
0.010194642469286919,
-0.10395175963640213,
-0.1028568297624588,
0.05173351243138313,
-0.12340104579925537,
0.07964924722909927,
-0.03694985434412956,
0.030445387586951256,
0.22815105319023132,
0.02754553034901619,
0.015633730217814445,
0.13255921006202698,
-0.00819331593811512,
-0.0877854973077774,
0.03996758162975311,
-0.044342756271362305,
0.021794743835926056,
-0.030855976045131683,
-0.07628626376390457,
-0.0880078375339508,
-0.10075201094150543,
-0.049825526773929596,
0.03320961445569992,
-0.030442843213677406,
-0.05212388187646866,
-0.14976045489311218,
-0.02720625326037407,
-0.07237301766872406,
0.11920249462127686,
-0.09342960268259048,
0.08832328021526337,
-0.012045936658978462,
0.0026839354541152716,
0.037163145840168,
0.1505078673362732,
0.010094218887388706,
0.10494716465473175,
0.006677085533738136,
0.09218452870845795,
-0.06759306788444519,
0.14643312990665436,
-0.12665413320064545,
-0.02135086990892887,
-0.03415476530790329,
0.2331210970878601,
0.20847657322883606,
-0.11358945816755295,
0.009311644360423088,
0.03202449902892113,
0.04839635267853737,
0.185939759016037,
0.12599588930606842,
0.01761433109641075,
0.33329761028289795,
-0.059357043355703354,
-0.02227349951863289,
0.05721667781472206,
-0.00022221643303055316,
-0.06214975565671921,
0.0716261938214302,
0.08921460807323456,
0.013963594101369381,
-0.1257423460483551,
0.11072274297475815,
-0.21343208849430084,
0.15216094255447388,
0.07192383706569672,
-0.18375952541828156,
-0.009178245440125465,
-0.05186039209365845,
0.008210902102291584,
-0.027973614633083344,
0.13407447934150696,
-0.07003656774759293,
-0.1739543378353119,
-0.19977876543998718,
0.060681428760290146,
-0.35512542724609375,
-0.20812080800533295,
0.06384200602769852,
0.1383514702320099,
0.10808566957712173,
-0.06061858683824539,
-0.013316533528268337,
0.006446295417845249,
0.01029437780380249,
-0.019556531682610512,
0.028526417911052704,
-0.008326482027769089,
-0.05453765019774437,
-0.25444141030311584,
-0.006056090816855431,
0.0625600665807724,
-0.15240277349948883,
0.05618175491690636,
-0.017780732363462448,
-0.008800189942121506,
0.13029517233371735,
-0.021711476147174835,
0.03442413732409477,
0.00029493181500583887,
-0.16273388266563416,
0.031801287084817886,
0.035038504749536514,
0.03614772483706474,
-0.010639974847435951,
-0.04227915778756142,
-0.002239778870716691,
0.07848605513572693,
-0.054354216903448105,
-0.1438787877559662,
0.11021588742733002,
-0.026462025940418243,
0.21526864171028137,
-0.06517954170703888,
-0.033111389726400375,
0.023098714649677277,
-0.07031320035457611,
0.2018292248249054,
-0.03690796345472336,
0.05650625377893448,
0.1586160659790039,
0.018734993413090706,
0.019857894629240036,
-0.30062609910964966,
0.08813683688640594,
-0.024517416954040527,
0.006894893944263458,
-0.05270370468497276
] |
null | null | null |
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="KevStrider/Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "Taxi-v3", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Taxi-v3", "type": "Taxi-v3"}, "metrics": [{"type": "mean_reward", "value": "7.46 +/- 2.65", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | KevStrider/Taxi-v3 | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-14T17:31:55+00:00 | [] | [] | TAGS
#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 Taxi-v3
This is a trained model of a Q-Learning agent playing Taxi-v3 .
## Usage
| [
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
"TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
32,
33
] | [
"passage: TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
0.048862796276807785,
-0.16549694538116455,
-0.005485367961227894,
0.02960980497300625,
0.1345081776380539,
-0.01784728653728962,
0.11895976960659027,
0.07759871333837509,
-0.07461097836494446,
-0.055395450443029404,
0.1418241262435913,
0.09088201075792313,
0.055222880095243454,
0.05699880048632622,
0.09511256217956543,
-0.27440664172172546,
0.048217080533504486,
-0.02918700873851776,
0.05621987581253052,
0.11878681182861328,
0.0670095682144165,
-0.040441032499074936,
0.061956584453582764,
0.11818158626556396,
-0.1018151044845581,
-0.007344264071434736,
0.035402704030275345,
-0.09440053254365921,
0.17413531243801117,
0.07204403728246689,
0.12337774783372879,
0.05132639780640602,
0.179361954331398,
-0.12762396037578583,
0.024310702458024025,
-0.0010275895474478602,
-0.10138072073459625,
-0.03909514099359512,
-0.012415820732712746,
-0.08349097520112991,
0.03230205550789833,
0.23522862792015076,
0.07199250161647797,
0.06632792949676514,
-0.17707863450050354,
-0.06584878265857697,
-0.04375573247671127,
0.069611094892025,
0.14951466023921967,
0.03758616745471954,
-0.033800311386585236,
0.1684885323047638,
-0.2564343810081482,
0.05066783353686333,
0.037275806069374084,
-0.42313119769096375,
0.017119819298386574,
0.1507398933172226,
0.15090937912464142,
0.06909667700529099,
-0.10573802888393402,
0.013512322679162025,
0.051325585693120956,
-0.0005318621988408267,
0.024325110018253326,
0.006554204970598221,
0.15601307153701782,
0.08537693321704865,
-0.1487821787595749,
-0.058576688170433044,
0.17441977560520172,
-0.03788546845316887,
-0.02613203600049019,
-0.039745692163705826,
0.0067160045728087425,
-0.06427708268165588,
-0.004067842848598957,
-0.1777995079755783,
0.00734262028709054,
0.06666424125432968,
-0.014348524622619152,
0.014901017770171165,
-0.035522811114788055,
-0.0966939702630043,
-0.023098144680261612,
-0.08592145889997482,
0.01677769608795643,
-0.006319406442344189,
-0.10187895596027374,
0.05002119392156601,
-0.061138734221458435,
0.0014382408699020743,
-0.05123179033398628,
-0.15047866106033325,
-0.049055423587560654,
-0.03481535613536835,
0.1474713832139969,
-0.0044205985032022,
-0.01873963139951229,
-0.03164304047822952,
0.15474793314933777,
0.049551334232091904,
-0.05370146036148071,
0.05625450983643532,
0.07605006545782089,
0.23867930471897125,
0.10401605814695358,
0.10196955502033234,
-0.06798075139522552,
0.10180158913135529,
-0.12330973148345947,
-0.08915644884109497,
-0.17508824169635773,
0.11820860952138901,
0.00015364694991149008,
0.1317785084247589,
-0.12023144960403442,
0.07898581773042679,
-0.067511186003685,
0.013453764840960503,
0.01636839471757412,
0.0820009782910347,
-0.012399360537528992,
0.10676060616970062,
-0.005061192903667688,
-0.06941985338926315,
0.014177112840116024,
0.05935845896601677,
0.03754841163754463,
-0.038601722568273544,
-0.03192409873008728,
-0.05762290954589844,
-0.05065649375319481,
-0.10128600150346756,
-0.06447898596525192,
0.018573462963104248,
-0.007677143905311823,
-0.1833900660276413,
-0.06407523155212402,
0.00897200871258974,
0.015712225809693336,
-0.03988850116729736,
-0.05148044601082802,
-0.15265507996082306,
-0.042461175471544266,
-0.015450406819581985,
-0.03500641882419586,
-0.06214277446269989,
-0.0383245050907135,
0.046435944736003876,
-0.07560601085424423,
0.013364278711378574,
0.023342855274677277,
0.05405820533633232,
-0.025881100445985794,
0.06068144738674164,
-0.08357544988393784,
0.09493788331747055,
-0.1540430635213852,
-0.03271956741809845,
-0.025445878505706787,
-0.041183918714523315,
0.1752462536096573,
0.06099751964211464,
-0.015994304791092873,
0.15260063111782074,
-0.17141541838645935,
-0.058121129870414734,
0.15596486628055573,
0.008629098534584045,
-0.09967197477817535,
-0.003560945624485612,
-0.09397093951702118,
0.1428760588169098,
0.08571921288967133,
0.2478504776954651,
0.12005335837602615,
-0.22748184204101562,
0.055358242243528366,
0.12515293061733246,
-0.14365963637828827,
0.10365243256092072,
0.07344598323106766,
0.005470725707709789,
-0.18886831402778625,
-0.06843198090791702,
-0.06121627986431122,
0.1053021252155304,
-0.08522345870733261,
-0.0776243582367897,
0.09323626756668091,
-0.05086790770292282,
0.24641476571559906,
-0.028281206265091896,
0.06174173951148987,
-0.026681531220674515,
-0.1389324963092804,
-0.01723906397819519,
0.060955192893743515,
0.05258452147245407,
-0.024835573509335518,
-0.25895482301712036,
0.13646544516086578,
0.048650871962308884,
0.025074828416109085,
0.004106190986931324,
-0.05691491439938545,
0.016934165731072426,
0.1511998474597931,
0.020012924447655678,
0.13717477023601532,
0.027723990380764008,
0.0706823319196701,
-0.006239562761038542,
-0.10560829937458038,
-0.04169593006372452,
0.061916545033454895,
-0.08518962562084198,
-0.06641357392072678,
0.011197872459888458,
-0.06935211271047592,
-0.11783787608146667,
-0.12166737765073776,
-0.026334572583436966,
-0.02980303019285202,
-0.07444227486848831,
0.02368103712797165,
0.06536602973937988,
-0.06702698022127151,
-0.0023908785078674555,
0.007125476840883493,
-0.011537045240402222,
0.16434046626091003,
0.011393417604267597,
-0.007796820718795061,
0.1328643560409546,
-0.11533161997795105,
0.12461213022470474,
0.049438029527664185,
-0.024806302040815353,
-0.04662557691335678,
0.0014137453399598598,
-0.057529181241989136,
0.029044216498732567,
-0.04390640929341316,
0.02774495631456375,
0.20111067593097687,
0.02772962674498558,
0.11389166116714478,
-0.0656520202755928,
0.04385066404938698,
-0.007961965166032314,
-0.009693224914371967,
0.018563594669103622,
0.07608018070459366,
0.07813210040330887,
-0.1324140727519989,
0.02262016013264656,
0.22455167770385742,
0.1385764330625534,
0.18313980102539062,
-0.010877152904868126,
0.06325667351484299,
-0.04875868931412697,
0.027505528181791306,
0.024100203067064285,
0.10314226150512695,
-0.10732068121433258,
-0.0322517491877079,
-0.025407759472727776,
0.023599207401275635,
-0.08197105675935745,
-0.1055799350142479,
-0.090115025639534,
0.01222382951527834,
-0.03125503659248352,
-0.15570329129695892,
0.13300658762454987,
-0.10451057553291321,
0.01802753657102585,
0.04692702740430832,
-0.22163605690002441,
0.11530312895774841,
0.014291439205408096,
-0.10303618758916855,
0.11281087249517441,
-0.12051989883184433,
-0.08699832111597061,
-0.05777236074209213,
-0.18658851087093353,
0.05280197039246559,
0.04673841595649719,
0.05166793242096901,
-0.18521739542484283,
0.024835903197526932,
0.05545609071850777,
0.13426995277404785,
-0.09743253141641617,
-0.07142634689807892,
-0.15038461983203888,
0.016068490222096443,
-0.033661190420389175,
-0.16029728949069977,
-0.005609163548797369,
-0.032781440764665604,
-0.18849676847457886,
-0.04539939761161804,
-0.15086813271045685,
-0.034627582877874374,
0.20464378595352173,
0.026907702907919884,
0.09480511397123337,
-0.07926445454359055,
0.3802889585494995,
-0.042039383202791214,
-0.06146497279405594,
-0.01321389526128769,
-0.07072482258081436,
0.02512686513364315,
0.13271741569042206,
0.0036099457647651434,
-0.017886579036712646,
-0.0037857077550143003,
0.0024592927657067776,
-0.06234965845942497,
-0.13400450348854065,
0.0028710351325571537,
0.03905198723077774,
0.1874423623085022,
0.004639793653041124,
0.06659388542175293,
0.03133883699774742,
0.057546284049749374,
0.07748064398765564,
0.030926106497645378,
0.0011591583024710417,
-0.01591806672513485,
0.06604493409395218,
-0.11684755235910416,
0.042466625571250916,
-0.030429253354668617,
-0.10143838077783585,
-0.013183288276195526,
0.07950251549482346,
0.12755028903484344,
0.17849206924438477,
-0.04790908098220825,
0.17489230632781982,
0.13580141961574554,
0.16576050221920013,
0.049315933138132095,
-0.020801831036806107,
-0.08773037046194077,
-0.06118565797805786,
0.004774159751832485,
-0.031952597200870514,
0.04869702458381653,
0.3231290578842163,
0.037619613111019135,
-0.09036035090684891,
0.11149907857179642,
0.009480619803071022,
0.05359881371259689,
0.022797370329499245,
-0.11162138730287552,
0.11170321702957153,
0.07968773692846298,
-0.06341761350631714,
-0.07602835446596146,
0.16758501529693604,
-0.1109386757016182,
-0.26646625995635986,
-0.11410990357398987,
-0.012305386364459991,
0.07903840392827988,
0.005651174578815699,
0.05498376116156578,
-0.11829282343387604,
-0.16034497320652008,
-0.034191906452178955,
0.1335442066192627,
-0.3077351450920105,
0.2065143585205078,
-0.0198091771453619,
0.06707923114299774,
-0.039657969027757645,
-0.07026876509189606,
0.09694647043943405,
0.13174086809158325,
0.29124146699905396,
0.01396956667304039,
0.04841272905468941,
-0.15176129341125488,
-0.0976925864815712,
0.0018439020495861769,
0.015482662245631218,
-0.02563396655023098,
0.028520405292510986,
-0.0540912002325058,
0.008404579944908619,
-0.018086453899741173,
0.2102297693490982,
-0.11316607892513275,
0.004344627261161804,
-0.06968966871500015,
-0.11707738786935806,
0.19409789144992828,
-0.07178345322608948,
-0.04543264955282211,
-0.14959357678890228,
-0.15512511134147644,
-0.004174166824668646,
-0.02413962036371231,
-0.019664527848362923,
-0.17603960633277893,
-0.18804074823856354,
-0.05204557999968529,
-0.005645004566758871,
-0.003464865731075406,
0.05867868289351463,
-0.07517234236001968,
-0.04805335775017738,
0.1009904220700264,
-0.07743175327777863,
-0.056063808500766754,
-0.1103200614452362,
0.1391381323337555,
0.06248528137803078,
0.16743235290050507,
0.05907081440091133,
0.0006117874872870743,
0.11471151560544968,
-0.02913086675107479,
0.11103474348783493,
-0.11291708797216415,
-0.17145049571990967,
-0.08334989100694656,
-0.018775060772895813,
0.09519003331661224,
-0.04789286106824875,
0.0028788831550627947,
0.2550160884857178,
0.14880181849002838,
-0.0897710770368576,
0.27680760622024536,
0.04414956644177437,
-0.09375058114528656,
-0.18432219326496124,
-0.15961645543575287,
0.03759992495179176,
0.060025621205568314,
0.13095876574516296,
-0.057205069810152054,
-0.08483537286520004,
-0.08492398262023926,
-0.07478608191013336,
-0.13140805065631866,
-0.24232175946235657,
-0.030598774552345276,
0.22874866425991058,
0.08656918257474899,
0.08219650387763977,
-0.012482990510761738,
-0.01186054851859808,
0.00526038184762001,
0.02680150233209133,
0.12018456310033798,
-0.13341329991817474,
0.11107480525970459,
0.022198403254151344,
0.044267985969781876,
0.009712530300021172,
0.07929777354001999,
0.03375575691461563,
-0.003218587953597307,
-0.0006439819699153304,
-0.0988350659608841,
-0.2596651017665863,
0.0816885456442833,
-0.01623627357184887,
-0.09960969537496567,
0.014988959766924381,
0.02061903104186058,
-0.2089255303144455,
0.011128270998597145,
-0.019883770495653152,
-0.03150356933474541,
-0.06483490765094757,
-0.10664787143468857,
-0.056551624089479446,
0.04928823933005333,
0.10853826254606247,
0.011660109274089336,
0.05354316532611847,
-0.0404130220413208,
0.07917837053537369,
0.0826287642121315,
0.15132710337638855,
0.06795957684516907,
-0.190711110830307,
-0.10953907668590546,
-0.0414445661008358,
0.12121522426605225,
-0.12505418062210083,
0.036917757242918015,
0.053161121904850006,
-0.016534561291337013,
0.14621229469776154,
0.1070784479379654,
-0.07452095299959183,
0.11915595084428787,
0.08904775977134705,
-0.04094788804650307,
-0.23367151618003845,
-0.07120766490697861,
0.11133213341236115,
0.07195597887039185,
-0.03961895406246185,
0.018120890483260155,
-0.04960581287741661,
-0.013980977237224579,
0.048759616911411285,
-0.0538676381111145,
-0.07230538129806519,
0.004421027842909098,
0.1247575581073761,
0.1029362753033638,
-0.04655474051833153,
0.01296416949480772,
0.037371400743722916,
0.003788623260334134,
0.04730486497282982,
0.0407949760556221,
-0.08269952982664108,
-0.04124005511403084,
0.02782733179628849,
0.37552911043167114,
-0.010165480896830559,
-0.020456433296203613,
0.018555615097284317,
-0.19949445128440857,
0.09135842323303223,
0.13205479085445404,
0.04697350412607193,
0.004247748292982578,
-0.08139242231845856,
0.026877427473664284,
-0.010625290684401989,
0.09936143457889557,
-0.07806670665740967,
-0.05493134260177612,
-0.21631066501140594,
-0.025010565295815468,
0.017490221187472343,
0.24077683687210083,
-0.08458559215068817,
-0.12801732122898102,
-0.20628872513771057,
0.13128381967544556,
-0.11333390325307846,
-0.03695881739258766,
-0.024473199620842934,
0.03926658630371094,
-0.01989821158349514,
0.06291737407445908,
-0.0710630789399147,
0.006373001262545586,
-0.11024709790945053,
0.055267609655857086,
0.04204455390572548,
0.1229788213968277,
0.014207782223820686,
0.02016810141503811,
0.05822525918483734,
-0.01837925612926483,
0.07173580676317215,
-0.06203491613268852,
-0.04550490900874138,
0.14224006235599518,
-0.020255116745829582,
-0.04152837023139,
-0.0483345128595829,
-0.036874305456876755,
0.11981741338968277,
-0.05059147998690605,
-0.007141099311411381,
-0.054929375648498535,
-0.06906463205814362,
0.03462086617946625,
-0.009175732731819153,
-0.008798843249678612,
0.06801853328943253,
0.04024988040328026,
-0.026994358748197556,
0.005263668950647116,
0.03447828069329262,
-0.10330043733119965,
-0.04955084249377251,
0.16955432295799255,
-0.0749620869755745,
0.10274054110050201,
-0.031069839373230934,
0.018015999346971512,
0.005847334861755371,
-0.022399673238396645,
-0.015360680408775806,
-0.1457086056470871,
-0.06137600541114807,
-0.09489979594945908,
0.11565322428941727,
0.08146517723798752,
0.03358805552124977,
0.04274565726518631,
0.019532648846507072,
-0.04414922371506691,
-0.038583990186452866,
0.12961317598819733,
0.08133101463317871,
0.012996876612305641,
0.01137041300535202,
0.01941833831369877,
-0.020302120596170425,
0.0028480992186814547,
-0.01250747125595808,
-0.07239153981208801,
-0.05874783173203468,
0.09400010108947754,
0.1600283533334732,
-0.06127211079001427,
-0.13325586915016174,
-0.020593497902154922,
0.04988488554954529,
0.0014717020094394684,
-0.08777432143688202,
0.04833676666021347,
0.15805292129516602,
-0.05623878911137581,
0.03216489031910896,
-0.09984751045703888,
-0.07263360917568207,
-0.16060975193977356,
-0.10029061883687973,
-0.06092562898993492,
-0.28350353240966797,
0.09752398729324341,
0.006392303854227066,
-0.014731393195688725,
0.059529416263103485,
0.051305368542671204,
-0.052508849650621414,
0.07068239152431488,
-0.18146829307079315,
-0.007054794579744339,
0.03497592359781265,
-0.13212306797504425,
0.02475893869996071,
-0.2378365397453308,
0.10198072344064713,
-0.04623803123831749,
-0.1519704908132553,
-0.04004510119557381,
0.0641569048166275,
-0.09540136158466339,
-0.01822364516556263,
-0.0475153923034668,
-0.01922670193016529,
0.01624443754553795,
-0.009348669089376926,
-0.031147832050919533,
0.13716529309749603,
0.02827494591474533,
-0.03268734738230705,
0.005254602525383234,
0.0223685409873724,
0.03955082967877388,
-0.0969657450914383,
-0.05986930429935455,
0.08311155438423157,
-0.031056145206093788,
0.14728976786136627,
0.000341245875461027,
0.04181376099586487,
-0.06758682429790497,
0.2593761384487152,
0.2023983597755432,
-0.12479214370250702,
0.008118697442114353,
-0.021801479160785675,
0.012670028023421764,
-0.041751839220523834,
0.13110700249671936,
0.013386172242462635,
0.12186761200428009,
-0.17513342201709747,
-0.01036517322063446,
-0.0818324014544487,
-0.04501292482018471,
0.06702108681201935,
0.14714950323104858,
0.15742522478103638,
0.03436789661645889,
-0.07328428328037262,
0.06722653657197952,
-0.30119743943214417,
0.20540550351142883,
-0.1346001923084259,
-0.01498429011553526,
-0.040251150727272034,
-0.058389630168676376,
0.061147745698690414,
0.11309876292943954,
0.10832664370536804,
-0.021150551736354828,
-0.0905047357082367,
-0.04486766457557678,
-0.039378076791763306,
-0.13019338250160217,
-0.02718670479953289,
0.1654091775417328,
0.06799814850091934,
0.31520840525627136,
-0.017577875405550003,
0.07702425122261047,
0.034410297870635986,
0.06451138854026794,
0.004519328009337187,
0.09537279605865479,
0.07960964739322662,
-0.06345855444669724,
-0.07373003661632538,
-0.001637450186535716,
0.05033271387219429,
0.14567798376083374,
-0.03826142102479935,
-0.18691548705101013,
0.15858715772628784,
0.07192251086235046,
-0.13762691617012024,
-0.05777517706155777,
0.08409425616264343,
-0.0739973932504654,
0.0550808347761631,
0.08115427941083908,
0.015876613557338715,
-0.017793258652091026,
-0.004664506763219833,
0.06074233725667,
0.024694660678505898,
-0.02343848906457424,
0.003570882137864828,
-0.08337053656578064,
-0.04151543974876404,
0.07267895340919495,
-0.0844460055232048,
-0.20546193420886993,
-0.0957019031047821,
-0.07551700621843338,
0.030557552352547646,
-0.0649830624461174,
0.12575586140155792,
0.1717868149280548,
0.0593598335981369,
-0.03307248651981354,
-0.10721943527460098,
-0.035562749952077866,
0.07602505385875702,
-0.044773899018764496,
-0.09409699589014053
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-ellis-way
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6531
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 75
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.0349 | 1.71 | 500 | 0.8773 |
| 0.8842 | 3.42 | 1000 | 0.7534 |
| 0.7695 | 5.13 | 1500 | 0.7030 |
| 0.7223 | 6.84 | 2000 | 0.6711 |
| 0.6589 | 8.55 | 2500 | 0.6593 |
| 0.642 | 10.26 | 3000 | 0.6443 |
| 0.6166 | 11.97 | 3500 | 0.6221 |
| 0.6062 | 13.68 | 4000 | 0.6149 |
| 0.5644 | 15.39 | 4500 | 0.6141 |
| 0.5534 | 17.1 | 5000 | 0.6099 |
| 0.5526 | 18.81 | 5500 | 0.6035 |
| 0.5034 | 20.52 | 6000 | 0.6035 |
| 0.4833 | 22.23 | 6500 | 0.6032 |
| 0.4813 | 23.94 | 7000 | 0.6034 |
| 0.4561 | 25.65 | 7500 | 0.6036 |
| 0.4517 | 27.36 | 8000 | 0.6029 |
| 0.4179 | 29.07 | 8500 | 0.6077 |
| 0.4501 | 30.78 | 9000 | 0.5990 |
| 0.4035 | 32.49 | 9500 | 0.6078 |
| 0.3941 | 34.2 | 10000 | 0.6116 |
| 0.3953 | 35.91 | 10500 | 0.6041 |
| 0.3901 | 37.61 | 11000 | 0.6112 |
| 0.3899 | 39.32 | 11500 | 0.6195 |
| 0.3666 | 41.03 | 12000 | 0.6198 |
| 0.3563 | 42.74 | 12500 | 0.6214 |
| 0.3591 | 44.45 | 13000 | 0.6254 |
| 0.3551 | 46.16 | 13500 | 0.6270 |
| 0.3387 | 47.87 | 14000 | 0.6275 |
| 0.3391 | 49.58 | 14500 | 0.6359 |
| 0.3332 | 51.29 | 15000 | 0.6335 |
| 0.3375 | 53.0 | 15500 | 0.6343 |
| 0.3287 | 54.71 | 16000 | 0.6363 |
| 0.3225 | 56.42 | 16500 | 0.6375 |
| 0.3202 | 58.13 | 17000 | 0.6432 |
| 0.3046 | 59.84 | 17500 | 0.6401 |
| 0.3055 | 61.55 | 18000 | 0.6465 |
| 0.3207 | 63.26 | 18500 | 0.6489 |
| 0.3113 | 64.97 | 19000 | 0.6490 |
| 0.3178 | 66.68 | 19500 | 0.6508 |
| 0.3175 | 68.39 | 20000 | 0.6519 |
| 0.3017 | 70.1 | 20500 | 0.6517 |
| 0.3001 | 71.81 | 21000 | 0.6542 |
| 0.3027 | 73.52 | 21500 | 0.6531 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "google/flan-t5-base", "model-index": [{"name": "flan-t5-ellis-way", "results": []}]} | text2text-generation | wayminder/flan-t5-ellis-way | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T17:32:31+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| flan-t5-ellis-way
=================
This model is a fine-tuned version of google/flan-t5-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6531
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* gradient\_accumulation\_steps: 16
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 75
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 75",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 75",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
80,
144,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 75### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.10438457131385803,
0.13534966111183167,
-0.0023310251999646425,
0.08196992427110672,
0.12009105086326599,
0.021420611068606377,
0.13912978768348694,
0.12298177927732468,
-0.06743498891592026,
0.11266716569662094,
0.13621652126312256,
0.09760888665914536,
0.06132268160581589,
0.14816027879714966,
-0.015057001262903214,
-0.2898314595222473,
0.0017132553039118648,
-0.01128512341529131,
-0.1326903998851776,
0.13015425205230713,
0.0761440247297287,
-0.11009635031223297,
0.08232224732637405,
-0.0060238768346607685,
-0.1274728924036026,
-0.011524522677063942,
-0.0180315300822258,
-0.047429200261831284,
0.1126730740070343,
0.0402725487947464,
0.09256996214389801,
0.03982293978333473,
0.089250847697258,
-0.2386721521615982,
0.01021066214889288,
0.06825660169124603,
0.0041893115267157555,
0.09299001097679138,
0.0904509425163269,
-0.0015465907054021955,
0.1355474889278412,
-0.07790882140398026,
0.06520731747150421,
0.05250544473528862,
-0.10399610549211502,
-0.21234722435474396,
-0.09151480346918106,
0.08486292511224747,
0.12302014231681824,
0.0872950628399849,
-0.024759728461503983,
0.07541880756616592,
-0.07060479372739792,
0.08785132318735123,
0.24690547585487366,
-0.3018312454223633,
-0.08316339552402496,
0.05328203737735748,
0.06753180921077728,
0.05928373709321022,
-0.12077260762453079,
-0.009264704771339893,
0.04610526189208031,
0.0026229918003082275,
0.10954021662473679,
0.01869639754295349,
0.08005135506391525,
0.01245088316500187,
-0.149846613407135,
-0.04120558127760887,
0.1433592438697815,
0.1072821095585823,
-0.020978210493922234,
-0.09106003493070602,
-0.04938726872205734,
-0.21935266256332397,
-0.02851136401295662,
-0.017141452059149742,
0.03778063878417015,
-0.036224815994501114,
-0.09542670100927353,
-0.0021096498239785433,
-0.0791897103190422,
-0.09209434688091278,
0.019408702850341797,
0.10790886729955673,
0.04661714285612106,
-0.02863086573779583,
0.008104351349174976,
0.12237618118524551,
0.07640930265188217,
-0.15298177301883698,
0.0046826135367155075,
0.020088057965040207,
-0.06415610015392303,
-0.02646988071501255,
-0.005765080917626619,
-0.022223947569727898,
0.02937944419682026,
0.16741183400154114,
-0.038807835429906845,
0.05787329748272896,
0.04022880643606186,
0.036691270768642426,
-0.0964234247803688,
0.1494721621274948,
-0.0773538276553154,
-0.08579745888710022,
-0.02926536463201046,
0.12223287671804428,
0.027038244530558586,
-0.012616107240319252,
-0.09501253813505173,
0.033650703728199005,
0.10967192053794861,
0.04764276370406151,
0.0020891691092401743,
0.03877627104520798,
-0.0664730817079544,
-0.016670536249876022,
0.033621445298194885,
-0.08218161761760712,
0.0425214059650898,
0.02158774621784687,
-0.06421171873807907,
-0.04138392210006714,
0.011122268624603748,
0.01589592918753624,
0.0022267100866883993,
0.13124577701091766,
-0.10707701742649078,
-0.029826266691088676,
-0.08717658370733261,
-0.09052176028490067,
0.029201850295066833,
-0.06764143705368042,
-0.001979082589969039,
-0.08196523785591125,
-0.13253532350063324,
-0.05131940543651581,
0.05526828020811081,
-0.05500173568725586,
-0.08607540279626846,
-0.06756088882684708,
-0.09666318446397781,
0.041511885821819305,
-0.005641200579702854,
0.1260179877281189,
-0.052455127239227295,
0.11612304300069809,
0.016634637489914894,
0.06622451543807983,
0.05181554704904556,
0.040437132120132446,
-0.0703362300992012,
0.05720622465014458,
-0.1569472998380661,
0.04648231342434883,
-0.07576577365398407,
0.07864820957183838,
-0.12315454334020615,
-0.10082415491342545,
-0.05017973855137825,
-0.006371956318616867,
0.07765369862318039,
0.12980344891548157,
-0.1567840725183487,
-0.08819284290075302,
0.1864040642976761,
-0.07474984973669052,
-0.1528046876192093,
0.127212792634964,
-0.00795044470578432,
-0.009813227690756321,
0.024891342967748642,
0.1405288577079773,
0.07513725012540817,
-0.09110413491725922,
-0.025322267785668373,
-0.039453182369470596,
0.0889117568731308,
0.0057083964347839355,
0.09937252104282379,
-0.03212786838412285,
0.006115937139838934,
0.004678551107645035,
-0.053379397839307785,
0.059895239770412445,
-0.09802195429801941,
-0.08773868530988693,
-0.02734002098441124,
-0.08964695781469345,
0.028993375599384308,
0.05889061465859413,
0.04809693619608879,
-0.08139067143201828,
-0.13317173719406128,
0.016205858439207077,
0.08471275120973587,
-0.07253523170948029,
0.006487289909273386,
-0.049410488456487656,
0.09724319726228714,
-0.036990467458963394,
0.0033453558571636677,
-0.13397188484668732,
-0.06813930720090866,
0.0331723652780056,
-0.037282269448041916,
0.003859658259898424,
-0.023791780695319176,
0.0751742571592331,
0.0743122398853302,
-0.06257539987564087,
-0.06948399543762207,
-0.06438060849905014,
-0.007647254969924688,
-0.09170781821012497,
-0.24735382199287415,
-0.051467061042785645,
-0.01306748203933239,
0.1513516902923584,
-0.257214218378067,
0.03644543141126633,
0.0008513853535987437,
0.1369543820619583,
0.03807539865374565,
-0.05076614394783974,
-0.013730388134717941,
0.04398112744092941,
-0.054761677980422974,
-0.0792892724275589,
0.03726537525653839,
-0.0019135837210342288,
-0.11718758195638657,
-0.026138069108128548,
-0.13235831260681152,
0.15520493686199188,
0.11480163037776947,
-0.0012079537846148014,
-0.10142382234334946,
-0.04125819355249405,
-0.08738525956869125,
-0.04865032061934471,
-0.030149204656481743,
-0.006265830714255571,
0.08337241411209106,
0.02042965777218342,
0.12176933139562607,
-0.08585164695978165,
-0.06127702817320824,
0.04221876710653305,
0.0038517138455063105,
-0.022565282881259918,
0.14450377225875854,
0.1169913113117218,
-0.06827279925346375,
0.14149700105190277,
0.1243753433227539,
-0.05035585165023804,
0.15468548238277435,
-0.05237017199397087,
-0.09083042293787003,
-0.03564124181866646,
0.04758961498737335,
0.034668080508708954,
0.12142709642648697,
-0.12020640820264816,
0.004529993049800396,
0.003667007200419903,
0.00976710394024849,
0.017756260931491852,
-0.18470823764801025,
-0.027165325358510017,
0.04012637957930565,
-0.06094661355018616,
0.002386427251622081,
-0.01777936890721321,
-0.025606881827116013,
0.09810622036457062,
0.025880342349410057,
-0.03733491152524948,
0.003023682162165642,
0.005892888177186251,
-0.09215317666530609,
0.2076578587293625,
-0.07156362384557724,
-0.1294785439968109,
-0.15370148420333862,
0.02979685179889202,
-0.04203535243868828,
-0.0004474434826988727,
0.0405452735722065,
-0.09116407483816147,
-0.0244612954556942,
-0.0876116007566452,
0.029383838176727295,
-0.034595806151628494,
0.04766068607568741,
0.008079451508820057,
0.022434892132878304,
0.06341540813446045,
-0.09289020299911499,
0.024291004985570908,
-0.013366500847041607,
-0.04836907610297203,
0.009337459690868855,
0.030977409332990646,
0.12821124494075775,
0.1473086178302765,
0.02304127998650074,
0.020133743062615395,
-0.03141416981816292,
0.188825786113739,
-0.1012420803308487,
0.014773130416870117,
0.09017327427864075,
0.014850957319140434,
0.053056780248880386,
0.14535577595233917,
0.050967179238796234,
-0.09261949360370636,
0.042387351393699646,
0.060836631804704666,
-0.016948586329817772,
-0.23317714035511017,
-0.00649246433749795,
-0.04752357676625252,
0.02883598953485489,
0.11793775856494904,
0.052447475492954254,
0.02635766565799713,
0.05509796738624573,
-0.017986126244068146,
0.0403059758245945,
-0.010873451828956604,
0.07656700164079666,
0.04178246110677719,
0.042765211313962936,
0.11556561291217804,
-0.031138485297560692,
-0.03522045165300369,
0.03469153121113777,
-0.010984583757817745,
0.21481548249721527,
-0.0028677349910140038,
0.16388265788555145,
0.05221981555223465,
0.16057398915290833,
-0.0002930839837063104,
0.06478675454854965,
0.017237510532140732,
-0.02965550124645233,
0.015482822433114052,
-0.061399612575769424,
-0.013311282731592655,
0.05730694532394409,
0.02157404087483883,
0.06053841859102249,
-0.11290895938873291,
0.0421937070786953,
0.043426092714071274,
0.30549779534339905,
0.05756552517414093,
-0.31545284390449524,
-0.10020508617162704,
0.012215561233460903,
-0.04353339225053787,
-0.03398582711815834,
0.031944528222084045,
0.1234283447265625,
-0.07279908657073975,
0.05562971532344818,
-0.07952030748128891,
0.08349346369504929,
-0.05672835558652878,
-0.0011183456517755985,
0.07708197832107544,
0.09553438425064087,
-0.016859468072652817,
0.06807344406843185,
-0.2610246539115906,
0.28607869148254395,
-0.0013309749774634838,
0.06469089537858963,
-0.03980989754199982,
0.027071800082921982,
0.014740366488695145,
0.01327082421630621,
0.10689584165811539,
-0.007141789887100458,
-0.07178518176078796,
-0.16919437050819397,
-0.10603939741849899,
0.010159761644899845,
0.12776042520999908,
-0.12210361659526825,
0.11694053560495377,
-0.0383358970284462,
-0.029712490737438202,
0.05505922809243202,
-0.035914283245801926,
-0.11313210427761078,
-0.10672929883003235,
0.009975246153771877,
-0.02950005605816841,
0.048527102917432785,
-0.09570248425006866,
-0.1099744588136673,
-0.09146331250667572,
0.176345556974411,
-0.147415891289711,
-0.017301080748438835,
-0.1317778378725052,
0.10598064213991165,
0.12263140827417374,
-0.08198542892932892,
0.04649816080927849,
-0.006562403403222561,
0.10705853253602982,
0.021807117387652397,
-0.02833649516105652,
0.12318843603134155,
-0.08205725997686386,
-0.23485684394836426,
-0.06874074786901474,
0.14956650137901306,
0.024569595232605934,
0.05195215344429016,
-0.0333225354552269,
0.014934681355953217,
-0.01031024381518364,
-0.0962672010064125,
0.06640958040952682,
0.011734274215996265,
0.04248674958944321,
0.006975285708904266,
-0.05429839715361595,
0.036536891013383865,
-0.046688299626111984,
-0.06425168365240097,
0.1145230233669281,
0.30702611804008484,
-0.09927062690258026,
0.01355757750570774,
0.05080285295844078,
-0.0478399395942688,
-0.15348614752292633,
0.017618006095290184,
0.09160919487476349,
0.027094220742583275,
0.017993951216340065,
-0.18989746272563934,
0.07543700188398361,
0.10358317941427231,
-0.030593784525990486,
0.12062007188796997,
-0.313944011926651,
-0.1413053423166275,
0.07138118147850037,
0.13009051978588104,
-0.0013971981825307012,
-0.18982593715190887,
-0.07516615837812424,
0.0002928640169557184,
-0.102055624127388,
0.0910574272274971,
-0.025493983179330826,
0.11095606535673141,
-0.017141399905085564,
-0.0008306876989081502,
0.016902657225728035,
-0.06771101802587509,
0.12687601149082184,
-0.022502796724438667,
0.07070775330066681,
-0.03236451372504234,
0.0010609502205625176,
0.022682055830955505,
-0.07068710774183273,
0.020478852093219757,
-0.10817203670740128,
0.04648645222187042,
-0.07782410830259323,
-0.0338178351521492,
-0.07429011911153793,
0.04222920536994934,
-0.07096988707780838,
-0.052739545702934265,
-0.037193819880485535,
0.041998475790023804,
0.05884890258312225,
-0.005832391325384378,
0.1441950500011444,
0.003777617122977972,
0.1678406000137329,
0.09709358960390091,
0.0788879469037056,
-0.0055659557692706585,
-0.07008831948041916,
-0.01826511323451996,
-0.022840818390250206,
0.05384019762277603,
-0.1390260010957718,
0.008584779687225819,
0.1403745859861374,
0.04303710162639618,
0.13822223246097565,
0.06920505315065384,
-0.0697714164853096,
-0.011340533383190632,
0.08971922099590302,
-0.13056321442127228,
-0.1650969535112381,
-0.03768650069832802,
-0.039996806532144547,
-0.15757660567760468,
0.02448505163192749,
0.08799339830875397,
-0.06559423357248306,
0.003140731481835246,
0.009188264608383179,
0.0336206778883934,
-0.010188134387135506,
0.18052908778190613,
0.052736323326826096,
0.08462682366371155,
-0.07420499622821808,
0.08500814437866211,
0.050864577293395996,
-0.1314699947834015,
0.021580928936600685,
0.09079878777265549,
-0.0729508101940155,
-0.014657231979072094,
0.03669881075620651,
0.106057308614254,
-0.01529223658144474,
-0.034552983939647675,
-0.13821101188659668,
-0.12003189325332642,
0.07520899921655655,
0.10903584212064743,
0.03981881961226463,
0.03721768036484718,
-0.004649979993700981,
0.05100167542695999,
-0.1080571785569191,
0.1322988122701645,
0.05861285328865051,
0.0924019142985344,
-0.17661456763744354,
0.1030745580792427,
0.011689020320773125,
-0.0036296944599598646,
-0.0063864863477647305,
0.0483916699886322,
-0.138761505484581,
-0.021930065006017685,
-0.052159521728754044,
-0.04345100745558739,
-0.057649485766887665,
-0.012956607155501842,
-0.0016932012513279915,
-0.058381762355566025,
-0.058484043926000595,
0.009456428699195385,
-0.09529776126146317,
-0.0623445101082325,
0.0018225417006760836,
0.07055804133415222,
-0.12409654259681702,
0.002601068466901779,
0.04066552221775055,
-0.10998430848121643,
0.09090456366539001,
0.01767302118241787,
0.05358601361513138,
0.017364704981446266,
-0.13042065501213074,
0.0665246844291687,
0.03039972111582756,
-0.014340081252157688,
0.027940386906266212,
-0.13492196798324585,
-0.01104345079511404,
-0.04178416728973389,
0.021075600758194923,
0.0003787505847867578,
0.018835460767149925,
-0.14957112073898315,
-0.009815995581448078,
-0.029561929404735565,
-0.051439668983221054,
-0.05558587238192558,
0.040661923587322235,
0.06782419979572296,
-0.01564808376133442,
0.17109329998493195,
-0.07986010611057281,
0.03017016500234604,
-0.23966872692108154,
0.007706812582910061,
-0.002344117034226656,
-0.06553628295660019,
-0.07500743120908737,
-0.026588356122374535,
0.0646020695567131,
-0.058922600001096725,
0.08907707035541534,
-0.04007609561085701,
0.04224622994661331,
0.03531686216592789,
-0.05829726159572601,
0.05090923607349396,
0.043162744492292404,
0.19865550100803375,
0.02931649051606655,
-0.03168930113315582,
0.0349494144320488,
0.022724183276295662,
0.07370207458734512,
0.05252229422330856,
0.18557140231132507,
0.12735623121261597,
-0.04487411677837372,
0.08804228156805038,
0.06267646700143814,
-0.11000461131334305,
-0.1655142605304718,
0.11455276608467102,
-0.06888436526060104,
0.12545689940452576,
-0.021604541689157486,
0.18812616169452667,
0.14416709542274475,
-0.18129780888557434,
0.01794113963842392,
-0.019949592649936676,
-0.07512181997299194,
-0.10116773843765259,
-0.07501422613859177,
-0.0829920545220375,
-0.17942313849925995,
0.021358661353588104,
-0.10789893567562103,
0.017383864149451256,
0.039700768887996674,
0.0303132813423872,
0.016134921461343765,
0.1559259295463562,
0.08258645236492157,
0.027414575219154358,
0.07061842828989029,
0.0331539511680603,
-0.038986366242170334,
-0.051207419484853745,
-0.09899178147315979,
0.004776451736688614,
-0.04782469943165779,
0.03474702313542366,
-0.05316564068198204,
-0.09035264700651169,
0.07042370736598969,
0.03612789511680603,
-0.09123976528644562,
0.022053154185414314,
-0.005026532337069511,
0.042642805725336075,
0.06038558855652809,
0.0054907468147575855,
-0.008831486105918884,
-0.026034321635961533,
0.23881325125694275,
-0.08882694691419601,
-0.023555034771561623,
-0.11521376669406891,
0.20596149563789368,
0.004184718709439039,
-0.00806670542806387,
0.012896537780761719,
-0.09757651388645172,
0.007301603443920612,
0.185929074883461,
0.15373408794403076,
-0.017363592982292175,
-0.02623753808438778,
0.03250068426132202,
-0.011879781261086464,
-0.032566461712121964,
0.07096507400274277,
0.11437317728996277,
0.0810791403055191,
-0.054158952087163925,
-0.03327684849500656,
-0.04058951884508133,
-0.049705397337675095,
-0.017565689980983734,
0.08833078294992447,
0.046151433140039444,
-0.0015672787558287382,
-0.03644941747188568,
0.09423036128282547,
-0.0630798190832138,
-0.11210919916629791,
0.060385480523109436,
-0.18750250339508057,
-0.16734635829925537,
-0.046747006475925446,
0.08941397815942764,
-0.013504471629858017,
0.0562683641910553,
0.01010266412049532,
-0.029061347246170044,
0.07337348908185959,
-0.0072300187312066555,
-0.08239046484231949,
-0.0839809700846672,
0.036748457700014114,
-0.0596197135746479,
0.23474089801311493,
-0.047793205827474594,
-0.011855976656079292,
0.14020173251628876,
0.039153166115283966,
-0.09823630005121231,
0.037688739597797394,
0.08053634315729141,
-0.08804920315742493,
0.050259076058864594,
0.14229120314121246,
-0.022835809737443924,
0.11403925716876984,
0.04547598212957382,
-0.1064089685678482,
0.01287802867591381,
-0.09411062300205231,
-0.05938946083188057,
-0.08179256319999695,
0.013596398755908012,
-0.03593040257692337,
0.14819733798503876,
0.21949920058250427,
-0.06234214827418327,
-0.008871199563145638,
-0.05706268548965454,
0.028469523414969444,
0.04313679039478302,
0.12275847792625427,
-0.004537437576800585,
-0.25675255060195923,
0.024671334773302078,
0.028880372643470764,
0.004383987281471491,
-0.2662543058395386,
-0.08657258003950119,
0.02146022580564022,
-0.04237307235598564,
-0.1005447581410408,
0.09429441392421722,
0.10739334672689438,
0.05905952304601669,
-0.05335592105984688,
-0.09638314694166183,
-0.05577170103788376,
0.18096095323562622,
-0.1689801961183548,
-0.07534142583608627
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0541
- Precision: 0.9319
- Recall: 0.9493
- F1: 0.9406
- Accuracy: 0.9867
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.2272 | 1.0 | 878 | 0.0760 | 0.8874 | 0.9234 | 0.9051 | 0.9784 |
| 0.0454 | 2.0 | 1756 | 0.0542 | 0.9231 | 0.9472 | 0.9350 | 0.9859 |
| 0.0272 | 3.0 | 2634 | 0.0541 | 0.9319 | 0.9493 | 0.9406 | 0.9867 |
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu118
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1", "accuracy"], "base_model": "bert-base-cased", "model-index": [{"name": "bert-finetuned-ner", "results": []}]} | token-classification | sjunique/bert-finetuned-ner | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"base_model:bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:32:33+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| bert-finetuned-ner
==================
This model is a fine-tuned version of bert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0541
* Precision: 0.9319
* Recall: 0.9493
* F1: 0.9406
* Accuracy: 0.9867
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu118
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
68,
98,
4,
38
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.0989266112446785,
0.1062382161617279,
-0.0023277616128325462,
0.10977478325366974,
0.14203456044197083,
0.018664777278900146,
0.14720261096954346,
0.10603427141904831,
-0.07032617926597595,
0.044423338025808334,
0.12549547851085663,
0.13445459306240082,
0.010635889135301113,
0.12812943756580353,
-0.05623188987374306,
-0.21772940456867218,
0.014658607542514801,
0.0373956523835659,
-0.05833791196346283,
0.10863738507032394,
0.09229157865047455,
-0.13375437259674072,
0.08725453913211823,
0.003016249043866992,
-0.18250736594200134,
0.009087920188903809,
0.02181868627667427,
-0.051317986100912094,
0.13416217267513275,
0.027515798807144165,
0.14270620048046112,
0.00971996784210205,
0.09811896085739136,
-0.1941760778427124,
0.008107094094157219,
0.06704003363847733,
0.002691297559067607,
0.08467921614646912,
0.03448319435119629,
0.0234680138528347,
0.06120859459042549,
-0.07710456103086472,
0.05927274748682976,
0.013996540568768978,
-0.10781213641166687,
-0.22098210453987122,
-0.09004828333854675,
0.05326146259903908,
0.09440824389457703,
0.07096856087446213,
-0.0015788563759997487,
0.13684535026550293,
-0.0548907071352005,
0.08019338548183441,
0.20673461258411407,
-0.31907162070274353,
-0.06272675096988678,
0.05274365842342377,
0.03280499204993248,
0.07244869321584702,
-0.09048835188150406,
-0.027242226526141167,
0.06884928047657013,
0.03350341320037842,
0.1431528776884079,
-0.028660306707024574,
-0.06982974708080292,
0.009490608237683773,
-0.15583683550357819,
-0.02212240733206272,
0.140568807721138,
0.049536723643541336,
-0.04825776070356369,
-0.037347424775362015,
-0.07162588089704514,
-0.14243751764297485,
-0.03859945759177208,
-0.028769107535481453,
0.05415268987417221,
-0.021483642980456352,
-0.051400892436504364,
-0.011656087823212147,
-0.1022917777299881,
-0.08096323162317276,
-0.06713096797466278,
0.13832306861877441,
0.04590744152665138,
0.010233295150101185,
-0.011628888547420502,
0.09757208824157715,
-0.04251214489340782,
-0.12233833223581314,
0.016820553690195084,
0.023815184831619263,
0.011142420582473278,
-0.05465320870280266,
-0.05498993769288063,
-0.05605322867631912,
0.02908218279480934,
0.15427124500274658,
-0.042172592133283615,
0.03894219547510147,
0.01911332458257675,
0.0477195680141449,
-0.09835212677717209,
0.17540526390075684,
-0.05532395839691162,
-0.019875381141901016,
0.03188576549291611,
0.06693082302808762,
0.051994144916534424,
0.0002209718804806471,
-0.1250731348991394,
0.026667902246117592,
0.1119878962635994,
0.0108652887865901,
-0.07068498432636261,
0.07355104386806488,
-0.04651347175240517,
-0.005190522409975529,
0.03792564198374748,
-0.08634151518344879,
0.029839124530553818,
-0.0028183062095195055,
-0.05222087725996971,
-0.055163439363241196,
0.02728661336004734,
0.020686855539679527,
0.002127516781911254,
0.10428541153669357,
-0.08949984610080719,
0.003897721180692315,
-0.09538374096155167,
-0.12126388400793076,
0.027102291584014893,
-0.07750018686056137,
0.02785015106201172,
-0.1119149699807167,
-0.16317829489707947,
0.0030292326118797064,
0.05958918109536171,
-0.017815498635172844,
-0.04132857546210289,
-0.03692532330751419,
-0.07432075589895248,
0.0058412556536495686,
-0.024308517575263977,
0.08421787619590759,
-0.06414609402418137,
0.0858897864818573,
0.037255801260471344,
0.06149286404252052,
-0.057248812168836594,
0.0374118834733963,
-0.0925978422164917,
0.04066237807273865,
-0.18801657855510712,
0.008636231534183025,
-0.07466593384742737,
0.06186086684465408,
-0.08917226642370224,
-0.07562302052974701,
0.01797442138195038,
0.002680856268852949,
0.07990849018096924,
0.07050169259309769,
-0.159791499376297,
-0.05674975365400314,
0.15671047568321228,
-0.08399628847837448,
-0.142501562833786,
0.128151535987854,
-0.05682915449142456,
0.03426538407802582,
0.05802842229604721,
0.17249198257923126,
0.06246979162096977,
-0.10382049530744553,
-0.00025508805993013084,
0.0035690704826265574,
0.06052948534488678,
-0.06895564496517181,
0.06503555178642273,
0.012206604704260826,
-0.0006468545761890709,
0.01780368946492672,
-0.06576529145240784,
0.055477600544691086,
-0.07724218815565109,
-0.08290795236825943,
-0.046183031052351,
-0.10442575812339783,
0.03513578325510025,
0.04718886688351631,
0.05987011268734932,
-0.10242947936058044,
-0.09090261161327362,
0.08525046706199646,
0.07779350876808167,
-0.0693444013595581,
0.019573824480175972,
-0.07829436659812927,
0.0982823446393013,
-0.07963237166404724,
-0.02162305824458599,
-0.14884719252586365,
-0.05512110888957977,
0.023136354982852936,
-0.016968175768852234,
0.008996544405817986,
0.00046900921734049916,
0.06341228634119034,
0.0792754739522934,
-0.05567798763513565,
-0.020684296265244484,
-0.01958470605313778,
0.021647628396749496,
-0.12541460990905762,
-0.194368377327919,
-0.03420006111264229,
-0.03037051483988762,
0.12089257687330246,
-0.21158061921596527,
0.045598965138196945,
-0.011687260121107101,
0.09205756336450577,
0.025598784908652306,
-0.0028317600954324007,
-0.05097409337759018,
0.07260700315237045,
-0.0415157787501812,
-0.06210163235664368,
0.05776108428835869,
0.01476467214524746,
-0.07864010334014893,
-0.03930850327014923,
-0.1283232569694519,
0.17630122601985931,
0.1348673552274704,
-0.0837259292602539,
-0.07194364815950394,
-0.009485269896686077,
-0.045254576951265335,
-0.03045424446463585,
-0.04542660340666771,
0.008740739896893501,
0.13822060823440552,
-0.012401923537254333,
0.14659033715724945,
-0.08143623173236847,
-0.04149815812706947,
0.023145942017436028,
-0.039984751492738724,
0.018220560625195503,
0.09138024598360062,
0.12951235473155975,
-0.0954468846321106,
0.15421831607818604,
0.1776101291179657,
-0.10832283645868301,
0.1001100093126297,
-0.048879630863666534,
-0.05750790983438492,
-0.023396920412778854,
-0.003625171259045601,
0.004994423594325781,
0.125851571559906,
-0.11444101482629776,
0.0036167227663099766,
0.011561726219952106,
0.01625698432326317,
0.013239964842796326,
-0.22427809238433838,
-0.02119184285402298,
0.03547549992799759,
-0.045575447380542755,
-0.0007282973965629935,
-0.02860616333782673,
-0.009960547089576721,
0.0919085443019867,
0.0032194312661886215,
-0.0984349474310875,
0.04347282275557518,
-0.0008737976895645261,
-0.07976347953081131,
0.204599067568779,
-0.08455492556095123,
-0.10758785903453827,
-0.12714962661266327,
-0.07663490623235703,
-0.03980978950858116,
0.02468237839639187,
0.05708549916744232,
-0.06515628099441528,
-0.04791815206408501,
-0.09752253443002701,
-0.003673602594062686,
0.04137725755572319,
0.03485467657446861,
0.015145118348300457,
-0.009846900589764118,
0.08702212572097778,
-0.10375143587589264,
-0.012938138097524643,
-0.049137018620967865,
-0.05339827388525009,
0.021333154290914536,
0.03741654381155968,
0.11408524960279465,
0.1470441073179245,
-0.027910582721233368,
-0.007608558051288128,
-0.024979501962661743,
0.23476245999336243,
-0.05056992545723915,
-0.01587560586631298,
0.1293792426586151,
-0.03157688304781914,
0.04715345799922943,
0.1365281641483307,
0.06668462604284286,
-0.09034541994333267,
0.011460104025900364,
0.038256559520959854,
-0.027590312063694,
-0.21291454136371613,
-0.03605316951870918,
-0.03995600342750549,
-0.009995263069868088,
0.09307503700256348,
0.03325099125504494,
0.028908513486385345,
0.07128184288740158,
0.03248882666230202,
0.09762080758810043,
-0.029674334451556206,
0.073602594435215,
0.13025450706481934,
0.041463859379291534,
0.12513761222362518,
-0.04254763573408127,
-0.05677209049463272,
0.033452995121479034,
0.025540295988321304,
0.20613068342208862,
0.018048910424113274,
0.13259491324424744,
0.05897291749715805,
0.17670299112796783,
-0.001760643208399415,
0.06728647649288177,
-0.02177302911877632,
-0.03759428486227989,
-0.01532242726534605,
-0.04541704058647156,
-0.028698161244392395,
0.03649899736046791,
-0.09439679235219955,
0.05818961188197136,
-0.09476787596940994,
0.017382264137268066,
0.05744286626577377,
0.2592451870441437,
0.04350943863391876,
-0.330951988697052,
-0.09278029203414917,
0.020423859357833862,
-0.030788548290729523,
-0.023090220987796783,
0.032753318548202515,
0.12231817841529846,
-0.04370507597923279,
0.019582344219088554,
-0.06875781714916229,
0.08470448851585388,
-0.03652581572532654,
0.039490096271038055,
0.07639501988887787,
0.10653655976057053,
0.003924083895981312,
0.06669686734676361,
-0.2591719627380371,
0.2635769844055176,
0.013097710907459259,
0.0729670375585556,
-0.051043279469013214,
0.010015820153057575,
0.03223827853798866,
0.0916469395160675,
0.06924129277467728,
-0.017115775495767593,
-0.06512245535850525,
-0.19467635452747345,
-0.06261606514453888,
0.03287616744637489,
0.07016560435295105,
-0.014970473945140839,
0.09427318722009659,
-0.03652140125632286,
0.0011479767272248864,
0.08679799735546112,
0.0001553771726321429,
-0.07282944023609161,
-0.08569074422121048,
-0.019810887053608894,
0.04256526008248329,
-0.03593727573752403,
-0.08196595311164856,
-0.100868821144104,
-0.13596966862678528,
0.15798993408679962,
-0.06143123656511307,
-0.027054455131292343,
-0.09724355489015579,
0.06300521641969681,
0.0534772127866745,
-0.0756925642490387,
0.056716982275247574,
0.006056127138435841,
0.09517350047826767,
0.02392709068953991,
-0.0636732205748558,
0.12406042218208313,
-0.08152833580970764,
-0.16087913513183594,
-0.0782582089304924,
0.08477155864238739,
0.022229444235563278,
0.04796859622001648,
0.0011697873705998063,
0.008168794214725494,
-0.014774422161281109,
-0.08055189251899719,
0.014326279051601887,
0.006851856131106615,
0.06695606559515,
-0.011719600297510624,
-0.06912672519683838,
0.007614597678184509,
-0.04738432914018631,
-0.026412568986415863,
0.15222808718681335,
0.25490719079971313,
-0.09554591774940491,
-0.007133870851248503,
0.060528699308633804,
-0.06670620292425156,
-0.20411787927150726,
0.03198244422674179,
0.035600364208221436,
0.0010938235791400075,
0.02775782346725464,
-0.13978123664855957,
0.13918311893939972,
0.11038237065076828,
-0.03145783022046089,
0.09817315638065338,
-0.27258729934692383,
-0.1343255341053009,
0.14041337370872498,
0.1573282927274704,
0.10130857676267624,
-0.1467454880475998,
-0.029026582837104797,
-0.02756933495402336,
-0.12354560941457748,
0.11054801195859909,
-0.12023641169071198,
0.09809403121471405,
-0.004241865128278732,
0.06722955405712128,
-0.004344641696661711,
-0.06050988659262657,
0.12231330573558807,
0.0008713475544936955,
0.10912702977657318,
-0.05605287849903107,
-0.016554322093725204,
0.042324453592300415,
-0.05387469381093979,
0.02080097235739231,
-0.09961560368537903,
0.03807699680328369,
-0.05339062958955765,
-0.03012738935649395,
-0.04652995988726616,
0.03289729356765747,
-0.03084714151918888,
-0.0700683444738388,
-0.04072396457195282,
0.02816266380250454,
0.03940165787935257,
-0.016173068434000015,
0.14029711484909058,
0.02931499294936657,
0.13958890736103058,
0.13372619450092316,
0.07018925249576569,
-0.08439727872610092,
-0.03356914967298508,
-0.0015394339570775628,
-0.03915034234523773,
0.07697822898626328,
-0.1492180973291397,
0.038826920092105865,
0.1253976672887802,
0.004287369083613157,
0.1375315934419632,
0.0731399655342102,
-0.033706605434417725,
0.004212252330034971,
0.05876749008893967,
-0.1585507094860077,
-0.10132883489131927,
0.013904456049203873,
-0.01462119072675705,
-0.11020267754793167,
0.07326454669237137,
0.11093366146087646,
-0.08342818170785904,
-0.0015576269943267107,
-0.0030710874125361443,
0.008252288214862347,
-0.04697762429714203,
0.18123437464237213,
0.07384492456912994,
0.045442283153533936,
-0.07722850143909454,
0.0738314613699913,
0.05138272047042847,
-0.06983612477779388,
0.009692582301795483,
0.024671325460076332,
-0.08768384903669357,
-0.04273039475083351,
0.0647716298699379,
0.18381112813949585,
-0.043023474514484406,
-0.06406824290752411,
-0.13975223898887634,
-0.11909067630767822,
0.060804758220911026,
0.18685349822044373,
0.10524073988199234,
0.011533310636878014,
-0.03694665804505348,
0.01792133040726185,
-0.11035869270563126,
0.10873980820178986,
0.02320992201566696,
0.09204281121492386,
-0.15388846397399902,
0.11364978551864624,
0.001056428300216794,
0.015182198956608772,
-0.02712484449148178,
0.05487537756562233,
-0.11292271316051483,
-0.003465181915089488,
-0.15472768247127533,
-0.0021186573430895805,
-0.02901829592883587,
0.013159970752894878,
0.011387786827981472,
-0.05960383266210556,
-0.06578906625509262,
0.026133596897125244,
-0.10005611926317215,
-0.018085015937685966,
0.04124421998858452,
0.05506763979792595,
-0.12743300199508667,
-0.03930964320898056,
0.023403460159897804,
-0.06526913493871689,
0.06722377240657806,
0.014962724409997463,
0.02884291671216488,
0.05309844762086868,
-0.16612008213996887,
0.02073996141552925,
0.07151105254888535,
0.022491587325930595,
0.05745351314544678,
-0.09496990591287613,
-0.009820941835641861,
0.0037622966337949038,
0.03243838623166084,
0.006735218223184347,
0.08597744256258011,
-0.12483493238687515,
-0.008689983747899532,
-0.02374635450541973,
-0.05912651866674423,
-0.0543769896030426,
0.008471950888633728,
0.09925816208124161,
0.0066707441583275795,
0.19561068713665009,
-0.08655297756195068,
0.00998789444565773,
-0.19880498945713043,
0.005478298757225275,
-0.005629718769341707,
-0.1095694825053215,
-0.13013169169425964,
-0.05037517845630646,
0.04474015533924103,
-0.05595250800251961,
0.15927578508853912,
0.010083583183586597,
0.02482106164097786,
0.03413887321949005,
-0.03686005249619484,
0.0371394157409668,
0.0332554429769516,
0.21795108914375305,
0.023837806656956673,
-0.03321564942598343,
0.014152629300951958,
0.031818751245737076,
0.10986443608999252,
0.07530368864536285,
0.16109034419059753,
0.15934038162231445,
-0.030164001509547234,
0.10186395049095154,
0.05658246949315071,
-0.05639107525348663,
-0.14491161704063416,
0.05203017592430115,
-0.047837160527706146,
0.09920231997966766,
-0.019875159487128258,
0.21368585526943207,
0.08504606783390045,
-0.158607617020607,
0.011673280969262123,
-0.05817728862166405,
-0.0783730298280716,
-0.11164236813783646,
-0.05245509371161461,
-0.08907677233219147,
-0.14967209100723267,
-0.002477964386343956,
-0.10854651778936386,
-0.007712837774306536,
0.10597459971904755,
0.006420407444238663,
-0.01718497835099697,
0.16368018090724945,
0.009009924717247486,
0.046157531440258026,
0.03283029794692993,
0.00911315530538559,
-0.03610965237021446,
-0.09041105210781097,
-0.09117069095373154,
-0.0021059392020106316,
-0.016349708661437035,
0.026365282014012337,
-0.055770643055438995,
-0.016678813844919205,
0.04616698622703552,
-0.0002737686736509204,
-0.09168485552072525,
0.007881986908614635,
0.014657517895102501,
0.0422561839222908,
0.041553761810064316,
0.003541715443134308,
0.02145378105342388,
0.001845150371082127,
0.18966273963451385,
-0.07351849973201752,
-0.06289410591125488,
-0.11113231629133224,
0.23070932924747467,
0.01718927174806595,
0.012379745952785015,
0.018086381256580353,
-0.08105017989873886,
0.012325872667133808,
0.22324669361114502,
0.17709803581237793,
-0.08496896922588348,
-0.005730603821575642,
-0.0018789630848914385,
-0.016058621928095818,
-0.055513378232717514,
0.09939619898796082,
0.12692615389823914,
0.02670476771891117,
-0.07622051984071732,
-0.05565190687775612,
-0.04038919135928154,
-0.0007854178547859192,
-0.0394676998257637,
0.047060009092092514,
0.03190726041793823,
0.010793282650411129,
-0.04861912503838539,
0.03735782578587532,
-0.029778597876429558,
-0.11200930923223495,
0.06614656746387482,
-0.1766616255044937,
-0.15437564253807068,
-0.005356441251933575,
0.11595245450735092,
-0.015977049246430397,
0.04619923233985901,
-0.03500168025493622,
0.0043857810087502,
0.09113405644893646,
-0.02894188091158867,
-0.06900280714035034,
-0.07200012356042862,
0.07149002701044083,
-0.06977922469377518,
0.24608054757118225,
-0.033146023750305176,
0.060825664550065994,
0.13125677406787872,
0.050652045756578445,
-0.085382379591465,
0.07686514407396317,
0.05157987028360367,
-0.08488857746124268,
0.02128121815621853,
0.057517245411872864,
-0.04170394688844681,
0.11300599575042725,
0.046319980174303055,
-0.1313825398683548,
0.010255631059408188,
-0.07010866701602936,
-0.08130127936601639,
-0.055477675050497055,
-0.04315495863556862,
-0.06722313165664673,
0.1364213526248932,
0.17876076698303223,
-0.03260122984647751,
-0.0010261840652674437,
-0.05626802146434784,
0.02780514396727085,
0.06392652541399002,
0.027706103399395943,
-0.03629830852150917,
-0.2244952917098999,
0.04000912979245186,
0.05662291869521141,
-0.014942744746804237,
-0.24962042272090912,
-0.09025359153747559,
-0.002159091643989086,
-0.052963986992836,
-0.09450092166662216,
0.07359028607606888,
0.11521132290363312,
0.05825580283999443,
-0.07199908792972565,
-0.08179628103971481,
-0.08289074897766113,
0.1372617483139038,
-0.12764666974544525,
-0.10179606080055237
] |
null | null | transformers |
# Phi-2 model fine-tuned for named entity recognition task
The model was fine-tuned using one quarter of the ConLL 2012 OntoNotes v5 dataset.
- Dataset Source: [conll2012_ontonotesv5](https://huggingface.co/datasets/conll2012_ontonotesv5)
- Subset Used: English_v12
- Number of Examples: 87,265
The prompts and expected outputs were constructed as described in [1].
Example input:
```md
Instruct: I am an excelent linquist. The task is to label organization entities in the given sentence. Below are some examples
Input: A spokesman for B. A. T said of the amended filings that,`` It would appear that nothing substantive has changed.
Output: A spokesman for @@B. A. T## said of the amended filings that,`` It would appear that nothing substantive has changed.
Input: Since NBC's interest in the Qintex bid for MGM / UA was disclosed, Mr. Wright has n't been available for comment.
Output: Since @@NBC##'s interest in the @@Qintex## bid for @@MGM / UA## was disclosed, Mr. Wright has n't been available for comment.
Input: You know news organizations demand total transparency whether you're General Motors or United States government /.
Output: You know news organizations demand total transparency whether you're @@General Motors## or United States government /.
Input: We respectfully invite you to watch a special edition of Across China.
Output:
```
Expected output:
```md
We respectfully invite you to watch a special edition of @@Across China##.
```
This model is trained to recognize the named entity categories
- person
- nationalities or religious or political groups
- facility
- organization
- geopolitical entity
- location
- product
- date
- time expression
- percentage
- monetary value
- quantity
- event
- work of art
- law/legal reference
- language name
# Model Trained Using AutoTrain
This model was trained using **SFT** AutoTrain trainer. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
Hyperparameters:
```json
{
"model": "microsoft/phi-2",
"valid_split": null,
"add_eos_token": false,
"block_size": 1024,
"model_max_length": 1024,
"padding": "right",
"trainer": "sft",
"use_flash_attention_2": false,
"disable_gradient_checkpointing": false,
"evaluation_strategy": "epoch",
"save_total_limit": 1,
"save_strategy": "epoch",
"auto_find_batch_size": false,
"mixed_precision": "bf16",
"lr": 0.0002,
"epochs": 1,
"batch_size": 1,
"warmup_ratio": 0.1,
"gradient_accumulation": 4,
"optimizer": "adamw_torch",
"scheduler": "linear",
"weight_decay": 0.01,
"max_grad_norm": 1.0,
"seed": 42,
"apply_chat_template": false,
"quantization": "int4",
"target_modules": null,
"merge_adapter": false,
"peft": true,
"lora_r": 16,
"lora_alpha": 32,
"lora_dropout": 0.05,
"dpo_beta": 0.1,
}
```
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "pahautelman/phi2-ner-v1"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path
).eval()
prompt = 'Label the person entities in the given sentence: Russian President Vladimir Putin is due to arrive in Havana a few hours from now to become the first post-Soviet leader to visit Cuba.'
inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors='pt')
outputs = model.generate(
inputs.to(model.device),
max_new_tokens=9,
do_sample=False,
)
output = tokenizer.batch_decode(outputs)[0]
# Model response: "Output: Russian President, Vladimir Putin"
print(output)
```
# References:
[1] Wang et al., GPT-NER: Named entity recognition via large language models 2023 | {"language": ["en"], "license": "mit", "tags": ["autotrain", "text-generation", "transformers", "named entity recognition"], "datasets": ["conll2012_ontonotesv5"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | pahautelman/phi2-ner-v1 | [
"transformers",
"safetensors",
"phi",
"text-generation",
"autotrain",
"named entity recognition",
"conversational",
"custom_code",
"en",
"dataset:conll2012_ontonotesv5",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"4-bit",
"region:us"
] | 2024-02-14T17:35:00+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #phi #text-generation #autotrain #named entity recognition #conversational #custom_code #en #dataset-conll2012_ontonotesv5 #license-mit #autotrain_compatible #endpoints_compatible #4-bit #region-us
|
# Phi-2 model fine-tuned for named entity recognition task
The model was fine-tuned using one quarter of the ConLL 2012 OntoNotes v5 dataset.
- Dataset Source: conll2012_ontonotesv5
- Subset Used: English_v12
- Number of Examples: 87,265
The prompts and expected outputs were constructed as described in [1].
Example input:
Expected output:
This model is trained to recognize the named entity categories
- person
- nationalities or religious or political groups
- facility
- organization
- geopolitical entity
- location
- product
- date
- time expression
- percentage
- monetary value
- quantity
- event
- work of art
- law/legal reference
- language name
# Model Trained Using AutoTrain
This model was trained using SFT AutoTrain trainer. For more information, please visit AutoTrain.
Hyperparameters:
# Usage
# References:
[1] Wang et al., GPT-NER: Named entity recognition via large language models 2023 | [
"# Phi-2 model fine-tuned for named entity recognition task\nThe model was fine-tuned using one quarter of the ConLL 2012 OntoNotes v5 dataset.\n- Dataset Source: conll2012_ontonotesv5\n- Subset Used: English_v12\n- Number of Examples: 87,265\n \nThe prompts and expected outputs were constructed as described in [1].\n\nExample input:\n\nExpected output:\n\n\nThis model is trained to recognize the named entity categories\n- person\n- nationalities or religious or political groups\n- facility\n- organization\n- geopolitical entity\n- location\n- product\n- date\n- time expression\n- percentage\n- monetary value\n- quantity\n- event\n- work of art\n- law/legal reference\n- language name",
"# Model Trained Using AutoTrain\n\nThis model was trained using SFT AutoTrain trainer. For more information, please visit AutoTrain.\n\nHyperparameters:",
"# Usage",
"# References:\n[1] Wang et al., GPT-NER: Named entity recognition via large language models 2023"
] | [
"TAGS\n#transformers #safetensors #phi #text-generation #autotrain #named entity recognition #conversational #custom_code #en #dataset-conll2012_ontonotesv5 #license-mit #autotrain_compatible #endpoints_compatible #4-bit #region-us \n",
"# Phi-2 model fine-tuned for named entity recognition task\nThe model was fine-tuned using one quarter of the ConLL 2012 OntoNotes v5 dataset.\n- Dataset Source: conll2012_ontonotesv5\n- Subset Used: English_v12\n- Number of Examples: 87,265\n \nThe prompts and expected outputs were constructed as described in [1].\n\nExample input:\n\nExpected output:\n\n\nThis model is trained to recognize the named entity categories\n- person\n- nationalities or religious or political groups\n- facility\n- organization\n- geopolitical entity\n- location\n- product\n- date\n- time expression\n- percentage\n- monetary value\n- quantity\n- event\n- work of art\n- law/legal reference\n- language name",
"# Model Trained Using AutoTrain\n\nThis model was trained using SFT AutoTrain trainer. For more information, please visit AutoTrain.\n\nHyperparameters:",
"# Usage",
"# References:\n[1] Wang et al., GPT-NER: Named entity recognition via large language models 2023"
] | [
78,
160,
37,
3,
25
] | [
"passage: TAGS\n#transformers #safetensors #phi #text-generation #autotrain #named entity recognition #conversational #custom_code #en #dataset-conll2012_ontonotesv5 #license-mit #autotrain_compatible #endpoints_compatible #4-bit #region-us \n# Phi-2 model fine-tuned for named entity recognition task\nThe model was fine-tuned using one quarter of the ConLL 2012 OntoNotes v5 dataset.\n- Dataset Source: conll2012_ontonotesv5\n- Subset Used: English_v12\n- Number of Examples: 87,265\n \nThe prompts and expected outputs were constructed as described in [1].\n\nExample input:\n\nExpected output:\n\n\nThis model is trained to recognize the named entity categories\n- person\n- nationalities or religious or political groups\n- facility\n- organization\n- geopolitical entity\n- location\n- product\n- date\n- time expression\n- percentage\n- monetary value\n- quantity\n- event\n- work of art\n- law/legal reference\n- language name# Model Trained Using AutoTrain\n\nThis model was trained using SFT AutoTrain trainer. For more information, please visit AutoTrain.\n\nHyperparameters:# Usage# References:\n[1] Wang et al., GPT-NER: Named entity recognition via large language models 2023"
] | [
-0.04224191606044769,
0.1203203946352005,
-0.005330866202712059,
0.04963976517319679,
0.09563716500997543,
-0.03779733180999756,
0.14072075486183167,
0.06293730437755585,
-0.04557732865214348,
0.12614655494689941,
0.024571403861045837,
0.056399788707494736,
0.0950995683670044,
0.1304740607738495,
-0.0015784086426720023,
-0.22945110499858856,
0.06128966063261032,
-0.06325969845056534,
0.04928738251328468,
0.10957637429237366,
0.08283507078886032,
-0.07500236481428146,
0.11014825105667114,
0.012489352375268936,
-0.09509969502687454,
0.029067980125546455,
-0.016779549419879913,
-0.07416409254074097,
0.12218751013278961,
0.08932969719171524,
0.01180438231676817,
-0.022725414484739304,
0.05976579338312149,
-0.17312680184841156,
0.006138380151242018,
0.026074254885315895,
-0.035571739077568054,
-0.00045195751590654254,
0.07917510718107224,
0.025546889752149582,
0.1607300490140915,
-0.037056975066661835,
0.0419471338391304,
0.03401568531990051,
-0.1140134260058403,
-0.06972198188304901,
-0.10063467174768448,
0.09967514872550964,
0.055789340287446976,
0.08137643337249756,
-0.043391164392232895,
0.17245420813560486,
-0.09610259532928467,
0.04773435369133949,
0.067563995718956,
-0.24024061858654022,
-0.042752139270305634,
0.08056589961051941,
-0.011019035242497921,
0.09034345299005508,
-0.046300873160362244,
0.014516597613692284,
0.025619566440582275,
0.037784602493047714,
0.037298575043678284,
-0.028340619057416916,
-0.0924646183848381,
0.02772996574640274,
-0.1500881463289261,
-0.02475035935640335,
0.13307900726795197,
0.03596755489706993,
0.0011721127666532993,
-0.13693086802959442,
-0.06012044847011566,
0.051828399300575256,
-0.0017353525618091226,
-0.08013541996479034,
0.04630804434418678,
-0.01113192643970251,
0.06797919422388077,
0.04156610742211342,
-0.07554075121879578,
0.027313238009810448,
-0.12029357999563217,
0.026811882853507996,
0.01974201202392578,
0.07527635991573334,
-0.053252290934324265,
0.011525413952767849,
-0.1517360657453537,
-0.08639755100011826,
-0.05038357526063919,
-0.005227990448474884,
-0.09432117640972137,
-0.028894709423184395,
0.04231124743819237,
0.027022864669561386,
0.04219774156808853,
0.04338974133133888,
-0.0769578292965889,
0.05466926470398903,
-0.020177064463496208,
0.008904707618057728,
0.04341933876276016,
0.19112086296081543,
-0.06529859453439713,
-0.0892915278673172,
0.0021311063319444656,
-0.015734896063804626,
0.010924474336206913,
0.03792346641421318,
-0.08712491393089294,
0.028130102902650833,
0.019167765974998474,
0.06339969485998154,
-0.012338392436504364,
0.062213778495788574,
-0.040146201848983765,
-0.0007921184878796339,
0.14005577564239502,
-0.12312076985836029,
0.018848344683647156,
0.0091850021854043,
-0.060566313564777374,
0.029330085963010788,
-0.045251306146383286,
0.018358595669269562,
-0.05762581154704094,
0.04711387678980827,
-0.04632408171892166,
0.00750297075137496,
-0.043430618941783905,
-0.07198456674814224,
0.03571512922644615,
0.03365781903266907,
-0.05352279916405678,
-0.12610644102096558,
-0.15879224240779877,
-0.06393436342477798,
-0.004210211336612701,
-0.0816904753446579,
-0.032125234603881836,
-0.015106030739843845,
0.014976456761360168,
-0.01845484785735607,
-0.020065657794475555,
0.0053283642046153545,
-0.04963771253824234,
0.007096551358699799,
-0.021117303520441055,
0.04457651451230049,
0.10358960926532745,
0.02542453072965145,
-0.12669987976551056,
-0.005361457820981741,
-0.11430409550666809,
0.0906725600361824,
-0.03135453164577484,
0.07229559123516083,
-0.1367242932319641,
-0.030369892716407776,
-0.009204603731632233,
0.02348402887582779,
-0.055709585547447205,
0.18471066653728485,
-0.15442892909049988,
-0.10740244388580322,
0.18175145983695984,
-0.12611038982868195,
-0.033091552555561066,
0.09125585108995438,
-0.007946422323584557,
0.045287489891052246,
0.1264892965555191,
0.1264597773551941,
-0.030601534992456436,
-0.1399904191493988,
-0.0037653720937669277,
-0.024577977135777473,
-0.02508366107940674,
0.053308628499507904,
0.036296721547842026,
-0.05332588404417038,
0.02434602938592434,
0.015002441592514515,
-0.04441913589835167,
-0.011929353699088097,
-0.03603582829236984,
-0.04408827796578407,
0.020283857360482216,
-0.054348934441804886,
-0.000033439951948821545,
-0.03311305120587349,
0.043059416115283966,
-0.012136098928749561,
-0.05804974585771561,
0.03652187064290047,
0.026333753019571304,
0.012099017389118671,
-0.004261663183569908,
-0.06713508814573288,
0.0802813246846199,
0.056271594017744064,
0.0033909676130861044,
-0.137124702334404,
-0.09264667332172394,
0.0755765438079834,
-0.07853160053491592,
0.1145675852894783,
-0.01357599813491106,
0.045946381986141205,
0.02315285988152027,
0.000022284273654804565,
0.03590352088212967,
0.06746399402618408,
-0.022443972527980804,
-0.04207047447562218,
-0.10339310765266418,
0.033160991966724396,
-0.007647693157196045,
0.0725369080901146,
-0.13140231370925903,
0.007432109676301479,
0.07845111191272736,
0.07114475965499878,
0.027260608971118927,
-0.03979269415140152,
-0.012048223987221718,
0.006054208148270845,
0.03492414206266403,
-0.04317411780357361,
-0.00026496537611819804,
-0.0032053047325462103,
-0.022426320239901543,
0.12122653424739838,
-0.17245954275131226,
0.06560248881578445,
0.08249256014823914,
0.07157687097787857,
-0.0785180926322937,
-0.06053810194134712,
-0.04144233092665672,
-0.01423724740743637,
-0.06251263618469238,
-0.005137997213751078,
0.1610635221004486,
-0.005432634614408016,
0.0640302300453186,
-0.0765225738286972,
-0.07742296159267426,
-0.009599332697689533,
-0.00672551617026329,
-0.05224671587347984,
0.1005157083272934,
0.042802758514881134,
-0.11605273187160492,
0.11844594031572342,
0.07650212943553925,
0.014274651184678078,
0.12627241015434265,
-0.024993080645799637,
-0.03940560296177864,
-0.015038824640214443,
0.0017456315690651536,
-0.002085513435304165,
0.008295761421322823,
-0.01098355371505022,
-0.03142470493912697,
0.052395109087228775,
0.02063739486038685,
-0.008957725018262863,
-0.11352464556694031,
0.03532526642084122,
0.004965638276189566,
-0.0274143498390913,
0.04943033680319786,
-0.005607430823147297,
0.0002786550030577928,
0.12013793736696243,
0.006194784305989742,
-0.011527384631335735,
-0.0036903235595673323,
-0.020500097423791885,
-0.1481229066848755,
0.16899973154067993,
-0.10568996518850327,
-0.2040064036846161,
-0.09643517434597015,
-0.01897272653877735,
-0.037687089294195175,
-0.018672099336981773,
0.020242495462298393,
-0.10844843834638596,
-0.09668955206871033,
-0.06415223330259323,
0.13949371874332428,
0.0014779639896005392,
-0.05351262167096138,
-0.04659716784954071,
-0.019464142620563507,
-0.021982470527291298,
-0.12538868188858032,
-0.03832350671291351,
-0.04428309202194214,
-0.126058891415596,
0.07285168766975403,
-0.03295260667800903,
0.047346048057079315,
0.16522030532360077,
0.030350415036082268,
0.008908321149647236,
-0.045600056648254395,
0.25642552971839905,
-0.12024044990539551,
0.08628951758146286,
0.1465730369091034,
0.03664668649435043,
0.07879842072725296,
0.19769011437892914,
0.041891805827617645,
-0.06365957111120224,
0.023422889411449432,
0.05361682549118996,
-0.035803262144327164,
-0.22656171023845673,
-0.10892799496650696,
-0.07126498967409134,
-0.031420402228832245,
0.035561271011829376,
0.0895063579082489,
0.06081307306885719,
0.022464420646429062,
-0.036265257745981216,
0.044206440448760986,
0.09196140617132187,
0.04497704654932022,
0.16083164513111115,
0.06540055572986603,
0.09440402686595917,
-0.02972625568509102,
-0.052978768944740295,
0.11475547403097153,
-0.04342770203948021,
0.26617082953453064,
0.08856607973575592,
0.054177653044462204,
0.060201484709978104,
0.04721783846616745,
0.056364648044109344,
-0.005144485738128424,
0.0062801418825984,
0.011237350292503834,
-0.017889948561787605,
-0.10360339283943176,
0.024896206334233284,
0.09495799243450165,
0.03521829843521118,
-0.06684784591197968,
0.048045117408037186,
-0.05023803561925888,
0.06970620900392532,
0.21154193580150604,
0.016363613307476044,
-0.22801484167575836,
-0.04730828478932381,
0.04301619902253151,
-0.08626101911067963,
-0.01995343156158924,
0.044167257845401764,
-0.01718955673277378,
-0.18302802741527557,
0.12147482484579086,
-0.00023880222579464316,
0.09314024448394775,
-0.05600394308567047,
0.020236041396856308,
0.04681320860981941,
-0.0022194532211869955,
-0.013471233658492565,
0.057958558201789856,
-0.21208693087100983,
0.27966421842575073,
0.023110629990696907,
0.009767933748662472,
-0.06973738223314285,
0.05044916644692421,
0.010956658981740475,
0.09726531058549881,
0.13181079924106598,
0.009511817246675491,
-0.06197619438171387,
-0.039546899497509,
-0.00931947585195303,
0.01961268112063408,
0.047993265092372894,
-0.03838638588786125,
0.05005504935979843,
-0.014319885522127151,
0.0035176503006368876,
-0.03289584070444107,
-0.008298169821500778,
-0.11822731047868729,
-0.10254693031311035,
0.018718162551522255,
-0.04391849786043167,
0.07035986334085464,
-0.036926109343767166,
-0.03827890753746033,
-0.18219013512134552,
0.13557347655296326,
-0.0399535670876503,
-0.047459013760089874,
-0.1114187091588974,
-0.05458657816052437,
0.06634637713432312,
-0.09258560836315155,
0.038513462990522385,
-0.031657446175813675,
0.08016344904899597,
-0.03779347985982895,
-0.10620396584272385,
0.08879686146974564,
-0.060173045843839645,
-0.12390629947185516,
-0.03760919347405434,
0.0895971953868866,
0.08906713128089905,
0.04931823909282684,
-0.01240785513073206,
0.07123037427663803,
-0.007492842618376017,
-0.09364841878414154,
0.034864574670791626,
0.10173247009515762,
0.05325967073440552,
0.017604777589440346,
-0.04485391080379486,
-0.1858062744140625,
-0.07707322388887405,
-0.06641511619091034,
0.03844355791807175,
0.18239769339561462,
-0.05451742932200432,
0.09611069411039352,
0.19538640975952148,
-0.0654715746641159,
-0.21249593794345856,
0.01728198677301407,
0.0027633458375930786,
0.03225351870059967,
0.07348297536373138,
-0.14292749762535095,
0.08122409880161285,
0.10208413749933243,
-0.06612742692232132,
-0.03999464213848114,
-0.2429766058921814,
-0.13105186820030212,
0.0899091437458992,
-0.008056257851421833,
-0.021050723269581795,
-0.15055377781391144,
-0.09043951332569122,
0.002967404667288065,
-0.17714808881282806,
0.06297899037599564,
-0.06742793321609497,
0.00035181400016881526,
0.01982170343399048,
0.1338064819574356,
0.004853986669331789,
-0.023094417527318,
0.11593599617481232,
0.07289852946996689,
0.047957923263311386,
-0.08975835889577866,
-0.07953692227602005,
0.06169220805168152,
-0.034769028425216675,
0.07112081348896027,
0.05143536627292633,
0.01032843068242073,
-0.0925220176577568,
-0.03891393169760704,
-0.07384061068296432,
0.07209199666976929,
-0.01163915079087019,
-0.0601377971470356,
-0.0918421819806099,
0.08223114162683487,
0.045159753412008286,
-0.054068729281425476,
0.017913706600666046,
-0.06739997863769531,
0.04293309152126312,
0.15424881875514984,
0.09162069112062454,
0.012246658094227314,
-0.04907925799489021,
-0.033080898225307465,
-0.012294302694499493,
0.05978638306260109,
-0.09766604006290436,
0.03571934625506401,
0.09773716330528259,
0.009180725552141666,
0.13462823629379272,
-0.0020904159173369408,
-0.08368853479623795,
0.017451219260692596,
0.04027509689331055,
-0.0630423054099083,
-0.1451577991247177,
0.009118565358221531,
-0.03590516746044159,
-0.03214015066623688,
0.004384112078696489,
0.06574301421642303,
-0.03257719799876213,
-0.03751321882009506,
-0.0036689341068267822,
0.070838063955307,
-0.020470168441534042,
0.10972293466329575,
-0.03010147251188755,
0.02418150007724762,
-0.108231320977211,
0.06665679067373276,
0.12051873654127121,
-0.01940908655524254,
-0.015460354276001453,
0.10756385326385498,
-0.09340140968561172,
-0.0392950177192688,
0.05638553202152252,
0.0731365904211998,
-0.09377939254045486,
-0.05373426526784897,
0.02766534313559532,
-0.1744004338979721,
0.051810216158628464,
0.07244839519262314,
0.025468194857239723,
0.0064530097879469395,
-0.009071160107851028,
0.006135639268904924,
-0.009271879680454731,
0.04998216778039932,
0.06771746277809143,
0.013839109800755978,
-0.06506787240505219,
0.019206855446100235,
0.010534295812249184,
0.022289715707302094,
-0.03831375017762184,
-0.04393872991204262,
-0.13863882422447205,
0.02242378517985344,
-0.05730850622057915,
0.03197634592652321,
-0.05563293769955635,
0.021658727899193764,
-0.019688675180077553,
-0.03187926858663559,
-0.052410561591386795,
0.0306034367531538,
-0.09037370979785919,
0.022319737821817398,
-0.04329754039645195,
0.1009320393204689,
-0.14447319507598877,
-0.004670352209359407,
0.08149509131908417,
-0.04173865169286728,
0.1134737953543663,
0.0058142393827438354,
-0.035773321986198425,
0.07927058637142181,
-0.14577539265155792,
0.0037602391093969345,
0.07375428080558777,
0.03502257913351059,
0.042750533670186996,
-0.01016149390488863,
0.0038726336788386106,
0.019276153296232224,
-0.024692999199032784,
0.020603716373443604,
-0.05950658768415451,
-0.07029100507497787,
0.06575523316860199,
-0.08164168894290924,
-0.11950164288282394,
-0.04140710085630417,
0.04652617499232292,
0.02181324176490307,
-0.03447870537638664,
0.08034802973270416,
-0.09452120959758759,
0.007888711988925934,
-0.12039610743522644,
-0.036346159875392914,
0.03128500282764435,
-0.04430770501494408,
-0.13649962842464447,
-0.08580911159515381,
0.08630634844303131,
0.02980109304189682,
0.23397429287433624,
0.0574183389544487,
0.03632911667227745,
0.06379171460866928,
0.06302592903375626,
-0.03212656080722809,
0.011418030597269535,
0.03305355831980705,
0.02368815988302231,
0.020261092111468315,
0.005406651180237532,
-0.02392551861703396,
-0.07565721869468689,
-0.0611310638487339,
0.17738144099712372,
0.10001011192798615,
-0.005191467236727476,
0.017588479444384575,
0.06491811573505402,
-0.06092660501599312,
-0.0994095429778099,
-0.015900129452347755,
-0.14454936981201172,
0.05430465191602707,
-0.07288306951522827,
0.09169535338878632,
0.15633955597877502,
-0.1282610446214676,
0.08520830422639847,
-0.03266075998544693,
-0.03790014982223511,
-0.13737338781356812,
-0.16458170115947723,
-0.09194689244031906,
-0.10837586969137192,
0.030843380838632584,
-0.06346046179533005,
0.07627130299806595,
0.09979195892810822,
0.059340544044971466,
-0.009843844920396805,
0.03683391213417053,
-0.07606521248817444,
-0.07912440598011017,
0.04887143522500992,
-0.010597600601613522,
0.01231355220079422,
-0.056353069841861725,
0.022042810916900635,
-0.01850193738937378,
-0.010676936246454716,
0.04865233600139618,
0.050304871052503586,
-0.019075937569141388,
0.0030348049476742744,
-0.06881946325302124,
-0.01482284627854824,
0.0015861508436501026,
-0.0016585905104875565,
-0.0813286229968071,
0.07066258788108826,
0.040311262011528015,
-0.007504800334572792,
0.02248830534517765,
0.12797723710536957,
-0.037317629903554916,
-0.1302035003900528,
-0.11019953340291977,
0.27139049768447876,
-0.02480006404221058,
0.06801552325487137,
0.02510761469602585,
-0.07894565165042877,
-0.01140967570245266,
0.23096215724945068,
0.19390444457530975,
-0.021373828873038292,
0.01533462293446064,
0.0011883398983627558,
0.015630880370736122,
0.01723954640328884,
0.07713797688484192,
0.02160579338669777,
0.2937866449356079,
-0.06607422232627869,
0.08034064620733261,
-0.015579679980874062,
-0.04045481234788895,
-0.09348249435424805,
0.10353802889585495,
-0.0015620412304997444,
-0.019191043451428413,
-0.03960825875401497,
0.1016298308968544,
-0.05863643065094948,
-0.1673031449317932,
0.04775485768914223,
-0.07357002794742584,
-0.09929300844669342,
0.013919465243816376,
-0.04485589265823364,
0.047213297337293625,
0.06467166543006897,
0.015012620016932487,
-0.06700614839792252,
0.10757103562355042,
0.046174779534339905,
-0.14045493304729462,
-0.10818330198526382,
0.09895221889019012,
-0.036350708454847336,
0.17657729983329773,
0.021071771159768105,
0.10755511373281479,
0.04991159215569496,
0.001436908612959087,
-0.055456794798374176,
0.11005790531635284,
0.05062015727162361,
-0.09619778394699097,
-0.032434217631816864,
0.004044003784656525,
0.018374620005488396,
0.061416205018758774,
0.07816598564386368,
-0.0492122620344162,
0.0717773362994194,
0.00981146190315485,
-0.0035437874030321836,
-0.09794802963733673,
0.05592536926269531,
-0.0718761682510376,
0.10668913275003433,
0.1344834268093109,
0.025015920400619507,
0.026529060676693916,
-0.060541313141584396,
0.05941973626613617,
0.011827412992715836,
0.08308041840791702,
-0.008820767514407635,
-0.14149720966815948,
0.017850330099463463,
-0.030187536031007767,
0.08736298978328705,
-0.1272355169057846,
-0.07496645301580429,
-0.06364516168832779,
0.0291595458984375,
-0.07695141434669495,
0.13497449457645416,
0.08144250512123108,
-0.0025305638555437326,
-0.007815673016011715,
-0.25145378708839417,
0.03684575855731964,
0.13979026675224304,
-0.06360506266355515,
-0.018469247967004776
] |
null | null | transformers |
# Introduction
This model vistagi/Mixtral-8x7b-v0.1-sft is trained with Ultrachat-200K dataset through supervised finetuning using Mixtral-8x7b-v0.1 as the baseline model.
The training is done with bfloat16 precision using LoRA.
## Details
Used Librarys
- torch
- deepspeed
- pytorch lightning
- transformers
- peft
| {"language": ["en"], "license": "apache-2.0", "datasets": ["HuggingFaceH4/ultrachat_200k"]} | text-generation | vistagi/Mixtral-8x7b-v0.1-sft | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"en",
"dataset:HuggingFaceH4/ultrachat_200k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-14T17:41:16+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #mixtral #text-generation #en #dataset-HuggingFaceH4/ultrachat_200k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Introduction
This model vistagi/Mixtral-8x7b-v0.1-sft is trained with Ultrachat-200K dataset through supervised finetuning using Mixtral-8x7b-v0.1 as the baseline model.
The training is done with bfloat16 precision using LoRA.
## Details
Used Librarys
- torch
- deepspeed
- pytorch lightning
- transformers
- peft
| [
"# Introduction\nThis model vistagi/Mixtral-8x7b-v0.1-sft is trained with Ultrachat-200K dataset through supervised finetuning using Mixtral-8x7b-v0.1 as the baseline model.\nThe training is done with bfloat16 precision using LoRA.",
"## Details\nUsed Librarys\n- torch\n- deepspeed\n- pytorch lightning\n- transformers\n- peft"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #en #dataset-HuggingFaceH4/ultrachat_200k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Introduction\nThis model vistagi/Mixtral-8x7b-v0.1-sft is trained with Ultrachat-200K dataset through supervised finetuning using Mixtral-8x7b-v0.1 as the baseline model.\nThe training is done with bfloat16 precision using LoRA.",
"## Details\nUsed Librarys\n- torch\n- deepspeed\n- pytorch lightning\n- transformers\n- peft"
] | [
73,
69,
24
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #en #dataset-HuggingFaceH4/ultrachat_200k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Introduction\nThis model vistagi/Mixtral-8x7b-v0.1-sft is trained with Ultrachat-200K dataset through supervised finetuning using Mixtral-8x7b-v0.1 as the baseline model.\nThe training is done with bfloat16 precision using LoRA.## Details\nUsed Librarys\n- torch\n- deepspeed\n- pytorch lightning\n- transformers\n- peft"
] | [
-0.09817255288362503,
-0.0018793162889778614,
-0.00037362496368587017,
0.07763585448265076,
0.11094570904970169,
0.016183873638510704,
0.13136892020702362,
0.056100260466337204,
-0.08558167517185211,
-0.03679661452770233,
0.10784606635570526,
0.08328355848789215,
0.02471780776977539,
0.15569433569908142,
-0.0907125249505043,
-0.17424441874027252,
0.035611238330602646,
0.0144552793353796,
-0.035358935594558716,
0.10511649399995804,
0.12216523289680481,
-0.09440134465694427,
0.06694541871547699,
-0.051895782351493835,
-0.17486943304538727,
0.018308313563466072,
-0.0013709991471841931,
-0.0845545306801796,
0.1438961774110794,
0.1233416348695755,
0.07955071330070496,
0.01352099608629942,
0.06797154992818832,
-0.1096629649400711,
0.051495566964149475,
0.009207290597259998,
-0.01750832609832287,
0.07757739722728729,
0.038936588913202286,
-0.007145592477172613,
0.13641661405563354,
-0.04269285500049591,
0.04221692308783531,
0.02635415643453598,
-0.07101287692785263,
-0.12590865790843964,
-0.04578803852200508,
0.028619199991226196,
0.10524524748325348,
0.08465386182069778,
0.01814029924571514,
0.14271114766597748,
-0.024492869153618813,
0.12455184757709503,
0.12296037375926971,
-0.2886030972003937,
-0.09906899183988571,
0.05675322934985161,
0.03388233482837677,
0.0794529840350151,
-0.07786227017641068,
0.032973118126392365,
0.07139725238084793,
0.012980707921087742,
0.14044253528118134,
-0.05058656632900238,
-0.2329152226448059,
-0.007769792340695858,
-0.10210531204938889,
0.005447770934551954,
0.2557879388332367,
-0.027227789163589478,
-0.08837805688381195,
-0.0930733010172844,
-0.10489189624786377,
0.027287302538752556,
-0.05597289651632309,
0.006761624943464994,
-0.019109368324279785,
0.0018588437233120203,
-0.0061893705278635025,
-0.015910020098090172,
-0.10032255947589874,
-0.09329135715961456,
-0.030834129080176353,
0.1674615442752838,
0.008014818653464317,
0.07188200950622559,
-0.07754158973693848,
0.10789308696985245,
0.01566649228334427,
-0.11554522067308426,
-0.014953292906284332,
-0.05249942094087601,
0.023568114265799522,
-0.0052559711039066315,
-0.07049049437046051,
-0.04350951686501503,
0.07744020223617554,
0.1468462347984314,
0.04375264793634415,
0.013809608295559883,
0.006942952051758766,
0.04002835601568222,
-0.029072221368551254,
0.06207166239619255,
0.03288252279162407,
-0.13101425766944885,
0.1514102816581726,
-0.013990337960422039,
0.12848760187625885,
-0.017803331837058067,
-0.053237222135066986,
-0.08993442356586456,
0.033013127744197845,
0.07003520429134369,
-0.019336774945259094,
0.018092362210154533,
-0.0003936226130463183,
0.011121876537799835,
0.06076914817094803,
-0.11779513210058212,
-0.007652971427887678,
-0.008905550464987755,
-0.019771307706832886,
-0.036041807383298874,
0.10277891159057617,
-0.024514703080058098,
-0.016760075464844704,
0.07761650532484055,
-0.026682522147893906,
-0.017501918599009514,
-0.07508756220340729,
-0.09683425724506378,
0.03236893564462662,
0.008709192276000977,
0.034159522503614426,
-0.1769552379846573,
-0.25171059370040894,
0.04373328760266304,
0.046063829213380814,
-0.0388200581073761,
-0.028211554512381554,
-0.027576372027397156,
-0.023211773484945297,
0.051313284784555435,
-0.023737037554383278,
0.08430313318967819,
-0.0429290235042572,
0.08368480950593948,
0.06261841952800751,
0.06860581040382385,
-0.18908481299877167,
0.007596477400511503,
-0.11136948317289352,
0.020305708050727844,
-0.06050916388630867,
0.02498931996524334,
-0.04465381056070328,
0.05979574844241142,
-0.07038465887308121,
-0.00874695461243391,
0.02001023106276989,
0.022891851142048836,
0.06998863816261292,
0.1428796350955963,
-0.17839013040065765,
-0.006960035767406225,
0.11153876781463623,
-0.18278570473194122,
-0.1470162272453308,
0.14024995267391205,
-0.04181358590722084,
0.08532465249300003,
0.08009196817874908,
0.038985755294561386,
0.17183740437030792,
-0.04164257273077965,
-0.012589170597493649,
0.06723561137914658,
0.021342352032661438,
-0.13496004045009613,
0.05871974676847458,
0.11093292385339737,
-0.14601655304431915,
0.06016531214118004,
-0.02146853692829609,
0.08382567763328552,
-0.02844746597111225,
-0.09158942848443985,
-0.11352447420358658,
-0.08055245876312256,
0.06189033389091492,
-0.038242194801568985,
0.040153734385967255,
-0.07494550198316574,
0.0008146626059897244,
0.004572645761072636,
0.14499056339263916,
-0.04520515352487564,
0.011260398663580418,
-0.033942777663469315,
0.023281484842300415,
-0.07081305980682373,
0.04585794359445572,
-0.11219924688339233,
-0.05365364998579025,
-0.03275274485349655,
0.10184243321418762,
-0.020629899576306343,
-0.04411882534623146,
0.07100075483322144,
0.11966577917337418,
-0.04913850873708725,
-0.03693745285272598,
0.0418258011341095,
-0.008605149574577808,
-0.06343045085668564,
-0.1100677102804184,
-0.02260085940361023,
-0.07094477862119675,
0.06797998398542404,
-0.16274525225162506,
0.04347561299800873,
-0.07110827416181564,
0.047079019248485565,
0.0018445859896019101,
0.037644512951374054,
-0.0029286362696439028,
0.036923207342624664,
-0.043630924075841904,
-0.060108330100774765,
0.05202750116586685,
0.05305342748761177,
-0.10327435284852982,
0.0039008776657283306,
-0.07845201343297958,
0.2646404206752777,
0.13302123546600342,
0.025604398921132088,
-0.015180602669715881,
-0.04221661016345024,
-0.011919470503926277,
-0.015860890969634056,
-0.09098173677921295,
0.015424251556396484,
-0.005724403541535139,
-0.0002298822219017893,
0.13653141260147095,
-0.07085525989532471,
-0.008409567177295685,
-0.0005112977232784033,
-0.04941077530384064,
0.023591741919517517,
0.07555458694696426,
0.022426841780543327,
-0.1245766133069992,
0.0894269198179245,
0.17927156388759613,
-0.08042369782924652,
0.12729832530021667,
-0.02679186500608921,
-0.021280338987708092,
0.022776160389184952,
0.04180189594626427,
0.0027897104155272245,
0.09880022704601288,
-0.02395954728126526,
0.03708859160542488,
0.04799453541636467,
-0.04199904948472977,
-0.008465562015771866,
-0.15562869608402252,
-0.019230706617236137,
0.011799653060734272,
-0.014878617599606514,
0.021362008526921272,
0.04861179739236832,
-0.05599889159202576,
0.09223582595586777,
-0.06359829008579254,
-0.08270218223333359,
0.05958576500415802,
0.005339480936527252,
-0.0820491835474968,
0.18202944099903107,
-0.08699500560760498,
-0.2190963625907898,
-0.17482967674732208,
-0.03649190813302994,
-0.07906357198953629,
-0.009742475114762783,
0.0860135406255722,
-0.03279285132884979,
-0.07478715479373932,
-0.0803079605102539,
-0.004009217023849487,
0.04669922962784767,
-0.005371618550270796,
-0.012005402706563473,
0.00986835639923811,
0.016622653231024742,
-0.15914981067180634,
0.009708329103887081,
0.015709515661001205,
-0.05312018468976021,
0.09607192128896713,
0.03815142810344696,
0.08400626480579376,
0.06296220421791077,
-0.018531562760472298,
-0.008764653466641903,
-0.012303624302148819,
0.21943990886211395,
-0.016277577728033066,
0.029369613155722618,
0.2549877464771271,
-0.03492116928100586,
0.03281083703041077,
0.09910967200994492,
0.034906912595033646,
-0.09210210293531418,
0.05706506967544556,
-0.08085449784994125,
-0.056764792650938034,
-0.134910449385643,
-0.10533525794744492,
-0.07152443379163742,
0.04632072150707245,
0.05844530463218689,
0.06840726733207703,
-0.0313430093228817,
0.12107548117637634,
-0.03767341375350952,
0.03858625143766403,
0.05213698372244835,
0.09097666293382645,
0.11274728178977966,
0.015157767571508884,
0.11983369290828705,
-0.10999102890491486,
-0.040598008781671524,
0.06822182983160019,
-0.006773042492568493,
0.14218226075172424,
0.0009870362700894475,
0.02425110712647438,
0.002928702160716057,
0.16474969685077667,
0.0786147490143776,
0.19399848580360413,
0.022115299478173256,
-0.026386762037873268,
-0.030139371752738953,
-0.0722617655992508,
-0.09415226429700851,
0.004663437604904175,
-0.20425845682621002,
0.11926083266735077,
-0.039879266172647476,
0.05435362458229065,
0.03559248149394989,
0.21141405403614044,
0.011872011236846447,
-0.30617693066596985,
-0.07981850951910019,
0.051314420998096466,
0.0023412774316966534,
-0.10200725495815277,
0.012710487470030785,
0.1067575141787529,
-0.07657033205032349,
0.032097671180963516,
-0.08094997704029083,
0.10131629556417465,
0.028891490772366524,
0.0014376073377206922,
-0.09755577147006989,
0.07350742816925049,
-0.016507670283317566,
0.09035752713680267,
-0.29492950439453125,
0.0995788648724556,
0.012028088793158531,
0.1224541887640953,
-0.04330596700310707,
0.008165331557393074,
0.04968960955739021,
0.15972989797592163,
0.08340271562337875,
-0.012538119219243526,
-0.0189654678106308,
-0.06198856234550476,
-0.06440699845552444,
0.0726667046546936,
-0.025807766243815422,
0.04831906408071518,
0.06426568329334259,
-0.07640928030014038,
-0.007638199254870415,
0.03778696432709694,
0.06803644448518753,
-0.09406650066375732,
-0.14092563092708588,
-0.050743598490953445,
0.11080237478017807,
0.0563921183347702,
-0.0854320079088211,
-0.042464159429073334,
0.0321798101067543,
0.1046568974852562,
0.00766321225091815,
-0.062261585146188736,
-0.12378524988889694,
-0.0017731167608872056,
0.05555472522974014,
-0.04470941051840782,
0.07151234894990921,
0.0031206635758280754,
0.11850225180387497,
-0.08073224872350693,
-0.15725918114185333,
0.020755615085363388,
-0.15102322399616241,
-0.07938817143440247,
-0.003239189740270376,
0.07365305721759796,
-0.04783867299556732,
-0.021375801414251328,
0.048055499792099,
0.013307304121553898,
-0.06328458338975906,
-0.11056100577116013,
-0.03579651564359665,
0.11588985472917557,
-0.031222855672240257,
0.027210287749767303,
-0.04691290110349655,
-0.15951314568519592,
0.010425498709082603,
-0.05440966784954071,
0.1758631467819214,
0.19743342697620392,
-0.04854946210980415,
0.08009898662567139,
0.12031232565641403,
-0.04373549669981003,
-0.24555310606956482,
-0.08197807520627975,
-0.03927525505423546,
0.019464246928691864,
-0.00031690779724158347,
-0.09621583670377731,
0.16457070410251617,
0.03173685446381569,
-0.041172027587890625,
0.1059289425611496,
-0.27714502811431885,
-0.10475993156433105,
0.18860189616680145,
0.15119370818138123,
0.32489585876464844,
-0.10682718455791473,
-0.030215825885534286,
-0.18015725910663605,
-0.11762140691280365,
0.12116251140832901,
-0.19384253025054932,
0.08841355890035629,
-0.07269701361656189,
-0.013479609042406082,
-0.004194281995296478,
-0.02398265339434147,
0.11879397928714752,
-0.10507184267044067,
0.11083248257637024,
-0.06985559314489365,
0.07232822477817535,
0.03879176825284958,
-0.039966437965631485,
0.09491576999425888,
-0.18328934907913208,
0.07049333304166794,
-0.048415932804346085,
-0.019014611840248108,
0.0023585213348269463,
0.0885133296251297,
-0.0047447336837649345,
-0.04867178946733475,
-0.03460675850510597,
-0.03163735941052437,
-0.01175110787153244,
-0.02353442832827568,
0.041558638215065,
0.03378775343298912,
0.05695908144116402,
0.12185441702604294,
0.07446786016225815,
-0.004758247174322605,
-0.02511838637292385,
-0.01298062689602375,
-0.04738710820674896,
0.1102546975016594,
-0.11490287631750107,
0.030904345214366913,
0.060965247452259064,
0.006487667094916105,
0.06220148876309395,
0.05718007683753967,
0.030576717108488083,
0.0052277580834925175,
0.07935589551925659,
-0.17521698772907257,
-0.07687842100858688,
-0.04331094026565552,
0.07350559532642365,
-0.02968801185488701,
0.07520433515310287,
0.154060497879982,
-0.12690474092960358,
-0.020320257171988487,
-0.029446810483932495,
0.023645225912332535,
-0.05599817633628845,
0.15374186635017395,
-0.0043539972975850105,
0.046283528208732605,
-0.1246153861284256,
0.11873040348291397,
-0.0005277332384139299,
0.03878416866064072,
-0.0009548835805617273,
0.09053714573383331,
-0.10865556448698044,
-0.07979561388492584,
0.07559428364038467,
0.14604419469833374,
-0.09104310721158981,
-0.09009984135627747,
-0.07186006754636765,
-0.1390073150396347,
0.009073308669030666,
0.1585208922624588,
0.08366739004850388,
0.001832474721595645,
-0.0619066022336483,
-0.022833647206425667,
-0.0822003036737442,
0.1020212471485138,
0.05057111755013466,
0.07954438030719757,
-0.14266268908977509,
0.07188518345355988,
-0.019326606765389442,
0.012270750477910042,
-0.04567752033472061,
-0.01599581353366375,
-0.07923684269189835,
0.017894823104143143,
-0.1880309134721756,
0.03250462934374809,
-0.04550972208380699,
0.026955602690577507,
-0.023890383541584015,
0.016951652243733406,
-0.006745229475200176,
0.025830229744315147,
-0.03820659965276718,
0.00986820925027132,
-0.016162369400262833,
0.015149310231208801,
-0.11424043774604797,
-0.04145945608615875,
0.01717863231897354,
-0.07220683991909027,
0.056815482676029205,
0.06504285335540771,
-0.026363715529441833,
0.06997713446617126,
-0.18921126425266266,
-0.0749376118183136,
0.0951576679944992,
-0.006798420567065477,
-0.003918858245015144,
-0.04171979799866676,
0.008990470319986343,
0.05391771346330643,
-0.0007920676725916564,
-0.00022732897195965052,
0.13253693282604218,
-0.10428906977176666,
-0.0325777642428875,
-0.09236891567707062,
-0.04863667115569115,
-0.07539157569408417,
-0.011755137704312801,
0.1041010171175003,
0.08133363723754883,
0.1934732347726822,
-0.1168757975101471,
-0.007655244320631027,
-0.1430724561214447,
0.006882248446345329,
-0.02985924482345581,
-0.14591240882873535,
-0.20699672400951385,
-0.06375002861022949,
0.03595494478940964,
-0.0023528370074927807,
0.07602435350418091,
-0.016662653535604477,
-0.04336703196167946,
-0.019822411239147186,
-0.003030573483556509,
-0.025590818375349045,
-0.0004082595696672797,
0.34686601161956787,
0.05209844186902046,
0.016711264848709106,
-0.03744250535964966,
0.06573088467121124,
0.1083989068865776,
0.04760325700044632,
0.04851822927594185,
0.14192327857017517,
-0.04525839537382126,
0.1491236388683319,
0.010202444158494473,
-0.0489431694149971,
0.04411773383617401,
-0.05788165703415871,
-0.005149430595338345,
0.05111963674426079,
-0.032609954476356506,
0.11929653584957123,
0.16370238363742828,
-0.08486757427453995,
-0.004245375283062458,
-0.029408907517790794,
-0.048527300357818604,
-0.12368453294038773,
-0.0662161111831665,
-0.09916989505290985,
-0.12559540569782257,
-0.007505293004214764,
-0.11771724373102188,
-0.011219135485589504,
0.005923568271100521,
0.018176458775997162,
-0.033447638154029846,
0.11087796092033386,
0.06056618690490723,
-0.02636113576591015,
0.0640614777803421,
-0.018640225753188133,
-0.006114432588219643,
0.014571333304047585,
-0.043374862521886826,
0.013734863139688969,
-0.04099273681640625,
0.008250278420746326,
0.0200995821505785,
0.025168728083372116,
0.08582521229982376,
-0.030220089480280876,
-0.08339273929595947,
-0.009027466177940369,
0.04393557086586952,
0.06123512610793114,
0.1131550669670105,
0.05647284537553787,
-0.024624118581414223,
0.019982459023594856,
0.27317097783088684,
-0.08194303512573242,
-0.16169321537017822,
-0.06889358162879944,
0.12272590398788452,
-0.01684761233627796,
0.02864275500178337,
0.009480527602136135,
-0.02775101363658905,
0.015512646175920963,
0.14228089153766632,
0.16816411912441254,
-0.04495704174041748,
0.027127500623464584,
-0.06394173204898834,
-0.009376289322972298,
-0.04227045923471451,
0.16511638462543488,
0.12994439899921417,
0.1637834757566452,
-0.08927295356988907,
0.05230351909995079,
-0.04914778470993042,
-0.02825913205742836,
-0.05791671201586723,
0.07830913364887238,
-0.052771467715501785,
-0.03289749100804329,
-0.01134448777884245,
0.06587466597557068,
-0.016770683228969574,
-0.06068721041083336,
0.017210280522704124,
-0.04306255653500557,
-0.08375927805900574,
-0.05470467731356621,
0.027382267639040947,
0.027588119730353355,
0.028393134474754333,
-0.06285054981708527,
0.05196195840835571,
0.06350768357515335,
-0.006658349186182022,
-0.09197759628295898,
-0.07963971048593521,
0.07683524489402771,
-0.09688644111156464,
0.12537425756454468,
0.011262374930083752,
0.06975618749856949,
0.055712927132844925,
0.01032657828181982,
-0.11622647196054459,
0.133940652012825,
-0.03111962042748928,
-0.061442624777555466,
0.09462877362966537,
0.0607471689581871,
-0.03529467061161995,
0.07889015227556229,
-0.0007749907672405243,
-0.08617661893367767,
-0.005236166529357433,
0.08299898356199265,
-0.044304102659225464,
-0.07209818810224533,
0.028692137449979782,
-0.055776067078113556,
0.11395730823278427,
0.08863814175128937,
-0.0573912039399147,
0.003581579774618149,
-0.06924674659967422,
0.10595669597387314,
0.047627344727516174,
-0.022830810397863388,
0.03852437064051628,
-0.14503392577171326,
-0.009247432462871075,
0.06646700948476791,
0.004347987473011017,
-0.2328895926475525,
-0.07005245983600616,
-0.13095292448997498,
-0.05671849846839905,
-0.09415130317211151,
0.043779559433460236,
0.18598835170269012,
0.0007648827158845961,
-0.027868706732988358,
-0.16213999688625336,
-0.0425870418548584,
0.08841657638549805,
-0.0729689821600914,
-0.08549347519874573
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# llama2-7b-None-test
This model is a fine-tuned version of [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.1.2
- Datasets 2.14.6
- Tokenizers 0.15.1 | {"library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "meta-llama/Llama-2-7b-hf", "model-index": [{"name": "llama2-7b-None-test", "results": []}]} | null | th135/llama2-7b-None-test | [
"peft",
"safetensors",
"llama",
"generated_from_trainer",
"base_model:meta-llama/Llama-2-7b-hf",
"region:us"
] | 2024-02-14T17:46:04+00:00 | [] | [] | TAGS
#peft #safetensors #llama #generated_from_trainer #base_model-meta-llama/Llama-2-7b-hf #region-us
|
# llama2-7b-None-test
This model is a fine-tuned version of meta-llama/Llama-2-7b-hf on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.1.2
- Datasets 2.14.6
- Tokenizers 0.15.1 | [
"# llama2-7b-None-test\n\nThis model is a fine-tuned version of meta-llama/Llama-2-7b-hf on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 32\n- total_train_batch_size: 128\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 10\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #safetensors #llama #generated_from_trainer #base_model-meta-llama/Llama-2-7b-hf #region-us \n",
"# llama2-7b-None-test\n\nThis model is a fine-tuned version of meta-llama/Llama-2-7b-hf on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 32\n- total_train_batch_size: 128\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 10\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.1"
] | [
42,
41,
6,
12,
8,
3,
140,
4,
36
] | [
"passage: TAGS\n#peft #safetensors #llama #generated_from_trainer #base_model-meta-llama/Llama-2-7b-hf #region-us \n# llama2-7b-None-test\n\nThis model is a fine-tuned version of meta-llama/Llama-2-7b-hf on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 32\n- total_train_batch_size: 128\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 10\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.1"
] | [
-0.11682146787643433,
0.14590983092784882,
-0.004159829579293728,
0.08251394331455231,
0.10931850969791412,
0.009757233783602715,
0.1097850352525711,
0.1547539383172989,
-0.05355985090136528,
0.09694687277078629,
0.08782782405614853,
0.03372162953019142,
0.0549343004822731,
0.14364655315876007,
-0.024203255772590637,
-0.2358199656009674,
0.009646538645029068,
-0.046174563467502594,
-0.05567330867052078,
0.09678255766630173,
0.09419461339712143,
-0.08811954408884048,
0.06865741312503815,
0.009148212149739265,
-0.07683027535676956,
0.02905297465622425,
-0.03922457993030548,
-0.057066816836595535,
0.08032871782779694,
-0.0005293281865306199,
0.06882816553115845,
0.014235374517738819,
0.10452721267938614,
-0.19583672285079956,
-0.0013370568631216884,
0.07184384763240814,
0.04537520930171013,
0.0916002094745636,
0.08201558142900467,
-0.015344621613621712,
0.08475227653980255,
-0.16376090049743652,
0.08754750341176987,
0.029405122622847557,
-0.05585269629955292,
-0.16847430169582367,
-0.07853040099143982,
0.07118015736341476,
0.10004325956106186,
0.07989326119422913,
0.018168844282627106,
0.14981666207313538,
-0.083583764731884,
0.06668157875537872,
0.2644679546356201,
-0.2647570073604584,
-0.06568454951047897,
0.013737681321799755,
0.04017236828804016,
0.0698949545621872,
-0.11400043219327927,
-0.019864166155457497,
0.029001984745264053,
0.022520020604133606,
0.08823265135288239,
-0.004263428505510092,
-0.02259431779384613,
0.0020739452447742224,
-0.12333790957927704,
-0.012183199636638165,
0.13629817962646484,
0.039152685552835464,
-0.04366328567266464,
-0.12048772722482681,
-0.060925569385290146,
-0.11303478479385376,
-0.004478633403778076,
0.0003419026907067746,
0.027408046647906303,
-0.07485514879226685,
-0.04313141852617264,
-0.032875899225473404,
-0.042453132569789886,
-0.07122436165809631,
0.02231116034090519,
0.13179652392864227,
0.059704236686229706,
0.02217927947640419,
0.00907212682068348,
0.12353123724460602,
0.014562848024070263,
-0.13540419936180115,
-0.04287909343838692,
-0.012127470225095749,
-0.11242450773715973,
-0.03380420804023743,
-0.03397156670689583,
-0.022273985669016838,
0.01855793036520481,
0.14518393576145172,
-0.05828899145126343,
0.09701976925134659,
0.03365379199385643,
-0.006175790447741747,
-0.02643301524221897,
0.11521535366773605,
-0.0545172244310379,
-0.025269893929362297,
-0.003351435298100114,
0.11944984644651413,
0.022758815437555313,
0.00024888134794309735,
-0.04887549579143524,
-0.028186053037643433,
0.09479720145463943,
0.07528131455183029,
-0.054523032158613205,
0.0015736942877992988,
-0.0337601862847805,
-0.03157071769237518,
0.03502662852406502,
-0.15145914256572723,
0.028181185945868492,
0.014142468571662903,
-0.09668605029582977,
-0.014314573258161545,
0.010405465960502625,
-0.02700151316821575,
-0.04615052044391632,
0.07615760713815689,
-0.06903423368930817,
0.0036193314008414745,
-0.06229130178689957,
-0.03930889815092087,
0.010827343910932541,
-0.062158212065696716,
-0.026874488219618797,
-0.0626722201704979,
-0.19186429679393768,
-0.05078409984707832,
0.02688925340771675,
-0.08959618955850601,
-0.026928242295980453,
-0.02238885499536991,
-0.048265282064676285,
0.026788881048560143,
-0.01918199472129345,
0.10282734781503677,
-0.051510900259017944,
0.07878156751394272,
-0.004834381397813559,
0.03794745355844498,
0.08124438673257828,
0.026987293735146523,
-0.07764479517936707,
0.058353736996650696,
-0.13423749804496765,
0.08368987590074539,
-0.10060150921344757,
0.011292030103504658,
-0.13214053213596344,
-0.09342250227928162,
0.0036363534163683653,
-0.03900377079844475,
0.07498838752508163,
0.13807590305805206,
-0.15055304765701294,
-0.014277920126914978,
0.15530112385749817,
-0.08149342238903046,
-0.07423247396945953,
0.09249348193407059,
-0.03543560206890106,
-0.010731428861618042,
0.04377333074808121,
0.17590448260307312,
0.12464191764593124,
-0.16321800649166107,
-0.013878555968403816,
0.01574498414993286,
0.07694806903600693,
0.017410825937986374,
0.08926169574260712,
-0.012938644737005234,
0.020456477999687195,
0.014181015081703663,
-0.06542415171861649,
-0.007479270454496145,
-0.06799780577421188,
-0.07546057552099228,
-0.05743454396724701,
-0.08720909059047699,
0.04094209149479866,
0.014496915973722935,
0.03211505711078644,
-0.07225821167230606,
-0.11687695235013962,
0.08015687018632889,
0.15986359119415283,
-0.06340000778436661,
0.002442966913804412,
-0.06831390410661697,
0.06600413471460342,
-0.03329057618975639,
-0.038185734301805496,
-0.175712451338768,
-0.10287745296955109,
0.04151390120387077,
-0.10587222874164581,
0.016532734036445618,
-0.009538905695080757,
0.0708194226026535,
0.07274052500724792,
-0.044894639402627945,
-0.03813111037015915,
-0.06004445254802704,
-0.005197427235543728,
-0.09066388756036758,
-0.17024652659893036,
-0.054161686450242996,
-0.018027296289801598,
0.14838635921478271,
-0.214851513504982,
0.006092744879424572,
-0.009801879525184631,
0.1643153727054596,
0.025672685354948044,
-0.05715490132570267,
0.027894074097275734,
0.024056894704699516,
0.00597187876701355,
-0.10877923667430878,
0.03350827470421791,
-0.015932710841298103,
-0.09370269626379013,
-0.03858043625950813,
-0.12948305904865265,
0.06260988861322403,
0.038225606083869934,
0.12462399899959564,
-0.10044381022453308,
-0.07026772201061249,
-0.06381334364414215,
-0.05456990748643875,
-0.08628587424755096,
-0.0005839311052113771,
0.1827506422996521,
0.036778319627046585,
0.13008229434490204,
-0.083347387611866,
-0.0758441835641861,
-0.0009017029078677297,
0.002469068393111229,
-0.0010765026090666652,
0.1040940135717392,
0.016009045764803886,
-0.11848142743110657,
0.061747655272483826,
0.10151362419128418,
-0.06620652228593826,
0.14238446950912476,
-0.06542389839887619,
-0.10389016568660736,
-0.03700999170541763,
0.038745082914829254,
-0.008025867864489555,
0.11999905109405518,
-0.02088211104273796,
0.020636389032006264,
0.030109819024801254,
0.0224527046084404,
0.015059823170304298,
-0.15407899022102356,
-0.010045566596090794,
0.01871240697801113,
-0.04293033108115196,
0.008275263011455536,
0.004129388369619846,
0.03499910980463028,
0.08257029950618744,
0.014612335711717606,
-0.03520463407039642,
0.008443072438240051,
-0.01918388530611992,
-0.07589921355247498,
0.18638433516025543,
-0.09601858258247375,
-0.1416543573141098,
-0.10687705129384995,
0.07672005146741867,
-0.06361699104309082,
-0.05140808969736099,
0.008100937120616436,
-0.06291107833385468,
-0.04217949137091637,
-0.11274893581867218,
-0.06193620339035988,
-0.028415294364094734,
-0.008854412473738194,
0.024576911702752113,
0.01788455992937088,
0.11336208879947662,
-0.1119089126586914,
0.009280344471335411,
-0.0033568681683391333,
-0.04758533462882042,
0.0009047559578903019,
0.03919767588376999,
0.07161948084831238,
0.10361169278621674,
-0.0095775555819273,
0.03685048595070839,
-0.028140654787421227,
0.19995015859603882,
-0.07053499668836594,
-0.008386917412281036,
0.12611842155456543,
0.01638864539563656,
0.07245004177093506,
0.09188608080148697,
0.025119997560977936,
-0.07396675646305084,
0.014728630892932415,
0.07455268502235413,
-0.02178860828280449,
-0.242408886551857,
-0.026435567066073418,
-0.019791021943092346,
-0.058676011860370636,
0.11206810176372528,
0.06545761227607727,
-0.01787932589650154,
0.03870740532875061,
-0.01669510081410408,
-0.03379986807703972,
-0.014945773407816887,
0.08533697575330734,
0.09567335247993469,
0.04704395309090614,
0.09970260411500931,
-0.026980524882674217,
-0.018047239631414413,
0.043641697615385056,
0.0321720726788044,
0.23520879447460175,
-0.06328944116830826,
0.09354206919670105,
0.01371682807803154,
0.13466578722000122,
-0.037745650857686996,
0.025936467573046684,
0.02297341637313366,
-0.003187171183526516,
0.011996072717010975,
-0.08356678485870361,
-0.02346561662852764,
0.039523884654045105,
-0.00979714933782816,
0.048649851232767105,
-0.09902682155370712,
0.04082756116986275,
0.012405471876263618,
0.2784125804901123,
0.057471007108688354,
-0.28875240683555603,
-0.05715888366103172,
0.008105696178972721,
-0.04413970187306404,
-0.06500202417373657,
0.02373035065829754,
0.11765507608652115,
-0.13398973643779755,
0.08949410915374756,
-0.07555750012397766,
0.0780438631772995,
-0.05492105707526207,
-0.005087768193334341,
0.05190359801054001,
0.1203223168849945,
-0.010266385972499847,
0.09157814830541611,
-0.16343653202056885,
0.18846414983272552,
0.021057212725281715,
0.07974430918693542,
-0.05271632596850395,
0.017256271094083786,
0.021440688520669937,
0.05012587457895279,
0.11375085264444351,
-0.0025311103090643883,
-0.06373891234397888,
-0.15420353412628174,
-0.11989804357290268,
0.02764175273478031,
0.1071455329656601,
-0.031056959182024002,
0.0808100625872612,
-0.04799030348658562,
-0.00598652008920908,
0.01373850554227829,
-0.09414330124855042,
-0.13893264532089233,
-0.10370541363954544,
0.03829297050833702,
0.012630906887352467,
-0.027853410691022873,
-0.09142975509166718,
-0.10468629002571106,
-0.02461797185242176,
0.14938321709632874,
-0.011568118818104267,
-0.07172207534313202,
-0.14871695637702942,
0.03506765514612198,
0.15726035833358765,
-0.06400211900472641,
-0.00129093904979527,
0.009365895763039589,
0.1159263625741005,
0.04837876930832863,
-0.07122916728258133,
0.05257679894566536,
-0.05063614249229431,
-0.1886415332555771,
-0.05530384182929993,
0.14829863607883453,
0.0488368421792984,
0.04730286821722984,
-0.011188888922333717,
0.011624089442193508,
0.01717543788254261,
-0.08163833618164062,
0.014106429181993008,
0.08901964873075485,
0.05462713912129402,
0.06396318227052689,
-0.05536866933107376,
0.10608814656734467,
-0.03627274930477142,
-0.0069380151107907295,
0.09461550414562225,
0.22771692276000977,
-0.07886537164449692,
0.10870535671710968,
0.04351270571351051,
-0.0603344589471817,
-0.1536066234111786,
0.014055910520255566,
0.1319212168455124,
0.026685357093811035,
0.059385914355516434,
-0.1669529378414154,
0.08456502109766006,
0.12981903553009033,
-0.02904532290995121,
0.05246545001864433,
-0.35044464468955994,
-0.11808328330516815,
0.06931302696466446,
0.0933610051870346,
0.008353129029273987,
-0.1429712325334549,
-0.042704738676548004,
-0.02245069295167923,
-0.0629514679312706,
0.05705072358250618,
-0.09086445719003677,
0.11905016750097275,
-0.041376709938049316,
0.06235179305076599,
0.03957921266555786,
-0.041509758681058884,
0.1542399376630783,
0.015348553657531738,
0.0767417773604393,
-0.03525770083069801,
0.06219397112727165,
0.04306907206773758,
-0.09518074989318848,
0.06788844615221024,
-0.08580091595649719,
0.07897447049617767,
-0.17491455376148224,
-0.010947556234896183,
-0.06558611243963242,
0.06379000097513199,
-0.04343606159090996,
-0.03566054254770279,
-0.048614077270030975,
0.061212796717882156,
0.07381822168827057,
-0.019922219216823578,
0.09703001379966736,
0.001974006649106741,
0.08340068906545639,
0.14821450412273407,
0.07400426268577576,
0.007563514169305563,
-0.15288551151752472,
-0.00414986303076148,
0.0017805235693231225,
0.05147853121161461,
-0.13647979497909546,
0.029162459075450897,
0.13615098595619202,
0.050295423716306686,
0.10716436058282852,
0.013479217886924744,
-0.07182124257087708,
-0.013263163156807423,
0.02912415750324726,
-0.09636978805065155,
-0.10020093619823456,
-0.013425404205918312,
0.0158877894282341,
-0.13926643133163452,
0.013908789493143559,
0.1297779083251953,
-0.04982253164052963,
-0.012469807639718056,
-0.006758870556950569,
0.024059848859906197,
-0.0035625281743705273,
0.21231266856193542,
0.039665963500738144,
0.08115343004465103,
-0.07065719366073608,
0.11541006714105606,
0.06382632255554199,
-0.04403913393616676,
0.06538960337638855,
0.08866225928068161,
-0.0802336186170578,
-0.011022754944860935,
0.05121229588985443,
0.10733482241630554,
-0.03376834839582443,
-0.045835841447114944,
-0.09762109816074371,
-0.09050913900136948,
0.04720340669155121,
0.09239321947097778,
0.04339749738574028,
-0.015459885820746422,
-0.011118676513433456,
-0.011056861840188503,
-0.11482200026512146,
0.1129474863409996,
0.05742541328072548,
0.07024432718753815,
-0.15429119765758514,
0.07664826512336731,
0.004155699163675308,
0.05298735573887825,
-0.014570608735084534,
0.007822202518582344,
-0.09564968198537827,
-0.011447962373495102,
-0.1598757803440094,
0.012159682810306549,
-0.022109178826212883,
0.010014834813773632,
-0.007989011704921722,
-0.03139713406562805,
-0.025841042399406433,
0.046298351138830185,
-0.06366068124771118,
-0.056886982172727585,
0.0048367721028625965,
0.052147675305604935,
-0.13177774846553802,
-0.021067043766379356,
0.020655253902077675,
-0.10445290803909302,
0.07710198312997818,
0.05250126123428345,
0.013323182240128517,
-0.006919139996170998,
-0.04655642434954643,
0.01033608615398407,
0.013155207969248295,
0.018642347306013107,
0.036341503262519836,
-0.12153391540050507,
-0.0024304543621838093,
-0.041004400700330734,
0.028575919568538666,
0.0237213596701622,
0.032140254974365234,
-0.1302897334098816,
-0.03285987302660942,
-0.06095841899514198,
-0.04170505702495575,
-0.057185906916856766,
0.0436834990978241,
0.06953204423189163,
0.03373373672366142,
0.13079535961151123,
-0.08092265576124191,
0.05461510270833969,
-0.1909627616405487,
-0.04591044783592224,
0.007093149237334728,
-0.01641855388879776,
-0.03098953701555729,
-0.013926503248512745,
0.08879632502794266,
-0.05588819086551666,
0.11192573606967926,
-0.012834180146455765,
0.0924946591258049,
0.037173911929130554,
-0.06144905090332031,
0.013887305743992329,
0.010322285816073418,
0.1474170833826065,
0.04286978021264076,
-0.0022018025629222393,
0.11622826755046844,
-0.042555633932352066,
0.04538383334875107,
0.04908492788672447,
0.1521684229373932,
0.15661010146141052,
-0.040427062660455704,
0.05423921346664429,
0.05133635178208351,
-0.11664018779993057,
-0.13963012397289276,
0.10192859172821045,
-0.0035400190390646458,
0.10566798597574234,
-0.0445171482861042,
0.13966362178325653,
0.11435835808515549,
-0.17191082239151,
0.032783009111881256,
-0.0393226332962513,
-0.11592258512973785,
-0.12532849609851837,
-0.057949092239141464,
-0.08023357391357422,
-0.11906275153160095,
0.021720794960856438,
-0.0953202098608017,
0.044207241386175156,
0.09030712395906448,
0.019697673618793488,
0.02351684868335724,
0.15367139875888824,
0.0024000543635338545,
0.00384596292860806,
0.06505312025547028,
0.03512127697467804,
0.015214957296848297,
-0.017883898690342903,
-0.06052805855870247,
0.06182413175702095,
-0.026472264900803566,
0.05989239737391472,
-0.04655584320425987,
0.034002047032117844,
0.03186476230621338,
-0.005428932141512632,
-0.07092013955116272,
0.02341114729642868,
0.019104108214378357,
0.036913394927978516,
0.05205902084708214,
0.06739705055952072,
-0.009486195631325245,
-0.06655152142047882,
0.26670724153518677,
-0.084600530564785,
-0.04092501848936081,
-0.11599072813987732,
0.2438352406024933,
0.03072168678045273,
-0.015107093378901482,
0.059977464377880096,
-0.10396406054496765,
-0.01829475909471512,
0.13482309877872467,
0.12606783211231232,
-0.05679669976234436,
-0.021907620131969452,
-0.02716807834804058,
-0.018948351964354515,
-0.049583930522203445,
0.109752357006073,
0.07930517941713333,
0.027290815487504005,
-0.05318253114819527,
0.016549857333302498,
-0.006712648086249828,
-0.03333699330687523,
-0.12272848188877106,
0.060305867344141006,
0.005742932669818401,
0.014665361493825912,
-0.031126776710152626,
0.06482229381799698,
0.0034540598280727863,
-0.16300824284553528,
0.05761483684182167,
-0.1330079883337021,
-0.19230100512504578,
-0.024966223165392876,
0.029829468578100204,
-0.014082619920372963,
0.03616748005151749,
-0.026194872334599495,
-0.01315099373459816,
0.14775824546813965,
-0.017422949895262718,
-0.040991950780153275,
-0.11118493974208832,
0.06871868669986725,
-0.10238111019134521,
0.19605380296707153,
0.007244368549436331,
0.08277815580368042,
0.10396848618984222,
-0.008012158796191216,
-0.14073657989501953,
0.03501740097999573,
0.09516420215368271,
-0.08798740804195404,
0.032834332436323166,
0.1743701547384262,
-0.019913606345653534,
0.12843969464302063,
0.048029422760009766,
-0.06119810789823532,
-0.0435652993619442,
-0.023100409656763077,
0.017884887754917145,
-0.0824306532740593,
0.006806737277656794,
-0.03906991332769394,
0.16214564442634583,
0.19722892343997955,
-0.04876819625496864,
0.0022031888365745544,
-0.06157471239566803,
0.030639391392469406,
0.039903245866298676,
0.03327784314751625,
-0.0030897916294634342,
-0.1799599677324295,
0.03499511256814003,
0.0191153846681118,
0.03890825808048248,
-0.24224992096424103,
-0.09027576446533203,
0.04083792865276337,
-0.06933712959289551,
-0.029834255576133728,
0.11724898219108582,
0.010436005890369415,
0.03133554011583328,
-0.027744106948375702,
-0.11961694806814194,
-0.0421653613448143,
0.13475655019283295,
-0.17010065913200378,
-0.03900494426488876
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mistral_4
This model is a fine-tuned version of [ybelkada/mistral-7b-instruct-v0.1-sharded](https://huggingface.co/ybelkada/mistral-7b-instruct-v0.1-sharded) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.2 | {"library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "ybelkada/mistral-7b-instruct-v0.1-sharded", "model-index": [{"name": "mistral_4", "results": []}]} | null | anyiwang/mistral_4 | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:ybelkada/mistral-7b-instruct-v0.1-sharded",
"region:us"
] | 2024-02-14T17:46:23+00:00 | [] | [] | TAGS
#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us
|
# mistral_4
This model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0
- Datasets 2.17.0
- Tokenizers 0.15.2 | [
"# mistral_4\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
"TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us \n",
"# mistral_4\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
50,
41,
6,
12,
8,
3,
103,
4,
36
] | [
"passage: TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #base_model-ybelkada/mistral-7b-instruct-v0.1-sharded #region-us \n# mistral_4\n\nThis model is a fine-tuned version of ybelkada/mistral-7b-instruct-v0.1-sharded on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 10\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
-0.10337264835834503,
0.006215794011950493,
-0.001789325033314526,
0.0589677132666111,
0.15485607087612152,
0.024661660194396973,
0.122022345662117,
0.10370431840419769,
-0.03875783830881119,
0.07222878187894821,
0.06050911173224449,
0.019691182300448418,
0.05346105992794037,
0.13962464034557343,
-0.02817060798406601,
-0.262439101934433,
0.04388393461704254,
-0.02197706513106823,
-0.023118125274777412,
0.08932846784591675,
0.12260407209396362,
-0.10758233815431595,
0.04834804683923721,
0.009517268277704716,
-0.1472058892250061,
0.014542456716299057,
-0.01180010475218296,
-0.038507021963596344,
0.10677209496498108,
0.023931588977575302,
0.15563884377479553,
-0.0016158894868567586,
0.1375308483839035,
-0.22651930153369904,
0.017075620591640472,
0.07380184531211853,
0.04182357341051102,
0.07784227281808853,
0.07114922255277634,
0.013560948893427849,
0.08596766740083694,
-0.11704763025045395,
0.11641354858875275,
0.031220970675349236,
-0.08262176811695099,
-0.20933224260807037,
-0.10710372030735016,
0.060312122106552124,
0.11796801537275314,
0.07700551301240921,
0.009835158474743366,
0.14242087304592133,
-0.0901828482747078,
0.05085735768079758,
0.19254645705223083,
-0.22196851670742035,
-0.08155807107686996,
0.05443694815039635,
0.053038131445646286,
0.08513971418142319,
-0.10770121961832047,
-0.04111854359507561,
0.06866102665662766,
0.04426225274801254,
0.06896253675222397,
0.007649783045053482,
-0.06452878564596176,
0.0005022608092986047,
-0.14663222432136536,
-0.010872065089643002,
0.13430839776992798,
0.04579004645347595,
-0.0527285672724247,
-0.07451902329921722,
-0.03368290141224861,
-0.051827237010002136,
-0.03572468087077141,
-0.0390949621796608,
0.0024862673599272966,
-0.02407955750823021,
-0.012910435907542706,
-0.04775889217853546,
-0.09928417205810547,
-0.09866531193256378,
0.0032498519867658615,
0.1311064511537552,
0.047070372849702835,
0.0018703984096646309,
-0.0420706570148468,
0.10813391208648682,
-0.001810224144719541,
-0.08633924275636673,
-0.008354033343493938,
-0.018132321536540985,
-0.0814436748623848,
-0.0757456049323082,
-0.04562411084771156,
-0.03799774497747421,
0.006461271084845066,
0.11102264374494553,
-0.10832057893276215,
0.08826020359992981,
0.023112833499908447,
0.03447101637721062,
-0.0251633208245039,
0.08081954717636108,
-0.041712187230587006,
0.04247117415070534,
-0.009520944207906723,
0.09880697727203369,
-0.01733573153614998,
-0.014226874336600304,
-0.07268641144037247,
-0.05025118589401245,
0.07962599396705627,
0.04885334149003029,
-0.05538642406463623,
0.008346492424607277,
-0.06515853106975555,
-0.02086462266743183,
0.01465445477515459,
-0.12135925889015198,
0.03646618500351906,
0.0144332991912961,
-0.058310240507125854,
-0.027242224663496017,
0.03002922609448433,
0.029377374798059464,
0.0076901670545339584,
0.07108276337385178,
-0.07468761503696442,
0.024852022528648376,
-0.09998003393411636,
-0.0803261399269104,
0.012915010564029217,
-0.0153382308781147,
-0.015492914244532585,
-0.10554129630327225,
-0.18069294095039368,
-0.05969438701868057,
0.03478049859404564,
-0.04181191325187683,
0.0006863681483082473,
-0.05268433690071106,
-0.031800899654626846,
0.03944379836320877,
-0.006819132715463638,
0.11582417786121368,
-0.05402616038918495,
0.0829983800649643,
-0.0553533211350441,
-0.0013788575306534767,
-0.028029317036271095,
0.028246091678738594,
-0.06388083100318909,
0.04445512965321541,
-0.10409299284219742,
0.055932678282260895,
-0.10560792684555054,
0.03686010092496872,
-0.13210664689540863,
-0.09771011024713516,
-0.018715301528573036,
-0.015840530395507812,
0.10999235510826111,
0.11136600375175476,
-0.20171275734901428,
-0.009703820571303368,
0.12720872461795807,
-0.11257486790418625,
-0.0810292512178421,
0.1060682013630867,
-0.06434255838394165,
0.08273648470640182,
0.03858178108930588,
0.17407062649726868,
0.11986999958753586,
-0.14124003052711487,
0.04288206994533539,
0.002502785762771964,
0.09208354353904724,
0.06680405884981155,
0.06228973716497421,
-0.03220329061150551,
-0.083177849650383,
0.006302901543676853,
-0.06441062688827515,
0.0471021868288517,
-0.09610269218683243,
-0.07290066033601761,
-0.04743048548698425,
-0.06059272959828377,
0.08012470602989197,
0.0191176887601614,
0.027221523225307465,
-0.06921228766441345,
-0.060811400413513184,
0.16650310158729553,
0.1496778130531311,
-0.05436931923031807,
0.0044854688458144665,
-0.06114275008440018,
0.03846197575330734,
-0.01646745204925537,
-0.03914026543498039,
-0.16919468343257904,
-0.11931213736534119,
0.029645886272192,
-0.03913192078471184,
0.0025577887427061796,
0.02892068587243557,
0.07792153209447861,
0.07292833179235458,
-0.04443078115582466,
-0.052019715309143066,
-0.11630630493164062,
0.013992189429700375,
-0.11284984648227692,
-0.18967723846435547,
-0.05293823778629303,
-0.06184903904795647,
0.167352095246315,
-0.25421127676963806,
0.02167338691651821,
-0.013650160282850266,
0.1327713429927826,
0.03669483959674835,
-0.04606243595480919,
-0.009569479152560234,
0.06628235429525375,
0.00016244534344878048,
-0.0845855325460434,
0.047826752066612244,
0.01811722107231617,
-0.12895971536636353,
-0.01707228273153305,
-0.1493241935968399,
0.03461204096674919,
0.04856346547603607,
0.03721722960472107,
-0.10710162669420242,
-0.1384902000427246,
-0.06328704953193665,
-0.05236035957932472,
-0.0840529277920723,
-0.0032734349370002747,
0.20191170275211334,
0.005900191143155098,
0.11713024228811264,
-0.06325466185808182,
-0.0390244722366333,
-0.005822141654789448,
-0.024584593251347542,
-0.017468951642513275,
0.08803189545869827,
0.019076982513070107,
-0.11767800152301788,
0.07110142707824707,
0.12634046375751495,
-0.065721295773983,
0.1708741933107376,
-0.055221863090991974,
-0.10148989409208298,
-0.01333630084991455,
0.05333716794848442,
-0.0061828759498894215,
0.1082717776298523,
-0.06267492473125458,
0.03053361177444458,
0.017201364040374756,
0.04742904007434845,
0.040219198912382126,
-0.1870562732219696,
-0.02142537198960781,
0.013491062447428703,
-0.019624127075076103,
-0.02187572605907917,
-0.017073802649974823,
0.028281310573220253,
0.08384675532579422,
0.019448695704340935,
-0.011735129170119762,
0.01810576021671295,
-0.006454815622419119,
-0.11513639241456985,
0.19852522015571594,
-0.15288731455802917,
-0.08844896405935287,
-0.11766643822193146,
0.07756655663251877,
-0.0004115824121981859,
-0.028531545773148537,
0.0261512603610754,
-0.09623593837022781,
-0.037756096571683884,
-0.0833800658583641,
0.0004976270720362663,
-0.05364157259464264,
-0.025098415091633797,
0.02426331490278244,
0.00717892125248909,
0.09182938188314438,
-0.12695206701755524,
0.014949704520404339,
-0.005114018451422453,
-0.0869145616889,
0.022619659081101418,
0.027672627940773964,
0.07906775176525116,
0.14128918945789337,
-0.011583597399294376,
-0.00486096041277051,
-0.04870016127824783,
0.22563214600086212,
-0.051559511572122574,
-0.023612210527062416,
0.10851018130779266,
-0.006585732102394104,
0.05973655730485916,
0.12210965156555176,
0.046115774661302567,
-0.08896319568157196,
0.05020616576075554,
0.04518600180745125,
-0.004499801900237799,
-0.25483155250549316,
-0.05211193487048149,
-0.024815058335661888,
-0.10208138078451157,
0.09321761131286621,
0.046902112662792206,
-0.028447233140468597,
0.06341329216957092,
-0.04612885043025017,
0.02134450152516365,
0.012971612624824047,
0.09163552522659302,
0.020226893946528435,
0.02259140834212303,
0.07046306878328323,
-0.023856526240706444,
-0.004968945402652025,
0.05620591342449188,
0.011244754306972027,
0.27087152004241943,
-0.013945433311164379,
0.059186480939388275,
0.06046794727444649,
0.16167764365673065,
-0.005845725070685148,
0.019717657938599586,
0.021461984142661095,
-0.010274054482579231,
-0.0032968975137919188,
-0.06314333528280258,
-0.03963378816843033,
0.05729199945926666,
-0.04088391736149788,
0.0727773979306221,
-0.12210151553153992,
0.0003023453464265913,
0.019875844940543175,
0.263919860124588,
0.012519828975200653,
-0.2620271146297455,
-0.10492577403783798,
0.03382217884063721,
-0.016352161765098572,
-0.10037713497877121,
0.009059403091669083,
0.16377991437911987,
-0.13561569154262543,
0.030971406027674675,
-0.06303726136684418,
0.0967380478978157,
0.0007971783052198589,
-0.011783137917518616,
0.01029207557439804,
0.1272677183151245,
-0.014811011962592602,
0.08371758460998535,
-0.22936837375164032,
0.24654847383499146,
0.01342978049069643,
0.104887954890728,
-0.022979319095611572,
0.02102394588291645,
0.03470654785633087,
0.08443046361207962,
0.061670608818531036,
0.01375835482031107,
-0.11259101331233978,
-0.20751063525676727,
-0.03861650824546814,
0.03720827400684357,
0.10715404897928238,
-0.014421668834984303,
0.05920276418328285,
-0.05043245851993561,
0.04238877445459366,
0.05125502124428749,
-0.07730136811733246,
-0.23079358041286469,
-0.1030389592051506,
0.004282772075384855,
0.01357520092278719,
0.01818849705159664,
-0.12834177911281586,
-0.09695174545049667,
-0.022535433992743492,
0.10721303522586823,
-0.012470382265746593,
-0.02761477418243885,
-0.14311201870441437,
0.07065211236476898,
0.11668410152196884,
-0.05560881271958351,
0.0244889073073864,
0.03599180281162262,
0.13164101541042328,
0.00907344464212656,
-0.046912480145692825,
0.06788238883018494,
-0.0761818140745163,
-0.16295753419399261,
-0.07254540175199509,
0.09332893788814545,
0.0977049395442009,
0.04696078971028328,
0.009154240600764751,
0.013569151982665062,
0.011253073811531067,
-0.09570546448230743,
0.007295412942767143,
0.18250195682048798,
0.035789553076028824,
0.09194548428058624,
-0.10019440203905106,
-0.002310779644176364,
-0.04056400805711746,
-0.03748605027794838,
0.13166220486164093,
0.24145032465457916,
-0.08891843259334564,
0.06795424968004227,
0.10234737396240234,
-0.09378474950790405,
-0.1704847812652588,
0.09904774278402328,
0.14595438539981842,
0.026414941996335983,
0.04589180648326874,
-0.18355418741703033,
0.05115588381886482,
0.14346209168434143,
-0.01859118603169918,
0.023816818371415138,
-0.33942046761512756,
-0.11595098674297333,
0.07909861952066422,
0.14036297798156738,
0.03207307681441307,
-0.11530941724777222,
-0.0409570150077343,
-0.028831537812948227,
-0.11614429950714111,
0.037035536020994186,
-0.12909281253814697,
0.07495085895061493,
-0.01104784570634365,
0.07939565926790237,
0.03848927840590477,
-0.032811522483825684,
0.2010136842727661,
-0.0251179039478302,
0.11297707259654999,
-0.0557200089097023,
0.05650795251131058,
0.025914093479514122,
-0.06990455836057663,
0.03973982110619545,
-0.005413677077740431,
0.06617297977209091,
-0.13306120038032532,
-0.008426230400800705,
-0.063511922955513,
0.07156642526388168,
-0.04820823669433594,
-0.07446098327636719,
-0.03277973085641861,
0.05820194259285927,
0.04310975968837738,
-0.027835164219141006,
0.062053751200437546,
-0.019339123740792274,
0.17479203641414642,
0.027584265917539597,
0.10433576256036758,
-0.015600290149450302,
-0.058317288756370544,
0.0021135651040822268,
-0.027970990166068077,
0.0863669291138649,
-0.13197878003120422,
0.022929400205612183,
0.1205100268125534,
0.030493127182126045,
0.16004014015197754,
0.041915345937013626,
-0.0729956403374672,
0.038614701479673386,
0.040365274995565414,
-0.06970222294330597,
-0.1620815396308899,
-0.02052895724773407,
0.13951744139194489,
-0.14788973331451416,
0.002574182813987136,
0.1135149821639061,
-0.07457549124956131,
-0.021955620497465134,
-0.023601898923516273,
0.012073843739926815,
-0.049306634813547134,
0.17946504056453705,
0.04113874211907387,
0.06150616332888603,
-0.05381961539387703,
0.09821463376283646,
0.07758620381355286,
-0.08366880565881729,
0.05396343022584915,
0.056876104325056076,
-0.0847276970744133,
-0.04022432491183281,
0.07213140279054642,
0.1686733216047287,
-0.014782448299229145,
-0.047140076756477356,
-0.0510488823056221,
-0.10459993034601212,
0.012020409107208252,
0.1064491868019104,
0.02693895250558853,
-0.009244709275662899,
-0.003670152509585023,
0.025449059903621674,
-0.10272513329982758,
0.06280021369457245,
0.046197112649679184,
0.08655966818332672,
-0.1557409018278122,
0.15246358513832092,
-0.0005885094869881868,
0.02406742237508297,
-0.00757675850763917,
-0.004174717236310244,
-0.10803421586751938,
0.004120722413063049,
-0.17840023338794708,
0.02450062893331051,
-0.03654945641756058,
0.024804135784506798,
0.010010073892772198,
-0.052713003009557724,
-0.008594752289354801,
0.044653113931417465,
-0.07883801311254501,
-0.039778996258974075,
0.007121212314814329,
0.0893743485212326,
-0.0845099687576294,
-0.024563241750001907,
0.041905973106622696,
-0.08150094747543335,
0.06385646760463715,
0.06682084500789642,
0.028872933238744736,
0.049224574118852615,
-0.20292578637599945,
0.016533302143216133,
0.03618454560637474,
0.008380367420613766,
0.03660057112574577,
-0.09889880567789078,
-0.025226041674613953,
-0.04352929815649986,
0.04045220836997032,
0.013063234277069569,
0.06196910887956619,
-0.11826188117265701,
-0.06270037591457367,
-0.03789031133055687,
-0.07101920247077942,
-0.08028954267501831,
0.03646593168377876,
0.0714026391506195,
0.0684865191578865,
0.12328524887561798,
-0.10572142153978348,
0.05424702540040016,
-0.17561693489551544,
-0.03588307648897171,
-0.024851299822330475,
0.004120181314647198,
-0.06947892159223557,
-0.05369690805673599,
0.06883453577756882,
-0.05020106956362724,
0.048741064965724945,
-0.04950939491391182,
0.07491974532604218,
0.018829142674803734,
-0.1066734790802002,
0.009285783395171165,
0.018411127850413322,
0.21818676590919495,
0.052903350442647934,
-0.001985691487789154,
0.051904648542404175,
0.0011282323393970728,
0.0352940559387207,
0.10621880739927292,
0.12911807000637054,
0.1746978610754013,
-0.027316628023982048,
0.060384009033441544,
0.05114862695336342,
-0.10159514099359512,
-0.058767061680555344,
0.0905451700091362,
0.0184214748442173,
0.05842888727784157,
-0.06324365735054016,
0.17743095755577087,
0.1437743902206421,
-0.18805299699306488,
0.022895431146025658,
-0.06536433100700378,
-0.09944599121809006,
-0.10515092313289642,
-0.017600156366825104,
-0.07646728307008743,
-0.14469367265701294,
0.008487126789987087,
-0.1103997603058815,
0.010412308387458324,
0.06352095305919647,
0.008961752988398075,
0.04339352250099182,
0.13274751603603363,
0.021573014557361603,
0.008538791909813881,
0.049207452684640884,
0.007000009994953871,
0.017741259187459946,
-0.09953625500202179,
-0.10704183578491211,
0.0920887216925621,
-0.04000731185078621,
0.04173225536942482,
-0.04637802019715309,
0.008257081732153893,
0.029475469142198563,
-0.0035149960312992334,
-0.07743743807077408,
0.03305261954665184,
0.010678993538022041,
0.00661084707826376,
0.0687677264213562,
0.08571168035268784,
-0.012139420956373215,
-0.04537387192249298,
0.2844032943248749,
-0.07214077562093735,
-0.07684493809938431,
-0.14623041450977325,
0.21965345740318298,
0.01139330118894577,
0.006197964306920767,
0.04032072797417641,
-0.1089073121547699,
0.01921582780778408,
0.07510922849178314,
0.10454878211021423,
-0.036782629787921906,
0.004954392556101084,
-0.025369932875037193,
-0.025389807298779488,
-0.09590140730142593,
0.1436505913734436,
0.09780371189117432,
-0.0091379564255476,
-0.06822861731052399,
0.005390261299908161,
-0.02247123047709465,
0.001140994019806385,
-0.08534034341573715,
0.02377086877822876,
-0.011433889158070087,
0.0012024232419207692,
-0.042873580008745193,
0.09876471012830734,
0.019190024584531784,
-0.13023684918880463,
0.01589992269873619,
-0.10620708763599396,
-0.14004801213741302,
-0.04497940465807915,
0.05797440931200981,
0.001495108474045992,
0.04602193087339401,
-0.0457729808986187,
0.014151371084153652,
0.13670183718204498,
-0.021389111876487732,
-0.015529042109847069,
-0.1416955143213272,
0.09178055822849274,
-0.023440631106495857,
0.2231423407793045,
-0.003172996686771512,
0.0713529959321022,
0.10009098798036575,
0.04253759980201721,
-0.1196022629737854,
0.060087744146585464,
0.06730786710977554,
-0.039240762591362,
0.01564645767211914,
0.14747850596904755,
-0.05549449473619461,
0.1238340511918068,
0.046350955963134766,
-0.17635901272296906,
0.01861233450472355,
-0.045253925025463104,
-0.02746964804828167,
-0.06790799647569656,
0.03881033509969711,
-0.05317975580692291,
0.1503496617078781,
0.1679898351430893,
-0.04184658080339432,
-0.013221114873886108,
-0.0641382709145546,
0.04313709959387779,
0.04742544889450073,
0.10846609622240067,
-0.026320846751332283,
-0.19949166476726532,
0.015592155046761036,
0.03718414157629013,
0.0275910384953022,
-0.22815091907978058,
-0.10185766965150833,
0.028148626908659935,
-0.05808671563863754,
-0.028743572533130646,
0.12076766788959503,
0.06166354939341545,
0.027767164632678032,
-0.03648023679852486,
-0.20641735196113586,
-0.035561271011829376,
0.15835872292518616,
-0.08958007395267487,
-0.04801149293780327
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_qa_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8077
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 250 | 2.3463 |
| 2.794 | 2.0 | 500 | 1.8910 |
| 2.794 | 3.0 | 750 | 1.8077 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "my_awesome_qa_model", "results": []}]} | question-answering | farfalla/my_awesome_qa_model | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"question-answering",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-14T17:46:52+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #question-answering #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us
| my\_awesome\_qa\_model
======================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.8077
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #question-answering #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
65,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #question-answering #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.1079578772187233,
0.08763980865478516,
-0.0019621143583208323,
0.10341255366802216,
0.12614955008029938,
0.021183563396334648,
0.14030638337135315,
0.1130698099732399,
-0.08146797865629196,
0.05760328471660614,
0.13386547565460205,
0.11614212393760681,
-0.0010275463573634624,
0.08917620778083801,
-0.06141696497797966,
-0.20077796280384064,
0.004093899391591549,
0.03878962993621826,
-0.08985836058855057,
0.11510021239519119,
0.08729863911867142,
-0.13721224665641785,
0.07706473767757416,
-0.005252608098089695,
-0.17405426502227783,
0.026438353583216667,
0.007451459765434265,
-0.03731966391205788,
0.12098446488380432,
0.02302885800600052,
0.13247452676296234,
0.02306237258017063,
0.07921166718006134,
-0.19601735472679138,
0.015795690938830376,
0.061106666922569275,
-0.002603008644655347,
0.07791982591152191,
0.025356560945510864,
0.00629910733550787,
0.08533596992492676,
-0.08035677671432495,
0.06192811205983162,
0.029701558873057365,
-0.12525951862335205,
-0.24548351764678955,
-0.1022774800658226,
0.052221160382032394,
0.09598281234502792,
0.09667747467756271,
-0.012235932052135468,
0.14284324645996094,
-0.07576734572649002,
0.08507441729307175,
0.23668980598449707,
-0.3127151429653168,
-0.07500237971544266,
0.04209037870168686,
0.03609500825405121,
0.07135437428951263,
-0.10207263380289078,
-0.030968399718403816,
0.07605142146348953,
0.02661088854074478,
0.09998829662799835,
-0.03796201944351196,
-0.08013312518596649,
0.025841889902949333,
-0.1454339623451233,
-0.010773399844765663,
0.1624334156513214,
0.06543325632810593,
-0.045340608805418015,
-0.03454513102769852,
-0.063564233481884,
-0.11902239918708801,
-0.02963925153017044,
-0.0380592904984951,
0.042180873453617096,
-0.03628644719719887,
-0.07894314080476761,
-0.01673061214387417,
-0.10153656452894211,
-0.09933217614889145,
-0.05122080817818642,
0.15787987411022186,
0.0414079949259758,
0.016491375863552094,
-0.023028390482068062,
0.10171841084957123,
-0.03466477617621422,
-0.14449843764305115,
0.002443211153149605,
0.021154526621103287,
-0.005706069525331259,
-0.050078656524419785,
-0.045911431312561035,
-0.0534600093960762,
0.03653931990265846,
0.18721002340316772,
-0.07968150824308395,
0.04092005640268326,
0.027010025456547737,
0.03858250752091408,
-0.09839488565921783,
0.1500435620546341,
-0.06256569921970367,
-0.017830118536949158,
-0.0018423255532979965,
0.08145970851182938,
0.033781055361032486,
0.004189325030893087,
-0.089665487408638,
0.019188188016414642,
0.08908883482217789,
0.019356081262230873,
-0.0449339896440506,
0.054494958370923996,
-0.05003219470381737,
-0.00040126912062987685,
0.01677119918167591,
-0.08018811047077179,
0.02755165286362171,
0.00874569732695818,
-0.060966260731220245,
-0.04023066535592079,
0.024771112948656082,
0.022089257836341858,
0.021505486220121384,
0.09314974397420883,
-0.0978613942861557,
0.00813241209834814,
-0.09164196252822876,
-0.10598386824131012,
0.023505231365561485,
-0.06532797962427139,
0.03309416025876999,
-0.09245709329843521,
-0.17023494839668274,
-0.013551461510360241,
0.05922982096672058,
-0.030474387109279633,
-0.014799722470343113,
-0.03865351155400276,
-0.09020908921957016,
0.0008138086413964629,
-0.013011627830564976,
0.08984459936618805,
-0.058956798166036606,
0.10181625187397003,
0.06126866862177849,
0.0699855238199234,
-0.03698963671922684,
0.030636988580226898,
-0.10944872349500656,
0.04139633849263191,
-0.18797966837882996,
0.003984891809523106,
-0.08641743659973145,
0.07289350777864456,
-0.09084009379148483,
-0.08535929769277573,
-0.016254007816314697,
0.006539325229823589,
0.0881168320775032,
0.10214412212371826,
-0.15106181800365448,
-0.05815037712454796,
0.148003488779068,
-0.09143427759408951,
-0.17716358602046967,
0.12513279914855957,
-0.047256968915462494,
0.06066618487238884,
0.0486370213329792,
0.16919341683387756,
0.06290459632873535,
-0.11700344830751419,
-0.017890803515911102,
0.003587662009522319,
0.059311218559741974,
-0.027697164565324783,
0.06983974575996399,
-0.016477065160870552,
0.0036333235912024975,
0.008215698413550854,
-0.06462837010622025,
0.03496193140745163,
-0.09898227453231812,
-0.08959127962589264,
-0.05475960299372673,
-0.1089145615696907,
0.04224627465009689,
0.06800667941570282,
0.056415509432554245,
-0.11482112109661102,
-0.10043847560882568,
0.0774381011724472,
0.08820438385009766,
-0.07181283086538315,
0.019134219735860825,
-0.07872435450553894,
0.08115661889314651,
-0.07177772372961044,
-0.02882152982056141,
-0.15635594725608826,
-0.053422488272190094,
0.0035561344120651484,
-0.0038196067325770855,
0.0064107878133654594,
0.014823948964476585,
0.08407282084226608,
0.06625182181596756,
-0.07336990535259247,
-0.045395378023386,
-0.05134711042046547,
0.013827234506607056,
-0.11478529870510101,
-0.20930363237857819,
-0.02668589912354946,
-0.03260941803455353,
0.1105661690235138,
-0.21417927742004395,
0.042914122343063354,
-0.0027070229407399893,
0.09319404512643814,
0.035662941634655,
-0.008854249492287636,
-0.046256173402071,
0.06529150158166885,
-0.030676813796162605,
-0.061146676540374756,
0.039574429392814636,
0.003142300993204117,
-0.09590459614992142,
-0.07621210068464279,
-0.11605701595544815,
0.17367379367351532,
0.12780000269412994,
-0.09184027463197708,
-0.07925602793693542,
-0.002761205192655325,
-0.06189825013279915,
-0.04021107032895088,
-0.03744383156299591,
0.004364399239420891,
0.11405923217535019,
-0.01421559602022171,
0.11729323863983154,
-0.08462566137313843,
-0.038485087454319,
0.009775741957128048,
-0.05127548798918724,
0.019547976553440094,
0.10796801745891571,
0.1117251068353653,
-0.081390380859375,
0.14698009192943573,
0.18587777018547058,
-0.11141511052846909,
0.11617311090230942,
-0.0654597356915474,
-0.08602005243301392,
-0.03317912295460701,
0.022672584280371666,
0.009516389109194279,
0.14659282565116882,
-0.13773559033870697,
0.019626738503575325,
0.014542412012815475,
0.010319598019123077,
0.02093718759715557,
-0.21572169661521912,
-0.045099757611751556,
0.026997068896889687,
-0.04558418691158295,
-0.013135450892150402,
-0.009701218456029892,
-0.008238356560468674,
0.08762362599372864,
-0.011737208813428879,
-0.06223534420132637,
0.03968314081430435,
-0.00263650412671268,
-0.07476041465997696,
0.210761159658432,
-0.07432426512241364,
-0.08286245167255402,
-0.11323168873786926,
-0.02658006176352501,
-0.03739188611507416,
0.012821050360798836,
0.06812896579504013,
-0.07864625751972198,
-0.034366391599178314,
-0.09309583902359009,
0.013437951914966106,
0.035155534744262695,
0.026941116899251938,
0.03337613493204117,
0.0014104206347838044,
0.09522728621959686,
-0.11167403310537338,
0.007496316451579332,
-0.04288000985980034,
-0.0683012455701828,
0.03175167739391327,
0.041434451937675476,
0.1289648413658142,
0.13266317546367645,
-0.006799431052058935,
-0.0034988534171134233,
-0.019179342314600945,
0.24632693827152252,
-0.06996584683656693,
-0.02550959587097168,
0.13380858302116394,
-0.009382988326251507,
0.04521271213889122,
0.13220326602458954,
0.07229889184236526,
-0.09762103110551834,
0.026602644473314285,
0.05790786072611809,
-0.019087817519903183,
-0.22741417586803436,
-0.015021627768874168,
-0.035907335579395294,
0.0006985376821830869,
0.07628485560417175,
0.031232578679919243,
0.04270970821380615,
0.07607122510671616,
0.019571565091609955,
0.0458909310400486,
-0.033833328634500504,
0.06337186694145203,
0.09435555338859558,
0.037152137607336044,
0.11685654520988464,
-0.04909558221697807,
-0.052371662110090256,
0.028975659981369972,
0.0016075856983661652,
0.22602519392967224,
0.010626540519297123,
0.13689634203910828,
0.07851652801036835,
0.18062299489974976,
-0.019139207899570465,
0.06615877896547318,
-0.014573010616004467,
-0.06555186957120895,
0.001708787283860147,
-0.055621396750211716,
-0.00017163378652185202,
0.04196846857666969,
-0.08856267482042313,
0.08691498637199402,
-0.098223477602005,
0.024556485936045647,
0.07178925722837448,
0.244070902466774,
0.0565110445022583,
-0.2960689663887024,
-0.09915676712989807,
0.01800783909857273,
-0.024558864533901215,
-0.018187614157795906,
0.02865000255405903,
0.13217481970787048,
-0.03832488879561424,
0.011472643353044987,
-0.0620817206799984,
0.08190985023975372,
0.011459979228675365,
0.03931395336985588,
0.06811609119176865,
0.08993252366781235,
-0.011013898067176342,
0.07281957566738129,
-0.2836052477359772,
0.2717476189136505,
0.01925056055188179,
0.09410490840673447,
-0.04434746503829956,
-0.0047549097798764706,
0.01829407550394535,
0.06845501810312271,
0.09035717695951462,
-0.02690468728542328,
-0.0394635833799839,
-0.1731618046760559,
-0.04470261186361313,
0.037691209465265274,
0.09619221836328506,
-0.028674203902482986,
0.10700350254774094,
-0.019911963492631912,
0.011559884995222092,
0.09085223078727722,
-0.011863641440868378,
-0.10737388581037521,
-0.07728314399719238,
-0.01664567179977894,
0.0064710951410233974,
-0.03958025202155113,
-0.09762688726186752,
-0.09770943224430084,
-0.11090271174907684,
0.13516497611999512,
-0.044538941234350204,
-0.021154869347810745,
-0.10042822360992432,
0.06963378936052322,
0.0912652462720871,
-0.07541369646787643,
0.03978581354022026,
0.021414978429675102,
0.054797496646642685,
0.03999511897563934,
-0.04735485836863518,
0.11886489391326904,
-0.08075416833162308,
-0.17606396973133087,
-0.061818793416023254,
0.10837610810995102,
0.03770598769187927,
0.045281149446964264,
-0.011241069063544273,
0.004922180902212858,
-0.023850344121456146,
-0.09154654294252396,
0.02500142529606819,
-0.027446528896689415,
0.07187540829181671,
0.016060784459114075,
-0.04647016525268555,
0.04587600752711296,
-0.049050863832235336,
-0.02025841735303402,
0.13496647775173187,
0.29501667618751526,
-0.08467547595500946,
-0.01717042736709118,
0.0626881793141365,
-0.04876435920596123,
-0.19560013711452484,
0.06310991942882538,
0.03228176757693291,
-0.004511831793934107,
0.05742659419775009,
-0.13983096182346344,
0.14491748809814453,
0.11258776485919952,
-0.027074260637164116,
0.11148685961961746,
-0.30757078528404236,
-0.12536178529262543,
0.1293194442987442,
0.15466010570526123,
0.10060099512338638,
-0.17425775527954102,
-0.03436605632305145,
-0.014030381105840206,
-0.14593110978603363,
0.08121512085199356,
-0.15451566874980927,
0.09450095891952515,
-0.014294955879449844,
0.048651322722435,
0.004190343432128429,
-0.07074429094791412,
0.14836987853050232,
-0.0014397071208804846,
0.11916863918304443,
-0.04686429724097252,
-0.012334862723946571,
0.07179057598114014,
-0.04476006701588631,
0.03221740946173668,
-0.08921004086732864,
0.05430687963962555,
-0.05277353152632713,
-0.026328114792704582,
-0.06338362395763397,
0.0444340705871582,
-0.05156704783439636,
-0.06668784469366074,
-0.04629887640476227,
0.031232062727212906,
0.025636333972215652,
-0.011674152687191963,
0.1439708173274994,
0.018912920728325844,
0.1529214233160019,
0.11539846658706665,
0.0782497227191925,
-0.0703849345445633,
-0.05379251390695572,
0.011466693133115768,
-0.031105032190680504,
0.0735807716846466,
-0.15534017980098724,
0.04171266034245491,
0.1302478313446045,
0.03247127681970596,
0.13903555274009705,
0.06965480744838715,
-0.04696248844265938,
0.011324635706841946,
0.056834202259778976,
-0.1626703292131424,
-0.1340324729681015,
0.00002757364018179942,
-0.04736484959721565,
-0.13953174650669098,
0.08312419801950455,
0.09649445861577988,
-0.05822139233350754,
0.00909530371427536,
-0.0009642209624871612,
0.003173605538904667,
-0.061057280749082565,
0.19070598483085632,
0.08569896966218948,
0.050324928015470505,
-0.07307818531990051,
0.0835479125380516,
0.02573074772953987,
-0.0808318555355072,
0.002578729996457696,
0.02989996038377285,
-0.06100577488541603,
-0.04711603373289108,
0.056641679257154465,
0.17590466141700745,
-0.019638240337371826,
-0.052817586809396744,
-0.15319590270519257,
-0.10319401323795319,
0.052231863141059875,
0.17720772325992584,
0.09928938746452332,
0.011824246495962143,
-0.015560406260192394,
0.03764193505048752,
-0.12035990506410599,
0.11392180621623993,
0.038055967539548874,
0.08253764361143112,
-0.15228131413459778,
0.10657879710197449,
-0.004222555551677942,
0.0139384800568223,
-0.021709267050027847,
0.05366765335202217,
-0.12366961687803268,
0.004869056865572929,
-0.16115553677082062,
-0.03251505270600319,
-0.04248606041073799,
0.00038847618270665407,
0.016223041340708733,
-0.0780189260840416,
-0.07620929181575775,
0.02351570315659046,
-0.10268716514110565,
-0.014264335855841637,
0.06091580539941788,
0.05153145641088486,
-0.14620642364025116,
-0.03192108869552612,
0.04174608364701271,
-0.06939243525266647,
0.06183396279811859,
0.03298024833202362,
0.02971559762954712,
0.05153597518801689,
-0.18594549596309662,
0.023890774697065353,
0.05430162698030472,
0.008871623314917088,
0.05333299934864044,
-0.09231086075305939,
-0.02903369627892971,
-0.0045629809610545635,
0.05498158186674118,
0.009396152570843697,
0.04742884263396263,
-0.12208909541368484,
-0.006536278873682022,
-0.04056916385889053,
-0.06674061715602875,
-0.060410063713788986,
0.012389244511723518,
0.10631756484508514,
0.006140247918665409,
0.2003929764032364,
-0.08034875988960266,
0.01971765235066414,
-0.21828030049800873,
0.006421102210879326,
0.0009078580187633634,
-0.07881254702806473,
-0.0955696851015091,
-0.039872486144304276,
0.053369294852018356,
-0.0663081631064415,
0.13214325904846191,
-0.03903914615511894,
0.032977212220430374,
0.036492105573415756,
-0.057273950427770615,
0.06256604194641113,
0.026984356343746185,
0.2606726586818695,
0.017259154468774796,
-0.026294171810150146,
0.014427727088332176,
0.03836307302117348,
0.09351077675819397,
0.07587286084890366,
0.17643071711063385,
0.18330498039722443,
-0.05255421623587608,
0.08821975439786911,
0.05581057444214821,
-0.05995520204305649,
-0.1149095892906189,
0.06790944933891296,
-0.02508026361465454,
0.06954899430274963,
-0.015715045854449272,
0.21034452319145203,
0.11567226052284241,
-0.1590530425310135,
0.012815553694963455,
-0.055513326078653336,
-0.08700840920209885,
-0.10042070597410202,
-0.03069598414003849,
-0.08540433645248413,
-0.17624931037425995,
0.016721447929739952,
-0.1270364373922348,
0.001148656359873712,
0.09995061159133911,
0.010746840387582779,
-0.015361901372671127,
0.21557922661304474,
0.030385876074433327,
0.05448276549577713,
0.03283162787556648,
-0.0024415564257651567,
-0.0372224859893322,
-0.07298217713832855,
-0.07409697026014328,
0.015388483181595802,
-0.03344576433300972,
0.02075573429465294,
-0.05580935627222061,
-0.057537730783224106,
0.04017962887883186,
-0.0042317514307796955,
-0.09807205945253372,
0.0029443101957440376,
0.029380904510617256,
0.04867691919207573,
0.056518010795116425,
0.015711145475506783,
0.024574652314186096,
-0.012388857081532478,
0.2190667688846588,
-0.08333645015954971,
-0.07089068740606308,
-0.11979837715625763,
0.19840994477272034,
0.0243499968200922,
0.013210986740887165,
0.01698974519968033,
-0.10116712003946304,
0.03034425526857376,
0.20663133263587952,
0.15971197187900543,
-0.08094584941864014,
-0.001310704625211656,
0.005433080717921257,
-0.010555037297308445,
-0.0763191431760788,
0.057375382632017136,
0.1257193237543106,
0.02287502959370613,
-0.08134513348340988,
-0.06990339607000351,
-0.045439016073942184,
-0.018028438091278076,
-0.042209699749946594,
0.039740122854709625,
0.04455721750855446,
0.007943157106637955,
-0.046565938740968704,
0.062073349952697754,
-0.03301776200532913,
-0.12776967883110046,
0.07783675193786621,
-0.1744745969772339,
-0.14384151995182037,
-0.01794668473303318,
0.1254720836877823,
-0.013855235651135445,
0.052839767187833786,
-0.04130734130740166,
-0.011217271909117699,
0.07731734216213226,
-0.0200643390417099,
-0.06387891620397568,
-0.08929909020662308,
0.08235743641853333,
-0.09098142385482788,
0.23255473375320435,
-0.03186991810798645,
0.06700818985700607,
0.14000444114208221,
0.034111734479665756,
-0.08409174531698227,
0.08454953879117966,
0.06634993851184845,
-0.10755924880504608,
0.011245387606322765,
0.07832124829292297,
-0.03013608790934086,
0.12205903232097626,
0.059464290738105774,
-0.15211297571659088,
0.00570900272578001,
-0.03888072445988655,
-0.072417251765728,
-0.07995426654815674,
-0.031972456723451614,
-0.06563299149274826,
0.1329375058412552,
0.18982073664665222,
-0.04638466611504555,
0.016028661280870438,
-0.044259920716285706,
0.045687485486269,
0.07130192965269089,
0.050666313618421555,
-0.03129382058978081,
-0.22839848697185516,
0.0557306744158268,
0.05764351412653923,
-0.02459322102367878,
-0.24021756649017334,
-0.0928330197930336,
0.021973462775349617,
-0.05945591256022453,
-0.07546738535165787,
0.06229817494750023,
0.130509614944458,
0.06353496015071869,
-0.05861582234501839,
-0.10889451205730438,
-0.08606508374214172,
0.15865565836429596,
-0.1339999884366989,
-0.09514786303043365
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.