sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | transformers |
**Explanation**
- With the base model, applied DPO to the small amount of layers with the open dataset , saved the adapter part
- Attached the base model and the tuned adapter together
**Base Model**
- [yanolja/KoSOLAR-10.7B-v0.3](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.3)
**Used Corpus**
- [We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs](https://huggingface.co/datasets/We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs)
**Log**
- 2024.02.13: Initial version Upload
**LICENSE**
- Apache 2.0 | {"language": ["ko"], "license": "apache-2.0", "datasets": "We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs", "pipeline_tag": "text-generation"} | text-generation | dddsaty/KoSOLAR-10.7B_DPO_Adapter_Attach | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"ko",
"dataset:We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T08:44:18+00:00 | [] | [
"ko"
] | TAGS
#transformers #safetensors #llama #text-generation #conversational #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Explanation
- With the base model, applied DPO to the small amount of layers with the open dataset , saved the adapter part
- Attached the base model and the tuned adapter together
Base Model
- yanolja/KoSOLAR-10.7B-v0.3
Used Corpus
- We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs
Log
- 2024.02.13: Initial version Upload
LICENSE
- Apache 2.0 | [] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
90
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.04619350656867027,
0.10133348405361176,
-0.005566644482314587,
0.032206885516643524,
0.0635007917881012,
-0.029340190812945366,
0.1878654807806015,
0.14628691971302032,
-0.008447723463177681,
-0.04352482408285141,
0.15225841104984283,
0.17220018804073334,
0.0325697623193264,
0.11915262043476105,
-0.09449517726898193,
-0.18918274343013763,
0.0912022516131401,
-0.028195736929774284,
-0.03166403993964195,
0.09642873704433441,
0.11723994463682175,
-0.00806808564811945,
0.04598579928278923,
-0.06606622040271759,
-0.020029010251164436,
-0.03100571781396866,
-0.0059704044833779335,
-0.13507656753063202,
0.05829402059316635,
0.04598834738135338,
0.044376250356435776,
0.08360552042722702,
-0.05607210472226143,
-0.19325941801071167,
0.026246484369039536,
0.03089308738708496,
-0.028123019263148308,
-0.004936641082167625,
0.00006730252061970532,
-0.02458573877811432,
0.0015356286894530058,
-0.048985060304403305,
-0.05276068300008774,
0.03133425861597061,
-0.07903829216957092,
-0.12357829511165619,
-0.08220191299915314,
0.03589322045445442,
0.08366821706295013,
0.08296145498752594,
0.0009468464413657784,
0.1309882253408432,
-0.03697244077920914,
0.06304958462715149,
0.039955224841833115,
-0.28376829624176025,
-0.008789598010480404,
0.06758984923362732,
0.08243753761053085,
0.08467501401901245,
-0.016928059980273247,
0.03101426176726818,
0.043795112520456314,
-0.023018937557935715,
-0.0035249877255409956,
-0.027967151254415512,
-0.08117731660604477,
0.03263751417398453,
-0.061328887939453125,
-0.025210851803421974,
0.2881835103034973,
0.026507848873734474,
0.020654939115047455,
-0.040393855422735214,
-0.05918096378445625,
-0.021136853843927383,
0.010146855376660824,
-0.009678705595433712,
0.009023057296872139,
0.082265205681324,
0.03626447170972824,
-0.019974181428551674,
-0.14653323590755463,
-0.02189749665558338,
-0.20759399235248566,
0.1772465854883194,
0.029381774365901947,
0.03811940923333168,
-0.1312091201543808,
0.030473535880446434,
0.058052077889442444,
-0.11718256771564484,
-0.040475498884916306,
-0.08197356760501862,
0.12289545685052872,
0.037658609449863434,
-0.040466587990522385,
-0.014961573295295238,
0.22711069881916046,
0.15086360275745392,
0.02554243616759777,
-0.03334611654281616,
-0.04488518834114075,
0.05200866609811783,
-0.0022103232331573963,
0.0026686631608754396,
-0.017949942499399185,
-0.027156436815857887,
0.10953826457262039,
-0.01796753704547882,
0.08750990778207779,
-0.023021647706627846,
-0.09039435535669327,
0.006076861638575792,
-0.017910471186041832,
0.13864465057849884,
0.0973593145608902,
0.09877043217420578,
-0.02474706806242466,
0.030130386352539062,
0.13275828957557678,
-0.0755198523402214,
-0.02118939533829689,
-0.007061853539198637,
0.016513973474502563,
0.08395502716302872,
0.007823270745575428,
0.06539805233478546,
-0.03892194852232933,
-0.016106706112623215,
0.007150094024837017,
-0.06517311185598373,
-0.019593998789787292,
0.007647535763680935,
0.07732982188463211,
0.00015123632329050452,
0.05459775775671005,
-0.21389859914779663,
-0.21242977678775787,
0.018877437338232994,
0.04445057362318039,
-0.025137143209576607,
-0.06804942339658737,
-0.020383300259709358,
-0.06854303181171417,
-0.005723865702748299,
-0.08571726083755493,
-0.0863766297698021,
-0.09824881702661514,
0.07553659379482269,
-0.025776345282793045,
0.06498415768146515,
-0.1652459055185318,
0.04598968103528023,
-0.1044967845082283,
0.008815968409180641,
-0.01249816082417965,
0.039420824497938156,
-0.08825064450502396,
0.15754525363445282,
-0.03869440034031868,
0.06690636277198792,
-0.025570882484316826,
0.026482051238417625,
-0.021114768460392952,
0.19012592732906342,
-0.12710407376289368,
-0.03827386349439621,
0.27459898591041565,
-0.12747226655483246,
-0.22380037605762482,
0.1288388967514038,
0.0522896833717823,
0.049435097724199295,
0.0932793840765953,
0.16970698535442352,
0.0698304995894432,
-0.05428216978907585,
-0.007994613610208035,
0.07554841786623001,
-0.03884164243936539,
-0.11137308925390244,
0.09012527018785477,
0.015212073922157288,
-0.05510370805859566,
0.062623530626297,
0.032965101301670074,
0.06758123636245728,
-0.01327595766633749,
-0.096443772315979,
-0.07238177210092545,
-0.07832452654838562,
-0.012495077215135098,
-0.062401752918958664,
0.0009657342452555895,
-0.08636666089296341,
-0.045866262167692184,
-0.10198239237070084,
0.06478813290596008,
-0.023828644305467606,
0.024434665217995644,
-0.0998818427324295,
0.048991356045007706,
-0.014442553743720055,
0.08129140734672546,
-0.04074831306934357,
-0.04361246898770332,
0.006602327339351177,
0.003912109881639481,
-0.018640412017703056,
0.026392728090286255,
0.0647188052535057,
-0.04306359589099884,
-0.0018600339535623789,
0.009688187390565872,
0.14396125078201294,
0.061112161725759506,
-0.006105996202677488,
-0.12383945286273956,
0.13998673856258392,
-0.05299687013030052,
0.1429257094860077,
-0.07534585893154144,
0.022314639762043953,
0.07954194396734238,
0.1053026095032692,
-0.007994594983756542,
0.09893012791872025,
-0.012544050812721252,
0.008180205710232258,
-0.04754435271024704,
-0.016224928200244904,
0.0896211490035057,
0.041404109448194504,
-0.09111388027667999,
0.19459296762943268,
-0.09455510228872299,
0.19737456738948822,
0.21230682730674744,
-0.026217637583613396,
0.06908398121595383,
-0.05352919176220894,
-0.021628541871905327,
-0.042239222675561905,
0.0517122782766819,
-0.022318925708532333,
-0.051306530833244324,
0.007269984111189842,
0.08966734260320663,
-0.052143942564725876,
0.01939713954925537,
-0.009946406818926334,
-0.09361449629068375,
-0.01619020290672779,
0.03605044633150101,
0.06939513981342316,
-0.07674451917409897,
0.183720663189888,
0.24870452284812927,
-0.017893200740218163,
0.163801908493042,
-0.0908571407198906,
-0.000392045738408342,
0.021753210574388504,
0.04758204147219658,
-0.010150543414056301,
0.008853957988321781,
-0.1403791755437851,
0.030981289222836494,
0.07579659670591354,
0.032986339181661606,
0.05357819050550461,
-0.1127927228808403,
-0.06236875429749489,
-0.0006489950465038419,
-0.07263272255659103,
0.006882582791149616,
0.041223809123039246,
-0.049508336931467056,
0.11785531789064407,
-0.04903154820203781,
-0.09620299935340881,
0.14281871914863586,
0.017140422016382217,
-0.033125679939985275,
0.13218934834003448,
-0.1431015431880951,
-0.2909696400165558,
-0.11687581986188889,
-0.0794968381524086,
-0.1303495615720749,
0.004662629682570696,
0.14682599902153015,
-0.06231376901268959,
-0.030528845265507698,
-0.049991969019174576,
0.03287690132856369,
0.07340690493583679,
0.013009236194193363,
-0.005017870105803013,
0.07741335779428482,
-0.040700867772102356,
-0.12057135999202728,
-0.04827209189534187,
0.05078106373548508,
-0.07159683108329773,
0.1724172979593277,
-0.08481930941343307,
0.10236218571662903,
0.09884817898273468,
0.041817665100097656,
-0.022949310019612312,
-0.020445944741368294,
0.010822152718901634,
-0.04207427427172661,
0.0349339060485363,
0.210739865899086,
-0.023343143984675407,
0.042445700615644455,
0.18220028281211853,
-0.009389298036694527,
-0.07933244109153748,
0.04392285645008087,
-0.015243765898048878,
-0.03592192754149437,
-0.29448243975639343,
-0.09345188736915588,
-0.07051218301057816,
0.13963328301906586,
-0.0464925654232502,
0.0747268870472908,
0.12494465708732605,
0.09416352212429047,
-0.04343746602535248,
-0.04270438849925995,
0.018956221640110016,
0.047057535499334335,
0.17559894919395447,
-0.050377100706100464,
0.11350499838590622,
-0.0939086452126503,
-0.06299140304327011,
0.10746082663536072,
0.148252472281456,
0.11114631593227386,
0.10152871906757355,
0.05116844177246094,
0.0688520222902298,
0.0885307639837265,
0.07026208192110062,
0.032157495617866516,
0.09863794595003128,
-0.027851322665810585,
-0.03200354427099228,
-0.05492554232478142,
-0.03909825161099434,
0.05600091069936752,
-0.10533419251441956,
-0.11387701332569122,
0.011211934499442577,
0.001816878910176456,
0.13242089748382568,
0.1915179044008255,
0.04582597687840462,
-0.17138634622097015,
0.020557338371872902,
0.08311902731657028,
-0.019063375890254974,
-0.04177579656243324,
0.12221232801675797,
-0.04613436013460159,
-0.026477886363863945,
0.10779885202646255,
0.004959327634423971,
0.07872387766838074,
-0.10795144736766815,
0.028516778722405434,
-0.08418785035610199,
-0.018828099593520164,
0.03994408994913101,
0.11827366054058075,
-0.3374232351779938,
0.17483019828796387,
-0.0339953675866127,
0.030113963410258293,
-0.09470941871404648,
0.029404787346720695,
0.0329270176589489,
0.14116953313350677,
0.11644887179136276,
-0.006764980498701334,
-0.034190140664577484,
0.07231136411428452,
-0.12985771894454956,
0.09776816517114639,
0.03455762192606926,
-0.026461167261004448,
-0.016985969617962837,
-0.03272223100066185,
-0.01328385341912508,
-0.015238126739859581,
0.03310120478272438,
-0.05525464192032814,
-0.1053893193602562,
0.04909151792526245,
0.14614678919315338,
0.11898442357778549,
-0.06675949692726135,
0.02299996092915535,
-0.15254424512386322,
0.1707748919725418,
0.009706344455480576,
-0.08412014693021774,
-0.038274601101875305,
-0.14163248240947723,
-0.006296590901911259,
-0.028685925528407097,
-0.005888507701456547,
-0.028776608407497406,
0.014440416358411312,
-0.01899639517068863,
-0.15384908020496368,
0.07417400181293488,
-0.12297908216714859,
-0.03374774008989334,
-0.0509980134665966,
0.08906731754541397,
-0.07798990607261658,
-0.03745298832654953,
0.015142380259931087,
-0.02873080037534237,
-0.07016012072563171,
-0.09513936191797256,
0.00035482217208482325,
0.08727225661277771,
0.019526157528162003,
-0.01010135654360056,
-0.0657016858458519,
-0.10392528772354126,
-0.027042873203754425,
-0.13763560354709625,
0.1717456728219986,
0.28314095735549927,
-0.019310010597109795,
0.11019596457481384,
0.23261433839797974,
-0.05542999505996704,
-0.3296035826206207,
-0.1951383799314499,
-0.14288440346717834,
-0.06629966199398041,
-0.10506714135408401,
-0.11182910948991776,
0.11868198961019516,
0.1149047389626503,
-0.07552459836006165,
0.05662224814295769,
-0.18319472670555115,
-0.09422995150089264,
0.17992234230041504,
0.02913859486579895,
0.32589396834373474,
-0.22426965832710266,
-0.06274528056383133,
-0.1347607523202896,
-0.13435696065425873,
0.1930064707994461,
-0.15748897194862366,
0.0715036392211914,
0.012084157206118107,
0.0024630101397633553,
-0.007533050607889891,
-0.014951959252357483,
0.18425261974334717,
-0.06227913126349449,
0.008242666721343994,
-0.1256597936153412,
0.07499644160270691,
0.07528916746377945,
-0.024691779166460037,
0.058126673102378845,
-0.1518295705318451,
0.018556930124759674,
-0.024708304554224014,
-0.03177585080265999,
-0.007064211647957563,
0.0337233804166317,
0.03332214802503586,
-0.0746556743979454,
-0.024353524670004845,
-0.007460972294211388,
0.011879251338541508,
0.0006508048390969634,
0.1552734225988388,
-0.016097623854875565,
0.03801264241337776,
0.1008821427822113,
0.07731165736913681,
-0.17618532478809357,
0.08258835226297379,
-0.0767151266336441,
-0.10355447977781296,
0.07451912015676498,
-0.14405733346939087,
0.03147077560424805,
0.07323160767555237,
-0.06450119614601135,
0.05070222541689873,
0.03269839659333229,
0.0001477703044656664,
-0.026182210072875023,
0.12008659541606903,
-0.16960158944129944,
-0.08082763850688934,
-0.01649625413119793,
0.08860567957162857,
0.04557117447257042,
0.11772742867469788,
0.16458897292613983,
-0.0186840258538723,
-0.005487373098731041,
0.008525066077709198,
0.06365574151277542,
-0.089675672352314,
0.05020448565483093,
0.03806351497769356,
0.00001273668840440223,
-0.10454864054918289,
0.13848648965358734,
0.022752633318305016,
-0.12620696425437927,
0.0206025131046772,
0.024103457108139992,
-0.12164530903100967,
-0.1331525593996048,
-0.02020520530641079,
0.07983779907226562,
-0.0449771024286747,
-0.1698530912399292,
-0.029136041179299355,
-0.13831229507923126,
0.01861611008644104,
0.0502973347902298,
0.06110386177897453,
0.0781397894024849,
0.04288134723901749,
-0.03076281026005745,
-0.048695988953113556,
0.05298649147152901,
-0.02061232551932335,
0.03222529590129852,
-0.13969463109970093,
-0.10734554380178452,
-0.06807868182659149,
0.06241370737552643,
-0.047509245574474335,
0.006334604229778051,
-0.1423996090888977,
-0.0014549857005476952,
-0.15754453837871552,
0.000717041315510869,
-0.08971582353115082,
0.01230151578783989,
0.04484055936336517,
-0.06910212337970734,
-0.016383551061153412,
-0.03536301106214523,
-0.05830196291208267,
0.0016872432315722108,
-0.01782281883060932,
0.05344681441783905,
-0.1234564259648323,
-0.0825013741850853,
0.038064248859882355,
-0.04884203523397446,
0.11692402511835098,
0.1313581019639969,
-0.06510017067193985,
0.006790518760681152,
-0.18353025615215302,
0.0020654848776757717,
0.09523837268352509,
0.04368795081973076,
-0.008962731808423996,
-0.016662485897541046,
-0.027462350204586983,
0.15519556403160095,
-0.022209761664271355,
0.06264398247003555,
0.03662949427962303,
-0.10056417435407639,
-0.026303093880414963,
-0.05397187918424606,
-0.07370717078447342,
-0.0025950430426746607,
-0.08264230191707611,
0.1298856884241104,
0.026654081419110298,
0.11503570526838303,
-0.03917185589671135,
0.008556611835956573,
-0.050801247358322144,
0.02460935339331627,
-0.004859887063503265,
-0.15330730378627777,
-0.0032138614915311337,
-0.03400867059826851,
-0.014445777051150799,
0.010765580460429192,
0.23746713995933533,
0.021148094907402992,
-0.0867745503783226,
0.045511480420827866,
-0.03942489996552467,
0.07497598230838776,
0.024620261043310165,
0.22838453948497772,
0.09447240084409714,
-0.0028105108067393303,
-0.12460298091173172,
-0.017474398016929626,
0.06223529949784279,
-0.0987464189529419,
0.03929253667593002,
0.09550876170396805,
0.03319992125034332,
0.08783493936061859,
0.07322966307401657,
0.015715325251221657,
-0.03301411122083664,
-0.09515470266342163,
-0.04928005859255791,
0.05407240614295006,
-0.006060163956135511,
0.034494463354349136,
0.15661510825157166,
-0.02728293091058731,
-0.019826188683509827,
-0.07154279947280884,
-0.027767842635512352,
-0.12872637808322906,
-0.13918738067150116,
-0.10403618216514587,
-0.1663331538438797,
-0.005293543450534344,
-0.05898495018482208,
-0.029029713943600655,
0.07954312860965729,
0.05797102674841881,
-0.06914760917425156,
0.09722951799631119,
0.049247585237026215,
-0.029124807566404343,
0.05699140578508377,
-0.0518558993935585,
-0.017120465636253357,
-0.038690775632858276,
-0.08089099079370499,
-0.02758355252444744,
0.039947718381881714,
-0.05674773082137108,
0.0877230316400528,
0.011050666682422161,
0.08262312412261963,
-0.128083273768425,
-0.06630951911211014,
-0.06681549549102783,
0.04230516031384468,
-0.020740535110235214,
0.16121722757816315,
0.056388407945632935,
-0.007166389375925064,
0.1448509693145752,
0.19776181876659393,
-0.02262255921959877,
-0.19759729504585266,
-0.10435270518064499,
0.06007932499051094,
-0.05400817468762398,
0.07046202570199966,
-0.003168731927871704,
-0.009629939682781696,
-0.056837182492017746,
0.2526692748069763,
0.22515247762203217,
-0.04154606908559799,
0.034138891845941544,
-0.013946287333965302,
0.02289554849267006,
-0.047154273837804794,
0.08755254000425339,
0.14388254284858704,
0.17209617793560028,
-0.042015038430690765,
-0.011782233603298664,
0.007012930233031511,
0.0013053813017904758,
-0.13959582149982452,
0.05622425302863121,
-0.014574366621673107,
-0.0753069669008255,
0.014987469650804996,
0.09054137021303177,
-0.13570940494537354,
0.06458625942468643,
-0.06494665890932083,
-0.08495308458805084,
-0.04925160855054855,
-0.016739463433623314,
0.2002938836812973,
-0.007954295724630356,
0.004312417004257441,
-0.03741704300045967,
-0.03280286118388176,
0.08767428249120712,
-0.02023065648972988,
-0.10041707754135132,
0.041806720197200775,
0.0436544306576252,
0.01497709285467863,
0.06993282586336136,
0.0036211381666362286,
0.1178179383277893,
0.08742494136095047,
0.030221208930015564,
-0.114394411444664,
0.11859229952096939,
0.014538606628775597,
-0.04705364629626274,
0.058551859110593796,
-0.1053248792886734,
-0.03627878054976463,
-0.04191060736775398,
0.10491381585597992,
0.018335958942770958,
0.005122422706335783,
0.15121819078922272,
-0.05347137898206711,
-0.059356726706027985,
0.09088428318500519,
-0.11071845144033432,
0.06359277665615082,
0.002545461989939213,
-0.050421424210071564,
0.02732040360569954,
-0.035095348954200745,
0.047614023089408875,
-0.0210249200463295,
-0.17545261979103088,
-0.009023785591125488,
-0.06992479413747787,
-0.09752436727285385,
0.12647438049316406,
0.10341916233301163,
-0.15639151632785797,
-0.009019918739795685,
-0.07590172439813614,
0.00882101058959961,
-0.14468874037265778,
-0.01916317269206047,
0.14252683520317078,
-0.012795424088835716,
-0.0377398245036602,
-0.05910505726933479,
0.011927815154194832,
0.05128117650747299,
-0.09315703809261322,
-0.11530248820781708
] |
null | null | diffusers |
# controlnet-saeu5407/controlnet-landmark
These are controlnet weights trained on stabilityai/stable-diffusion-2-1-base with new type of conditioning.
You can find some example images below.
prompt: a women wearing white shirt

prompt: a women wearing white shirt

| {"license": "creativeml-openrail-m", "tags": ["stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "diffusers", "controlnet"], "base_model": "stabilityai/stable-diffusion-2-1-base", "inference": true} | text-to-image | saeu5407/controlnet-landmark | [
"diffusers",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"controlnet",
"base_model:stabilityai/stable-diffusion-2-1-base",
"license:creativeml-openrail-m",
"diffusers:ControlNetModel",
"region:us"
] | 2024-02-13T08:44:30+00:00 | [] | [] | TAGS
#diffusers #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #controlnet #base_model-stabilityai/stable-diffusion-2-1-base #license-creativeml-openrail-m #diffusers-ControlNetModel #region-us
|
# controlnet-saeu5407/controlnet-landmark
These are controlnet weights trained on stabilityai/stable-diffusion-2-1-base with new type of conditioning.
You can find some example images below.
prompt: a women wearing white shirt
!images_0)
prompt: a women wearing white shirt
!images_1)
| [
"# controlnet-saeu5407/controlnet-landmark\n\nThese are controlnet weights trained on stabilityai/stable-diffusion-2-1-base with new type of conditioning.\nYou can find some example images below.\nprompt: a women wearing white shirt\n!images_0)\nprompt: a women wearing white shirt\n!images_1)"
] | [
"TAGS\n#diffusers #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #controlnet #base_model-stabilityai/stable-diffusion-2-1-base #license-creativeml-openrail-m #diffusers-ControlNetModel #region-us \n",
"# controlnet-saeu5407/controlnet-landmark\n\nThese are controlnet weights trained on stabilityai/stable-diffusion-2-1-base with new type of conditioning.\nYou can find some example images below.\nprompt: a women wearing white shirt\n!images_0)\nprompt: a women wearing white shirt\n!images_1)"
] | [
81,
78
] | [
"passage: TAGS\n#diffusers #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #controlnet #base_model-stabilityai/stable-diffusion-2-1-base #license-creativeml-openrail-m #diffusers-ControlNetModel #region-us \n# controlnet-saeu5407/controlnet-landmark\n\nThese are controlnet weights trained on stabilityai/stable-diffusion-2-1-base with new type of conditioning.\nYou can find some example images below.\nprompt: a women wearing white shirt\n!images_0)\nprompt: a women wearing white shirt\n!images_1)"
] | [
-0.15786169469356537,
0.07975348085165024,
-0.0025134619791060686,
-0.06019292026758194,
0.023747500032186508,
0.01336880587041378,
0.22439512610435486,
0.023235494270920753,
0.07325209677219391,
0.06003464758396149,
0.09069368243217468,
-0.09038306772708893,
-0.04528126120567322,
0.18579202890396118,
-0.00920605193823576,
-0.15601308643817902,
0.0014089406467974186,
0.03831415995955467,
-0.012277972884476185,
0.13340163230895996,
0.08107974380254745,
-0.03150727599859238,
0.0399027056992054,
0.012009442783892155,
-0.12161226570606232,
-0.02364525757730007,
-0.004501624498516321,
-0.016628779470920563,
0.13346749544143677,
-0.0003508698137011379,
0.16660882532596588,
0.1310744285583496,
0.07376422733068466,
-0.2266445755958557,
0.03954263776540756,
0.005984381306916475,
-0.0463794507086277,
0.05672846734523773,
-0.06821413338184357,
0.001569483196362853,
0.16891564428806305,
-0.06111510097980499,
0.0002698669268283993,
0.04489774629473686,
-0.0336427316069603,
-0.00858347862958908,
-0.021073123440146446,
0.11246273666620255,
-0.009977567009627819,
0.003919163253158331,
0.0162825845181942,
0.1731385439634323,
-0.1521044373512268,
0.053063709288835526,
0.14001086354255676,
-0.275789350271225,
-0.018221784383058548,
0.15593776106834412,
0.11175648123025894,
0.04946187883615494,
-0.1244167760014534,
0.062498848885297775,
0.006890495773404837,
-0.036576494574546814,
-0.01706182211637497,
-0.05393943935632706,
0.11335034668445587,
-0.05276117101311684,
-0.1422514021396637,
0.08908332884311676,
0.14492671191692352,
0.07464797794818878,
-0.029154889285564423,
-0.1713443547487259,
-0.09829263389110565,
0.13713769614696503,
-0.09443965554237366,
-0.10001161694526672,
-0.04005780443549156,
0.0484091155230999,
-0.07713040709495544,
-0.04608175903558731,
-0.11002931743860245,
-0.05138752982020378,
-0.09812182188034058,
0.16587428748607635,
-0.038613591343164444,
0.02581935189664364,
-0.006649314425885677,
0.08521825820207596,
-0.14094269275665283,
-0.10148538649082184,
0.05518616735935211,
-0.040704015642404556,
-0.06557071954011917,
-0.0013135022018104792,
-0.0047957319766283035,
-0.18305066227912903,
0.04779743775725365,
0.1036439910531044,
-0.0722266212105751,
0.05360781028866768,
0.038107484579086304,
0.11742930859327316,
0.006898708641529083,
-0.03703621029853821,
-0.08823899924755096,
-0.01947762817144394,
0.004243214149028063,
0.03283824399113655,
0.061655133962631226,
-0.04087391495704651,
-0.025930380448698997,
-0.07100328058004379,
0.04108232259750366,
0.009942322038114071,
-0.06285028904676437,
0.06070401892066002,
-0.07008262723684311,
0.010429237969219685,
0.1578429937362671,
-0.018884170800447464,
0.05219482257962227,
0.008417705073952675,
0.07046721130609512,
0.07192345708608627,
0.11830315738916397,
0.09186326712369919,
0.12125807255506516,
-0.040926627814769745,
-0.1490362137556076,
-0.08757539838552475,
0.010367739014327526,
0.02581072226166725,
0.01977810449898243,
-0.1385783851146698,
-0.0677509680390358,
-0.17355801165103912,
-0.17331543564796448,
0.04507797583937645,
0.0585697740316391,
-0.03410026431083679,
0.05568312853574753,
-0.030661720782518387,
-0.052313823252916336,
0.07393358647823334,
0.06774353981018066,
-0.1503605842590332,
-0.04210864007472992,
0.07941026240587234,
0.041767753660678864,
0.08072565495967865,
-0.006261774338781834,
-0.08731890469789505,
-0.031956057995557785,
0.062429334968328476,
-0.21409770846366882,
0.016631361097097397,
-0.07408906519412994,
0.040913160890340805,
-0.0434705950319767,
0.004058441147208214,
-0.07710381597280502,
0.06175399199128151,
0.06885753571987152,
0.1900302618741989,
-0.206992968916893,
-0.07545793801546097,
0.08637761324644089,
-0.19415226578712463,
0.009793917648494244,
0.0230249110609293,
0.04654528200626373,
0.17244096100330353,
0.03502679243683815,
0.11107530444860458,
0.07858265191316605,
-0.20267140865325928,
0.16015751659870148,
-0.022607197985053062,
-0.14357911050319672,
-0.04054693505167961,
-0.09500180929899216,
0.023617491126060486,
0.09703470766544342,
-0.0002041747939074412,
-0.08661607652902603,
0.06068412959575653,
-0.07142768800258636,
-0.051639024168252945,
0.016020093113183975,
-0.108279749751091,
0.09166846424341202,
0.08752423524856567,
-0.0034181391820311546,
-0.04036865755915642,
-0.07707615941762924,
-0.0016400679014623165,
0.044796548783779144,
-0.0378422811627388,
0.03583843633532524,
-0.09020060300827026,
0.04098871350288391,
0.03520346060395241,
-0.06163633242249489,
-0.07399990409612656,
-0.037842635065317154,
-0.03473424166440964,
0.29257914423942566,
0.02302999049425125,
0.1250423789024353,
0.1591734141111374,
0.04653601348400116,
-0.02028445154428482,
-0.00403194036334753,
0.0329911969602108,
0.02390911616384983,
0.013806658796966076,
-0.21652284264564514,
0.05039171874523163,
-0.022966748103499413,
0.24309630692005157,
-0.2846265733242035,
0.007249027490615845,
0.1242043748497963,
0.10104040056467056,
0.0660461038351059,
-0.08283351361751556,
0.011877763085067272,
0.01012779027223587,
-0.023016758263111115,
0.009726453572511673,
0.051296308636665344,
0.01340427529066801,
-0.1252845674753189,
0.12499983608722687,
-0.1302472949028015,
0.11718795448541641,
0.023024294525384903,
-0.0539584718644619,
-0.11234172433614731,
-0.030778249725699425,
-0.05201970413327217,
-0.020668230950832367,
0.05749397724866867,
-0.002722692908719182,
-0.06460591405630112,
-0.008305652998387814,
0.03578339144587517,
0.012048371136188507,
-0.004067471716552973,
0.04310127720236778,
-0.0718369260430336,
-0.08878553658723831,
-0.04878279194235802,
0.05410183221101761,
0.0035148928873240948,
-0.02612009458243847,
0.028387803584337234,
-0.13397614657878876,
0.22149325907230377,
-0.0029695890843868256,
-0.012500508688390255,
-0.04524358734488487,
0.07923324406147003,
0.028810158371925354,
0.167150616645813,
-0.047710664570331573,
-0.06200553849339485,
-0.019684644415974617,
-0.01870615966618061,
-0.007398862857371569,
-0.1104278564453125,
-0.04227086529135704,
0.02780284732580185,
-0.06915144622325897,
0.06645996868610382,
0.040937889367341995,
-0.12318889796733856,
0.07050296664237976,
-0.04976319894194603,
-0.21139103174209595,
-0.017584653571248055,
-0.040886860340833664,
-0.12075510621070862,
0.07991312444210052,
-0.0177341066300869,
-0.05345029756426811,
-0.10330770164728165,
0.022895995527505875,
0.030159274116158485,
0.036691129207611084,
-0.03438132256269455,
0.03340234234929085,
-0.0828772708773613,
-0.10193189233541489,
-0.11740869283676147,
0.09901724010705948,
0.030177373439073563,
0.06857297569513321,
0.01325194351375103,
0.09022042900323868,
-0.11917878687381744,
0.026766613125801086,
-0.09109476953744888,
-0.02235451154410839,
0.07184083759784698,
0.0042907060123980045,
0.18299435079097748,
0.1847851723432541,
-0.016372522339224815,
-0.046997539699077606,
0.06052793562412262,
0.12212980538606644,
-0.07221291959285736,
0.19752347469329834,
0.03533235937356949,
-0.01891074888408184,
0.03686375170946121,
0.1695738136768341,
0.07497242838144302,
-0.04185780510306358,
0.002014127327129245,
-0.0020459985826164484,
-0.08003108203411102,
0.006471579894423485,
-0.13651028275489807,
-0.03905990347266197,
-0.07110567390918732,
-0.06950488686561584,
0.015303962863981724,
0.1362466812133789,
0.04712238907814026,
0.052801765501499176,
-0.13086877763271332,
0.05387378856539726,
0.052733879536390305,
-0.007163898553699255,
-0.008129729889333248,
0.08055050671100616,
-0.029511509463191032,
-0.11045796424150467,
0.050306886434555054,
-0.043629106134176254,
0.12730178236961365,
-0.03371885046362877,
0.27154284715652466,
0.16167239844799042,
0.11114362627267838,
0.12141315639019012,
0.08401196449995041,
-0.04372658208012581,
-0.07584160566329956,
0.029811952263116837,
-0.10809137672185898,
0.024962855502963066,
0.02943996712565422,
-0.0034424918703734875,
-0.062106192111968994,
-0.03590076416730881,
-0.022962115705013275,
-0.0020586601458489895,
-0.026188213378190994,
0.0682876780629158,
-0.17526626586914062,
0.07368095219135284,
0.007658929098397493,
-0.01395114790648222,
0.009953108616173267,
0.007794615812599659,
0.20025327801704407,
-0.04918738454580307,
0.044264789670705795,
0.013360107317566872,
0.053564850240945816,
-0.10363391786813736,
-0.05075835436582565,
-0.10674314945936203,
0.06639233976602554,
-0.023504028096795082,
0.008037234656512737,
-0.03156258165836334,
0.20693138241767883,
0.0006289992015808821,
0.02961595170199871,
0.0007957112393341959,
-0.1014314517378807,
-0.02870209887623787,
0.10375837236642838,
-0.014047753065824509,
0.019282974302768707,
0.06543618440628052,
-0.002847801661118865,
0.01520360168069601,
-0.0454249307513237,
0.08267448097467422,
0.09340806305408478,
0.05908147990703583,
0.030485302209854126,
-0.055045321583747864,
0.07697571814060211,
0.12159004807472229,
-0.262580543756485,
-0.03778001293540001,
-0.021766675636172295,
0.17105218768119812,
-0.06166897341609001,
-0.11291446536779404,
-0.056245386600494385,
-0.07331544905900955,
0.17929188907146454,
-0.09610286355018616,
0.003938443027436733,
-0.08720748871564865,
-0.16829068958759308,
0.030680878087878227,
0.016547931358218193,
0.05518484488129616,
0.03762149065732956,
0.1412067711353302,
-0.05546528846025467,
-0.045037176460027695,
0.061133693903684616,
-0.05407967045903206,
-0.1263737976551056,
-0.1354801207780838,
0.0995517298579216,
0.14770257472991943,
-0.03764021024107933,
0.02562970481812954,
-0.02766679972410202,
0.00005701312329620123,
-0.04554841294884682,
0.05123431980609894,
0.053950462490320206,
-0.12914353609085083,
-0.051888588815927505,
0.014638113789260387,
-0.00816117413341999,
0.0005047468584962189,
0.014451641589403152,
0.1113722026348114,
0.21397905051708221,
-0.05591201037168503,
0.10304506868124008,
0.35094141960144043,
0.02285260148346424,
-0.21118997037410736,
-0.072279192507267,
-0.005426671821624041,
0.029041090980172157,
0.03904169425368309,
-0.04656887426972389,
0.10515863448381424,
0.10326580703258514,
-0.015576483681797981,
0.2050282508134842,
-0.29626893997192383,
-0.06624367833137512,
0.05555206537246704,
0.11355260014533997,
0.19576658308506012,
-0.1438526213169098,
-0.052044518291950226,
-0.02803151309490204,
-0.21511946618556976,
0.002828815020620823,
-0.04783984646201134,
0.06542588770389557,
-0.028178434818983078,
-0.03232923150062561,
0.0046476745046675205,
-0.045788176357746124,
0.04688001796603203,
0.006470212712883949,
0.05324798822402954,
-0.13764400780200958,
0.023374127224087715,
0.17055460810661316,
-0.015192339196801186,
0.021786807104945183,
-0.15230615437030792,
-0.019341329112648964,
-0.06949900090694427,
0.018212001770734787,
-0.0601608082652092,
0.027201084420084953,
-0.04065527766942978,
-0.04974246025085449,
-0.013707251287996769,
-0.019178804010152817,
-0.05815866217017174,
0.021558383479714394,
0.004314694087952375,
-0.03343058377504349,
0.08895206451416016,
0.3035215139389038,
0.06850660592317581,
-0.17053472995758057,
-0.07367277890443802,
-0.034256674349308014,
-0.046311426907777786,
0.07654314488172531,
-0.052639324218034744,
0.006903692148625851,
0.05076562240719795,
0.10179999470710754,
0.11484368145465851,
0.029319362714886665,
-0.14132651686668396,
-0.016634095460176468,
0.1516016721725464,
-0.13585785031318665,
0.0627172589302063,
0.022986730560660362,
0.027546999976038933,
0.024362003430724144,
0.05084221065044403,
0.16005851328372955,
-0.021129896864295006,
0.0537380613386631,
0.0018456405960023403,
0.024071382358670235,
-0.10190185904502869,
0.14006522297859192,
0.12690334022045135,
0.07911217212677002,
0.0045930081978440285,
0.07846624404191971,
-0.010469131171703339,
0.10058228671550751,
-0.014259779825806618,
0.07370717823505402,
-0.16335369646549225,
-0.045554906129837036,
0.07264362275600433,
0.14603669941425323,
-0.09099792689085007,
-0.04403236135840416,
-0.1197691336274147,
-0.09754552692174911,
-0.005511715542525053,
0.18386253714561462,
0.04272960498929024,
0.06539832055568695,
0.009568980894982815,
-0.007499658036977053,
-0.07927627861499786,
0.041572827845811844,
0.004689190071076155,
0.07260562479496002,
-0.20717062056064606,
-0.011235133744776249,
-0.0470801405608654,
-0.07384361326694489,
-0.07051033526659012,
-0.06810463219881058,
-0.08766524493694305,
0.038282718509435654,
-0.04242289811372757,
-0.04473253712058067,
-0.08409016579389572,
-0.06331202387809753,
0.01983741670846939,
-0.04616763815283775,
0.0033848118036985397,
-0.015859952196478844,
0.028608379885554314,
0.029908381402492523,
0.01842072606086731,
-0.07723373174667358,
-0.13223788142204285,
-0.014656877145171165,
0.026333874091506004,
-0.07921440899372101,
0.05570453405380249,
-0.00880805216729641,
-0.01535709761083126,
-0.06461378931999207,
-0.23861947655677795,
0.014985330402851105,
0.09485426545143127,
-0.05626959353685379,
-0.014921026304364204,
-0.02477594092488289,
-0.04497304558753967,
0.012124655768275261,
0.05467502027750015,
0.019372902810573578,
-0.006578132510185242,
-0.05511332303285599,
0.14930592477321625,
-0.011790846474468708,
-0.13531984388828278,
-0.11857900768518448,
-0.0017704232595860958,
0.17994022369384766,
0.00498797046020627,
0.13793398439884186,
-0.07255291193723679,
-0.03898957744240761,
-0.050495557487010956,
0.009998995810747147,
0.06873515993356705,
0.05573919788002968,
0.04151540994644165,
-0.020452719181776047,
-0.03781226649880409,
-0.06427459418773651,
0.08733461797237396,
-0.11001759767532349,
-0.20785047113895416,
-0.029417535290122032,
-0.08525019884109497,
0.010907500982284546,
0.019224314019083977,
0.2881735861301422,
0.06038399040699005,
0.000009839840458880644,
-0.09902407974004745,
0.08155233412981033,
0.031378086656332016,
-0.12069946527481079,
0.05800772085785866,
0.016975414007902145,
-0.05319223925471306,
0.10207260400056839,
0.05709857493638992,
0.015284308232367039,
-0.04478410258889198,
0.1591033637523651,
-0.09912222623825073,
-0.015852240845561028,
-0.09973491728305817,
-0.05234835296869278,
0.2743612825870514,
-0.07976018637418747,
0.00833069160580635,
0.03126812353730202,
-0.0665092021226883,
-0.02864355593919754,
-0.2334209680557251,
-0.019284861162304878,
-0.22941093146800995,
0.0385061576962471,
-0.028733273968100548,
-0.06410841643810272,
-0.028065862134099007,
0.023929305374622345,
-0.01877395436167717,
0.10772087424993515,
0.0002778025809675455,
-0.016856007277965546,
0.04183817282319069,
0.021952377632260323,
-0.07905854284763336,
0.037949688732624054,
0.01744144968688488,
0.0036238234024494886,
0.06600630283355713,
-0.017808297649025917,
0.08833640068769455,
-0.009113592095673084,
0.04679024964570999,
0.06238966062664986,
-0.09726985543966293,
0.015513830818235874,
-0.01697928085923195,
0.014194798655807972,
-0.01461751013994217,
0.043834101408720016,
-0.015898721292614937,
-0.028201570734381676,
0.09381723403930664,
0.023018162697553635,
-0.17961765825748444,
-0.0307175200432539,
0.16079218685626984,
-0.06358219683170319,
0.09653861075639725,
-0.09697026759386063,
-0.1566808968782425,
0.0530332550406456,
0.268461674451828,
0.22760248184204102,
-0.13524559140205383,
0.06026555970311165,
0.0143557358533144,
-0.017229458317160606,
-0.026987869292497635,
0.03741225600242615,
0.0845988318324089,
0.3271799087524414,
0.022394483909010887,
-0.08574987202882767,
-0.0921737551689148,
-0.08794225007295609,
-0.07406137883663177,
-0.07010329514741898,
-0.014048843644559383,
0.02114647999405861,
-0.1494622826576233,
0.06911490857601166,
-0.048151541501283646,
-0.11232320964336395,
0.11937429010868073,
-0.08223224431276321,
0.033426862210035324,
-0.05117655172944069,
0.12471572309732437,
-0.0260324589908123,
0.018751874566078186,
-0.08284956961870193,
0.02103583514690399,
-0.051697760820388794,
0.03924839198589325,
-0.04952418431639671,
0.031122488901019096,
-0.10396896302700043,
-0.019783303141593933,
0.1283455640077591,
-0.06354240328073502,
0.05401318147778511,
0.06430777162313461,
-0.008727204985916615,
-0.054261524230241776,
0.14040349423885345,
0.03585169091820717,
-0.08893068879842758,
-0.12612515687942505,
0.1781618893146515,
-0.00558238010853529,
0.09590648859739304,
0.10543008893728256,
-0.20157307386398315,
0.04159717634320259,
0.09632202237844467,
-0.03963969647884369,
-0.13032320141792297,
0.05327080190181732,
-0.05938182398676872,
0.09374772012233734,
-0.03426295518875122,
-0.003021853044629097,
0.03876464068889618,
-0.04586908966302872,
0.06634032726287842,
0.041539158672094345,
-0.016188591718673706,
0.024871012195944786,
-0.1756913810968399,
0.013808534480631351,
-0.019845252856612206,
0.050387103110551834,
-0.17086926102638245,
-0.054681360721588135,
-0.09835492819547653,
-0.036653511226177216,
0.0027383416891098022,
0.06841492652893066,
0.1905086636543274,
0.03557147458195686,
0.0038746497593820095,
-0.1416410505771637,
0.04758865758776665,
0.11764024198055267,
-0.1455005705356598,
-0.0293448306620121
] |
null | null | transformers |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | adarshheg/Llama-2-13b-finetuned-v1 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"autotrain",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T08:45:39+00:00 | [] | [] | TAGS
#transformers #safetensors #llama #text-generation #autotrain #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #autotrain #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
56,
29,
3
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #autotrain #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.030233582481741905,
0.044486843049526215,
-0.001213985262438655,
0.0538194440305233,
0.13616780936717987,
-0.034359160810709,
0.24212737381458282,
0.04974839836359024,
-0.08069171756505966,
-0.08828417211771011,
0.1835254579782486,
0.19055704772472382,
-0.05231833457946777,
0.16918182373046875,
-0.03819317743182182,
-0.25125381350517273,
0.027510078623890877,
-0.02052813582122326,
0.05992385745048523,
0.11618368327617645,
0.1356484442949295,
-0.07286405563354492,
0.07558650523424149,
0.04071101173758507,
-0.20057329535484314,
0.04125277325510979,
0.06584042310714722,
-0.13731889426708221,
0.17589664459228516,
0.06651129573583603,
0.11982711404561996,
0.04201258346438408,
0.13194973766803741,
-0.11539541929960251,
0.01677699387073517,
0.006089715287089348,
-0.012448305264115334,
0.07580878585577011,
0.09121459722518921,
-0.05039992183446884,
0.07662608474493027,
0.1693045198917389,
0.10217941552400589,
0.03913329541683197,
-0.09684345871210098,
0.01868700422346592,
-0.011758350767195225,
0.009696263819932938,
0.11904925107955933,
0.1142357662320137,
-0.0037827088963240385,
0.16560974717140198,
-0.13275016844272614,
0.08540078997612,
-0.05037863925099373,
-0.2618809938430786,
-0.01718125306069851,
0.1800895780324936,
0.06736887246370316,
-0.013204663060605526,
-0.10871165990829468,
0.0832592099905014,
0.11307011544704437,
-0.007529445458203554,
0.08455708622932434,
-0.026264257729053497,
-0.06016365438699722,
-0.002186497673392296,
-0.08158216625452042,
0.019356463104486465,
0.18619242310523987,
-0.08962637186050415,
-0.026531536132097244,
-0.10455767810344696,
-0.03288734704256058,
0.007692196872085333,
0.0019304570741951466,
-0.1005178838968277,
-0.017774827778339386,
0.09158472716808319,
-0.029593104496598244,
-0.024699222296476364,
-0.12848596274852753,
-0.06777367740869522,
-0.10036627948284149,
0.09939469397068024,
0.003897651331499219,
-0.008503499440848827,
-0.10258311778306961,
0.12370152771472931,
0.030374685302376747,
-0.10124702751636505,
0.05063316598534584,
-0.09004855901002884,
0.028912976384162903,
-0.09744736552238464,
-0.02546374686062336,
-0.13549922406673431,
0.020870886743068695,
0.20467180013656616,
0.17805926501750946,
-0.01145392656326294,
-0.08812520653009415,
0.03625109791755676,
0.0008179644355550408,
0.12653805315494537,
0.032579418271780014,
-0.036496490240097046,
0.06200064718723297,
-0.04231312870979309,
-0.013179670087993145,
-0.02807638980448246,
-0.18589061498641968,
0.024049878120422363,
0.02915334515273571,
0.07065627723932266,
-0.06868276745080948,
0.09377432614564896,
-0.027718648314476013,
0.03711109980940819,
0.016023842617869377,
-0.04853251203894615,
0.026124270632863045,
-0.0738735944032669,
0.00013070651039015502,
-0.057878635823726654,
0.05027531459927559,
0.10120894759893417,
0.021184498444199562,
0.1256687492132187,
-0.09038646519184113,
-0.03545280545949936,
-0.11335796862840652,
-0.05878029763698578,
0.003939428832381964,
0.011430792510509491,
0.05267070606350899,
-0.19940395653247833,
-0.3015422821044922,
-0.004989997949451208,
0.050753381103277206,
-0.023778526112437248,
-0.07349185645580292,
-0.08470188826322556,
0.001000837772153318,
0.05167684704065323,
-0.03120448999106884,
0.06968189030885696,
-0.020581809803843498,
0.032200396060943604,
-0.05502425506711006,
0.01783364824950695,
-0.054251205176115036,
0.022036677226424217,
-0.13833174109458923,
-0.006974850781261921,
-0.03346197307109833,
0.039347440004348755,
-0.034659307450056076,
0.15313684940338135,
-0.024753857403993607,
0.03732745721936226,
-0.03288530185818672,
0.05699798837304115,
0.014490505680441856,
0.1587008237838745,
-0.13942737877368927,
-0.029804671183228493,
0.13435518741607666,
-0.11049015820026398,
-0.11021945625543594,
0.09814219921827316,
-0.1027923971414566,
0.25366804003715515,
0.11463119834661484,
0.089041568338871,
0.08555333316326141,
-0.0939832255244255,
0.10416270047426224,
0.014406654052436352,
-0.0810551568865776,
-0.05981045216321945,
0.001247191452421248,
0.014072762802243233,
-0.2282852977514267,
0.04590285196900368,
0.1099134013056755,
0.07957035303115845,
-0.03853422775864601,
-0.0828741192817688,
-0.02569119818508625,
-0.06479489803314209,
0.05748641490936279,
-0.012020731344819069,
0.14137892425060272,
-0.048433054238557816,
-0.03437682241201401,
0.07282166182994843,
0.049919936805963516,
0.04887467995285988,
-0.04896143823862076,
-0.08309599757194519,
-0.014155385084450245,
-0.05337151885032654,
0.014066973701119423,
-0.09911438822746277,
-0.06441604346036911,
-0.019569741562008858,
0.09963230788707733,
0.04109548404812813,
0.07980747520923615,
0.03298676386475563,
0.05346972867846489,
-0.028099561110138893,
0.009641850367188454,
0.171212837100029,
0.03339327871799469,
-0.12648417055606842,
-0.10679809004068375,
0.10591638833284378,
-0.07651489973068237,
0.12340249121189117,
-0.2326846718788147,
0.0319368876516819,
-0.11047415435314178,
0.09298565238714218,
0.004907169379293919,
0.083468496799469,
-0.08398003876209259,
0.028484543785452843,
-0.1119765117764473,
0.0021211018320173025,
0.055693674832582474,
0.032440412789583206,
-0.04558722302317619,
0.13343413174152374,
-0.1485532969236374,
0.2725752294063568,
0.11859120428562164,
-0.1225438341498375,
-0.08789797127246857,
-0.08209558576345444,
0.01463414542376995,
-0.01473908219486475,
-0.10711272060871124,
-0.00464220205321908,
0.090196393430233,
-0.03334807977080345,
0.19780901074409485,
-0.025136709213256836,
-0.027009958401322365,
-0.010027045384049416,
-0.08553040027618408,
-0.003327628830447793,
0.01587565243244171,
0.11182920634746552,
-0.17783890664577484,
0.1318385899066925,
0.15874429047107697,
-0.04425647482275963,
0.18798032402992249,
0.03296133875846863,
0.011020161211490631,
0.002961918478831649,
-0.0587744414806366,
0.012081347405910492,
-0.014865024946630001,
0.0052044577896595,
-0.02005123905837536,
0.011482035741209984,
0.00413762079551816,
0.03298396244645119,
-0.13842253386974335,
-0.045649055391550064,
0.022555530071258545,
0.05180300772190094,
0.05135413259267807,
0.06037316098809242,
-0.08062099665403366,
0.07630951702594757,
-0.04452550411224365,
-0.14345431327819824,
0.12739118933677673,
0.02064763568341732,
-0.11117818206548691,
0.18438909947872162,
-0.08062981814146042,
-0.2297380119562149,
-0.22443866729736328,
-0.16446608304977417,
-0.011114777065813541,
0.07911116629838943,
0.060191091150045395,
-0.07421005517244339,
-0.07637105882167816,
-0.011371796950697899,
-0.0550556555390358,
0.0073495288379490376,
-0.010368063114583492,
-0.09405577927827835,
0.049745358526706696,
-0.004702834878116846,
-0.10820401459932327,
-0.03869745135307312,
0.020398495718836784,
-0.061533134430646896,
0.07165931165218353,
-0.04781206697225571,
0.06501610577106476,
0.15835903584957123,
-0.01930721290409565,
0.015421092510223389,
-0.023545147851109505,
0.14220495522022247,
-0.07042994350194931,
-0.0027030508499592543,
0.11660090833902359,
-0.05792497098445892,
0.03252281993627548,
0.1998281329870224,
0.02275119721889496,
-0.07990385591983795,
0.08379725366830826,
-0.026467666029930115,
-0.07103549689054489,
-0.2110617309808731,
-0.09836360812187195,
-0.003794529940932989,
0.006001502741128206,
0.09317165613174438,
0.059360016137361526,
0.26240023970603943,
0.14496001601219177,
0.07884223759174347,
0.08026859164237976,
0.010121341794729233,
0.09064983576536179,
0.1671321541070938,
-0.02893867902457714,
0.1837460845708847,
-0.08177211880683899,
-0.18439914286136627,
0.03811042383313179,
-0.016378022730350494,
0.07307704538106918,
0.16287975013256073,
-0.03344360738992691,
0.031136173754930496,
0.07826884835958481,
0.14637620747089386,
0.1369740217924118,
0.07916141301393509,
-0.053584322333335876,
-0.008333854377269745,
-0.01352411787956953,
-0.051015615463256836,
0.12768198549747467,
-0.063595712184906,
-0.05301755294203758,
-0.032549891620874405,
0.05175798386335373,
0.03259597718715668,
0.08064481616020203,
0.0003997169260401279,
-0.309732049703598,
0.04671970009803772,
0.043427757918834686,
-0.07567816972732544,
-0.09734112024307251,
0.09140878915786743,
-0.035215768963098526,
-0.16654866933822632,
0.019458334892988205,
-0.041935864835977554,
0.08800463378429413,
0.0078069777227938175,
0.059996895492076874,
-0.06545950472354889,
-0.025956671684980392,
-0.041478727012872696,
0.14310163259506226,
-0.37306511402130127,
0.20193158090114594,
-0.013142331503331661,
0.042778607457876205,
-0.10678635537624359,
0.020484188571572304,
0.08859410136938095,
0.1896958351135254,
0.11323587596416473,
-0.06416832655668259,
-0.14478136599063873,
-0.13083983957767487,
-0.09616615623235703,
-0.007938794791698456,
0.018248550593852997,
-0.02861541509628296,
0.03276824578642845,
-0.12244863063097,
-0.007232520263642073,
0.04563054442405701,
-0.0003797943063545972,
-0.13678863644599915,
-0.16151514649391174,
0.0010730470530688763,
0.031956855207681656,
0.11872614175081253,
-0.03973402827978134,
-0.09386511147022247,
-0.10537009686231613,
0.16155357658863068,
0.0434398278594017,
-0.0032312744297087193,
-0.13477565348148346,
-0.04382272809743881,
-0.02633882686495781,
-0.03157653659582138,
0.08056245744228363,
0.006978948600590229,
0.12115171551704407,
-0.07418990880250931,
-0.08299543708562851,
0.09858261793851852,
-0.11504889279603958,
-0.06339965760707855,
-0.1055075153708458,
0.02134295180439949,
-0.04582704231142998,
-0.0055122836492955685,
0.09996341913938522,
0.044301845133304596,
-0.0564575232565403,
-0.06688746064901352,
-0.030333636328577995,
-0.0035526733845472336,
-0.019270796328783035,
-0.10012051463127136,
-0.12814848124980927,
-0.08549763262271881,
-0.01797124370932579,
-0.11312005668878555,
0.20464067161083221,
0.1497236043214798,
-0.08891571313142776,
0.13653406500816345,
0.1947350651025772,
-0.12512075901031494,
-0.3112392723560333,
-0.0591794028878212,
-0.060733214020729065,
0.017820820212364197,
0.051851484924554825,
-0.1396218240261078,
0.12098728865385056,
0.026967007666826248,
-0.08025223016738892,
-0.01870194636285305,
-0.1393427848815918,
-0.16253414750099182,
0.25069278478622437,
0.025390613824129105,
0.22613508999347687,
-0.10329495370388031,
-0.05625482276082039,
-0.1528514325618744,
0.04403030499815941,
0.05570097640156746,
-0.059750333428382874,
0.06813552230596542,
0.027666809037327766,
0.06517914682626724,
0.0352771058678627,
-0.031431861221790314,
0.059037331491708755,
-0.05435364320874214,
0.08663322776556015,
-0.1689387410879135,
-0.01237628236413002,
0.04819100350141525,
-0.034416746348142624,
0.10872482508420944,
-0.06728927791118622,
0.032740700989961624,
-0.02744685485959053,
-0.07909418642520905,
0.03789518401026726,
0.0732329860329628,
0.0007817583391442895,
-0.11316461861133575,
0.006888468749821186,
-0.0024804365821182728,
-0.0036804734263569117,
-0.07207884639501572,
0.0360134020447731,
-0.015701891854405403,
0.12322087585926056,
0.15038511157035828,
0.22221173346042633,
-0.03807198628783226,
0.07619243115186691,
-0.03499734401702881,
-0.10971996933221817,
0.08894997090101242,
-0.08182878792285919,
0.02895357646048069,
0.07967188209295273,
-0.04530767723917961,
0.1518583744764328,
0.059346023947000504,
0.01439667958766222,
-0.0170619897544384,
0.1622321903705597,
-0.15806029736995697,
0.03757179155945778,
-0.08510110527276993,
0.0981348529458046,
0.03999621793627739,
-0.0031106341630220413,
0.123895563185215,
-0.09477032721042633,
-0.01722901687026024,
0.02182912267744541,
-0.0064381323754787445,
-0.02466222271323204,
0.1154962033033371,
0.03963370621204376,
0.019384723156690598,
-0.07287894189357758,
0.032995473593473434,
0.0793546736240387,
0.03090100735425949,
0.0360221303999424,
0.01733146794140339,
-0.09581634402275085,
-0.09762053936719894,
0.020059550181031227,
0.26283106207847595,
-0.2073555886745453,
-0.08517836779356003,
-0.03368183225393295,
-0.12218183279037476,
0.025682536885142326,
0.10866613686084747,
0.08440512418746948,
0.04843233525753021,
-0.05936649441719055,
-0.031254567205905914,
-0.12268935889005661,
0.10343098640441895,
0.01711028814315796,
0.06650421768426895,
-0.1809314489364624,
0.07358395308256149,
-0.02809927426278591,
0.008834644220769405,
-0.09301190823316574,
-0.021431833505630493,
-0.12153994292020798,
0.02847396209836006,
-0.15779872238636017,
-0.03682858124375343,
-0.03192681446671486,
-0.005093364976346493,
0.050037600100040436,
-0.004694884177297354,
-0.029660729691386223,
-0.026728112250566483,
-0.09693919867277145,
0.031877078115940094,
-0.0025847572833299637,
0.04843446612358093,
-0.043190669268369675,
-0.035425733774900436,
0.034816160798072815,
-0.009424110874533653,
0.052381593734025955,
-0.003583191428333521,
-0.011726359836757183,
0.0612170472741127,
-0.14290447533130646,
0.02284354716539383,
0.08007043600082397,
0.0021814126521348953,
0.025587504729628563,
-0.046147607266902924,
0.003772641997784376,
0.09461848437786102,
0.04222482442855835,
0.042058926075696945,
-0.021312225610017776,
-0.10621987283229828,
0.03238086402416229,
0.06855572015047073,
-0.12687964737415314,
-0.03339167684316635,
-0.033452991396188736,
0.008667406626045704,
-0.03922462835907936,
0.23274736106395721,
-0.11200960725545883,
0.047668736428022385,
-0.03629864379763603,
0.03481632098555565,
-0.040750276297330856,
-0.1322820633649826,
-0.09714572131633759,
-0.1218259409070015,
-0.03861447423696518,
0.004378629848361015,
0.27098628878593445,
0.1524139642715454,
-0.012074965052306652,
0.026575852185487747,
0.07427959144115448,
0.07876431941986084,
0.017954310402274132,
0.2124546319246292,
0.11772505939006805,
0.019052164629101753,
-0.1249738559126854,
0.07732754200696945,
0.05001425743103027,
-0.06056597828865051,
-0.00614928686991334,
-0.002644259948283434,
-0.10810491442680359,
0.0764278918504715,
0.058919016271829605,
-0.0322267971932888,
-0.08979810774326324,
-0.13948139548301697,
-0.12417440116405487,
0.0398101881146431,
-0.07980944216251373,
0.01371616031974554,
0.16255922615528107,
-0.04193843528628349,
-0.01258701179176569,
-0.044840361922979355,
-0.04393536224961281,
-0.22105973958969116,
-0.15929199755191803,
-0.12153827399015427,
-0.08488250523805618,
0.030652163550257683,
-0.03584383800625801,
0.04418419674038887,
0.04562603309750557,
0.05583393573760986,
-0.05587306618690491,
0.10599631071090698,
-0.08984807133674622,
-0.0009273026371374726,
0.009541553445160389,
-0.05641864612698555,
0.00033469367190264165,
-0.1973697394132614,
-0.012389290146529675,
-0.13826921582221985,
0.018863461911678314,
-0.048267021775245667,
-0.030272165313363075,
-0.003238338278606534,
0.003345966339111328,
-0.03968377038836479,
-0.021012550219893456,
-0.017558271065354347,
0.030668145045638084,
0.016730744391679764,
0.0320734865963459,
0.005219834391027689,
-0.008128107525408268,
0.03835280239582062,
0.20299074053764343,
-0.045781176537275314,
-0.18120475113391876,
-0.13223539292812347,
0.24052202701568604,
0.015449130907654762,
0.1216314285993576,
-0.05895445495843887,
-0.0028388097416609526,
0.046702757477760315,
0.32025182247161865,
0.27878323197364807,
-0.05612753704190254,
0.010938582010567188,
-0.022306501865386963,
-0.011537747457623482,
-0.008011733181774616,
0.15695297718048096,
0.01662231609225273,
0.15353867411613464,
-0.047389231622219086,
0.04584977775812149,
-0.02435649186372757,
-0.08908694982528687,
-0.04333536699414253,
0.1347881257534027,
-0.020947841927409172,
-0.008336201310157776,
-0.02847667969763279,
0.07034122198820114,
-0.10188855975866318,
0.14772182703018188,
-0.1257404088973999,
-0.019365347921848297,
-0.06710933893918991,
0.03698932006955147,
0.10075706988573074,
-0.015645895153284073,
0.029549336060881615,
-0.034948039799928665,
-0.022729575634002686,
0.019183486700057983,
-0.03610850125551224,
-0.09600125253200531,
-0.026283137500286102,
0.0822208896279335,
0.0198498647660017,
0.21264657378196716,
-0.010850045830011368,
0.04094035178422928,
0.07488980889320374,
-0.006131554488092661,
-0.10380975157022476,
0.0967283695936203,
-0.005664472468197346,
-0.06362035125494003,
0.13359829783439636,
-0.011046118102967739,
0.013147052377462387,
0.010283130221068859,
-0.010407431982457638,
-0.1329643428325653,
0.12699143588542938,
-0.11626135557889938,
-0.08817215263843536,
-0.052357643842697144,
0.09224232286214828,
-0.026907680556178093,
0.1509033441543579,
0.08656276762485504,
-0.014904826879501343,
0.01371307484805584,
-0.03778959438204765,
0.07716576755046844,
-0.013930321671068668,
-0.1174720972776413,
-0.022831548005342484,
-0.19073913991451263,
-0.03281955048441887,
0.09336961060762405,
-0.022282110527157784,
-0.28174594044685364,
-0.08078229427337646,
-0.08494999259710312,
-0.043805185705423355,
-0.13497743010520935,
0.07576882094144821,
0.23732800781726837,
0.02908778376877308,
-0.01389587577432394,
-0.12473831325769424,
-0.017889177426695824,
0.030575288459658623,
-0.05309143289923668,
-0.10085879266262054
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | mahiatlinux/lora_model | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T08:47:39+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# santoshdahal/whispher-ne-medium
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the Common Voice 11.0 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"language": ["np"], "license": "apache-2.0", "tags": ["hf-asr-leaderboard", "generated_from_trainer"], "datasets": ["mozilla-foundation/common_voice_11_0"], "base_model": "openai/whisper-medium", "model-index": [{"name": "santoshdahal/whispher-ne-medium", "results": []}]} | automatic-speech-recognition | santoshdahal/whisper-medium-nepali | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"np",
"dataset:mozilla-foundation/common_voice_11_0",
"base_model:openai/whisper-medium",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T08:49:03+00:00 | [] | [
"np"
] | TAGS
#transformers #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #np #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us
|
# santoshdahal/whispher-ne-medium
This model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
| [
"# santoshdahal/whispher-ne-medium\n\nThis model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2+cu121\n- Datasets 2.15.0\n- Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #np #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n",
"# santoshdahal/whispher-ne-medium\n\nThis model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2+cu121\n- Datasets 2.15.0\n- Tokenizers 0.15.0"
] | [
96,
42,
6,
12,
8,
3,
118,
38
] | [
"passage: TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #np #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n# santoshdahal/whispher-ne-medium\n\nThis model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Framework versions\n\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2+cu121\n- Datasets 2.15.0\n- Tokenizers 0.15.0"
] | [
-0.11720985174179077,
0.16260674595832825,
-0.002518775872886181,
0.02464471198618412,
0.12164223194122314,
0.011919183656573296,
0.11956443637609482,
0.1099800169467926,
-0.036785539239645004,
0.11107814311981201,
0.0682455450296402,
0.04145319014787674,
0.06719842553138733,
0.12649276852607727,
-0.018313385546207428,
-0.2522006034851074,
0.03771863505244255,
-0.017399661242961884,
-0.03949730470776558,
0.07414153218269348,
0.11002103239297867,
-0.060138069093227386,
0.051153529435396194,
0.045361775904893875,
-0.11160861700773239,
0.0033427285961806774,
-0.04791780188679695,
-0.05172006040811539,
0.07207825779914856,
0.028944458812475204,
-0.0011736484011635184,
0.006364823319017887,
0.06185005232691765,
-0.26503822207450867,
0.010268502868711948,
0.04872521385550499,
0.027308376505970955,
0.05645066872239113,
0.05626767873764038,
0.016744378954172134,
0.07457197457551956,
-0.09707134962081909,
0.07352212816476822,
0.07265029847621918,
-0.04253387823700905,
-0.19843609631061554,
-0.07905556261539459,
0.08938128501176834,
0.10042712092399597,
0.0874318927526474,
-0.010188092477619648,
0.12748271226882935,
-0.009716944769024849,
0.07051140069961548,
0.148736372590065,
-0.2327622026205063,
-0.05191722884774208,
-0.007756673730909824,
0.059464458376169205,
0.08241463452577591,
-0.07907037436962128,
0.007178196217864752,
0.05743394419550896,
0.019412171095609665,
0.04924257472157478,
0.0067545101046562195,
0.05431273207068443,
-0.06174818053841591,
-0.12930920720100403,
-0.04777142405509949,
0.2405143678188324,
0.09513457119464874,
-0.06212255358695984,
-0.14229382574558258,
-0.03645893558859825,
-0.06307968497276306,
-0.038379233330488205,
-0.03237752243876457,
0.006116121541708708,
-0.022717641666531563,
-0.012227003462612629,
-0.05314825102686882,
-0.05120523273944855,
-0.04265948385000229,
0.06358072906732559,
0.18899281322956085,
0.042359475046396255,
-0.020834175869822502,
0.005396321415901184,
0.05364726856350899,
-0.0017558897379785776,
-0.1419837921857834,
-0.054575998336076736,
0.010759590193629265,
-0.05205493047833443,
-0.034097298979759216,
-0.038614172488451004,
-0.0062579079531133175,
0.04983791336417198,
0.17803670465946198,
0.014174206182360649,
0.08978376537561417,
0.01675538159906864,
-0.013054532930254936,
-0.0006417699041776359,
0.12147528678178787,
-0.008767674677073956,
-0.07069747149944305,
0.026389330625534058,
0.10994599759578705,
0.0636158436536789,
-0.028613032773137093,
-0.08739109337329865,
-0.00040579467895440757,
0.10325386375188828,
0.06254533678293228,
0.019972888752818108,
0.003447698662057519,
-0.053134217858314514,
-0.0612453818321228,
0.06939435005187988,
-0.13589806854724884,
0.027230001986026764,
-0.001088203163817525,
-0.005956633016467094,
-0.023403115570545197,
-0.0011046425206586719,
0.03992556408047676,
-0.033598095178604126,
0.022527161985635757,
-0.07849648594856262,
-0.03863697126507759,
-0.02968851663172245,
-0.0124607402831316,
0.03212001547217369,
-0.024005314335227013,
0.025544073432683945,
-0.08368577808141708,
-0.12481541931629181,
-0.04759623110294342,
0.026103826239705086,
-0.029776504263281822,
-0.12062907218933105,
-0.022772882133722305,
-0.03973681479692459,
0.029086390510201454,
-0.0134400250390172,
0.06998102366924286,
-0.042176175862550735,
0.04098319262266159,
0.03888404369354248,
-0.01637713983654976,
0.018221817910671234,
0.04468582198023796,
-0.05337373539805412,
0.07015541195869446,
-0.07363723963499069,
0.10478930175304413,
-0.1173805296421051,
0.049953754991292953,
-0.12048129737377167,
-0.0789896547794342,
-0.04107219725847244,
-0.023880407214164734,
0.05152985081076622,
0.11885133385658264,
-0.09124144911766052,
-0.05503018572926521,
0.1331234574317932,
-0.0789308175444603,
-0.13355949521064758,
0.13562195003032684,
0.00994883757084608,
-0.008947550319135189,
0.04650038108229637,
0.16315001249313354,
0.15976053476333618,
-0.11561734229326248,
-0.03180558606982231,
0.0030359907541424036,
0.10969637334346771,
0.028422892093658447,
0.10197418928146362,
-0.04820141941308975,
-0.03337638080120087,
0.020229239016771317,
-0.04934709891676903,
0.0037395842373371124,
-0.037986017763614655,
-0.08174628764390945,
-0.05440876632928848,
-0.0839763656258583,
0.0018099042354151607,
0.009194218553602695,
-0.014979943633079529,
-0.07574769109487534,
-0.11754649877548218,
0.04553713649511337,
0.1467507928609848,
-0.07329630851745605,
0.015223347581923008,
-0.07967602461576462,
0.06102224439382553,
-0.04986347258090973,
-0.011393032968044281,
-0.15508978068828583,
-0.10530658811330795,
0.06520706415176392,
-0.11106118559837341,
0.0471380352973938,
-0.06365025043487549,
0.03902413696050644,
0.07044970989227295,
-0.021964147686958313,
-0.052022822201251984,
-0.07524434477090836,
0.0012963231420144439,
-0.07316137105226517,
-0.15147198736667633,
-0.07794205099344254,
-0.0300055593252182,
0.24327446520328522,
-0.19737869501113892,
0.01676032692193985,
0.020127521827816963,
0.13156187534332275,
0.007860534824430943,
-0.07827464491128922,
0.02445289120078087,
0.019361447542905807,
-0.0091474037617445,
-0.0909184142947197,
0.009377223439514637,
0.03888348862528801,
-0.12157196551561356,
-0.03350896015763283,
-0.19436663389205933,
0.0854819044470787,
0.06361160427331924,
0.10117229074239731,
-0.017480194568634033,
-0.05514230579137802,
-0.05210759863257408,
-0.05158543959259987,
-0.04720442742109299,
-0.037332285195589066,
0.1599297821521759,
-0.0062053147703409195,
0.08897977322340012,
-0.06820067763328552,
-0.055803339928388596,
0.023479951545596123,
0.002396080642938614,
-0.05884786322712898,
0.060881320387125015,
0.002875041915103793,
-0.12773020565509796,
0.08855217695236206,
0.0710626021027565,
-0.03403376042842865,
0.12704062461853027,
-0.05372830107808113,
-0.07388006150722504,
-0.02542073465883732,
0.023676184937357903,
0.010118368081748486,
0.08216378092765808,
-0.16771407425403595,
0.012325560674071312,
0.045822516083717346,
0.009436742402613163,
0.0410599559545517,
-0.1246720552444458,
0.030090922489762306,
0.03785054385662079,
-0.03436700999736786,
0.006023484747856855,
0.0014991600764915347,
-0.027453413233160973,
0.03812739998102188,
0.029080010950565338,
0.005262287333607674,
0.02041558548808098,
-0.014861301518976688,
-0.1090906634926796,
0.1474774032831192,
-0.11559309810400009,
-0.1409599930047989,
-0.1577489674091339,
0.015766562893986702,
-0.08678824454545975,
-0.00560406781733036,
0.04562085121870041,
-0.05734899267554283,
-0.06739931553602219,
-0.07822182774543762,
-0.03955943137407303,
-0.07106666266918182,
-0.00124304776545614,
0.10011595487594604,
-0.011822293512523174,
0.09945449978113174,
-0.11153507232666016,
-0.0009798503015190363,
0.023621780797839165,
-0.03404370695352554,
-0.06404273957014084,
0.012651164084672928,
0.10257460922002792,
0.11694686114788055,
0.04559668153524399,
0.019891606643795967,
-0.01568884216248989,
0.266680508852005,
-0.09580119699239731,
-0.013474729843437672,
0.15746665000915527,
-0.002740794327110052,
0.0628022849559784,
0.11564704775810242,
0.02685542032122612,
-0.0822080746293068,
0.023042742162942886,
0.027090394869446754,
-0.017569055780768394,
-0.23280984163284302,
-0.0652695968747139,
-0.06109783798456192,
-0.08003942668437958,
0.09857792407274246,
0.07314185798168182,
0.04828903079032898,
0.07846282422542572,
-0.07679527997970581,
0.03969307243824005,
0.003923905547708273,
0.10851911455392838,
0.12684492766857147,
0.04914787411689758,
0.08061627298593521,
-0.019077828153967857,
-0.0007689267513342202,
0.06221774220466614,
0.032113369554281235,
0.1841304749250412,
0.05132612958550453,
0.19380301237106323,
0.027418721467256546,
0.15551306307315826,
-0.007642160635441542,
0.007686158642172813,
0.028225334361195564,
0.021304849535226822,
0.028886497020721436,
-0.07609013468027115,
-0.05823388695716858,
0.05382142961025238,
0.054720859974622726,
0.037828411906957626,
-0.06404227763414383,
0.06033874303102493,
-0.0026298232842236757,
0.28148671984672546,
0.032086875289678574,
-0.2838555574417114,
-0.10433939099311829,
0.031196443364024162,
-0.04146267846226692,
-0.0768766701221466,
0.016101432964205742,
0.06661747395992279,
-0.12432779371738434,
0.1005699634552002,
-0.03587910532951355,
0.0927824079990387,
-0.04646514728665352,
-0.009583243168890476,
0.036238737404346466,
0.12586210668087006,
0.003696363652125001,
0.08974387496709824,
-0.15175436437129974,
0.16931244730949402,
0.018082166090607643,
0.09223508089780807,
-0.06199871003627777,
0.06208482012152672,
0.02761639654636383,
0.06633583456277847,
0.0868491604924202,
0.00893966294825077,
-0.0799141675233841,
-0.1253035068511963,
-0.1110963448882103,
0.03343256935477257,
0.11859910190105438,
-0.07688697427511215,
0.07254967093467712,
-0.059869445860385895,
0.0011854165932163596,
0.01805109903216362,
-0.0074464427307248116,
-0.1893790364265442,
-0.16733895242214203,
0.052432723343372345,
0.07779111713171005,
0.05050555244088173,
-0.09059002250432968,
-0.09897264838218689,
-0.04320353642106056,
0.16950343549251556,
-0.04342346265912056,
-0.0542544461786747,
-0.14709950983524323,
0.012522268109023571,
0.12042713165283203,
-0.07705177366733551,
0.03142588585615158,
-0.00707183126360178,
0.1908225119113922,
0.006021804641932249,
-0.06464594602584839,
0.05951875075697899,
-0.09693273156881332,
-0.15597350895404816,
-0.01662418805062771,
0.19384925067424774,
0.010375459678471088,
0.04358523339033127,
0.016408903524279594,
0.023580029606819153,
0.0033602246548980474,
-0.07911573350429535,
0.03387536108493805,
0.07443032413721085,
0.021753773093223572,
0.062135905027389526,
-0.040358759462833405,
0.014006387442350388,
-0.07531020790338516,
-0.027236150577664375,
0.11682979017496109,
0.2274046540260315,
-0.04514067620038986,
0.041589654982089996,
0.08481908589601517,
-0.07121824473142624,
-0.13159385323524475,
-0.012390774674713612,
0.12645314633846283,
0.01657726988196373,
0.04504423961043358,
-0.1902247965335846,
0.11087805777788162,
0.0965014100074768,
-0.049197275191545486,
0.05123647302389145,
-0.23927727341651917,
-0.13145051896572113,
0.09769438207149506,
0.07334008067846298,
-0.00993530172854662,
-0.1431470364332199,
-0.08895258605480194,
-0.0664524957537651,
-0.13666656613349915,
0.047831352800130844,
-0.02905929647386074,
0.09737894684076309,
0.01749972626566887,
0.057286519557237625,
0.028745360672473907,
-0.020719382911920547,
0.1742393523454666,
0.034997276961803436,
0.014131668023765087,
-0.04421207681298256,
0.07586685568094254,
0.024313004687428474,
-0.06987324357032776,
0.06719190627336502,
-0.07689247280359268,
0.059841740876436234,
-0.13559921085834503,
-0.03821665048599243,
-0.0349719375371933,
0.05724809691309929,
-0.037748850882053375,
-0.026861054822802544,
-0.018155962228775024,
0.01398793701082468,
0.07364928722381592,
0.0008348356932401657,
0.13441841304302216,
0.004068917594850063,
0.09624866396188736,
0.13159987330436707,
0.14035171270370483,
-0.04719996079802513,
-0.1365266889333725,
-0.055208802223205566,
-0.03297070786356926,
0.05206265673041344,
-0.05895004794001579,
0.04105746001005173,
0.07418414205312729,
0.053094156086444855,
0.12557466328144073,
0.00955499242991209,
-0.08361838757991791,
0.006701803300529718,
0.032899077981710434,
-0.06113241985440254,
-0.2626889944076538,
-0.06791378557682037,
0.05754777044057846,
-0.1811845749616623,
0.0308479443192482,
0.12082581967115402,
-0.03455014154314995,
-0.02194216102361679,
-0.012342189438641071,
0.02414088323712349,
-0.013811028562486172,
0.15494197607040405,
0.03488975763320923,
0.07944150269031525,
-0.08297151327133179,
0.1006556823849678,
0.07103250920772552,
-0.06870678812265396,
0.07019244879484177,
0.04547600448131561,
-0.0631144717335701,
-0.02023153193295002,
-0.0027142027392983437,
0.04524435102939606,
0.009392539039254189,
-0.06294248253107071,
-0.0460439957678318,
-0.10067815333604813,
0.03280111774802208,
0.023847142234444618,
0.03616364300251007,
-0.018214156851172447,
-0.013741383329033852,
0.03016914799809456,
-0.11603531241416931,
0.10208716243505478,
0.034218478947877884,
0.08930021524429321,
-0.18062633275985718,
-0.007266632281243801,
0.014974692836403847,
0.03504341468214989,
-0.004605810157954693,
-0.032392363995313644,
-0.054513849318027496,
-0.031638018786907196,
-0.12246707826852798,
0.009868809953331947,
-0.06995274871587753,
0.023183220997452736,
-0.008213872089982033,
-0.06526671350002289,
-0.06148703023791313,
0.08462207019329071,
-0.06133325397968292,
-0.08752654492855072,
-0.003900205483660102,
0.07781452685594559,
-0.10000468790531158,
0.023965155705809593,
0.05189449340105057,
-0.12493555247783661,
0.09120742976665497,
0.06618919223546982,
0.01019167061895132,
0.014175835065543652,
-0.03537898138165474,
-0.008414283394813538,
0.038443032652139664,
0.04172845557332039,
0.020992537960410118,
-0.14886890351772308,
-0.028341498225927353,
0.009310818277299404,
0.011200027540326118,
-0.013519814237952232,
0.07949984818696976,
-0.11168457567691803,
-0.05563262477517128,
-0.05150771886110306,
-0.04197808727622032,
-0.04978172853589058,
0.01796409673988819,
0.06852703541517258,
0.005105593241751194,
0.16223260760307312,
-0.0616815984249115,
0.02406190149486065,
-0.1973273605108261,
-0.008132820948958397,
-0.015265950001776218,
-0.03307687118649483,
-0.07433830201625824,
-0.014820675365626812,
0.0634588971734047,
-0.02990226447582245,
0.10530780255794525,
-0.08014483004808426,
0.08333536982536316,
0.036780767142772675,
0.011865864507853985,
0.0381600484251976,
-0.010633517988026142,
0.22161635756492615,
0.08435104042291641,
-0.0028187090065330267,
0.10772200673818588,
-0.043873388320207596,
0.06119145452976227,
0.04890996590256691,
0.07814372330904007,
0.12174353748559952,
-0.008491573855280876,
0.07101623713970184,
0.07571128010749817,
-0.07416976243257523,
-0.16764473915100098,
0.07425533980131149,
-0.01714623160660267,
0.12822210788726807,
-0.0035424635279923677,
0.13221003115177155,
0.14762060344219208,
-0.154582217335701,
0.035760387778282166,
-0.02329777181148529,
-0.08533866703510284,
-0.12250003963708878,
-0.15369467437267303,
-0.09337548911571503,
-0.1468837410211563,
0.014371576718986034,
-0.12271200120449066,
0.005252826493233442,
0.07058179378509521,
-0.003315854584798217,
0.019546909257769585,
0.15844927728176117,
-0.04172546789050102,
-0.01527042593806982,
0.053652506321668625,
-0.03017086163163185,
-0.021631862968206406,
-0.028964471071958542,
-0.08978200703859329,
0.08278903365135193,
-0.0009242126834578812,
0.10146060585975647,
-0.037347134202718735,
-0.01887552998960018,
0.06338071823120117,
-0.02026309445500374,
-0.09512682259082794,
0.021528169512748718,
0.0029635122045874596,
0.00995275005698204,
0.05368839576840401,
0.03226533904671669,
-0.018891984596848488,
-0.01949726603925228,
0.21533019840717316,
-0.05011799558997154,
-0.04884391650557518,
-0.128143772482872,
0.12824022769927979,
0.0059045832604169846,
-0.020415427163243294,
0.06551007181406021,
-0.09036724269390106,
0.008029651828110218,
0.11008711904287338,
0.10023490339517593,
-0.014735305681824684,
-0.017505081370472908,
-0.008056452497839928,
-0.01607019454240799,
-0.058715131133794785,
0.08640814572572708,
0.0974993109703064,
0.019322682172060013,
-0.006340425461530685,
0.021350979804992676,
0.002177319023758173,
-0.056997861713171005,
-0.10566490888595581,
0.050761062651872635,
-0.018670352175831795,
-0.014507818035781384,
0.0013448323588818312,
0.1058058813214302,
-0.010719929821789265,
-0.1225161999464035,
-0.03807491064071655,
-0.10472838580608368,
-0.1778467446565628,
-0.0372333861887455,
0.061300069093704224,
0.027456481009721756,
0.043687500059604645,
0.00982412789016962,
-0.027549071237444878,
0.141792431473732,
-0.009153325110673904,
-0.05675709620118141,
-0.07946904003620148,
0.08377354592084885,
-0.07257601618766785,
0.22241535782814026,
0.0018765598069876432,
0.056431617587804794,
0.1027105525135994,
0.0261194109916687,
-0.14997893571853638,
0.032780930399894714,
0.07637830078601837,
-0.0769379734992981,
0.06943503022193909,
0.1662828028202057,
-0.04282115772366524,
0.10014985501766205,
0.05063872039318085,
-0.10805759578943253,
-0.042285509407520294,
-0.034963902086019516,
0.011697795242071152,
-0.06504078954458237,
0.01897936314344406,
-0.09265230596065521,
0.16070245206356049,
0.13189232349395752,
-0.08054342865943909,
-0.05819735303521156,
-0.0371304452419281,
0.03192192688584328,
0.06842198967933655,
0.12124360352754593,
-0.010088286362588406,
-0.2452523410320282,
0.004109550733119249,
-0.033393699675798416,
0.01734241656959057,
-0.21921129524707794,
-0.09909074008464813,
0.020609885454177856,
-0.02966897562146187,
-0.036063052713871,
0.09648726135492325,
0.07922720164060593,
-0.014997672289609909,
-0.055629514157772064,
-0.1345127522945404,
-0.05101177096366882,
0.1406661570072174,
-0.15493467450141907,
-0.049006279557943344
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0 | {"library_name": "peft", "pipeline_tag": "text-generation"} | text-generation | SJ182120/l2_python | [
"peft",
"text-generation",
"region:us"
] | 2024-02-13T08:51:31+00:00 | [] | [] | TAGS
#peft #text-generation #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0 | [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.4.0"
] | [
"TAGS\n#peft #text-generation #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.4.0"
] | [
14,
154,
11
] | [
"passage: TAGS\n#peft #text-generation #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.4.0"
] | [
-0.05881572142243385,
0.011197815649211407,
-0.0027571548707783222,
0.12589456140995026,
0.11085706949234009,
0.06660623103380203,
0.12763667106628418,
0.12437505275011063,
0.027253007516264915,
0.08653977513313293,
0.10197463631629944,
0.043686240911483765,
0.08022744208574295,
0.1561698466539383,
-0.027632534503936768,
-0.03967539966106415,
0.055549927055835724,
-0.005329824518412352,
0.016411805525422096,
0.08900198340415955,
0.05316228047013283,
-0.027680499479174614,
0.04068896919488907,
-0.0967017337679863,
-0.1751086264848709,
0.007201737258583307,
0.0056825424544513226,
0.025455325841903687,
0.04269948974251747,
0.04027232155203819,
0.05007233843207359,
-0.0037090247496962547,
-0.03768859803676605,
-0.21217304468154907,
-0.004186705686151981,
0.11228235810995102,
-0.030067583546042442,
0.07225141674280167,
-0.08224170655012131,
0.12040094286203384,
-0.003866250393912196,
-0.04515907168388367,
-0.009020119905471802,
0.04115600138902664,
-0.08619213849306107,
-0.10962390154600143,
-0.055680133402347565,
0.05948550999164581,
0.013362925499677658,
0.06881589442491531,
-0.015644213184714317,
0.15736238658428192,
-0.11635260283946991,
0.10409986227750778,
0.10079976916313171,
-0.22611679136753082,
-0.03253475949168205,
0.1273968517780304,
-0.019978994503617287,
0.16964295506477356,
-0.07277907431125641,
-0.08157005906105042,
0.08483396470546722,
0.05174221470952034,
-0.06462931632995605,
-0.001764305168762803,
-0.09723103791475296,
0.0007373450789600611,
-0.13250893354415894,
-0.04994067922234535,
0.17658554017543793,
0.021309267729520798,
-0.04031899571418762,
-0.041669998317956924,
-0.09257596731185913,
-0.35366091132164,
0.022184373810887337,
0.014121217653155327,
-0.07372593879699707,
0.05293095111846924,
0.030091846361756325,
-0.03629710525274277,
-0.005775159690529108,
-0.09149530529975891,
-0.03122434951364994,
0.09358777850866318,
0.03611838072538376,
0.039214570075273514,
-0.00104164844378829,
0.09963862597942352,
-0.11594081670045853,
-0.03337165713310242,
-0.03539630398154259,
-0.03809250146150589,
-0.022430038079619408,
0.002994721755385399,
-0.07830601185560226,
0.1641528159379959,
0.08272913098335266,
0.07621414214372635,
-0.18922950327396393,
0.13167354464530945,
-0.05106198787689209,
0.06192479655146599,
-0.04527236148715019,
0.002310642506927252,
-0.10096359252929688,
0.11694393306970596,
0.006416523829102516,
0.13870564103126526,
0.02805371582508087,
-0.05132335051894188,
-0.08321142196655273,
-0.0054443818517029285,
0.11598705500364304,
0.014898242428898811,
-0.11097488552331924,
0.025811748579144478,
-0.12944792211055756,
-0.0317189022898674,
0.06605634838342667,
-0.0814901739358902,
0.0071325236931443214,
0.04928812012076378,
-0.05109953507781029,
0.019321683794260025,
0.1059962585568428,
-0.055470552295446396,
-0.07010381668806076,
-0.04075784608721733,
-0.09164997935295105,
0.0014889700105413795,
-0.09080446511507034,
-0.13538354635238647,
0.05236659571528435,
-0.17107760906219482,
-0.01754668541252613,
-0.036966871470212936,
-0.09263300895690918,
0.023772627115249634,
0.008492665365338326,
-0.0872214064002037,
0.05155628174543381,
-0.1062537357211113,
-0.16817007958889008,
-0.023935386911034584,
0.012595133855938911,
0.01168390829116106,
-0.03166382759809494,
0.10512995719909668,
0.03190753608942032,
0.10651034861803055,
-0.19400353729724884,
-0.0009164002840407193,
0.013396083377301693,
0.0612972192466259,
0.0288090780377388,
0.13743333518505096,
-0.09254538267850876,
-0.022310545668005943,
-0.05905919522047043,
-0.0706859827041626,
-0.11278921365737915,
-0.007111814804375172,
0.11822677403688431,
0.10515100508928299,
-0.16595681011676788,
-0.02197483368217945,
0.09264646470546722,
-0.03292210027575493,
-0.08168843388557434,
0.1512005627155304,
-0.036678247153759,
0.11651269346475601,
-0.01852087862789631,
0.09312769770622253,
0.21885336935520172,
-0.11144550144672394,
0.013898308388888836,
0.11944220960140228,
0.05695997551083565,
0.005223315674811602,
-0.0021800738759338856,
0.06130006164312363,
-0.16044586896896362,
0.04566182196140289,
0.08802935481071472,
0.027865957468748093,
-0.0551101490855217,
-0.056713078171014786,
-0.03226421773433685,
-0.04497651383280754,
0.12856896221637726,
0.012565732933580875,
0.005666473414748907,
-0.08119973540306091,
-0.0746416449546814,
0.12509800493717194,
0.11896125972270966,
-0.027334723621606827,
0.0032789765391498804,
-0.10217958688735962,
0.0032003652304410934,
-0.028917375952005386,
0.04036465659737587,
-0.1225801631808281,
0.02327653020620346,
0.07309805601835251,
0.05080091953277588,
0.011764182709157467,
0.0380081906914711,
0.058842070400714874,
0.0248578991740942,
-0.05147070810198784,
0.02575971744954586,
-0.04024398326873779,
-0.010153966024518013,
-0.09388139843940735,
-0.08575032651424408,
0.008606086485087872,
-0.009998362511396408,
0.20687274634838104,
-0.16108472645282745,
0.045504119247198105,
0.1139059066772461,
-0.009723220951855183,
-0.02540266141295433,
-0.031853750348091125,
-0.0654454156756401,
0.09873996675014496,
-0.013720872811973095,
-0.02672441489994526,
0.04957606643438339,
0.03335459902882576,
-0.06231309100985527,
-0.14289100468158722,
-0.1188836470246315,
0.03744763880968094,
0.13689714670181274,
0.05626381188631058,
-0.06007016822695732,
-0.060669757425785065,
-0.016761476173996925,
-0.030358534306287766,
0.06240035593509674,
-0.06414227932691574,
0.02662777341902256,
0.004450374748557806,
0.07938659191131592,
-0.10644973814487457,
-0.029465284198522568,
0.07243990898132324,
-0.016531988978385925,
-0.04110553488135338,
0.124102883040905,
-0.011800275184214115,
-0.06747709959745407,
0.07441500574350357,
0.0501236766576767,
-0.13798129558563232,
0.11945606768131256,
0.0014309396501630545,
-0.019796505570411682,
-0.08421660214662552,
0.18273760378360748,
0.01824500970542431,
0.10148128122091293,
-0.16414664685726166,
0.09546143561601639,
-0.009288528934121132,
0.013040040619671345,
0.08409480005502701,
-0.18855534493923187,
-0.0025965420063585043,
-0.04492464289069176,
-0.10329965502023697,
-0.059771694242954254,
-0.00875019934028387,
0.014908094890415668,
0.05609450489282608,
-0.006125994957983494,
0.06188476085662842,
0.13812272250652313,
-0.024652784690260887,
-0.09473837167024612,
0.17944352328777313,
-0.22092348337173462,
-0.22601404786109924,
-0.24786701798439026,
-0.014143545180559158,
-0.14161531627178192,
-0.0267991553992033,
-0.033965665847063065,
-0.0961129441857338,
0.03159759193658829,
-0.09088070690631866,
-0.015915539115667343,
-0.03519374132156372,
0.008121411316096783,
0.04775024205446243,
0.006911728996783495,
0.15508657693862915,
-0.08029204607009888,
0.020809650421142578,
0.05023948475718498,
-0.02995353750884533,
0.10771969705820084,
-0.10328380018472672,
-0.028618859127163887,
0.12483133375644684,
-0.009994493797421455,
0.035472217947244644,
0.00011722915223799646,
0.3151676058769226,
0.00625992938876152,
0.030641749501228333,
0.10019954293966293,
-0.000419061747379601,
0.0658683180809021,
0.0803287923336029,
0.009124977514147758,
-0.10952199250459671,
0.07542693614959717,
0.04653116315603256,
-0.09201350808143616,
-0.154190331697464,
-0.0426674447953701,
-0.0744854137301445,
0.025570759549736977,
0.07532978057861328,
0.07795606553554535,
0.09981152415275574,
0.06807941943407059,
0.011005481705069542,
0.11637180298566818,
0.03382246941328049,
0.002112910384312272,
0.13259576261043549,
-0.03128501772880554,
0.05084922909736633,
-0.025560354813933372,
0.0349469892680645,
0.07214953750371933,
0.12327124923467636,
0.07037850469350815,
-0.08584535866975784,
0.028556587174534798,
0.05466320365667343,
0.2334834486246109,
-0.01122377160936594,
0.10947742313146591,
-0.06953107565641403,
-0.002204582095146179,
0.002263392321765423,
-0.04707268252968788,
-0.07716907560825348,
0.028960416093468666,
-0.042615704238414764,
0.07884593307971954,
-0.020807882770895958,
-0.031190279871225357,
0.08482342213392258,
0.09909810870885849,
0.15141646564006805,
-0.28519365191459656,
-0.11020099371671677,
-0.003077890956774354,
0.11524161696434021,
-0.10653336346149445,
0.02933037094771862,
0.2171388417482376,
0.012019132263958454,
-0.05875641852617264,
-0.037782903760671616,
0.047680143266916275,
-0.0025596285704523325,
0.030673576518893242,
0.10779093950986862,
0.13530980050563812,
-0.004418906755745411,
0.08285203576087952,
-0.30476459860801697,
0.01925586722791195,
0.05470427870750427,
0.03805522993206978,
-0.05143482983112335,
0.004429546184837818,
-0.04942825436592102,
-0.05244961008429527,
0.0490093007683754,
0.0061435881070792675,
0.1781231015920639,
-0.26357439160346985,
-0.06223435327410698,
-0.01465378887951374,
0.13256517052650452,
0.06903000175952911,
0.041718658059835434,
0.017992768436670303,
0.057827435433864594,
0.06127482280135155,
0.042297378182411194,
-0.055972859263420105,
-0.10980167984962463,
0.004393668845295906,
0.16815561056137085,
-0.12493569403886795,
-0.04595203697681427,
-0.05480054020881653,
-0.025372803211212158,
0.06761212646961212,
-0.1707804948091507,
-0.06787270307540894,
-0.06464764475822449,
0.024276241660118103,
0.13281120359897614,
-0.019562888890504837,
-0.01676190085709095,
-0.026003019884228706,
0.035132259130477905,
-0.049619160592556,
-0.10031118243932724,
0.10864138603210449,
-0.03442714735865593,
-0.1388426125049591,
-0.00808817707002163,
0.15319451689720154,
0.07737308740615845,
-0.013959160074591637,
-0.08494644612073898,
-0.047704827040433884,
0.02073710598051548,
-0.15545380115509033,
-0.0009582064230926335,
0.09235426783561707,
-0.06056572496891022,
0.1023498997092247,
-0.11422499269247055,
0.20928551256656647,
-0.061420660465955734,
0.09678772836923599,
0.08257210999727249,
0.3105289041996002,
-0.07746589928865433,
0.028599712997674942,
0.06306972354650497,
-0.03940698504447937,
-0.2546375095844269,
0.03940539434552193,
0.042575106024742126,
0.044821854680776596,
-0.06173386424779892,
-0.17220547795295715,
0.020235378295183182,
0.0861479789018631,
0.015978027135133743,
0.21665577590465546,
-0.3239070177078247,
-0.067665234208107,
0.03700398653745651,
0.061543140560388565,
0.18266460299491882,
-0.061855949461460114,
-0.007746312767267227,
-0.027031145989894867,
-0.0006121171172708273,
0.16259776055812836,
-0.13808467984199524,
0.11944621801376343,
-0.02270696684718132,
0.04095324128866196,
0.010821640491485596,
-0.02987377531826496,
0.15153926610946655,
0.011350316926836967,
0.08452794700860977,
0.0066027105785906315,
-0.027082854881882668,
0.07064864784479141,
-0.07299751043319702,
0.06075851246714592,
-0.10255613178014755,
0.0744672417640686,
-0.07602608948945999,
0.006593442056328058,
-0.06693880259990692,
-0.013958260416984558,
-0.08044703304767609,
-0.030051574110984802,
-0.09441012889146805,
0.055598001927137375,
-0.00376864243298769,
-0.01712554320693016,
-0.01915314607322216,
0.03512020409107208,
0.07535658031702042,
0.4490441679954529,
-0.041369423270225525,
-0.04448173940181732,
0.10311397910118103,
0.11341592669487,
-0.02220841869711876,
0.10408508032560349,
-0.13947589695453644,
0.03768465295433998,
0.1175806000828743,
-0.0023214814718812704,
0.13872022926807404,
0.09752149134874344,
-0.10865655541419983,
-0.015972204506397247,
0.04115288332104683,
-0.15005064010620117,
-0.07635878026485443,
-0.04430030286312103,
-0.002281248802319169,
-0.10295980423688889,
0.02364729344844818,
0.11684981733560562,
-0.034884385764598846,
0.05889495462179184,
0.028456954285502434,
0.049207136034965515,
-0.1364605724811554,
0.12878288328647614,
0.03700036555528641,
0.07615123689174652,
-0.09992527961730957,
0.09652400016784668,
0.023303348571062088,
0.00825932715088129,
0.04725231975317001,
0.027966376394033432,
-0.10512086749076843,
-0.001554601825773716,
-0.03789567947387695,
-0.08284953981637955,
0.11354126036167145,
-0.04939194396138191,
-0.040133170783519745,
-0.12646333873271942,
0.012889940291643143,
0.09744752943515778,
0.04964281618595123,
0.09888484328985214,
-0.030033277347683907,
0.0075920321978628635,
-0.12556973099708557,
0.06605581194162369,
-0.035500358790159225,
0.023017941042780876,
-0.12867692112922668,
0.06691409647464752,
-0.022538913413882256,
0.06888579577207565,
-0.02716882899403572,
-0.02122713252902031,
-0.22216403484344482,
0.01898934505879879,
-0.046384815126657486,
0.015130899846553802,
0.03883083164691925,
0.03256089612841606,
0.016385694965720177,
0.07026076316833496,
-0.027662817388772964,
0.029234861955046654,
-0.03697348013520241,
-0.05532633140683174,
0.029561443254351616,
0.000859746418427676,
-0.04387318715453148,
-0.03446122631430626,
0.036215778440237045,
-0.11598259955644608,
0.05110352486371994,
0.04179312288761139,
-0.07250756770372391,
0.08318465203046799,
0.0367065854370594,
0.029018918052315712,
0.09824129194021225,
0.0500800795853138,
0.03615472465753555,
-0.03558721765875816,
0.02863329090178013,
-0.011567543260753155,
0.00535937212407589,
0.06053660809993744,
0.13280972838401794,
-0.05545606091618538,
-0.05218265950679779,
-0.13534586131572723,
-0.017654743045568466,
-0.04460085928440094,
0.04957935959100723,
0.13777801394462585,
0.10554702579975128,
0.0904281958937645,
-0.09167193621397018,
-0.017379432916641235,
-0.13618789613246918,
-0.07471699267625809,
0.06517311185598373,
-0.061843886971473694,
-0.013707146979868412,
-0.04696909338235855,
0.07232685387134552,
-0.0007585143903270364,
0.14544054865837097,
-0.08391784876585007,
-0.09559306502342224,
-0.05065767467021942,
-0.16526760160923004,
-0.14026571810245514,
-0.00825378019362688,
0.25463220477104187,
0.0322333462536335,
-0.034513041377067566,
-0.07385747879743576,
-0.001654492923989892,
0.07827775925397873,
0.14663274586200714,
0.06168665736913681,
0.06337334215641022,
-0.11659637093544006,
0.11688853055238724,
0.038669653236866,
-0.039098165929317474,
0.13086941838264465,
0.3022078275680542,
-0.07057863473892212,
0.02253899537026882,
-0.09504877775907516,
0.08245986700057983,
0.03789769858121872,
-0.13405705988407135,
0.013447491452097893,
-0.03060263767838478,
-0.16508647799491882,
-0.12470903247594833,
0.009751017205417156,
-0.06427910923957825,
-0.1924532949924469,
-0.024802647531032562,
-0.11324028670787811,
-0.06466472893953323,
0.11631932854652405,
0.04117546230554581,
-0.02213331125676632,
0.1888004094362259,
-0.07154331356287003,
0.02065609209239483,
0.011281643994152546,
-0.018731145188212395,
-0.0010611722245812416,
-0.026093561202287674,
-0.09227901697158813,
0.15296483039855957,
0.0077853393740952015,
0.10320281237363815,
0.022113660350441933,
0.08986940234899521,
0.04709361121058464,
-0.04839078709483147,
-0.04243874549865723,
0.0018743014661595225,
0.026814065873622894,
-0.054975178092718124,
0.13271668553352356,
0.057239994406700134,
-0.09841584414243698,
-0.07825161516666412,
-0.0140769612044096,
-0.08489853143692017,
-0.028453143313527107,
-0.1411457359790802,
0.25284892320632935,
-0.04393581673502922,
0.10782010108232498,
-0.025014596059918404,
-0.05708682909607887,
-0.10928280651569366,
0.1616043746471405,
0.15175794064998627,
-0.1328047662973404,
-0.013763121329247952,
0.07707972079515457,
0.0021979939192533493,
-0.05992339178919792,
0.14983594417572021,
0.09661564230918884,
-0.01768977753818035,
0.03258506581187248,
-0.005139987915754318,
-0.02565227448940277,
-0.0015975008718669415,
-0.010132165625691414,
-0.017926402390003204,
0.0026933092158287764,
0.021905606612563133,
-0.13038411736488342,
-0.006390224676579237,
-0.09876580536365509,
-0.06608805060386658,
0.15829384326934814,
-0.12009911984205246,
-0.07943340390920639,
-0.035133689641952515,
-0.08415786176919937,
-0.08983651548624039,
0.021428463980555534,
-0.11201931536197662,
0.05978894978761673,
0.0751558244228363,
-0.06481432914733887,
-0.015317801386117935,
-0.06633755564689636,
-0.009677664376795292,
0.0014995323726907372,
0.056292008608579636,
-0.010303734801709652,
0.06371618807315826,
0.1000191941857338,
-0.02336752787232399,
-0.04457669332623482,
0.10046066343784332,
0.007258089724928141,
-0.030224421992897987,
-0.1370682716369629,
0.037784844636917114,
-0.03127046674489975,
0.09586665034294128,
0.0295368991792202,
-0.06781578809022903,
-0.02999231405556202,
-0.15733802318572998,
-0.025585172697901726,
-0.13041628897190094,
-0.08406594395637512,
-0.07614273577928543,
0.09592524915933609,
0.20122450590133667,
-0.05998660624027252,
0.030010296031832695,
-0.03475714474916458,
0.02168632671236992,
-0.05580853298306465,
0.04468223452568054,
0.0031677638180553913,
-0.15440766513347626,
0.03098267689347267,
-0.013649947941303253,
0.012241546995937824,
-0.3267116844654083,
0.017840273678302765,
-0.010885944589972496,
-0.011035164818167686,
-0.06182963028550148,
0.13602295517921448,
0.0065087806433439255,
0.06010688096284866,
-0.05984212085604668,
-0.25982925295829773,
-0.05355921387672424,
0.14448481798171997,
-0.0009282905957661569,
-0.07896853983402252
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | yeniceriSGK/falcon-1b-pibrain-v2 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T08:51:54+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | diffusers |
# SDXL LoRA DreamBooth - philipp-zettl/logo_LoRA
## Model description
An attempt to train a LoRA on a small set of images to generate logos for marketing purposes. | {"license": "openrail", "library_name": "diffusers", "tags": ["stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "diffusers", "lora", "template:sd-lora"], "datasets": ["logo-wizard/modern-logo-dataset"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a modern logo for a vegan coffee shop"} | text-to-image | philipp-zettl/logo_LoRA | [
"diffusers",
"tensorboard",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"template:sd-lora",
"dataset:logo-wizard/modern-logo-dataset",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail",
"has_space",
"region:us"
] | 2024-02-13T08:53:00+00:00 | [] | [] | TAGS
#diffusers #tensorboard #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #dataset-logo-wizard/modern-logo-dataset #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #has_space #region-us
|
# SDXL LoRA DreamBooth - philipp-zettl/logo_LoRA
## Model description
An attempt to train a LoRA on a small set of images to generate logos for marketing purposes. | [
"# SDXL LoRA DreamBooth - philipp-zettl/logo_LoRA",
"## Model description \nAn attempt to train a LoRA on a small set of images to generate logos for marketing purposes."
] | [
"TAGS\n#diffusers #tensorboard #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #dataset-logo-wizard/modern-logo-dataset #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #has_space #region-us \n",
"# SDXL LoRA DreamBooth - philipp-zettl/logo_LoRA",
"## Model description \nAn attempt to train a LoRA on a small set of images to generate logos for marketing purposes."
] | [
100,
20,
25
] | [
"passage: TAGS\n#diffusers #tensorboard #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #dataset-logo-wizard/modern-logo-dataset #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #has_space #region-us \n# SDXL LoRA DreamBooth - philipp-zettl/logo_LoRA## Model description \nAn attempt to train a LoRA on a small set of images to generate logos for marketing purposes."
] | [
-0.03765374422073364,
-0.022215813398361206,
-0.0043585095554590225,
0.03156299889087677,
0.06714826822280884,
0.009770878590643406,
0.19327454268932343,
0.054762471467256546,
0.032866183668375015,
0.049538690596818924,
0.12838995456695557,
0.06134859845042229,
-0.010622368194162846,
0.19488736987113953,
-0.0939544141292572,
-0.21427106857299805,
-0.04149992763996124,
0.006279880180954933,
-0.026061678305268288,
0.04708453640341759,
0.06317757815122604,
-0.08318961411714554,
0.13722681999206543,
-0.052505653351545334,
-0.17472364008426666,
-0.024256911128759384,
-0.0061716726049780846,
-0.038687046617269516,
0.07001151144504547,
0.061598002910614014,
0.06943824887275696,
0.09804577380418777,
0.08840985596179962,
-0.15619032084941864,
0.060414496809244156,
0.007165774703025818,
-0.028465915471315384,
0.06382233649492264,
-0.018009766936302185,
-0.06820995360612869,
0.12554122507572174,
-0.051074277609586716,
-0.022394029423594475,
0.005523455794900656,
-0.07788002490997314,
-0.07043411582708359,
-0.03711273893713951,
0.031792715191841125,
0.02040640451014042,
-0.01880311779677868,
0.04667974263429642,
-0.04641362279653549,
-0.02090326137840748,
0.06650834530591965,
0.19260211288928986,
-0.24770157039165497,
-0.07372424006462097,
0.14821161329746246,
0.07835257798433304,
0.07277451455593109,
-0.06168908253312111,
0.19174587726593018,
0.032524313777685165,
-0.02071518450975418,
0.06849518418312073,
-0.07891274243593216,
0.06512469798326492,
-0.007346327882260084,
-0.04857733100652695,
0.0584455244243145,
0.25039350986480713,
0.013211648911237717,
-0.023358315229415894,
-0.11277790367603302,
-0.021977590397000313,
0.12415730953216553,
-0.07530605792999268,
-0.03078320249915123,
0.04059181362390518,
-0.018056156113743782,
-0.04666878283023834,
-0.17969785630702972,
-0.11454775184392929,
-0.06744257360696793,
-0.05492697283625603,
0.1694987714290619,
-0.013116379268467426,
0.08853462338447571,
-0.05540025606751442,
0.08679480850696564,
-0.05038249492645264,
-0.13413435220718384,
0.0010745711624622345,
-0.08370271325111389,
0.02834993600845337,
0.08408704400062561,
0.006405779160559177,
-0.08350618183612823,
0.10651548206806183,
0.022064073011279106,
0.15173310041427612,
-0.04239703342318535,
-0.0035509697627276182,
0.13355228304862976,
-0.004581620916724205,
0.005411894526332617,
-0.03746713697910309,
-0.09114468097686768,
0.036400169134140015,
0.03501826897263527,
0.07792479544878006,
-0.051684245467185974,
-0.1426009237766266,
0.019743967801332474,
-0.0875997394323349,
0.02003701776266098,
0.04372173175215721,
0.0481017529964447,
-0.06712361425161362,
0.006710978224873543,
0.03557076305150986,
-0.022348806262016296,
0.040914155542850494,
-0.06027304381132126,
-0.03394933044910431,
0.14780639111995697,
0.1268073469400406,
-0.02038484811782837,
0.08420506864786148,
0.08147545903921127,
-0.047310248017311096,
0.008896577171981335,
-0.06288900971412659,
-0.03375954180955887,
-0.022115273401141167,
-0.12886899709701538,
0.03108290024101734,
-0.1165095791220665,
-0.19949960708618164,
0.030198421329259872,
0.058100782334804535,
-0.09126956015825272,
0.024781987071037292,
-0.03135804086923599,
-0.0763189047574997,
0.036280687898397446,
0.010202120058238506,
0.07375109940767288,
-0.028345050290226936,
0.06384233385324478,
0.007250686641782522,
0.14509615302085876,
-0.16712775826454163,
0.01715243048965931,
-0.01606464758515358,
0.04898783192038536,
-0.07477312535047531,
0.0701439306139946,
-0.06387746334075928,
0.026962991803884506,
-0.040583815425634384,
-0.02905755676329136,
-0.09573162347078323,
0.10653205215930939,
0.0032497828360646963,
0.10297073423862457,
-0.22841839492321014,
-0.028684979304671288,
0.111972875893116,
-0.09586545825004578,
-0.09592118859291077,
0.027606191113591194,
-0.0028010618407279253,
0.1429246962070465,
0.048187922686338425,
0.14035259187221527,
0.0491611510515213,
-0.3128741979598999,
0.045371126383543015,
0.03706005588173866,
-0.1140989139676094,
-0.13263842463493347,
-0.03177986294031143,
0.10205728560686111,
0.0975521132349968,
0.08623337000608444,
-0.20261608064174652,
0.03995366394519806,
-0.08689058572053909,
-0.03393872082233429,
-0.04684353992342949,
-0.06661204993724823,
0.009467869997024536,
0.04050232097506523,
0.045549023896455765,
-0.006378751248121262,
-0.08835330605506897,
0.07037091255187988,
0.05283290520310402,
-0.018967239186167717,
0.02054809033870697,
-0.056340575218200684,
0.18018554151058197,
-0.06470233201980591,
-0.010885331779718399,
-0.06397786736488342,
-0.09367188811302185,
-0.03372025117278099,
0.23334930837154388,
0.07473693788051605,
0.23335590958595276,
0.08133076131343842,
0.001397225889377296,
-0.04281176999211311,
-0.0096737090498209,
0.024079471826553345,
-0.02555043250322342,
-0.021971415728330612,
-0.1718573272228241,
0.011303427629172802,
-0.10667546838521957,
-0.052041735500097275,
-0.12175167351961136,
0.011628508567810059,
0.006854387000203133,
0.10784295201301575,
0.1120639517903328,
0.03533054515719414,
0.022285833954811096,
0.023279720917344093,
-0.04497353360056877,
-0.021106533706188202,
0.0765165239572525,
-0.008748278953135014,
-0.06236385926604271,
0.10626987367868423,
-0.08014070242643356,
0.2327437400817871,
0.1119762659072876,
-0.03679263964295387,
-0.04202745854854584,
-0.0777173712849617,
-0.009880494326353073,
0.03968031331896782,
-0.03541846200823784,
-0.07236608862876892,
0.05384243652224541,
-0.013228215277194977,
0.07086725533008575,
-0.01950065605342388,
0.03671211376786232,
0.017043204978108406,
-0.10046065598726273,
-0.07826373726129532,
0.06189560517668724,
0.0981060042977333,
0.0028332474175840616,
0.10095199942588806,
0.10718578100204468,
-0.031692180782556534,
0.16197209060192108,
-0.010060328990221024,
-0.038805797696113586,
-0.048306919634342194,
0.031462833285331726,
0.02133212797343731,
0.19930869340896606,
0.04803355410695076,
-0.07443135231733322,
0.01953265815973282,
-0.08825813233852386,
0.028166180476546288,
-0.10930408537387848,
-0.06318295001983643,
0.007095724809914827,
-0.07295381277799606,
0.19065308570861816,
0.10538259148597717,
-0.07896075397729874,
0.1030602678656578,
-0.04605931043624878,
-0.15447796881198883,
0.0208633653819561,
-0.005704791750758886,
0.05106818675994873,
0.03138774260878563,
-0.026679012924432755,
-0.17500093579292297,
-0.12722274661064148,
0.01238546334207058,
-0.04704936593770981,
-0.03644317388534546,
0.03886101022362709,
-0.013506731949746609,
-0.038494303822517395,
-0.050617776811122894,
0.044240936636924744,
0.02760026976466179,
-0.009042501449584961,
-0.06028120219707489,
0.01278842892497778,
-0.07865256816148758,
-0.08924458920955658,
-0.0174038577824831,
-0.10725206881761551,
0.035277578979730606,
0.0863126739859581,
0.022027336061000824,
0.1532296985387802,
0.04027664288878441,
0.018071383237838745,
0.027614813297986984,
0.018162259832024574,
0.068537138402462,
-0.04404459521174431,
0.13122715055942535,
0.19545824825763702,
0.07635524868965149,
0.0826188251376152,
0.10952442139387131,
0.04498831927776337,
-0.0985008031129837,
0.022086510434746742,
0.004442399367690086,
-0.11989795416593552,
-0.015405711717903614,
-0.08031588792800903,
-0.08641956001520157,
-0.04661889746785164,
-0.05166659504175186,
0.04244673624634743,
0.06665603816509247,
0.07145851850509644,
0.08418244868516922,
-0.06259122490882874,
-0.018706446513533592,
0.03185587748885155,
0.06921996921300888,
-0.060098469257354736,
0.10075479000806808,
-0.06976985931396484,
-0.076651930809021,
0.07241333276033401,
0.08185546100139618,
0.23771212995052338,
-0.06028920039534569,
0.06350581347942352,
0.10734579712152481,
0.04910629987716675,
0.13672426342964172,
0.008178798481822014,
-0.05238728225231171,
-0.023992527276277542,
-0.02678246796131134,
-0.05811043456196785,
0.05378364026546478,
0.06550268083810806,
0.03989896923303604,
-0.06014740467071533,
0.08755336701869965,
0.037804827094078064,
-0.017621692270040512,
0.1180352047085762,
0.061930105090141296,
-0.22685176134109497,
0.08483581990003586,
0.04822976514697075,
0.08039581030607224,
-0.02752303145825863,
-0.015738148242235184,
0.25170281529426575,
0.01816767081618309,
0.030705375596880913,
-0.03765081986784935,
0.06085923686623573,
0.012710468843579292,
-0.010610703378915787,
0.0028441965114325285,
0.12257500737905502,
-0.0020437228959053755,
-0.039039481431245804,
-0.16821062564849854,
0.16193510591983795,
0.01603141985833645,
0.057245586067438126,
0.017149997875094414,
-0.012216895818710327,
0.02787027694284916,
0.06967911124229431,
0.15212740004062653,
-0.015070552937686443,
-0.027488194406032562,
0.0005654080887325108,
-0.10903160274028778,
0.014786715619266033,
0.07694515585899353,
0.007522644009441137,
0.009871541522443295,
-0.010660448111593723,
-0.03078245185315609,
0.04561518877744675,
0.0735614001750946,
-0.11801037192344666,
-0.12782235443592072,
0.015359792858362198,
0.06104828044772148,
-0.15971048176288605,
-0.02355983853340149,
-0.04406775161623955,
0.0021896278485655785,
0.08857517689466476,
0.047378335148096085,
-0.06687503308057785,
-0.1048799455165863,
-0.06434854865074158,
0.01318455021828413,
-0.03087441436946392,
0.013677943497896194,
-0.012172549962997437,
-0.007883693091571331,
-0.0051025706343352795,
-0.13094638288021088,
0.03394736349582672,
-0.008669563569128513,
-0.03908427059650421,
-0.08587656915187836,
0.044992852956056595,
-0.03143269196152687,
-0.036053068935871124,
0.009373992681503296,
0.010866472497582436,
-0.0022008060477674007,
-0.07077296078205109,
0.11553506553173065,
0.08237269520759583,
-0.06964991986751556,
-0.030011001974344254,
-0.09840492159128189,
-0.02914837747812271,
0.0581292100250721,
0.03254100680351257,
0.056804485619068146,
0.20126467943191528,
-0.10256445407867432,
0.1112167090177536,
0.05418236553668976,
-0.058454081416130066,
-0.29096630215644836,
-0.05440675839781761,
-0.0935591459274292,
0.06946852058172226,
0.001794951967895031,
-0.11315906792879105,
0.1069570928812027,
0.06440030783414841,
-0.033648017793893814,
0.26167502999305725,
-0.34097176790237427,
-0.09242308139801025,
0.06852491945028305,
0.12757480144500732,
0.25237369537353516,
-0.20124571025371552,
-0.03371576964855194,
-0.01580946519970894,
-0.04994920268654823,
0.11785827577114105,
-0.0237566065043211,
0.10843906551599503,
-0.08898189663887024,
0.03341351076960564,
0.005689243786036968,
-0.011751952581107616,
0.12336882203817368,
-0.08247555792331696,
0.11667283624410629,
-0.09153550863265991,
-0.02290945313870907,
0.1619960516691208,
-0.012387298978865147,
0.026108425110578537,
-0.2444290667772293,
0.00819797907024622,
-0.05605562776327133,
-0.012022823095321655,
-0.025049973279237747,
0.0717819407582283,
-0.027788104489445686,
-0.07924803346395493,
-0.0697961077094078,
0.01038028858602047,
-0.059107404202222824,
0.025328315794467926,
-0.0031820202711969614,
-0.10911824554204941,
-0.015976792201399803,
0.17739048600196838,
0.006472253706306219,
0.03244715556502342,
-0.09587781131267548,
-0.05141483247280121,
-0.03889453411102295,
0.09682212024927139,
-0.11786800622940063,
-0.054160017520189285,
0.12262025475502014,
0.05582519993185997,
0.042515695095062256,
0.028118692338466644,
-0.028310537338256836,
0.11815011501312256,
0.10145253688097,
-0.05351195111870766,
-0.214030459523201,
0.004847731441259384,
-0.04918145015835762,
0.0711168423295021,
0.044397540390491486,
0.1713322401046753,
-0.04222745820879936,
0.036994773894548416,
-0.017868803814053535,
0.04654191806912422,
-0.055063437670469284,
0.059451308101415634,
0.0715133547782898,
-0.0003738185332622379,
-0.02584627829492092,
0.012104441411793232,
-0.0551830418407917,
0.10260631889104843,
-0.0524565763771534,
0.017338288947939873,
-0.024558838456869125,
-0.05790473893284798,
0.028655879199504852,
0.20750147104263306,
-0.061451252549886703,
0.012752735055983067,
-0.062486432492733,
-0.15157681703567505,
-0.010046088136732578,
0.10838690400123596,
0.02609158493578434,
-0.017605161294341087,
-0.002210598671808839,
-0.024448944255709648,
-0.005345620680600405,
0.03403172269463539,
0.02881438285112381,
0.07332972437143326,
-0.16483190655708313,
-0.08401253074407578,
-0.032310064882040024,
0.04876770079135895,
-0.11048010736703873,
-0.01591557264328003,
-0.12193874269723892,
0.0016071908175945282,
-0.1611630767583847,
0.08389662951231003,
-0.037880655378103256,
-0.04323473200201988,
-0.005632678046822548,
-0.07838841527700424,
-0.03573925793170929,
0.006417300086468458,
-0.05039358511567116,
0.013551888056099415,
0.06272165477275848,
-0.0033088461495935917,
-0.07234520465135574,
-0.07711926102638245,
-0.00314312893897295,
-0.058209870010614395,
0.027613379061222076,
-0.023510433733463287,
-0.06054080277681351,
-0.02340167574584484,
-0.2748573422431946,
0.013506792485713959,
0.08573976904153824,
0.024680553004145622,
0.017111094668507576,
0.0663742646574974,
-0.008509870618581772,
0.010490329004824162,
0.06157314404845238,
-0.020830702036619186,
0.05986287444829941,
-0.09623238444328308,
0.10831277072429657,
-0.10851403325796127,
-0.040592487901449203,
-0.04046250134706497,
0.06471429020166397,
0.19831064343452454,
0.1586172729730606,
0.08513116836547852,
-0.08758047968149185,
0.03900093212723732,
-0.05724368244409561,
0.03683485835790634,
0.03171393647789955,
-0.08673253655433655,
0.11507025361061096,
-0.05100943148136139,
-0.0039739604108035564,
-0.011996129527688026,
0.1423139125108719,
0.058107320219278336,
-0.2689307630062103,
-0.024782635271549225,
-0.01861250028014183,
0.015146921388804913,
0.060057323426008224,
0.12086208164691925,
0.0037823121529072523,
0.061879780143499374,
-0.07121120393276215,
0.03460245579481125,
0.0709494799375534,
0.054163750261068344,
0.12220431119203568,
0.24638821184635162,
0.01431999821215868,
0.043842725455760956,
0.07654323428869247,
-0.019408700987696648,
-0.019904272630810738,
0.1022622138261795,
-0.09000802040100098,
0.08830397576093674,
-0.10573039948940277,
-0.007174719125032425,
0.126678466796875,
-0.0646255761384964,
0.016939014196395874,
0.03681899979710579,
-0.031247472390532494,
-0.03665580227971077,
-0.22336453199386597,
-0.06728135049343109,
-0.10724848508834839,
0.010253462009131908,
-0.03211743012070656,
-0.015985602512955666,
0.05566076934337616,
0.00221050507389009,
0.011035693809390068,
-0.023365311324596405,
-0.017844868823885918,
0.00029253403772599995,
0.11038977652788162,
0.004216806963086128,
-0.10210887342691422,
-0.060904230922460556,
0.019165920093655586,
-0.015923982486128807,
0.07523718476295471,
-0.07258819043636322,
0.04502301663160324,
0.023416291922330856,
0.024360427632927895,
-0.061116769909858704,
-0.051685892045497894,
-0.03361473232507706,
0.01602279394865036,
0.03825610503554344,
0.019820934161543846,
0.05008140951395035,
-0.054789964109659195,
0.0004754658439196646,
0.1819513738155365,
-0.0024076432455331087,
-0.06004542484879494,
-0.07338245213031769,
-0.08473395556211472,
-0.1412295401096344,
0.06385397911071777,
-0.048050347715616226,
-0.04166364669799805,
-0.016626061871647835,
0.20086199045181274,
0.2912389934062958,
-0.1647920161485672,
0.04453354701399803,
-0.06711739301681519,
0.00440969318151474,
-0.04774288088083267,
0.07505307346582413,
-0.005799101199954748,
0.2810802757740021,
-0.04762480407953262,
-0.06387890130281448,
-0.07549465447664261,
-0.03443526849150658,
-0.07291929423809052,
-0.10212381184101105,
0.009164306335151196,
-0.04145190492272377,
-0.12523993849754333,
0.01997821405529976,
-0.15307150781154633,
-0.07015437632799149,
0.21456573903560638,
-0.16080163419246674,
-0.0471307672560215,
-0.04208375886082649,
0.08168451488018036,
-0.005764591973274946,
0.05493173748254776,
-0.12190967798233032,
-0.019043851643800735,
0.022881194949150085,
-0.019302843138575554,
-0.12725947797298431,
0.01993803307414055,
0.0007793752592988312,
-0.1941935420036316,
0.0915592610836029,
-0.05742260441184044,
-0.039299607276916504,
0.048552483320236206,
-0.021160686388611794,
-0.05738665908575058,
0.07412700355052948,
-0.017950989305973053,
-0.09431824833154678,
-0.01352309063076973,
0.05952943116426468,
-0.0339319184422493,
0.08697257190942764,
0.037754300981760025,
-0.05479930341243744,
-0.006093717645853758,
0.0994061753153801,
-0.06873506307601929,
-0.06209468096494675,
-0.021981284022331238,
-0.09303522855043411,
0.08540114015340805,
0.03751411288976669,
0.014935363084077835,
-0.019031431525945663,
0.013186721131205559,
0.06326449662446976,
0.0693168118596077,
0.0006355945370160043,
0.003455142956227064,
-0.01667223684489727,
-0.0646001547574997,
0.006082977633923292,
-0.006474014837294817,
-0.18451392650604248,
-0.07496034353971481,
-0.17856550216674805,
0.03164783492684364,
-0.021026287227869034,
0.025217613205313683,
0.2774433493614197,
0.023703021928668022,
-0.018588198348879814,
-0.12488877773284912,
0.021615490317344666,
0.07184958457946777,
-0.03403157740831375,
-0.023540321737527847
] |
null | null | transformers |
# jaLLAbi
jaLLAbi is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [openchat/openchat-3.5-0106](https://huggingface.co/openchat/openchat-3.5-0106)
* [machinists/Mistral-7B-SQL](https://huggingface.co/machinists/Mistral-7B-SQL)
## 🧩 Configuration
\```yaml
slices:
- sources:
- model: openchat/openchat-3.5-0106
layer_range: [0, 32]
- model: machinists/Mistral-7B-SQL
layer_range: [0, 32]
merge_method: slerp
base_model: openchat/openchat-3.5-0106
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
\``` | {"license": "apache-2.0", "tags": ["merge", "mergekit", "lazymergekit", "openchat/openchat-3.5-0106", "machinists/Mistral-7B-SQL"]} | text-generation | AbacusResearch/jaLLAbi | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"openchat/openchat-3.5-0106",
"machinists/Mistral-7B-SQL",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T08:53:42+00:00 | [] | [] | TAGS
#transformers #safetensors #mixtral #text-generation #merge #mergekit #lazymergekit #openchat/openchat-3.5-0106 #machinists/Mistral-7B-SQL #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# jaLLAbi
jaLLAbi is a merge of the following models using mergekit:
* openchat/openchat-3.5-0106
* machinists/Mistral-7B-SQL
## Configuration
\ | [
"# jaLLAbi\n\njaLLAbi is a merge of the following models using mergekit:\n* openchat/openchat-3.5-0106\n* machinists/Mistral-7B-SQL",
"## Configuration\n\n\\"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #merge #mergekit #lazymergekit #openchat/openchat-3.5-0106 #machinists/Mistral-7B-SQL #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# jaLLAbi\n\njaLLAbi is a merge of the following models using mergekit:\n* openchat/openchat-3.5-0106\n* machinists/Mistral-7B-SQL",
"## Configuration\n\n\\"
] | [
89,
41,
6
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #merge #mergekit #lazymergekit #openchat/openchat-3.5-0106 #machinists/Mistral-7B-SQL #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# jaLLAbi\n\njaLLAbi is a merge of the following models using mergekit:\n* openchat/openchat-3.5-0106\n* machinists/Mistral-7B-SQL## Configuration\n\n\\"
] | [
-0.09358615428209305,
0.004048878327012062,
-0.0031972252763807774,
0.023409603163599968,
0.053979355841875076,
0.009148791432380676,
0.13244463503360748,
0.08760413527488708,
0.09139104932546616,
-0.026720847934484482,
0.08013315498828888,
0.1602793186903,
0.02693321369588375,
0.204777330160141,
-0.11511165648698807,
-0.13033461570739746,
0.08823990076780319,
-0.009104091674089432,
-0.12684081494808197,
0.0797652006149292,
0.13305379450321198,
-0.007386608514934778,
0.09685032814741135,
0.008970353752374649,
-0.06155603006482124,
-0.0017802974907681346,
0.01760181225836277,
-0.013304495252668858,
0.08594485372304916,
0.15720663964748383,
0.05231047049164772,
0.04878241568803787,
-0.08569160848855972,
-0.13836464285850525,
0.05956873670220375,
-0.013087976723909378,
-0.01528205443173647,
0.046148717403411865,
0.07145033031702042,
0.0014036553911864758,
-0.022210968658328056,
-0.07550573348999023,
-0.008769717067480087,
0.12186422199010849,
-0.08252999186515808,
-0.08317149430513382,
-0.12745799124240875,
-0.015299259684979916,
0.10145759582519531,
0.04904047027230263,
0.0127717275172472,
0.10910614579916,
0.009153524413704872,
0.1100723147392273,
0.13360631465911865,
-0.2614465355873108,
-0.049381908029317856,
0.10694067180156708,
0.08640283346176147,
0.002493205713108182,
0.04045096039772034,
0.03038153424859047,
0.08041001111268997,
0.0005313688307069242,
-0.004842651542276144,
-0.0799969807267189,
-0.03054497390985489,
-0.06192319467663765,
-0.09197546541690826,
-0.006719720084220171,
0.28675171732902527,
0.015040057711303234,
-0.020994190126657486,
-0.11770486831665039,
-0.1261366903781891,
0.1217278316617012,
-0.00797394197434187,
-0.022721609100699425,
0.013118241913616657,
0.05520548298954964,
0.11789797991514206,
-0.07139461487531662,
-0.09243422001600266,
0.0013305260799825191,
-0.15892820060253143,
0.07221981883049011,
0.001165017019957304,
0.032759666442871094,
-0.057327575981616974,
-0.01976979710161686,
0.0029503204859793186,
-0.10779894143342972,
-0.02024591714143753,
-0.08360699564218521,
-0.01723575033247471,
-0.0047139753587543964,
-0.08679748326539993,
-0.12142258137464523,
0.17496512830257416,
0.20173007249832153,
-0.13857804238796234,
0.059682514518499374,
-0.005090550985187292,
0.07952684909105301,
0.020729100331664085,
-0.03987269476056099,
-0.01773320510983467,
-0.0787748470902443,
0.06794218719005585,
-0.04040725901722908,
0.09372549504041672,
0.00873971451073885,
-0.08097963780164719,
-0.06841778755187988,
-0.02962440811097622,
0.046832066029310226,
0.08124707639217377,
0.12255232781171799,
-0.0956265851855278,
-0.03700954094529152,
0.1600399911403656,
-0.05599486455321312,
-0.018161866813898087,
-0.022302914410829544,
-0.033272113651037216,
-0.05244433134794235,
0.092494435608387,
0.09674946963787079,
-0.0054717944003641605,
-0.04710744693875313,
-0.0477273054420948,
-0.021740464493632317,
-0.021141136065125465,
-0.0938928872346878,
0.060397304594516754,
0.013112389482557774,
-0.011836276389658451,
-0.12618663907051086,
-0.22234924137592316,
0.00186066550668329,
0.07812581211328506,
-0.0053076487965881824,
-0.10325928032398224,
-0.01618865691125393,
0.02262893132865429,
0.003268391825258732,
-0.03184551000595093,
0.027180295437574387,
-0.04266341030597687,
-0.028894810006022453,
-0.06171508505940437,
0.02155812829732895,
-0.2186463475227356,
0.016582978889346123,
-0.08276789635419846,
0.0570390522480011,
-0.13158732652664185,
0.08615633100271225,
-0.08850149065256119,
0.13147151470184326,
-0.13295966386795044,
0.05585814267396927,
0.03874634578824043,
0.016340142115950584,
0.0055068740621209145,
0.17462413012981415,
-0.14312975108623505,
-0.02576153725385666,
0.1348295509815216,
-0.11011231690645218,
-0.2028508335351944,
0.12427642196416855,
0.015702998265624046,
0.1056186631321907,
0.06506236642599106,
0.12881159782409668,
0.17009775340557098,
-0.0065099457278847694,
0.02960893325507641,
0.052938178181648254,
-0.029172483831644058,
-0.03745429217815399,
0.09726446866989136,
-0.030305376276373863,
-0.13616178929805756,
0.051503945142030716,
0.07682467252016068,
0.08336092531681061,
-0.0012952863471582532,
-0.06860677897930145,
-0.05497129634022713,
-0.06829708814620972,
0.04808679223060608,
-0.047741688787937164,
0.01945696398615837,
-0.07947690784931183,
0.017101235687732697,
-0.02750757895410061,
0.07140378654003143,
0.043968819081783295,
-0.00512400409206748,
-0.089943528175354,
-0.010403170250356197,
-0.02345794253051281,
0.05699436366558075,
-0.08384908735752106,
-0.12102118879556656,
0.010264568962156773,
-0.0665351077914238,
-0.01912010833621025,
0.07310649752616882,
0.06518524140119553,
-0.018337413668632507,
-0.022087309509515762,
-0.008366873487830162,
0.19143018126487732,
0.013928876258432865,
-0.01917017437517643,
-0.19880318641662598,
-0.000006585467872355366,
-0.07071862369775772,
0.2302779108285904,
-0.023269571363925934,
0.0814892128109932,
0.03774633631110191,
0.15314358472824097,
-0.006469681393355131,
0.07655367255210876,
0.07483426481485367,
-0.004310294054448605,
-0.049433112144470215,
0.003566509345546365,
0.11619739979505539,
0.048835258930921555,
-0.19467316567897797,
0.17938151955604553,
-0.04260842874646187,
0.1807965487241745,
0.0907825455069542,
0.01635129190981388,
0.011181282810866833,
-0.11829688400030136,
-0.03564402461051941,
-0.07123370468616486,
0.057866040617227554,
-0.026439940556883812,
0.020321493968367577,
0.013230160810053349,
0.09939706325531006,
-0.07170506566762924,
-0.04016098007559776,
0.003296546870842576,
-0.050556935369968414,
-0.043410174548625946,
0.08650332689285278,
-0.0396127924323082,
-0.32792237401008606,
0.14501826465129852,
0.1826724261045456,
0.014534062705934048,
0.10920234024524689,
-0.014102932997047901,
0.09783309698104858,
-0.045493144541978836,
0.09766918420791626,
0.011681041680276394,
-0.028926750645041466,
-0.10166892409324646,
0.07379508763551712,
0.06215786188840866,
0.009788468480110168,
0.05892423540353775,
-0.06426048278808594,
-0.007937095127999783,
0.004583725705742836,
0.006774726789444685,
0.10522502660751343,
0.10887761414051056,
-0.006407151930034161,
0.09133847057819366,
0.026080984622240067,
-0.006742854602634907,
0.07934212684631348,
-0.02128441631793976,
-0.13036136329174042,
0.14496003091335297,
-0.18158848583698273,
-0.20656269788742065,
-0.11087949573993683,
-0.11017924547195435,
-0.09007240831851959,
-0.03422528877854347,
0.10940752178430557,
-0.04462830349802971,
-0.019496526569128036,
-0.07051870226860046,
0.03370453044772148,
0.029667772352695465,
-0.013666020706295967,
0.06072326377034187,
0.011864382773637772,
0.03424077853560448,
-0.11189167946577072,
-0.024036556482315063,
0.06290686875581741,
-0.08032859861850739,
0.07509497553110123,
-0.11916555464267731,
0.10566587001085281,
0.05052672699093819,
0.05543085187673569,
-0.02794741466641426,
-0.04760957509279251,
0.17857283353805542,
0.0024774004705250263,
0.033709440380334854,
0.15768739581108093,
-0.06981445848941803,
0.05628123879432678,
0.24430955946445465,
0.007383831776678562,
-0.05141134932637215,
0.02484899014234543,
-0.06042289733886719,
-0.019105518236756325,
-0.21164244413375854,
-0.12301445007324219,
-0.1302473545074463,
0.16738839447498322,
-0.014661873690783978,
0.03777923062443733,
0.027660226449370384,
0.07102613151073456,
-0.08702773600816727,
0.005391535349190235,
0.0807061493396759,
0.06071934849023819,
0.17635460197925568,
-0.04343298450112343,
0.0880870595574379,
-0.07349782437086105,
0.0458710752427578,
0.1226077675819397,
0.04658149182796478,
0.0560087114572525,
0.056347865611314774,
0.1883908063173294,
0.10156603157520294,
0.11327637732028961,
0.04895710200071335,
0.04864690452814102,
-0.019741281867027283,
0.028012346476316452,
-0.041332826018333435,
-0.07560747116804123,
-0.08507144451141357,
0.04546145349740982,
-0.09432429820299149,
0.04560559242963791,
-0.03848540410399437,
0.010639441199600697,
0.07771053910255432,
0.20856453478336334,
0.04337543249130249,
-0.1941230595111847,
-0.11776068061590195,
0.08489459753036499,
0.012963160872459412,
-0.006611505523324013,
-0.00659671938046813,
0.02907024510204792,
-0.08432416617870331,
0.2214827984571457,
-0.04566122964024544,
0.10897268354892731,
0.040248241275548935,
0.010254791006445885,
-0.008904796093702316,
0.027601391077041626,
0.01583821512758732,
0.05701667070388794,
-0.3107232451438904,
0.12608446180820465,
0.03258097544312477,
-0.025042973458766937,
-0.02717180736362934,
0.04283218830823898,
0.03653997555375099,
0.14291787147521973,
0.025996308773756027,
-0.006755904760211706,
-0.05604510009288788,
0.028720645233988762,
-0.043267156928777695,
0.04454726725816727,
-0.03235813230276108,
-0.01520113367587328,
0.0400710292160511,
-0.06092788651585579,
-0.0025550785940140486,
0.020782897248864174,
0.20166704058647156,
-0.15172183513641357,
-0.17337371408939362,
0.056776437908411026,
0.11169338971376419,
0.04226240888237953,
-0.10029187053442001,
0.012259487994015217,
-0.034024082124233246,
0.2346171885728836,
-0.13458678126335144,
-0.09198631346225739,
-0.0946984812617302,
-0.05269276723265648,
0.04082663729786873,
-0.06024741381406784,
0.05393995717167854,
-0.07870285212993622,
0.058908991515636444,
-0.06958898901939392,
-0.13929855823516846,
0.09680680185556412,
-0.1330409198999405,
-0.05940694361925125,
-0.05667916685342789,
0.06766138225793839,
-0.0782795399427414,
0.020723681896924973,
-0.001403229427523911,
0.003589729079976678,
-0.06860969960689545,
-0.04596259072422981,
-0.025894541293382645,
0.18675263226032257,
0.014230365864932537,
0.13085797429084778,
-0.08979249745607376,
-0.2624174952507019,
0.01042939443141222,
-0.10483396798372269,
0.17626799643039703,
0.26485246419906616,
-0.03017263300716877,
0.11964476108551025,
0.14477528631687164,
-0.05243445560336113,
-0.20839299261569977,
-0.0768410861492157,
-0.030127504840493202,
-0.020852843299508095,
0.010974263772368431,
-0.0836770236492157,
0.016810912638902664,
0.0826488807797432,
-0.041474174708127975,
-0.0033537880517542362,
-0.2829890847206116,
-0.1068015992641449,
0.01991363614797592,
0.02575545571744442,
0.28882578015327454,
-0.12131708860397339,
-0.0769243985414505,
-0.1293231099843979,
-0.22247745096683502,
0.08616963773965836,
-0.1596442461013794,
0.051759179681539536,
-0.010140853002667427,
-0.02176423743367195,
-0.004852608777582645,
-0.03609451651573181,
0.10450099408626556,
-0.12768083810806274,
0.01689462549984455,
-0.12284272164106369,
0.0025186785496771336,
0.06008096784353256,
-0.03743566945195198,
0.05834970995783806,
-0.17943218350410461,
0.043388403952121735,
0.005983527284115553,
-0.014982047490775585,
-0.032731007784605026,
0.06780603528022766,
-0.03300107643008232,
-0.05882098898291588,
-0.03786448389291763,
0.028894254937767982,
0.02000926434993744,
0.025820985436439514,
0.2006566822528839,
0.0041297744028270245,
0.18049860000610352,
0.22126556932926178,
0.14526213705539703,
-0.1192021369934082,
0.07871823757886887,
-0.024377930909395218,
-0.06326179951429367,
0.04507336765527725,
-0.06515398621559143,
0.01227659359574318,
0.07909936457872391,
-0.046578921377658844,
0.0978178158402443,
0.021795189008116722,
-0.002259023953229189,
0.03605344146490097,
0.09006015956401825,
-0.152177095413208,
-0.26842251420021057,
0.00894302036613226,
0.17088735103607178,
-0.032047733664512634,
0.05680340528488159,
0.20682360231876373,
-0.015459753572940826,
-0.014516805298626423,
0.015775250270962715,
0.058047086000442505,
-0.06579126417636871,
0.10400588810443878,
-0.07708583772182465,
0.017845937982201576,
-0.08457694202661514,
0.07210759073495865,
0.043037790805101395,
-0.12203569710254669,
-0.034964509308338165,
0.1352377086877823,
-0.14177100360393524,
-0.12567073106765747,
-0.014890170656144619,
0.18618682026863098,
-0.05787988752126694,
-0.055155545473098755,
-0.06121362745761871,
-0.14566396176815033,
0.052641648799180984,
0.1519789844751358,
0.08659300208091736,
0.040901124477386475,
0.10280884057283401,
-0.04365907236933708,
0.06164500117301941,
0.0699630081653595,
-0.005598695017397404,
0.04225517436861992,
-0.09677663445472717,
-0.030683893710374832,
-0.03308195248246193,
0.03956737741827965,
-0.02286858856678009,
-0.0016244122525677085,
-0.1368723213672638,
-0.03927500545978546,
-0.22407826781272888,
0.04552347585558891,
-0.12522158026695251,
-0.020782046020030975,
-0.03222975507378578,
-0.048757895827293396,
-0.04438336566090584,
0.027761656790971756,
-0.023082980886101723,
-0.015027029439806938,
-0.04921363294124603,
0.11140456795692444,
-0.06333398818969727,
-0.044622693210840225,
0.03982401639223099,
0.006391292437911034,
0.0907733216881752,
-0.004980177152901888,
-0.08273761719465256,
-0.02923738583922386,
-0.13419488072395325,
-0.07164251059293747,
0.07762668281793594,
0.00392491091042757,
0.027006249874830246,
-0.10885069519281387,
-0.07532195001840591,
0.062273647636175156,
-0.023850886151194572,
-0.030430860817432404,
0.1541087031364441,
-0.05576591566205025,
-0.01866484433412552,
0.02325991727411747,
-0.03571408614516258,
-0.06353354454040527,
-0.04609563201665878,
0.09630444645881653,
0.07279421389102936,
0.19062510132789612,
-0.06401243805885315,
0.03788340836763382,
-0.13186359405517578,
-0.016439242288470268,
0.008419050835072994,
-0.1695939600467682,
-0.11526402831077576,
-0.029680289328098297,
-0.002841404639184475,
-0.028019987046718597,
0.14679035544395447,
-0.06029859557747841,
-0.13996997475624084,
0.06785202026367188,
-0.05570652708411217,
-0.015143049880862236,
0.029386959969997406,
0.19239521026611328,
0.00578659912571311,
0.012069564312696457,
-0.0666191428899765,
0.05488775297999382,
0.006555624771863222,
-0.020709870383143425,
0.050486333668231964,
0.11778263002634048,
0.07366856932640076,
0.05713299289345741,
0.02441483549773693,
-0.011823633685708046,
-0.018999561667442322,
-0.06636891514062881,
-0.014386817812919617,
0.05278264358639717,
-0.03102875128388405,
0.09331297129392624,
0.13805241882801056,
-0.08145222067832947,
0.06351379305124283,
-0.03694605082273483,
0.023575391620397568,
-0.11119438707828522,
-0.0829354003071785,
-0.09844184666872025,
-0.11665729433298111,
-0.09273114800453186,
-0.10259650647640228,
-0.019714901223778725,
-0.03210994601249695,
0.012001316994428635,
0.015264950692653656,
0.08920291811227798,
-0.03510938212275505,
-0.07098754495382309,
0.005088386591523886,
-0.027311503887176514,
-0.032698746770620346,
-0.030330752953886986,
-0.05845204368233681,
-0.0008802099036984146,
0.02022138424217701,
-0.020708667114377022,
0.06885945796966553,
-0.017469394952058792,
0.04776769503951073,
-0.07049360126256943,
-0.08036859333515167,
-0.0331624299287796,
0.04508540779352188,
0.0028439390007406473,
0.008630204945802689,
0.06254168599843979,
0.0020735228899866343,
0.07793278247117996,
0.15740633010864258,
-0.020206499844789505,
-0.12910142540931702,
-0.09192593395709991,
0.12821334600448608,
-0.05763566493988037,
0.035690080374479294,
0.02686368301510811,
0.006448647007346153,
0.005686360411345959,
0.12946270406246185,
0.2797727882862091,
-0.0709933191537857,
0.02913365513086319,
-0.0237697996199131,
0.018752286210656166,
0.018313279375433922,
0.0836925283074379,
0.040508948266506195,
0.19183029234409332,
-0.035738978534936905,
0.10151975601911545,
0.010400042869150639,
-0.04249529540538788,
-0.10743321478366852,
-0.02912873402237892,
-0.03387761116027832,
-0.01603999361395836,
0.019329145550727844,
0.08380865305662155,
-0.06873397529125214,
0.00952993705868721,
-0.07801445573568344,
-0.14360906183719635,
-0.08080609887838364,
-0.07303076982498169,
0.18628297746181488,
0.00933849811553955,
0.06686291843652725,
-0.03318149223923683,
0.00040396154508925974,
0.06891819089651108,
-0.03808853402733803,
-0.07626648992300034,
-0.07216251641511917,
0.07038991153240204,
-0.0655529648065567,
0.019145594909787178,
-0.020799079909920692,
0.0336882509291172,
0.12087110430002213,
-0.00010779711010400206,
-0.12670013308525085,
-0.016898123547434807,
0.03464802727103233,
0.01600475423038006,
0.03769529238343239,
-0.04822259396314621,
-0.0580727681517601,
0.12442583590745926,
0.08865988254547119,
-0.17828403413295746,
0.05340567231178284,
0.11083294451236725,
-0.05239785462617874,
-0.056488845497369766,
0.12063180655241013,
-0.05872362479567528,
0.10762260109186172,
0.11594820022583008,
-0.056186966598033905,
0.0020764218643307686,
-0.037686046212911606,
-0.002779475413262844,
0.04545443877577782,
0.07440043240785599,
-0.07993561774492264,
-0.16878126561641693,
-0.007930532097816467,
0.07171465456485748,
0.0567023791372776,
-0.14254876971244812,
-0.09622752666473389,
-0.16080930829048157,
-0.000793716695625335,
-0.06942691653966904,
0.07771087437868118,
0.1483881026506424,
-0.0013694048393517733,
0.002611141186207533,
-0.19427411258220673,
-0.007190277334302664,
0.06763439625501633,
-0.016582392156124115,
-0.10600721091032028
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-detect-cheapfake-combined-train-test-contradict-context
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "roberta-base", "model-index": [{"name": "roberta-base-detect-cheapfake-combined-train-test-contradict-context", "results": []}]} | text-classification | hoanghoavienvo/roberta-base-detect-cheapfake-combined-train-test-contradict-context | [
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T08:53:52+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# roberta-base-detect-cheapfake-combined-train-test-contradict-context
This model is a fine-tuned version of roberta-base on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.1
| [
"# roberta-base-detect-cheapfake-combined-train-test-contradict-context\n\nThis model is a fine-tuned version of roberta-base on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-06\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Framework versions\n\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.1.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# roberta-base-detect-cheapfake-combined-train-test-contradict-context\n\nThis model is a fine-tuned version of roberta-base on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-06\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Framework versions\n\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.1.0\n- Tokenizers 0.15.1"
] | [
63,
49,
6,
12,
8,
3,
90,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# roberta-base-detect-cheapfake-combined-train-test-contradict-context\n\nThis model is a fine-tuned version of roberta-base on the None dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-06\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Framework versions\n\n- Transformers 4.37.0\n- Pytorch 2.1.2\n- Datasets 2.1.0\n- Tokenizers 0.15.1"
] | [
-0.08074713498353958,
0.13015055656433105,
-0.002996771829202771,
0.10081391036510468,
0.1327013373374939,
0.01013921108096838,
0.1392773985862732,
0.12397600710391998,
-0.0890522450208664,
0.049678411334753036,
0.06346241384744644,
0.057654742151498795,
0.01660456322133541,
0.12347343564033508,
-0.04797080531716347,
-0.19800080358982086,
0.001956683350726962,
-0.012728381901979446,
-0.057835787534713745,
0.10723493248224258,
0.11249516159296036,
-0.09251676499843597,
0.082767054438591,
-0.01335052214562893,
-0.15411238372325897,
0.056024517863988876,
-0.0015108855441212654,
-0.054798025637865067,
0.09920081496238708,
0.02903650514781475,
0.11070075631141663,
0.012119700200855732,
0.11139853298664093,
-0.21429862082004547,
0.0062313745729625225,
0.08067067712545395,
0.024181408807635307,
0.07571078836917877,
0.024114485830068588,
-0.044096216559410095,
0.09691628068685532,
-0.12629781663417816,
0.103331059217453,
0.0366242341697216,
-0.07349929958581924,
-0.15738068521022797,
-0.07828716933727264,
0.07655590772628784,
0.0765121802687645,
0.09143267571926117,
0.0083583639934659,
0.16052037477493286,
-0.0800798088312149,
0.09385374933481216,
0.20005419850349426,
-0.2490224391222,
-0.0534505620598793,
0.06829825788736343,
0.059972748160362244,
0.08522497117519379,
-0.12572531402111053,
-0.013766891323029995,
0.058181725442409515,
0.009801923297345638,
0.0901026502251625,
-0.037355031818151474,
-0.09969624876976013,
0.0009074317058548331,
-0.12170355767011642,
0.0032432207372039557,
0.18688362836837769,
0.029387814924120903,
-0.048635903745889664,
-0.08602680265903473,
-0.05757194012403488,
-0.06981955468654633,
-0.03366110101342201,
-0.017609011381864548,
0.04228623956441879,
-0.047722190618515015,
-0.06353846192359924,
-0.05858583748340607,
-0.08221841603517532,
-0.07481081038713455,
0.00640932796522975,
0.1424691379070282,
0.05150584876537323,
0.010021029971539974,
-0.03607078269124031,
0.10071606934070587,
-0.013300561346113682,
-0.12065842002630234,
-0.033885806798934937,
-0.016017653048038483,
-0.052460383623838425,
-0.05454256385564804,
-0.039035454392433167,
-0.042327553033828735,
0.038895294070243835,
0.18575431406497955,
-0.04799064248800278,
0.06474621593952179,
-0.004454166628420353,
0.007572069764137268,
-0.03456931561231613,
0.1316537857055664,
-0.026196548715233803,
-0.06229314208030701,
0.04436774179339409,
0.09231008589267731,
0.051887013018131256,
0.009039463475346565,
-0.0881534218788147,
-0.021340763196349144,
0.08964373916387558,
0.08629117906093597,
-0.03118375688791275,
0.055644385516643524,
-0.010125732980668545,
-0.009961203671991825,
0.05272951349616051,
-0.14643728733062744,
0.03625841066241264,
-0.0037096114829182625,
-0.058914389461278915,
-0.0741262286901474,
0.06077597290277481,
-0.012415672652423382,
-0.014856506139039993,
0.05344447121024132,
-0.08352743834257126,
-0.009066174738109112,
-0.08578027039766312,
-0.09414485096931458,
0.005453682504594326,
-0.04890483617782593,
0.010446160100400448,
-0.09241606295108795,
-0.20641760528087616,
-0.031747713685035706,
0.019535819068551064,
-0.042066894471645355,
-0.02073809690773487,
-0.03774541616439819,
-0.07073083519935608,
0.004868493415415287,
0.0027657083701342344,
0.09029418230056763,
-0.05197017639875412,
0.08789379149675369,
0.034850090742111206,
0.01801234483718872,
0.01774096116423607,
0.026812179014086723,
-0.1296471655368805,
0.02389218844473362,
-0.1484825313091278,
0.05766572803258896,
-0.11117994785308838,
0.04830200970172882,
-0.09589843451976776,
-0.07802461087703705,
-0.0054064285941421986,
0.00024111480161082,
0.032934658229351044,
0.12392258644104004,
-0.13019850850105286,
-0.036132585257291794,
0.16312633454799652,
-0.10103789716959,
-0.079375259578228,
0.08918928354978561,
-0.055053919553756714,
0.0372723788022995,
0.08207675069570541,
0.15395504236221313,
0.1594337671995163,
-0.13716073334217072,
-0.012470798566937447,
0.021220477297902107,
0.05061589181423187,
0.0021106863860040903,
0.047386717051267624,
0.0056077903136610985,
-0.008859289810061455,
0.024165110662579536,
-0.0674583837389946,
0.010674533434212208,
-0.05567549914121628,
-0.07136116921901703,
-0.06473542749881744,
-0.10538017749786377,
0.05492747202515602,
0.04127490893006325,
0.02946251444518566,
-0.09716181457042694,
-0.12629136443138123,
0.10717075318098068,
0.14382223784923553,
-0.06749236583709717,
0.00639724126085639,
-0.0675637498497963,
0.047374702990055084,
-0.04400339350104332,
-0.030143670737743378,
-0.1810792237520218,
-0.12239101529121399,
0.03604165092110634,
-0.041004739701747894,
0.05372015759348869,
0.005727569106966257,
0.07254252582788467,
0.06629209965467453,
-0.07245440781116486,
-0.014821614138782024,
-0.04825089871883392,
0.00487728463485837,
-0.09341617673635483,
-0.1934538334608078,
-0.030249696224927902,
-0.03709147870540619,
0.14627252519130707,
-0.25964653491973877,
0.03998420014977455,
-0.01607602648437023,
0.11921583116054535,
0.04431656748056412,
-0.03788548707962036,
0.009574933908879757,
0.03140338137745857,
-0.025949448347091675,
-0.0927998274564743,
0.023220045492053032,
-0.001888623577542603,
-0.06089124083518982,
-0.035333916544914246,
-0.16439324617385864,
0.08373912423849106,
0.08867392688989639,
0.06689534336328506,
-0.11275843530893326,
0.008427174761891365,
-0.05060642212629318,
-0.036002662032842636,
-0.059827275574207306,
-0.01666187308728695,
0.12960144877433777,
-0.0054478575475513935,
0.1333230435848236,
-0.06113700941205025,
-0.07040686905384064,
-0.0007068735430948436,
-0.019704435020685196,
-0.0013714785454794765,
0.0917859598994255,
0.05632650479674339,
-0.1379094123840332,
0.1172548234462738,
0.07074937224388123,
-0.07822120189666748,
0.1398131251335144,
-0.045269206166267395,
-0.06338509917259216,
-0.00739557109773159,
0.010784485377371311,
-0.004758143797516823,
0.11651265621185303,
-0.07394503802061081,
-0.003911708015948534,
0.0194258913397789,
-0.010568499565124512,
0.027980035170912743,
-0.16661405563354492,
-0.007604790851473808,
0.02607780136168003,
-0.036422744393348694,
0.02391749620437622,
-0.018758129328489304,
0.02630489505827427,
0.09030908346176147,
-0.0041376142762601376,
-0.0302758626639843,
0.019086480140686035,
-0.003302719909697771,
-0.09024298936128616,
0.19514767825603485,
-0.09281595796346664,
-0.14764094352722168,
-0.14666694402694702,
-0.010602141730487347,
-0.056587520986795425,
-0.01696578785777092,
0.02926030382514,
-0.06534763425588608,
-0.0656905546784401,
-0.10990497469902039,
-0.0027182581834495068,
-0.01432812586426735,
-0.018434572964906693,
-0.0001656638487474993,
0.03975343331694603,
0.08256379514932632,
-0.14482423663139343,
0.013979421928524971,
-0.007927941158413887,
-0.10334913432598114,
-0.017655162140727043,
0.03479752689599991,
0.11288672685623169,
0.11622901260852814,
-0.03230072557926178,
-0.009328313171863556,
-0.04090211167931557,
0.20101416110992432,
-0.07104352861642838,
0.01842864602804184,
0.1323980838060379,
0.00024934232351370156,
0.043366894125938416,
0.1358412206172943,
0.0315760001540184,
-0.08222314715385437,
0.044910069555044174,
0.08068114519119263,
-0.022134585306048393,
-0.24411490559577942,
-0.039987556636333466,
-0.031298212707042694,
-0.056032780557870865,
0.08693806082010269,
0.07895578444004059,
0.042955182492733,
0.051744673401117325,
-0.02367585338652134,
0.04336871579289436,
0.03403273969888687,
0.09627873450517654,
0.1090138852596283,
0.03794959932565689,
0.1109527200460434,
-0.06605406850576401,
-0.04376643896102905,
0.06333483010530472,
0.010657232254743576,
0.24018386006355286,
-0.026599187403917313,
0.09121264517307281,
0.04600021243095398,
0.11015575379133224,
-0.004727437626570463,
0.017229588702321053,
0.011079097166657448,
0.0031171487644314766,
0.016244908794760704,
-0.06820269674062729,
-0.002499614842236042,
0.027641229331493378,
-0.06113540381193161,
0.051186703145504,
-0.06383201479911804,
0.0022771116346120834,
0.03368017077445984,
0.21606707572937012,
0.03756038844585419,
-0.2927923798561096,
-0.08837932348251343,
0.03301987424492836,
-0.034779492765665054,
-0.05354589968919754,
0.025227423757314682,
0.10014732927083969,
-0.09702908992767334,
0.0735984817147255,
-0.050353970378637314,
0.0789792612195015,
-0.03583728149533272,
0.00005619177682092413,
0.017763275653123856,
0.11900001019239426,
-0.01975310780107975,
0.09790752083063126,
-0.22090615332126617,
0.1922808140516281,
0.027022426947951317,
0.10658738017082214,
-0.03807888925075531,
0.02881494350731373,
0.028476431965827942,
0.11974874138832092,
0.1054203063249588,
-0.02541356347501278,
-0.02955033630132675,
-0.15672633051872253,
-0.06875283271074295,
0.04174317046999931,
0.09282369166612625,
-0.02284560352563858,
0.09301672875881195,
-0.07377488911151886,
-0.012483706697821617,
0.05468236282467842,
-0.10395120829343796,
-0.16850514709949493,
-0.13441908359527588,
0.004140395671129227,
0.020130982622504234,
-0.06919755041599274,
-0.08913016319274902,
-0.09154576808214188,
-0.020495060831308365,
0.2128569632768631,
-0.03913017734885216,
-0.05502784252166748,
-0.12943726778030396,
0.057431410998106,
0.110507532954216,
-0.07411728799343109,
0.01894085668027401,
0.002827746095135808,
0.1369187980890274,
0.02293247915804386,
-0.1007128357887268,
0.054583169519901276,
-0.05876293033361435,
-0.16263489425182343,
-0.044556017965078354,
0.11786145716905594,
0.04330490902066231,
0.033073633909225464,
-0.007862107828259468,
0.02977844513952732,
0.006660140119493008,
-0.07543283700942993,
-0.020316526293754578,
0.08245471119880676,
0.058491531759500504,
0.04922141134738922,
-0.0977768674492836,
-0.029578519985079765,
-0.04938872903585434,
-0.0045061563141644,
0.10837344825267792,
0.20982132852077484,
-0.08773976564407349,
0.06562155485153198,
0.08507047593593597,
-0.09553012996912003,
-0.21993961930274963,
0.05867154896259308,
0.03891879692673683,
0.006357959471642971,
0.04612400382757187,
-0.15783891081809998,
0.11124633997678757,
0.09719644486904144,
-0.03392781689763069,
0.08455578237771988,
-0.28071847558021545,
-0.14193223416805267,
0.10917846113443375,
0.11647453159093857,
0.009975217282772064,
-0.15127785503864288,
-0.04292339086532593,
-0.039235178381204605,
-0.1232963278889656,
0.08540480583906174,
-0.10285023599863052,
0.10876800864934921,
-0.0062688663601875305,
0.06503964960575104,
0.022188110277056694,
-0.0369877852499485,
0.12906816601753235,
-0.0067137000150978565,
0.10605330020189285,
-0.056765489280223846,
-0.006030871067196131,
0.09825001657009125,
-0.07571970671415329,
0.06496923416852951,
-0.08269225805997849,
0.07656314969062805,
-0.0948893278837204,
-0.030236465856432915,
-0.055314939469099045,
0.08615139126777649,
-0.048839058727025986,
-0.05169219896197319,
-0.0833677276968956,
0.04604276269674301,
0.022868847474455833,
-0.03145931661128998,
0.12560975551605225,
-0.005615388974547386,
0.12430069595575333,
0.1176178902387619,
0.10197386145591736,
0.028203604742884636,
-0.07283169031143188,
-0.0007469979464076459,
-0.03517252579331398,
0.059423401951789856,
-0.15238313376903534,
0.03981710970401764,
0.09821780771017075,
0.04873849079012871,
0.12637415528297424,
0.030733544379472733,
-0.052356962114572525,
0.0002163860626751557,
0.0457063689827919,
-0.10897713899612427,
-0.12601801753044128,
-0.013490437529981136,
-0.06830155849456787,
-0.14580005407333374,
0.03567369282245636,
0.12696310877799988,
-0.08159944415092468,
-0.020270979031920433,
-0.01687370240688324,
0.0328388549387455,
-0.021754929795861244,
0.17206723988056183,
0.07047067582607269,
0.058268070220947266,
-0.07920371741056442,
0.12486911565065384,
0.07991121709346771,
-0.021554382517933846,
0.043284863233566284,
0.07728245109319687,
-0.07670203596353531,
-0.04134436324238777,
0.0679101049900055,
0.18299466371536255,
-0.0441017709672451,
-0.04458675906062126,
-0.11903901398181915,
-0.08875095099210739,
0.031858980655670166,
0.1853300780057907,
0.04941306263208389,
0.0017296932637691498,
-0.016394972801208496,
0.03186668083071709,
-0.15430931746959686,
0.12720417976379395,
0.03134988248348236,
0.094218410551548,
-0.156454935669899,
0.07526808232069016,
0.01919156312942505,
0.014555864967405796,
-0.02555823139846325,
0.0319768488407135,
-0.11280368268489838,
-0.025247447192668915,
-0.1430235356092453,
0.015147343277931213,
-0.0025472044944763184,
0.02483477257192135,
-0.01423434168100357,
-0.05043159797787666,
-0.05303678289055824,
0.06215733289718628,
-0.05772721767425537,
-0.04278750717639923,
0.019586695358157158,
0.05673002079129219,
-0.1706453561782837,
-0.011321669444441795,
0.032512541860342026,
-0.08906471729278564,
0.05642223730683327,
0.03463524207472801,
0.01442632358521223,
0.041082337498664856,
-0.14450429379940033,
0.005084298085421324,
0.03372631594538689,
0.027073711156845093,
0.037564150989055634,
-0.08487507700920105,
0.0029641210567206144,
-0.02000134065747261,
0.04989081993699074,
0.02261708304286003,
0.07273037731647491,
-0.12335379421710968,
0.0023652249947190285,
-0.06894244998693466,
-0.0586155541241169,
-0.045343831181526184,
0.04236411675810814,
0.11023922264575958,
0.016264978796243668,
0.1618672013282776,
-0.09987424314022064,
0.01961337961256504,
-0.19734391570091248,
-0.026897544041275978,
-0.0035776901058852673,
-0.06223159283399582,
-0.09433801472187042,
-0.028933294117450714,
0.06321559846401215,
-0.04781987518072128,
0.14884060621261597,
0.013602223247289658,
0.06832129508256912,
0.05138647183775902,
-0.020992731675505638,
0.01888648420572281,
0.025531545281410217,
0.16652491688728333,
0.030992714688181877,
-0.017558692023158073,
0.07260404527187347,
-0.019737714901566505,
0.0721394345164299,
-0.021926404908299446,
0.14962635934352875,
0.1704486757516861,
-0.06377679854631424,
0.04321202635765076,
0.0617612786591053,
-0.08737920969724655,
-0.09510055929422379,
0.07686612755060196,
-0.0107866320759058,
0.08535884320735931,
-0.05021430179476738,
0.16190430521965027,
0.14887240529060364,
-0.14215832948684692,
0.043156713247299194,
-0.05896850675344467,
-0.09808823466300964,
-0.1298164576292038,
-0.04562929645180702,
-0.10054988414049149,
-0.10277682542800903,
0.01911771297454834,
-0.11566556245088577,
0.011797169223427773,
0.0677485466003418,
-0.00012173406139481813,
-0.0015160356415435672,
0.16102394461631775,
-0.026002781465649605,
0.013385944068431854,
0.030315760523080826,
0.005248484201729298,
-0.01870344951748848,
-0.04923093691468239,
-0.042359039187431335,
0.06459005922079086,
0.006465423386543989,
0.0501096211373806,
-0.03441628813743591,
0.023343905806541443,
0.051619548350572586,
-0.03154587745666504,
-0.06747851520776749,
0.031783103942871094,
0.03626912459731102,
0.0329778790473938,
0.044598810374736786,
0.04861968383193016,
-0.013488865457475185,
-0.02389133721590042,
0.3051019310951233,
-0.06662878394126892,
-0.09485500305891037,
-0.110587477684021,
0.22367806732654572,
0.02610877901315689,
-0.001510054338723421,
0.04542195051908493,
-0.12378258258104324,
0.014173241332173347,
0.19524551928043365,
0.16872450709342957,
-0.019362736493349075,
0.002016144571825862,
-0.025759415701031685,
-0.01381012238562107,
-0.034828897565603256,
0.10609143227338791,
0.09086503088474274,
0.03887704759836197,
-0.042201507836580276,
-0.023625556379556656,
0.009823708795011044,
-0.019521577283740044,
-0.09039580076932907,
0.09715772420167923,
-0.00003745097274077125,
0.0035431249998509884,
-0.038186587393283844,
0.06860111653804779,
0.014695592224597931,
-0.18681959807872772,
0.05142650753259659,
-0.1473504900932312,
-0.1558360904455185,
-0.022921796888113022,
0.09769873321056366,
-0.040239859372377396,
0.024069081991910934,
-0.010606050491333008,
-0.004570443648844957,
0.10708072036504745,
-0.019138285890221596,
-0.0804237648844719,
-0.10936781764030457,
0.0571904256939888,
-0.06603164225816727,
0.2435014694929123,
-0.011914406903088093,
0.04085611179471016,
0.11084364354610443,
-0.01246856153011322,
-0.13608160614967346,
0.06915444135665894,
0.05788152292370796,
-0.06692424416542053,
0.04671893268823624,
0.13736459612846375,
-0.04677591100335121,
0.13732321560382843,
0.042311571538448334,
-0.0965309739112854,
-0.00816178135573864,
-0.05794468894600868,
-0.05755732208490372,
-0.09201809018850327,
-0.009182251058518887,
-0.07157473266124725,
0.14231429994106293,
0.17855069041252136,
-0.029886990785598755,
0.016058510169386864,
-0.07873186469078064,
0.031413882970809937,
0.07418259233236313,
0.07055550813674927,
-0.002177607733756304,
-0.2088415026664734,
0.01932891085743904,
0.04792127013206482,
0.03417975828051567,
-0.30764201283454895,
-0.0692448690533638,
0.017852460965514183,
-0.03328925743699074,
-0.06566663086414337,
0.11392716318368912,
0.09584920853376389,
0.03729727119207382,
-0.04312143847346306,
-0.16285830736160278,
-0.03933345153927803,
0.14435835182666779,
-0.16853497922420502,
-0.056206364184617996
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-to-audio | Sagicc/speecht5_finetuned_rs | [
"transformers",
"safetensors",
"speecht5",
"text-to-audio",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T08:54:27+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #speecht5 #text-to-audio #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #speecht5 #text-to-audio #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
43,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #speecht5 #text-to-audio #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06167911738157272,
0.16548340022563934,
-0.004593416582792997,
0.019097663462162018,
0.09994615614414215,
0.002740920288488269,
0.06759516149759293,
0.10583428293466568,
-0.01988399587571621,
0.13707755506038666,
0.011960740201175213,
0.10068204253911972,
0.1161942332983017,
0.16732290387153625,
-0.00771849462762475,
-0.2263876497745514,
0.06144936382770538,
-0.11805395781993866,
0.012750252149999142,
0.118165522813797,
0.1339596062898636,
-0.10760717839002609,
0.07227543741464615,
-0.013376031070947647,
0.0005576065741479397,
-0.027094220742583275,
-0.05605489760637283,
-0.06493523716926575,
0.05241605266928673,
0.07014591246843338,
0.05178326740860939,
0.02368778921663761,
0.07964014261960983,
-0.2901970446109772,
0.018180103972554207,
0.0773867592215538,
0.015448064543306828,
0.06619779765605927,
0.09308769553899765,
-0.06924589723348618,
0.10869299620389938,
-0.03215329721570015,
0.13847891986370087,
0.07598221302032471,
-0.0921265035867691,
-0.18712389469146729,
-0.07246603071689606,
0.051920533180236816,
0.13668860495090485,
0.0665576383471489,
-0.04359521344304085,
0.15289154648780823,
-0.10822506994009018,
0.007361260242760181,
0.10588200390338898,
-0.07336096465587616,
-0.05219945311546326,
0.02498004399240017,
0.10959106683731079,
0.09674584865570068,
-0.12768928706645966,
-0.012142281047999859,
0.03570681810379028,
0.014749986119568348,
0.08549235016107559,
0.01979222148656845,
0.13741254806518555,
0.020408878102898598,
-0.1449318379163742,
-0.05154357850551605,
0.11342272162437439,
0.03485376387834549,
-0.053398992866277695,
-0.22903786599636078,
-0.01638220064342022,
-0.006920823361724615,
-0.03845500573515892,
-0.03769728168845177,
0.045008584856987,
-0.030674787238240242,
0.07523441314697266,
0.019595064222812653,
-0.07039128988981247,
-0.043325118720531464,
0.07271972298622131,
0.0864376500248909,
0.0241733156144619,
-0.015566462650895119,
0.026961123570799828,
0.10896003991365433,
0.0906837061047554,
-0.13148175179958344,
-0.056169867515563965,
-0.074687160551548,
-0.09304957091808319,
-0.04976755753159523,
0.03307120129466057,
0.05410008877515793,
0.0552053265273571,
0.20170217752456665,
0.011198613792657852,
0.05030069500207901,
0.02470220811665058,
0.0114042479544878,
0.08047524839639664,
0.06761832535266876,
-0.06249874830245972,
-0.1343008577823639,
-0.04767819121479988,
0.1132669523358345,
0.006536044180393219,
-0.033678505569696426,
-0.02766772173345089,
0.05639803409576416,
0.04546935856342316,
0.10846012830734253,
0.09350898116827011,
0.000864118046592921,
-0.08865325897932053,
-0.04707770794630051,
0.21121209859848022,
-0.14767716825008392,
0.025280676782131195,
0.02044081874191761,
-0.04957907274365425,
-0.020100636407732964,
-0.0013741545844823122,
0.02771921269595623,
-0.03292099013924599,
0.11321970075368881,
-0.07292300462722778,
-0.03406337648630142,
-0.10584533214569092,
-0.05950983613729477,
0.029707642272114754,
0.02767513133585453,
-0.024991624057292938,
-0.02983229048550129,
-0.08861798793077469,
-0.07416834682226181,
0.07502078264951706,
-0.07578491419553757,
-0.07663456350564957,
-0.0065037552267313,
-0.04771161451935768,
0.013278372585773468,
-0.0034616817720234394,
0.11998847126960754,
-0.040840357542037964,
0.03891460970044136,
-0.051055047661066055,
0.07202231138944626,
0.14668677747249603,
0.039076924324035645,
-0.08992552012205124,
0.06323104351758957,
-0.23776060342788696,
0.10818395018577576,
-0.10731272399425507,
0.019443191587924957,
-0.14286819100379944,
-0.031096577644348145,
0.010259192436933517,
0.029197290539741516,
-0.013150593265891075,
0.13081489503383636,
-0.20270639657974243,
-0.040322739630937576,
0.16239479184150696,
-0.1299779862165451,
-0.0869366005063057,
0.06876348704099655,
-0.05601797625422478,
0.10219244658946991,
0.0402742475271225,
-0.016141079366207123,
0.054179877042770386,
-0.13621863722801208,
-0.03311752527952194,
-0.04220837354660034,
-0.0035726046189665794,
0.1627722829580307,
0.06913509964942932,
-0.0661623477935791,
0.04646996408700943,
0.01671091839671135,
-0.021378520876169205,
-0.037426162511110306,
-0.03594345226883888,
-0.08993098139762878,
0.013640277087688446,
-0.06918895989656448,
0.026745032519102097,
-0.008960003033280373,
-0.08941419422626495,
-0.04280177131295204,
-0.1677020639181137,
-0.01621413044631481,
0.08714034408330917,
0.015129243955016136,
-0.022880811244249344,
-0.08109569549560547,
0.011607972905039787,
0.0012574708089232445,
-0.02600337751209736,
-0.16758106648921967,
-0.048478491604328156,
0.0437433160841465,
-0.19944363832473755,
0.024584902450442314,
-0.040842991322278976,
0.03591115400195122,
0.031232377514243126,
-0.04084841161966324,
-0.013472743332386017,
0.016233524307608604,
0.01822774112224579,
-0.014329759404063225,
-0.23140236735343933,
-0.01615961641073227,
-0.03535204380750656,
0.15334218740463257,
-0.22351068258285522,
0.020466743037104607,
0.06747693568468094,
0.13979218900203705,
0.01158983912318945,
-0.05479741096496582,
0.028993243351578712,
-0.054839182645082474,
-0.045804496854543686,
-0.055449094623327255,
-0.012154782190918922,
-0.022898439317941666,
-0.030789179727435112,
0.06091178581118584,
-0.19192905724048615,
-0.03513779491186142,
0.11404303461313248,
0.05675783008337021,
-0.15447133779525757,
-0.04180677607655525,
-0.03867857903242111,
-0.06885110586881638,
-0.09899334609508514,
-0.052088070660829544,
0.10217426717281342,
0.0510837584733963,
0.04436998814344406,
-0.08110510557889938,
-0.04860338568687439,
0.012579403817653656,
-0.019155830144882202,
-0.025103533640503883,
0.08265406638383865,
0.08895497024059296,
-0.10449666529893875,
0.08904232829809189,
0.06852846592664719,
0.0675421804189682,
0.09204627573490143,
-0.004029807634651661,
-0.11263967305421829,
-0.013809462077915668,
0.011600077152252197,
0.014785882085561752,
0.12681783735752106,
-0.0559755377471447,
0.05606510490179062,
0.05637483671307564,
-0.024016454815864563,
0.015552452765405178,
-0.10461848229169846,
0.029330970719456673,
0.03294013440608978,
-0.0031076744198799133,
0.02048010192811489,
-0.03663692623376846,
0.019858507439494133,
0.0915752500295639,
0.034693483263254166,
0.031055932864546776,
0.012983591295778751,
-0.04245990142226219,
-0.1126476377248764,
0.16175472736358643,
-0.09511085599660873,
-0.262237548828125,
-0.12163412570953369,
-0.014436004683375359,
0.048990245908498764,
-0.016396086663007736,
0.014904003590345383,
-0.041093356907367706,
-0.11958764493465424,
-0.09756673872470856,
0.018072670325636864,
0.05451248213648796,
-0.09027399867773056,
-0.05986254662275314,
0.05341104790568352,
0.04128555208444595,
-0.1249379962682724,
0.027149846777319908,
0.04481068626046181,
-0.04373796656727791,
-0.011254833079874516,
0.0753229483962059,
0.09212097525596619,
0.17986233532428741,
0.02658134326338768,
-0.026226161047816277,
0.029820308089256287,
0.24225081503391266,
-0.1508846879005432,
0.09975794702768326,
0.14357402920722961,
-0.07127903401851654,
0.0947582945227623,
0.21705219149589539,
0.036358531564474106,
-0.08079371601343155,
0.046123240143060684,
0.04316700994968414,
-0.03487853333353996,
-0.24535785615444183,
-0.07814157009124756,
-0.003946501761674881,
-0.08014543354511261,
0.07988967001438141,
0.08827400952577591,
0.11501660943031311,
0.055602192878723145,
-0.09838440269231796,
-0.07304178178310394,
0.04065968841314316,
0.10669359564781189,
-0.012147650122642517,
0.01107509434223175,
0.09397588670253754,
-0.026495514437556267,
0.006272942293435335,
0.10411658138036728,
-0.001983852591365576,
0.19068185985088348,
0.03722454980015755,
0.15556569397449493,
0.08323909342288971,
0.053902432322502136,
0.025087548419833183,
0.01554166804999113,
0.03377028554677963,
0.017029643058776855,
-0.010178633034229279,
-0.09146630764007568,
0.009826160036027431,
0.12940752506256104,
0.062371209263801575,
0.031172001734375954,
0.02540227770805359,
-0.03539096564054489,
0.06440598517656326,
0.1607149988412857,
0.009393136017024517,
-0.202750563621521,
-0.039394162595272064,
0.08372968435287476,
-0.0893591120839119,
-0.11664912849664688,
-0.0007576621137559414,
0.021753160282969475,
-0.17870621383190155,
0.04986803978681564,
-0.017884990200400352,
0.10999759286642075,
-0.11610829085111618,
-0.02926469035446644,
0.05590568855404854,
0.08341801166534424,
-0.035009145736694336,
0.07892953604459763,
-0.18237504363059998,
0.12593483924865723,
0.00918023195117712,
0.059552956372499466,
-0.10854919254779816,
0.09576401114463806,
0.019139986485242844,
-0.013917059637606144,
0.16873063147068024,
-0.007535479497164488,
-0.08365084230899811,
-0.045635730028152466,
-0.07673026621341705,
-0.022854501381516457,
0.10513032972812653,
-0.09903683513402939,
0.08059301227331161,
-0.01738971471786499,
-0.03904140740633011,
-0.0032601123675704002,
-0.12286004424095154,
-0.1511862725019455,
-0.18046987056732178,
0.07015654444694519,
-0.11293238401412964,
0.020599322393536568,
-0.11062124371528625,
-0.06499124318361282,
-0.047406889498233795,
0.19597364962100983,
-0.13840043544769287,
-0.08751856535673141,
-0.14411881566047668,
-0.09414899349212646,
0.15941599011421204,
-0.04024198651313782,
0.09103775024414062,
0.004244681913405657,
0.21662510931491852,
0.011199125088751316,
-0.0030722864903509617,
0.09248382598161697,
-0.09600712358951569,
-0.20458319783210754,
-0.08677847683429718,
0.14288555085659027,
0.12799601256847382,
0.046403974294662476,
-0.018540190532803535,
0.032712943851947784,
-0.02208361215889454,
-0.11312771588563919,
0.01658632420003414,
0.1088055819272995,
0.06246071681380272,
0.04286294803023338,
0.0041239457204937935,
-0.1525641232728958,
-0.08867873251438141,
-0.05204599350690842,
0.009888927452266216,
0.1847924143075943,
-0.06636099517345428,
0.16044402122497559,
0.15668396651744843,
-0.04995095357298851,
-0.20933379232883453,
0.034748345613479614,
0.05230649933218956,
-0.00987466424703598,
0.05893637239933014,
-0.18686597049236298,
0.08693484961986542,
0.014266173355281353,
-0.0648329108953476,
0.15024936199188232,
-0.17203234136104584,
-0.1519346833229065,
0.08157461881637573,
0.05907076224684715,
-0.2273063212633133,
-0.1250568926334381,
-0.10276850312948227,
-0.061700839549303055,
-0.14479182660579681,
0.0732736736536026,
0.015570375137031078,
-0.00019644292478915304,
0.051765237003564835,
0.016510382294654846,
0.020424112677574158,
-0.05057976394891739,
0.2035384625196457,
0.002489325124770403,
0.029817067086696625,
-0.08044003695249557,
-0.08752407133579254,
0.0447240024805069,
-0.04010516032576561,
0.06199098378419876,
-0.007123992312699556,
0.0006209314451552927,
-0.07103875279426575,
-0.06293250620365143,
-0.06258604675531387,
0.0262419655919075,
-0.08497549593448639,
-0.09944456070661545,
-0.06725189834833145,
0.10138873755931854,
0.09479286521673203,
-0.03241611272096634,
-0.05216696485877037,
-0.0930715948343277,
0.04046650230884552,
0.22360840439796448,
0.19076663255691528,
0.06722480058670044,
-0.08785752207040787,
0.005118930712342262,
-0.02236255444586277,
0.0450364351272583,
-0.17726388573646545,
0.056881166994571686,
0.04331925883889198,
0.026429545134305954,
0.11926604062318802,
-0.023719599470496178,
-0.16705875098705292,
-0.043999046087265015,
0.05742347240447998,
-0.055079732090234756,
-0.18982692062854767,
-0.012035333551466465,
0.06702776253223419,
-0.17519685626029968,
-0.07758849859237671,
0.012915488332509995,
-0.00994193460792303,
-0.031156150624155998,
0.008984750136733055,
0.07718369364738464,
0.023616446182131767,
0.11048536002635956,
0.06622365117073059,
0.09973084926605225,
-0.11397983133792877,
0.09215400367975235,
0.08957680314779282,
-0.09825357794761658,
0.004264728631824255,
0.07989643514156342,
-0.05516355484724045,
-0.02421298436820507,
0.022927649319171906,
0.07039427012205124,
0.03355913609266281,
-0.057881902903318405,
-0.0131098423153162,
-0.11349627375602722,
0.05688451603055,
0.121361643075943,
0.03505857661366463,
-0.007830782793462276,
0.04882485792040825,
0.031029390171170235,
-0.08691712468862534,
0.12325524538755417,
0.059084340929985046,
0.04143647849559784,
-0.06310784071683884,
-0.02777070924639702,
0.039047181606292725,
-0.021320687606930733,
-0.021001920104026794,
-0.030338548123836517,
-0.055356092751026154,
-0.00751672824844718,
-0.1686430722475052,
0.022748786956071854,
-0.08493874222040176,
0.0033119560685008764,
0.015885615721344948,
-0.043597884476184845,
-0.013173247687518597,
0.010756236501038074,
-0.08633670955896378,
-0.04398580268025398,
-0.0028673342894762754,
0.10262144356966019,
-0.1500122994184494,
0.008598659187555313,
0.09958469867706299,
-0.1219477429986,
0.06875897198915482,
0.0009574598516337574,
-0.011421050876379013,
0.011812449432909489,
-0.14687880873680115,
0.037942662835121155,
-0.0030714983586221933,
0.015035992488265038,
0.030879078432917595,
-0.18846383690834045,
-0.0014494924107566476,
-0.03991689160466194,
-0.04869832843542099,
-0.023775417357683182,
-0.08024434745311737,
-0.12044926732778549,
0.10584390163421631,
0.01124478131532669,
-0.09261500090360641,
-0.015082105994224548,
0.04159711301326752,
0.11273670941591263,
-0.04763614013791084,
0.14161771535873413,
-0.010532353073358536,
0.06576383858919144,
-0.17819805443286896,
-0.014620738103985786,
-0.01061546616256237,
0.011424031108617783,
-0.005006077699363232,
-0.0023361491039395332,
0.05289961025118828,
-0.014277007430791855,
0.2403629869222641,
-0.021734731271862984,
0.04695047810673714,
0.06137616187334061,
0.021372836083173752,
0.010653541423380375,
0.09202362596988678,
0.05641287565231323,
0.023071452975273132,
0.00670867133885622,
0.015673208981752396,
-0.047076862305402756,
-0.02324189990758896,
-0.1606062650680542,
0.0718972384929657,
0.15766695141792297,
0.0951700285077095,
-0.013449890539050102,
0.06473343074321747,
-0.11728892475366592,
-0.10423410683870316,
0.1021803691983223,
-0.04491652175784111,
-0.0006307702278718352,
-0.050389364361763,
0.15571917593479156,
0.14174163341522217,
-0.16775572299957275,
0.07407859712839127,
-0.05846787616610527,
-0.05416850000619888,
-0.12001023441553116,
-0.1721457839012146,
-0.06830571591854095,
-0.04530676454305649,
-0.0011672941036522388,
-0.05874517187476158,
0.0865936204791069,
0.11224820464849472,
-0.0002561230503488332,
0.0006645999383181334,
0.09735790640115738,
-0.03758581727743149,
-0.0189666710793972,
0.03851906582713127,
0.04490121826529503,
0.020799053832888603,
-0.04939786344766617,
0.013447504490613937,
0.007503087166696787,
0.037598274648189545,
0.05245101824402809,
0.025861544534564018,
-0.04639001935720444,
0.020399359986186028,
-0.011709860526025295,
-0.10881736874580383,
0.03785376995801926,
-0.03712807223200798,
-0.057738449424505234,
0.14757823944091797,
0.028817616403102875,
0.02395014651119709,
-0.02489541284739971,
0.2239140272140503,
-0.07530294358730316,
-0.07603023946285248,
-0.14248964190483093,
0.10133551061153412,
-0.05165180191397667,
0.06086588650941849,
0.05936092883348465,
-0.11202768236398697,
0.014482755213975906,
0.13496603071689606,
0.1264360249042511,
-0.03303531929850578,
0.008262017741799355,
0.031042123213410378,
0.004532141610980034,
-0.0428909957408905,
0.04112871736288071,
0.0375734344124794,
0.14108939468860626,
-0.06696342676877975,
0.07585976272821426,
-0.010344007983803749,
-0.09073739498853683,
-0.02462323196232319,
0.12893642485141754,
0.009150884114205837,
0.027105161920189857,
-0.08226025849580765,
0.12261601537466049,
-0.07169446349143982,
-0.2403191775083542,
0.03586927056312561,
-0.0507417693734169,
-0.15592887997627258,
-0.023326456546783447,
0.028075914829969406,
0.00763680599629879,
0.02350701205432415,
0.06463916599750519,
-0.06884550303220749,
0.1652672439813614,
0.0423705093562603,
-0.07087545841932297,
-0.060932062566280365,
0.077521912753582,
-0.09796937555074692,
0.29068225622177124,
0.012665039859712124,
0.05931966006755829,
0.08941308408975601,
-0.017797626554965973,
-0.12795063853263855,
0.0434376560151577,
0.10241130739450455,
-0.10249767452478409,
0.06670315563678741,
0.20217923820018768,
-0.0011505897855386138,
0.11591868847608566,
0.08216839283704758,
-0.07882541418075562,
0.0643257200717926,
-0.08226620405912399,
-0.0812302678823471,
-0.09903064370155334,
0.07298213988542557,
-0.06974713504314423,
0.14509135484695435,
0.12227024137973785,
-0.04315946251153946,
-0.0011773762525990605,
-0.02769840881228447,
0.04610813409090042,
0.005601190961897373,
0.13444000482559204,
0.015432958491146564,
-0.18767565488815308,
0.033016908913850784,
0.014415440149605274,
0.10305216163396835,
-0.20714892446994781,
-0.07635559141635895,
0.039395298808813095,
-0.025580987334251404,
-0.05728061869740486,
0.10401864349842072,
0.046351358294487,
0.039458729326725006,
-0.04885422810912132,
-0.04630141332745552,
-0.00017566129099577665,
0.16165021061897278,
-0.11629605293273926,
-0.0033297778572887182
] |
null | null | null | <style>
.title-container {
display: flex;
justify-content: center;
align-items: center;
height: 100vh; /* Adjust this value to position the title vertically */
}
.title {
font-size: 2.5em;
text-align: center;
color: RED;
font-family: 'Helvetica Neue', sans-serif;
text-transform: uppercase;
letter-spacing: 0.1em;
padding: 0.5em 0;
background: transparent;
}
.title span {
background: -webkit-linear-gradient(45deg, RED,BLACK);
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
}
.custom-table {
table-layout: fixed;
width: 100%;
border-collapse: collapse;
margin-top: 2em;
}
.custom-table td {
width: 50%;
vertical-align: top;
padding: 10px;
box-shadow: 0px 0px 0px 0px rgba(0, 0, 0, 0.15);
}
.custom-image-container {
position: relative;
width: 100%;
margin-bottom: 0em;
overflow: hidden;
border-radius: 10px;
transition: transform .7s;
/* Smooth transition for the container */
}
.custom-image-container:hover {
transform: scale(1.05);
/* Scale the container on hover */
}
.custom-image {
width: auto;
height: auto;
object-fit: cover;
border-radius: 10px;
transition: transform .7s;
margin-bottom: 0em;
}
.nsfw-filter {
filter: blur(8px); /* Apply a blur effect */
transition: filter 0.3s ease; /* Smooth transition for the blur effect */
}
.custom-image-container:hover .nsfw-filter {
filter: none; /* Remove the blur effect on hover */
}
.overlay {
position: absolute;
bottom: 0;
left: 0;
right: 0;
color: white;
width: 100%;
height: 40%;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
font-size: 1vw;
font-style: bold;
text-align: center;
opacity: 0;
/* Keep the text fully opaque */
background: linear-gradient(0deg, rgba(0, 0, 0, 0.8) 60%, rgba(0, 0, 0, 0) 100%);
transition: opacity .5s;
}
.custom-image-container:hover .overlay {
opacity: 1;
/* Make the overlay always visible */
}
.overlay-text {
background: linear-gradient(45deg, #00000, #0000);
-webkit-background-clip: text;
color: transparent;
/* Fallback for browsers that do not support this effect */
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.7);
/* Enhanced text shadow for better legibility */
.overlay-subtext {
font-size: 0.75em;
margin-top: 0.5em;
font-style: italic;
}
.overlay,
.overlay-subtext {
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.5);
}
</style>
<h1 class="title">
<span>Manga</span>
</h1>
<table class="custom-table">
<tr>
<td>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/6554c034-54c0-4598-92a5-c51d1d8e2737_1.png" alt="sample1">
</div>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/e5f99d8a-8e26-4169-80c3-b49b22ba0a06_2.png" alt="sample4">
</div>
</td>
<td>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/642efe75-e842-4c76-a037-3a513ed20b1e_1.png" alt="sample2">
</div>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/ee801a23-9acb-48af-bb31-92b370f5b9e5_1.png" alt="sample3">
</div>
</td>
<td>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/76321b32-5913-4cc4-9ce4-46df67ce8d4f_2.png" alt="sample1">
</div>
<div class="custom-image-container">
<img class="custom-image" src="https://enhanceai.s3.amazonaws.com/6c110f5f-4de4-4ed5-95ca-49267a32ceec_1.png" alt="sample1">
</div>
</td>
</tr>
</table>
## Overview
**Manga** is Advance Manga Image Generate Ai Model From [EnhanceAi](https://enhanceai.art)
## Model Details
- **Try Now:** [EnhanceAi](https://enhanceai.art) 200 Image Generate Free
- **Developer By:** Pranav Ajay & Kushal Saho | {"tags": ["text-to-image", "stable-diffusion", "safetensors", "stable-diffusion-xl", "text-generator", "image-generator", "ai", "image-to-image", "inpainting", "image-to-inpainting", "manga"]} | text-to-image | enhanceaiart/manga | [
"text-to-image",
"stable-diffusion",
"safetensors",
"stable-diffusion-xl",
"text-generator",
"image-generator",
"ai",
"image-to-image",
"inpainting",
"image-to-inpainting",
"manga",
"region:us"
] | 2024-02-13T08:57:51+00:00 | [] | [] | TAGS
#text-to-image #stable-diffusion #safetensors #stable-diffusion-xl #text-generator #image-generator #ai #image-to-image #inpainting #image-to-inpainting #manga #region-us
| <style>
.title-container {
display: flex;
justify-content: center;
align-items: center;
height: 100vh; /* Adjust this value to position the title vertically */
}
.title {
font-size: 2.5em;
text-align: center;
color: RED;
font-family: 'Helvetica Neue', sans-serif;
text-transform: uppercase;
letter-spacing: 0.1em;
padding: 0.5em 0;
background: transparent;
}
.title span {
background: -webkit-linear-gradient(45deg, RED,BLACK);
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
}
.custom-table {
table-layout: fixed;
width: 100%;
border-collapse: collapse;
margin-top: 2em;
}
.custom-table td {
width: 50%;
vertical-align: top;
padding: 10px;
box-shadow: 0px 0px 0px 0px rgba(0, 0, 0, 0.15);
}
.custom-image-container {
position: relative;
width: 100%;
margin-bottom: 0em;
overflow: hidden;
border-radius: 10px;
transition: transform .7s;
/* Smooth transition for the container */
}
.custom-image-container:hover {
transform: scale(1.05);
/* Scale the container on hover */
}
.custom-image {
width: auto;
height: auto;
object-fit: cover;
border-radius: 10px;
transition: transform .7s;
margin-bottom: 0em;
}
.nsfw-filter {
filter: blur(8px); /* Apply a blur effect */
transition: filter 0.3s ease; /* Smooth transition for the blur effect */
}
.custom-image-container:hover .nsfw-filter {
filter: none; /* Remove the blur effect on hover */
}
.overlay {
position: absolute;
bottom: 0;
left: 0;
right: 0;
color: white;
width: 100%;
height: 40%;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
font-size: 1vw;
font-style: bold;
text-align: center;
opacity: 0;
/* Keep the text fully opaque */
background: linear-gradient(0deg, rgba(0, 0, 0, 0.8) 60%, rgba(0, 0, 0, 0) 100%);
transition: opacity .5s;
}
.custom-image-container:hover .overlay {
opacity: 1;
/* Make the overlay always visible */
}
.overlay-text {
background: linear-gradient(45deg, #00000, #0000);
-webkit-background-clip: text;
color: transparent;
/* Fallback for browsers that do not support this effect */
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.7);
/* Enhanced text shadow for better legibility */
.overlay-subtext {
font-size: 0.75em;
margin-top: 0.5em;
font-style: italic;
}
.overlay,
.overlay-subtext {
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.5);
}
</style>
<h1 class="title">
<span>Manga</span>
</h1>
<table class="custom-table">
<tr>
<td>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample1">
</div>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample4">
</div>
</td>
<td>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample2">
</div>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample3">
</div>
</td>
<td>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample1">
</div>
<div class="custom-image-container">
<img class="custom-image" src="URL alt="sample1">
</div>
</td>
</tr>
</table>
## Overview
Manga is Advance Manga Image Generate Ai Model From EnhanceAi
## Model Details
- Try Now: EnhanceAi 200 Image Generate Free
- Developer By: Pranav Ajay & Kushal Saho | [
"## Overview \n\nManga is Advance Manga Image Generate Ai Model From EnhanceAi",
"## Model Details\n\n- Try Now: EnhanceAi 200 Image Generate Free\n- Developer By: Pranav Ajay & Kushal Saho"
] | [
"TAGS\n#text-to-image #stable-diffusion #safetensors #stable-diffusion-xl #text-generator #image-generator #ai #image-to-image #inpainting #image-to-inpainting #manga #region-us \n",
"## Overview \n\nManga is Advance Manga Image Generate Ai Model From EnhanceAi",
"## Model Details\n\n- Try Now: EnhanceAi 200 Image Generate Free\n- Developer By: Pranav Ajay & Kushal Saho"
] | [
66,
17,
29
] | [
"passage: TAGS\n#text-to-image #stable-diffusion #safetensors #stable-diffusion-xl #text-generator #image-generator #ai #image-to-image #inpainting #image-to-inpainting #manga #region-us \n## Overview \n\nManga is Advance Manga Image Generate Ai Model From EnhanceAi## Model Details\n\n- Try Now: EnhanceAi 200 Image Generate Free\n- Developer By: Pranav Ajay & Kushal Saho"
] | [
-0.03894796594977379,
0.003951326012611389,
-0.0010059450287371874,
-0.006467417813837528,
0.13284404575824738,
0.0403798446059227,
0.2652876675128937,
0.03678659349679947,
-0.04571264609694481,
0.09892116487026215,
0.03469298407435417,
0.011454527266323566,
0.11678346991539001,
0.16348429024219513,
-0.029543530195951462,
-0.29774776101112366,
0.10715034604072571,
0.045845888555049896,
0.04999106004834175,
0.08395833522081375,
0.09495700895786285,
-0.09124720841646194,
0.09055004268884659,
-0.005549744702875614,
-0.06980885565280914,
-0.05902848392724991,
0.005256759934127331,
-0.12337254732847214,
0.02694605104625225,
0.050431232899427414,
0.0528283566236496,
0.041025854647159576,
0.08885741233825684,
-0.13220183551311493,
0.06136278063058853,
-0.037047430872917175,
-0.018727676942944527,
0.03764868900179863,
0.1410854011774063,
0.04977281019091606,
0.08160172402858734,
-0.019904015585780144,
-0.1182359978556633,
0.017073452472686768,
-0.07437683641910553,
0.19015395641326904,
0.0013852565316483378,
0.09381686896085739,
0.22264643013477325,
0.06831805408000946,
-0.05619777366518974,
-0.036974262446165085,
0.02666599676012993,
-0.019597142934799194,
0.028300004079937935,
-0.23077808320522308,
-0.12300188839435577,
0.06814827024936676,
0.09197165071964264,
0.17117524147033691,
-0.07793419808149338,
0.15006761252880096,
0.09597416967153549,
-0.010029650293290615,
0.022703418508172035,
-0.02645755000412464,
0.23202259838581085,
-0.046234533190727234,
-0.038980185985565186,
0.10444542020559311,
0.1367899477481842,
0.0342523455619812,
-0.019092243164777756,
-0.13754495978355408,
-0.055066466331481934,
0.13492020964622498,
-0.11113111674785614,
-0.04311245679855347,
-0.02264142967760563,
0.0034952375572174788,
0.04116922244429588,
-0.054709792137145996,
-0.11849042773246765,
-0.09499463438987732,
0.009055765345692635,
0.18002919852733612,
0.019531797617673874,
0.01019135769456625,
-0.11784683167934418,
0.024922596290707588,
-0.00456614326685667,
-0.12677110731601715,
0.04961203783750534,
-0.11861400306224823,
0.09261103719472885,
0.09300751984119415,
0.0468357689678669,
-0.3348565995693207,
0.06557799130678177,
-0.06280253827571869,
-0.05576041713356972,
-0.020137043669819832,
-0.026392178609967232,
0.14260849356651306,
0.040343184024095535,
-0.0214761383831501,
-0.12424995750188828,
0.004769385326653719,
0.04420452564954758,
0.14712326228618622,
0.1301230639219284,
-0.05744526535272598,
-0.10724394768476486,
-0.00963714811950922,
-0.09556365013122559,
-0.036938074976205826,
-0.03370433673262596,
0.02739592082798481,
-0.07798907905817032,
-0.004933664575219154,
0.17775653302669525,
-0.06841321289539337,
-0.06661156564950943,
-0.034456074237823486,
-0.030161386355757713,
0.0668504387140274,
-0.04982810467481613,
-0.0016873273998498917,
-0.0019493559375405312,
0.03852264955639839,
-0.08212243765592575,
0.0026391802821308374,
-0.02455783262848854,
0.002295617712661624,
-0.03924374282360077,
0.10186228156089783,
-0.01678626611828804,
-0.1443382203578949,
-0.08750315755605698,
0.013944040052592754,
0.02697775885462761,
-0.07106515765190125,
0.037159308791160583,
-0.008110792376101017,
0.019335966557264328,
0.040425918996334076,
0.030751308426260948,
-0.12667042016983032,
-0.035273049026727676,
0.04229260981082916,
0.056625913828611374,
0.0796213448047638,
0.0020281216129660606,
0.016780486330389977,
-0.06832852214574814,
0.068231500685215,
-0.14759624004364014,
-0.017108438536524773,
0.015942338854074478,
0.05407455563545227,
-0.039790596812963486,
0.00980670191347599,
-0.050876107066869736,
0.07065995782613754,
0.03251202777028084,
0.2775227129459381,
-0.161372572183609,
-0.0034316228702664375,
0.08209600299596786,
-0.14513328671455383,
-0.17784668505191803,
0.07250680774450302,
0.023655610159039497,
0.0888967365026474,
0.015116874128580093,
0.09044792503118515,
-0.15741518139839172,
-0.06678086519241333,
-0.024499481543898582,
-0.030461886897683144,
-0.01886453852057457,
0.021934453397989273,
0.15229949355125427,
0.08552917093038559,
-0.05681052803993225,
0.043649353086948395,
-0.13405805826187134,
0.05673529580235481,
-0.07631242275238037,
-0.06056014820933342,
0.06984948366880417,
-0.04090365022420883,
0.022002171725034714,
-0.002924831584095955,
0.1005820706486702,
0.06322048604488373,
-0.07934610545635223,
-0.1626996099948883,
0.06214376911520958,
-0.06339991837739944,
0.022165263071656227,
-0.08046182990074158,
0.20494161546230316,
-0.1708691120147705,
0.021631158888339996,
-0.030689233914017677,
0.057398002594709396,
0.02700839936733246,
0.08792684972286224,
0.0434102900326252,
-0.12239687889814377,
0.026298291981220245,
0.08257386833429337,
-0.025159485638141632,
-0.06159742549061775,
-0.01588127203285694,
-0.08297433704137802,
0.0029368966352194548,
-0.12317019701004028,
0.08299756050109863,
-0.07459396123886108,
0.010801253840327263,
-0.14787597954273224,
0.02539464272558689,
-0.0348144955933094,
0.06401839107275009,
0.005979539826512337,
0.031538303941488266,
0.12682202458381653,
-0.03457837924361229,
-0.148299902677536,
0.018116995692253113,
0.08822647482156754,
-0.01873258873820305,
-0.19395045936107635,
0.19727373123168945,
-0.1490805447101593,
0.09296728670597076,
0.10200423002243042,
-0.05696956068277359,
-0.03427772596478462,
-0.11772716790437698,
-0.012004735879600048,
0.019774723798036575,
-0.055050428956747055,
0.07539729028940201,
-0.08423665165901184,
0.030154580250382423,
0.13713183999061584,
-0.06761369854211807,
0.14540058374404907,
0.03285065293312073,
-0.07734403014183044,
-0.11811570078134537,
0.03597623482346535,
0.16363997757434845,
0.011146707460284233,
0.08805843442678452,
0.18655277788639069,
0.021942241117358208,
0.1874987781047821,
0.05873258411884308,
-0.10526753962039948,
0.0051374551840126514,
0.0378616526722908,
0.0534980483353138,
0.18969537317752838,
0.002666325541213155,
-0.03156587854027748,
0.02190655656158924,
-0.04981773719191551,
0.04152682051062584,
-0.09845778346061707,
-0.13306355476379395,
-0.02835967019200325,
-0.017386870458722115,
0.1774316430091858,
0.05362390726804733,
-0.10566365718841553,
0.09753619134426117,
-0.03459026291966438,
0.019957376644015312,
0.00872848927974701,
0.004051524214446545,
-0.045607853680849075,
0.08927546441555023,
0.01611337997019291,
-0.33128976821899414,
-0.1317339688539505,
0.050179023295640945,
-0.0975177213549614,
0.011657810769975185,
0.0008328488329425454,
-0.1981920450925827,
-0.06485938280820847,
-0.12287700921297073,
-0.008823541924357414,
-0.0118253780528903,
-0.05888162553310394,
-0.014305118471384048,
0.02071365714073181,
-0.035051874816417694,
-0.029974568635225296,
0.012230724096298218,
-0.02238980308175087,
-0.004007370211184025,
0.16417133808135986,
0.017526142299175262,
0.2309812754392624,
0.14569398760795593,
0.00952286459505558,
-0.00669315317645669,
0.03510424122214317,
0.08304260671138763,
-0.13727132976055145,
0.10281197726726532,
0.19920210540294647,
0.02167055942118168,
0.08287801593542099,
0.18892408907413483,
0.028432006016373634,
-0.059703897684812546,
0.06352909654378891,
-0.08742822706699371,
-0.04775301367044449,
-0.12206617742776871,
-0.03651171550154686,
-0.05501602590084076,
0.03354676440358162,
-0.04136170819401741,
0.06033805012702942,
-0.023249970749020576,
0.18191379308700562,
0.0027948233764618635,
-0.011043989099562168,
0.04403916746377945,
0.07726823538541794,
-0.08309383690357208,
-0.013939677737653255,
0.032654788345098495,
0.007120396941900253,
-0.06121159344911575,
0.06841214001178741,
-0.005704034119844437,
0.06053698807954788,
-0.0038081153761595488,
0.0371432863175869,
0.029900947585701942,
0.09426627308130264,
0.17146459221839905,
0.009581048041582108,
0.018433721736073494,
-0.04888443276286125,
-0.07673894613981247,
-0.08588004112243652,
0.013846752233803272,
0.1609281599521637,
-0.12858940660953522,
-0.07230951637029648,
-0.008897162042558193,
0.09913996607065201,
0.08626323938369751,
-0.024356752634048462,
0.025776248425245285,
-0.17194853723049164,
0.0588613860309124,
0.07617345452308655,
0.06570105999708176,
-0.05249371752142906,
0.04174052178859711,
0.16678617894649506,
-0.04290285333991051,
0.009450674057006836,
-0.13869863748550415,
0.10570871829986572,
0.041872356086969376,
0.001198437763378024,
-0.0638926774263382,
0.004777080845087767,
-0.055027320981025696,
0.016828058287501335,
-0.11668375879526138,
0.18085813522338867,
0.021463828161358833,
0.04457440227270126,
0.0349821038544178,
-0.014482153579592705,
0.10403567552566528,
0.15447375178337097,
0.19274906814098358,
-0.028278127312660217,
-0.041228484362363815,
-0.07328485697507858,
-0.12127275764942169,
0.06023690477013588,
0.09468094259500504,
-0.00017590948846191168,
0.022480102255940437,
0.023734625428915024,
-0.010297736153006554,
-0.08975912630558014,
0.22456225752830505,
-0.2992854416370392,
-0.12665516138076782,
0.08181682974100113,
0.026949550956487656,
0.13620586693286896,
-0.0365961492061615,
-0.029519088566303253,
0.0021499970462173223,
0.07289446890354156,
0.06186360493302345,
-0.03908165544271469,
-0.07743828743696213,
0.009458865970373154,
-0.0020387833938002586,
-0.06331767141819,
-0.0620974525809288,
-0.1010163426399231,
0.13907967507839203,
-0.08858420699834824,
-0.013136613182723522,
-0.026335353031754494,
-0.03563496842980385,
-0.04552987590432167,
-0.09169210493564606,
-0.039924293756484985,
0.018076805397868156,
-0.009696616791188717,
0.009439910762012005,
-0.048825062811374664,
0.08621098101139069,
-0.04265441745519638,
0.031057853251695633,
0.04861831292510033,
0.015859343111515045,
-0.029992327094078064,
-0.18007346987724304,
-0.1851004809141159,
-0.14768891036510468,
-0.14624671638011932,
0.05430802330374718,
0.23180007934570312,
0.020295260474085808,
-0.005888016894459724,
0.24756622314453125,
-0.05433431267738342,
-0.1730884611606598,
-0.1024250015616417,
-0.15428434312343597,
0.07396973669528961,
0.08949172496795654,
-0.11549656838178635,
0.0032317021396011114,
0.028572386130690575,
-0.06462319195270538,
0.058875612914562225,
-0.28169918060302734,
-0.08956373482942581,
0.004618722479790449,
0.153089702129364,
0.16517411172389984,
-0.2791057825088501,
-0.03212490305304527,
-0.061294589191675186,
0.051970213651657104,
-0.024314643815159798,
0.048450130969285965,
0.07637156546115875,
-0.08594342321157455,
-0.04831334203481674,
-0.015966298058629036,
-0.056118402630090714,
0.1554180085659027,
-0.10670538246631622,
0.039836108684539795,
-0.14351008832454681,
-0.07512549310922623,
0.05292942002415657,
-0.01592356711626053,
0.08389534801244736,
-0.14773690700531006,
-0.0322011299431324,
-0.06631693243980408,
-0.01956440508365631,
0.012682618573307991,
-0.037014737725257874,
0.011362208984792233,
-0.13168884813785553,
-0.06350483000278473,
0.18058325350284576,
-0.019010214135050774,
0.059048451483249664,
0.18937566876411438,
-0.057949453592300415,
0.0896618664264679,
-0.05249591916799545,
0.036150336265563965,
0.0058304741978645325,
0.05222560465335846,
-0.1149715930223465,
-0.014810547232627869,
0.12621687352657318,
-0.1637924164533615,
-0.016525598242878914,
0.08822576701641083,
-0.017404405400156975,
0.15187132358551025,
0.008222919888794422,
-0.0021573121193796396,
0.16752751171588898,
0.18062207102775574,
-0.09362774342298508,
-0.13467442989349365,
-0.06602757424116135,
0.17292483150959015,
0.10169098526239395,
-0.04617198556661606,
0.11375317722558975,
-0.07977326214313507,
-0.012579934671521187,
-0.02876206673681736,
0.043809592723846436,
0.018605930730700493,
0.015663698315620422,
0.01988472044467926,
-0.023786455392837524,
0.021209122613072395,
0.08819545060396194,
0.07904549688100815,
-0.12824490666389465,
-0.027123501524329185,
0.13790254294872284,
-0.05507434532046318,
-0.08658672869205475,
-0.004144306760281324,
0.0988885834813118,
-0.0751374214887619,
-0.007753170561045408,
-0.12449131906032562,
-0.03196834772825241,
-0.046363379806280136,
-0.05249018594622612,
-0.0013198289088904858,
-0.09180816262960434,
0.04509827122092247,
-0.0254830215126276,
-0.04462617635726929,
0.021322496235370636,
0.12576767802238464,
0.030973324552178383,
-0.12432567775249481,
-0.08296439051628113,
0.0673157274723053,
0.09267795830965042,
-0.1045844629406929,
-0.054379213601350784,
-0.08281727135181427,
-0.005580109544098377,
-0.245885968208313,
0.08330126106739044,
-0.18153443932533264,
-0.14882728457450867,
-0.08176811039447784,
0.02738441340625286,
-0.07848713546991348,
-0.0538460798561573,
-0.0383053794503212,
0.0231869388371706,
-0.0522099994122982,
0.035288743674755096,
-0.0244633536785841,
0.006009673234075308,
0.054063890129327774,
-0.0666205883026123,
0.0322604738175869,
-0.052851684391498566,
-0.07918328046798706,
-0.07632845640182495,
-0.24337075650691986,
-0.0028569335117936134,
-0.013670134358108044,
-0.038690146058797836,
-0.035808365792036057,
0.08226880431175232,
0.005649912636727095,
0.03624321520328522,
-0.020402085036039352,
-0.015927551314234734,
0.07711459696292877,
-0.05851089954376221,
0.0313202328979969,
-0.10659299790859222,
-0.023845931515097618,
-0.10087202489376068,
0.05828041583299637,
0.07993220537900925,
0.04717016592621803,
0.03004084900021553,
-0.10312443971633911,
0.09264293313026428,
-0.08070621639490128,
0.06511005759239197,
0.0349382683634758,
-0.04050793871283531,
-0.038464054465293884,
-0.07117493450641632,
-0.015444203279912472,
-0.08500108867883682,
0.07388203591108322,
0.0779256597161293,
-0.15649199485778809,
0.06079970300197601,
-0.02081718109548092,
0.020041214302182198,
0.022431323304772377,
0.12589339911937714,
0.07050638645887375,
0.06478702276945114,
-0.058926742523908615,
0.005461432505398989,
0.038649991154670715,
0.034700796008110046,
-0.026607850566506386,
0.09302403032779694,
0.045684680342674255,
0.12206794321537018,
-0.07615067064762115,
0.04454673081636429,
-0.04561702162027359,
0.026623565703630447,
-0.09231885522603989,
0.0634390115737915,
0.024916909635066986,
0.06903256475925446,
0.32425475120544434,
-0.05799271538853645,
0.01922229677438736,
-0.05338713526725769,
-0.060001496225595474,
-0.023109721019864082,
-0.19193048775196075,
-0.08001501113176346,
-0.13870468735694885,
0.12113789469003677,
-0.03708263486623764,
-0.003473267424851656,
0.13707126677036285,
0.03138202428817749,
-0.008035664446651936,
0.1562149077653885,
0.1717124581336975,
-0.09114369004964828,
0.15180106461048126,
-0.007110243197530508,
0.0013808367075398564,
0.018344389274716377,
-0.024777568876743317,
-0.023890400305390358,
-0.047137219458818436,
-0.018574412912130356,
0.06374641507863998,
-0.0038104925770312548,
0.08055140823125839,
-0.01761801913380623,
-0.09490668773651123,
-0.011118302121758461,
-0.04023725539445877,
0.07198368012905121,
0.18105080723762512,
0.04062409698963165,
-0.04935740679502487,
0.014209054410457611,
0.1847156286239624,
0.05513397976756096,
0.09575428813695908,
-0.12679439783096313,
0.0014652658719569445,
-0.0729798749089241,
-0.0029313492123037577,
-0.03756319731473923,
-0.09222253412008286,
-0.01757669262588024,
0.29350870847702026,
0.14856772124767303,
-0.2714213728904724,
-0.08332116156816483,
-0.06544943153858185,
0.009080508723855019,
-0.07929663360118866,
0.10240395367145538,
-0.022874055430293083,
0.23239544034004211,
-0.12198704481124878,
0.015790972858667374,
-0.1602223515510559,
-0.048694901168346405,
-0.019150154665112495,
0.039767779409885406,
0.09708423912525177,
0.015083042904734612,
-0.11371982842683792,
0.13137775659561157,
-0.14927051961421967,
-0.05931016430258751,
-0.04285431653261185,
-0.03442833572626114,
-0.003512901719659567,
-0.03736760839819908,
0.07428750395774841,
0.02100517600774765,
0.05804307013750076,
-0.07086675614118576,
0.10422728955745697,
-0.12076541036367416,
0.04805764555931091,
0.01975390687584877,
0.0597328245639801,
0.06170155480504036,
-0.09714531153440475,
0.23328907787799835,
0.024285640567541122,
-0.13127146661281586,
0.009078902192413807,
-0.01684815064072609,
-0.01808655634522438,
0.11974498629570007,
-0.04801792651414871,
0.07786157727241516,
-0.023499099537730217,
0.21657314896583557,
0.024787109345197678,
-0.009420578368008137,
0.08950881659984589,
0.011176727712154388,
0.08729691803455353,
0.0704897865653038,
-0.04583717882633209,
-0.059099189937114716,
0.09514274448156357,
-0.11807557195425034,
0.10229738056659698,
0.06562481820583344,
0.00399848772212863,
-0.0207863487303257,
-0.016344789415597916,
0.12390447407960892,
-0.012904515489935875,
0.04893980175256729,
0.055121954530477524,
-0.10235176980495453,
-0.07889781147241592,
0.0019879364408552647,
-0.013287913985550404,
-0.3426607549190521,
-0.05913301557302475,
-0.11939506977796555,
0.027958348393440247,
-0.11363213509321213,
0.08117332309484482,
0.22237788140773773,
-0.03881171718239784,
-0.01829879730939865,
-0.17034092545509338,
0.06579706817865372,
-0.0024308105930685997,
-0.01575635001063347,
-0.06708208471536636
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results_tcm_faq
This model is a fine-tuned version of [NousResearch/Llama-2-7b-chat-hf](https://huggingface.co/NousResearch/Llama-2-7b-chat-hf) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.13.3
| {"tags": ["generated_from_trainer"], "base_model": "NousResearch/Llama-2-7b-chat-hf", "model-index": [{"name": "results_tcm_faq", "results": []}]} | null | ksh-nyp/results_tcm_faq | [
"generated_from_trainer",
"base_model:NousResearch/Llama-2-7b-chat-hf",
"region:us"
] | 2024-02-13T09:01:49+00:00 | [] | [] | TAGS
#generated_from_trainer #base_model-NousResearch/Llama-2-7b-chat-hf #region-us
|
# results_tcm_faq
This model is a fine-tuned version of NousResearch/Llama-2-7b-chat-hf on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.13.3
| [
"# results_tcm_faq\n\nThis model is a fine-tuned version of NousResearch/Llama-2-7b-chat-hf on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.16.1\n- Tokenizers 0.13.3"
] | [
"TAGS\n#generated_from_trainer #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n",
"# results_tcm_faq\n\nThis model is a fine-tuned version of NousResearch/Llama-2-7b-chat-hf on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.16.1\n- Tokenizers 0.13.3"
] | [
33,
38,
6,
12,
8,
3,
105,
4,
33
] | [
"passage: TAGS\n#generated_from_trainer #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n# results_tcm_faq\n\nThis model is a fine-tuned version of NousResearch/Llama-2-7b-chat-hf on the None dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1### Training results### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.16.1\n- Tokenizers 0.13.3"
] | [
-0.1309565305709839,
0.08537376672029495,
-0.0024686078540980816,
0.0737111046910286,
0.12361260503530502,
0.028825266286730766,
0.11988576501607895,
0.10149338096380234,
-0.07296408712863922,
0.05352530628442764,
0.07366365939378738,
-0.0016783778555691242,
0.03967062383890152,
0.11254502087831497,
-0.0047167083248496056,
-0.23485612869262695,
0.018970943987369537,
0.007047283463180065,
-0.07537269592285156,
0.10518734157085419,
0.09260687977075577,
-0.08517665416002274,
0.0770808607339859,
0.04719506949186325,
-0.19365696609020233,
-0.035302188247442245,
-0.039405886083841324,
-0.07611127942800522,
0.09414390474557877,
-0.01547860074788332,
0.09599855542182922,
0.04665816202759743,
0.07707630842924118,
-0.13162168860435486,
0.004506684374064207,
0.058233458548784256,
0.03641830384731293,
0.12462512403726578,
0.028251400217413902,
-0.0251548383384943,
0.12034900486469269,
-0.07333605736494064,
0.06722858548164368,
0.0577501505613327,
-0.10660160332918167,
-0.14787551760673523,
-0.07246220856904984,
0.08812073618173599,
0.059963028877973557,
0.09887732565402985,
0.018973812460899353,
0.16084341704845428,
-0.11321468651294708,
0.049449484795331955,
0.22230207920074463,
-0.2371731996536255,
-0.07032687216997147,
0.01921866647899151,
0.012417039833962917,
0.05511665344238281,
-0.1259579211473465,
-0.021448686718940735,
0.04603699594736099,
0.014088992029428482,
0.06971108168363571,
0.021783508360385895,
-0.06923006474971771,
-0.02300776168704033,
-0.10167276859283447,
-0.01858743466436863,
0.20659981667995453,
0.05964409559965134,
-0.07373210787773132,
-0.0005079279071651399,
-0.0713624358177185,
-0.17266294360160828,
-0.020088888704776764,
-0.012075267732143402,
0.02836514450609684,
-0.06666874140501022,
-0.09783143550157547,
-0.06805302947759628,
-0.09909629076719284,
-0.07804182916879654,
-0.009026135317981243,
0.2298266589641571,
0.0407898835837841,
0.041175540536642075,
-0.04967799782752991,
0.12339049577713013,
-0.022006846964359283,
-0.13049498200416565,
-0.025190113112330437,
-0.01690283603966236,
-0.03284101188182831,
-0.01824997365474701,
-0.08879592269659042,
0.010768263600766659,
0.01788729801774025,
0.1420258730649948,
-0.11652485281229019,
0.02829820290207863,
0.04591517895460129,
0.02500179037451744,
-0.038263507187366486,
0.10914844274520874,
-0.03470761328935623,
0.011587178334593773,
0.023817582055926323,
0.07366327941417694,
0.009912698529660702,
-0.030736476182937622,
-0.07719840854406357,
-0.031496811658144,
0.08714530616998672,
0.03745861351490021,
-0.02152498997747898,
0.01450769230723381,
-0.016676796600222588,
-0.051525283604860306,
0.04414509981870651,
-0.0799877941608429,
-0.00569445826113224,
0.017791442573070526,
-0.08713917434215546,
0.03726782649755478,
0.03599696233868599,
-0.0057183546014130116,
-0.036446668207645416,
0.03127332404255867,
-0.13102993369102478,
-0.00878568273037672,
-0.06448373198509216,
-0.035024408251047134,
-0.0003036066482309252,
-0.06023731082677841,
-0.0004715739923994988,
-0.09902092069387436,
-0.16680407524108887,
0.0006030238000676036,
0.036439038813114166,
-0.06290006637573242,
-0.07360749691724777,
-0.005737547297030687,
-0.10660196095705032,
0.017002351582050323,
0.004114414565265179,
0.08809486776590347,
-0.03943025693297386,
0.09519389271736145,
0.04818565025925636,
0.01856287755072117,
-0.021630479022860527,
0.02541767619550228,
-0.10359111428260803,
0.028559109196066856,
-0.11292348057031631,
0.05785743147134781,
-0.08097638934850693,
0.012057427316904068,
-0.089532770216465,
-0.09687032550573349,
-0.012532155960798264,
-0.04830603301525116,
0.07886743545532227,
0.16889609396457672,
-0.10870205610990524,
-0.07052834331989288,
0.11297352612018585,
-0.0964791551232338,
-0.09384258836507797,
0.08530095964670181,
-0.01448260247707367,
0.022153304889798164,
0.038984380662441254,
0.1334330290555954,
0.13003000617027283,
-0.0988580584526062,
-0.02493523806333542,
0.008924839086830616,
0.07233310490846634,
-0.05189882218837738,
0.0826270654797554,
-0.001748467329889536,
-0.06207809969782829,
0.056399960070848465,
-0.0595581941306591,
0.006738048512488604,
-0.11746388673782349,
-0.0869937539100647,
-0.07349090278148651,
-0.09814020991325378,
0.08379296213388443,
0.002984224818646908,
0.06744244694709778,
-0.05650077387690544,
-0.14063182473182678,
0.05847889930009842,
0.14615529775619507,
-0.030160637572407722,
0.009725796990096569,
-0.08253713697195053,
0.09371859580278397,
-0.08542682230472565,
-0.03137386962771416,
-0.16699515283107758,
-0.11549032479524612,
0.02755788154900074,
-0.0472140870988369,
0.04155731201171875,
0.0003825319872703403,
0.05847502127289772,
0.09787511825561523,
-0.0558825358748436,
-0.0206385999917984,
-0.14561571180820465,
-0.005621659103780985,
-0.08554008603096008,
-0.1904985010623932,
-0.07461483031511307,
-0.009647897444665432,
0.23030227422714233,
-0.17694443464279175,
0.01696624793112278,
-0.009690816514194012,
0.1456453949213028,
0.007984807714819908,
-0.05338212847709656,
-0.008465650491416454,
0.0463748499751091,
-0.011433783918619156,
-0.059277769178152084,
0.047270823270082474,
0.022496609017252922,
-0.11432120949029922,
-0.059215545654296875,
-0.11556430906057358,
0.0778157040476799,
0.10885904729366302,
0.05658615753054619,
-0.08279353380203247,
-0.01842300407588482,
-0.09228489547967911,
-0.042358700186014175,
0.007677957881242037,
0.01585950143635273,
0.13959181308746338,
0.01627497933804989,
0.12526743113994598,
-0.07618413120508194,
-0.038361791521310806,
0.0417620949447155,
-0.015616762451827526,
0.013435927219688892,
0.05614568293094635,
0.10263313353061676,
-0.05569944903254509,
0.07960028201341629,
0.07578642666339874,
-0.11197353154420853,
0.1267140507698059,
-0.05384931340813637,
-0.09268694370985031,
-0.052714813500642776,
-0.015451672486960888,
-0.006777516566216946,
0.14916428923606873,
-0.10866349935531616,
0.034725502133369446,
0.046262238174676895,
0.01100203301757574,
0.072843536734581,
-0.1736806184053421,
-0.010338385589420795,
0.001368095981888473,
-0.010965729132294655,
-0.06682883948087692,
0.0027710406575351954,
0.022090211510658264,
0.09206219017505646,
0.020723318681120872,
-0.026441870257258415,
0.03229041397571564,
0.02728363871574402,
-0.07229936122894287,
0.1973292976617813,
-0.09565017372369766,
-0.10498803853988647,
-0.13947699964046478,
0.08074444532394409,
-0.1302465796470642,
-0.026177044957876205,
0.033997293561697006,
-0.07826597988605499,
-0.0024341491516679525,
-0.059029050171375275,
0.006623119115829468,
-0.05385579913854599,
0.026710495352745056,
0.035059183835983276,
-0.0027945872861891985,
0.09075295180082321,
-0.11979029327630997,
0.012486545369029045,
-0.037915632128715515,
-0.10810668021440506,
0.024126989766955376,
0.04234885796904564,
0.09713588654994965,
0.12022920697927475,
-0.022773947566747665,
0.0063209799118340015,
-0.0009606849635019898,
0.2506536543369293,
-0.07409598678350449,
-0.005102615337818861,
0.14377851784229279,
0.037650734186172485,
0.05392303317785263,
0.06726868450641632,
0.051486872136592865,
-0.06037928909063339,
-0.01608714461326599,
0.04353077709674835,
-0.03204716369509697,
-0.2281094789505005,
-0.04728925600647926,
-0.039860233664512634,
-0.04822632297873497,
0.058460742235183716,
0.053332582116127014,
0.07506070286035538,
0.07963178306818008,
-0.020172644406557083,
0.013798403553664684,
-0.05448541045188904,
0.10544821619987488,
0.08160252124071121,
0.07097623497247696,
0.09711585193872452,
-0.006192196626216173,
-0.010892005637288094,
0.06171651929616928,
0.008963251486420631,
0.24478712677955627,
-0.05168817192316055,
0.10518841445446014,
0.0523979626595974,
0.1548619568347931,
-0.0006597361643798649,
0.054166246205568314,
0.04830387234687805,
-0.016893725842237473,
0.003838531207293272,
-0.06514481455087662,
-0.055053431540727615,
0.012236283160746098,
0.014247029088437557,
0.06320551782846451,
-0.1315397024154663,
0.02613009139895439,
0.007161226123571396,
0.267378568649292,
0.03541504219174385,
-0.3247675895690918,
-0.13827869296073914,
-0.024130627512931824,
0.0009845223976299167,
-0.061425819993019104,
0.023860234767198563,
0.07691004872322083,
-0.10800159722566605,
0.021565241739153862,
-0.036836083978414536,
0.07908773422241211,
-0.015192129649221897,
0.020132042467594147,
-0.005100317765027285,
0.1588841676712036,
-0.0016244265716522932,
0.0855938196182251,
-0.16230860352516174,
0.20924998819828033,
0.007444851566106081,
0.10569513589143753,
-0.03544547036290169,
-0.006838077679276466,
0.011055008508265018,
0.07530147582292557,
0.054072923958301544,
0.0173020176589489,
0.054636500775814056,
-0.15394340455532074,
-0.11939031630754471,
0.052049435675144196,
0.09656991064548492,
-0.08265431970357895,
0.1020502820611,
-0.025503432378172874,
0.01941663771867752,
0.030461153015494347,
0.021278951317071915,
-0.15123499929904938,
-0.08524073660373688,
0.054116882383823395,
0.026493752375245094,
-0.008725796826183796,
-0.08042778819799423,
-0.10696427524089813,
0.04144502058625221,
0.16194644570350647,
0.00493913609534502,
-0.06679698824882507,
-0.1513742208480835,
0.04939116910099983,
0.16046667098999023,
-0.06930486112833023,
0.004092138260602951,
-0.028406893834471703,
0.1387861967086792,
0.045908257365226746,
-0.06846507638692856,
0.042157091200351715,
-0.07480393350124359,
-0.20123207569122314,
-0.031213009729981422,
0.14500227570533752,
0.04499242827296257,
0.03781227394938469,
-0.02521548606455326,
0.01570511981844902,
-0.031118016690015793,
-0.10790074616670609,
0.03526404872536659,
0.060059286653995514,
0.019095689058303833,
0.005041999276727438,
-0.04936441034078598,
0.09888739883899689,
-0.03602195158600807,
0.004966466221958399,
0.08780182152986526,
0.29987257719039917,
-0.06572110205888748,
0.05485626682639122,
0.1270953267812729,
-0.03716710954904556,
-0.1153278797864914,
0.008418706245720387,
0.11369914561510086,
0.012505557388067245,
-0.005691797472536564,
-0.19342775642871857,
0.10767819732427597,
0.13169215619564056,
-0.023522069677710533,
0.07255319505929947,
-0.24875664710998535,
-0.11322777718305588,
0.09621457755565643,
0.0942406877875328,
0.12269245833158493,
-0.12068919092416763,
-0.03844841569662094,
-0.023146329447627068,
-0.15208645164966583,
0.1617109179496765,
-0.10916076600551605,
0.12825721502304077,
-0.039444319903850555,
0.11324071139097214,
0.036400508135557175,
-0.037702467292547226,
0.1782332807779312,
0.017695998772978783,
0.045060157775878906,
-0.0036791504826396704,
0.017372865229845047,
0.10511253029108047,
-0.06040363386273384,
0.049589794129133224,
-0.012205321341753006,
0.09326989203691483,
-0.11196332424879074,
-0.006398586556315422,
-0.12014492601156235,
0.07313211262226105,
-0.0593092143535614,
-0.049609147012233734,
-0.018973806872963905,
0.04499264806509018,
0.035218048840761185,
-0.024618593975901604,
0.0708690732717514,
0.036613740026950836,
0.15904252231121063,
0.1397365778684616,
0.059357769787311554,
-0.02660401538014412,
-0.11544466018676758,
-0.021074915304780006,
-0.03115035779774189,
0.09175413101911545,
-0.11211345344781876,
0.007738612126559019,
0.1102784126996994,
0.06808353215456009,
0.06647509336471558,
0.061144232749938965,
-0.07768899947404861,
-0.006208315491676331,
0.042462244629859924,
-0.13616876304149628,
-0.09632214158773422,
-0.06163744255900383,
0.0087191266939044,
-0.13568414747714996,
0.07749515026807785,
0.15219442546367645,
-0.10932833701372147,
-0.01906517706811428,
-0.000445420213509351,
-0.008353685960173607,
-0.03780725225806236,
0.14810502529144287,
0.06771674007177353,
0.08522992581129074,
-0.09502540528774261,
0.11768268048763275,
0.07295353710651398,
-0.025668632239103317,
0.043949637562036514,
0.042869098484516144,
-0.1017272025346756,
-0.022807953879237175,
0.015033906325697899,
0.09255127608776093,
-0.13491640985012054,
-0.047065526247024536,
-0.12166393548250198,
-0.0836406797170639,
0.048638250678777695,
0.10488645732402802,
0.03153994306921959,
-0.00017392107110936195,
-0.036069147288799286,
0.049234870821237564,
-0.1666930913925171,
0.06689031422138214,
0.011950807645916939,
0.07691849023103714,
-0.11851008981466293,
0.12849347293376923,
0.005948354955762625,
0.07255478948354721,
-0.023143695667386055,
-0.026630541309714317,
-0.10064142197370529,
-0.023987475782632828,
-0.17856022715568542,
-0.037959784269332886,
-0.004444814287126064,
-0.0012974077835679054,
0.0025425476487725973,
-0.06767786294221878,
-0.07309470325708389,
0.04949570074677467,
-0.09709858149290085,
-0.04373597353696823,
0.02325817383825779,
0.03192102164030075,
-0.12796512246131897,
0.038872987031936646,
0.042995404452085495,
-0.10435878485441208,
0.08698271214962006,
0.11578325182199478,
0.03629755973815918,
0.03745159134268761,
-0.050118379294872284,
-0.0021064230240881443,
-0.01271944958716631,
0.011348907835781574,
0.0774453729391098,
-0.06955892592668533,
-0.011315640062093735,
-0.0355772040784359,
0.03670186549425125,
-0.003313940018415451,
0.07261906564235687,
-0.12590950727462769,
-0.0762653574347496,
-0.002457102993503213,
-0.04362504556775093,
-0.082362100481987,
-0.006091057322919369,
0.07293815165758133,
0.0496353916823864,
0.12155181169509888,
-0.02960171550512314,
0.0443512424826622,
-0.18392179906368256,
-0.03366611525416374,
-0.007432828191667795,
-0.01986265927553177,
-0.044285308569669724,
-0.030685795471072197,
0.0757216215133667,
-0.0475604310631752,
0.08747866004705429,
-0.03699934110045433,
0.11047052592039108,
0.022627489641308784,
-0.0369027815759182,
-0.05392426997423172,
-0.010512793436646461,
0.19201871752738953,
0.0952715128660202,
0.018954603001475334,
0.10181651264429092,
0.0004455487069208175,
0.04238729178905487,
0.005665007047355175,
0.18095175921916962,
0.06017669290304184,
-0.01974358782172203,
0.07020674645900726,
0.04813658073544502,
-0.08799345046281815,
-0.170586496591568,
0.04410363733768463,
0.008827965706586838,
0.08714886754751205,
-0.036328550428152084,
0.13385009765625,
0.1487450897693634,
-0.1300790011882782,
0.03735192120075226,
-0.02965603955090046,
-0.09047850966453552,
-0.08498848974704742,
-0.042394984513521194,
-0.062111541628837585,
-0.1569080948829651,
0.02772432193160057,
-0.12281101197004318,
-0.02987925335764885,
0.09212559461593628,
0.013270090334117413,
-0.0012297446373850107,
0.14617948234081268,
0.02296127751469612,
-0.01143193431198597,
0.036426957696676254,
0.002144923899322748,
0.0024244796950370073,
-0.050933837890625,
-0.08348070830106735,
0.04566448926925659,
0.039984866976737976,
0.07368786633014679,
-0.03962897136807442,
-0.01468676794320345,
0.06887978315353394,
0.02148868516087532,
-0.09912608563899994,
0.04127919301390648,
-0.015158923342823982,
0.05949889495968819,
0.027705756947398186,
0.005214707460254431,
0.0014003737596794963,
-0.04024408757686615,
0.28627005219459534,
-0.07396923750638962,
-0.05025400593876839,
-0.131692573428154,
0.2610354721546173,
0.016157658770680428,
-0.05436301231384277,
0.06447353214025497,
-0.12276122719049454,
-0.07878100872039795,
0.16703742742538452,
0.13749703764915466,
-0.08277290314435959,
-0.023065796121954918,
0.020894600078463554,
-0.025069808587431908,
-0.07331244647502899,
0.1105780228972435,
0.06825427711009979,
0.07433722168207169,
-0.08169097453355789,
0.04045351967215538,
-0.023595169186592102,
-0.013810906559228897,
-0.06985538452863693,
0.11298777163028717,
-0.012620076537132263,
-0.016338715329766273,
-0.076326884329319,
0.037206798791885376,
-0.06666851788759232,
-0.1838209182024002,
0.04509660229086876,
-0.13265027105808258,
-0.1887442022562027,
-0.043204162269830704,
0.03877352923154831,
-0.007840920239686966,
0.0772206038236618,
-0.02952084131538868,
-0.011522466316819191,
0.14691802859306335,
-0.02370259165763855,
-0.033578403294086456,
-0.1064525917172432,
0.11056583374738693,
0.017414411529898643,
0.2139424979686737,
-0.02004391886293888,
0.09432247281074524,
0.10376987606287003,
0.026362089440226555,
-0.14637984335422516,
0.03148341923952103,
0.05405232310295105,
-0.12460271269083023,
0.022975198924541473,
0.1741311252117157,
-0.026030072942376137,
0.07270846515893936,
0.0422375351190567,
-0.1543634533882141,
-0.06101426109671593,
0.0398862361907959,
0.015340296551585197,
-0.019459305331110954,
-0.02373332343995571,
-0.060422055423259735,
0.13937638700008392,
0.19739598035812378,
-0.06904881447553635,
-0.03799787908792496,
-0.08226660639047623,
0.07599802315235138,
0.06982883810997009,
0.007440270856022835,
-0.014330148696899414,
-0.2136518359184265,
0.010850399732589722,
0.10394203662872314,
0.011637127958238125,
-0.234798863530159,
-0.05862406641244888,
0.03901546075940132,
-0.05992037057876587,
-0.03281049430370331,
0.08725835382938385,
0.034086503088474274,
0.02836625650525093,
-0.0442381352186203,
-0.0073557025752961636,
-0.07913070917129517,
0.1172998771071434,
-0.18985258042812347,
-0.09033846855163574
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Model Description
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on [ConLL2003 dataset](https://huggingface.co/datasets/conll2003).
It achieves the following results on the evaluation set in Named Entity Recognition (NER)/Token Classification task:
- Loss: 0.0585
- F1: 0.9536
# Model Performance
- 1st Place: This fine-tuned model is topped on the best scores ( F1: 94.6%) from [Named Entity Recognition (NER) on CoNLL 2003 (English)]((https://paperswithcode.com/sota/named-entity-recognition-ner-on-conll-2003)).
- 6th Place: This fine-tuned model is ranked in the 6th place from the [Token Classification on conll2003 leaderboard](https://paperswithcode.com/sota/token-classification-on-conll2003)
## Model Usage
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("jinhybr/distilroberta-ConLL2003")
model = AutoModelForTokenClassification.from_pretrained("jinhybr/distilroberta-ConLL2003")
nlp = pipeline("ner", model=model, tokenizer=tokenizer, grouped_entities=True)
example = "My name is Tao Jin and live in Canada"
ner_results = nlp(example)
print(ner_results)
[{'entity_group': 'PER', 'score': 0.99686015, 'word': ' Tao Jin', 'start': 11, 'end': 18}, {'entity_group': 'LOC', 'score': 0.9996836, 'word': ' Canada', 'start': 31, 'end': 37}]
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 24
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.1666 | 1.0 | 439 | 0.0621 | 0.9345 |
| 0.0499 | 2.0 | 878 | 0.0564 | 0.9391 |
| 0.0273 | 3.0 | 1317 | 0.0553 | 0.9469 |
| 0.0167 | 4.0 | 1756 | 0.0553 | 0.9492 |
| 0.0103 | 5.0 | 2195 | 0.0572 | 0.9516 |
| 0.0068 | 6.0 | 2634 | 0.0585 | 0.9536 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["f1"], "base_model": "distilroberta-base", "model-index": [{"name": "distilroberta-ConLL2003", "results": []}]} | token-classification | jinhybr/distilroberta-ConLL2003 | [
"transformers",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:distilroberta-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2024-02-13T09:03:42+00:00 | [] | [] | TAGS
#transformers #safetensors #xlm-roberta #token-classification #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
| Model Description
=================
This model is a fine-tuned version of distilroberta-base on ConLL2003 dataset.
It achieves the following results on the evaluation set in Named Entity Recognition (NER)/Token Classification task:
* Loss: 0.0585
* F1: 0.9536
Model Performance
=================
* 1st Place: This fine-tuned model is topped on the best scores ( F1: 94.6%) from Named Entity Recognition (NER) on CoNLL 2003 (English)).
* 6th Place: This fine-tuned model is ranked in the 6th place from the Token Classification on conll2003 leaderboard
Model Usage
-----------
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 24
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 6.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 24\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 6.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #xlm-roberta #token-classification #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 24\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 6.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
99,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #xlm-roberta #token-classification #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 24\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 6.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.11112271249294281,
0.11080210655927658,
-0.00209253653883934,
0.10829982906579971,
0.14788293838500977,
0.009929376654326916,
0.13387417793273926,
0.10670305788516998,
-0.07994341105222702,
0.04627138748764992,
0.12459927797317505,
0.12602181732654572,
0.009010376408696175,
0.1655489206314087,
-0.07786630839109421,
-0.19054122269153595,
0.035562530159950256,
0.02020973712205887,
-0.05341735854744911,
0.11489780992269516,
0.09657327830791473,
-0.1363217681646347,
0.09265562146902084,
-0.011654388159513474,
-0.1677134931087494,
0.00010590485908323899,
0.02194211818277836,
-0.05938362330198288,
0.1280350387096405,
0.02034739777445793,
0.13635051250457764,
0.01954364776611328,
0.0906180590391159,
-0.18163223564624786,
0.007660983130335808,
0.06300672143697739,
-0.001533176633529365,
0.07664573937654495,
0.028637409210205078,
-0.014729641377925873,
0.06417036056518555,
-0.10425861179828644,
0.048744723200798035,
0.013060919009149075,
-0.1267208456993103,
-0.2321113795042038,
-0.09761141240596771,
0.03777674213051796,
0.10941173136234283,
0.07997982203960419,
-0.003756965044885874,
0.1446157544851303,
-0.06606554985046387,
0.08770428597927094,
0.22659707069396973,
-0.3171996474266052,
-0.06462954729795456,
0.059097498655319214,
0.04171135276556015,
0.07811214029788971,
-0.09133508056402206,
-0.025357995182275772,
0.05713566765189171,
0.025481220334768295,
0.15048213303089142,
-0.02718566171824932,
-0.07132817804813385,
0.0017032087780535221,
-0.15178462862968445,
-0.038271866738796234,
0.16852498054504395,
0.061612289398908615,
-0.05906370282173157,
-0.03688594698905945,
-0.07287302613258362,
-0.1461302936077118,
-0.04428914561867714,
-0.0027787815779447556,
0.056141991168260574,
-0.02548084780573845,
-0.046084221452474594,
0.00246984395198524,
-0.09458095580339432,
-0.0768817663192749,
-0.052248284220695496,
0.17210164666175842,
0.03945700451731682,
0.00920324306935072,
0.002355978824198246,
0.10288740694522858,
-0.034959614276885986,
-0.1301666498184204,
-0.006027792580425739,
0.02222434990108013,
0.007137080188840628,
-0.04803235083818436,
-0.05898476019501686,
-0.021920379251241684,
0.033874236047267914,
0.1547904759645462,
-0.06960876286029816,
0.029965536668896675,
0.029484134167432785,
0.036347176879644394,
-0.11119574308395386,
0.18103358149528503,
-0.04629717767238617,
-0.051422663033008575,
0.04338344931602478,
0.0851629450917244,
0.062441881746053696,
0.008217058144509792,
-0.1157621294260025,
0.009080394171178341,
0.11587801575660706,
0.02886318415403366,
-0.06473619490861893,
0.0696960985660553,
-0.05187036097049713,
0.007425735238939524,
0.04788334667682648,
-0.09435347467660904,
0.020592618733644485,
-0.010474842973053455,
-0.05142374336719513,
-0.0828077495098114,
0.033892542123794556,
0.02655315399169922,
0.016230478882789612,
0.07763730734586716,
-0.0809439867734909,
0.007591328117996454,
-0.08857011049985886,
-0.12168075144290924,
0.01067076064646244,
-0.09048749506473541,
0.03809104487299919,
-0.12297991663217545,
-0.18417194485664368,
-0.0035831809509545565,
0.05387868359684944,
-0.015181581489741802,
-0.05228987708687782,
-0.03942536562681198,
-0.08766277134418488,
0.01096353866159916,
-0.013705645687878132,
0.08331276476383209,
-0.07331156730651855,
0.09339477121829987,
0.048388879746198654,
0.07213989645242691,
-0.05147017538547516,
0.03571951389312744,
-0.10992071032524109,
0.04899388551712036,
-0.18691560626029968,
0.015966027975082397,
-0.05807607248425484,
0.07793094962835312,
-0.07713047415018082,
-0.07558747380971909,
0.015377404168248177,
-0.01121313776820898,
0.06912373006343842,
0.09048355370759964,
-0.15100015699863434,
-0.037401825189590454,
0.15710747241973877,
-0.08769746869802475,
-0.15035390853881836,
0.1184161901473999,
-0.051321275532245636,
0.026125870645046234,
0.05352769419550896,
0.16204865276813507,
0.07635339349508286,
-0.0969795286655426,
-0.003237051423639059,
0.000573900411836803,
0.04668573662638664,
-0.0697469562292099,
0.08651605248451233,
0.024469450116157532,
-0.012125097215175629,
0.01605987548828125,
-0.0837704986333847,
0.06004024296998978,
-0.07461316138505936,
-0.08835034817457199,
-0.058669112622737885,
-0.12016046047210693,
0.04647422209382057,
0.039152491837739944,
0.05064135789871216,
-0.11431616544723511,
-0.08297071605920792,
0.045548535883426666,
0.09426803886890411,
-0.05670096352696419,
0.013088803738355637,
-0.07184892892837524,
0.09983572363853455,
-0.0908716544508934,
-0.022363662719726562,
-0.15309759974479675,
-0.05002329871058464,
0.012339042499661446,
0.009387440048158169,
-0.0072301956824958324,
-0.01178183127194643,
0.06229091063141823,
0.08353512734174728,
-0.052601370960474014,
-0.04534776508808136,
-0.01727471314370632,
0.021211352199316025,
-0.12607227265834808,
-0.17720548808574677,
-0.03781101480126381,
-0.03630107641220093,
0.1480402797460556,
-0.2096288800239563,
0.04702821746468544,
-0.027471913024783134,
0.08035717904567719,
0.030213823541998863,
0.0012298902729526162,
-0.03448109328746796,
0.07488946616649628,
-0.04479977861046791,
-0.07281142473220825,
0.05340293422341347,
0.02033291570842266,
-0.07488997280597687,
-0.03471207246184349,
-0.13764669001102448,
0.2178025245666504,
0.13695038855075836,
-0.04412053897976875,
-0.0709499791264534,
0.015280857682228088,
-0.03950897976756096,
-0.02285059355199337,
-0.04796655848622322,
0.01862136274576187,
0.10222554951906204,
-0.019052641466259956,
0.14718221127986908,
-0.09091879427433014,
-0.0299056489020586,
0.02513059973716736,
-0.04940992221236229,
0.0154110724106431,
0.0991702452301979,
0.08457417041063309,
-0.10087095946073532,
0.157146155834198,
0.205392524600029,
-0.0918428972363472,
0.09201013296842575,
-0.04926399886608124,
-0.046470534056425095,
-0.03156734257936478,
0.007002136670053005,
0.005416383501142263,
0.1267622411251068,
-0.08173190802335739,
0.00423562154173851,
0.016258222982287407,
0.020153291523456573,
0.002269588178023696,
-0.2181590348482132,
-0.0355435386300087,
0.04287409037351608,
-0.051730599254369736,
-0.011244085617363453,
-0.02088579162955284,
0.001144600217230618,
0.09369630366563797,
0.0028509630355983973,
-0.10549230128526688,
0.04178576543927193,
0.004400020465254784,
-0.07166135311126709,
0.1974642425775528,
-0.08902208507061005,
-0.1253078728914261,
-0.11657416075468063,
-0.05629170686006546,
-0.04499097540974617,
0.03452211245894432,
0.06567464023828506,
-0.04891810566186905,
-0.049311865121126175,
-0.10782463103532791,
-0.022323569282889366,
0.03469879552721977,
0.02605457231402397,
0.017131008207798004,
-0.0013553544413298368,
0.09546060860157013,
-0.09425845742225647,
-0.01673145778477192,
-0.032548174262046814,
-0.05104199796915054,
0.038647379726171494,
0.03984130173921585,
0.11904008686542511,
0.12249622493982315,
-0.034687548875808716,
-0.014442267827689648,
-0.026121720671653748,
0.25210604071617126,
-0.056262895464897156,
-0.01810222491621971,
0.14608941972255707,
0.001961285015568137,
0.04799880459904671,
0.1485426425933838,
0.046261753886938095,
-0.09867290407419205,
0.0142614571377635,
0.013632874935865402,
-0.035589419305324554,
-0.18461193144321442,
-0.04565081745386124,
-0.045915503054857254,
-0.029125826433300972,
0.08449607342481613,
0.02453622594475746,
0.017737654969096184,
0.07120874524116516,
0.010249313898384571,
0.07695982605218887,
-0.032915905117988586,
0.076959989964962,
0.0979275032877922,
0.051798079162836075,
0.1305467188358307,
-0.04925736039876938,
-0.04747046157717705,
0.02383420802652836,
0.018903380259871483,
0.20843392610549927,
0.021844804286956787,
0.09979468584060669,
0.04911765456199646,
0.20017197728157043,
0.005972119513899088,
0.07198356091976166,
-0.00204427819699049,
-0.04093247279524803,
-0.015152537263929844,
-0.0403793528676033,
-0.04074297845363617,
0.03548583388328552,
-0.10099883377552032,
0.07642190158367157,
-0.11246055364608765,
0.012500320561230183,
0.053429048508405685,
0.2640719711780548,
0.04551181197166443,
-0.34323015809059143,
-0.09129766374826431,
0.0294894278049469,
-0.021692728623747826,
-0.029104899615049362,
0.019314711913466454,
0.10980633646249771,
-0.05672870948910713,
0.02230605110526085,
-0.05881284549832344,
0.07833682745695114,
-0.0072368741966784,
0.03704799711704254,
0.04667438194155693,
0.08659785240888596,
0.0012580374022945762,
0.0633012130856514,
-0.2560141980648041,
0.2754678726196289,
0.016270404681563377,
0.07373154163360596,
-0.03644154220819473,
0.004798655863851309,
0.03616942837834358,
0.10953191667795181,
0.07577820122241974,
-0.012049487791955471,
-0.09936271607875824,
-0.2175745666027069,
-0.05618147924542427,
0.03620726615190506,
0.07073336839675903,
-0.03171674162149429,
0.12139829248189926,
-0.04076914116740227,
-0.0056744832545518875,
0.07786398380994797,
0.02073894999921322,
-0.06872603297233582,
-0.08160459250211716,
-0.028101826086640358,
0.06387448310852051,
0.0018875275272876024,
-0.09357063472270966,
-0.08605066686868668,
-0.13792681694030762,
0.1392143815755844,
-0.05538932979106903,
-0.035520195960998535,
-0.10099925845861435,
0.03262811899185181,
0.05793784186244011,
-0.08121027052402496,
0.06519369035959244,
0.001196512719616294,
0.08962498605251312,
0.023591846227645874,
-0.058874569833278656,
0.1086995005607605,
-0.08616364747285843,
-0.17095082998275757,
-0.05832723528146744,
0.09278864413499832,
0.0034261862747371197,
0.044979650527238846,
0.005653579253703356,
0.026979664340615273,
-0.028436891734600067,
-0.07327043265104294,
0.007443720940500498,
0.009514711797237396,
0.07725446671247482,
0.01721234619617462,
-0.05322179198265076,
-0.011680018156766891,
-0.05181845650076866,
-0.028609847649931908,
0.14688925445079803,
0.2767010033130646,
-0.10467424988746643,
-0.0010844676289707422,
0.030645089223980904,
-0.050046902149915695,
-0.18987298011779785,
0.020678052678704262,
0.02437605708837509,
0.0114628029987216,
0.04213311895728111,
-0.12297067046165466,
0.10883849114179611,
0.10529496520757675,
-0.03393438830971718,
0.09086300432682037,
-0.2714700698852539,
-0.11801276355981827,
0.12664812803268433,
0.14897188544273376,
0.13956409692764282,
-0.14747673273086548,
-0.019054053351283073,
-0.02854904532432556,
-0.11622118204832077,
0.0955568253993988,
-0.11281457543373108,
0.10595893114805222,
-0.01892903447151184,
0.051593199372291565,
0.0006243413663469255,
-0.0524696409702301,
0.14277540147304535,
-0.009841816499829292,
0.1308957189321518,
-0.053895123302936554,
-0.006190879736095667,
0.04752745479345322,
-0.07476992160081863,
0.032561007887125015,
-0.11905478686094284,
0.050605274736881256,
-0.06410340219736099,
-0.014747360721230507,
-0.05327682942152023,
0.03565733879804611,
-0.024614326655864716,
-0.05299925431609154,
-0.04163014888763428,
0.025611137971282005,
0.04413030296564102,
-0.014758188277482986,
0.15491752326488495,
0.029099758714437485,
0.15235792100429535,
0.13264216482639313,
0.0532473623752594,
-0.08693300187587738,
-0.013614260591566563,
0.001712797675281763,
-0.03582325950264931,
0.06944653391838074,
-0.1529778242111206,
0.04366185888648033,
0.11139138787984848,
0.004099779296666384,
0.13141842186450958,
0.06951582431793213,
-0.013462657108902931,
0.011625492945313454,
0.06698263436555862,
-0.1594480723142624,
-0.09579112380743027,
0.0020125694572925568,
-0.011118685826659203,
-0.12060128152370453,
0.06834718585014343,
0.11603786051273346,
-0.0874161645770073,
-0.00453727412968874,
-0.022147465497255325,
0.018051642924547195,
-0.04288129881024361,
0.17572236061096191,
0.06828352063894272,
0.05540543794631958,
-0.08404871821403503,
0.07362080365419388,
0.060718100517988205,
-0.05499056726694107,
-0.00046968026435934007,
0.016521185636520386,
-0.09698887914419174,
-0.050774361938238144,
0.060970328748226166,
0.15648150444030762,
-0.04964477941393852,
-0.05866173282265663,
-0.14032761752605438,
-0.12284331768751144,
0.052733149379491806,
0.18420232832431793,
0.11066073179244995,
0.017392393201589584,
-0.004100640770047903,
-0.000344440049957484,
-0.1052396148443222,
0.11242370307445526,
0.0297689251601696,
0.09339548647403717,
-0.1771344542503357,
0.12011371552944183,
-0.008702435530722141,
0.023570364341139793,
-0.02008998766541481,
0.04907158389687538,
-0.10242651402950287,
-0.005316259805113077,
-0.14016050100326538,
0.00025568241835571826,
-0.03468231111764908,
0.014495550654828548,
0.001675682608038187,
-0.05785684660077095,
-0.06423921883106232,
0.03064650669693947,
-0.097087562084198,
-0.023091835901141167,
0.026434658095240593,
0.05640546977519989,
-0.11075754463672638,
-0.05212249606847763,
0.020261825993657112,
-0.0722244530916214,
0.054277610033750534,
0.02560391277074814,
0.03704488277435303,
0.037375569343566895,
-0.15416748821735382,
0.014568762853741646,
0.06781375408172607,
0.01629878208041191,
0.05610950291156769,
-0.11253892630338669,
-0.008131279610097408,
0.007210762705653906,
0.02988290786743164,
0.011443889699876308,
0.08340929448604584,
-0.13080549240112305,
-0.00014236228889785707,
-0.016255950555205345,
-0.042416658252477646,
-0.048784445971250534,
0.0034778304398059845,
0.11483260244131088,
0.001049583894200623,
0.21026380360126495,
-0.08295813947916031,
0.006067269016057253,
-0.19917136430740356,
-0.0010719405254349113,
-0.0274890698492527,
-0.12338317930698395,
-0.15650668740272522,
-0.044904518872499466,
0.04724118858575821,
-0.039928194135427475,
0.13968490064144135,
0.01894366554915905,
0.048092689365148544,
0.03097384236752987,
-0.02959747612476349,
0.058470409363508224,
0.030691083520650864,
0.21417097747325897,
0.03446183353662491,
-0.037307627499103546,
0.043137259781360626,
0.033541593700647354,
0.11247764527797699,
0.052768953144550323,
0.14718538522720337,
0.1850663721561432,
-0.015220347791910172,
0.11048303544521332,
0.02848697453737259,
-0.05960289016366005,
-0.14332963526248932,
0.028930921107530594,
-0.05059841275215149,
0.08670296519994736,
-0.018545860424637794,
0.18891391158103943,
0.10489087551832199,
-0.1634899377822876,
0.014980046078562737,
-0.06005634739995003,
-0.05901389196515083,
-0.10244394093751907,
-0.04468671977519989,
-0.09831172972917557,
-0.15374316275119781,
-0.01494685374200344,
-0.10456214845180511,
-0.0006678224308416247,
0.125759094953537,
-0.02127215452492237,
-0.009592000395059586,
0.15565474331378937,
0.01516412477940321,
0.039986755698919296,
0.031944431364536285,
0.0010866528609767556,
-0.036353230476379395,
-0.06055813655257225,
-0.09599535912275314,
0.017263062298297882,
-0.0354853980243206,
0.030750494450330734,
-0.05652545765042305,
-0.028807733207941055,
0.05124833807349205,
-0.012396484613418579,
-0.10022933036088943,
0.011112552136182785,
0.02214825339615345,
0.043971847742795944,
0.040295686572790146,
0.012989797629415989,
0.022207582369446754,
0.007308183237910271,
0.2331167459487915,
-0.077628955245018,
-0.07123320549726486,
-0.10452108085155487,
0.23439142107963562,
0.034948673099279404,
0.0013007516972720623,
0.01329605933278799,
-0.08332998305559158,
0.029993530362844467,
0.20139312744140625,
0.1670488864183426,
-0.08236309140920639,
0.008946964517235756,
-0.03817012533545494,
-0.01980799064040184,
-0.05151784420013428,
0.08722002804279327,
0.13704662024974823,
0.003203650936484337,
-0.08172975480556488,
-0.04101537540555,
-0.05703061446547508,
0.004394608549773693,
-0.05244017392396927,
0.032170187681913376,
0.0005041664699092507,
0.007166792638599873,
-0.04081004485487938,
0.03930867463350296,
-0.011726807802915573,
-0.08520333468914032,
0.06696656346321106,
-0.185358926653862,
-0.14783243834972382,
-0.015209357254207134,
0.09813594073057175,
-0.0025787795893847942,
0.04117567837238312,
-0.040651530027389526,
0.0032900748774409294,
0.07411454617977142,
-0.032525211572647095,
-0.053239401429891586,
-0.08684113621711731,
0.07111736387014389,
-0.07227975130081177,
0.23402205109596252,
-0.03473220393061638,
0.03309905156493187,
0.12649160623550415,
0.04810477793216705,
-0.1096850261092186,
0.10211220383644104,
0.04965238273143768,
-0.06886962056159973,
0.04325127974152565,
0.07022377848625183,
-0.041424281895160675,
0.12584175169467926,
0.046024370938539505,
-0.13513612747192383,
0.007986626587808132,
-0.046955082565546036,
-0.06862146407365799,
-0.04728388041257858,
-0.0457877553999424,
-0.053197361528873444,
0.1394415944814682,
0.16997356712818146,
-0.04949076846241951,
0.00481095677241683,
-0.04546170309185982,
0.02931763045489788,
0.07438619434833527,
0.026295695453882217,
-0.036477621644735336,
-0.2227943390607834,
0.03944643959403038,
0.05899481102824211,
-0.015064986422657967,
-0.24276120960712433,
-0.09058316797018051,
-0.01574539765715599,
-0.05736657604575157,
-0.0854872316122055,
0.08026989549398422,
0.1190122440457344,
0.04577009379863739,
-0.06741779297590256,
-0.08996202051639557,
-0.08440883457660675,
0.16125930845737457,
-0.11499588191509247,
-0.10551796853542328
] |
null | null | null |
gguf version of mbx-7b-v3 | {"license": "unknown"} | null | Slayery/mbx-7b-v3-gguf | [
"gguf",
"license:unknown",
"region:us"
] | 2024-02-13T09:07:08+00:00 | [] | [] | TAGS
#gguf #license-unknown #region-us
|
gguf version of mbx-7b-v3 | [] | [
"TAGS\n#gguf #license-unknown #region-us \n"
] | [
16
] | [
"passage: TAGS\n#gguf #license-unknown #region-us \n"
] | [
0.05113450437784195,
0.11973287165164948,
-0.00645101722329855,
-0.01643637754023075,
0.011933865025639534,
0.043764129281044006,
0.1613987386226654,
0.04991152882575989,
0.22102586925029755,
-0.003344388213008642,
0.17013505101203918,
0.01110776700079441,
0.03214149922132492,
0.009098376147449017,
-0.0029993015341460705,
-0.1416400671005249,
0.07256884127855301,
-0.06813071668148041,
0.09591235220432281,
-0.017731433734297752,
-0.05152268335223198,
-0.026444952934980392,
0.005034520290791988,
-0.021987134590744972,
-0.11498747020959854,
-0.006854899227619171,
0.014491547830402851,
-0.037569671869277954,
0.07126776874065399,
0.04930715635418892,
0.02449515089392662,
0.03703632950782776,
-0.03021962009370327,
-0.12715888023376465,
0.016102593392133713,
-0.09283517301082611,
-0.137498140335083,
-0.0015120910247787833,
0.019388271495699883,
-0.026033779606223106,
0.03776819631457329,
0.20369918644428253,
-0.08150061219930649,
0.04911298677325249,
-0.22091002762317657,
-0.27481651306152344,
-0.11860917508602142,
0.014690612442791462,
-0.09594368189573288,
0.05638020485639572,
0.063808873295784,
0.0802316889166832,
-0.18646597862243652,
-0.0491042360663414,
0.0683380588889122,
-0.35957634449005127,
0.08799925446510315,
0.1851089745759964,
-0.016299616545438766,
0.025108542293310165,
-0.07433612644672394,
0.1205352321267128,
0.08952222764492035,
-0.01605338416993618,
-0.16977044939994812,
-0.04150202125310898,
-0.010018941015005112,
0.13046976923942566,
-0.043889909982681274,
-0.09207465499639511,
0.22404322028160095,
0.024446193128824234,
-0.06118686869740486,
0.08653558045625687,
0.024744469672441483,
0.06383731961250305,
0.025595322251319885,
0.06937773525714874,
0.01847267895936966,
0.16377206146717072,
0.14605441689491272,
-0.07607676833868027,
-0.11362525075674057,
-0.04640248417854309,
-0.27770230174064636,
0.1869025081396103,
-0.03399509936571121,
0.14976337552070618,
-0.1346711367368698,
0.016250435262918472,
-0.31370505690574646,
0.014437493868172169,
-0.07101645320653915,
-0.034838542342185974,
0.06868395954370499,
0.026451412588357925,
-0.012289072386920452,
0.14313066005706787,
0.1513107717037201,
0.18217934668064117,
-0.04172474890947342,
-0.009136217646300793,
-0.09011457860469818,
0.16306240856647491,
-0.02022668346762657,
0.04871885105967522,
0.10683145374059677,
0.1583152562379837,
0.02337540127336979,
-0.17670783400535583,
0.011719964444637299,
-0.05404075235128403,
-0.16277536749839783,
0.005629570223391056,
-0.20271329581737518,
0.13545630872249603,
0.021091531962156296,
-0.059525828808546066,
-0.06083769351243973,
0.0762954130768776,
0.09823320806026459,
0.021700244396924973,
-0.023444581776857376,
0.00834388006478548,
0.02072897180914879,
-0.11288157850503922,
-0.06408500671386719,
0.044025860726833344,
0.15480048954486847,
0.06113828346133232,
-0.14827895164489746,
-0.02218560501933098,
0.03488802909851074,
0.061251115053892136,
0.09765233099460602,
-0.057036250829696655,
0.04829028993844986,
-0.07090997695922852,
-0.13786208629608154,
0.06235996633768082,
-0.00530259171500802,
-0.03436749055981636,
0.11672341823577881,
0.0704321637749672,
0.0037449910305440426,
-0.029705917462706566,
-0.04750349372625351,
-0.11490235477685928,
-0.06393688917160034,
0.09464381635189056,
0.012410955503582954,
-0.004130964633077383,
-0.23735292255878448,
-0.007777971215546131,
-0.11544761806726456,
0.06248177960515022,
0.027250075712800026,
-0.03961983323097229,
-0.16392819583415985,
0.1329697072505951,
0.03646143898367882,
0.05594358965754509,
-0.07241763174533844,
0.027938252314925194,
-0.07887700945138931,
0.10037819296121597,
-0.03954882547259331,
-0.10406212508678436,
0.1904432773590088,
-0.1189676970243454,
-0.11815325170755386,
0.006555043160915375,
0.043511681258678436,
0.022375021129846573,
0.04508613422513008,
0.4210600256919861,
-0.12332302331924438,
-0.17857906222343445,
0.08392012119293213,
0.17655687034130096,
-0.14011448621749878,
-0.16596853733062744,
0.13273802399635315,
-0.21243532001972198,
-0.22153182327747345,
0.03698994964361191,
-0.038417212665081024,
0.11104663461446762,
-0.04203682392835617,
-0.06867063790559769,
0.01469635684043169,
-0.008593535050749779,
0.0284860972315073,
0.0048998501151800156,
0.09452207386493683,
-0.052346937358379364,
0.03614788502454758,
-0.07585874199867249,
-0.020992856472730637,
0.1417689323425293,
-0.05684527009725571,
-0.028182094916701317,
0.12187270820140839,
0.023766690865159035,
0.01652267947793007,
0.024822335690259933,
-0.13860592246055603,
0.03517834097146988,
-0.04868257790803909,
0.11271156370639801,
0.10796637833118439,
0.02633020654320717,
0.01135001890361309,
0.03622669726610184,
0.044779133051633835,
0.07064615190029144,
0.0521443709731102,
0.029249871149659157,
-0.012419402599334717,
0.09615208208560944,
-0.010215908288955688,
-0.10551464557647705,
-0.0784967690706253,
-0.03899140655994415,
0.13429038226604462,
-0.09482614696025848,
-0.006463983561843634,
0.0017028010915964842,
-0.023498473688960075,
-0.017999274656176567,
0.07980409264564514,
-0.012927612289786339,
0.11208423227071762,
-0.007534482982009649,
-0.11607815325260162,
0.14577721059322357,
-0.014753669500350952,
0.23474852740764618,
0.1169416755437851,
0.027215594425797462,
-0.020106017589569092,
-0.11867227405309677,
-0.039782531559467316,
0.006185492966324091,
0.06251735240221024,
0.029368791729211807,
0.03089171089231968,
-0.07245596498250961,
0.015710601583123207,
-0.02778002619743347,
0.019803375005722046,
-0.027583694085478783,
-0.04107268527150154,
-0.1288667768239975,
0.06365346908569336,
0.13566473126411438,
-0.18853941559791565,
0.19729721546173096,
0.27373409271240234,
0.1391768753528595,
0.18083329498767853,
-0.11459845304489136,
0.00775625417008996,
-0.0786185935139656,
0.0426761731505394,
0.010256262496113777,
0.14479967951774597,
-0.08469931781291962,
0.00014830753207206726,
0.05490070581436157,
0.027260541915893555,
0.0572444312274456,
-0.16806045174598694,
-0.18916110694408417,
-0.04297298938035965,
-0.03484007716178894,
-0.12156160175800323,
0.12198638916015625,
-0.12463432550430298,
0.010325872339308262,
0.07130275666713715,
-0.03139759972691536,
0.18620705604553223,
0.004554860293865204,
-0.047638632357120514,
0.08000724017620087,
-0.17670610547065735,
-0.0902637243270874,
-0.11279358714818954,
-0.04657002538442612,
0.01168956607580185,
0.0504194013774395,
0.09103934466838837,
-0.07975108176469803,
-0.03068234585225582,
0.09365787357091904,
-0.09212832152843475,
-0.19144436717033386,
-0.03767840191721916,
-0.004301868379116058,
0.028814930468797684,
-0.09238962829113007,
-0.05010740086436272,
-0.08708182722330093,
-0.049010228365659714,
-0.08415757864713669,
0.08108272403478622,
-0.03414471819996834,
0.06855271756649017,
0.07989611476659775,
0.06334123760461807,
0.08970600366592407,
-0.043397966772317886,
0.2184920758008957,
-0.05835622549057007,
-0.11046471446752548,
0.04405023902654648,
0.05445786565542221,
0.03484772518277168,
0.1361539661884308,
0.08220060914754868,
-0.13913220167160034,
-0.05434075742959976,
-0.06072230264544487,
-0.13232147693634033,
-0.18455664813518524,
-0.032771218568086624,
-0.058646075427532196,
0.13373923301696777,
-0.045694150030612946,
0.11316104233264923,
0.17249420285224915,
0.017049076035618782,
0.06954937428236008,
-0.007937734015285969,
-0.041024964302778244,
0.0051429299637675285,
0.18943347036838531,
-0.027297209948301315,
-0.038285721093416214,
-0.1182854175567627,
0.02713097259402275,
0.142935112118721,
0.08435170352458954,
0.15037241578102112,
0.2766992151737213,
0.13182516396045685,
0.14772716164588928,
0.12263500690460205,
0.13940970599651337,
-0.06217316910624504,
0.03343777358531952,
-0.07776649296283722,
-0.034440964460372925,
-0.044148169457912445,
0.03496071323752403,
0.026284169405698776,
0.06183987855911255,
-0.17043575644493103,
0.005555372219532728,
-0.31559261679649353,
0.007023112382739782,
-0.16775448620319366,
0.07566718757152557,
0.07372593134641647,
0.03898857161402702,
0.058555323630571365,
0.08737054467201233,
-0.003984434995800257,
0.11410675942897797,
-0.021439960226416588,
-0.11296617984771729,
0.008980500511825085,
0.05209926888346672,
0.037334639579057693,
0.093379445374012,
0.05590716749429703,
-0.08588874340057373,
-0.13147421181201935,
0.020238688215613365,
0.14614322781562805,
-0.21726760268211365,
0.30255526304244995,
0.07067067921161652,
-0.04155241325497627,
-0.05145151540637016,
-0.06664235144853592,
0.005928815342485905,
0.09896135330200195,
0.1636669635772705,
0.0688113123178482,
-0.14034925401210785,
-0.11289310455322266,
-0.006068772170692682,
0.014961713925004005,
0.09462133795022964,
-0.014173025265336037,
-0.15469156205654144,
-0.030638229101896286,
0.07869638502597809,
-0.011908493936061859,
0.10218095779418945,
-0.1309037208557129,
-0.10418476909399033,
0.06341136991977692,
0.06818435341119766,
0.01929367519915104,
-0.08674074709415436,
0.03854886442422867,
-0.05330711603164673,
0.05348439887166023,
-0.11501190811395645,
0.028659865260124207,
-0.12112578004598618,
-0.11264971643686295,
0.02800929918885231,
-0.04756758362054825,
0.013907882384955883,
-0.10833903402090073,
-0.18796095252037048,
-0.09735725820064545,
-0.19696688652038574,
0.13420672714710236,
-0.04047758877277374,
0.002079355763271451,
-0.04199789837002754,
0.10205942392349243,
0.006200463976711035,
0.05185532569885254,
-0.03258258476853371,
0.04405150189995766,
-0.009228872135281563,
-0.1627616435289383,
0.18865461647510529,
-0.11697179079055786,
0.030065691098570824,
0.059062231332063675,
0.04103723168373108,
0.05802543833851814,
0.09682418406009674,
-0.11329787969589233,
0.15987055003643036,
0.379869282245636,
-0.03045179881155491,
0.30652499198913574,
0.3023828864097595,
-0.09682240337133408,
-0.21831342577934265,
-0.11378598213195801,
-0.228165864944458,
-0.07882475107908249,
0.09072098135948181,
-0.16659989953041077,
0.025036493316292763,
0.17412254214286804,
-0.1377222239971161,
0.3711889386177063,
-0.24608120322227478,
0.0203606728464365,
0.13895003497600555,
-0.05799161270260811,
0.511957585811615,
-0.167469784617424,
-0.18707700073719025,
0.01806897297501564,
-0.19075140357017517,
0.13629992306232452,
0.046538516879081726,
0.10444869846105576,
-0.03404511883854866,
0.004062662832438946,
-0.005994024220854044,
-0.03217495605349541,
0.22741857171058655,
-0.031934089958667755,
0.076546311378479,
-0.07987990230321884,
-0.1548270881175995,
0.17897257208824158,
0.0531214103102684,
-0.022682825103402138,
-0.02631465159356594,
-0.029260510578751564,
0.008523086085915565,
0.0037912160623818636,
-0.05538689345121384,
0.1425459235906601,
0.029954712837934494,
-0.09434555470943451,
-0.12705177068710327,
0.04118317365646362,
-0.14329589903354645,
-0.018933435901999474,
0.20933008193969727,
-0.01894628070294857,
0.1458362489938736,
-0.011348146013915539,
-0.10375108569860458,
-0.11638569831848145,
-0.012036495842039585,
-0.11449992656707764,
-0.058411456644535065,
0.08564599603414536,
-0.10676656663417816,
-0.07733477652072906,
0.10803775489330292,
0.023843443021178246,
0.09787606447935104,
0.08966322243213654,
-0.13550588488578796,
0.06864889711141586,
0.09362839162349701,
-0.08725099265575409,
-0.19309934973716736,
-0.0016273815417662263,
-0.03739777207374573,
0.22284045815467834,
0.02632112428545952,
-0.019656430929899216,
0.07605426758527756,
0.018560808151960373,
-0.02394840680062771,
-0.008655348792672157,
-0.10050050169229507,
-0.06668076664209366,
0.06734044849872589,
-0.026352843269705772,
-0.10852938890457153,
0.1677463799715042,
0.07210572808980942,
0.07280836999416351,
0.007100003771483898,
0.0670049786567688,
-0.02847163937985897,
-0.08322755992412567,
-0.21472053229808807,
0.06516041606664658,
-0.1670222133398056,
-0.033386796712875366,
0.11202289909124374,
-0.03219880163669586,
-0.033339016139507294,
0.10267548263072968,
-0.005477498285472393,
0.14082859456539154,
0.0090246070176363,
0.02988993190228939,
0.1526499092578888,
-0.08852002769708633,
-0.2627219557762146,
-0.015079754404723644,
-0.09116709977388382,
-0.11027021706104279,
0.0015302940737456083,
0.09684717655181885,
-0.05932050198316574,
-0.10527049005031586,
-0.26387080550193787,
0.05357871204614639,
-0.11568109691143036,
-0.045089416205883026,
-0.06855414062738419,
-0.04265986382961273,
0.09088346362113953,
-0.06357911229133606,
0.016427788883447647,
0.005374973174184561,
-0.15766364336013794,
0.03028123453259468,
0.10946495831012726,
0.0852172002196312,
-0.060998059809207916,
-0.030608469620347023,
0.12186893075704575,
0.110063835978508,
0.12327835708856583,
0.15727415680885315,
0.10157828032970428,
0.14797160029411316,
-0.29403024911880493,
-0.03483840078115463,
0.05042392760515213,
-0.06919999420642853,
0.03622608631849289,
0.10153856128454208,
-0.04923873022198677,
0.017520518973469734,
-0.07493942975997925,
0.06395010650157928,
-0.08428709954023361,
-0.13886567950248718,
-0.17445459961891174,
0.04370369762182236,
-0.14417222142219543,
0.008620423264801502,
-0.11206301301717758,
0.15205669403076172,
-0.0386870875954628,
0.023380031809210777,
0.037759922444820404,
0.01611725240945816,
0.05210704728960991,
-0.04251781105995178,
-0.03168342635035515,
-0.09094695746898651,
-0.04075861722230911,
-0.0026727402582764626,
-0.06949152797460556,
0.012712378986179829,
0.39052054286003113,
-0.0012026652693748474,
-0.1732276976108551,
-0.015330759808421135,
0.06898744404315948,
0.10015163570642471,
-0.029724081978201866,
0.2703114151954651,
0.10799286514520645,
-0.018762748688459396,
-0.11608004570007324,
0.07564164698123932,
-0.13880205154418945,
-0.2433439940214157,
0.07176637649536133,
-0.0585654154419899,
-0.06728470325469971,
-0.02980550192296505,
0.1186019629240036,
-0.1439025104045868,
-0.013734179548919201,
-0.11624902486801147,
0.09494119882583618,
-0.014953667297959328,
-0.02890985645353794,
-0.02785254642367363,
0.15867415070533752,
0.010379305109381676,
0.04587939754128456,
-0.06040773540735245,
-0.01085327472537756,
-0.13551686704158783,
-0.21185526251792908,
0.06184665113687515,
-0.06513991206884384,
0.11025730520486832,
0.03179437667131424,
0.0640081912279129,
0.15882056951522827,
0.02845042385160923,
-0.039403319358825684,
-0.029120082035660744,
-0.07581188529729843,
-0.059049926698207855,
-0.007420214358717203,
-0.032245535403490067,
-0.011720697395503521,
-0.12904499471187592,
-0.059456683695316315,
0.004162426106631756,
-0.1347278207540512,
-0.054030705243349075,
0.014051476493477821,
0.003949990030378103,
-0.03680270165205002,
-0.1489698886871338,
-0.021004091948270798,
-0.04927634447813034,
0.12321747839450836,
-0.01850888691842556,
0.1325908601284027,
-0.03203291818499565,
0.01791246607899666,
0.05958961695432663,
0.09555832296609879,
0.04361063987016678,
-0.006439258344471455,
0.07006176561117172,
0.15765728056430817,
-0.07305220514535904,
0.1315283328294754,
-0.08978898823261261,
0.0014536224771291018,
0.018188249319791794,
0.16979415714740753,
0.23928074538707733,
-0.06535785645246506,
0.01775585487484932,
0.02636394463479519,
-0.001391784637235105,
0.12600955367088318,
0.09983877837657928,
-0.04007594659924507,
0.2911432385444641,
-0.07712879776954651,
0.0051956214010715485,
0.02375967800617218,
0.017666934058070183,
-0.12156887352466583,
0.07202374935150146,
0.04915747418999672,
-0.027012426406145096,
-0.09684985876083374,
0.09215530008077621,
-0.16806229948997498,
0.1032259613275528,
0.1207776889204979,
-0.04312365502119064,
0.026346897706389427,
-0.04522030055522919,
0.011694693937897682,
0.014044510200619698,
0.0687374547123909,
-0.09244072437286377,
-0.08892500400543213,
-0.15158376097679138,
0.03578533977270126,
-0.3501462936401367,
-0.13695290684700012,
0.06028975173830986,
0.18194790184497833,
0.23054499924182892,
-0.02014392614364624,
0.08410539478063583,
0.01843912899494171,
0.07882488518953323,
-0.06522022187709808,
0.15599536895751953,
0.050869427621364594,
-0.02863873541355133,
-0.15786071121692657,
-0.1529199630022049,
0.03182322904467583,
-0.10834081470966339,
0.03558938577771187,
0.03270979970693588,
0.03524026274681091,
0.18191233277320862,
-0.06307114660739899,
-0.010600005276501179,
0.0033391520846635103,
-0.1243988499045372,
0.06107478588819504,
-0.058335863053798676,
-0.009528244845569134,
-0.08305004984140396,
-0.04982863366603851,
0.01905914954841137,
0.1593838483095169,
-0.14995209872722626,
-0.0854339450597763,
0.16327610611915588,
0.022017156705260277,
0.19446152448654175,
-0.018702946603298187,
-0.11862506717443466,
-0.0026350081898272038,
-0.11104579269886017,
0.13709725439548492,
-0.04316838085651398,
0.000941427075304091,
0.1574680060148239,
0.004723464138805866,
0.01791178062558174,
-0.2782723903656006,
0.014928646385669708,
-0.05653778463602066,
-0.006444811820983887,
-0.007487150840461254
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | aidonuts/pernicious-001-ep1 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:08:33+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04654794931411743,
0.16618601977825165,
-0.005445904564112425,
0.01853804849088192,
0.0981811136007309,
0.011998992413282394,
0.06433123350143433,
0.11398410052061081,
-0.0230073444545269,
0.11406639218330383,
0.03047988750040531,
0.10172267258167267,
0.11317981779575348,
0.14841650426387787,
-0.002152352826669812,
-0.22403094172477722,
0.050844956189394,
-0.12105348706245422,
-0.033293843269348145,
0.11749980598688126,
0.1483822613954544,
-0.09928343445062637,
0.07274559140205383,
-0.029687678441405296,
-0.012143402360379696,
-0.030057786032557487,
-0.05890674889087677,
-0.046214159578084946,
0.04651786759495735,
0.06640566885471344,
0.06770290434360504,
0.0071083661168813705,
0.09012923389673233,
-0.2696533799171448,
0.018959321081638336,
0.07145345956087112,
-0.002759667346253991,
0.06957992166280746,
0.06404146552085876,
-0.07107418030500412,
0.10337356477975845,
-0.05106033384799957,
0.14650006592273712,
0.08365883678197861,
-0.09081148356199265,
-0.1895141303539276,
-0.08866965025663376,
0.09882009029388428,
0.17572562396526337,
0.04925641790032387,
-0.02320658043026924,
0.09761467576026917,
-0.08769196271896362,
0.015438909642398357,
0.04981724172830582,
-0.07620415836572647,
-0.05378096550703049,
0.05986575037240982,
0.07907199114561081,
0.06627275794744492,
-0.12434766441583633,
-0.02885502204298973,
0.005009706597775221,
0.010980482213199139,
0.0769270583987236,
0.01728810742497444,
0.146672785282135,
0.0338633768260479,
-0.12615777552127838,
-0.04880760237574577,
0.09869225323200226,
0.03395522013306618,
-0.04422314465045929,
-0.24749068915843964,
-0.03152675926685333,
-0.030810698866844177,
-0.029386121779680252,
-0.03716538846492767,
0.04340358078479767,
-0.007673026993870735,
0.08638741075992584,
-0.0060646249912679195,
-0.07403432577848434,
-0.03937075287103653,
0.06169692054390907,
0.0672287791967392,
0.02999979443848133,
-0.013745363801717758,
0.010938193649053574,
0.11620724946260452,
0.1095694974064827,
-0.12054188549518585,
-0.05555335059762001,
-0.06393084675073624,
-0.08656639605760574,
-0.040790557861328125,
0.034162238240242004,
0.03456587344408035,
0.05349370837211609,
0.25305667519569397,
0.015654386952519417,
0.059652652591466904,
0.034477248787879944,
0.007892133668065071,
0.05848940089344978,
0.11044429242610931,
-0.06018859148025513,
-0.10444226115942001,
-0.02648012898862362,
0.08843598514795303,
0.008199662901461124,
-0.03287925571203232,
-0.05088530853390694,
0.06019928678870201,
0.01946467161178589,
0.11926145106554031,
0.09061790257692337,
0.010536285117268562,
-0.07121123373508453,
-0.061038948595523834,
0.1891259253025055,
-0.16544590890407562,
0.04322727024555206,
0.035097137093544006,
-0.03903156518936157,
0.00019933005387429148,
0.013914269395172596,
0.016625655815005302,
-0.025983380153775215,
0.09017423540353775,
-0.054113563150167465,
-0.04145489260554314,
-0.11186197400093079,
-0.03383193537592888,
0.033762916922569275,
0.008953776210546494,
-0.035059962421655655,
-0.033713940531015396,
-0.08351044356822968,
-0.07577689737081528,
0.09320491552352905,
-0.07346344739198685,
-0.04878907650709152,
-0.01804324984550476,
-0.07530532777309418,
0.022395428270101547,
0.019394835457205772,
0.07707412540912628,
-0.02362251654267311,
0.04399976506829262,
-0.05189276114106178,
0.05863580107688904,
0.11207318305969238,
0.03570080175995827,
-0.05736649036407471,
0.06062258034944534,
-0.23834340274333954,
0.09552820026874542,
-0.07409077137708664,
0.05591456592082977,
-0.153293639421463,
-0.024439791217446327,
0.04788333550095558,
0.008784620091319084,
-0.009650949388742447,
0.13416339457035065,
-0.21702027320861816,
-0.02536402828991413,
0.1717337965965271,
-0.10057014971971512,
-0.07069246470928192,
0.05619903281331062,
-0.04835370555520058,
0.10988964140415192,
0.03825836628675461,
-0.025690359994769096,
0.06171267107129097,
-0.1267417073249817,
0.003717758459970355,
-0.05005312338471413,
-0.017048977315425873,
0.1548657864332199,
0.07182947546243668,
-0.07217690348625183,
0.07399354875087738,
0.025708531960844994,
-0.0246540866792202,
-0.04625825211405754,
-0.015164627693593502,
-0.10536660254001617,
0.014689887873828411,
-0.06369215250015259,
0.014470234513282776,
-0.020807426422834396,
-0.09071163833141327,
-0.027962757274508476,
-0.17504668235778809,
-0.03014434315264225,
0.08651752024888992,
-0.008693269453942776,
-0.01803150773048401,
-0.1178668737411499,
0.009341353550553322,
0.04177580401301384,
0.0061247628182172775,
-0.13462838530540466,
-0.04812471568584442,
0.02780051715672016,
-0.1600649207830429,
0.034652888774871826,
-0.05392369255423546,
0.04932025074958801,
0.025790516287088394,
-0.028889117762446404,
-0.026493212208151817,
0.021633783355355263,
0.005992184858769178,
-0.011999987065792084,
-0.24343903362751007,
-0.028118690475821495,
-0.024888472631573677,
0.1682123839855194,
-0.20917098224163055,
0.03546025976538658,
0.07867541164159775,
0.15366052091121674,
0.011240328662097454,
-0.04177491366863251,
0.005974748637527227,
-0.06935794651508331,
-0.02736494317650795,
-0.05875484645366669,
-0.0047869328409433365,
-0.03310677409172058,
-0.04545191675424576,
0.04568447172641754,
-0.16510973870754242,
-0.032636504620313644,
0.09776268899440765,
0.06289951503276825,
-0.13922683894634247,
-0.020621931180357933,
-0.03630133345723152,
-0.049253206700086594,
-0.04911839962005615,
-0.0605199858546257,
0.10893940925598145,
0.05891856551170349,
0.04574795812368393,
-0.05928509309887886,
-0.07568105310201645,
-0.001827909960411489,
-0.013898161239922047,
-0.017864689230918884,
0.09759635478258133,
0.0751434788107872,
-0.13251115381717682,
0.09224759042263031,
0.09603385627269745,
0.07919023185968399,
0.09113933145999908,
-0.02355697751045227,
-0.08261934667825699,
-0.045987509191036224,
0.031442027539014816,
0.020124373957514763,
0.13039541244506836,
-0.024294709786772728,
0.04352088272571564,
0.042134687304496765,
-0.019369594752788544,
0.014752166345715523,
-0.08687400817871094,
0.033972494304180145,
0.028472330421209335,
-0.016721390187740326,
0.050190530717372894,
-0.03876714035868645,
0.02440318465232849,
0.08830609917640686,
0.045322712510824203,
0.03507532551884651,
0.015493292361497879,
-0.05206458270549774,
-0.1083620935678482,
0.16405931115150452,
-0.12714070081710815,
-0.22483378648757935,
-0.13936103880405426,
0.0037376401014626026,
0.035628627985715866,
-0.015835661441087723,
0.002417160663753748,
-0.059374887496232986,
-0.12220635265111923,
-0.08858037739992142,
0.015140829607844353,
0.04942670464515686,
-0.09028962254524231,
-0.06437795609235764,
0.058117836713790894,
0.03889724239706993,
-0.14560972154140472,
0.017612040042877197,
0.04854894429445267,
-0.09789852797985077,
-0.006774199660867453,
0.08094939589500427,
0.0698540136218071,
0.1770169734954834,
0.017703235149383545,
-0.021850809454917908,
0.032354529947042465,
0.20614571869373322,
-0.13538233935832977,
0.11083246022462845,
0.13607586920261383,
-0.09041404724121094,
0.08072979003190994,
0.19951270520687103,
0.03932560607790947,
-0.10153959691524506,
0.031980328261852264,
0.02283124253153801,
-0.0284719280898571,
-0.24526868760585785,
-0.07212468236684799,
-0.004402178805321455,
-0.058010730892419815,
0.07660572230815887,
0.09286724030971527,
0.08215958625078201,
0.012304253876209259,
-0.09310996532440186,
-0.08154371380805969,
0.05942574888467789,
0.10367169976234436,
0.024584239348769188,
-0.010839897207915783,
0.08998730033636093,
-0.034100502729415894,
0.019626356661319733,
0.0853661298751831,
0.005239574704319239,
0.17840281128883362,
0.05159219726920128,
0.18830420076847076,
0.07925192266702652,
0.07219027727842331,
0.009912233799695969,
0.013080619275569916,
0.018877580761909485,
0.03300119563937187,
-0.002769160782918334,
-0.08440786600112915,
-0.02248465269804001,
0.11566436290740967,
0.06668911874294281,
0.010815348476171494,
0.015172341838479042,
-0.04104290530085564,
0.07965951412916183,
0.1831512451171875,
-0.007656289264559746,
-0.1783534437417984,
-0.057547420263290405,
0.07553383708000183,
-0.09879875183105469,
-0.09854305535554886,
-0.013454320840537548,
0.03072015568614006,
-0.17046253383159637,
0.023390959948301315,
-0.02239842526614666,
0.1106182336807251,
-0.14194999635219574,
-0.020490378141403198,
0.07218493521213531,
0.07199500501155853,
0.004729843698441982,
0.05758659541606903,
-0.16417601704597473,
0.10671813786029816,
0.008950476534664631,
0.06779605895280838,
-0.09610627591609955,
0.1008887067437172,
-0.004196076653897762,
-0.02063460275530815,
0.1393408179283142,
0.002700034761801362,
-0.06884108483791351,
-0.0763031542301178,
-0.08754398673772812,
-0.009632662869989872,
0.12754282355308533,
-0.1419651061296463,
0.08767123520374298,
-0.037212442606687546,
-0.0424150750041008,
-0.0017086371080949903,
-0.10206665843725204,
-0.11638247221708298,
-0.18888559937477112,
0.06001543253660202,
-0.13492922484874725,
0.03152317553758621,
-0.10799519717693329,
-0.032371897250413895,
-0.030304040759801865,
0.19337286055088043,
-0.23447458446025848,
-0.07199826091527939,
-0.1475764364004135,
-0.10233612358570099,
0.1443224400281906,
-0.0501345656812191,
0.08485390990972519,
-0.007241467013955116,
0.16846685111522675,
0.019060896709561348,
-0.02531743235886097,
0.0971490666270256,
-0.09173708409070969,
-0.19302815198898315,
-0.07869284600019455,
0.15662524104118347,
0.13260218501091003,
0.031680017709732056,
-0.002461588243022561,
0.036563750356435776,
-0.015421539545059204,
-0.11935004591941833,
0.015969349071383476,
0.1787186712026596,
0.06237189099192619,
0.02331034652888775,
-0.027346095070242882,
-0.11273157596588135,
-0.06900003552436829,
-0.028530338779091835,
0.03054865077137947,
0.17762407660484314,
-0.07057618349790573,
0.18207968771457672,
0.14163152873516083,
-0.05922834202647209,
-0.20400173962116241,
0.010538800619542599,
0.03055560030043125,
0.0009220078936778009,
0.02591954916715622,
-0.20123432576656342,
0.08688826113939285,
0.004683020059019327,
-0.05110127478837967,
0.13194532692432404,
-0.17217805981636047,
-0.14451217651367188,
0.0765485092997551,
0.038384392857551575,
-0.19559739530086517,
-0.12913893163204193,
-0.09174312651157379,
-0.045869920402765274,
-0.18591414391994476,
0.09569250047206879,
0.0305706188082695,
0.010893458500504494,
0.03030681423842907,
0.029179483652114868,
0.019487828016281128,
-0.0418255440890789,
0.18391458690166473,
-0.024792250245809555,
0.026594700291752815,
-0.08539514988660812,
-0.06927408277988434,
0.03743394836783409,
-0.052842434495687485,
0.07349982857704163,
-0.023486759513616562,
0.007861839607357979,
-0.10348054021596909,
-0.042148489505052567,
-0.03735732287168503,
0.015448716469109058,
-0.09657872468233109,
-0.08514349907636642,
-0.045032672584056854,
0.09675803780555725,
0.09690850973129272,
-0.033646680414676666,
-0.028050623834133148,
-0.07533035427331924,
0.04412057250738144,
0.19926515221595764,
0.1785389482975006,
0.042153384536504745,
-0.08034496754407883,
-0.004150947090238333,
-0.010121207684278488,
0.04310847446322441,
-0.20463712513446808,
0.06283636391162872,
0.05450061708688736,
0.01973269321024418,
0.11436162889003754,
-0.019565396010875702,
-0.15359151363372803,
-0.07263088971376419,
0.06303015351295471,
-0.060181066393852234,
-0.19620554149150848,
0.00867035984992981,
0.060603946447372437,
-0.16371412575244904,
-0.04535605385899544,
0.04643881320953369,
-0.005620351992547512,
-0.038163937628269196,
0.021896906197071075,
0.09194854646921158,
0.0026654244866222143,
0.07427921891212463,
0.05387866869568825,
0.0827430784702301,
-0.10537070035934448,
0.08090532571077347,
0.08839722722768784,
-0.08452684432268143,
0.023530138656497,
0.10478579998016357,
-0.059433579444885254,
-0.03440561518073082,
0.020135708153247833,
0.08153781294822693,
0.01775863952934742,
-0.040019966661930084,
0.013229827396571636,
-0.10452935844659805,
0.05954122915863991,
0.08839859813451767,
0.032507482916116714,
0.016702456399798393,
0.03425082191824913,
0.04607953503727913,
-0.07238735258579254,
0.12142276018857956,
0.031868141144514084,
0.017129309475421906,
-0.036505792289972305,
-0.040896978229284286,
0.019542274996638298,
-0.03214648738503456,
-0.005015232600271702,
-0.03023446537554264,
-0.07695909589529037,
-0.014793801121413708,
-0.1626158058643341,
-0.011131818406283855,
-0.05648450180888176,
0.010329355485737324,
0.03204665705561638,
-0.032609567046165466,
0.008124498650431633,
0.009250079281628132,
-0.07695289701223373,
-0.0663459524512291,
-0.020460480824112892,
0.09540658444166183,
-0.16213038563728333,
0.022481130436062813,
0.08244425803422928,
-0.12187694013118744,
0.09281346201896667,
0.016204802319407463,
-0.006236857734620571,
0.025038830935955048,
-0.1475188434123993,
0.034843120723962784,
-0.03386561945080757,
0.010836300440132618,
0.04373383894562721,
-0.21569781005382538,
-0.00004886732858722098,
-0.033673107624053955,
-0.06639216095209122,
-0.009451326914131641,
-0.03672455996274948,
-0.11508306115865707,
0.1058407872915268,
0.007236586883664131,
-0.08753558248281479,
-0.03186136856675148,
0.029325377196073532,
0.0838974118232727,
-0.021959776058793068,
0.15145497024059296,
-0.008370938710868359,
0.07429654151201248,
-0.16209737956523895,
-0.018623165786266327,
-0.006028574425727129,
0.022658247500658035,
-0.01664556935429573,
-0.01111356820911169,
0.044031109660863876,
-0.022746501490473747,
0.17925859987735748,
-0.030318550765514374,
0.02272745408117771,
0.06815794110298157,
0.019072026014328003,
-0.030184008181095123,
0.10406795144081116,
0.04094860330224037,
0.02014910988509655,
0.018591465428471565,
0.003289656015112996,
-0.04647882282733917,
-0.03173251822590828,
-0.19407226145267487,
0.07288651913404465,
0.15608493983745575,
0.09729263186454773,
-0.016707008704543114,
0.07954329252243042,
-0.10199416428804398,
-0.1109243705868721,
0.12477338314056396,
-0.04797708988189697,
-0.002418199321255088,
-0.07150927931070328,
0.13247236609458923,
0.1437523066997528,
-0.1859612911939621,
0.07269313186407089,
-0.0699717253446579,
-0.04708027467131615,
-0.10980689525604248,
-0.19441905617713928,
-0.05561789125204086,
-0.049456022679805756,
-0.016053348779678345,
-0.04698808491230011,
0.07504211366176605,
0.054538097232580185,
0.006766852922737598,
-0.0023397188633680344,
0.06506035476922989,
-0.031050674617290497,
-0.0037882844917476177,
0.032597362995147705,
0.06591679900884628,
0.012734474614262581,
-0.030802709981799126,
0.016619903966784477,
-0.013545602560043335,
0.045626189559698105,
0.06578011065721512,
0.04976864159107208,
-0.02938537672162056,
0.014603170566260815,
-0.038539156317710876,
-0.10249634087085724,
0.043612558394670486,
-0.024421939626336098,
-0.0789753645658493,
0.15477414429187775,
0.023680059239268303,
0.007779473438858986,
-0.020137663930654526,
0.23901568353176117,
-0.0738423764705658,
-0.0964353010058403,
-0.14737580716609955,
0.10557299107313156,
-0.038081806153059006,
0.05800395458936691,
0.04625935107469559,
-0.10226529091596603,
0.018044332042336464,
0.1338089406490326,
0.16182038187980652,
-0.039008259773254395,
0.020095856860280037,
0.031135575845837593,
0.00566398398950696,
-0.03622615709900856,
0.04847532883286476,
0.06906453520059586,
0.16569648683071136,
-0.04632584750652313,
0.09100406616926193,
0.0019041687482967973,
-0.09579581767320633,
-0.038361791521310806,
0.11069868505001068,
-0.016052277758717537,
0.019335128366947174,
-0.05818064883351326,
0.11742528527975082,
-0.06386786699295044,
-0.23783175647258759,
0.06453443318605423,
-0.0684293657541275,
-0.13765870034694672,
-0.02378307841718197,
0.08207765966653824,
-0.012955902144312859,
0.027587108314037323,
0.0730307325720787,
-0.07240920513868332,
0.201939657330513,
0.03798431158065796,
-0.05499868467450142,
-0.055047210305929184,
0.0805421993136406,
-0.10008571296930313,
0.2739645540714264,
0.01557221356779337,
0.04601577669382095,
0.10384146869182587,
-0.009341772645711899,
-0.13838784396648407,
0.019836371764540672,
0.09581108391284943,
-0.10502193123102188,
0.04196618124842644,
0.19815568625926971,
-0.0014755994779989123,
0.12389086186885834,
0.07657600939273834,
-0.07551808655261993,
0.0478031262755394,
-0.08054235577583313,
-0.06760486960411072,
-0.09260394424200058,
0.09703279286623001,
-0.07772123068571091,
0.14251399040222168,
0.13876807689666748,
-0.05074559152126312,
0.012724342755973339,
-0.031311117112636566,
0.044293127954006195,
-0.00010600237874314189,
0.10321761667728424,
0.004272161517292261,
-0.1832672357559204,
0.024692710489034653,
0.005650998093187809,
0.10749758034944534,
-0.16033467650413513,
-0.09566054493188858,
0.042343202978372574,
0.003505636239424348,
-0.0672195628285408,
0.1290110945701599,
0.05665452033281326,
0.04342988133430481,
-0.03997718170285225,
-0.03521440550684929,
-0.0060732318088412285,
0.13561366498470306,
-0.10713256150484085,
0.0009933578548952937
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# allenaitk-instruct-base-def-pos-raj1
This model is a fine-tuned version of [allenai/tk-instruct-base-def-pos](https://huggingface.co/allenai/tk-instruct-base-def-pos) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1353
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 0.1772 | 1.0 | 5737 | 0.1493 |
| 0.144 | 2.0 | 11474 | 0.1344 |
| 0.1205 | 3.0 | 17211 | 0.1349 |
| 0.106 | 4.0 | 22948 | 0.1353 |
### Framework versions
- Transformers 4.33.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "allenai/tk-instruct-base-def-pos", "model-index": [{"name": "allenaitk-instruct-base-def-pos-raj1", "results": []}]} | text2text-generation | Raj12334/allenaitk-instruct-base-def-pos-raj1 | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:allenai/tk-instruct-base-def-pos",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:08:50+00:00 | [] | [] | TAGS
#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #base_model-allenai/tk-instruct-base-def-pos #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| allenaitk-instruct-base-def-pos-raj1
====================================
This model is a fine-tuned version of allenai/tk-instruct-base-def-pos on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1353
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.33.0
* Pytorch 2.0.0
* Datasets 2.1.0
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #base_model-allenai/tk-instruct-base-def-pos #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.13.3"
] | [
82,
117,
4,
30
] | [
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #base_model-allenai/tk-instruct-base-def-pos #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.33.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.13.3"
] | [
-0.12202219665050507,
0.13696101307868958,
-0.0025786382611840963,
0.1117657795548439,
0.0994374081492424,
0.02996065653860569,
0.14857177436351776,
0.17506231367588043,
-0.09123524278402328,
0.04948614910244942,
0.14300785958766937,
0.13233913481235504,
0.05300695821642876,
0.18927142024040222,
-0.05531538277864456,
-0.23873268067836761,
0.036723826080560684,
0.026203293353319168,
-0.03487721458077431,
0.13991011679172516,
0.10003683716058731,
-0.10090163350105286,
0.1033477634191513,
0.010778645053505898,
-0.16552244126796722,
-0.023654071614146233,
-0.017111826688051224,
-0.07138460874557495,
0.10678254067897797,
0.0003883034223690629,
0.06890062987804413,
0.05210549384355545,
0.05771402642130852,
-0.14753012359142303,
0.004632450174540281,
0.05381190404295921,
0.0023070857860147953,
0.10244721174240112,
0.052101872861385345,
-0.01660558395087719,
0.13766467571258545,
-0.09277716279029846,
0.05580497533082962,
0.01811683177947998,
-0.11926105618476868,
-0.22551937401294708,
-0.10481812804937363,
0.08680559694766998,
0.06525584310293198,
0.07849276810884476,
0.001686332281678915,
0.16470922529697418,
-0.008096802979707718,
0.09913384169340134,
0.250686913728714,
-0.30706000328063965,
-0.06017562747001648,
-0.011751564219594002,
0.02554803527891636,
0.07805953174829483,
-0.07334382832050323,
-0.01920229382812977,
0.025371944531798363,
0.05119195953011513,
0.13376738131046295,
-0.019041085615754128,
-0.019350355491042137,
-0.021634964272379875,
-0.12196867913007736,
-0.09650366008281708,
0.22297324240207672,
0.041428014636039734,
-0.054660532623529434,
-0.05870992690324783,
-0.0787319764494896,
-0.16896869242191315,
-0.013119366019964218,
-0.0011027893051505089,
0.02695128507912159,
-0.033487215638160706,
-0.0855521410703659,
-0.04405853897333145,
-0.08568623661994934,
-0.05539482459425926,
-0.009639721363782883,
0.13861137628555298,
0.048889994621276855,
0.03438907861709595,
-0.02893771603703499,
0.11738271266222,
0.0349072702229023,
-0.1782217025756836,
-0.02449471689760685,
0.0011017873184755445,
0.020729904994368553,
-0.02414766699075699,
-0.04683661460876465,
-0.05205399915575981,
0.023754898458719254,
0.16071973741054535,
-0.07869572937488556,
0.044861312955617905,
-0.00856457557529211,
0.028319429606199265,
-0.07115025073289871,
0.14425644278526306,
-0.012427168898284435,
-0.033600885421037674,
0.025358345359563828,
0.1002565398812294,
0.061420559883117676,
-0.029190322384238243,
-0.10368829220533371,
0.0020431960001587868,
0.1438366323709488,
0.02944590523838997,
-0.005820187274366617,
0.052970871329307556,
-0.055429767817258835,
-0.06485876441001892,
0.05935161933302879,
-0.10939588397741318,
0.00039006711449474096,
-0.0008257183362729847,
-0.07948955148458481,
-0.042455676943063736,
0.028686674311757088,
0.01235644519329071,
-0.04323983192443848,
0.03470994904637337,
-0.08555298298597336,
-0.007317604962736368,
-0.05121916905045509,
-0.09000871330499649,
0.021600496023893356,
-0.1037641167640686,
0.0005464997957460582,
-0.10607007145881653,
-0.20157399773597717,
-0.004098000004887581,
0.06933467090129852,
-0.04343174025416374,
-0.10055876523256302,
-0.05291351303458214,
-0.08302848041057587,
0.032603669911623,
-0.023195916786789894,
0.09446877241134644,
-0.06576209515333176,
0.0973932221531868,
0.03302755206823349,
0.06427552551031113,
-0.03880905732512474,
0.04749559611082077,
-0.09273317456245422,
0.044724225997924805,
-0.15376350283622742,
0.07912278175354004,
-0.018218761309981346,
0.036162614822387695,
-0.11613085865974426,
-0.10170494765043259,
0.018961139023303986,
-0.04847099632024765,
0.0843135267496109,
0.12711676955223083,
-0.1278565376996994,
-0.06362931430339813,
0.17006921768188477,
-0.0937466099858284,
-0.16563567519187927,
0.12014137208461761,
-0.033268772065639496,
0.024441268295049667,
0.05529682710766792,
0.1804170310497284,
0.07923872768878937,
-0.07685436308383942,
-0.012863821350038052,
-0.021108334884047508,
0.0730859786272049,
-0.05694519355893135,
0.11603401601314545,
0.00005726738163502887,
0.02108626626431942,
0.01556319184601307,
-0.06035371497273445,
0.023410867899656296,
-0.08514589816331863,
-0.10094068944454193,
-0.06027570739388466,
-0.08933813124895096,
0.04047870635986328,
0.03578098118305206,
0.07125262171030045,
-0.1090184822678566,
-0.10072441399097443,
0.011364258825778961,
0.10379808396100998,
-0.08819600939750671,
0.0425964817404747,
-0.07392377406358719,
0.11205590516328812,
-0.07310280203819275,
-0.012405370362102985,
-0.17359037697315216,
-0.04633826017379761,
0.03722827136516571,
-0.010017662309110165,
-0.0008436256321147084,
-0.03527645766735077,
0.06603730469942093,
0.08086370676755905,
-0.04906414449214935,
-0.05467097833752632,
-0.06927384436130524,
-0.008735007606446743,
-0.10565090924501419,
-0.19148854911327362,
-0.07553225010633469,
-0.031164677813649178,
0.16558295488357544,
-0.15105150640010834,
0.039579566568136215,
0.011138672940433025,
0.11765140295028687,
0.02488742023706436,
-0.034267354756593704,
-0.01382265705615282,
0.03820404037833214,
-0.05802036076784134,
-0.0738007053732872,
0.06598199158906937,
0.03524031117558479,
-0.106581911444664,
0.00545671209692955,
-0.1234629899263382,
0.12626276910305023,
0.11934154480695724,
-0.017758553847670555,
-0.04600165784358978,
-0.009340577758848667,
-0.0771435871720314,
-0.03329197317361832,
-0.01496835146099329,
0.00770490150898695,
0.12281511723995209,
0.009626858867704868,
0.1462017446756363,
-0.10441064834594727,
-0.05438915640115738,
0.035515327006578445,
-0.02325380966067314,
-0.01554778777062893,
0.09370258450508118,
0.020064853131771088,
-0.13168945908546448,
0.1348063051700592,
0.16487616300582886,
-0.03914446756243706,
0.12361416220664978,
-0.054854314774274826,
-0.08132980763912201,
-0.03842445835471153,
0.008586716838181019,
0.01326169166713953,
0.09031886607408524,
-0.11218122392892838,
0.0007689431658945978,
0.048662785440683365,
0.02347615361213684,
0.01048725750297308,
-0.18191106617450714,
-0.004071277566254139,
0.033828988671302795,
-0.06891655921936035,
-0.029570387676358223,
-0.00013624379062093794,
-0.0005698782042600214,
0.09708032011985779,
0.027214907109737396,
-0.05498449504375458,
0.05234351009130478,
0.006475168280303478,
-0.0815962627530098,
0.17651547491550446,
-0.08128862828016281,
-0.16239485144615173,
-0.11662989854812622,
-0.04138936847448349,
-0.09790154546499252,
0.008627147413790226,
0.07503321021795273,
-0.03629542142152786,
-0.024456994608044624,
-0.10579303652048111,
-0.023921547457575798,
-0.019609762355685234,
0.04150974377989769,
0.05917312949895859,
-0.01739288680255413,
0.06918991357088089,
-0.09683781862258911,
-0.0316789411008358,
-0.007817675359547138,
-0.02979574352502823,
0.0466824546456337,
0.0008328240946866572,
0.09519036114215851,
0.09771527349948883,
-0.012002116069197655,
0.042988236993551254,
-0.02627614699304104,
0.22957171499729156,
-0.051464710384607315,
-0.01983073353767395,
0.14551697671413422,
-0.009567571803927422,
0.08342034369707108,
0.1090908944606781,
0.027437083423137665,
-0.07249633222818375,
-0.0003143677022308111,
-0.002431423170492053,
-0.04060647264122963,
-0.20874519646167755,
-0.03163230046629906,
-0.06029484048485756,
0.010448559187352657,
0.13308162987232208,
0.03569307178258896,
0.04851435869932175,
0.07163363695144653,
-0.01413076650351286,
0.05685969442129135,
-0.007937598042190075,
0.10170169919729233,
0.08816558867692947,
0.07478753477334976,
0.12200472503900528,
-0.03992151468992233,
-0.011665149591863155,
0.054845187813043594,
0.02631603553891182,
0.2026318460702896,
-0.04206790775060654,
0.18075498938560486,
0.04390624538064003,
0.1975049376487732,
-0.005478803068399429,
0.08644542098045349,
-0.0009896019473671913,
0.022616110742092133,
-0.020534945651888847,
-0.06119133159518242,
-0.06608451902866364,
0.03238830342888832,
-0.03228466585278511,
0.05859929695725441,
-0.12155608087778091,
0.03166866675019264,
0.04411318898200989,
0.3277585506439209,
0.07026097178459167,
-0.38275784254074097,
-0.10855156928300858,
0.012542945332825184,
-0.01174803078174591,
-0.039787407964468,
-0.001830922905355692,
0.08550234138965607,
-0.10316481441259384,
0.0826164036989212,
-0.08135104179382324,
0.08646328747272491,
-0.05270016938447952,
0.019232310354709625,
0.059980690479278564,
0.07973069697618484,
0.01495220698416233,
0.06110001355409622,
-0.2731199860572815,
0.25500285625457764,
0.012512062676250935,
0.06580360978841782,
-0.09054470807313919,
0.016981899738311768,
0.022811733186244965,
0.03512733429670334,
0.08078667521476746,
0.0004315174592193216,
-0.06678146868944168,
-0.13841459155082703,
-0.12049967050552368,
0.024926211684942245,
0.07447376102209091,
-0.06160023435950279,
0.12202831357717514,
-0.03225739300251007,
0.0020640729926526546,
0.027678508311510086,
0.02634504623711109,
-0.06229956075549126,
-0.10072192549705505,
0.04406352341175079,
0.03230341151356697,
-0.0012222378281876445,
-0.06819324940443039,
-0.10790097713470459,
-0.03599530830979347,
0.17718134820461273,
0.018822399899363518,
-0.08766373246908188,
-0.12163534015417099,
0.05297275632619858,
0.09641603380441666,
-0.09567402303218842,
0.02227579429745674,
-0.013723562471568584,
0.10469397157430649,
0.018822714686393738,
-0.0934005081653595,
0.0776265412569046,
-0.06274163722991943,
-0.2000919133424759,
-0.020050879567861557,
0.13571517169475555,
0.01504767220467329,
0.05252888426184654,
0.003548896173015237,
0.03932977467775345,
-0.042684946209192276,
-0.06557910889387131,
0.04161757230758667,
0.003791490336880088,
0.10925311595201492,
-0.007520237937569618,
-0.028365105390548706,
0.035924915224313736,
-0.06379689276218414,
-0.013249648734927177,
0.16776996850967407,
0.26584339141845703,
-0.07772167026996613,
0.06646616756916046,
0.04155224561691284,
-0.050802432000637054,
-0.14355218410491943,
-0.030888881534337997,
0.07925491780042648,
0.003406816627830267,
0.022666621953248978,
-0.17929238080978394,
0.029789652675390244,
0.08957533538341522,
-0.02410772442817688,
0.08555310219526291,
-0.32042810320854187,
-0.1197681725025177,
0.08257357776165009,
0.12369594722986221,
0.09961890429258347,
-0.15381041169166565,
-0.05202678218483925,
-0.014834048226475716,
-0.1564313918352127,
0.1376049518585205,
-0.10387654602527618,
0.12228798121213913,
-0.04231991618871689,
0.06570462882518768,
0.012075469829142094,
-0.05177779495716095,
0.1379626840353012,
0.01517527922987938,
0.04533018171787262,
-0.06597708910703659,
0.023364651948213577,
0.11474671959877014,
-0.09495027363300323,
0.06214592233300209,
-0.11465507000684738,
0.06021532788872719,
-0.13610205054283142,
-0.00936118047684431,
-0.08888378739356995,
0.03237225115299225,
-0.03042028099298477,
-0.04813697934150696,
-0.012631483376026154,
0.014746838249266148,
0.08770950883626938,
-0.012479024939239025,
0.17830625176429749,
0.04655400291085243,
0.14203345775604248,
0.16658687591552734,
0.0747123584151268,
-0.053490716964006424,
-0.041208576411008835,
-0.03705417737364769,
-0.030593011528253555,
0.04331526532769203,
-0.12828464806079865,
0.03184053301811218,
0.12176267057657242,
0.014095094054937363,
0.12361805140972137,
0.06340254843235016,
-0.04527217894792557,
0.02591644413769245,
0.05679002404212952,
-0.18879888951778412,
-0.08258400112390518,
-0.024274857714772224,
-0.024870194494724274,
-0.15054866671562195,
0.04805538058280945,
0.1428736001253128,
-0.06188889592885971,
-0.0279754176735878,
-0.007363323122262955,
0.0269839596003294,
-0.03836444392800331,
0.16727446019649506,
0.03334918990731239,
0.0587368942797184,
-0.10623634606599808,
0.1146620586514473,
0.04135604947805405,
-0.07462283968925476,
0.05355098843574524,
0.08099164068698883,
-0.10899905115365982,
-0.029984138906002045,
0.03803601488471031,
0.11732829362154007,
-0.06537062674760818,
-0.04850421100854874,
-0.14503639936447144,
-0.09066715836524963,
0.0872151181101799,
0.09340078383684158,
0.07673756778240204,
0.038063760846853256,
-0.02855008654296398,
-0.00045070267515257,
-0.10662974417209625,
0.1011524498462677,
0.06501522660255432,
0.07632574439048767,
-0.1379895806312561,
0.11119578033685684,
-0.012141289189457893,
0.07516718655824661,
-0.01394928339868784,
0.0033492238726466894,
-0.07641670852899551,
-0.00937474612146616,
-0.12418704479932785,
0.004800464957952499,
-0.05278755724430084,
-0.008344228379428387,
-0.01712454855442047,
-0.05972812697291374,
-0.06529076397418976,
0.03362377732992172,
-0.11078832298517227,
-0.05917109176516533,
-0.002911214018240571,
0.03611080348491669,
-0.13033387064933777,
-0.034418731927871704,
0.01325965765863657,
-0.10699331760406494,
0.10185462981462479,
0.07872457057237625,
-0.011089959181845188,
0.020193733274936676,
-0.017409982159733772,
0.0012952393153682351,
0.03335573896765709,
0.004347721580415964,
0.052983853965997696,
-0.12603464722633362,
-0.020003091543912888,
0.018597401678562164,
0.004390617832541466,
0.020055221393704414,
0.1330716907978058,
-0.11729583144187927,
-0.00836208090186119,
0.0018532988615334034,
-0.04800418019294739,
-0.07276250422000885,
0.05698598176240921,
0.08597724884748459,
0.029299870133399963,
0.1919803023338318,
-0.06164196878671646,
0.025830594822764397,
-0.21357959508895874,
-0.007015937007963657,
0.008111980743706226,
-0.11094436049461365,
-0.09835804253816605,
-0.041491858661174774,
0.06771332025527954,
-0.0655176118016243,
0.08405588567256927,
-0.02242325246334076,
0.04281678423285484,
0.02626037783920765,
-0.004735935013741255,
0.010629654861986637,
0.009725844487547874,
0.21518626809120178,
0.027316385880112648,
-0.034041643142700195,
0.08906345069408417,
0.01244006585329771,
0.0813349261879921,
0.09666302800178528,
0.2100146859884262,
0.1201133131980896,
0.050760988146066666,
0.10522489994764328,
0.039235759526491165,
-0.03347061201930046,
-0.19679230451583862,
0.008865704759955406,
-0.008992135524749756,
0.1369585543870926,
0.007255116477608681,
0.19165854156017303,
0.10847869515419006,
-0.1873413622379303,
0.02405736967921257,
-0.023849353194236755,
-0.07427532225847244,
-0.10709156841039658,
-0.11117386072874069,
-0.08945876359939575,
-0.14093610644340515,
-0.007241873070597649,
-0.11568186432123184,
0.04567721486091614,
0.08836889266967773,
0.006845871452242136,
-0.00672838045284152,
0.11203880608081818,
0.03335873410105705,
-0.0013935824390500784,
0.05180250108242035,
-0.00023459414660464972,
-0.01857469230890274,
-0.03394228219985962,
-0.10207343101501465,
0.007198259700089693,
0.016053596511483192,
0.06356814503669739,
-0.03524866700172424,
-0.0085302060469985,
0.04484890401363373,
-0.024170037358999252,
-0.1213637962937355,
0.01608440838754177,
0.01440876629203558,
0.06604500114917755,
0.036820851266384125,
0.015813037753105164,
-0.0012000651331618428,
-0.005009863525629044,
0.20665276050567627,
-0.07469002157449722,
-0.058839745819568634,
-0.1251097470521927,
0.21059788763523102,
0.019864505156874657,
-0.06138511002063751,
0.0656360611319542,
-0.08234740048646927,
-0.020309116691350937,
0.16242720186710358,
0.19168531894683838,
-0.046068225055933,
-0.012417145073413849,
-0.007274045143276453,
-0.007037838455289602,
-0.005556111689656973,
0.1045132428407669,
0.10549405217170715,
0.003525710664689541,
-0.08156155049800873,
-0.009243090637028217,
-0.048253826797008514,
-0.0022603014949709177,
-0.05273859202861786,
0.08040148764848709,
-0.005108275916427374,
-0.013961086049675941,
-0.032151635736227036,
0.04175301641225815,
-0.04760957509279251,
-0.07509934902191162,
0.025545360520482063,
-0.19764310121536255,
-0.17148466408252716,
-0.017413778230547905,
0.05426323786377907,
0.008420294150710106,
0.06370801478624344,
-0.007497407961636782,
0.011034782975912094,
0.10570507496595383,
-0.01709629036486149,
-0.08675945550203323,
-0.08119282871484756,
0.121120385825634,
-0.12692825496196747,
0.20072130858898163,
-0.037968482822179794,
0.04925288259983063,
0.1274266242980957,
0.04054725542664528,
-0.12505267560482025,
0.026790399104356766,
0.061049409210681915,
-0.019382163882255554,
0.02293090894818306,
0.1275172084569931,
-0.04047123342752457,
0.06029092147946358,
0.05075240880250931,
-0.09001171588897705,
-0.04169246926903725,
-0.01989479921758175,
-0.004435477778315544,
-0.03418170288205147,
-0.05631951615214348,
-0.04686841368675232,
0.1392199844121933,
0.1813254952430725,
-0.06111942231655121,
-0.0076372516341507435,
-0.07144799828529358,
0.02306249365210533,
0.07763224840164185,
0.0024238082114607096,
-0.042203452438116074,
-0.24643424153327942,
-0.008852938190102577,
0.0715046152472496,
0.007153748534619808,
-0.24380157887935638,
-0.09790004789829254,
0.0076623233035206795,
-0.056687865406274796,
-0.10079973191022873,
0.09690006077289581,
0.05241852253675461,
0.035552531480789185,
-0.051843635737895966,
0.018830474466085434,
-0.08475786447525024,
0.15703487396240234,
-0.16713185608386993,
-0.08103054016828537
] |
null | null | ml-agents |
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: haihuynh/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["SnowballTarget", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SnowballTarget"]} | reinforcement-learning | haihuynh/ppo-SnowballTarget | [
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] | 2024-02-13T09:10:08+00:00 | [] | [] | TAGS
#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us
|
# ppo Agent playing SnowballTarget
This is a trained model of a ppo agent playing SnowballTarget
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: haihuynh/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n",
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
50,
207
] | [
"passage: TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
-0.03226843848824501,
0.0680922195315361,
-0.00343026639893651,
0.10073773562908173,
0.15903246402740479,
-0.015684643760323524,
0.16562408208847046,
0.09643834084272385,
0.13228346407413483,
0.05576049163937569,
0.08662696182727814,
0.08498673141002655,
0.06953591853380203,
0.12674659490585327,
0.08923020958900452,
-0.23318228125572205,
-0.03776201605796814,
-0.09804106503725052,
-0.023350680246949196,
0.07237159460783005,
0.04820490628480911,
-0.03611293435096741,
0.03158538416028023,
0.06690961867570877,
0.0015995725989341736,
-0.002032109070569277,
-0.0782003402709961,
-0.042560264468193054,
0.07475624233484268,
-0.02295634336769581,
0.01706058904528618,
-0.04953780397772789,
0.10120552778244019,
-0.16924981772899628,
0.027975210919976234,
0.03878234699368477,
-0.007206525653600693,
-0.020280834287405014,
0.1387191116809845,
0.04390585049986839,
0.11473813652992249,
-0.11951158195734024,
0.09070172905921936,
0.0748797282576561,
-0.0493326373398304,
0.025027120485901833,
-0.06698931008577347,
0.05697903782129288,
0.21610571444034576,
0.13881422579288483,
-0.0031415566336363554,
0.07271980494260788,
-0.041209910064935684,
0.05804502218961716,
0.1604757457971573,
-0.27812227606773376,
-0.06857863068580627,
0.18013860285282135,
-0.053439248353242874,
0.01878577098250389,
-0.024447279050946236,
0.044892873615026474,
-0.015502477064728737,
0.022013908252120018,
-0.018498258665204048,
0.03439580276608467,
0.2663120925426483,
0.02920500561594963,
-0.09073292464017868,
-0.07966065406799316,
-0.009404558688402176,
0.03616885840892792,
-0.04067496210336685,
-0.18896302580833435,
0.010987048037350178,
0.12167049944400787,
0.010628565214574337,
0.028852814808487892,
0.05268819257616997,
0.01531409751623869,
-0.09385403245687485,
-0.15612861514091492,
-0.04285518825054169,
-0.04770756512880325,
0.10892318189144135,
0.07999119907617569,
-0.02340151183307171,
-0.005668157245963812,
0.03840131685137749,
0.07180947810411453,
0.11631696671247482,
-0.039746351540088654,
-0.04178839549422264,
-0.012555482797324657,
-0.15727096796035767,
-0.02063107118010521,
-0.03262801840901375,
-0.018100854009389877,
0.04605662450194359,
0.13808250427246094,
0.12958872318267822,
0.040701501071453094,
0.03196660429239273,
0.023164963349699974,
0.007545530796051025,
0.11997836083173752,
0.04656391218304634,
-0.026189111173152924,
0.0037847820203751326,
0.022665193304419518,
0.06486087292432785,
-0.08937037736177444,
-0.10267597436904907,
0.05559422820806503,
-0.038626208901405334,
0.12567169964313507,
0.14507979154586792,
-0.02994190715253353,
-0.00025854771956801414,
-0.036025162786245346,
0.05473377928137779,
-0.1404709368944168,
0.07258182764053345,
0.05795833095908165,
-0.032943129539489746,
-0.08679909259080887,
-0.07288346439599991,
0.06545375287532806,
-0.0749790221452713,
0.03111473098397255,
0.0009823581203818321,
0.07732018083333969,
0.018516231328248978,
-0.016172409057617188,
0.04911100119352341,
-0.11911153793334961,
-0.015057522803544998,
-0.1590774804353714,
-0.11471033841371536,
-0.07280420511960983,
0.03793457895517349,
-0.048899680376052856,
-0.12033939361572266,
-0.10154884308576584,
0.037968095391988754,
-0.07819975912570953,
0.03580531105399132,
-0.04631439596414566,
-0.06475143134593964,
-0.03866814076900482,
-0.10979826003313065,
0.05753428488969803,
0.15836173295974731,
-0.002243620343506336,
-0.03539621829986572,
0.028293898329138756,
-0.16251800954341888,
0.16076253354549408,
-0.1323661357164383,
0.16633351147174835,
-0.07589954137802124,
0.03443753719329834,
0.1385277360677719,
-0.03028825670480728,
0.049778103828430176,
0.1938319206237793,
-0.10373543947935104,
-0.07468534260988235,
0.03838472440838814,
-0.09019933640956879,
-0.10900095850229263,
0.0666554868221283,
0.013083443976938725,
0.04920423403382301,
0.06071214750409126,
0.2151956856250763,
0.07799891382455826,
-0.23455534875392914,
0.04595652595162392,
0.009144800715148449,
-0.13087424635887146,
0.010877341963350773,
0.11313868314027786,
-0.06745576858520508,
-0.005732966121286154,
-0.03856458514928818,
-0.11014697700738907,
0.10264122486114502,
-0.006971433758735657,
-0.07993247359991074,
0.03268272802233696,
-0.054733116179704666,
-0.05576693266630173,
-0.0017134568188339472,
0.04568362236022949,
-0.032701823860406876,
-0.03807339444756508,
-0.007497759535908699,
0.03880110755562782,
-0.002562449313700199,
0.07852431386709213,
-0.035384614020586014,
0.10720140486955643,
-0.0038445238023996353,
0.00575345428660512,
-0.08775980770587921,
-0.13380888104438782,
-0.013944711536169052,
0.005280737299472094,
0.08157937973737717,
-0.08797063678503036,
0.10178540647029877,
0.09136724472045898,
0.03210058808326721,
-0.06990811228752136,
-0.05201747640967369,
0.025550298392772675,
-0.10528258979320526,
-0.10594328492879868,
-0.0711326152086258,
-0.05350992828607559,
0.12998804450035095,
-0.10555056482553482,
0.061593689024448395,
-0.059327252209186554,
0.09791762381792068,
-0.023058682680130005,
-0.07359391450881958,
0.04858071729540825,
-0.015751847997307777,
0.03023040108382702,
-0.09310072660446167,
0.10083914548158646,
0.06906786561012268,
-0.15056748688220978,
0.023844370618462563,
0.05844295769929886,
-0.09926125407218933,
0.11741366982460022,
0.03585332632064819,
-0.003348365891724825,
-0.06672400236129761,
-0.06737344712018967,
-0.0016581708332523704,
-0.07683692872524261,
0.029531847685575485,
0.20163007080554962,
0.1267738789319992,
0.0788988545536995,
-0.03354242071509361,
-0.05928126350045204,
-0.03150536119937897,
-0.04965370520949364,
-0.0674571618437767,
0.1314302533864975,
0.019011879339814186,
-0.025583146139979362,
0.03041801042854786,
0.004339639097452164,
0.07926829904317856,
0.12530572712421417,
-0.01107945665717125,
-0.11293049156665802,
0.015416950918734074,
0.06009115278720856,
0.05852751061320305,
0.01868230476975441,
0.036330465227365494,
-0.027019517496228218,
-0.020545559003949165,
-0.06050744652748108,
-0.02054009400308132,
-0.11326341331005096,
-0.06732803583145142,
0.05507318675518036,
-0.014698619022965431,
0.00917048193514347,
-0.08575702458620071,
-0.044061221182346344,
0.023806491866707802,
0.10617738962173462,
-0.0003313822962809354,
0.03545546159148216,
-0.033996641635894775,
-0.12364020198583603,
0.04896281287074089,
-0.0863981619477272,
-0.2562680244445801,
-0.10700986534357071,
-0.031670693308115005,
-0.08448426425457001,
0.02036207541823387,
0.07519704103469849,
-0.2141530066728592,
0.000007979029760463163,
-0.09908229112625122,
-0.005605047568678856,
-0.00574945192784071,
-0.03788480907678604,
0.1435413807630539,
0.09594201296567917,
-0.01756184548139572,
-0.0695578008890152,
0.013073230162262917,
0.012610822916030884,
-0.0671878457069397,
-0.01498668547719717,
0.07242611795663834,
0.09246165305376053,
0.05969879403710365,
0.06713219732046127,
0.06581172347068787,
-0.02952554076910019,
0.1679614782333374,
-0.0595504529774189,
0.037197455763816833,
0.06100121885538101,
-0.005566289182752371,
0.06992392987012863,
0.017337331548333168,
0.034690093249082565,
0.007921220734715462,
0.0020034913904964924,
0.016030561178922653,
-0.08306250721216202,
-0.20094811916351318,
-0.0757538378238678,
-0.003979819361120462,
0.17406845092773438,
0.16168798506259918,
0.1036500334739685,
-0.09282294660806656,
0.019875047728419304,
0.002789488760754466,
-0.10545316338539124,
0.12151001393795013,
0.1257794201374054,
-0.06162133440375328,
-0.00866698194295168,
0.03594151511788368,
-0.03524322807788849,
0.042140547186136246,
0.05513856187462807,
-0.02785191684961319,
0.07460246235132217,
0.020613674074411392,
-0.005006827414035797,
-0.025805138051509857,
-0.04302065446972847,
-0.05447268858551979,
0.11759653687477112,
0.07051340490579605,
0.01588437706232071,
0.015234574675559998,
-0.06960803270339966,
-0.07536967098712921,
0.14445610344409943,
0.15622730553150177,
-0.0708133652806282,
-0.03234793245792389,
0.1184762716293335,
0.06168437749147415,
0.20492848753929138,
0.013387368060648441,
-0.10341714322566986,
-0.07357892394065857,
-0.0037097546737641096,
-0.11368755251169205,
0.013885213993489742,
0.05162396654486656,
-0.025569934397935867,
-0.1605783849954605,
0.0656120628118515,
0.0010152655886486173,
0.11426851898431778,
0.0023026312701404095,
-0.025314560160040855,
0.05833521485328674,
0.010416124947369099,
-0.026050105690956116,
0.0467609241604805,
-0.1576816290616989,
0.031191829591989517,
-0.003190582850947976,
0.09673065692186356,
-0.057483237236738205,
0.0205397792160511,
0.09065018594264984,
-0.04310663044452667,
0.16095349192619324,
0.05195707082748413,
-0.01223805733025074,
-0.14444969594478607,
-0.16926386952400208,
-0.05426067113876343,
-0.03674504905939102,
-0.09565171599388123,
0.06514795124530792,
0.037472549825906754,
-0.018614092841744423,
-0.10433463752269745,
0.023116445168852806,
-0.057708676904439926,
-0.13129539787769318,
-0.04138574004173279,
-0.07388493418693542,
0.08612815290689468,
-0.05551351234316826,
-0.07164987921714783,
-0.08753908425569534,
0.18257862329483032,
0.10447791963815689,
-0.10799599438905716,
-0.1150776669383049,
0.006277561653405428,
-0.05837640166282654,
-0.026620808988809586,
0.061966557055711746,
0.008393602445721626,
0.10133548825979233,
-0.1089506596326828,
-0.050738625228405,
-0.0355626605451107,
-0.10187941044569016,
-0.09598831087350845,
0.03822505474090576,
0.17696410417556763,
0.05336686596274376,
0.08918686956167221,
0.00008365963003598154,
0.09972541779279709,
-0.006233998108655214,
-0.07359512150287628,
0.11458215862512589,
0.09096967428922653,
-0.041195690631866455,
0.055511463433504105,
0.04706958308815956,
0.07914774119853973,
-0.13028283417224884,
-0.02863149344921112,
0.20718789100646973,
0.2668018937110901,
-0.06143675744533539,
0.21554410457611084,
0.03248201683163643,
-0.041001614183187485,
-0.1415509730577469,
-0.0658806562423706,
0.012092101387679577,
-0.05802522599697113,
0.0945308580994606,
-0.19199630618095398,
0.10301456600427628,
0.0016485818196088076,
-0.0124317342415452,
0.025723803788423538,
-0.1396377682685852,
-0.0806853398680687,
0.027904627844691277,
0.09544496238231659,
-0.057305220514535904,
-0.09939071536064148,
-0.0740957036614418,
0.013384195975959301,
-0.08821944892406464,
0.03215790539979935,
-0.09602072834968567,
0.05031478777527809,
0.014675433747470379,
0.04474775120615959,
0.05595027655363083,
-0.05722932890057564,
0.13261821866035461,
-0.037777192890644073,
-0.0640096515417099,
-0.0581909716129303,
0.03465409204363823,
-0.008810074999928474,
-0.09320985525846481,
0.0571228452026844,
-0.005732669495046139,
-0.025792764499783516,
-0.1773664951324463,
-0.052470747381448746,
0.02137538604438305,
0.03528464213013649,
-0.036991555243730545,
-0.08449315279722214,
-0.029548058286309242,
0.06868806481361389,
0.08948053419589996,
0.025657350197434425,
0.1504884511232376,
0.006240607239305973,
0.005945851095020771,
0.05111370608210564,
0.03860967606306076,
0.00992953684180975,
-0.12022370845079422,
-0.07069975137710571,
-0.06612211465835571,
0.0024041051510721445,
-0.039844613522291183,
-0.011583718471229076,
0.054212093353271484,
0.05667166784405708,
-0.0028817292768508196,
0.05653054267168045,
-0.08099106699228287,
-0.008628967218101025,
0.021666837856173515,
-0.10336630791425705,
-0.11502294987440109,
-0.09043661504983902,
-0.11487115174531937,
0.03220780938863754,
-0.06240004301071167,
0.08783186972141266,
-0.05402747914195061,
-0.0008021841640584171,
0.02301989682018757,
0.040168482810258865,
-0.006134417373687029,
0.026862500235438347,
0.014006026089191437,
0.03145069628953934,
-0.07261767238378525,
0.10697497427463531,
0.019205186516046524,
-0.043252404779195786,
0.05375038832426071,
0.20360279083251953,
-0.06823917478322983,
-0.06176195293664932,
-0.0780613049864769,
0.06196635961532593,
0.020530419424176216,
-0.029590649530291557,
-0.037526100873947144,
-0.05623430386185646,
0.11283013969659805,
-0.17424702644348145,
0.01108220312744379,
-0.11322664469480515,
0.00726539408788085,
0.06248128414154053,
-0.049998242408037186,
0.0682339295744896,
-0.035508252680301666,
-0.06188659369945526,
-0.13914963603019714,
0.05398353934288025,
0.0281581562012434,
0.1029495894908905,
-0.012566661462187767,
-0.03989243134856224,
-0.15153969824314117,
0.03176539018750191,
-0.04070910066366196,
0.008906066417694092,
-0.18617762625217438,
0.02237585186958313,
-0.005816708318889141,
0.026832284405827522,
0.03854448348283768,
0.06530117243528366,
-0.02990453690290451,
-0.08898817747831345,
-0.0590718612074852,
0.06624763458967209,
-0.08663976937532425,
-0.017041869461536407,
-0.03226790949702263,
-0.08014914393424988,
0.059446949511766434,
0.06473515927791595,
-0.029353870078921318,
-0.0500621572136879,
-0.033743273466825485,
0.015360532328486443,
-0.029399791732430458,
-0.051863107830286026,
0.049361106008291245,
-0.13950835168361664,
0.01103584747761488,
-0.05913659185171127,
-0.11201384663581848,
0.03307284414768219,
0.13695906102657318,
-0.051791660487651825,
0.04304436966776848,
0.040915440768003464,
-0.09117984026670456,
-0.07405990362167358,
-0.012256286107003689,
0.07023116201162338,
0.0375761017203331,
0.0939851850271225,
-0.08452695608139038,
0.19662068784236908,
-0.10636894404888153,
-0.03828488290309906,
0.01869889535009861,
0.07143165916204453,
0.008686566725373268,
-0.0971563309431076,
0.039045434445142746,
-0.016220178455114365,
0.06855985522270203,
0.06620065122842789,
0.021539710462093353,
0.05359736084938049,
0.04515263810753822,
0.1300612986087799,
0.007783852517604828,
0.08598992973566055,
-0.00042009857133962214,
0.016518546268343925,
0.12035307288169861,
-0.005117773078382015,
0.06435844302177429,
-0.059864770621061325,
0.06725799292325974,
0.03790343552827835,
0.07430953532457352,
0.07150542736053467,
0.04588598012924194,
-0.0870242491364479,
-0.18712227046489716,
-0.037947770208120346,
0.028591623529791832,
0.032207269221544266,
-0.040001630783081055,
0.16394729912281036,
0.15155597031116486,
-0.198584645986557,
0.016178634017705917,
-0.004779201932251453,
0.03244810178875923,
-0.07116245478391647,
-0.10376838594675064,
0.004774283152073622,
-0.13952884078025818,
0.09750474989414215,
-0.018598008900880814,
-0.005840454250574112,
-0.02945278398692608,
0.016531512141227722,
0.03051834926009178,
0.04568054527044296,
-0.015912916511297226,
0.010367119684815407,
0.04051654040813446,
-0.02306179516017437,
0.01538213063031435,
0.007448417600244284,
-0.08385603874921799,
-0.03630215674638748,
-0.06569911539554596,
-0.022666990756988525,
0.024364259093999863,
0.01041533425450325,
0.06198636814951897,
0.01705743372440338,
-0.06034892052412033,
0.0777050107717514,
-0.000007287038897629827,
0.028744379058480263,
0.20168781280517578,
0.09654781222343445,
-0.04032544046640396,
-0.04628358036279678,
0.19619931280612946,
-0.03660624474287033,
-0.07074923813343048,
-0.0892270877957344,
0.13914093375205994,
-0.05824674293398857,
-0.04710942134261131,
-0.03863028809428215,
-0.15921030938625336,
-0.06178021430969238,
0.18719300627708435,
0.1275814026594162,
-0.02412581630051136,
0.0027950257062911987,
-0.0662069320678711,
0.0020464956760406494,
0.025665931403636932,
0.09557986259460449,
0.06497358530759811,
0.07655705511569977,
-0.09819726645946503,
-0.015408245846629143,
-0.07794314622879028,
-0.11077439785003662,
-0.21214276552200317,
0.04448200762271881,
0.03785153105854988,
-0.020454922690987587,
-0.018224269151687622,
0.120803102850914,
-0.11076284945011139,
-0.10036478191614151,
0.10503492504358292,
-0.04352007433772087,
-0.07357945293188095,
0.003446679562330246,
0.03561016917228699,
0.003799302037805319,
0.09679453074932098,
0.08635007590055466,
0.04880039393901825,
0.03643743693828583,
-0.01675904355943203,
-0.0847742110490799,
0.036026060581207275,
0.05265684053301811,
-0.13489796221256256,
0.22403714060783386,
-0.025716451928019524,
0.001203608699142933,
0.09506377577781677,
0.06773675978183746,
-0.17619405686855316,
0.00466102035716176,
0.05791202932596207,
-0.18197670578956604,
0.0005185713525861502,
0.08191653341054916,
-0.03332270681858063,
0.004489847458899021,
0.0625891163945198,
-0.043961841613054276,
0.005832279566675425,
0.20027247071266174,
0.03887634724378586,
-0.035342343151569366,
0.07925400137901306,
-0.1411864161491394,
0.10237599909305573,
0.09221631288528442,
-0.06131981313228607,
0.00351718463934958,
-0.034514348953962326,
0.005065697245299816,
-0.00838220864534378,
-0.03707423433661461,
-0.025188760831952095,
-0.12561653554439545,
-0.024002304300665855,
-0.06335248798131943,
0.02367456443607807,
-0.2013731300830841,
-0.133644238114357,
-0.05311643332242966,
-0.07468776404857635,
-0.03906288743019104,
0.0901276171207428,
0.0744134783744812,
-0.047233808785676956,
0.016773194074630737,
-0.10983962565660477,
0.023239901289343834,
0.14343760907649994,
-0.07468754053115845,
-0.004713743459433317
] |
null | null | diffusers | ### Christmas-Tree Dreambooth model trained by Kruti23 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 112110090
Sample pictures of this concept:
.png)
| {"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]} | text-to-image | Kruti23/christmas-tree | [
"diffusers",
"safetensors",
"NxtWave-GenAI-Webinar",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-13T09:10:55+00:00 | [] | [] | TAGS
#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### Christmas-Tree Dreambooth model trained by Kruti23 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 112110090
Sample pictures of this concept:
!0.png)
| [
"### Christmas-Tree Dreambooth model trained by Kruti23 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 112110090\n\nSample pictures of this concept:\n\n !0.png)"
] | [
"TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### Christmas-Tree Dreambooth model trained by Kruti23 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 112110090\n\nSample pictures of this concept:\n\n !0.png)"
] | [
73,
53
] | [
"passage: TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### Christmas-Tree Dreambooth model trained by Kruti23 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 112110090\n\nSample pictures of this concept:\n\n !0.png)"
] | [
-0.1049434244632721,
0.096145398914814,
-0.0018140373285859823,
0.03465735912322998,
0.06640021502971649,
-0.04277289658784866,
0.15458112955093384,
-0.007556028664112091,
-0.032050974667072296,
0.016046522185206413,
0.16658537089824677,
0.09664527326822281,
0.036795973777770996,
0.1784520447254181,
-0.06518130004405975,
-0.19794726371765137,
0.03663645684719086,
0.07671530544757843,
0.001706757233478129,
0.07801887392997742,
0.059817589819431305,
-0.04224648326635361,
0.11640501767396927,
-0.052100878208875656,
-0.16268108785152435,
-0.03519674018025398,
-0.036575496196746826,
-0.05823448672890663,
0.06857152283191681,
0.03848421573638916,
0.06683725118637085,
0.10950200259685516,
-0.013381043449044228,
-0.014223567210137844,
0.036534249782562256,
-0.00024049066996667534,
-0.03106709197163582,
0.041469428688287735,
-0.05152987316250801,
0.038917627185583115,
0.23846378922462463,
0.009090011939406395,
-0.04562091454863548,
0.028273873031139374,
-0.09844915568828583,
0.005602298304438591,
0.016013994812965393,
0.11648049205541611,
0.1525062918663025,
0.07476694881916046,
0.01612241007387638,
0.12541666626930237,
0.06256422400474548,
0.13411033153533936,
0.14908427000045776,
-0.26153209805488586,
-0.07090024650096893,
0.1781037151813507,
0.06899552047252655,
-0.0000652230010018684,
-0.030719982460141182,
0.060377344489097595,
0.10612573474645615,
-0.017551720142364502,
0.035091299563646317,
-0.08094775676727295,
0.032105643302202225,
-0.09095656126737595,
-0.140588641166687,
-0.0038062003441154957,
0.2124037742614746,
0.04272365942597389,
-0.02997620590031147,
-0.07757911831140518,
-0.07516070455312729,
-0.04249625280499458,
-0.03350118175148964,
0.024537963792681694,
-0.06932869553565979,
-0.05425891652703285,
-0.05414651334285736,
-0.03579235076904297,
-0.12167440354824066,
-0.04309108108282089,
-0.00024918155395425856,
0.16960467398166656,
0.04450007900595665,
0.07538166642189026,
-0.1266411393880844,
0.09988793730735779,
0.024746889248490334,
-0.15126895904541016,
0.027021024376153946,
-0.11701025068759918,
0.07908514887094498,
0.074107825756073,
0.016780368983745575,
-0.062030352652072906,
0.059772685170173645,
0.022880692034959793,
0.0073701064102351665,
-0.05303264036774635,
0.0926809310913086,
0.08382315933704376,
0.05024126544594765,
-0.03919730335474014,
-0.07124385982751846,
-0.04669199138879776,
0.04173984006047249,
-0.05280645191669464,
0.06451793760061264,
-0.051918502897024155,
-0.07682307064533234,
-0.0031681018881499767,
-0.0836903527379036,
0.08189930766820908,
0.016999879851937294,
0.0708005279302597,
-0.018068328499794006,
-0.06397691369056702,
0.1944132298231125,
0.04646363481879234,
-0.040996670722961426,
-0.03291131556034088,
0.0005363900563679636,
0.08128979057073593,
0.04693140834569931,
0.015349872410297394,
-0.017202090471982956,
0.07380565255880356,
-0.0536918118596077,
-0.021960260346531868,
-0.0339602530002594,
-0.03229839727282524,
0.00687379390001297,
-0.2468249350786209,
0.07768823951482773,
-0.1706368327140808,
-0.0641779974102974,
0.04061711207032204,
0.026672683656215668,
0.02467157132923603,
-0.09580180794000626,
-0.06744671612977982,
-0.09776251018047333,
-0.023929644376039505,
0.006040378473699093,
-0.01190729159861803,
-0.016496166586875916,
0.05300295725464821,
-0.012753956951200962,
0.09863531589508057,
-0.21769507229328156,
-0.028422975912690163,
-0.09681113064289093,
0.026772066950798035,
-0.044374361634254456,
-0.02594495192170143,
-0.03930654376745224,
0.08206479996442795,
0.010822064243257046,
-0.02357069030404091,
0.04040205851197243,
-0.017597271129488945,
-0.007685812655836344,
0.15150000154972076,
-0.07225038856267929,
0.007392005529254675,
0.1287381500005722,
-0.14924027025699615,
-0.1961522251367569,
0.0652552992105484,
0.028075603768229485,
0.07769129425287247,
0.060815464705228806,
0.0914754793047905,
0.07873814553022385,
-0.17308694124221802,
-0.0324195995926857,
0.011230205185711384,
-0.1689053624868393,
-0.1782619208097458,
-0.0007396495202556252,
0.16781644523143768,
-0.10961660742759705,
0.016315968707203865,
-0.14495429396629333,
0.10542725771665573,
-0.06696061789989471,
-0.017712775617837906,
-0.02819066122174263,
-0.1338321566581726,
-0.0016349770594388247,
0.009265061467885971,
0.02589675411581993,
0.007132533006370068,
0.020191550254821777,
-0.12532605230808258,
0.04165032133460045,
-0.014424586668610573,
-0.026285171508789062,
-0.07647869735956192,
0.04248175770044327,
-0.029195528477430344,
0.003210972063243389,
0.009171867743134499,
-0.06724242120981216,
0.03497565910220146,
0.08090249449014664,
-0.007713756989687681,
0.07358931750059128,
0.027655240148305893,
0.07637179642915726,
-0.0074074240401387215,
-0.07158301770687103,
0.06243806332349777,
0.03291068598628044,
-0.00895712897181511,
-0.14859575033187866,
0.0784190222620964,
-0.04407842829823494,
-0.0458008274435997,
-0.08366895467042923,
0.042600810527801514,
0.03045729361474514,
0.14697329699993134,
0.049577727913856506,
0.019469842314720154,
0.04779068008065224,
0.014006372541189194,
-0.030413083732128143,
-0.017584633082151413,
0.07408159971237183,
0.05326738581061363,
-0.10605012625455856,
0.23718488216400146,
-0.12477933615446091,
0.09509381651878357,
0.09308836609125137,
0.016820227727293968,
-0.029644083231687546,
0.06445833295583725,
-0.07267311960458755,
-0.010427944362163544,
0.010181712917983532,
0.01690431870520115,
0.09326334297657013,
-0.016798708587884903,
0.11835698783397675,
-0.0498662032186985,
-0.07176490128040314,
0.08101008087396622,
-0.01218824926763773,
0.007065693382173777,
0.1133785992860794,
0.07163364440202713,
-0.12324551492929459,
0.07721848040819168,
0.10125333815813065,
0.0012964124325662851,
0.197776198387146,
-0.042135633528232574,
0.002203824697062373,
-0.08275602012872696,
0.09740235656499863,
0.029666593298316002,
0.22987967729568481,
-0.06882350146770477,
0.014945275150239468,
-0.00586517620831728,
-0.0500192753970623,
0.03707338869571686,
-0.09438644349575043,
-0.08391312509775162,
-0.008004611358046532,
0.011899859644472599,
0.09082583338022232,
0.1058356985449791,
-0.10191254317760468,
0.08257365226745605,
-0.08291569352149963,
-0.1254577934741974,
0.05157189071178436,
-0.002922716084867716,
-0.03669600561261177,
0.10557670891284943,
-0.011723652482032776,
-0.2428644746541977,
-0.10453905165195465,
-0.04453158751130104,
-0.06732606142759323,
-0.06370782852172852,
0.06795833259820938,
0.0011261027539148927,
-0.0029111846815794706,
-0.0809401422739029,
0.03816885128617287,
-0.04094894230365753,
0.008925734087824821,
-0.08539455384016037,
0.009455679915845394,
-0.07669682055711746,
-0.047821659594774246,
0.02566000260412693,
-0.03545718267560005,
0.02168789692223072,
0.1330483853816986,
0.0010799659648910165,
0.15905022621154785,
0.09742598980665207,
-0.018093612045049667,
-0.01202450692653656,
0.01055908203125,
0.16914381086826324,
-0.03975452855229378,
0.12277345359325409,
0.10577035695314407,
0.05693073198199272,
0.0629500225186348,
0.16031023859977722,
0.028210917487740517,
-0.06299503147602081,
0.06128961220383644,
-0.06875099986791611,
-0.11514416337013245,
-0.13250260055065155,
-0.07745561003684998,
-0.03364628180861473,
0.18900495767593384,
-0.001920082955621183,
0.034009017050266266,
0.061423152685165405,
0.18509523570537567,
0.0014151346404105425,
0.04519924893975258,
-0.05238727480173111,
0.12086454778909683,
-0.04319841042160988,
-0.02362341620028019,
0.020996704697608948,
-0.10325993597507477,
-0.07634035497903824,
0.09347611665725708,
0.0041018277406692505,
0.1715731918811798,
-0.055493347346782684,
-0.01992867700755596,
0.10712084174156189,
0.15799689292907715,
0.156546488404274,
0.097641721367836,
-0.028573401272296906,
-0.051017697900533676,
0.006875567138195038,
-0.07698079198598862,
0.0900380089879036,
0.05083988979458809,
-0.09020828455686569,
-0.04575655236840248,
0.026503505185246468,
0.04870842024683952,
-0.07448270171880722,
0.160154789686203,
0.11959617584943771,
-0.22107113897800446,
0.003910553641617298,
-0.007882500067353249,
0.10087721049785614,
-0.07365861535072327,
0.07088197767734528,
0.21895134449005127,
-0.038422729820013046,
0.07695469260215759,
-0.08110561221837997,
0.06946798413991928,
0.0068315165117383,
-0.01630859449505806,
-0.07913817465305328,
0.0001894988090498373,
-0.008415757678449154,
0.016719749197363853,
-0.1846996247768402,
0.17346203327178955,
-0.0463879220187664,
0.06861831247806549,
0.003441473236307502,
-0.06313463300466537,
-0.013624346815049648,
0.1517324298620224,
0.14952749013900757,
0.03463445603847504,
0.016123812645673752,
-0.06056961789727211,
-0.11353018134832382,
0.05593720078468323,
0.030047379434108734,
-0.012323914095759392,
0.016389265656471252,
0.09307140111923218,
-0.038363970816135406,
-0.020700810477137566,
-0.034667301923036575,
-0.17337608337402344,
-0.08529629558324814,
-0.021077850833535194,
0.2287420630455017,
0.06593813747167587,
-0.00909397006034851,
0.010863392613828182,
0.0033549272920936346,
0.0904807522892952,
-0.14440545439720154,
-0.1215462014079094,
-0.051051415503025055,
-0.12159883975982666,
0.01945420354604721,
-0.06031567603349686,
0.04608805850148201,
-0.08576462417840958,
0.03700379282236099,
-0.06094418466091156,
-0.11603827029466629,
0.045361097902059555,
-0.11710281670093536,
-0.11971249431371689,
-0.04614404961466789,
0.08849561959505081,
0.022868948057293892,
-0.04075884819030762,
-0.011823117733001709,
-0.016800327226519585,
-0.06989139318466187,
-0.14164972305297852,
0.00012094676640117541,
0.0651860162615776,
-0.04221181944012642,
0.028568092733621597,
-0.04770364612340927,
-0.07595059275627136,
-0.05903162062168121,
-0.043221838772296906,
0.10082859545946121,
0.27340632677078247,
-0.08338897675275803,
0.04747318476438522,
0.13322213292121887,
-0.017038781195878983,
-0.21371707320213318,
-0.16636629402637482,
-0.06208961829543114,
0.0003925186756532639,
-0.0023921108804643154,
-0.08047717064619064,
0.10859736800193787,
0.014710781164467335,
-0.029519252479076385,
0.2007453292608261,
-0.291181355714798,
-0.05910958722233772,
0.020297260954976082,
0.1750853806734085,
0.3014560341835022,
-0.11836839467287064,
-0.04324567690491676,
0.009456899017095566,
-0.146768718957901,
0.20088200271129608,
-0.04145890846848488,
0.0649603083729744,
-0.0843188539147377,
0.03210483863949776,
-0.016598956659436226,
-0.03830878064036369,
0.06240532919764519,
-0.03644946962594986,
0.06299005448818207,
-0.08038559556007385,
-0.003819037461653352,
0.18101665377616882,
-0.040482472628355026,
0.08963283896446228,
-0.17408497631549835,
0.1171138733625412,
-0.0812704861164093,
0.000052939583838451654,
-0.06631354242563248,
0.06437589973211288,
-0.08019918203353882,
-0.10461258888244629,
-0.09012268483638763,
-0.01537063717842102,
0.001655749510973692,
0.03169559687376022,
-0.01023189164698124,
0.036534808576107025,
-0.00972726009786129,
0.08676834404468536,
-0.008082859218120575,
0.029818305745720863,
-0.005214515142142773,
-0.12057948112487793,
-0.06376315653324127,
0.12139348685741425,
-0.08294423669576645,
-0.07109033316373825,
0.13982734084129333,
0.04105532914400101,
-0.001639357884414494,
0.020159827545285225,
-0.04492589458823204,
0.03180180490016937,
0.08448392152786255,
-0.15738222002983093,
-0.12076760083436966,
-0.08024963736534119,
0.21060466766357422,
0.12116516381502151,
0.1745394468307495,
0.12436548620462418,
-0.1362553834915161,
0.015242037363350391,
-0.06292278319597244,
-0.02655070647597313,
-0.005505789071321487,
-0.013562403619289398,
-0.004151466768234968,
0.0625094547867775,
-0.08674165606498718,
0.021883375942707062,
-0.059567008167505264,
-0.10153192281723022,
-0.043544504791498184,
0.07745955884456635,
-0.1180703267455101,
-0.0935443863272667,
0.04950539767742157,
0.09959413856267929,
-0.17474739253520966,
-0.06414089351892471,
-0.034409984946250916,
-0.08260874450206757,
0.02752360887825489,
0.1523372381925583,
-0.011117246933281422,
0.07221788913011551,
0.05359147489070892,
-0.05241069942712784,
-0.08833712339401245,
0.04505108296871185,
-0.020040316507220268,
0.1297733038663864,
-0.21451826393604279,
-0.09378883242607117,
0.04293808713555336,
0.030255353078246117,
-0.07637709379196167,
-0.008281370624899864,
-0.041874922811985016,
-0.010612893849611282,
0.03648839890956879,
0.09232794493436813,
-0.0663791373372078,
-0.0540635921061039,
-0.060994379222393036,
-0.0018430030904710293,
-0.03804771229624748,
0.050360068678855896,
-0.025819625705480576,
0.04355018585920334,
0.0006922728498466313,
-0.02324094995856285,
-0.016529308632016182,
-0.037052612751722336,
-0.007760344073176384,
-0.06176907569169998,
0.09742274135351181,
-0.03693821653723717,
-0.11601234972476959,
-0.013881842605769634,
-0.1648656725883484,
0.053243786096572876,
0.07310999184846878,
0.007143940310925245,
-0.016438551247119904,
0.06421972066164017,
-0.003138075117021799,
0.02328616753220558,
0.05267350375652313,
-0.017518524080514908,
-0.0014514571521431208,
-0.1473018378019333,
-0.010990297421813011,
0.031488336622714996,
-0.026472369208931923,
-0.07049212604761124,
-0.06098335608839989,
0.028191857039928436,
0.05547225475311279,
0.17195121943950653,
-0.09004519879817963,
0.07435598224401474,
0.007807236630469561,
0.024031177163124084,
0.08212076872587204,
-0.08227404206991196,
-0.03161102160811424,
-0.006765882018953562,
-0.025052838027477264,
0.006882476154714823,
0.10648097842931747,
-0.07903694361448288,
-0.1918843388557434,
-0.010246852412819862,
-0.06296757608652115,
-0.048781756311655045,
0.005611347500234842,
0.3093337416648865,
-0.000666537438519299,
-0.017676791176199913,
-0.14822500944137573,
0.06531751155853271,
0.011856241151690483,
0.031949732452631,
0.01631874218583107,
0.06083757430315018,
0.07236770540475845,
0.05229639261960983,
0.06474267691373825,
0.10439306497573853,
-0.014379261061549187,
-0.036560457199811935,
-0.11011891067028046,
0.10815615206956863,
-0.007644560653716326,
0.09327388554811478,
0.1519639790058136,
0.020487016066908836,
-0.028604010120034218,
0.0970771536231041,
-0.04946841672062874,
-0.018446000292897224,
-0.20433743298053741,
-0.050566840916872025,
-0.09866177290678024,
0.029900718480348587,
-0.053150080144405365,
-0.04587477818131447,
-0.0394289456307888,
0.0599646158516407,
-0.06385143101215363,
0.0768541470170021,
0.067646823823452,
-0.010311562567949295,
0.16456757485866547,
0.021382685750722885,
-0.0686006098985672,
0.08115531504154205,
0.05910670757293701,
-0.011516937054693699,
-0.024452585726976395,
-0.03134504333138466,
0.04875360429286957,
-0.024751931428909302,
0.04576915502548218,
0.054869819432497025,
-0.07055404782295227,
-0.005954393185675144,
-0.010104045271873474,
0.02741038240492344,
0.08486837148666382,
0.06151144951581955,
0.00021590889082290232,
0.011812161654233932,
0.14278942346572876,
0.012543844990432262,
-0.06730001419782639,
-0.06909387558698654,
0.07116864621639252,
-0.1026768907904625,
0.02747129090130329,
-0.044673800468444824,
-0.029487375169992447,
-0.06088801473379135,
0.17736195027828217,
0.1328960508108139,
0.004688415210694075,
0.016783317551016808,
-0.10149332135915756,
0.016189029440283775,
-0.04513878747820854,
0.08880472928285599,
0.026807857677340508,
0.28204473853111267,
-0.05230364203453064,
-0.06023593246936798,
-0.1304476261138916,
-0.02333628572523594,
-0.07939304411411285,
-0.14119203388690948,
0.019149305298924446,
-0.12471318244934082,
-0.10267901420593262,
0.09160271286964417,
-0.23223522305488586,
0.0196672435849905,
0.03820129856467247,
-0.03530530631542206,
-0.005376137793064117,
-0.06058906763792038,
0.11791379749774933,
0.05802484229207039,
0.06129498407244682,
-0.062299858778715134,
0.026777571067214012,
0.057785969227552414,
-0.01517850998789072,
-0.13279008865356445,
0.06610413640737534,
-0.05774451792240143,
-0.17048120498657227,
0.09573172777891159,
-0.008549193851649761,
0.06836452335119247,
0.07641220092773438,
-0.04701864346861839,
-0.12806420028209686,
0.13877272605895996,
-0.051901787519454956,
-0.05736827477812767,
-0.0065519665367901325,
0.1284017115831375,
0.02473561093211174,
-0.018554866313934326,
-0.02086855098605156,
-0.10508114099502563,
-0.033744409680366516,
0.1630985140800476,
0.06590472161769867,
-0.10672498494386673,
0.1051180511713028,
-0.007952689193189144,
0.08433840423822403,
0.013865815475583076,
-0.04718656465411186,
-0.03315235674381256,
-0.012038304470479488,
0.016919653862714767,
-0.00997831393033266,
-0.08584680408239365,
0.07039047032594681,
-0.07728958129882812,
-0.04400375485420227,
0.047892823815345764,
0.06219944730401039,
-0.172371968626976,
0.025965850800275803,
-0.20126157999038696,
-0.0017090891487896442,
-0.06667123734951019,
0.005587705411016941,
0.16614128649234772,
0.025506384670734406,
0.02335900068283081,
-0.04099602997303009,
-0.022167280316352844,
0.04165700450539589,
-0.02107938751578331,
-0.15402749180793762
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-tiny-finetuned-minds14
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the PolyAI/minds14 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6785
- Wer Ortho: 0.3607
- Wer: 0.3624
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_ratio: 0.1
- lr_scheduler_warmup_steps: 50
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|
| 3.8342 | 1.0 | 28 | 2.7013 | 0.4859 | 0.3669 |
| 1.52 | 2.0 | 56 | 0.6447 | 0.3822 | 0.3624 |
| 0.4282 | 3.0 | 84 | 0.5154 | 0.3573 | 0.3521 |
| 0.2511 | 4.0 | 112 | 0.5017 | 0.3452 | 0.3430 |
| 0.1461 | 5.0 | 140 | 0.5106 | 0.3620 | 0.3572 |
| 0.0829 | 6.0 | 168 | 0.5399 | 0.3641 | 0.3592 |
| 0.0423 | 7.0 | 196 | 0.5596 | 0.3573 | 0.3527 |
| 0.0199 | 8.0 | 224 | 0.5846 | 0.3627 | 0.3598 |
| 0.0093 | 9.0 | 252 | 0.6006 | 0.3594 | 0.3572 |
| 0.0056 | 10.0 | 280 | 0.6207 | 0.3345 | 0.3301 |
| 0.0037 | 11.0 | 308 | 0.6238 | 0.3560 | 0.3534 |
| 0.0021 | 12.0 | 336 | 0.6377 | 0.3486 | 0.3482 |
| 0.0016 | 13.0 | 364 | 0.6485 | 0.3594 | 0.3579 |
| 0.0013 | 14.0 | 392 | 0.6621 | 0.3567 | 0.3572 |
| 0.0011 | 15.0 | 420 | 0.6617 | 0.3587 | 0.3605 |
| 0.0009 | 16.0 | 448 | 0.6682 | 0.3560 | 0.3559 |
| 0.0008 | 17.0 | 476 | 0.6741 | 0.3627 | 0.3624 |
| 0.0008 | 17.86 | 500 | 0.6785 | 0.3607 | 0.3624 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["PolyAI/minds14"], "metrics": ["wer"], "base_model": "openai/whisper-tiny", "model-index": [{"name": "whisper-tiny-finetuned-minds14", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "PolyAI/minds14", "type": "PolyAI/minds14", "config": "en-US", "split": "train", "args": "en-US"}, "metrics": [{"type": "wer", "value": 0.3624031007751938, "name": "Wer"}]}]}]} | automatic-speech-recognition | arshsin/whisper-tiny-finetuned-minds14 | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:PolyAI/minds14",
"base_model:openai/whisper-tiny",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:11:32+00:00 | [] | [] | TAGS
#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-PolyAI/minds14 #base_model-openai/whisper-tiny #license-apache-2.0 #model-index #endpoints_compatible #region-us
| whisper-tiny-finetuned-minds14
==============================
This model is a fine-tuned version of openai/whisper-tiny on the PolyAI/minds14 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6785
* Wer Ortho: 0.3607
* Wer: 0.3624
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: constant\_with\_warmup
* lr\_scheduler\_warmup\_ratio: 0.1
* lr\_scheduler\_warmup\_steps: 50
* training\_steps: 500
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.2
* Datasets 2.1.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\\_with\\_warmup\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-PolyAI/minds14 #base_model-openai/whisper-tiny #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\\_with\\_warmup\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
79,
155,
4,
30
] | [
"passage: TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-PolyAI/minds14 #base_model-openai/whisper-tiny #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: constant\\_with\\_warmup\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* lr\\_scheduler\\_warmup\\_steps: 50\n* training\\_steps: 500\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
-0.10851134359836578,
0.10100818425416946,
-0.002378334291279316,
0.05566694214940071,
0.0987052172422409,
0.01591210626065731,
0.1290070116519928,
0.1442204862833023,
-0.02317907102406025,
0.0761917233467102,
0.11336958408355713,
0.07943095266819,
0.07325231283903122,
0.17750291526317596,
-0.0442756786942482,
-0.3094388544559479,
0.035701651126146317,
-0.023934170603752136,
-0.10807307064533234,
0.11341666430234909,
0.09052059054374695,
-0.0983874723315239,
0.04038473591208458,
-0.004651547409594059,
-0.06524192541837692,
-0.023782996460795403,
-0.018228748813271523,
-0.06301064044237137,
0.12100372463464737,
0.033051952719688416,
0.05901101604104042,
0.03073735721409321,
0.08042144775390625,
-0.28189709782600403,
0.01689663715660572,
0.049919530749320984,
0.04132162034511566,
0.06670644134283066,
0.07547996193170547,
-0.015738356858491898,
0.09086401015520096,
-0.07966429740190506,
0.09065534174442291,
0.05721764639019966,
-0.11185143142938614,
-0.30305805802345276,
-0.06837936490774155,
0.03900836408138275,
0.14422118663787842,
0.0645173192024231,
-0.03249889239668846,
0.09229394048452377,
-0.059453561902046204,
0.09329945594072342,
0.19640684127807617,
-0.25767725706100464,
-0.06346950680017471,
-0.05117540806531906,
0.04497634246945381,
0.06201763078570366,
-0.10138766467571259,
-0.0120387002825737,
0.011611363850533962,
0.026316849514842033,
0.11195274442434311,
0.005565487779676914,
0.008075977675616741,
-0.004747375380247831,
-0.13085629045963287,
-0.044359348714351654,
0.1278989017009735,
0.086823470890522,
-0.03378364443778992,
-0.14632835984230042,
-0.02955763041973114,
-0.13903604447841644,
-0.06037355214357376,
-0.0035381196066737175,
0.028870007023215294,
-0.03468857333064079,
-0.08314121514558792,
-0.0016900459304451942,
-0.06690781563520432,
-0.0931219756603241,
0.03080405481159687,
0.19320528209209442,
0.03635793179273605,
-0.03448725491762161,
0.011929660104215145,
0.1011897623538971,
0.0344579741358757,
-0.15655559301376343,
-0.01968676969408989,
0.024557214230298996,
-0.0927719697356224,
-0.029281342402100563,
-0.03082447312772274,
-0.014538772404193878,
0.02934567630290985,
0.18643127381801605,
-0.015089740045368671,
0.10128635168075562,
0.030110469087958336,
0.010806476697325706,
-0.0820765346288681,
0.1560186892747879,
-0.039267148822546005,
-0.05683530867099762,
-0.029344065114855766,
0.1398940235376358,
0.012073381803929806,
-0.01721341907978058,
-0.06470982730388641,
0.030460823327302933,
0.1052950993180275,
0.06159725785255432,
-0.004059796221554279,
0.04385484382510185,
-0.08300361037254333,
-0.019471898674964905,
0.0030916829127818346,
-0.10505710542201996,
0.031089842319488525,
0.03172142058610916,
-0.05955187603831291,
-0.062166545540094376,
0.004964565858244896,
0.04250108823180199,
0.007567472755908966,
0.10249019414186478,
-0.054896097630262375,
-0.021203402429819107,
-0.06236175820231438,
-0.0961354523897171,
0.01670367829501629,
-0.01629846729338169,
0.015114943496882915,
-0.08449599146842957,
-0.12395710498094559,
-0.042763784527778625,
0.06925242394208908,
-0.04224533215165138,
-0.060573428869247437,
-0.07806985825300217,
-0.06993082910776138,
0.042324915528297424,
-0.013487623073160648,
0.13874468207359314,
-0.06839552521705627,
0.09922266006469727,
0.011116284877061844,
0.058040447533130646,
0.058810967952013016,
0.05038938671350479,
-0.06023242324590683,
0.06134948879480362,
-0.1477014124393463,
0.0934549942612648,
-0.09310027956962585,
0.0689990371465683,
-0.15365378558635712,
-0.08925390243530273,
0.0015425988240167499,
0.0034904556814581156,
0.08375141769647598,
0.1507098525762558,
-0.16401584446430206,
-0.08431895822286606,
0.1584523618221283,
-0.08183404058218002,
-0.1299702376127243,
0.1395934820175171,
-0.025187961757183075,
0.02078584022819996,
0.044339392334222794,
0.19458073377609253,
0.10764326900243759,
-0.09227307885885239,
0.031491152942180634,
-0.022017959505319595,
0.10112714767456055,
0.05825416371226311,
0.10284493863582611,
-0.017127271741628647,
-0.01907803863286972,
0.005397135857492685,
-0.03645677492022514,
0.07453742623329163,
-0.07899076491594315,
-0.09471110254526138,
-0.013222454115748405,
-0.09564055502414703,
0.02472606860101223,
0.033691879361867905,
0.02348852902650833,
-0.10282886773347855,
-0.11687136441469193,
0.02318275161087513,
0.10957322269678116,
-0.08079428225755692,
0.010204780846834183,
-0.08216878026723862,
0.023245612159371376,
0.013563300482928753,
-0.0023445007391273975,
-0.1312735229730606,
-0.02210261858999729,
0.03346278890967369,
-0.05563778802752495,
0.01941400021314621,
-0.05728091299533844,
0.10179861634969711,
0.05767900496721268,
-0.06775776296854019,
-0.08279582113027573,
-0.03329823166131973,
0.022669009864330292,
-0.08832727372646332,
-0.23308201134204865,
-0.06904949247837067,
-0.044917862862348557,
0.20660805702209473,
-0.20804055035114288,
0.02653586119413376,
0.008389778435230255,
0.12285139411687851,
0.05462472885847092,
-0.04210327938199043,
0.020842304453253746,
0.05003653094172478,
-0.005753537639975548,
-0.07702073454856873,
0.03268647938966751,
0.0003441066655796021,
-0.16651464998722076,
0.012074167840182781,
-0.1637980043888092,
0.11364953964948654,
0.0878836140036583,
0.03898463025689125,
-0.09697042405605316,
-0.06633811444044113,
-0.05700842663645744,
-0.050618771463632584,
-0.02165663242340088,
0.004688597749918699,
0.16500741243362427,
0.027239643037319183,
0.11159809678792953,
-0.06914946436882019,
-0.046924374997615814,
0.014717242680490017,
-0.009012664668262005,
-0.021173851564526558,
0.14821229875087738,
0.013811438344419003,
-0.10296623408794403,
0.11101942509412766,
0.12082353979349136,
-0.04222118481993675,
0.1336791217327118,
-0.06392308324575424,
-0.07411286234855652,
-0.03151516988873482,
0.04543750360608101,
0.03458903357386589,
0.0833890438079834,
-0.09719271957874298,
0.006730054970830679,
0.02226775512099266,
0.013048622757196426,
-0.0011931918561458588,
-0.18990501761436462,
-0.01740795373916626,
0.05490686371922493,
-0.060689620673656464,
-0.025707367807626724,
-0.012861818075180054,
-0.009216181002557278,
0.08496636152267456,
0.029166631400585175,
-0.05405902490019798,
-0.003975381143391132,
-0.02233153209090233,
-0.08714883029460907,
0.1962757259607315,
-0.10600198060274124,
-0.14102768898010254,
-0.1272188276052475,
0.011478333733975887,
-0.004555992316454649,
-0.004914466291666031,
0.05434740334749222,
-0.09986091405153275,
-0.033810026943683624,
-0.09767378866672516,
0.0063202837482094765,
-0.027350135147571564,
0.029148973524570465,
-0.017316853627562523,
0.011118982918560505,
0.06850103288888931,
-0.08479870855808258,
0.006416101939976215,
-0.003318621078506112,
-0.016283411532640457,
0.024187376722693443,
0.01926170103251934,
0.07387464493513107,
0.1348562240600586,
0.0412210077047348,
0.015840090811252594,
-0.04362507537007332,
0.2199719250202179,
-0.11485787481069565,
0.0032603589352220297,
0.12601836025714874,
-0.01787818968296051,
0.04083137586712837,
0.176728755235672,
0.04636383429169655,
-0.0953727588057518,
0.028790658339858055,
0.018172428011894226,
-0.009039344266057014,
-0.20859244465827942,
-0.021565880626440048,
-0.05045543238520622,
-0.0117396991699934,
0.11820404976606369,
0.03569241613149643,
-0.02226758934557438,
0.03863901644945145,
-0.042287427932024,
-0.040113285183906555,
0.03471985086798668,
0.08072938024997711,
0.05404839664697647,
0.03414712846279144,
0.11137705296278,
-0.01188577339053154,
-0.03921475633978844,
0.0253888126462698,
0.016392990946769714,
0.19907881319522858,
0.014650021679699421,
0.17673413455486298,
0.027752459049224854,
0.12812045216560364,
0.0037451947573572397,
0.04632677137851715,
0.028102360665798187,
-0.019494209438562393,
-0.0019474579021334648,
-0.05058775097131729,
-0.05171790346503258,
0.0734705924987793,
0.06682778894901276,
0.04177651181817055,
-0.11289626359939575,
-0.011146008968353271,
0.03267751634120941,
0.34483861923217773,
0.06707778573036194,
-0.28974342346191406,
-0.10709107667207718,
0.03627309948205948,
-0.09672430902719498,
-0.03660733625292778,
0.02340555191040039,
0.12228018045425415,
-0.09693344682455063,
0.06454725563526154,
-0.06755772978067398,
0.08498973399400711,
-0.06508823484182358,
0.012321061454713345,
0.03730328381061554,
0.08088657259941101,
-0.01669965870678425,
0.056086376309394836,
-0.2576780319213867,
0.3033991754055023,
-0.0026454287581145763,
0.08233123272657394,
-0.054009437561035156,
0.02265634760260582,
0.04056175425648689,
-0.017858685925602913,
0.08309370279312134,
-0.012190929614007473,
-0.14012940227985382,
-0.20680002868175507,
-0.08012335002422333,
0.02751906029880047,
0.12418332695960999,
-0.07418768852949142,
0.10988407582044601,
-0.04338931664824486,
-0.025669611990451813,
0.059143103659152985,
-0.06650243699550629,
-0.1010056808590889,
-0.10146857798099518,
0.018663426861166954,
0.041345272213220596,
0.09245430678129196,
-0.10767409950494766,
-0.08992757648229599,
-0.035687077790498734,
0.12342405319213867,
-0.0897335559129715,
-0.030495651066303253,
-0.13851964473724365,
0.03906950354576111,
0.14887714385986328,
-0.07694111764431,
0.041412677615880966,
0.004650522954761982,
0.10706978291273117,
0.017439043149352074,
-0.01391812227666378,
0.12343434989452362,
-0.0938444584608078,
-0.21803367137908936,
-0.04641666263341904,
0.16952131688594818,
0.03794180601835251,
0.0734471008181572,
-0.017570573836565018,
0.04257426783442497,
-0.02541504055261612,
-0.06782422214746475,
0.06905388087034225,
0.03827442228794098,
-0.01071158517152071,
0.0515260212123394,
-0.03504884988069534,
0.013330010697245598,
-0.08104801923036575,
-0.060344427824020386,
0.14261935651302338,
0.33040308952331543,
-0.0795646607875824,
0.06986438482999802,
0.08415696769952774,
-0.032342903316020966,
-0.14798997342586517,
-0.010892458260059357,
0.10500643402338028,
0.025427499786019325,
0.03197629749774933,
-0.2109941840171814,
0.030091101303696632,
0.06986649334430695,
-0.03384041413664818,
0.07002831250429153,
-0.30295273661613464,
-0.14045350253582,
0.12125299125909805,
0.11125998944044113,
0.02042514644563198,
-0.14767804741859436,
-0.05920887365937233,
-0.02310089021921158,
-0.07008913159370422,
0.05036604031920433,
-0.038192275911569595,
0.13357609510421753,
0.005116392392665148,
0.034445103257894516,
0.03727061673998833,
-0.048174310475587845,
0.16255010664463043,
-0.04167130962014198,
0.06923534721136093,
-0.027592981234192848,
0.032774776220321655,
-0.06403718888759613,
-0.07693741470575333,
0.016137247905135155,
-0.14497283101081848,
0.027453556656837463,
-0.10479441285133362,
-0.039610717445611954,
-0.07060698419809341,
0.02305106446146965,
-0.024388274177908897,
-0.054549187421798706,
-0.0071863094344735146,
0.05680682882666588,
0.08389568328857422,
-0.002267655450850725,
0.11304458230733871,
-0.05903037264943123,
0.17095521092414856,
0.12648285925388336,
0.1355840116739273,
-0.014955843798816204,
-0.04849956929683685,
-0.0015201140195131302,
-0.029608173295855522,
0.057580240070819855,
-0.1132940873503685,
0.037784792482852936,
0.1284770369529724,
0.03517487645149231,
0.16184622049331665,
0.04859153926372528,
-0.06819852441549301,
-0.0019781142473220825,
0.05775217339396477,
-0.09035848081111908,
-0.20410043001174927,
-0.023942556232213974,
0.0625886470079422,
-0.15871278941631317,
-0.00533140217885375,
0.11970630288124084,
-0.054675690829753876,
-0.009234756231307983,
0.0026462890673428774,
0.04485682025551796,
-0.030373450368642807,
0.21440601348876953,
0.028224550187587738,
0.08678123354911804,
-0.08909593522548676,
0.08005088567733765,
0.03961896523833275,
-0.13515964150428772,
0.05757422372698784,
0.09413216263055801,
-0.061348773539066315,
-0.030857397243380547,
0.044215209782123566,
0.07146670669317245,
0.047209594398736954,
-0.04206724464893341,
-0.11628466099500656,
-0.1489812433719635,
0.06899622827768326,
0.12191058695316315,
0.02459736354649067,
0.017203856259584427,
-0.014996862038969994,
0.0414796881377697,
-0.08200837671756744,
0.11135313659906387,
0.07561642676591873,
0.06385636329650879,
-0.12676961719989777,
0.13293825089931488,
-0.004160310607403517,
-0.00014944645226933062,
-0.004561587702482939,
-0.00015921311569400132,
-0.12939520180225372,
0.02005564048886299,
-0.11431916058063507,
-0.02965523488819599,
-0.05733780935406685,
0.011031227186322212,
0.017020156607031822,
-0.07080765813589096,
-0.037049636244773865,
0.011862359941005707,
-0.12071475386619568,
-0.04919397458434105,
-0.018887747079133987,
0.07405022531747818,
-0.10267800837755203,
-0.04127819091081619,
0.05493762716650963,
-0.1167295053601265,
0.09223776310682297,
0.03976941108703613,
-0.000749287661164999,
0.012456170283257961,
-0.12782825529575348,
0.003822523169219494,
0.018830861896276474,
-0.008756552822887897,
0.007594865281134844,
-0.18613475561141968,
-0.021848412230610847,
-0.03844603896141052,
0.01180787943303585,
0.0038683349266648293,
0.08123993873596191,
-0.11949233710765839,
-0.011724771931767464,
-0.01049484871327877,
-0.06191796809434891,
-0.07320187240839005,
0.024667512625455856,
0.06319401413202286,
0.0033341040834784508,
0.16372379660606384,
-0.09017019718885422,
0.04986242204904556,
-0.21147659420967102,
-0.0019824812188744545,
-0.027869489043951035,
-0.09022340923547745,
-0.08665984869003296,
-0.03790422901511192,
0.09305388480424881,
-0.055152032524347305,
0.08369994908571243,
-0.06425202637910843,
0.04140260070562363,
0.04139133542776108,
-0.08909349143505096,
0.04845322668552399,
0.04881398752331734,
0.2119121551513672,
0.056271716952323914,
-0.03927651047706604,
0.05980508029460907,
0.0004047882684972137,
0.0450921393930912,
0.10784006118774414,
0.14661644399166107,
0.18032807111740112,
0.033304233103990555,
0.07757820934057236,
0.05594503507018089,
-0.0869695395231247,
-0.13044044375419617,
0.0831446647644043,
-0.02821151353418827,
0.12038540095090866,
-0.022504037246108055,
0.22839932143688202,
0.10040945559740067,
-0.18371129035949707,
0.049940984696149826,
-0.04558226466178894,
-0.06968606263399124,
-0.10972685366868973,
-0.07187316566705704,
-0.09213609993457794,
-0.16723309457302094,
0.004488687962293625,
-0.0909818783402443,
0.03129451721906662,
0.04662149399518967,
0.03846973925828934,
0.03867216408252716,
0.12888431549072266,
0.027245040982961655,
0.018420012667775154,
0.10113829374313354,
-0.00024125019263010472,
-0.02448316290974617,
-0.04527215287089348,
-0.1276128590106964,
0.04826505854725838,
-0.03239909186959267,
0.03121737390756607,
-0.041304104030132294,
-0.1119665876030922,
0.04803001880645752,
0.006402595434337854,
-0.12529046833515167,
0.02253810688853264,
-0.005662206560373306,
0.05571118742227554,
0.0772620365023613,
0.03576309233903885,
-0.013211898505687714,
-0.01019340381026268,
0.2627720236778259,
-0.10455718636512756,
-0.08178284764289856,
-0.13483712077140808,
0.27750104665756226,
-0.02726617269217968,
-0.010581433773040771,
0.011524575762450695,
-0.07469464838504791,
-0.011290081776678562,
0.15189971029758453,
0.15078507363796234,
-0.0416552871465683,
-0.00048361977678723633,
-0.007340806536376476,
-0.008375301957130432,
-0.03643398359417915,
0.08457797020673752,
0.10804399102926254,
0.01003960333764553,
-0.07083045691251755,
0.00492858374491334,
-0.01842079870402813,
-0.07030337303876877,
-0.04490778595209122,
0.09229451417922974,
0.022882511839270592,
-0.0001399526372551918,
-0.032288409769535065,
0.11572413146495819,
-0.056374941021203995,
-0.12405622750520706,
0.014430718496441841,
-0.1754440814256668,
-0.16994743049144745,
-0.053358469158411026,
0.0583907887339592,
0.04724326729774475,
0.04701093211770058,
0.00998838059604168,
-0.011932930909097195,
0.06282704323530197,
-0.001070179627276957,
-0.02491016685962677,
-0.11611635982990265,
0.09757500141859055,
-0.07212787121534348,
0.194623202085495,
-0.03014412336051464,
0.007083400152623653,
0.12737992405891418,
0.03370659053325653,
-0.09136117994785309,
0.0635145828127861,
0.09083423018455505,
-0.13054479658603668,
0.06167527660727501,
0.18150842189788818,
-0.04695717245340347,
0.16418199241161346,
0.04297034069895744,
-0.13805104792118073,
0.01603587158024311,
-0.08792823553085327,
-0.07479052245616913,
-0.05671828240156174,
-0.000872881559189409,
-0.0308833010494709,
0.14614684879779816,
0.2054160237312317,
-0.08002810925245285,
-0.022787470370531082,
-0.049671564251184464,
0.009483017958700657,
0.05919725447893143,
0.0828605368733406,
-0.045377276837825775,
-0.26772481203079224,
0.0025048807729035616,
0.036033663898706436,
0.02596198208630085,
-0.23924466967582703,
-0.10078597813844681,
0.004026897717267275,
-0.03516356274485588,
-0.05024034157395363,
0.1056838259100914,
0.09780696034431458,
0.03387460112571716,
-0.05653586983680725,
-0.14399035274982452,
-0.017390051856637,
0.18055494129657745,
-0.15648610889911652,
-0.04486653953790665
] |
null | null | transformers |
# Model Card for Mistral-7B-Instruct-v0.2
The Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of [Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1).
For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/la-plateforme/).
## Instruction format
In order to leverage instruction fine-tuning, your prompt should be surrounded by `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.
E.g.
```
text = "<s>[INST] What is your favourite condiment? [/INST]"
"Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> "
"[INST] Do you have mayonnaise recipes? [/INST]"
```
This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.2")
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.2")
messages = [
{"role": "user", "content": "What is your favourite condiment?"},
{"role": "assistant", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"},
{"role": "user", "content": "Do you have mayonnaise recipes?"}
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
```
## Model Architecture
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
## Troubleshooting
- If you see the following error:
```
Traceback (most recent call last):
File "", line 1, in
File "/transformers/models/auto/auto_factory.py", line 482, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/transformers/models/auto/configuration_auto.py", line 1022, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/transformers/models/auto/configuration_auto.py", line 723, in getitem
raise KeyError(key)
KeyError: 'mistral'
```
Installing transformers from source should solve the issue
pip install git+https://github.com/huggingface/transformers
This should not be required after transformers-v4.33.4.
## Limitations
The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance.
It does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to
make the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs.
## The Mistral AI Team
Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Blanche Savary, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Emma Bou Hanna, Florian Bressand, Gianna Lengyel, Guillaume Bour, Guillaume Lample, Lélio Renard Lavaud, Louis Ternon, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Théophile Gervet, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed. | {"license": "apache-2.0", "tags": ["finetuned"], "pipeline_tag": "text-generation", "inference": false} | text-generation | moc1pher/mistral-orient | [
"transformers",
"pytorch",
"safetensors",
"mistral",
"text-generation",
"finetuned",
"conversational",
"arxiv:2310.06825",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:14:07+00:00 | [
"2310.06825"
] | [] | TAGS
#transformers #pytorch #safetensors #mistral #text-generation #finetuned #conversational #arxiv-2310.06825 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
# Model Card for Mistral-7B-Instruct-v0.2
The Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of Mistral-7B-Instruct-v0.1.
For full details of this model please read our paper and release blog post.
## Instruction format
In order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.
E.g.
This format is available as a chat template via the 'apply_chat_template()' method:
## Model Architecture
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
## Troubleshooting
- If you see the following error:
Installing transformers from source should solve the issue
pip install git+URL
This should not be required after transformers-v4.33.4.
## Limitations
The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance.
It does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to
make the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs.
## The Mistral AI Team
Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Blanche Savary, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Emma Bou Hanna, Florian Bressand, Gianna Lengyel, Guillaume Bour, Guillaume Lample, Lélio Renard Lavaud, Louis Ternon, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Théophile Gervet, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed. | [
"# Model Card for Mistral-7B-Instruct-v0.2\n\nThe Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of Mistral-7B-Instruct-v0.1.\n\nFor full details of this model please read our paper and release blog post.",
"## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:",
"## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer",
"## Troubleshooting\n- If you see the following error:\n\n\nInstalling transformers from source should solve the issue\npip install git+URL\n\nThis should not be required after transformers-v4.33.4.",
"## Limitations\n\nThe Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance. \nIt does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to\nmake the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs.",
"## The Mistral AI Team\n\nAlbert Jiang, Alexandre Sablayrolles, Arthur Mensch, Blanche Savary, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Emma Bou Hanna, Florian Bressand, Gianna Lengyel, Guillaume Bour, Guillaume Lample, Lélio Renard Lavaud, Louis Ternon, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Théophile Gervet, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed."
] | [
"TAGS\n#transformers #pytorch #safetensors #mistral #text-generation #finetuned #conversational #arxiv-2310.06825 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n",
"# Model Card for Mistral-7B-Instruct-v0.2\n\nThe Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of Mistral-7B-Instruct-v0.1.\n\nFor full details of this model please read our paper and release blog post.",
"## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:",
"## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer",
"## Troubleshooting\n- If you see the following error:\n\n\nInstalling transformers from source should solve the issue\npip install git+URL\n\nThis should not be required after transformers-v4.33.4.",
"## Limitations\n\nThe Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance. \nIt does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to\nmake the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs.",
"## The Mistral AI Team\n\nAlbert Jiang, Alexandre Sablayrolles, Arthur Mensch, Blanche Savary, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Emma Bou Hanna, Florian Bressand, Gianna Lengyel, Guillaume Bour, Guillaume Lample, Lélio Renard Lavaud, Louis Ternon, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Théophile Gervet, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed."
] | [
68,
70,
105,
56,
42,
85,
125
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #mistral #text-generation #finetuned #conversational #arxiv-2310.06825 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n# Model Card for Mistral-7B-Instruct-v0.2\n\nThe Mistral-7B-Instruct-v0.2 Large Language Model (LLM) is an improved instruct fine-tuned version of Mistral-7B-Instruct-v0.1.\n\nFor full details of this model please read our paper and release blog post.## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer## Troubleshooting\n- If you see the following error:\n\n\nInstalling transformers from source should solve the issue\npip install git+URL\n\nThis should not be required after transformers-v4.33.4.## Limitations\n\nThe Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance. \nIt does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to\nmake the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs."
] | [
-0.09122976660728455,
0.09995874762535095,
-0.006308325100690126,
0.029533402994275093,
0.08990995585918427,
-0.003773544216528535,
0.08337631076574326,
0.08826310932636261,
0.0517084077000618,
0.054058656096458435,
-0.018877001479268074,
0.07797109335660934,
0.0678364634513855,
0.11602136492729187,
0.009831574745476246,
-0.20886167883872986,
0.03951217606663704,
-0.04025792330503464,
0.09019631892442703,
0.06990981847047806,
0.13127665221691132,
-0.026330798864364624,
0.01257265079766512,
0.04588649421930313,
-0.06649493426084518,
0.014841371215879917,
0.004760208074003458,
-0.029666366055607796,
0.08171582221984863,
0.06000325456261635,
0.046539340168237686,
0.029706109315156937,
-0.008716116659343243,
-0.17006389796733856,
0.02086634933948517,
0.094186931848526,
-0.039122045040130615,
0.052450016140937805,
0.08241581171751022,
-0.005354501772671938,
0.08867881447076797,
-0.10352065414190292,
-0.015174277126789093,
0.06410208344459534,
-0.041976556181907654,
-0.14058099687099457,
-0.060432516038417816,
0.10539232939481735,
0.10704273730516434,
0.08087848126888275,
-0.01136627048254013,
0.04308357834815979,
0.025535574182868004,
0.0944424644112587,
0.14173464477062225,
-0.1892741620540619,
-0.03308636695146561,
0.05607762187719345,
0.03748226538300514,
0.11741676926612854,
-0.058054689317941666,
-0.01577284000813961,
-0.007755413185805082,
0.029794029891490936,
0.020998695865273476,
-0.018926361575722694,
-0.04058540239930153,
-0.040962979197502136,
-0.12559473514556885,
-0.03655114024877548,
0.20572802424430847,
0.00918054673820734,
-0.07827356457710266,
-0.14934350550174713,
-0.08444869518280029,
0.11483647674322128,
0.04466954246163368,
-0.05631684139370918,
0.05203638970851898,
0.052616722881793976,
0.09125620871782303,
-0.12824536859989166,
-0.11384213715791702,
-0.0477135144174099,
-0.026983344927430153,
0.07531532645225525,
-0.005512096919119358,
0.042371802031993866,
-0.036691345274448395,
0.11074173450469971,
-0.08680159598588943,
-0.07763858139514923,
-0.09926757961511612,
-0.014465011656284332,
-0.06412360817193985,
-0.030816078186035156,
-0.001521610771305859,
-0.10430551320314407,
0.04863782972097397,
0.19592222571372986,
-0.11526430398225784,
0.05876130238175392,
0.019939415156841278,
0.03386683389544487,
0.011422291398048401,
0.1450784057378769,
-0.009345019236207008,
-0.03683842346072197,
0.06807894259691238,
0.03602974861860275,
0.1030249148607254,
0.006391557864844799,
-0.05393271893262863,
-0.09957163035869598,
0.04421057552099228,
0.04006526619195938,
0.07497607916593552,
0.051922064274549484,
-0.05662759020924568,
-0.025310173630714417,
0.26935672760009766,
-0.09875214099884033,
0.0210324227809906,
0.005812100134789944,
-0.04055633023381233,
0.02057819254696369,
0.07767920941114426,
-0.03446538373827934,
-0.07680903375148773,
-0.03877895697951317,
-0.06771595031023026,
-0.02544291876256466,
-0.07025010138750076,
-0.0937042310833931,
0.015423011034727097,
-0.03561444953083992,
-0.05331877991557121,
-0.1285807192325592,
-0.22477620840072632,
-0.003443832276389003,
0.005950997117906809,
-0.01175892073661089,
0.00036745399120263755,
-0.02977934665977955,
-0.0560557097196579,
-0.020429646596312523,
-0.043966617435216904,
-0.018873801454901695,
-0.05414580553770065,
0.023493366315960884,
-0.014504307880997658,
0.03621775284409523,
-0.041580069810152054,
-0.002030450152233243,
-0.035914838314056396,
0.02084091491997242,
-0.23978149890899658,
0.11739853769540787,
-0.0365278497338295,
-0.009982343763113022,
-0.08326455950737,
-0.01378034520894289,
0.07733245939016342,
0.0006590727134607732,
0.04440942406654358,
0.11925916373729706,
-0.14457012712955475,
0.00017851535812951624,
0.13597238063812256,
-0.1747773289680481,
0.0035689121577888727,
0.12913428246974945,
0.038906652480363846,
-0.021937232464551926,
0.07987452298402786,
0.12739397585391998,
0.0916777029633522,
-0.05820862948894501,
-0.10835675895214081,
0.06764467805624008,
-0.10098790377378464,
-0.00909836683422327,
0.016224337741732597,
-0.012687094509601593,
0.06922647356987,
0.03896632790565491,
-0.08153177052736282,
0.04328412935137749,
0.026183338835835457,
0.03425585478544235,
-0.016736533492803574,
-0.03775324672460556,
0.01444784551858902,
-0.02167140133678913,
-0.05011240392923355,
-0.03274926170706749,
-0.12294188886880875,
0.07225214689970016,
0.13708068430423737,
-0.01793021708726883,
0.011290013790130615,
-0.09402501583099365,
0.11633113026618958,
-0.03436492756009102,
0.001635813619941473,
-0.1428476721048355,
-0.081349216401577,
0.046051498502492905,
-0.005028255749493837,
-0.034682925790548325,
0.07700513303279877,
0.0411263033747673,
0.049591682851314545,
0.029178237542510033,
-0.06364486366510391,
0.028140179812908173,
-0.026263220235705376,
-0.06001611053943634,
-0.06844783574342728,
-0.032731302082538605,
-0.037791743874549866,
0.1564401239156723,
-0.1757822185754776,
0.06411924958229065,
0.15055224299430847,
0.11750984191894531,
0.010728646069765091,
-0.07762043178081512,
0.031414955854415894,
-0.05891864001750946,
-0.026723431423306465,
-0.06931784749031067,
0.03714114427566528,
0.06772715598344803,
-0.006032122764736414,
0.0629381537437439,
-0.19366289675235748,
-0.14648154377937317,
0.03846127912402153,
0.022368954494595528,
-0.018691910430788994,
-0.05776825547218323,
-0.040307559072971344,
-0.017559722065925598,
-0.05450120568275452,
-0.13784809410572052,
0.21932722628116608,
0.0023007115814834833,
0.13210082054138184,
-0.06373266875743866,
-0.07977675646543503,
-0.02543029561638832,
-0.02316652238368988,
-0.035438671708106995,
0.05071575939655304,
-0.14285901188850403,
-0.055836595594882965,
0.05152415484189987,
-0.013500388711690903,
-0.07159879058599472,
0.0962795540690422,
0.038629014045000076,
-0.020208803936839104,
0.0004354751145001501,
0.09678313881158829,
0.021046947687864304,
0.060663819313049316,
-0.09226226806640625,
-0.023523595184087753,
0.028608111664652824,
0.043440647423267365,
0.0389854870736599,
-0.11066831648349762,
0.07958953827619553,
0.017685992643237114,
-0.06076524406671524,
0.04019007831811905,
0.02976926416158676,
-0.027299771085381508,
0.04563915356993675,
-0.017612917348742485,
0.017345082014799118,
0.02690138854086399,
-0.06877323985099792,
-0.12751956284046173,
0.14540037512779236,
-0.11572582274675369,
-0.19297891855239868,
-0.15160994231700897,
-0.04611416533589363,
-0.06150191277265549,
0.011887121945619583,
0.08417364954948425,
-0.0453672893345356,
-0.06522446870803833,
-0.08504261076450348,
-0.0032968062441796064,
-0.01405313704162836,
-0.06643698364496231,
-0.030977055430412292,
-0.06045777350664139,
0.08013821393251419,
-0.1550554484128952,
-0.008888275362551212,
0.012450462207198143,
-0.12500730156898499,
0.035465043038129807,
0.012831589207053185,
0.048551592975854874,
0.07016462087631226,
-0.02762899547815323,
0.004421699326485395,
-0.014463755302131176,
0.20392952859401703,
-0.026795165613293648,
0.1318570375442505,
0.25430479645729065,
-0.005930204875767231,
0.10298644006252289,
0.07107214629650116,
-0.029025109484791756,
-0.032265182584524155,
0.010028330609202385,
-0.014280416071414948,
-0.012318518944084644,
-0.19846276938915253,
-0.03888711333274841,
-0.047828707844018936,
-0.044576212763786316,
0.02656431868672371,
0.0810781717300415,
0.10366176068782806,
0.06835824251174927,
-0.05313859134912491,
0.0631624162197113,
0.08509429544210434,
0.11750930547714233,
0.1223234087228775,
0.000195670232642442,
0.040309082716703415,
-0.04118220508098602,
0.06078090891242027,
0.1078772097826004,
0.04703846946358681,
0.15417000651359558,
-0.10982297360897064,
0.21288932859897614,
0.04911128804087639,
0.03376590833067894,
0.029084596782922745,
0.045300573110580444,
-0.08957281708717346,
0.028821531683206558,
-0.0017345900414511561,
-0.06944859772920609,
-0.031264401972293854,
0.06728219240903854,
-0.06453339010477066,
0.05665815994143486,
-0.058640800416469574,
-0.01331192534416914,
0.07211027294397354,
0.19794955849647522,
0.0019011014373973012,
-0.16905081272125244,
-0.08375648409128189,
0.0536385104060173,
-0.04080980643630028,
-0.09874904900789261,
-0.005599504336714745,
0.1848110407590866,
-0.06547076255083084,
0.009154211729764938,
-0.011043529026210308,
0.08978407829999924,
-0.1510777473449707,
0.0002503507712390274,
-0.012387998402118683,
0.2345670908689499,
0.0021288737189024687,
0.07589662820100784,
-0.07418205589056015,
0.05435143783688545,
0.01932256482541561,
0.09498977661132812,
-0.0641138032078743,
0.07026970386505127,
0.05825149640440941,
0.07229281961917877,
0.10921912640333176,
0.0057350327260792255,
0.01774989254772663,
-0.03461211919784546,
-0.05528680607676506,
0.010121883824467659,
0.019923923537135124,
-0.05707584694027901,
0.029049381613731384,
-0.038597121834754944,
-0.013737550936639309,
-0.011083430610597134,
-0.026645664125680923,
-0.12138506770133972,
-0.14152124524116516,
0.010888484306633472,
0.1163974180817604,
0.05122136324644089,
-0.07804708182811737,
0.000575966783799231,
-0.08305352926254272,
0.18970656394958496,
-0.12416958808898926,
-0.13076066970825195,
-0.12083401530981064,
-0.005819360725581646,
0.0410379022359848,
-0.050455015152692795,
0.020454684272408485,
-0.014301690272986889,
0.1544983834028244,
-0.019070880487561226,
-0.03670553117990494,
-0.006890958175063133,
-0.12655970454216003,
-0.07137329131364822,
-0.0186800304800272,
0.020194249227643013,
0.08031488209962845,
-0.009871170856058598,
0.011273177340626717,
-0.00846424512565136,
-0.03624071553349495,
-0.09521675854921341,
-0.030311748385429382,
0.21689844131469727,
0.05854581296443939,
0.0672607347369194,
-0.0047690593637526035,
-0.16504588723182678,
-0.04229012504220009,
-0.0029318782035261393,
0.0359661839902401,
0.2621054947376251,
-0.0683789998292923,
0.04860452562570572,
0.15795662999153137,
-0.06608088314533234,
-0.14389196038246155,
0.007474810350686312,
0.03195760399103165,
0.019309960305690765,
-0.014696224592626095,
-0.11517390608787537,
0.09723737090826035,
0.08571474999189377,
-0.017502067610621452,
0.09324395656585693,
-0.17202329635620117,
-0.07628153264522552,
0.021344000473618507,
0.09512648731470108,
0.044108930975198746,
-0.07580165565013885,
-0.06978519260883331,
0.012014346197247505,
-0.11320611834526062,
-0.03327067941427231,
-0.06510777026414871,
0.042904164642095566,
-0.025522761046886444,
0.012446695007383823,
0.04274255782365799,
-0.05502016842365265,
0.1081153079867363,
-0.02963438630104065,
0.037992071360349655,
-0.07998600602149963,
0.056002743542194366,
0.05874663591384888,
-0.09115762263536453,
0.11143956333398819,
-0.11406860500574112,
0.07450882345438004,
-0.055888231843709946,
-0.02766232192516327,
-0.07362660020589828,
0.07660730183124542,
-0.0006531607359647751,
-0.05538879707455635,
-0.016863074153661728,
0.012752714566886425,
0.03668127581477165,
0.017889320850372314,
-0.04557008296251297,
-0.060106970369815826,
0.03382512181997299,
0.10340806096792221,
0.12306386232376099,
-0.06758695840835571,
-0.11058469116687775,
-0.0267763864248991,
-0.005948980804532766,
0.05983709916472435,
-0.0629810243844986,
0.003143324749544263,
0.05833889916539192,
0.022405333817005157,
0.1424931138753891,
0.03365277871489525,
-0.11394240707159042,
0.01333173643797636,
0.04145481437444687,
-0.09323354065418243,
-0.1075136736035347,
-0.059797462075948715,
0.16341142356395721,
-0.07242707908153534,
0.04656002297997475,
0.1447622925043106,
-0.015587040223181248,
-0.015022117644548416,
0.020994344726204872,
0.014920483343303204,
0.0011200710432603955,
0.0338810570538044,
-0.0036196729633957148,
0.06271446496248245,
-0.03255683183670044,
0.06066170707345009,
0.06373757123947144,
-0.0782044380903244,
0.01491590216755867,
0.12704876065254211,
-0.1371438056230545,
-0.06572407484054565,
-0.12210627645254135,
0.02530255727469921,
0.0003578366304282099,
-0.07092497497797012,
-0.04702071473002434,
-0.03636427968740463,
-0.006187040824443102,
0.10176940262317657,
0.045467622578144073,
-0.02080599218606949,
0.0008707234519533813,
0.05064806342124939,
-0.07312093675136566,
0.08478564769029617,
-0.02416825108230114,
0.040943291038274765,
-0.12376252561807632,
0.027192043140530586,
0.037982210516929626,
-0.0038375111762434244,
0.007782239932566881,
-0.042000435292720795,
-0.08363431692123413,
-0.022644486278295517,
-0.14608678221702576,
0.06190226599574089,
-0.09263260662555695,
-0.005807201378047466,
0.016090059652924538,
-0.012165832333266735,
0.016686253249645233,
0.07973022013902664,
0.00377894239500165,
-0.05487150326371193,
-0.0025948130059987307,
0.07414401322603226,
-0.13408970832824707,
-0.04036888852715492,
-0.029023658484220505,
-0.10044552385807037,
0.15776945650577545,
0.025232046842575073,
-0.0507599301636219,
-0.02877403050661087,
-0.1662212610244751,
0.0010303775779902935,
-0.006965376902371645,
0.036178793758153915,
0.028203964233398438,
-0.061236534267663956,
0.02067500911653042,
0.008977696299552917,
-0.0706329271197319,
-0.0443856343626976,
0.0426940880715847,
-0.05284399911761284,
0.06799537688493729,
-0.011827332898974419,
0.013159328140318394,
-0.12983830273151398,
0.04114779457449913,
0.13717180490493774,
0.05448373779654503,
0.09325843304395676,
-0.0627380907535553,
0.015427319332957268,
-0.11392536759376526,
-0.013688191771507263,
0.048120420426130295,
-0.01714637503027916,
-0.010919734835624695,
-0.014123428612947464,
0.03370531275868416,
-0.04520808160305023,
0.028490690514445305,
-0.03887898474931717,
0.02481120266020298,
0.03688593953847885,
-0.004172487650066614,
-0.0407419428229332,
-0.037135541439056396,
0.06235098838806152,
-0.0422561876475811,
0.0266500823199749,
0.01741548627614975,
0.00519227422773838,
0.03847751393914223,
-0.050803568214178085,
0.09081755578517914,
0.08846168965101242,
-0.048427384346723557,
0.09972386062145233,
-0.016603082418441772,
-0.05013977363705635,
-0.1843680441379547,
-0.003739334410056472,
-0.0449356846511364,
0.048155076801776886,
-0.08019787818193436,
0.08246185630559921,
0.18118244409561157,
-0.07753710448741913,
0.02953098714351654,
0.01954956352710724,
-0.03809278458356857,
-0.05501939728856087,
-0.2097969651222229,
-0.008180273696780205,
-0.07623868435621262,
-0.03949728608131409,
-0.09863108396530151,
0.04925449192523956,
0.0655006617307663,
0.008290954865515232,
0.036034900695085526,
0.060999445617198944,
0.005514058284461498,
-0.04270507022738457,
0.024764632806181908,
-0.045442525297403336,
0.04631679877638817,
-0.01102666836231947,
0.002645237138494849,
0.06961356103420258,
0.03025110252201557,
0.06427391618490219,
0.061276182532310486,
0.10112068802118301,
0.001364362658932805,
-0.0063048070296645164,
-0.07291340827941895,
-0.010296468622982502,
0.05256003886461258,
-0.03847275301814079,
0.12808924913406372,
0.0733470767736435,
-0.043211985379457474,
-0.0011656488059088588,
0.15687476098537445,
-0.027920696884393692,
-0.11538483947515488,
-0.09307684004306793,
0.19428744912147522,
-0.03368787840008736,
-0.03636116907000542,
0.02421930618584156,
-0.15978281199932098,
0.02618887647986412,
0.17679788172245026,
0.03450819104909897,
0.007341380696743727,
0.0037615629844367504,
-0.014763301238417625,
-0.018802229315042496,
-0.013065815903246403,
0.04995555430650711,
0.05023469030857086,
0.31053775548934937,
-0.015703294426202774,
0.0905599296092987,
-0.00009221016807714477,
0.007851412519812584,
-0.047310374677181244,
0.02620859257876873,
-0.07397934794425964,
0.01677018590271473,
-0.013444129377603531,
0.02743341214954853,
0.002264605136588216,
-0.12977886199951172,
-0.0178713146597147,
0.005487839225679636,
-0.04351862892508507,
0.010454866103827953,
0.08410380780696869,
-0.04949912056326866,
0.056694719940423965,
-0.015378156676888466,
-0.00844633486121893,
0.19177784025669098,
-0.056605853140354156,
-0.09166677296161652,
-0.004189594183117151,
0.025683943182229996,
-0.02766363136470318,
0.1757136732339859,
-0.025558756664395332,
0.10769180953502655,
0.11273365467786789,
0.03160439431667328,
-0.16835029423236847,
0.04338384047150612,
-0.03177490830421448,
-0.13471943140029907,
0.013020462356507778,
0.11624234914779663,
-0.03241545706987381,
0.04916926845908165,
0.024915173649787903,
-0.11342480033636093,
0.01749606244266033,
0.08364526927471161,
-0.012924551032483578,
-0.10274139791727066,
0.007744533009827137,
-0.054367441684007645,
0.11630494147539139,
0.05799400806427002,
-0.003428651951253414,
-0.04229886457324028,
-0.04676029458642006,
0.0087592713534832,
0.028061937540769577,
0.10476013273000717,
0.005520725157111883,
-0.11812025308609009,
0.02290538139641285,
0.01690605841577053,
0.045369066298007965,
-0.14347852766513824,
-0.07641860097646713,
-0.041906438767910004,
-0.04565022885799408,
-0.07123474776744843,
0.0511469766497612,
0.1101405918598175,
0.06813566386699677,
-0.018292635679244995,
0.06714203953742981,
-0.03753669559955597,
0.10779227316379547,
-0.09860462695360184,
-0.07551570236682892
] |
null | null | diffusers | ### My-Lion-xgb Dreambooth model trained by XKJKJHXJ following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 820423622002
Sample pictures of this concept:


| {"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]} | text-to-image | XKJKJHXJ/my-lion-xgb | [
"diffusers",
"safetensors",
"NxtWave-GenAI-Webinar",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-13T09:14:55+00:00 | [] | [] | TAGS
#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### My-Lion-xgb Dreambooth model trained by XKJKJHXJ following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 820423622002
Sample pictures of this concept:
!0
!1
| [
"### My-Lion-xgb Dreambooth model trained by XKJKJHXJ following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 820423622002\n\nSample pictures of this concept:\n\n \n !0\n !1"
] | [
"TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### My-Lion-xgb Dreambooth model trained by XKJKJHXJ following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 820423622002\n\nSample pictures of this concept:\n\n \n !0\n !1"
] | [
73,
61
] | [
"passage: TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### My-Lion-xgb Dreambooth model trained by XKJKJHXJ following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 820423622002\n\nSample pictures of this concept:\n\n \n !0\n !1"
] | [
-0.09972495585680008,
0.1396876871585846,
-0.001307588187046349,
0.01507405936717987,
0.059103406965732574,
-0.043390460312366486,
0.19504773616790771,
0.008234490640461445,
0.03484516963362694,
0.029355674982070923,
0.1212623119354248,
0.08151569217443466,
0.016207603737711906,
0.21143148839473724,
-0.040318895131349564,
-0.09162629395723343,
0.03803097829222679,
0.08349788933992386,
0.006381342187523842,
0.051762860268354416,
0.07728438824415207,
-0.08097448945045471,
0.1355699598789215,
-0.035279061645269394,
-0.18087884783744812,
-0.010474666021764278,
-0.030210785567760468,
-0.05548229068517685,
0.06150542199611664,
0.0399971604347229,
0.03712074086070061,
0.10677903890609741,
0.015507756732404232,
-0.04869725555181503,
0.03725495934486389,
0.014314346015453339,
-0.04870707169175148,
0.046805500984191895,
0.026819709688425064,
0.06558568775653839,
0.14201903343200684,
0.054694872349500656,
-0.07340651750564575,
0.027467554435133934,
-0.0747344046831131,
-0.06606213748455048,
0.008958752267062664,
0.14167025685310364,
0.13406480848789215,
0.07329759746789932,
0.006652934942394495,
0.10941065847873688,
0.06832412630319595,
0.12141337990760803,
0.16245850920677185,
-0.2623169422149658,
-0.09389500319957733,
0.1645413190126419,
0.11930093914270401,
0.0687195211648941,
-0.042341090738773346,
0.10929543524980545,
0.10598880052566528,
-0.027786538004875183,
0.04993749037384987,
-0.08031687885522842,
0.053039997816085815,
-0.06443779915571213,
-0.11943846195936203,
0.02499573864042759,
0.2231394350528717,
0.043009355664253235,
-0.0432109460234642,
-0.0566425584256649,
-0.09224135428667068,
0.02151188813149929,
-0.035982150584459305,
-0.023649323731660843,
-0.0427791066467762,
-0.002737312810495496,
-0.01691914163529873,
-0.06943587213754654,
-0.10424251854419708,
-0.06082478165626526,
-0.0024097992572933435,
0.18582142889499664,
0.00446403818204999,
0.07818493247032166,
-0.11383022367954254,
0.09399308264255524,
-0.0063231042586266994,
-0.12878569960594177,
0.012757686898112297,
-0.09906104952096939,
0.03739723190665245,
0.05787086486816406,
0.0658220425248146,
-0.05528866872191429,
0.05206182599067688,
-0.01576360873878002,
0.03433958441019058,
-0.030220702290534973,
0.024338847026228905,
0.09747209399938583,
0.028973782435059547,
-0.06174207106232643,
-0.09214691817760468,
-0.11823961138725281,
0.014599048532545567,
-0.04638702794909477,
0.019780665636062622,
-0.03343501314520836,
-0.08312349021434784,
0.0057694087736308575,
-0.07195701450109482,
0.05138971656560898,
0.029607413336634636,
0.0473262257874012,
0.008377429097890854,
-0.03709046542644501,
0.16755278408527374,
0.05282887816429138,
-0.012930667959153652,
-0.046392884105443954,
0.002883817534893751,
0.033946629613637924,
0.07399414479732513,
-0.01583636738359928,
0.0053752874955534935,
0.003913667984306812,
-0.08420537412166595,
-0.029466310515999794,
-0.04596319794654846,
-0.04324979707598686,
0.007089759688824415,
-0.15528613328933716,
0.045094236731529236,
-0.16595147550106049,
-0.09172999858856201,
0.05766897648572922,
0.05349389836192131,
0.0005652917316183448,
-0.055166278034448624,
-0.048092007637023926,
-0.10051457583904266,
0.0033846877049654722,
-0.021007776260375977,
-0.002786154393106699,
-0.012980705127120018,
0.043574512004852295,
0.03435857221484184,
0.09775196015834808,
-0.22727255523204803,
-0.002476580673828721,
-0.047694358974695206,
0.04404019936919212,
-0.009401744231581688,
-0.01113122422248125,
-0.06527562439441681,
0.05000758916139603,
0.0030749323777854443,
0.010603084228932858,
0.03206027299165726,
0.026195131242275238,
0.01993626356124878,
0.11434612423181534,
-0.1499631255865097,
-0.003969736397266388,
0.18909092247486115,
-0.13841387629508972,
-0.19545018672943115,
0.08082592487335205,
0.037288643419742584,
0.08743442595005035,
0.06649158149957657,
0.10675376653671265,
0.06256137788295746,
-0.21973712742328644,
-0.038060422986745834,
0.01856650412082672,
-0.12177077680826187,
-0.19326429069042206,
0.006924287881702185,
0.14370955526828766,
-0.031502410769462585,
0.01588706113398075,
-0.11059944331645966,
0.09924360364675522,
-0.08767370134592056,
-0.03189409151673317,
-0.04295041412115097,
-0.11922420561313629,
-0.05004604905843735,
-0.01741916686296463,
0.00435542780905962,
-0.01826709508895874,
0.004187361802905798,
-0.10695334523916245,
0.06873048841953278,
-0.03735867515206337,
-0.026378802955150604,
-0.1140846312046051,
0.06985063850879669,
-0.11834810674190521,
0.011715127155184746,
-0.0020144309382885695,
-0.026469307020306587,
0.053105805069208145,
0.10944667458534241,
-0.011836585588753223,
0.17150664329528809,
0.06495720148086548,
0.07590118795633316,
-0.022737102583050728,
-0.08602370321750641,
0.1020839512348175,
0.012630674988031387,
-0.061565980315208435,
-0.14913548529148102,
0.07449349761009216,
-0.06884046643972397,
-0.046121954917907715,
-0.18074160814285278,
0.02352800965309143,
-0.010660737752914429,
0.09625653922557831,
0.06538930535316467,
-0.0020958140958100557,
0.030152752995491028,
-0.005727013107389212,
-0.06474292278289795,
-0.00953388400375843,
0.047642581164836884,
0.022826073691248894,
-0.09246453642845154,
0.2216269075870514,
-0.15307630598545074,
0.19622796773910522,
0.08154141157865524,
-0.046858202666044235,
0.0058267610147595406,
0.08524052053689957,
-0.06837046146392822,
-0.007552927825599909,
0.009526497684419155,
-0.0015657276380807161,
0.00016151303134392947,
-0.04575793817639351,
0.10689960420131683,
-0.061538711190223694,
-0.0163321141153574,
0.06331503391265869,
-0.050590600818395615,
-0.03250729665160179,
0.08830762654542923,
0.06794318556785583,
-0.1261463463306427,
0.13085953891277313,
0.11863131821155548,
-0.006644703913480043,
0.19707059860229492,
0.02023329772055149,
-0.0008663904154673219,
-0.08152768015861511,
0.07400260120630264,
0.013409139588475227,
0.2332771122455597,
-0.0678531751036644,
0.019687505438923836,
0.01900111511349678,
-0.030509693548083305,
0.06276163458824158,
-0.08441101014614105,
-0.06379475444555283,
-0.010098082013428211,
-0.0382319912314415,
0.11596976965665817,
0.09655897319316864,
-0.1319931298494339,
0.10124752670526505,
-0.09821496158838272,
-0.11562240123748779,
0.027099907398223877,
-0.013495977967977524,
-0.046117912977933884,
0.07379613071680069,
-0.033582888543605804,
-0.17922307550907135,
-0.11946479231119156,
-0.09312039613723755,
-0.049169499427080154,
-0.008914140984416008,
0.04861206188797951,
-0.02288760431110859,
-0.03334484621882439,
-0.07921163737773895,
-0.08136196434497833,
-0.05013326182961464,
0.01761575974524021,
0.05969199538230896,
0.017750965431332588,
-0.01946161314845085,
-0.056317828595638275,
0.005638350732624531,
-0.036511436104774475,
0.009855453856289387,
0.09910257905721664,
0.03568466007709503,
0.17054250836372375,
0.09227404743432999,
-0.00016644792049191892,
-0.014351327903568745,
0.017649348825216293,
0.20117168128490448,
-0.04434289038181305,
0.10475480556488037,
0.12395184487104416,
0.032202355563640594,
0.05679270997643471,
0.13406263291835785,
0.01678546704351902,
-0.08209814131259918,
0.03617812693119049,
-0.05653921514749527,
-0.10134215652942657,
-0.12288536131381989,
-0.0543811097741127,
-0.0512002557516098,
0.15048320591449738,
-0.03243262320756912,
0.07132846862077713,
0.09460022300481796,
0.14194418489933014,
0.006317631341516972,
0.0034498318564146757,
-0.05665339156985283,
0.07976629585027695,
-0.09120440483093262,
-0.037337467074394226,
0.032735634595155716,
-0.08752309530973434,
-0.04599611461162567,
0.09283579140901566,
0.04466860741376877,
0.13411688804626465,
0.023056160658597946,
0.05974258854985237,
0.10173213481903076,
0.08530135452747345,
0.1405482441186905,
0.09490823745727539,
-0.04599767550826073,
-0.05162292346358299,
-0.01771215721964836,
-0.09092994034290314,
0.0970466136932373,
0.05699724331498146,
-0.05744832009077072,
-0.029293067753314972,
0.06808607280254364,
0.06341922283172607,
-0.020928628742694855,
0.08356878161430359,
0.0976305827498436,
-0.22013403475284576,
0.03555573895573616,
0.04244265705347061,
0.042563341557979584,
-0.07206059992313385,
0.0034333092626184225,
0.2823881208896637,
-0.015822680667042732,
0.05613155663013458,
-0.03046499751508236,
0.07971328496932983,
0.03171004727482796,
-0.012874835170805454,
-0.06214578077197075,
0.023430485278367996,
-0.009155750274658203,
0.01907406374812126,
-0.2381693571805954,
0.1604180485010147,
-0.01175896730273962,
0.0745939314365387,
-0.00718162814155221,
-0.03090674616396427,
-0.024847237393260002,
0.12195021659135818,
0.17439471185207367,
0.020795324817299843,
-0.043835435062646866,
-0.02425769530236721,
-0.12538571655750275,
0.03318782150745392,
0.016108227893710136,
0.011162797920405865,
0.04043714329600334,
0.08678936958312988,
-0.04184408485889435,
0.00032717938302084804,
0.02490818314254284,
-0.18138791620731354,
-0.08572528511285782,
0.005735989194363356,
0.23065060377120972,
0.02567697875201702,
-0.03845533728599548,
0.03634030744433403,
-0.03480324149131775,
0.09934929013252258,
-0.19899140298366547,
-0.08118179440498352,
-0.06952395290136337,
-0.09649205952882767,
-0.009783156216144562,
-0.06061060354113579,
-0.011613738723099232,
-0.06363189220428467,
0.08056962490081787,
-0.059479519724845886,
-0.12129012495279312,
0.019668323919177055,
-0.15673509240150452,
-0.09088223427534103,
-0.09456173330545425,
0.04741369187831879,
0.04945218935608864,
-0.023839440196752548,
0.004617556929588318,
-0.06979484111070633,
-0.040000464767217636,
-0.10821117460727692,
0.04496108368039131,
0.08250743895769119,
-0.10931839793920517,
-0.04220898076891899,
-0.0559663362801075,
-0.06597078591585159,
-0.02989054098725319,
-0.05747019499540329,
0.06816496700048447,
0.26474082469940186,
-0.06508909910917282,
0.049492672085762024,
0.2140355408191681,
-0.06748595088720322,
-0.22972384095191956,
-0.12567438185214996,
-0.06061824783682823,
-0.018879279494285583,
-0.004602924454957247,
-0.08901138603687286,
0.13080012798309326,
0.01619448885321617,
-0.04554979130625725,
0.25106289982795715,
-0.23753996193408966,
-0.038334764540195465,
0.021513354033231735,
0.13614654541015625,
0.27733391523361206,
-0.15292391180992126,
-0.03159574419260025,
-0.012814449146389961,
-0.1313517540693283,
0.18080711364746094,
-0.0054670884273946285,
0.0584469810128212,
-0.06370490044355392,
0.01734229549765587,
-0.019028887152671814,
-0.048751406371593475,
0.08683472871780396,
-0.0371481329202652,
0.08161064982414246,
-0.0751112699508667,
0.028753960505127907,
0.14810962975025177,
-0.031895942986011505,
0.04683423414826393,
-0.15214300155639648,
0.032599981874227524,
-0.12248463928699493,
-0.003305829595774412,
-0.042694807052612305,
0.024716949090361595,
-0.05019174516201019,
-0.10415225476026535,
-0.05029311403632164,
-0.012454194948077202,
-0.011857313103973866,
0.04108360782265663,
-0.017304999753832817,
0.006424656603485346,
-0.007149083074182272,
0.15558254718780518,
0.03283800557255745,
-0.041949257254600525,
-0.011547195725142956,
-0.08137566596269608,
-0.0399131216108799,
0.13175855576992035,
-0.05200241506099701,
-0.022519774734973907,
0.09987631440162659,
-0.0031939588952809572,
0.04438795894384384,
0.027277572080492973,
-0.06082027032971382,
0.057238876819610596,
0.09578121453523636,
-0.18277637660503387,
-0.1223982498049736,
-0.034280333667993546,
0.16975335776805878,
0.09598612040281296,
0.11609319597482681,
0.11986927688121796,
-0.09417214244604111,
0.034377314150333405,
-0.057761795818805695,
0.022577879950404167,
-0.04035618156194687,
0.05736140161752701,
0.012262719683349133,
0.04716140404343605,
-0.06477769464254379,
0.028788747265934944,
-0.04007923603057861,
-0.031169766560196877,
-0.022677117958664894,
0.05196345970034599,
-0.07792334258556366,
-0.07705064117908478,
0.05165671557188034,
0.20525890588760376,
-0.11863716691732407,
-0.099049411714077,
-0.016870805993676186,
-0.07119987159967422,
0.023023594170808792,
0.11398220807313919,
0.0016097835032269359,
0.024292556568980217,
0.057823602110147476,
-0.00581745570525527,
-0.07856081426143646,
0.05934206023812294,
-0.02394956909120083,
0.11936940997838974,
-0.22755080461502075,
-0.060025278478860855,
-0.008330310694873333,
0.05264277756214142,
-0.07911336421966553,
0.0060958475805819035,
-0.09532345086336136,
0.017823832109570503,
-0.036081671714782715,
0.07260762155056,
-0.10051693022251129,
-0.07255638390779495,
-0.04043557494878769,
-0.005318512208759785,
-0.05361943691968918,
0.03510519862174988,
-0.041086576879024506,
0.05353359505534172,
0.053559113293886185,
-0.0025641038082540035,
-0.017917437478899956,
-0.022859113290905952,
-0.030642036348581314,
-0.041149213910102844,
0.07523636519908905,
-0.015300377272069454,
-0.08589146286249161,
-0.04500459507107735,
-0.23462705314159393,
0.01969527080655098,
0.08683054894208908,
0.005791565403342247,
0.02092982642352581,
0.08998998999595642,
0.009498799219727516,
0.0261330995708704,
0.034615181386470795,
-0.04737046733498573,
-0.004775101784616709,
-0.08838118612766266,
-0.011254329234361649,
-0.022726338356733322,
-0.003379305824637413,
-0.05806918814778328,
-0.0036890541668981314,
0.10331882536411285,
0.06165264546871185,
0.1195102334022522,
-0.07950142025947571,
0.03192610293626785,
-0.040691934525966644,
0.0387238934636116,
0.08920673280954361,
-0.0812276154756546,
0.06144650653004646,
-0.03396974131464958,
-0.022052466869354248,
-0.01268579438328743,
0.10024234652519226,
-0.06335403025150299,
-0.2672671973705292,
-0.036842141300439835,
-0.19369840621948242,
-0.04991777241230011,
-0.013785061426460743,
0.26727429032325745,
0.00715539138764143,
0.013887855224311352,
-0.11795198917388916,
0.055951736867427826,
0.0780239924788475,
0.09040763974189758,
0.0029200578574091196,
0.08838842809200287,
0.01549359131604433,
0.08642268925905228,
0.06768148392438889,
-0.019297460094094276,
-0.05809498578310013,
-0.014309301041066647,
-0.16545192897319794,
0.1203736960887909,
-0.033566784113645554,
0.078482985496521,
0.15509860217571259,
0.009649019688367844,
-0.029381101951003075,
0.09552336484193802,
-0.013332446105778217,
-0.040425557643175125,
-0.17554350197315216,
-0.062780000269413,
-0.09192068874835968,
0.019694149494171143,
-0.052874404937028885,
-0.012313156388700008,
-0.05371568724513054,
0.047629185020923615,
-0.07968665659427643,
0.054637789726257324,
0.09755396097898483,
-0.011759479530155659,
0.0970541462302208,
-0.0054474747739732265,
-0.06404759734869003,
0.025168048217892647,
0.047906748950481415,
-0.014884940348565578,
0.019967496395111084,
-0.009982810355722904,
0.05933300033211708,
-0.03860839083790779,
0.07662614434957504,
0.035030387341976166,
-0.05953779071569443,
-0.03310314193367958,
-0.01581592485308647,
0.029591847211122513,
0.07506690919399261,
0.022763747721910477,
-0.027616538107395172,
0.0177958682179451,
0.07760626077651978,
-0.0022892188280820847,
-0.03641465678811073,
-0.05021504685282707,
0.04744032770395279,
-0.13309495151042938,
0.06607437878847122,
-0.05807620286941528,
0.0018729883013293147,
-0.06100183352828026,
0.22742445766925812,
0.13763725757598877,
-0.07220286875963211,
0.010390188544988632,
-0.09279115498065948,
0.014560630545020103,
-0.08628979325294495,
0.08856944739818573,
0.027380242943763733,
0.2979317605495453,
-0.0455271415412426,
-0.02133316360414028,
-0.10523584485054016,
-0.022153150290250778,
-0.05918426066637039,
-0.07433245331048965,
-0.003471019444987178,
-0.03352634236216545,
-0.0921512022614479,
0.058238569647073746,
-0.1967972069978714,
-0.025066571310162544,
0.06398633122444153,
-0.0006029411451891065,
0.0017768045654520392,
-0.016017485409975052,
0.1145465150475502,
0.02307361550629139,
0.019705263897776604,
-0.10174187272787094,
0.02335573360323906,
0.03351951390504837,
-0.03031923808157444,
-0.07228975743055344,
0.06528522074222565,
-0.012258583679795265,
-0.15343517065048218,
0.17177656292915344,
-0.014180171303451061,
0.010410752147436142,
0.07799536734819412,
-0.07384985685348511,
-0.13676871359348297,
0.12125959247350693,
-0.04093492031097412,
-0.09119324386119843,
-0.026536500081419945,
0.12674500048160553,
-0.006320111453533173,
0.042498886585235596,
-0.004441617056727409,
-0.05354538559913635,
-0.04653789848089218,
0.09850933402776718,
0.024293113499879837,
-0.09284669905900955,
0.05019992217421532,
-0.02700311318039894,
0.09795806556940079,
-0.02800842374563217,
-0.04679201915860176,
-0.015456583350896835,
-0.03232462704181671,
0.05268285423517227,
-0.0011182470479980111,
-0.00795888900756836,
0.06258341670036316,
-0.14796067774295807,
-0.022501206025481224,
0.07392314821481705,
0.06715290248394012,
-0.1757032424211502,
0.034888945519924164,
-0.17824722826480865,
-0.00797492079436779,
-0.04128590598702431,
-0.014931621961295605,
0.2524835765361786,
0.017127670347690582,
-0.00015911750961095095,
-0.09367632120847702,
-0.02635676972568035,
0.05190243199467659,
-0.014812711626291275,
-0.12496388703584671
] |
null | null | null | ### My-Pet-CAT Dreambooth model trained by shafi4 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 21KT1A0559
Sample pictures of this concept:
| {"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]} | text-to-image | shafi4/my-pet-cat | [
"safetensors",
"NxtWave-GenAI-Webinar",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-13T09:19:09+00:00 | [] | [] | TAGS
#safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #region-us
| ### My-Pet-CAT Dreambooth model trained by shafi4 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 21KT1A0559
Sample pictures of this concept:
| [
"### My-Pet-CAT Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
"TAGS\n#safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #region-us \n",
"### My-Pet-CAT Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
48,
52
] | [
"passage: TAGS\n#safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #region-us \n### My-Pet-CAT Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
-0.09547961503267288,
0.25353506207466125,
-0.0018489903304725885,
-0.02238844893872738,
0.054078008979558945,
-0.011454240418970585,
0.18957339227199554,
0.01447191834449768,
0.13257694244384766,
0.044047772884368896,
0.15994460880756378,
0.03299612179398537,
0.03689482808113098,
0.20571216940879822,
0.0692005604505539,
-0.1162906065583229,
0.12626761198043823,
0.029165277257561684,
0.023066578432917595,
0.044265277683734894,
0.041151855140924454,
-0.09159304201602936,
0.09772692620754242,
-0.00770975835621357,
-0.10574448108673096,
-0.04036026820540428,
-0.09201618283987045,
0.018889011815190315,
0.055462080985307693,
-0.030339177697896957,
0.09922811388969421,
0.12956340610980988,
0.051235590130090714,
-0.04151799529790878,
0.04466564208269119,
0.009884725324809551,
-0.0646790936589241,
0.060330621898174286,
0.13719141483306885,
-0.01771717518568039,
0.12051450461149216,
0.12842676043510437,
-0.06249016150832176,
0.05749770626425743,
-0.02876281924545765,
-0.05645553022623062,
0.05148611590266228,
0.08219803124666214,
0.08160552382469177,
0.08931811153888702,
-0.03184182196855545,
0.10620038956403732,
0.003385860938578844,
0.12523247301578522,
0.09665083140134811,
-0.22677192091941833,
-0.0896461084485054,
0.19680823385715485,
0.03197653591632843,
0.014120278880000114,
-0.03949049487709999,
0.10801128298044205,
0.09687908738851547,
-0.04368773475289345,
0.04318151995539665,
-0.013394998386502266,
0.005985800642520189,
-0.12824803590774536,
-0.12850037217140198,
0.012291277758777142,
0.2021903097629547,
0.07873567193746567,
-0.019043942913413048,
0.027593944221735,
-0.10538825392723083,
-0.03747764974832535,
-0.05298684537410736,
-0.030707556754350662,
-0.03269298002123833,
0.028267983347177505,
0.004501886200159788,
-0.04873200133442879,
-0.11521150171756744,
-0.07376881688833237,
-0.013854012824594975,
0.083962082862854,
0.013028820976614952,
0.04648859053850174,
-0.091460220515728,
0.09974288195371628,
0.02092115208506584,
-0.07996858656406403,
0.03240220621228218,
-0.0783892497420311,
0.01023143995553255,
0.052649203687906265,
0.03792322054505348,
-0.003911761566996574,
0.12836119532585144,
-0.030964544042944908,
0.11465174704790115,
-0.019290320575237274,
0.03021467849612236,
0.06849697232246399,
0.002831693971529603,
-0.07448821514844894,
-0.1222713515162468,
-0.10641268640756607,
-0.009534374810755253,
-0.0035368804819881916,
-0.034346334636211395,
-0.040915295481681824,
-0.11599434912204742,
0.06552679091691971,
-0.014443988911807537,
0.0568758025765419,
0.04521942511200905,
0.06408562511205673,
-0.008617295883595943,
-0.01919848844408989,
0.15193291008472443,
0.06769895553588867,
-0.030146628618240356,
0.017901888117194176,
-0.004051376599818468,
-0.07809256762266159,
0.030030744150280952,
-0.016185937449336052,
0.005185781978070736,
-0.0516275130212307,
-0.11507415026426315,
-0.04818340390920639,
-0.07481800019741058,
-0.03905818611383438,
-0.017282312735915184,
-0.07969463616609573,
0.042211491614580154,
-0.17765864729881287,
-0.08621610701084137,
0.03244175761938095,
0.046953197568655014,
-0.024449985474348068,
-0.043310876935720444,
-0.09834349900484085,
-0.10454043000936508,
-0.043187499046325684,
-0.015534510836005211,
0.0022490224801003933,
-0.03799053281545639,
0.028436977416276932,
-0.07907894253730774,
0.09517651051282883,
-0.24657154083251953,
0.008322905749082565,
-0.13536734879016876,
0.0007534359465353191,
0.10374633967876434,
-0.04709743708372116,
-0.03511608764529228,
0.1409253478050232,
0.00708203436806798,
0.01991528458893299,
-0.058987535536289215,
-0.015730194747447968,
-0.010469228029251099,
0.15121521055698395,
-0.09499367326498032,
0.04957381263375282,
0.13827016949653625,
-0.11315277218818665,
-0.18149614334106445,
0.11004570871591568,
0.05856567993760109,
0.16485629975795746,
0.0937780886888504,
0.20285075902938843,
0.11795686930418015,
-0.08668883144855499,
-0.06561484932899475,
0.014264681376516819,
-0.13914905488491058,
-0.13994020223617554,
0.0075520980171859264,
0.1381382793188095,
-0.16657456755638123,
0.009767400100827217,
-0.08162441849708557,
0.08721314370632172,
-0.10528197884559631,
-0.04488412290811539,
0.005229524802416563,
-0.16520041227340698,
-0.02894432656466961,
0.02089804597198963,
0.045976750552654266,
-0.054824404418468475,
0.05686233937740326,
-0.18452343344688416,
0.059204526245594025,
-0.01829191856086254,
-0.04425078630447388,
-0.11841804534196854,
0.0974939689040184,
0.0012866788310930133,
0.035234738141298294,
-0.03216048330068588,
-0.11322356015443802,
0.04892536252737045,
0.050328329205513,
0.039218224585056305,
0.18901170790195465,
0.024058017879724503,
0.07760381698608398,
0.016206961125135422,
-0.09564097225666046,
0.12002600729465485,
0.026707082986831665,
-0.06121134012937546,
-0.15659047663211823,
0.10809271037578583,
-0.062471792101860046,
0.023939888924360275,
-0.14366693794727325,
0.05862204730510712,
0.052340760827064514,
0.12547241151332855,
0.010539122857153416,
-0.011606345884501934,
0.025142954662442207,
-0.03341492637991905,
-0.06681415438652039,
-0.011664347723126411,
0.07437852770090103,
0.011784697882831097,
-0.12037529051303864,
0.12708653509616852,
-0.15458378195762634,
0.21445608139038086,
0.11728597432374954,
0.0008884224225766957,
-0.008565494790673256,
0.10504459589719772,
-0.03983873128890991,
0.018784256651997566,
0.0069641838781535625,
-0.006975631695240736,
-0.09327427297830582,
-0.0784691870212555,
0.0962335616350174,
-0.0727536752820015,
0.009358699433505535,
0.05490799620747566,
-0.03530978038907051,
-0.01962529867887497,
0.0982981026172638,
0.021541204303503036,
-0.2066190391778946,
0.14342646300792694,
0.1358330249786377,
0.02324681729078293,
0.20283451676368713,
0.09703186899423599,
0.01748192124068737,
-0.04437931627035141,
0.08450046181678772,
-0.024854302406311035,
0.2663766145706177,
-0.07610873132944107,
0.06747955083847046,
0.022405026480555534,
-0.011743350885808468,
0.062483642250299454,
-0.1636253446340561,
-0.0690295621752739,
-0.06355898827314377,
-0.04418047517538071,
0.03590799495577812,
0.04377079755067825,
-0.12688180804252625,
0.12650510668754578,
-0.11123542487621307,
-0.21121883392333984,
0.012694185599684715,
-0.021815087646245956,
-0.07067467272281647,
0.09381256997585297,
-0.06354116648435593,
-0.26179781556129456,
-0.08222953230142593,
-0.0191984660923481,
-0.02923029288649559,
0.02699071541428566,
0.04349489510059357,
-0.07158315181732178,
-0.026774462312459946,
-0.11252491176128387,
-0.10827389359474182,
-0.11415162682533264,
0.029966523870825768,
0.0039045666344463825,
0.05193804204463959,
-0.003087997902184725,
-0.015793709084391594,
0.017304597422480583,
-0.06851417571306229,
-0.0032795369625091553,
0.11345068365335464,
-0.034862928092479706,
0.1761433184146881,
0.12276137620210648,
-0.021800871938467026,
-0.02114497311413288,
-0.01571739837527275,
0.25417619943618774,
-0.07039039582014084,
0.06351169943809509,
0.026701470836997032,
0.05590655282139778,
0.07723276317119598,
0.2276604175567627,
0.05039052665233612,
-0.12659236788749695,
0.05778742581605911,
-0.05938371270895004,
-0.1272418200969696,
-0.09691938012838364,
-0.08829604089260101,
-0.023316066712141037,
0.1688586175441742,
-0.00699664605781436,
0.08567237108945847,
0.1263732612133026,
0.17711040377616882,
-0.039121270179748535,
-0.1243656724691391,
-0.06159660592675209,
0.08623945713043213,
-0.09709648787975311,
-0.07221399992704391,
0.05102771148085594,
-0.0881841853260994,
-0.0662071704864502,
0.06294243782758713,
0.04273116961121559,
0.16912920773029327,
0.09927545487880707,
-0.01815878413617611,
0.056890323758125305,
0.19951127469539642,
0.0945781022310257,
0.10638222098350525,
0.003323080949485302,
-0.07275906950235367,
-0.06041539087891579,
-0.061866480857133865,
0.1292058378458023,
0.10407136380672455,
-0.02751454897224903,
0.009401773102581501,
0.06459573656320572,
0.007313128095120192,
0.0012250760337337852,
0.05276276543736458,
0.11556842923164368,
-0.2760794162750244,
0.036455150693655014,
0.0041832649149000645,
0.07325902581214905,
-0.06795290112495422,
0.02308639883995056,
0.20676390826702118,
0.012150619179010391,
0.0358944870531559,
-0.02045566588640213,
0.0513320155441761,
0.1307549625635147,
0.02781485766172409,
-0.13926979899406433,
-0.003972496371716261,
-0.029116671532392502,
0.08771611750125885,
-0.14587350189685822,
0.212310791015625,
-0.030976427718997,
0.025260139256715775,
-0.010486859828233719,
-0.07380624860525131,
0.012519841082394123,
0.21756011247634888,
0.18219827115535736,
0.011012188158929348,
-0.09386061877012253,
-0.10609577596187592,
-0.11580400168895721,
0.0492774099111557,
0.06263745576143265,
-0.06452884525060654,
0.03351180627942085,
0.08142927289009094,
-0.031895171850919724,
-0.00682805897668004,
0.01893249899148941,
-0.15010185539722443,
-0.05893645063042641,
0.0000729403254808858,
0.22454652190208435,
0.12791424989700317,
-0.00960833951830864,
0.005851740948855877,
-0.03566362336277962,
0.03461975231766701,
-0.232415109872818,
-0.03023596666753292,
-0.03358977288007736,
-0.095687635242939,
-0.014447687193751335,
-0.027621768414974213,
0.005253656767308712,
-0.09090597927570343,
0.03840838745236397,
-0.031026341021060944,
-0.09432712197303772,
0.027298152446746826,
-0.19375188648700714,
-0.1394263654947281,
-0.14464855194091797,
0.02101743593811989,
0.05051935091614723,
0.03597232326865196,
0.017121553421020508,
-0.0519263781607151,
-0.056084271520376205,
-0.07896021753549576,
0.01364215649664402,
0.03574822098016739,
-0.12076897919178009,
-0.04350867494940758,
-0.058423496782779694,
-0.10483023524284363,
-0.1140572652220726,
-0.051864758133888245,
0.040195293724536896,
0.28337186574935913,
-0.0915888100862503,
0.05505519360303879,
0.27678340673446655,
-0.0437219962477684,
-0.2662002742290497,
-0.1147283986210823,
-0.09919583052396774,
-0.03915611654520035,
0.04565137252211571,
-0.1467575877904892,
0.11186346411705017,
0.018089504912495613,
-0.09563473612070084,
0.19707630574703217,
-0.2037992626428604,
-0.06229244917631149,
0.09696705639362335,
0.15759779512882233,
0.3140411972999573,
-0.15072345733642578,
-0.010598338209092617,
-0.001925932359881699,
0.06588217616081238,
0.2707287073135376,
0.0034977186005562544,
0.056670404970645905,
-0.004437311086803675,
-0.015444249846041203,
0.007120522204786539,
0.007132567465305328,
0.14606855809688568,
-0.0364803746342659,
0.05682205408811569,
-0.056893542408943176,
0.1018909141421318,
0.11983097344636917,
-0.01596503145992756,
0.03474915772676468,
-0.08460540324449539,
-0.0001950619334820658,
-0.02998114563524723,
0.002701233606785536,
-0.027256246656179428,
0.02312183752655983,
-0.018848750740289688,
-0.12498171627521515,
-0.14560478925704956,
0.03096415475010872,
0.01523195393383503,
0.02596771903336048,
-0.016818422824144363,
-0.0030992208048701286,
-0.022938279435038567,
0.14586690068244934,
0.0524410605430603,
-0.10364656150341034,
0.0840342715382576,
-0.06336542963981628,
-0.08547906577587128,
0.15460991859436035,
-0.0017361087957397103,
-0.059438373893499374,
0.1080421581864357,
-0.017498720437288284,
0.011125141754746437,
0.021545855328440666,
-0.07313203066587448,
-0.001036127214320004,
0.1567562371492386,
-0.14676830172538757,
-0.2742615342140198,
-0.014387478120625019,
0.20851801335811615,
0.08484310656785965,
0.11139822751283646,
0.07634060084819794,
-0.09325097501277924,
0.027982281520962715,
-0.04581218957901001,
0.008522002957761288,
0.05205044522881508,
0.04185360670089722,
-0.0036660218611359596,
0.04562455415725708,
-0.08393383771181107,
0.03003285452723503,
-0.00022160749358590692,
-0.068634994328022,
-0.019513854756951332,
-0.011731849052011967,
-0.09669081121683121,
-0.08295559138059616,
-0.0091831274330616,
0.11541572213172913,
-0.07697582244873047,
-0.10813801735639572,
-0.09166844189167023,
-0.06726289540529251,
-0.0036985380575060844,
0.10757602751255035,
0.0483408197760582,
0.02322346158325672,
0.0702114999294281,
-0.002956319833174348,
-0.04178265854716301,
0.058308620005846024,
-0.05558779090642929,
0.08543073385953903,
-0.23013797402381897,
-0.06625588983297348,
0.04185499995946884,
0.012265292927622795,
-0.08266119658946991,
-0.03839268907904625,
-0.08399738371372223,
0.011980336159467697,
0.06569743901491165,
0.08209626376628876,
-0.11839553713798523,
-0.07107725739479065,
-0.023463066667318344,
-0.009220258332788944,
-0.0658901184797287,
-0.007247254252433777,
-0.07907800376415253,
0.07325715571641922,
0.04737149178981781,
0.053906604647636414,
0.009858413599431515,
0.005842986982315779,
-0.005777533166110516,
0.00140050845220685,
0.0690428614616394,
-0.042961105704307556,
-0.09257708489894867,
0.000853674893733114,
-0.26767852902412415,
0.01900605857372284,
0.08040979504585266,
0.028199253603816032,
-0.02546747401356697,
0.11012955754995346,
-0.05305292457342148,
0.0006882853340357542,
0.04677275940775871,
-0.001900176634080708,
0.01564609259366989,
-0.09965856373310089,
-0.05491314455866814,
-0.035466160625219345,
-0.0032803742215037346,
-0.037946850061416626,
-0.05174020305275917,
0.04854729026556015,
0.00017836202459875494,
0.14091502130031586,
-0.07418335974216461,
0.03471099212765694,
-0.052653051912784576,
0.03362492099404335,
0.049863144755363464,
-0.06466545909643173,
-0.04413313418626785,
-0.07180548459291458,
-0.022540250793099403,
-0.006754038389772177,
0.11561085283756256,
-0.023204592987895012,
-0.2218787521123886,
-0.03383282944560051,
-0.03584466874599457,
0.0052977558225393295,
-0.02533135749399662,
0.2418231964111328,
0.02876042015850544,
0.013867263682186604,
-0.11028389632701874,
0.05268021672964096,
0.04617733135819435,
0.12324792891740799,
0.03856983408331871,
0.03583556041121483,
0.07423122227191925,
0.08614230901002884,
0.052188657224178314,
0.02494022250175476,
-0.039424020797014236,
0.03417932614684105,
-0.10481926798820496,
0.09242160618305206,
-0.006123917642980814,
0.09787748754024506,
0.18331782519817352,
0.03532203659415245,
-0.014136651530861855,
0.07399103045463562,
0.007209408562630415,
-0.06188768893480301,
-0.21427448093891144,
-0.054076891392469406,
-0.1040731742978096,
-0.0012606766540557146,
-0.049602825194597244,
-0.050624992698431015,
0.05008653551340103,
0.034151893109083176,
-0.059962570667266846,
0.10740384459495544,
0.020027227699756622,
0.00026991203776560724,
0.04640093073248863,
-0.008217507973313332,
-0.038801390677690506,
-0.0076259891502559185,
-0.04688479006290436,
0.03485659137368202,
0.04842747002840042,
-0.020499901846051216,
0.04392484575510025,
-0.0007961581577546895,
0.004726836923509836,
0.004321379587054253,
-0.05698918551206589,
-0.054579220712184906,
-0.020464526489377022,
0.0036472282372415066,
0.08857528120279312,
0.011675035580992699,
-0.06720235198736191,
0.018061060458421707,
0.0859602615237236,
0.007250037509948015,
0.03493493050336838,
-0.03685729578137398,
0.2033536434173584,
-0.1462218463420868,
0.06784335523843765,
-0.011096864938735962,
0.008929689414799213,
-0.07135681807994843,
0.21798011660575867,
0.16453170776367188,
-0.11236017197370529,
-0.02115137130022049,
-0.10856062918901443,
0.006382573861628771,
-0.11509915441274643,
0.09628475457429886,
0.007635950110852718,
0.23573602735996246,
-0.015317898243665695,
0.01928718201816082,
-0.10283934324979782,
-0.012185611762106419,
-0.0862220972776413,
-0.12147442251443863,
0.05558378994464874,
0.0032764466013759375,
-0.14891377091407776,
0.09315911680459976,
-0.17477400600910187,
0.0008322689100168645,
0.07671834528446198,
0.02194095589220524,
-0.04367318004369736,
0.006935780867934227,
0.09561288356781006,
0.038897477090358734,
0.055641405284404755,
-0.1571713536977768,
0.054358597844839096,
-0.007965808734297752,
-0.05510478839278221,
-0.12040819227695465,
0.017287123948335648,
-0.02322465553879738,
-0.22875912487506866,
0.33589982986450195,
-0.026835564523935318,
-0.008864259347319603,
0.06618563830852509,
-0.11199450492858887,
-0.18019725382328033,
0.12798283994197845,
0.0035751129034906626,
-0.025230824947357178,
0.011243719607591629,
0.11488310247659683,
0.008774958550930023,
0.05290519446134567,
0.06602800637483597,
-0.040679123252630234,
-0.020139489322900772,
0.04733632504940033,
0.006268169265240431,
-0.06127820163965225,
0.07090341299772263,
-0.03925975784659386,
0.06693403422832489,
-0.0035975670907646418,
-0.06388943642377853,
-0.03811018913984299,
0.011039748787879944,
0.011853517033159733,
0.000985650229267776,
-0.06539580225944519,
0.057182446122169495,
-0.1474834382534027,
-0.04158303514122963,
0.027892757207155228,
0.0070481617003679276,
-0.2162504494190216,
-0.012307614088058472,
-0.19081655144691467,
0.019297722727060318,
-0.03286036103963852,
0.00230273581109941,
0.1682763695716858,
-0.0021381613332778215,
0.014714251272380352,
-0.08402314782142639,
-0.06394803524017334,
0.025529900565743446,
0.006137564778327942,
-0.16494140028953552
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | shankz7/mistral_7B_instruct_v0.2_customer_specification_finetune | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:19:27+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
**Explanation**
- With the base model, attached the DPO applied Adapter
**Base Model**
- [TomGrc/FusionNet_7Bx2_MoE_v0.1](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1)
**Adapter Base Model**
- [yanolja/KoSOLAR-10.7B-v0.3](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.3)
**Adapter Corpus**
- [We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs](https://huggingface.co/datasets/We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs)
**Score**
|Average|ARC|HellaSwag|MMLU|TruthfulQA|Winogrande|GSM8K|
|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
|76.09|73.89|88.94|65.03|71.24|87.61|69.83|
**Log**
- 2024.02.13: Initial version Upload
**LICENSE**
- MIT | {"language": ["en", "ko"], "license": "mit", "datasets": "We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs", "pipeline_tag": "text-generation"} | text-generation | dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"en",
"ko",
"dataset:We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:19:43+00:00 | [] | [
"en",
"ko"
] | TAGS
#transformers #safetensors #mixtral #text-generation #en #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Explanation
* With the base model, attached the DPO applied Adapter
Base Model
* TomGrc/FusionNet\_7Bx2\_MoE\_v0.1
Adapter Base Model
* yanolja/KoSOLAR-10.7B-v0.3
Adapter Corpus
* We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs
Score
Log
* 2024.02.13: Initial version Upload
LICENSE
* MIT
| [] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #en #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
85
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #en #ko #dataset-We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.046664606779813766,
0.059917815029621124,
-0.00500042550265789,
0.011129749938845634,
0.1146635189652443,
0.011073892936110497,
0.21043069660663605,
0.13031752407550812,
0.030823150649666786,
-0.02661469206213951,
0.15297667682170868,
0.17015135288238525,
0.03434661403298378,
0.1782059222459793,
-0.12273801118135452,
-0.23281948268413544,
0.08248388767242432,
0.00028128703706897795,
-0.03129761293530464,
0.10560238361358643,
0.11201027035713196,
-0.036341000348329544,
0.05055634304881096,
-0.07882699370384216,
-0.10960982739925385,
-0.03313927724957466,
0.006809750106185675,
-0.12841349840164185,
0.07741746306419373,
0.07458439469337463,
0.10962742567062378,
0.10900817066431046,
-0.051426179707050323,
-0.1568818837404251,
0.030079472810029984,
0.011434436775743961,
-0.05130412057042122,
0.01577933505177498,
0.045249488204717636,
-0.024335132911801338,
0.023119166493415833,
-0.05707593262195587,
-0.0522405281662941,
0.025369906798005104,
-0.11336687952280045,
-0.13606612384319305,
-0.07850134372711182,
0.02866470068693161,
0.07551908493041992,
0.054206546396017075,
-0.005449431017041206,
0.15227201581001282,
-0.03475922346115112,
0.07329630106687546,
0.006343552842736244,
-0.260618656873703,
0.006349627859890461,
0.10733316093683243,
0.11807218939065933,
0.05738072097301483,
-0.015325402840971947,
0.040853992104530334,
0.05612572655081749,
-0.022496284916996956,
-0.027454053983092308,
-0.028526343405246735,
-0.007967358455061913,
0.03868028149008751,
-0.07143991440534592,
-0.007750876247882843,
0.2679624855518341,
-0.006137555930763483,
0.011420661583542824,
-0.057645078748464584,
-0.06110234558582306,
-0.03889154642820358,
0.020127400755882263,
-0.044046398252248764,
-0.021969959139823914,
0.07472791522741318,
0.024572961032390594,
0.0116821164265275,
-0.14110662043094635,
-0.023290134966373444,
-0.24087251722812653,
0.24943292140960693,
0.01749546267092228,
0.025393273681402206,
-0.13701611757278442,
0.04990620166063309,
0.0490419901907444,
-0.10517403483390808,
-0.016769202426075935,
-0.10423127561807632,
0.08117146790027618,
-0.030885379761457443,
-0.04218423366546631,
-0.08086869865655899,
0.1797313541173935,
0.11171109229326248,
-0.005169276613742113,
-0.03912866488099098,
-0.04217267408967018,
0.06385746598243713,
0.045496903359889984,
0.0030834018252789974,
-0.02231135219335556,
-0.017342446371912956,
0.06787662953138351,
-0.0537521168589592,
0.05024930462241173,
-0.03606240823864937,
-0.10968027263879776,
-0.03056001476943493,
0.0062749129720032215,
0.12318041175603867,
0.05991173908114433,
0.10475802421569824,
-0.05462035536766052,
0.010486788116395473,
0.1533413678407669,
-0.0595405250787735,
-0.004377750214189291,
-0.01076924242079258,
0.0310285072773695,
0.06942342966794968,
-0.040603552013635635,
0.059019122272729874,
-0.016864249482750893,
0.04734831675887108,
-0.01792505942285061,
-0.053111542016267776,
-0.01582282781600952,
-0.05834062024950981,
0.064693383872509,
-0.012240327894687653,
0.05221567675471306,
-0.22336696088314056,
-0.19366635382175446,
0.018681732937693596,
0.015384722501039505,
-0.03473367914557457,
-0.01850614883005619,
-0.021031897515058517,
-0.05984019115567207,
0.0011433751787990332,
-0.07349152863025665,
-0.10652117431163788,
-0.08304215967655182,
0.10670500248670578,
-0.013845395296812057,
0.04883592575788498,
-0.18471533060073853,
0.044814787805080414,
-0.10833880305290222,
0.007076219189912081,
-0.05989877134561539,
0.06312089413404465,
-0.0642731562256813,
0.1169724315404892,
-0.030221309512853622,
0.03982837125658989,
-0.034202151000499725,
0.049398016184568405,
-0.027221117168664932,
0.20211277902126312,
-0.14424794912338257,
-0.07537595182657242,
0.22415576875209808,
-0.14720918238162994,
-0.19694986939430237,
0.12768486142158508,
0.02700994722545147,
0.06539055705070496,
0.09247437864542007,
0.16249249875545502,
0.11184714734554291,
-0.06315504014492035,
-0.011398455128073692,
0.05957735702395439,
-0.08646302670240402,
-0.10662553459405899,
0.08648470044136047,
0.022566745057702065,
-0.10523270815610886,
0.0552297867834568,
0.04687955975532532,
0.0816265419125557,
-0.05378689244389534,
-0.06683144718408585,
-0.04813091456890106,
-0.03123878687620163,
0.048433948308229446,
-0.024441726505756378,
0.04177432507276535,
-0.08666199445724487,
-0.023973722010850906,
-0.046218711882829666,
0.02286987006664276,
-0.02814427949488163,
-0.0020401477813720703,
-0.09657403081655502,
0.08932202309370041,
-0.053483832627534866,
0.060110483318567276,
-0.07872792333364487,
-0.08212018013000488,
0.014646842144429684,
0.01667783409357071,
-0.04183600842952728,
0.06662656366825104,
0.06141499802470207,
0.007893203757703304,
-0.020410116761922836,
-0.0014075144426897168,
0.17276887595653534,
0.05541939660906792,
-0.01802535355091095,
-0.10373612493276596,
0.11982300877571106,
-0.07997960597276688,
0.08907504379749298,
-0.11174410581588745,
0.014426344074308872,
0.108958899974823,
0.117643803358078,
0.008403286337852478,
0.09027167409658432,
-0.02397211268544197,
0.057052306830883026,
-0.06521309167146683,
-0.006455334834754467,
0.09671138226985931,
0.015302062034606934,
-0.10086223483085632,
0.18549802899360657,
-0.12763358652591705,
0.23347991704940796,
0.21176713705062866,
-0.052602387964725494,
0.011232146061956882,
-0.10708710551261902,
-0.006922066677361727,
-0.027233917266130447,
0.04563415050506592,
-0.015952643007040024,
-0.061453867703676224,
0.0007445674273185432,
0.09703516960144043,
-0.043136514723300934,
0.021286513656377792,
0.013888035900890827,
-0.06389258801937103,
-0.05915592610836029,
0.0394134558737278,
0.04886365681886673,
-0.17184774577617645,
0.1968689113855362,
0.23367609083652496,
0.03871134668588638,
0.2116004228591919,
-0.07448004931211472,
0.013517542742192745,
-0.0030782222747802734,
0.04942344129085541,
-0.009602940641343594,
-0.0027305514086037874,
-0.10696592926979065,
0.030430948361754417,
0.06908825039863586,
0.02499815635383129,
0.047079890966415405,
-0.11950905621051788,
-0.07174301147460938,
-0.003018774325028062,
-0.04812931641936302,
-0.025241583585739136,
0.07173039019107819,
-0.011170039884746075,
0.1080951914191246,
-0.034325648099184036,
-0.07337096333503723,
0.17630629241466522,
0.014235180802643299,
-0.05813618376851082,
0.15988625586032867,
-0.1431785523891449,
-0.2785952091217041,
-0.14353413879871368,
-0.09418037533760071,
-0.10272476822137833,
0.03482675552368164,
0.1260756105184555,
-0.05968009680509567,
-0.030313881114125252,
-0.020975753664970398,
0.08848412334918976,
0.027120284736156464,
0.015708470717072487,
-0.05753016844391823,
0.07340073585510254,
-0.07669775187969208,
-0.10192020237445831,
-0.056426748633384705,
0.035619739443063736,
-0.04251880943775177,
0.21575689315795898,
-0.09474309533834457,
0.09564884006977081,
0.10529229044914246,
-0.0029771903064101934,
-0.024650929495692253,
-0.05501248687505722,
0.049177564680576324,
-0.06258637458086014,
0.007593921851366758,
0.18078237771987915,
-0.026094624772667885,
0.044084783643484116,
0.21289224922657013,
0.0013068030821159482,
-0.07405862212181091,
0.045333217829465866,
-0.051980286836624146,
-0.037008948624134064,
-0.2692442834377289,
-0.10986092686653137,
-0.07312298566102982,
0.11190960556268692,
-0.03086450695991516,
0.06884817034006119,
0.10394240915775299,
0.11263829469680786,
-0.03530189022421837,
-0.01777188666164875,
0.01752513088285923,
0.058699533343315125,
0.1813478171825409,
-0.02026626281440258,
0.12506406009197235,
-0.08292903006076813,
-0.08769719302654266,
0.10539619624614716,
0.1143455058336258,
0.1314869523048401,
0.11313220858573914,
0.03394008055329323,
0.055020011961460114,
0.06655032187700272,
0.13540129363536835,
0.06838358938694,
0.12385162711143494,
-0.033756256103515625,
-0.018242958933115005,
-0.04274632781744003,
-0.025798900052905083,
0.04718580096960068,
-0.041657816618680954,
-0.11511687934398651,
-0.021087080240249634,
-0.038555946201086044,
0.1062353327870369,
0.12584103643894196,
0.053802624344825745,
-0.224233478307724,
0.009801669046282768,
0.07599926739931107,
0.012823636643588543,
-0.06302964687347412,
0.0874815434217453,
-0.016598723828792572,
-0.046524520963430405,
0.1064552366733551,
-0.03702562674880028,
0.0543065220117569,
-0.09105721116065979,
0.02946881763637066,
-0.04525500908493996,
-0.026595475152134895,
0.015106147155165672,
0.10225308686494827,
-0.3396171033382416,
0.22339750826358795,
-0.02148568443953991,
0.027679063379764557,
-0.08159086853265762,
0.017667517066001892,
0.006492948159575462,
0.16472040116786957,
0.10427527129650116,
-0.008559774607419968,
-0.07401462644338608,
-0.052295755594968796,
-0.06703407317399979,
0.07331469655036926,
0.08020050078630447,
-0.035810697823762894,
0.0032155828084796667,
-0.028535297140479088,
0.005431304685771465,
-0.014773678034543991,
0.03716402128338814,
-0.06777659058570862,
-0.11051786690950394,
0.05605042353272438,
0.07950868457555771,
0.18159013986587524,
-0.052346765995025635,
-0.01485873106867075,
-0.1652989387512207,
0.1586543619632721,
-0.03340129181742668,
-0.08708814531564713,
-0.059538599103689194,
-0.10458989441394806,
-0.0028271714691072702,
-0.04103265330195427,
-0.020240692421793938,
-0.02986009791493416,
0.0187457837164402,
-0.03758818656206131,
-0.1640724539756775,
0.10512836277484894,
-0.08543077111244202,
-0.0577198788523674,
-0.0721278190612793,
0.0742337629199028,
-0.07159484177827835,
-0.03467907756567001,
0.0097659882158041,
-0.0022303429432213306,
-0.046792127192020416,
-0.08537351340055466,
0.012697067111730576,
0.043939679861068726,
0.05182391777634621,
0.04008382931351662,
-0.053574129939079285,
-0.11907356977462769,
-0.01763775199651718,
-0.11949280649423599,
0.1790599226951599,
0.32489925622940063,
-0.024201123043894768,
0.10879597067832947,
0.2364288717508316,
-0.06498484313488007,
-0.3310905694961548,
-0.14243489503860474,
-0.13344939053058624,
-0.02804686315357685,
-0.08592598140239716,
-0.11058273911476135,
0.06705872714519501,
0.12473788857460022,
-0.043988849967718124,
0.04854439198970795,
-0.1779346913099289,
-0.10193924605846405,
0.17095795273780823,
0.0465671606361866,
0.3494671881198883,
-0.22115586698055267,
-0.07969937473535538,
-0.11793075501918793,
-0.14539973437786102,
0.21476416289806366,
-0.07730896025896072,
0.0828205794095993,
-0.011749533005058765,
-0.015023904852569103,
0.0037244290579110384,
-0.03131287172436714,
0.1878703236579895,
-0.06387889385223389,
0.049194999039173126,
-0.12296388298273087,
0.02487400360405445,
0.11063747107982635,
-0.004609477706253529,
0.025004779919981956,
-0.08027462661266327,
0.012536265887320042,
-0.02272614650428295,
-0.048717744648456573,
-0.005240270402282476,
0.04648888111114502,
0.04134683310985565,
-0.08906813710927963,
-0.019301777705550194,
0.01934811845421791,
-0.03154156729578972,
-0.03928442671895027,
0.17542791366577148,
-0.01317478809505701,
0.09858465939760208,
0.06200459226965904,
0.09183680266141891,
-0.13025543093681335,
0.10053806006908417,
-0.06162923574447632,
-0.10952281951904297,
0.0920151025056839,
-0.07528302073478699,
0.018131135031580925,
0.10128238052129745,
-0.05134091153740883,
0.07707352936267853,
0.0657711923122406,
0.013211976736783981,
0.008875134401023388,
0.1343049854040146,
-0.1819031983613968,
-0.11538273841142654,
-0.025051647797226906,
-0.006369379349052906,
0.06347683072090149,
0.10237989574670792,
0.15159857273101807,
-0.010478023439645767,
-0.00521222036331892,
0.002713344292715192,
0.031192217022180557,
-0.07852207869291306,
0.07757186144590378,
0.021063759922981262,
0.016540681943297386,
-0.11065764725208282,
0.10983170568943024,
0.03750389441847801,
-0.1335986703634262,
0.011346573010087013,
0.002959935925900936,
-0.13128679990768433,
-0.12410126626491547,
-0.02885577827692032,
0.12399537116289139,
-0.0628284141421318,
-0.14267481863498688,
-0.023897036910057068,
-0.15968598425388336,
0.01813594065606594,
0.08755390346050262,
0.06116882711648941,
0.10622499138116837,
0.02237698994576931,
-0.033644407987594604,
-0.09406744688749313,
0.04046779125928879,
-0.01678265631198883,
0.05697352811694145,
-0.14877085387706757,
-0.03794873133301735,
-0.09112654626369476,
0.06442222744226456,
-0.07748305797576904,
-0.008442864753305912,
-0.17773698270320892,
-0.014683968387544155,
-0.1645481139421463,
-0.01151390839368105,
-0.10555122792720795,
0.0034887227229774,
0.03293720260262489,
-0.04938272386789322,
-0.0079918522387743,
-0.05924350395798683,
-0.06647919118404388,
0.02515191212296486,
0.006001731846481562,
0.0684194415807724,
-0.0988675057888031,
-0.08439360558986664,
0.025823213160037994,
-0.030892163515090942,
0.11110154539346695,
0.13182561099529266,
-0.0572207048535347,
0.007835720665752888,
-0.21230252087116241,
-0.009266441687941551,
0.10828906297683716,
0.01626577228307724,
0.0003565080987755209,
-0.029244042932987213,
0.0016359362052753568,
0.14699575304985046,
-0.037901487201452255,
0.08854575455188751,
0.04163186997175217,
-0.09896041452884674,
-0.003739154664799571,
-0.07337147742509842,
-0.07886047661304474,
-0.03006340190768242,
-0.047411490231752396,
0.11622658371925354,
0.027543380856513977,
0.14826656877994537,
-0.063511423766613,
0.02134851925075054,
-0.05189547315239906,
-0.0008187047787941992,
-0.008174916729331017,
-0.17098543047904968,
-0.014978992752730846,
-0.051042407751083374,
0.0033379439264535904,
0.03357098624110222,
0.25945866107940674,
0.034160204231739044,
-0.08420221507549286,
0.0314120352268219,
-0.032403841614723206,
0.06033613905310631,
0.01988542079925537,
0.2621113061904907,
0.09962693601846695,
-0.012425526045262814,
-0.13295942544937134,
0.04236998036503792,
0.035837456583976746,
-0.08651039004325867,
0.05532364919781685,
0.09202510118484497,
-0.04089118540287018,
0.07652760297060013,
0.05298833176493645,
-0.004717118106782436,
-0.017639486119151115,
-0.08518409729003906,
-0.040600668638944626,
0.035854220390319824,
0.00593451177701354,
0.002179572358727455,
0.12898355722427368,
-0.04056719318032265,
-0.017789412289857864,
-0.07263986021280289,
-0.04707314446568489,
-0.13056764006614685,
-0.14609001576900482,
-0.10823802649974823,
-0.16001707315444946,
0.018038099631667137,
-0.05857529118657112,
-0.026339231058955193,
0.06411760300397873,
0.058435920625925064,
-0.07660458236932755,
0.09400241076946259,
0.0387791283428669,
-0.037065014243125916,
0.056562118232250214,
-0.048167359083890915,
0.0018922338495031,
-0.057955414056777954,
-0.08860424906015396,
-0.03413943573832512,
0.01178143173456192,
-0.05447852611541748,
0.08520897477865219,
-0.0016705786110833287,
0.05861911177635193,
-0.14939521253108978,
-0.09998832643032074,
-0.055906638503074646,
0.07541751861572266,
-0.03286011144518852,
0.15258006751537323,
0.04351775720715523,
-0.0015515796840190887,
0.12047828733921051,
0.1707025021314621,
0.013341199606657028,
-0.16328448057174683,
-0.07397553324699402,
0.13674680888652802,
-0.02452365681529045,
0.10445606708526611,
0.014303011819720268,
-0.033688224852085114,
-0.04647999629378319,
0.16854123771190643,
0.26312676072120667,
-0.017862292006611824,
0.03576957434415817,
0.01428620982915163,
0.017204703763127327,
0.002631225623190403,
0.11835947632789612,
0.09160876274108887,
0.16877855360507965,
-0.04616062343120575,
0.015227803960442543,
-0.0011358127230778337,
0.00968221202492714,
-0.08070576936006546,
0.06164605915546417,
0.01412744726985693,
-0.06903552263975143,
-0.01723547652363777,
0.10868439823389053,
-0.14899496734142303,
0.06712004542350769,
-0.014934106729924679,
-0.09351471066474915,
-0.040511131286621094,
-0.02080387994647026,
0.17188891768455505,
-0.03641151636838913,
0.02209162339568138,
-0.04322747513651848,
-0.03213962912559509,
0.05430396646261215,
-0.010286344215273857,
-0.12238558381795883,
0.07158643752336502,
0.02196291647851467,
0.020918171852827072,
0.025127308443188667,
-0.0045374478213489056,
0.13091953098773956,
0.09410779923200607,
0.04984939843416214,
-0.09875570982694626,
0.11696135252714157,
0.004715949762612581,
-0.03532760590314865,
0.07308799028396606,
-0.0739593580365181,
-0.00942815188318491,
-0.043275732547044754,
0.10698327422142029,
-0.03118038922548294,
0.0295732244849205,
0.054624322801828384,
-0.06955882161855698,
-0.0692700445652008,
0.09484860301017761,
-0.0871962234377861,
0.0813244879245758,
0.0342402309179306,
-0.020746242254972458,
0.04181111603975296,
-0.04196745529770851,
0.062056299299001694,
0.00932996068149805,
-0.090482197701931,
-0.009657097980380058,
-0.10804528743028641,
-0.09926386922597885,
0.1717008799314499,
0.0964430719614029,
-0.20921590924263,
-0.006167755927890539,
-0.10143466293811798,
0.019408725202083588,
-0.1438116580247879,
0.020239923149347305,
0.12408032268285751,
0.003361395327374339,
-0.044410381466150284,
-0.10299069434404373,
-0.0032766107469797134,
0.0577346608042717,
-0.10353954136371613,
-0.10940323770046234
] |
null | null | ml-agents |
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: lambdavi/ppo-Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["Pyramids", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Pyramids"]} | reinforcement-learning | lambdavi/ppo-Pyramids | [
"ml-agents",
"tensorboard",
"onnx",
"Pyramids",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Pyramids",
"region:us"
] | 2024-02-13T09:20:11+00:00 | [] | [] | TAGS
#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us
|
# ppo Agent playing Pyramids
This is a trained model of a ppo agent playing Pyramids
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: lambdavi/ppo-Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: lambdavi/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us \n",
"# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: lambdavi/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
48,
204
] | [
"passage: TAGS\n#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us \n# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: lambdavi/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
-0.027923114597797394,
0.040443502366542816,
-0.0036825030110776424,
0.07278509438037872,
0.16060513257980347,
-0.02350984700024128,
0.1549682915210724,
0.13551795482635498,
0.1987728774547577,
0.11450391262769699,
0.022366363555192947,
0.1040366068482399,
0.06963277608156204,
0.13406656682491302,
0.060884371399879456,
-0.18236082792282104,
-0.0406881682574749,
-0.054431620985269547,
0.08311916142702103,
0.08061409741640091,
0.05113481730222702,
-0.07101015746593475,
0.07677759230136871,
0.030957145616412163,
-0.007837093435227871,
0.0012737608049064875,
-0.08903950452804565,
-0.021193915978074074,
0.04831046983599663,
-0.016396790742874146,
-0.020941030234098434,
-0.05411788821220398,
0.09207381308078766,
-0.14277485013008118,
0.029017100110650063,
0.08513844758272171,
-0.000550071825273335,
0.003962442744523287,
0.12349969893693924,
0.012185235507786274,
0.06999233365058899,
-0.09126263856887817,
0.05488013103604317,
0.04227808117866516,
-0.06433449685573578,
-0.01222448330372572,
-0.134755939245224,
0.05186084285378456,
0.20399585366249084,
0.13825079798698425,
0.0022118387278169394,
0.1400034874677658,
0.0068574510514736176,
0.04110196605324745,
0.177867591381073,
-0.29977595806121826,
-0.06739044934511185,
0.0742824599146843,
-0.0059427376836538315,
0.04460008442401886,
0.003992192912846804,
0.046121418476104736,
-0.0479789599776268,
0.04024225100874901,
0.007898501120507717,
-0.019085166975855827,
0.15585288405418396,
-0.03774046152830124,
-0.10215182602405548,
-0.07233905047178268,
0.11218158155679703,
0.03378375619649887,
-0.02464248239994049,
-0.16709761321544647,
-0.023064637556672096,
0.11081618070602417,
-0.02522512525320053,
0.02690955251455307,
0.05701647326350212,
-0.005142875015735626,
0.042487118393182755,
-0.11734042316675186,
-0.03748677670955658,
-0.06479275226593018,
0.036303892731666565,
0.10870246589183807,
0.03246486559510231,
-0.034914739429950714,
0.05568169429898262,
0.053985532373189926,
0.04010332003235817,
-0.05409513786435127,
-0.01793593168258667,
-0.016057072207331657,
-0.11292586475610733,
-0.03845439478754997,
0.03139352798461914,
-0.08225998282432556,
0.039104387164115906,
0.06087515875697136,
0.08168614655733109,
0.03186696022748947,
-0.003135502338409424,
0.05990670993924141,
0.01093169767409563,
0.12164661288261414,
0.0010997159406542778,
0.04833320900797844,
0.0543203204870224,
0.05201810970902443,
0.04358593374490738,
-0.05263917148113251,
-0.0692955031991005,
0.07179044187068939,
-0.061340730637311935,
0.10500622540712357,
0.12609915435314178,
0.016824670135974884,
-0.03694765642285347,
-0.061872001737356186,
-0.0345124825835228,
-0.15493988990783691,
0.062209226191043854,
0.05409812927246094,
-0.028354274109005928,
-0.06967716664075851,
-0.008797275833785534,
-0.014330880716443062,
-0.10835432261228561,
-0.021879777312278748,
-0.01453273557126522,
0.05725012347102165,
-0.02455519698560238,
-0.037207428365945816,
0.05000855028629303,
-0.015475606545805931,
-0.038921184837818146,
-0.1811618059873581,
-0.1804620623588562,
-0.07797490060329437,
0.03844297304749489,
-0.06262463331222534,
-0.08147185295820236,
-0.02153482660651207,
0.05671871080994606,
-0.10530985891819,
0.006478180177509785,
-0.04563860222697258,
-0.054407622665166855,
-0.002801747526973486,
-0.029111748561263084,
0.04812135174870491,
0.1792842000722885,
0.045652203261852264,
-0.022500669583678246,
0.0588800422847271,
-0.23588702082633972,
0.13441802561283112,
-0.1212167963385582,
0.16727769374847412,
-0.10042265057563782,
0.04425491765141487,
0.06578105688095093,
0.0056757573038339615,
0.03354652598500252,
0.1702435165643692,
-0.08941710740327835,
-0.06182921677827835,
0.09297100454568863,
-0.04255354031920433,
-0.1571432501077652,
0.054153501987457275,
0.02930772863328457,
0.09371374547481537,
0.07254600524902344,
0.2189013808965683,
0.1316821277141571,
-0.22407078742980957,
0.05360384285449982,
0.0035927542485296726,
-0.08776871114969254,
0.00046993489377200603,
0.11250607669353485,
-0.10772386193275452,
-0.035633672028779984,
-0.023678716272115707,
-0.1754463016986847,
0.060342635959386826,
-0.017811590805649757,
-0.04626433551311493,
0.040659427642822266,
-0.053816694766283035,
-0.04628049582242966,
0.019148176535964012,
0.05236027017235756,
-0.0008966251625679433,
-0.05040114000439644,
-0.0711967945098877,
0.07488429546356201,
-0.03503294289112091,
0.04069841280579567,
-0.04495972767472267,
0.19144687056541443,
-0.028598064556717873,
0.04478554055094719,
-0.15246643126010895,
-0.11820868402719498,
0.029235519468784332,
0.04309200495481491,
0.09390374273061752,
-0.1440863162279129,
0.0722765401005745,
0.08222968876361847,
0.03779124468564987,
-0.05590861663222313,
-0.060122739523649216,
0.004486052319407463,
-0.09042495489120483,
-0.08870518207550049,
-0.062457840889692307,
-0.0451931357383728,
0.04288575425744057,
-0.08899737894535065,
0.05996636301279068,
-0.13997580111026764,
0.07114667445421219,
-0.0041011301800608635,
-0.04532599076628685,
0.04600636661052704,
0.013141203671693802,
0.037471529096364975,
-0.07359953224658966,
0.09285038709640503,
0.013409283012151718,
-0.0348999910056591,
0.010645574890077114,
-0.011994139291346073,
-0.042510274797677994,
0.09733615070581436,
-0.014091387391090393,
-0.015341812744736671,
0.009098338894546032,
-0.0326119139790535,
0.011980408802628517,
-0.0880119577050209,
-0.013446474447846413,
0.2121632695198059,
0.09805672615766525,
0.10372143983840942,
-0.07123979926109314,
-0.04022863134741783,
-0.023620914667844772,
-0.05277564376592636,
-0.029410842806100845,
0.1504659503698349,
0.07283596694469452,
-0.04494824633002281,
0.06747609376907349,
0.0413266122341156,
0.073832206428051,
0.03823310509324074,
-0.022082623094320297,
-0.12107938528060913,
0.01460212655365467,
0.09512493759393692,
0.05771021172404289,
0.01717982068657875,
0.005990351550281048,
-0.015474588610231876,
0.014783737249672413,
-0.04410500451922417,
-0.010244972072541714,
-0.11518886685371399,
-0.04737401753664017,
0.03453635051846504,
-0.020171789452433586,
0.019029036164283752,
-0.031429510563611984,
-0.014306909404695034,
0.06232442706823349,
0.06652165949344635,
0.027031129226088524,
-0.010139282792806625,
-0.05572882294654846,
-0.11486479640007019,
0.084511399269104,
-0.09026190638542175,
-0.25595203042030334,
-0.082145556807518,
-0.07079320400953293,
-0.057941876351833344,
0.03057367168366909,
0.04272700101137161,
-0.12948612868785858,
-0.011553885415196419,
-0.08987518399953842,
0.021613504737615585,
0.022064249962568283,
-0.04631112888455391,
0.1728454828262329,
0.08196964859962463,
0.014195314608514309,
-0.06556709855794907,
-0.02658584527671337,
-0.004995883442461491,
-0.039261139929294586,
-0.0066938563250005245,
0.043073318898677826,
0.06968596577644348,
0.06926825642585754,
0.0720321536064148,
0.05820050835609436,
-0.010411940515041351,
0.10898621380329132,
-0.05580271780490875,
-0.029141990467905998,
0.15478748083114624,
-0.0003319014795124531,
0.06255688518285751,
0.04159718006849289,
0.03911856561899185,
-0.02568177878856659,
0.012060092762112617,
-0.0013562939129769802,
-0.0342915840446949,
-0.20578433573246002,
-0.10466513782739639,
-0.04948408529162407,
0.12271758168935776,
0.11737849563360214,
0.09992159903049469,
-0.12957823276519775,
0.0005775538738816977,
0.010522965341806412,
-0.00907276850193739,
0.10026592761278152,
0.1101495772600174,
-0.009288647212088108,
-0.04119971767067909,
-0.0175851471722126,
-0.04301375895738602,
0.020701956003904343,
0.05686041712760925,
0.018662378191947937,
0.1292596459388733,
0.04053926095366478,
0.05581902712583542,
0.02899823524057865,
-0.04461041837930679,
-0.05224192142486572,
0.07017644494771957,
0.0200144462287426,
0.014024239964783192,
0.0002769248094409704,
-0.08356054127216339,
-0.046816349029541016,
0.061516884714365005,
0.12504535913467407,
-0.004668071400374174,
-0.09172237664461136,
0.06795521080493927,
0.09279394149780273,
0.15171502530574799,
-0.0010572816245257854,
-0.18955406546592712,
-0.05225437507033348,
-0.004412222653627396,
-0.10156494379043579,
0.00590805197134614,
-0.0012964385095983744,
-0.03971729427576065,
-0.18291807174682617,
0.034177474677562714,
-0.005597712472081184,
0.1255858838558197,
-0.08159054070711136,
-0.015653949230909348,
0.042478349059820175,
0.042257122695446014,
-0.003224687185138464,
0.05478654429316521,
-0.1489022672176361,
0.10806519538164139,
0.01056518405675888,
0.08816792815923691,
-0.05753203481435776,
0.02524208463728428,
0.09119068086147308,
-0.026965223252773285,
0.1932131052017212,
0.021975737065076828,
0.029401501640677452,
-0.09577915072441101,
-0.1857258826494217,
-0.05113566666841507,
-0.040894150733947754,
-0.11117639392614365,
0.07170934975147247,
0.027647322043776512,
-0.035961076617240906,
-0.10885186493396759,
0.06348039209842682,
-0.061272405087947845,
-0.07547463476657867,
0.0016503154765814543,
-0.08330132812261581,
-0.025231067091226578,
-0.036386292427778244,
-0.03583476319909096,
-0.12163286656141281,
0.18664614856243134,
0.07259412109851837,
-0.07652468234300613,
-0.09860716015100479,
-0.03470229730010033,
-0.04788593947887421,
-0.04618455842137337,
0.011672742664813995,
0.004564856179058552,
0.09257424622774124,
-0.0660582110285759,
-0.08362281322479248,
0.002696069423109293,
-0.11887809634208679,
-0.07120882719755173,
-0.043822549283504486,
0.20785759389400482,
0.021326113492250443,
0.06095629930496216,
-0.004753987770527601,
0.03231959417462349,
-0.013513092882931232,
-0.06116974353790283,
0.1376575529575348,
0.18147659301757812,
0.012629352509975433,
0.07573148608207703,
-0.07793568819761276,
0.03981371968984604,
-0.12211304157972336,
0.01651918701827526,
0.2094457596540451,
0.27794820070266724,
-0.038285933434963226,
0.1510244905948639,
0.014525498263537884,
-0.07243373990058899,
-0.17230327427387238,
-0.04882839694619179,
0.030380703508853912,
-0.015462394803762436,
0.1261262148618698,
-0.1937543749809265,
0.058505285531282425,
0.005500104743987322,
-0.02914106845855713,
0.03830409049987793,
-0.26272499561309814,
-0.07648361474275589,
0.050097137689590454,
0.09798518568277359,
-0.05775707960128784,
-0.1014295220375061,
-0.08340850472450256,
0.016826258972287178,
-0.11397553980350494,
0.038753777742385864,
-0.1728196144104004,
0.07257114350795746,
-0.0011728242971003056,
0.05020350590348244,
0.04229620844125748,
-0.027421239763498306,
0.12017092853784561,
-0.043973710387945175,
-0.03500105068087578,
-0.055973708629608154,
0.041156522929668427,
0.014220784418284893,
-0.08372685313224792,
0.070884108543396,
0.0004720394790638238,
-0.02957019954919815,
-0.21184225380420685,
-0.030531255528330803,
-0.010117876343429089,
0.05721893906593323,
-0.0022389376536011696,
-0.011520692147314548,
-0.00014075095532462,
0.058735087513923645,
0.07413797825574875,
0.04297196865081787,
0.10936896502971649,
0.01484921108931303,
0.009595080278813839,
0.049838438630104065,
0.05237897112965584,
0.08350657671689987,
-0.16412828862667084,
-0.05057527869939804,
-0.04073341563344002,
0.008840376511216164,
-0.03896361216902733,
0.001975883962586522,
0.05888015031814575,
0.023005086928606033,
0.031091351062059402,
0.05505404993891716,
-0.11664989590644836,
0.006505418568849564,
0.042008787393569946,
-0.1131288930773735,
-0.18730661273002625,
-0.04685279726982117,
-0.08289437741041183,
-0.008919424377381802,
-0.04430193081498146,
0.0462721511721611,
-0.031523287296295166,
-0.02213503234088421,
0.04376358166337013,
0.037560414522886276,
-0.04240258038043976,
0.04911307245492935,
-0.009052447974681854,
0.022472307085990906,
-0.06632774323225021,
0.18078260123729706,
0.06301423162221909,
-0.007835685275495052,
0.018850741907954216,
0.2188095897436142,
-0.07414734363555908,
-0.09289980679750443,
-0.029681775718927383,
0.08488907665014267,
0.14643441140651703,
-0.003393573220819235,
-0.04872289299964905,
-0.08110100030899048,
0.08299762010574341,
-0.09861177206039429,
0.016868455335497856,
-0.14472152292728424,
0.011034351773560047,
0.050392068922519684,
-0.05754762887954712,
0.09467315673828125,
-0.007954751141369343,
-0.02690555527806282,
-0.13352490961551666,
0.017612846568226814,
0.03078395687043667,
0.1400798112154007,
-0.016550231724977493,
-0.04840995743870735,
-0.13770295679569244,
0.043747469782829285,
0.008366145193576813,
-0.0013924711383879185,
-0.17339766025543213,
-0.030420122668147087,
-0.013182139955461025,
0.04398539662361145,
0.00161056371871382,
0.0604652538895607,
-0.05079354718327522,
-0.09586642682552338,
-0.03669726476073265,
0.11359374225139618,
-0.07392806559801102,
-0.03191174939274788,
0.014524390920996666,
-0.07823847979307175,
0.07596369087696075,
0.07008690387010574,
-0.004351882729679346,
-0.02376566082239151,
-0.09415256232023239,
-0.06359389424324036,
-0.01032909844070673,
0.0019098898628726602,
0.0641782209277153,
-0.17993953824043274,
0.0360439270734787,
-0.046476732939481735,
-0.11481156200170517,
0.003855648450553417,
0.13254886865615845,
-0.07805770635604858,
0.017443297430872917,
0.03581710159778595,
-0.03600991517305374,
-0.05675378441810608,
0.022043630480766296,
0.02291686274111271,
0.06050862744450569,
0.06085255742073059,
-0.07519329339265823,
0.17198263108730316,
-0.12571409344673157,
-0.02253003790974617,
0.006757116410881281,
0.017163941636681557,
0.039861246943473816,
-0.08119810372591019,
0.05888622999191284,
-0.038032956421375275,
0.09629816561937332,
0.0905676856637001,
0.005350738298147917,
0.03232581913471222,
0.03788672015070915,
0.09201732277870178,
0.009149588644504547,
0.03484193608164787,
-0.0269161444157362,
0.012411407195031643,
0.07724510133266449,
0.005169093608856201,
0.07173772901296616,
-0.021108057349920273,
0.1227564588189125,
0.11294537037611008,
0.10099627077579498,
0.04759814217686653,
0.08742183446884155,
-0.08998758345842361,
-0.20932455360889435,
-0.10887084156274796,
0.020147351548075676,
0.05628997087478638,
-0.06186909228563309,
0.16227950155735016,
0.08612311631441116,
-0.1772773414850235,
0.05476667359471321,
-0.024607479572296143,
0.010056944563984871,
-0.0788600817322731,
-0.14553776383399963,
0.0004873780417256057,
-0.14606383442878723,
0.061235763132572174,
-0.030696209520101547,
-0.00758894719183445,
-0.027212774381041527,
-0.02588873729109764,
-0.01811746135354042,
0.09365376830101013,
-0.044804010540246964,
-0.05064244940876961,
0.06783057749271393,
-0.026759527623653412,
0.011180518195033073,
-0.059174057096242905,
-0.028546059504151344,
-0.04823232814669609,
-0.09359583258628845,
0.017278123646974564,
0.04539654776453972,
-0.024669993668794632,
0.07710395753383636,
-0.03160965442657471,
-0.0826815739274025,
0.031916696578264236,
-0.0136305782943964,
-0.030597902834415436,
0.1562069058418274,
0.07479219883680344,
-0.0866735577583313,
-0.02150203101336956,
0.1894344836473465,
-0.02371423877775669,
0.024106958881020546,
-0.07632757723331451,
0.21333420276641846,
-0.024390343576669693,
-0.07389649003744125,
-0.020444348454475403,
-0.13736575841903687,
-0.06243981793522835,
0.2355760931968689,
0.14177633821964264,
-0.07268539816141129,
0.02557232603430748,
-0.033042192459106445,
0.0036539032589644194,
-0.02212582342326641,
0.09791959077119827,
0.0790831446647644,
0.11840895563364029,
-0.08087439835071564,
0.02265971526503563,
-0.03212476521730423,
-0.07068471610546112,
-0.2134903073310852,
-0.008618881925940514,
0.03629204258322716,
-0.003938743844628334,
-0.01620911993086338,
0.10198662430047989,
-0.13309580087661743,
-0.11884450167417526,
0.0762479230761528,
-0.09525628387928009,
-0.10304202884435654,
-0.03768443316221237,
-0.024322787299752235,
0.029507271945476532,
0.08726433664560318,
0.02434263937175274,
0.03412799537181854,
0.08263258635997772,
-0.008029324002563953,
-0.04662623628973961,
-0.02294093370437622,
0.08676113933324814,
-0.107146717607975,
0.2530556917190552,
-0.0398838184773922,
0.03353193774819374,
0.059568844735622406,
0.02910543978214264,
-0.16520212590694427,
0.029920563101768494,
0.05282948538661003,
-0.13732866942882538,
0.031559817492961884,
0.0801340639591217,
-0.049669161438941956,
0.02997896447777748,
0.07129544019699097,
0.011837444268167019,
0.014879059977829456,
0.07351268082857132,
0.036424420773983,
-0.0582783967256546,
0.05433133244514465,
-0.15597152709960938,
0.10567614436149597,
0.12570008635520935,
-0.056384794414043427,
0.011184237897396088,
-0.02801620587706566,
0.024851033464074135,
0.04275127127766609,
0.06253869831562042,
-0.04551403596997261,
-0.13861054182052612,
-0.004114563576877117,
0.01909974217414856,
0.0652809739112854,
-0.22885234653949738,
-0.12435141950845718,
-0.048989009112119675,
-0.06956428289413452,
-0.053215786814689636,
0.08509252965450287,
0.1504819691181183,
-0.026226762682199478,
-0.019900567829608917,
-0.16006077826023102,
0.023034971207380295,
0.1496458500623703,
-0.09365565329790115,
-0.019534239545464516
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | aidonuts/pernicious-001-ep2 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:21:33+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04654794931411743,
0.16618601977825165,
-0.005445904564112425,
0.01853804849088192,
0.0981811136007309,
0.011998992413282394,
0.06433123350143433,
0.11398410052061081,
-0.0230073444545269,
0.11406639218330383,
0.03047988750040531,
0.10172267258167267,
0.11317981779575348,
0.14841650426387787,
-0.002152352826669812,
-0.22403094172477722,
0.050844956189394,
-0.12105348706245422,
-0.033293843269348145,
0.11749980598688126,
0.1483822613954544,
-0.09928343445062637,
0.07274559140205383,
-0.029687678441405296,
-0.012143402360379696,
-0.030057786032557487,
-0.05890674889087677,
-0.046214159578084946,
0.04651786759495735,
0.06640566885471344,
0.06770290434360504,
0.0071083661168813705,
0.09012923389673233,
-0.2696533799171448,
0.018959321081638336,
0.07145345956087112,
-0.002759667346253991,
0.06957992166280746,
0.06404146552085876,
-0.07107418030500412,
0.10337356477975845,
-0.05106033384799957,
0.14650006592273712,
0.08365883678197861,
-0.09081148356199265,
-0.1895141303539276,
-0.08866965025663376,
0.09882009029388428,
0.17572562396526337,
0.04925641790032387,
-0.02320658043026924,
0.09761467576026917,
-0.08769196271896362,
0.015438909642398357,
0.04981724172830582,
-0.07620415836572647,
-0.05378096550703049,
0.05986575037240982,
0.07907199114561081,
0.06627275794744492,
-0.12434766441583633,
-0.02885502204298973,
0.005009706597775221,
0.010980482213199139,
0.0769270583987236,
0.01728810742497444,
0.146672785282135,
0.0338633768260479,
-0.12615777552127838,
-0.04880760237574577,
0.09869225323200226,
0.03395522013306618,
-0.04422314465045929,
-0.24749068915843964,
-0.03152675926685333,
-0.030810698866844177,
-0.029386121779680252,
-0.03716538846492767,
0.04340358078479767,
-0.007673026993870735,
0.08638741075992584,
-0.0060646249912679195,
-0.07403432577848434,
-0.03937075287103653,
0.06169692054390907,
0.0672287791967392,
0.02999979443848133,
-0.013745363801717758,
0.010938193649053574,
0.11620724946260452,
0.1095694974064827,
-0.12054188549518585,
-0.05555335059762001,
-0.06393084675073624,
-0.08656639605760574,
-0.040790557861328125,
0.034162238240242004,
0.03456587344408035,
0.05349370837211609,
0.25305667519569397,
0.015654386952519417,
0.059652652591466904,
0.034477248787879944,
0.007892133668065071,
0.05848940089344978,
0.11044429242610931,
-0.06018859148025513,
-0.10444226115942001,
-0.02648012898862362,
0.08843598514795303,
0.008199662901461124,
-0.03287925571203232,
-0.05088530853390694,
0.06019928678870201,
0.01946467161178589,
0.11926145106554031,
0.09061790257692337,
0.010536285117268562,
-0.07121123373508453,
-0.061038948595523834,
0.1891259253025055,
-0.16544590890407562,
0.04322727024555206,
0.035097137093544006,
-0.03903156518936157,
0.00019933005387429148,
0.013914269395172596,
0.016625655815005302,
-0.025983380153775215,
0.09017423540353775,
-0.054113563150167465,
-0.04145489260554314,
-0.11186197400093079,
-0.03383193537592888,
0.033762916922569275,
0.008953776210546494,
-0.035059962421655655,
-0.033713940531015396,
-0.08351044356822968,
-0.07577689737081528,
0.09320491552352905,
-0.07346344739198685,
-0.04878907650709152,
-0.01804324984550476,
-0.07530532777309418,
0.022395428270101547,
0.019394835457205772,
0.07707412540912628,
-0.02362251654267311,
0.04399976506829262,
-0.05189276114106178,
0.05863580107688904,
0.11207318305969238,
0.03570080175995827,
-0.05736649036407471,
0.06062258034944534,
-0.23834340274333954,
0.09552820026874542,
-0.07409077137708664,
0.05591456592082977,
-0.153293639421463,
-0.024439791217446327,
0.04788333550095558,
0.008784620091319084,
-0.009650949388742447,
0.13416339457035065,
-0.21702027320861816,
-0.02536402828991413,
0.1717337965965271,
-0.10057014971971512,
-0.07069246470928192,
0.05619903281331062,
-0.04835370555520058,
0.10988964140415192,
0.03825836628675461,
-0.025690359994769096,
0.06171267107129097,
-0.1267417073249817,
0.003717758459970355,
-0.05005312338471413,
-0.017048977315425873,
0.1548657864332199,
0.07182947546243668,
-0.07217690348625183,
0.07399354875087738,
0.025708531960844994,
-0.0246540866792202,
-0.04625825211405754,
-0.015164627693593502,
-0.10536660254001617,
0.014689887873828411,
-0.06369215250015259,
0.014470234513282776,
-0.020807426422834396,
-0.09071163833141327,
-0.027962757274508476,
-0.17504668235778809,
-0.03014434315264225,
0.08651752024888992,
-0.008693269453942776,
-0.01803150773048401,
-0.1178668737411499,
0.009341353550553322,
0.04177580401301384,
0.0061247628182172775,
-0.13462838530540466,
-0.04812471568584442,
0.02780051715672016,
-0.1600649207830429,
0.034652888774871826,
-0.05392369255423546,
0.04932025074958801,
0.025790516287088394,
-0.028889117762446404,
-0.026493212208151817,
0.021633783355355263,
0.005992184858769178,
-0.011999987065792084,
-0.24343903362751007,
-0.028118690475821495,
-0.024888472631573677,
0.1682123839855194,
-0.20917098224163055,
0.03546025976538658,
0.07867541164159775,
0.15366052091121674,
0.011240328662097454,
-0.04177491366863251,
0.005974748637527227,
-0.06935794651508331,
-0.02736494317650795,
-0.05875484645366669,
-0.0047869328409433365,
-0.03310677409172058,
-0.04545191675424576,
0.04568447172641754,
-0.16510973870754242,
-0.032636504620313644,
0.09776268899440765,
0.06289951503276825,
-0.13922683894634247,
-0.020621931180357933,
-0.03630133345723152,
-0.049253206700086594,
-0.04911839962005615,
-0.0605199858546257,
0.10893940925598145,
0.05891856551170349,
0.04574795812368393,
-0.05928509309887886,
-0.07568105310201645,
-0.001827909960411489,
-0.013898161239922047,
-0.017864689230918884,
0.09759635478258133,
0.0751434788107872,
-0.13251115381717682,
0.09224759042263031,
0.09603385627269745,
0.07919023185968399,
0.09113933145999908,
-0.02355697751045227,
-0.08261934667825699,
-0.045987509191036224,
0.031442027539014816,
0.020124373957514763,
0.13039541244506836,
-0.024294709786772728,
0.04352088272571564,
0.042134687304496765,
-0.019369594752788544,
0.014752166345715523,
-0.08687400817871094,
0.033972494304180145,
0.028472330421209335,
-0.016721390187740326,
0.050190530717372894,
-0.03876714035868645,
0.02440318465232849,
0.08830609917640686,
0.045322712510824203,
0.03507532551884651,
0.015493292361497879,
-0.05206458270549774,
-0.1083620935678482,
0.16405931115150452,
-0.12714070081710815,
-0.22483378648757935,
-0.13936103880405426,
0.0037376401014626026,
0.035628627985715866,
-0.015835661441087723,
0.002417160663753748,
-0.059374887496232986,
-0.12220635265111923,
-0.08858037739992142,
0.015140829607844353,
0.04942670464515686,
-0.09028962254524231,
-0.06437795609235764,
0.058117836713790894,
0.03889724239706993,
-0.14560972154140472,
0.017612040042877197,
0.04854894429445267,
-0.09789852797985077,
-0.006774199660867453,
0.08094939589500427,
0.0698540136218071,
0.1770169734954834,
0.017703235149383545,
-0.021850809454917908,
0.032354529947042465,
0.20614571869373322,
-0.13538233935832977,
0.11083246022462845,
0.13607586920261383,
-0.09041404724121094,
0.08072979003190994,
0.19951270520687103,
0.03932560607790947,
-0.10153959691524506,
0.031980328261852264,
0.02283124253153801,
-0.0284719280898571,
-0.24526868760585785,
-0.07212468236684799,
-0.004402178805321455,
-0.058010730892419815,
0.07660572230815887,
0.09286724030971527,
0.08215958625078201,
0.012304253876209259,
-0.09310996532440186,
-0.08154371380805969,
0.05942574888467789,
0.10367169976234436,
0.024584239348769188,
-0.010839897207915783,
0.08998730033636093,
-0.034100502729415894,
0.019626356661319733,
0.0853661298751831,
0.005239574704319239,
0.17840281128883362,
0.05159219726920128,
0.18830420076847076,
0.07925192266702652,
0.07219027727842331,
0.009912233799695969,
0.013080619275569916,
0.018877580761909485,
0.03300119563937187,
-0.002769160782918334,
-0.08440786600112915,
-0.02248465269804001,
0.11566436290740967,
0.06668911874294281,
0.010815348476171494,
0.015172341838479042,
-0.04104290530085564,
0.07965951412916183,
0.1831512451171875,
-0.007656289264559746,
-0.1783534437417984,
-0.057547420263290405,
0.07553383708000183,
-0.09879875183105469,
-0.09854305535554886,
-0.013454320840537548,
0.03072015568614006,
-0.17046253383159637,
0.023390959948301315,
-0.02239842526614666,
0.1106182336807251,
-0.14194999635219574,
-0.020490378141403198,
0.07218493521213531,
0.07199500501155853,
0.004729843698441982,
0.05758659541606903,
-0.16417601704597473,
0.10671813786029816,
0.008950476534664631,
0.06779605895280838,
-0.09610627591609955,
0.1008887067437172,
-0.004196076653897762,
-0.02063460275530815,
0.1393408179283142,
0.002700034761801362,
-0.06884108483791351,
-0.0763031542301178,
-0.08754398673772812,
-0.009632662869989872,
0.12754282355308533,
-0.1419651061296463,
0.08767123520374298,
-0.037212442606687546,
-0.0424150750041008,
-0.0017086371080949903,
-0.10206665843725204,
-0.11638247221708298,
-0.18888559937477112,
0.06001543253660202,
-0.13492922484874725,
0.03152317553758621,
-0.10799519717693329,
-0.032371897250413895,
-0.030304040759801865,
0.19337286055088043,
-0.23447458446025848,
-0.07199826091527939,
-0.1475764364004135,
-0.10233612358570099,
0.1443224400281906,
-0.0501345656812191,
0.08485390990972519,
-0.007241467013955116,
0.16846685111522675,
0.019060896709561348,
-0.02531743235886097,
0.0971490666270256,
-0.09173708409070969,
-0.19302815198898315,
-0.07869284600019455,
0.15662524104118347,
0.13260218501091003,
0.031680017709732056,
-0.002461588243022561,
0.036563750356435776,
-0.015421539545059204,
-0.11935004591941833,
0.015969349071383476,
0.1787186712026596,
0.06237189099192619,
0.02331034652888775,
-0.027346095070242882,
-0.11273157596588135,
-0.06900003552436829,
-0.028530338779091835,
0.03054865077137947,
0.17762407660484314,
-0.07057618349790573,
0.18207968771457672,
0.14163152873516083,
-0.05922834202647209,
-0.20400173962116241,
0.010538800619542599,
0.03055560030043125,
0.0009220078936778009,
0.02591954916715622,
-0.20123432576656342,
0.08688826113939285,
0.004683020059019327,
-0.05110127478837967,
0.13194532692432404,
-0.17217805981636047,
-0.14451217651367188,
0.0765485092997551,
0.038384392857551575,
-0.19559739530086517,
-0.12913893163204193,
-0.09174312651157379,
-0.045869920402765274,
-0.18591414391994476,
0.09569250047206879,
0.0305706188082695,
0.010893458500504494,
0.03030681423842907,
0.029179483652114868,
0.019487828016281128,
-0.0418255440890789,
0.18391458690166473,
-0.024792250245809555,
0.026594700291752815,
-0.08539514988660812,
-0.06927408277988434,
0.03743394836783409,
-0.052842434495687485,
0.07349982857704163,
-0.023486759513616562,
0.007861839607357979,
-0.10348054021596909,
-0.042148489505052567,
-0.03735732287168503,
0.015448716469109058,
-0.09657872468233109,
-0.08514349907636642,
-0.045032672584056854,
0.09675803780555725,
0.09690850973129272,
-0.033646680414676666,
-0.028050623834133148,
-0.07533035427331924,
0.04412057250738144,
0.19926515221595764,
0.1785389482975006,
0.042153384536504745,
-0.08034496754407883,
-0.004150947090238333,
-0.010121207684278488,
0.04310847446322441,
-0.20463712513446808,
0.06283636391162872,
0.05450061708688736,
0.01973269321024418,
0.11436162889003754,
-0.019565396010875702,
-0.15359151363372803,
-0.07263088971376419,
0.06303015351295471,
-0.060181066393852234,
-0.19620554149150848,
0.00867035984992981,
0.060603946447372437,
-0.16371412575244904,
-0.04535605385899544,
0.04643881320953369,
-0.005620351992547512,
-0.038163937628269196,
0.021896906197071075,
0.09194854646921158,
0.0026654244866222143,
0.07427921891212463,
0.05387866869568825,
0.0827430784702301,
-0.10537070035934448,
0.08090532571077347,
0.08839722722768784,
-0.08452684432268143,
0.023530138656497,
0.10478579998016357,
-0.059433579444885254,
-0.03440561518073082,
0.020135708153247833,
0.08153781294822693,
0.01775863952934742,
-0.040019966661930084,
0.013229827396571636,
-0.10452935844659805,
0.05954122915863991,
0.08839859813451767,
0.032507482916116714,
0.016702456399798393,
0.03425082191824913,
0.04607953503727913,
-0.07238735258579254,
0.12142276018857956,
0.031868141144514084,
0.017129309475421906,
-0.036505792289972305,
-0.040896978229284286,
0.019542274996638298,
-0.03214648738503456,
-0.005015232600271702,
-0.03023446537554264,
-0.07695909589529037,
-0.014793801121413708,
-0.1626158058643341,
-0.011131818406283855,
-0.05648450180888176,
0.010329355485737324,
0.03204665705561638,
-0.032609567046165466,
0.008124498650431633,
0.009250079281628132,
-0.07695289701223373,
-0.0663459524512291,
-0.020460480824112892,
0.09540658444166183,
-0.16213038563728333,
0.022481130436062813,
0.08244425803422928,
-0.12187694013118744,
0.09281346201896667,
0.016204802319407463,
-0.006236857734620571,
0.025038830935955048,
-0.1475188434123993,
0.034843120723962784,
-0.03386561945080757,
0.010836300440132618,
0.04373383894562721,
-0.21569781005382538,
-0.00004886732858722098,
-0.033673107624053955,
-0.06639216095209122,
-0.009451326914131641,
-0.03672455996274948,
-0.11508306115865707,
0.1058407872915268,
0.007236586883664131,
-0.08753558248281479,
-0.03186136856675148,
0.029325377196073532,
0.0838974118232727,
-0.021959776058793068,
0.15145497024059296,
-0.008370938710868359,
0.07429654151201248,
-0.16209737956523895,
-0.018623165786266327,
-0.006028574425727129,
0.022658247500658035,
-0.01664556935429573,
-0.01111356820911169,
0.044031109660863876,
-0.022746501490473747,
0.17925859987735748,
-0.030318550765514374,
0.02272745408117771,
0.06815794110298157,
0.019072026014328003,
-0.030184008181095123,
0.10406795144081116,
0.04094860330224037,
0.02014910988509655,
0.018591465428471565,
0.003289656015112996,
-0.04647882282733917,
-0.03173251822590828,
-0.19407226145267487,
0.07288651913404465,
0.15608493983745575,
0.09729263186454773,
-0.016707008704543114,
0.07954329252243042,
-0.10199416428804398,
-0.1109243705868721,
0.12477338314056396,
-0.04797708988189697,
-0.002418199321255088,
-0.07150927931070328,
0.13247236609458923,
0.1437523066997528,
-0.1859612911939621,
0.07269313186407089,
-0.0699717253446579,
-0.04708027467131615,
-0.10980689525604248,
-0.19441905617713928,
-0.05561789125204086,
-0.049456022679805756,
-0.016053348779678345,
-0.04698808491230011,
0.07504211366176605,
0.054538097232580185,
0.006766852922737598,
-0.0023397188633680344,
0.06506035476922989,
-0.031050674617290497,
-0.0037882844917476177,
0.032597362995147705,
0.06591679900884628,
0.012734474614262581,
-0.030802709981799126,
0.016619903966784477,
-0.013545602560043335,
0.045626189559698105,
0.06578011065721512,
0.04976864159107208,
-0.02938537672162056,
0.014603170566260815,
-0.038539156317710876,
-0.10249634087085724,
0.043612558394670486,
-0.024421939626336098,
-0.0789753645658493,
0.15477414429187775,
0.023680059239268303,
0.007779473438858986,
-0.020137663930654526,
0.23901568353176117,
-0.0738423764705658,
-0.0964353010058403,
-0.14737580716609955,
0.10557299107313156,
-0.038081806153059006,
0.05800395458936691,
0.04625935107469559,
-0.10226529091596603,
0.018044332042336464,
0.1338089406490326,
0.16182038187980652,
-0.039008259773254395,
0.020095856860280037,
0.031135575845837593,
0.00566398398950696,
-0.03622615709900856,
0.04847532883286476,
0.06906453520059586,
0.16569648683071136,
-0.04632584750652313,
0.09100406616926193,
0.0019041687482967973,
-0.09579581767320633,
-0.038361791521310806,
0.11069868505001068,
-0.016052277758717537,
0.019335128366947174,
-0.05818064883351326,
0.11742528527975082,
-0.06386786699295044,
-0.23783175647258759,
0.06453443318605423,
-0.0684293657541275,
-0.13765870034694672,
-0.02378307841718197,
0.08207765966653824,
-0.012955902144312859,
0.027587108314037323,
0.0730307325720787,
-0.07240920513868332,
0.201939657330513,
0.03798431158065796,
-0.05499868467450142,
-0.055047210305929184,
0.0805421993136406,
-0.10008571296930313,
0.2739645540714264,
0.01557221356779337,
0.04601577669382095,
0.10384146869182587,
-0.009341772645711899,
-0.13838784396648407,
0.019836371764540672,
0.09581108391284943,
-0.10502193123102188,
0.04196618124842644,
0.19815568625926971,
-0.0014755994779989123,
0.12389086186885834,
0.07657600939273834,
-0.07551808655261993,
0.0478031262755394,
-0.08054235577583313,
-0.06760486960411072,
-0.09260394424200058,
0.09703279286623001,
-0.07772123068571091,
0.14251399040222168,
0.13876807689666748,
-0.05074559152126312,
0.012724342755973339,
-0.031311117112636566,
0.044293127954006195,
-0.00010600237874314189,
0.10321761667728424,
0.004272161517292261,
-0.1832672357559204,
0.024692710489034653,
0.005650998093187809,
0.10749758034944534,
-0.16033467650413513,
-0.09566054493188858,
0.042343202978372574,
0.003505636239424348,
-0.0672195628285408,
0.1290110945701599,
0.05665452033281326,
0.04342988133430481,
-0.03997718170285225,
-0.03521440550684929,
-0.0060732318088412285,
0.13561366498470306,
-0.10713256150484085,
0.0009933578548952937
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2269
- Accuracy: 0.9215
- F1: 0.9216
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8758 | 1.0 | 250 | 0.3253 | 0.905 | 0.9045 |
| 0.2571 | 2.0 | 500 | 0.2269 | 0.9215 | 0.9216 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu118
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": []}]} | text-classification | gK29382231121/distilbert-base-uncased-finetuned-emotion | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:22:10+00:00 | [] | [] | TAGS
#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-emotion
=========================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2269
* Accuracy: 0.9215
* F1: 0.9216
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu118
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
68,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.09214076399803162,
0.0979614183306694,
-0.0019159578951075673,
0.11677741259336472,
0.15470454096794128,
0.022053299471735954,
0.14488713443279266,
0.10451043397188187,
-0.07812443375587463,
0.038024142384529114,
0.1212630644440651,
0.13728956878185272,
0.0008458215743303299,
0.13793832063674927,
-0.08614712208509445,
-0.2184818685054779,
0.019058190286159515,
0.014677203260362148,
-0.03191200643777847,
0.1162770614027977,
0.10323835164308548,
-0.1202312782406807,
0.08896412700414658,
-0.0185630414634943,
-0.1688167303800583,
0.00728527270257473,
0.01783973164856434,
-0.05052690580487251,
0.12824097275733948,
0.02549724094569683,
0.1317271590232849,
0.02296377718448639,
0.09418884664773941,
-0.20642651617527008,
0.004607985727488995,
0.0508185438811779,
-0.006197625771164894,
0.0644790306687355,
0.017613844946026802,
-0.009341774508357048,
0.08116341382265091,
-0.08997412025928497,
0.06109797954559326,
0.02097882144153118,
-0.1266992837190628,
-0.202153742313385,
-0.0900067612528801,
0.040170591324567795,
0.09884747862815857,
0.08108542114496231,
-0.009479704312980175,
0.11558306962251663,
-0.07796543836593628,
0.08883481472730637,
0.20139087736606598,
-0.30668002367019653,
-0.0565466545522213,
0.04679226875305176,
0.013577079400420189,
0.07291530817747116,
-0.10038579255342484,
-0.0371166355907917,
0.0696059912443161,
0.023798270151019096,
0.11797206848859787,
-0.02216978743672371,
-0.09733173996210098,
-0.004047262016683817,
-0.14487309753894806,
-0.02200390212237835,
0.1676570177078247,
0.05182085186243057,
-0.058719903230667114,
-0.04312751442193985,
-0.06790058314800262,
-0.1367795616388321,
-0.039065927267074585,
-0.011378336697816849,
0.051360853016376495,
-0.017587725073099136,
-0.043126195669174194,
0.0039132884703576565,
-0.09182313829660416,
-0.07848374545574188,
-0.05674368515610695,
0.17115822434425354,
0.03966744616627693,
0.00008520644041709602,
0.0037859957665205,
0.10368739813566208,
-0.04828431084752083,
-0.13001713156700134,
0.0024943246971815825,
0.01279019471257925,
0.021307989954948425,
-0.05802799016237259,
-0.0622284822165966,
-0.03247525915503502,
0.023300660774111748,
0.1795789897441864,
-0.07102327793836594,
0.040589671581983566,
0.01668870821595192,
0.03611242398619652,
-0.09870550781488419,
0.15705940127372742,
-0.021799346432089806,
-0.03313063830137253,
0.032551586627960205,
0.07535580545663834,
0.05694076418876648,
-0.0008547616889700294,
-0.1212940365076065,
0.01714593730866909,
0.10487373918294907,
0.023250797763466835,
-0.08511919528245926,
0.07631529867649078,
-0.062191251665353775,
0.002985188737511635,
0.03964745253324509,
-0.09354182332754135,
0.023847438395023346,
0.0015657312469556928,
-0.04959145188331604,
-0.06546270102262497,
0.034631434828042984,
0.026652267202734947,
0.010448381304740906,
0.10484755784273148,
-0.07935138046741486,
0.004955092910677195,
-0.08735012263059616,
-0.11493679136037827,
0.012255971319973469,
-0.07650608569383621,
0.029728010296821594,
-0.11747612059116364,
-0.2066560536623001,
-0.00806803721934557,
0.04977551847696304,
-0.0176662839949131,
-0.03228681907057762,
-0.06534910202026367,
-0.07673638314008713,
0.013689016923308372,
-0.014676940627396107,
0.053186897188425064,
-0.07333176583051682,
0.09731674939393997,
0.04870256409049034,
0.06465727835893631,
-0.06462902575731277,
0.03900951147079468,
-0.11461776494979858,
0.02866794355213642,
-0.1840626299381256,
0.02812691032886505,
-0.06947609037160873,
0.06335465610027313,
-0.06365758180618286,
-0.08229292929172516,
0.008796009235084057,
0.00258540827780962,
0.06574788689613342,
0.10281989723443985,
-0.15724419057369232,
-0.05393987149000168,
0.16887424886226654,
-0.10897549986839294,
-0.1430080533027649,
0.12458840012550354,
-0.05949115753173828,
0.05348331108689308,
0.06716746091842651,
0.1709614396095276,
0.06512381136417389,
-0.08559641242027283,
-0.00771466875448823,
-0.006262487731873989,
0.055262211710214615,
-0.024853045120835304,
0.06482574343681335,
0.005822577513754368,
-0.02924882434308529,
0.021734854206442833,
-0.05786655843257904,
0.04910619929432869,
-0.0809270441532135,
-0.08352638781070709,
-0.05134519189596176,
-0.10622981190681458,
0.0676221027970314,
0.0432758666574955,
0.056164879351854324,
-0.11864395439624786,
-0.07968715578317642,
0.07402101904153824,
0.08584079891443253,
-0.06451117992401123,
0.015812337398529053,
-0.06584993749856949,
0.08230677247047424,
-0.03136461228132248,
-0.021882593631744385,
-0.14925643801689148,
-0.04833558201789856,
0.023942938074469566,
0.012751767411828041,
0.006064596585929394,
-0.029220785945653915,
0.06157534569501877,
0.09256212413311005,
-0.07394301146268845,
-0.039473436772823334,
-0.03080667369067669,
0.023670664057135582,
-0.11318852752447128,
-0.1861543208360672,
-0.014074863865971565,
-0.029918352141976357,
0.1542530357837677,
-0.2256270796060562,
0.05498193949460983,
-0.015190916135907173,
0.08109080791473389,
0.029732637107372284,
-0.00420091487467289,
-0.045256394892930984,
0.07905649393796921,
-0.048702068626880646,
-0.060924794524908066,
0.05453893169760704,
0.015046171844005585,
-0.08435575664043427,
-0.05522746965289116,
-0.12919583916664124,
0.18417905271053314,
0.1318083554506302,
-0.08065412938594818,
-0.08653907477855682,
-0.009186175651848316,
-0.04194572567939758,
-0.025824032723903656,
-0.05579899996519089,
0.004428796004503965,
0.12847299873828888,
-0.02294495701789856,
0.14836853742599487,
-0.08510530740022659,
-0.026717083528637886,
0.013088980689644814,
-0.05107207968831062,
0.022728661075234413,
0.09720800817012787,
0.0962570384144783,
-0.11428207904100418,
0.15248946845531464,
0.1836712509393692,
-0.09789538383483887,
0.12543493509292603,
-0.04495696723461151,
-0.046645112335681915,
-0.019139179959893227,
0.008072813972830772,
0.004492676816880703,
0.09831798821687698,
-0.11827641725540161,
0.013801555149257183,
0.0057557630352675915,
0.02191774733364582,
0.008722018450498581,
-0.2141803503036499,
-0.029996881261467934,
0.03774775564670563,
-0.0505577027797699,
-0.010182890109717846,
-0.0242917463183403,
-0.004888314288109541,
0.0947437733411789,
-0.006944174878299236,
-0.08895372599363327,
0.05426090210676193,
0.0009600659250281751,
-0.08358079195022583,
0.21123802661895752,
-0.1026676744222641,
-0.12062529474496841,
-0.13120132684707642,
-0.07182929664850235,
-0.047776468098163605,
0.038425564765930176,
0.07549307495355606,
-0.06770462542772293,
-0.045349303632974625,
-0.10675603151321411,
-0.0030329322908073664,
0.04457472264766693,
0.024106159806251526,
0.01529905665665865,
0.0049713170155882835,
0.07817046344280243,
-0.10301294177770615,
-0.019768137484788895,
-0.03791046142578125,
-0.06589019298553467,
0.0445077158510685,
0.021337611600756645,
0.109464131295681,
0.13613975048065186,
-0.030517717823386192,
-0.011033058166503906,
-0.030981717631220818,
0.23307958245277405,
-0.046609919518232346,
-0.020708365365862846,
0.13443200290203094,
-0.01899462379515171,
0.04898643121123314,
0.1416487842798233,
0.05496283620595932,
-0.10837315022945404,
0.03272866830229759,
0.027123738080263138,
-0.024334652349352837,
-0.20725086331367493,
-0.05109674856066704,
-0.03698455169796944,
-0.004293567035347223,
0.09096182137727737,
0.024653291329741478,
0.023254403844475746,
0.06664629280567169,
0.016793016344308853,
0.07658322900533676,
-0.0013634038623422384,
0.07973328232765198,
0.12888768315315247,
0.04218679293990135,
0.12255778163671494,
-0.03980773314833641,
-0.05119454488158226,
0.03769850358366966,
-0.01519042532891035,
0.20039570331573486,
0.021510079503059387,
0.10962718725204468,
0.05477299913764,
0.15788771212100983,
-0.000005086020792077761,
0.07481946051120758,
-0.0008891391917131841,
-0.03857406601309776,
-0.014038004912436008,
-0.04979623109102249,
-0.04928017407655716,
0.043557219207286835,
-0.11163098365068436,
0.0709911361336708,
-0.12328408658504486,
0.023553455248475075,
0.06474108248949051,
0.24025294184684753,
0.05062933638691902,
-0.32430562376976013,
-0.10175544768571854,
0.02860475331544876,
-0.02434716373682022,
-0.028740564361214638,
0.036968786269426346,
0.10694535076618195,
-0.05744660645723343,
0.041102975606918335,
-0.048659298568964005,
0.07913398742675781,
-0.02479924075305462,
0.04320352524518967,
0.047373417764902115,
0.0798698216676712,
-0.00619714567437768,
0.06900984048843384,
-0.27010631561279297,
0.25715765357017517,
0.009235536679625511,
0.07927612960338593,
-0.0400768406689167,
0.003979280591011047,
0.039521969854831696,
0.11349084228277206,
0.07360760867595673,
-0.013267054222524166,
-0.06425933539867401,
-0.19786705076694489,
-0.04955010861158371,
0.026334794238209724,
0.09000588208436966,
-0.028815478086471558,
0.1006225049495697,
-0.03952106833457947,
0.004507935140281916,
0.08376539498567581,
-0.016329079866409302,
-0.09566224366426468,
-0.08893327414989471,
-0.028915688395500183,
0.04218122363090515,
0.010862396098673344,
-0.08830466866493225,
-0.09212386608123779,
-0.11568295210599899,
0.15349619090557098,
-0.05692629516124725,
-0.035698696970939636,
-0.09592626988887787,
0.047443944960832596,
0.04849888011813164,
-0.0720510482788086,
0.06382066756486893,
0.011485248804092407,
0.08627290278673172,
0.01745595410466194,
-0.051660582423210144,
0.11781322211027145,
-0.08134134858846664,
-0.18784987926483154,
-0.07367781549692154,
0.0983370989561081,
0.018516508862376213,
0.039059922099113464,
0.0032695357222110033,
0.012979644350707531,
-0.013540918007493019,
-0.08632734417915344,
0.006875602062791586,
0.01746967062354088,
0.07119587063789368,
0.04973205551505089,
-0.08494739234447479,
-0.015491790138185024,
-0.04804697260260582,
-0.02960202284157276,
0.15938697755336761,
0.2920039892196655,
-0.08676372468471527,
0.005889860447496176,
0.059445977210998535,
-0.06127595156431198,
-0.20488926768302917,
0.024864673614501953,
0.032883547246456146,
-0.0007841621991246939,
0.034910958260297775,
-0.1413918137550354,
0.12157493829727173,
0.11655659228563309,
-0.025476494804024696,
0.09957773238420486,
-0.27131298184394836,
-0.12977859377861023,
0.13092398643493652,
0.14874666929244995,
0.13824641704559326,
-0.14406466484069824,
-0.026816414669156075,
-0.0502212718129158,
-0.12475377321243286,
0.10305912792682648,
-0.11244819313287735,
0.10987017303705215,
-0.017176449298858643,
0.05046897381544113,
0.0005860378150828183,
-0.047367531806230545,
0.1313169151544571,
0.007854793220758438,
0.120839424431324,
-0.06417819112539291,
-0.014716201461851597,
0.039751894772052765,
-0.05900077894330025,
0.030573226511478424,
-0.10700673609972,
0.05103655159473419,
-0.06159425154328346,
-0.028948063030838966,
-0.03949490189552307,
0.043421901762485504,
-0.037870362401008606,
-0.06767034530639648,
-0.03659960627555847,
0.022457130253314972,
0.04956521466374397,
-0.012726273387670517,
0.13027538359165192,
0.01757558062672615,
0.14754195511341095,
0.1261582374572754,
0.0702800303697586,
-0.07614850252866745,
0.0010974399046972394,
-0.0030122329480946064,
-0.03857323154807091,
0.06167798489332199,
-0.141588494181633,
0.04373874515295029,
0.11531568318605423,
0.014958188869059086,
0.15552687644958496,
0.07861645519733429,
-0.006360987201333046,
0.0097619304433465,
0.06634658575057983,
-0.16642045974731445,
-0.07190676033496857,
-0.0033139213919639587,
-0.026879116892814636,
-0.1084543988108635,
0.06274615973234177,
0.11063331365585327,
-0.07671200484037399,
0.005450420081615448,
-0.019215324893593788,
0.018530724570155144,
-0.04258790984749794,
0.16436834633350372,
0.06311806291341782,
0.04637830704450607,
-0.0836443305015564,
0.09027465432882309,
0.04548586905002594,
-0.05095353350043297,
0.009000384248793125,
0.03195412456989288,
-0.09705367684364319,
-0.046525988727808,
0.04999262094497681,
0.17941373586654663,
-0.042945001274347305,
-0.058036722242832184,
-0.1334039568901062,
-0.12153556942939758,
0.05331060662865639,
0.1824653595685959,
0.10824401676654816,
0.021416449919342995,
-0.02737431414425373,
0.013141822069883347,
-0.11510904133319855,
0.10598357766866684,
0.031135912984609604,
0.08891858160495758,
-0.15357623994350433,
0.10911022126674652,
-0.00472806254401803,
-0.0016454639844596386,
-0.02305869199335575,
0.04790397733449936,
-0.11724685877561569,
-0.00757031561806798,
-0.1330975741147995,
-0.0005471805925481021,
-0.027942528948187828,
0.019130147993564606,
0.007290527690201998,
-0.05042346939444542,
-0.052923038601875305,
0.01999559812247753,
-0.09210266172885895,
-0.019435787573456764,
0.035591479390859604,
0.06794250011444092,
-0.12437935173511505,
-0.04514829441905022,
0.02950088307261467,
-0.0759221538901329,
0.06846272200345993,
0.036417052149772644,
0.023973088711500168,
0.05127771943807602,
-0.19254566729068756,
0.017268402501940727,
0.07711680978536606,
0.012443716637790203,
0.04075251892209053,
-0.10175126045942307,
-0.009981688112020493,
0.00370059534907341,
0.03110189363360405,
0.022608347237110138,
0.09324097633361816,
-0.12957952916622162,
0.010288229212164879,
-0.01953965239226818,
-0.06297441571950912,
-0.04977572709321976,
0.007553390227258205,
0.10603775084018707,
-0.010748830623924732,
0.2087804526090622,
-0.10074779391288757,
0.008513232693076134,
-0.19043485820293427,
0.0028507986571639776,
-0.00682100560516119,
-0.10983011871576309,
-0.15105055272579193,
-0.05374665930867195,
0.039942290633916855,
-0.046963226050138474,
0.1539946347475052,
-0.00037106420495547354,
0.02421722188591957,
0.03181363642215729,
-0.04112671688199043,
0.043747272342443466,
0.026265693828463554,
0.23860928416252136,
0.03491399064660072,
-0.04554495960474014,
0.01766023226082325,
0.026655901223421097,
0.11685533076524734,
0.04935058578848839,
0.1647305190563202,
0.16805708408355713,
-0.06031470373272896,
0.09855034947395325,
0.03573150187730789,
-0.054694171994924545,
-0.13715310394763947,
0.04555597901344299,
-0.0316079705953598,
0.08574938029050827,
-0.021146783605217934,
0.20128761231899261,
0.07932665944099426,
-0.16052095592021942,
0.013387076556682587,
-0.05209862440824509,
-0.07747078686952591,
-0.1095784604549408,
-0.023137392476201057,
-0.10212947428226471,
-0.1617632806301117,
0.0036947058979421854,
-0.11969462037086487,
0.005187069531530142,
0.09442558139562607,
-0.007101317867636681,
-0.012987051159143448,
0.16529299318790436,
-0.008882135152816772,
0.03783179074525833,
0.05063353106379509,
-0.007684787269681692,
-0.0422668494284153,
-0.08070237934589386,
-0.10810684412717819,
0.008226203732192516,
-0.026384111493825912,
0.024477597326040268,
-0.04620185121893883,
-0.027624431997537613,
0.037447139620780945,
-0.007048159837722778,
-0.09568671882152557,
0.01625915616750717,
0.030924096703529358,
0.04458214342594147,
0.04817992448806763,
0.012992224656045437,
0.011016524396836758,
0.013765628449618816,
0.2159663736820221,
-0.07297251373529434,
-0.08747649192810059,
-0.09877234697341919,
0.24086931347846985,
0.05073096230626106,
0.025288833305239677,
0.018779728561639786,
-0.09202266484498978,
0.03020375221967697,
0.1923431009054184,
0.16448107361793518,
-0.07641180604696274,
0.004659534897655249,
-0.018482740968465805,
-0.014931896701455116,
-0.03326602280139923,
0.08958589285612106,
0.12395880371332169,
0.0055262125097215176,
-0.06493004411458969,
-0.046414244920015335,
-0.0390622578561306,
-0.005720081739127636,
-0.047878116369247437,
0.04908059164881706,
0.015858177095651627,
0.0017257190775126219,
-0.04886384680867195,
0.052335165441036224,
-0.0371059849858284,
-0.08982816338539124,
0.06479167193174362,
-0.1871408224105835,
-0.14439533650875092,
-0.011851496994495392,
0.10194070637226105,
-0.004701440688222647,
0.047204479575157166,
-0.03417659550905228,
-0.011543585918843746,
0.08302055299282074,
-0.029469963163137436,
-0.06075419485569,
-0.07542803883552551,
0.0506351999938488,
-0.0743495374917984,
0.23618391156196594,
-0.030116187408566475,
0.06092466786503792,
0.12346386164426804,
0.049916960299015045,
-0.07423224300146103,
0.1103966236114502,
0.04240882396697998,
-0.06746746599674225,
0.03339756280183792,
0.06872401386499405,
-0.050298020243644714,
0.12936265766620636,
0.05169029161334038,
-0.1547669768333435,
0.019099464640021324,
-0.016349265351891518,
-0.10140876471996307,
-0.05914544686675072,
-0.037042200565338135,
-0.059878911823034286,
0.13481488823890686,
0.18942180275917053,
-0.039078883826732635,
0.010014832019805908,
-0.044162411242723465,
0.03858164697885513,
0.06833138316869736,
0.028328748419880867,
-0.03017929196357727,
-0.23593519628047943,
0.0350717157125473,
0.09331156313419342,
-0.005604958161711693,
-0.26071348786354065,
-0.08879108726978302,
-0.012590361759066582,
-0.047186125069856644,
-0.09942891448736191,
0.08563127368688583,
0.11023639887571335,
0.04809743911027908,
-0.059514377266168594,
-0.10986161231994629,
-0.07993253320455551,
0.16136842966079712,
-0.11848853528499603,
-0.11030925065279007
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"language": ["my"], "library_name": "transformers"} | text-generation | Turtle344/Myanmar_GPT_finetuned_health_qa | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"my",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:24:59+00:00 | [
"1910.09700"
] | [
"my"
] | TAGS
#transformers #safetensors #gpt2 #text-generation #my #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #gpt2 #text-generation #my #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
59,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #gpt2 #text-generation #my #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.050027936697006226,
0.16474796831607819,
-0.005256314761936665,
0.023480162024497986,
0.09612338244915009,
0.015085194259881973,
0.06767778098583221,
0.11168179661035538,
-0.017857782542705536,
0.11609584838151932,
0.03267510235309601,
0.09662002325057983,
0.11471638083457947,
0.1502790004014969,
-0.00046198334894143045,
-0.2294265478849411,
0.0488833524286747,
-0.12625961005687714,
-0.03868892416357994,
0.11614766716957092,
0.14925898611545563,
-0.09960314631462097,
0.07651636004447937,
-0.031067850068211555,
-0.00693482393398881,
-0.03363160043954849,
-0.058763593435287476,
-0.047068364918231964,
0.048381462693214417,
0.07101240009069443,
0.06409071385860443,
0.004078144673258066,
0.09360705316066742,
-0.26812097430229187,
0.01946207880973816,
0.07168101519346237,
-0.0035911027807742357,
0.07360919564962387,
0.05811300873756409,
-0.075057253241539,
0.09645725041627884,
-0.05185258761048317,
0.14562873542308807,
0.08199485391378403,
-0.09359484910964966,
-0.1937980204820633,
-0.09161742031574249,
0.10345915704965591,
0.18180930614471436,
0.046683937311172485,
-0.02396625094115734,
0.09682846814393997,
-0.08748012781143188,
0.009899373166263103,
0.054137662053108215,
-0.06782261282205582,
-0.05273483321070671,
0.06359346956014633,
0.08039257675409317,
0.0737549439072609,
-0.12343647330999374,
-0.022005517035722733,
0.007054837886244059,
0.008627086877822876,
0.08273728936910629,
0.0220766793936491,
0.15205468237400055,
0.03948210924863815,
-0.12660613656044006,
-0.05027563497424126,
0.10404495894908905,
0.04285689443349838,
-0.047203999012708664,
-0.25151509046554565,
-0.028758972883224487,
-0.02873173914849758,
-0.029612528160214424,
-0.039311666041612625,
0.043947868049144745,
-0.0075396522879600525,
0.08358395099639893,
-0.009502860717475414,
-0.07541881501674652,
-0.04042796790599823,
0.06448647379875183,
0.06153983622789383,
0.026783280074596405,
-0.012019796296954155,
0.011285427957773209,
0.11826885491609573,
0.10342902690172195,
-0.12343187630176544,
-0.055625107139348984,
-0.06324746459722519,
-0.08156139403581619,
-0.04220743477344513,
0.037431750446558,
0.042773302644491196,
0.04932383820414543,
0.2501077651977539,
0.013180410489439964,
0.05480695143342018,
0.038909513503313065,
0.00957733765244484,
0.06348294764757156,
0.11560738831758499,
-0.060599759221076965,
-0.09657138586044312,
-0.027744242921471596,
0.08652771264314651,
0.01120567787438631,
-0.0377381257712841,
-0.055390141904354095,
0.06218383088707924,
0.015383407473564148,
0.12091340869665146,
0.09377853572368622,
0.004842707887291908,
-0.07228046655654907,
-0.06479840725660324,
0.19804774224758148,
-0.16037599742412567,
0.047879066318273544,
0.03616773709654808,
-0.04011683911085129,
-0.006215163506567478,
0.01000446081161499,
0.02223590575158596,
-0.02301623858511448,
0.08893629908561707,
-0.055309731513261795,
-0.04036511853337288,
-0.10809312015771866,
-0.032323118299245834,
0.03398643434047699,
0.010111612267792225,
-0.033305007964372635,
-0.03133244439959526,
-0.08380616456270218,
-0.07147164642810822,
0.09519615024328232,
-0.07421132177114487,
-0.05262557417154312,
-0.017739402130246162,
-0.07255282253026962,
0.025211334228515625,
0.019831139594316483,
0.07846344262361526,
-0.021520689129829407,
0.04264945909380913,
-0.05563519522547722,
0.05962551757693291,
0.11153828352689743,
0.03383200988173485,
-0.055622316896915436,
0.06171237677335739,
-0.24270670115947723,
0.09758982062339783,
-0.0710858479142189,
0.053941115736961365,
-0.15170030295848846,
-0.026066219434142113,
0.04978783056139946,
0.006414515431970358,
-0.013322651386260986,
0.13732922077178955,
-0.21359477937221527,
-0.027973202988505363,
0.1669851392507553,
-0.09585407376289368,
-0.07805746793746948,
0.05799942463636398,
-0.05055795609951019,
0.10654383897781372,
0.04082268103957176,
-0.025040464475750923,
0.06222570687532425,
-0.13479861617088318,
-0.0014198875287547708,
-0.047894641757011414,
-0.02079744078218937,
0.15788690745830536,
0.0774504542350769,
-0.07105650007724762,
0.07653696835041046,
0.023984640836715698,
-0.027575641870498657,
-0.045474857091903687,
-0.016135215759277344,
-0.10837390273809433,
0.010876641608774662,
-0.06308722496032715,
0.01784658245742321,
-0.023467710241675377,
-0.09289564937353134,
-0.029393047094345093,
-0.17648854851722717,
-0.026166146621108055,
0.08561204373836517,
-0.009356572292745113,
-0.01821500062942505,
-0.11507151275873184,
0.01334168016910553,
0.03697510063648224,
0.003883840050548315,
-0.12995710968971252,
-0.05271145701408386,
0.027370581403374672,
-0.16429691016674042,
0.034567296504974365,
-0.05636884272098541,
0.050430186092853546,
0.029958050698041916,
-0.033219922333955765,
-0.029138339683413506,
0.019550498574972153,
0.0051649450324475765,
-0.011968139559030533,
-0.2456217110157013,
-0.027244921773672104,
-0.02145005576312542,
0.16947484016418457,
-0.21498709917068481,
0.03720460087060928,
0.07385016232728958,
0.1509069949388504,
0.00899495743215084,
-0.04162772372364998,
0.006179061718285084,
-0.07684190571308136,
-0.029944073408842087,
-0.06235410273075104,
-0.007475883234292269,
-0.03604045882821083,
-0.05323837324976921,
0.04966186732053757,
-0.16728930175304413,
-0.03148381784558296,
0.09977582842111588,
0.06474105268716812,
-0.13687220215797424,
-0.01833443157374859,
-0.038453057408332825,
-0.04376931115984917,
-0.0542934350669384,
-0.057917725294828415,
0.10512920469045639,
0.05703093111515045,
0.043937187641859055,
-0.06442828476428986,
-0.07887491583824158,
0.0019488829420879483,
-0.01839604601264,
-0.021913951262831688,
0.09501618891954422,
0.07625814527273178,
-0.126447856426239,
0.09389124810695648,
0.10223960131406784,
0.0893523097038269,
0.09571781009435654,
-0.02223181538283825,
-0.08475581556558609,
-0.05032327398657799,
0.026346847414970398,
0.018355706706643105,
0.1345573216676712,
-0.01452932320535183,
0.05123843252658844,
0.04170922935009003,
-0.013238015584647655,
0.011355235241353512,
-0.090174101293087,
0.030547333881258965,
0.0318179652094841,
-0.020568078383803368,
0.040081366896629333,
-0.03946930542588234,
0.020224466919898987,
0.08842084556818008,
0.048865120857954025,
0.043731577694416046,
0.014730525203049183,
-0.04688551276922226,
-0.11232394725084305,
0.1658431440591812,
-0.12630289793014526,
-0.23393084108829498,
-0.1414857655763626,
0.0012103788321837783,
0.03697152063250542,
-0.0101776709780097,
0.002784519223496318,
-0.06655111163854599,
-0.11821088194847107,
-0.08970573544502258,
0.013177535496652126,
0.04651108756661415,
-0.08744523674249649,
-0.062383223325014114,
0.05917976796627045,
0.03988218307495117,
-0.14535534381866455,
0.017937207594513893,
0.04890121892094612,
-0.09446868300437927,
-0.008470512926578522,
0.08386951684951782,
0.06891493499279022,
0.17969219386577606,
0.015723396092653275,
-0.020486973226070404,
0.033630773425102234,
0.21424317359924316,
-0.13868694007396698,
0.11445405334234238,
0.13941779732704163,
-0.08899342268705368,
0.08465798199176788,
0.1972147673368454,
0.040454063564538956,
-0.10250882804393768,
0.032274406403303146,
0.018989186733961105,
-0.03012417070567608,
-0.24601931869983673,
-0.07199491560459137,
-0.0009032037341967225,
-0.05718288570642471,
0.07615386694669724,
0.09231015294790268,
0.09016262739896774,
0.015849417075514793,
-0.09527456015348434,
-0.0800856426358223,
0.05403657630085945,
0.10447722673416138,
0.01403232291340828,
-0.011341944336891174,
0.08738230913877487,
-0.03221329301595688,
0.019152658060193062,
0.09192562848329544,
0.0015805772272869945,
0.17290568351745605,
0.05680093541741371,
0.18587352335453033,
0.07768648862838745,
0.07139948755502701,
0.010344353504478931,
0.010746655985713005,
0.01986384019255638,
0.028385909274220467,
-0.003948506433516741,
-0.08578839153051376,
-0.0125935273244977,
0.11948346346616745,
0.0718657448887825,
0.013354934751987457,
0.014085279777646065,
-0.043420031666755676,
0.0822971761226654,
0.17455342411994934,
-0.0016380405286327004,
-0.18000802397727966,
-0.06300204992294312,
0.0823143944144249,
-0.09416370838880539,
-0.10087089985609055,
-0.02497442439198494,
0.027849232777953148,
-0.17399466037750244,
0.024642182514071465,
-0.017257755622267723,
0.11150366067886353,
-0.14095109701156616,
-0.020777858793735504,
0.065652996301651,
0.06895755976438522,
0.001703139510937035,
0.061374157667160034,
-0.1580723077058792,
0.10568492114543915,
0.012128042988479137,
0.0679168626666069,
-0.09677726030349731,
0.10052824020385742,
-0.005584191530942917,
-0.01643131859600544,
0.13733883202075958,
0.00983541738241911,
-0.07105584442615509,
-0.07991203665733337,
-0.09464412927627563,
-0.009314688853919506,
0.12753425538539886,
-0.15136787295341492,
0.08518610894680023,
-0.032676421105861664,
-0.043905675411224365,
-0.00011145674216095358,
-0.1081976369023323,
-0.12278328090906143,
-0.18657366931438446,
0.05838552117347717,
-0.13678398728370667,
0.03979263827204704,
-0.10717664659023285,
-0.03283962607383728,
-0.03160460293292999,
0.19279107451438904,
-0.22879460453987122,
-0.07042109966278076,
-0.15047919750213623,
-0.09862979501485825,
0.1445441097021103,
-0.0517747737467289,
0.08496206998825073,
-0.0056128366850316525,
0.17687392234802246,
0.021953562274575233,
-0.022166036069393158,
0.09915217012166977,
-0.0935419574379921,
-0.1971602737903595,
-0.08049463480710983,
0.15944358706474304,
0.1358121782541275,
0.03507259860634804,
-0.0037945411168038845,
0.037499669939279556,
-0.018789149820804596,
-0.12417293339967728,
0.0241480004042387,
0.1768360286951065,
0.067517951130867,
0.024204043671488762,
-0.02292693220078945,
-0.10995736718177795,
-0.0674448236823082,
-0.03357062116265297,
0.028750281780958176,
0.1812824010848999,
-0.07155410945415497,
0.1883368045091629,
0.1448865830898285,
-0.05865113437175751,
-0.19569028913974762,
0.009250068105757236,
0.032289594411849976,
0.003142351284623146,
0.03268567845225334,
-0.20322321355342865,
0.08684394508600235,
0.0020255313720554113,
-0.05320986360311508,
0.1383202224969864,
-0.1694873869419098,
-0.1494223177433014,
0.07170457392930984,
0.03663245588541031,
-0.19858869910240173,
-0.12217120081186295,
-0.09244102239608765,
-0.049983181059360504,
-0.18514080345630646,
0.10299309343099594,
0.032166171818971634,
0.008180810138583183,
0.031135594472289085,
0.027528658509254456,
0.01635144278407097,
-0.04037967696785927,
0.19183005392551422,
-0.026670929044485092,
0.027661850675940514,
-0.08582589030265808,
-0.07177740335464478,
0.045644015073776245,
-0.056142788380384445,
0.07827942073345184,
-0.02500017359852791,
0.00978175550699234,
-0.10604862868785858,
-0.04276496171951294,
-0.03224313259124756,
0.014153984375298023,
-0.0957200676202774,
-0.08862278610467911,
-0.0503210574388504,
0.09521231800317764,
0.09585000574588776,
-0.034942321479320526,
-0.03485199809074402,
-0.06989708542823792,
0.04093492403626442,
0.1845279484987259,
0.1762910634279251,
0.043463703244924545,
-0.08190630376338959,
-0.007321304641664028,
-0.010983508080244064,
0.04218963533639908,
-0.21721041202545166,
0.06483815610408783,
0.049478679895401,
0.018266484141349792,
0.11753053218126297,
-0.01990492083132267,
-0.15549395978450775,
-0.06976567208766937,
0.062165405601263046,
-0.05942438170313835,
-0.19788597524166107,
0.004264907445758581,
0.050639912486076355,
-0.1677471399307251,
-0.046875931322574615,
0.042514264583587646,
-0.0034587772097438574,
-0.040564294904470444,
0.01941085048019886,
0.09103266894817352,
0.0037896670401096344,
0.06964824348688126,
0.058282677084207535,
0.08317959308624268,
-0.10298577696084976,
0.07800132781267166,
0.08795404434204102,
-0.08119598031044006,
0.02621092088520527,
0.0963074192404747,
-0.0579676516354084,
-0.03074178658425808,
0.027504565194249153,
0.08081906288862228,
0.01724325492978096,
-0.04197880998253822,
0.012430759146809578,
-0.10100654512643814,
0.06343277543783188,
0.08504259586334229,
0.03228013589978218,
0.015797702595591545,
0.033924560993909836,
0.04932043328881264,
-0.06969481706619263,
0.12132792919874191,
0.029456952586770058,
0.017248518764972687,
-0.04189243167638779,
-0.050019770860672,
0.024869205430150032,
-0.027694858610630035,
-0.005530920810997486,
-0.03501863777637482,
-0.07428453117609024,
-0.017179444432258606,
-0.1635059416294098,
-0.014649666845798492,
-0.05030691623687744,
0.009999682195484638,
0.02838853746652603,
-0.037808794528245926,
0.009258810430765152,
0.008937002159655094,
-0.07566045969724655,
-0.06705565005540848,
-0.023068036884069443,
0.09430728852748871,
-0.16433237493038177,
0.022471176460385323,
0.08655921369791031,
-0.12020260840654373,
0.09458251297473907,
0.019944947212934494,
-0.003488474991172552,
0.026671748608350754,
-0.15260228514671326,
0.03862147778272629,
-0.03288261964917183,
0.012332449667155743,
0.04386809468269348,
-0.22567100822925568,
-0.0003594512236304581,
-0.036579277366399765,
-0.06302624940872192,
-0.008392655290663242,
-0.03859637677669525,
-0.11527056246995926,
0.10570613294839859,
0.005931496154516935,
-0.08894413709640503,
-0.03250240907073021,
0.0323784202337265,
0.08161021769046783,
-0.02496093325316906,
0.15208588540554047,
-0.00207732361741364,
0.07444743067026138,
-0.17001521587371826,
-0.0186258926987648,
-0.008170838467776775,
0.022739017382264137,
-0.01730337366461754,
-0.009926589205861092,
0.0417211577296257,
-0.02374940738081932,
0.18326538801193237,
-0.026155810803174973,
0.023897062987089157,
0.06509087234735489,
0.028239112347364426,
-0.02348138764500618,
0.1029142364859581,
0.04977674037218094,
0.019744493067264557,
0.017750872299075127,
0.004704901948571205,
-0.04269330948591232,
-0.02411082573235035,
-0.2009672075510025,
0.06824636459350586,
0.1413951963186264,
0.0945001170039177,
-0.016895292326807976,
0.08102800697088242,
-0.10148470103740692,
-0.11793510615825653,
0.11845827102661133,
-0.05291564017534256,
-0.005178646184504032,
-0.06771177798509598,
0.13147090375423431,
0.14465214312076569,
-0.19057928025722504,
0.07167289406061172,
-0.06657368689775467,
-0.048708755522966385,
-0.11428135633468628,
-0.19877813756465912,
-0.057406406849622726,
-0.05249326303601265,
-0.014654957689344883,
-0.046307582408189774,
0.07600142061710358,
0.0558621920645237,
0.0066533759236335754,
-0.0010272628860548139,
0.06499047577381134,
-0.025001700967550278,
0.0006535093416459858,
0.02878933772444725,
0.06477060168981552,
0.012734908610582352,
-0.02708892710506916,
0.020003756508231163,
-0.0124588618054986,
0.04115289822220802,
0.06620694696903229,
0.04819779098033905,
-0.031203847378492355,
0.014304676093161106,
-0.03705102950334549,
-0.10658574849367142,
0.04060226306319237,
-0.025969676673412323,
-0.08082456141710281,
0.15038928389549255,
0.02329815737903118,
0.009871776215732098,
-0.02009480819106102,
0.24151544272899628,
-0.07493261992931366,
-0.09359162300825119,
-0.145912304520607,
0.1050858348608017,
-0.04417093098163605,
0.05931282788515091,
0.04718651622533798,
-0.10557620972394943,
0.015843620523810387,
0.12844666838645935,
0.16308201849460602,
-0.03887132555246353,
0.020848779007792473,
0.030801787972450256,
0.00451048044487834,
-0.03638087585568428,
0.04893702641129494,
0.06769941002130508,
0.1598430573940277,
-0.050012264400720596,
0.09363209456205368,
0.0014203250175341964,
-0.09646835923194885,
-0.03729995712637901,
0.11791293323040009,
-0.018567455932497978,
0.020269213244318962,
-0.05626360699534416,
0.11993836611509323,
-0.0624820776283741,
-0.2327124923467636,
0.06224925443530083,
-0.06696385145187378,
-0.1380520761013031,
-0.020930388942360878,
0.07783205062150955,
-0.013049895875155926,
0.028939034789800644,
0.0744941383600235,
-0.07557165622711182,
0.19887301325798035,
0.03869112953543663,
-0.05862116441130638,
-0.05374279245734215,
0.08125589787960052,
-0.10287419706583023,
0.2806376814842224,
0.016432832926511765,
0.048905149102211,
0.10398142039775848,
-0.014833793975412846,
-0.1373276710510254,
0.01922636665403843,
0.09618207812309265,
-0.09676505625247955,
0.04219788312911987,
0.20332688093185425,
-0.0013902755454182625,
0.11704205721616745,
0.07854630798101425,
-0.07418767362833023,
0.047525618225336075,
-0.08811801671981812,
-0.06975209712982178,
-0.09316657483577728,
0.0959140956401825,
-0.0771951898932457,
0.14154987037181854,
0.13074006140232086,
-0.05291276052594185,
0.009896296076476574,
-0.031781889498233795,
0.04675227776169777,
0.0034188064746558666,
0.10277023166418076,
0.009294440969824791,
-0.18240419030189514,
0.022705385461449623,
0.018488753587007523,
0.10790926963090897,
-0.16417165100574493,
-0.09700500220060349,
0.04471738636493683,
0.0022752759978175163,
-0.05986823886632919,
0.12864463031291962,
0.05974240601062775,
0.04506097733974457,
-0.041915711015462875,
-0.023578429594635963,
-0.010399113409221172,
0.1371832937002182,
-0.10372728109359741,
0.0024029547348618507
] |
null | null | transformers | # [MaziyarPanahi/samantha-1.1-westlake-7b-GGUF](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-GGUF)
- Model creator: [cognitivecomputations](https://huggingface.co/cognitivecomputations)
- Original model: [cognitivecomputations/samantha-1.1-westlake-7b](https://huggingface.co/cognitivecomputations/samantha-1.1-westlake-7b)
## Description
[MaziyarPanahi/samantha-1.1-westlake-7b-GGUF](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-GGUF) contains GGUF format model files for [cognitivecomputations/samantha-1.1-westlake-7b](https://huggingface.co/cognitivecomputations/samantha-1.1-westlake-7b).
## How to use
Thanks to [TheBloke](https://huggingface.co/TheBloke) for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: [MaziyarPanahi/samantha-1.1-westlake-7b-GGUF](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-GGUF) and below it, a specific filename to download, such as: samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download MaziyarPanahi/samantha-1.1-westlake-7b-GGUF samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download [MaziyarPanahi/samantha-1.1-westlake-7b-GGUF](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-GGUF) --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MaziyarPanahi/samantha-1.1-westlake-7b-GGUF samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers) | {"tags": ["quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "transformers", "pytorch", "mistral", "text-generation", "conversational", "dataset:cognitivecomputations/samantha-data", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us"], "model_name": "samantha-1.1-westlake-7b-GGUF", "base_model": "cognitivecomputations/samantha-1.1-westlake-7b", "inference": false, "model_creator": "cognitivecomputations", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/samantha-1.1-westlake-7b-GGUF | [
"transformers",
"gguf",
"mistral",
"quantized",
"2-bit",
"3-bit",
"4-bit",
"5-bit",
"6-bit",
"8-bit",
"GGUF",
"pytorch",
"text-generation",
"conversational",
"dataset:cognitivecomputations/samantha-data",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:cognitivecomputations/samantha-1.1-westlake-7b"
] | 2024-02-13T09:25:28+00:00 | [] | [] | TAGS
#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b
| # MaziyarPanahi/samantha-1.1-westlake-7b-GGUF
- Model creator: cognitivecomputations
- Original model: cognitivecomputations/samantha-1.1-westlake-7b
## Description
MaziyarPanahi/samantha-1.1-westlake-7b-GGUF contains GGUF format model files for cognitivecomputations/samantha-1.1-westlake-7b.
## How to use
Thanks to TheBloke for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* URL. The source project for GGUF. Offers a CLI and a server option.
* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.
* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
Note for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* URL
### In 'text-generation-webui'
Under Download Model, you can enter the model repo: MaziyarPanahi/samantha-1.1-westlake-7b-GGUF and below it, a specific filename to download, such as: samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the 'huggingface-hub' Python library:
Then you can download any individual model file to the current directory, at high speed, with a command like this:
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
For more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.
To accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':
And set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':
Windows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.
</details>
## Example 'URL' command
Make sure you are using 'URL' from commit d0cee0d or later.
Change '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'
For other parameters and how to use them, please refer to the URL documentation
## How to run in 'text-generation-webui'
Further instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.
## How to run from Python code
You can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: llama-cpp-python docs.
#### First install the package
Run one of the following commands, according to your system:
#### Simple llama-cpp-python example code
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* LangChain + llama-cpp-python
* LangChain + ctransformers | [
"# MaziyarPanahi/samantha-1.1-westlake-7b-GGUF\n- Model creator: cognitivecomputations\n- Original model: cognitivecomputations/samantha-1.1-westlake-7b",
"## Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GGUF contains GGUF format model files for cognitivecomputations/samantha-1.1-westlake-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/samantha-1.1-westlake-7b-GGUF and below it, a specific filename to download, such as: samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
"TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n",
"# MaziyarPanahi/samantha-1.1-westlake-7b-GGUF\n- Model creator: cognitivecomputations\n- Original model: cognitivecomputations/samantha-1.1-westlake-7b",
"## Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GGUF contains GGUF format model files for cognitivecomputations/samantha-1.1-westlake-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/samantha-1.1-westlake-7b-GGUF and below it, a specific filename to download, such as: samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
122,
46,
45,
26,
401,
323,
84,
83,
218,
182,
49,
77,
36,
19,
12,
50
] | [
"passage: TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n# MaziyarPanahi/samantha-1.1-westlake-7b-GGUF\n- Model creator: cognitivecomputations\n- Original model: cognitivecomputations/samantha-1.1-westlake-7b## Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GGUF contains GGUF format model files for cognitivecomputations/samantha-1.1-westlake-7b.## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"passage: ### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/samantha-1.1-westlake-7b-GGUF and below it, a specific filename to download, such as: samantha-1.1-westlake-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download."
] | [
-0.0609760507941246,
0.1698022186756134,
-0.0023831145372241735,
0.07491575181484222,
0.07622450590133667,
0.04535375535488129,
0.030573731288313866,
0.10258933901786804,
0.11608521640300751,
0.07735765725374222,
0.07163591682910919,
0.03394551947712898,
0.061034172773361206,
0.12590888142585754,
0.0921248197555542,
-0.2050369381904602,
0.029250990599393845,
0.001483510248363018,
0.017813226208090782,
0.03521192446351051,
0.030836304649710655,
-0.049803730100393295,
0.07735301554203033,
-0.023508500307798386,
-0.051351867616176605,
-0.06368570774793625,
-0.03928610682487488,
0.005172773730009794,
0.06641632318496704,
0.03383626788854599,
-0.07035158574581146,
-0.03873158246278763,
-0.015789441764354706,
-0.12729723751544952,
0.018465694040060043,
0.03256556764245033,
-0.002830717246979475,
0.03788160905241966,
-0.03637745976448059,
0.028272517025470734,
0.1329270601272583,
-0.06672674417495728,
0.007133868522942066,
0.047769878059625626,
-0.08130882680416107,
-0.10203073918819427,
-0.10751502215862274,
0.015763066709041595,
0.02620638534426689,
0.027465995401144028,
0.004079071804881096,
0.035807859152555466,
-0.0010536843910813332,
0.04188215732574463,
0.19124439358711243,
-0.2067209780216217,
-0.04903443157672882,
0.10820450633764267,
0.026186242699623108,
0.05153608322143555,
-0.0940866619348526,
0.04575956612825394,
0.0034295436926186085,
0.025689320638775826,
0.039747778326272964,
-0.044595010578632355,
0.07760095596313477,
0.002421671524643898,
-0.10379375517368317,
-0.012743578292429447,
0.12484654784202576,
-0.010357968509197235,
-0.04764022305607796,
-0.06739652156829834,
-0.056065768003463745,
-0.025746628642082214,
-0.03249034285545349,
0.024239452555775642,
0.013947054743766785,
-0.0038864417001605034,
0.05262986570596695,
-0.1256842166185379,
-0.024867553263902664,
-0.034537047147750854,
-0.014674920588731766,
0.22251510620117188,
0.01683359406888485,
0.048191919922828674,
0.03993070870637894,
0.1327705681324005,
-0.1608743965625763,
-0.03973189368844032,
-0.10449421405792236,
0.0038572102785110474,
-0.056789956986904144,
0.029886173084378242,
0.02236957661807537,
0.04974694550037384,
0.009253869764506817,
0.12680260837078094,
0.0005803853273391724,
0.0813591331243515,
0.08141164481639862,
-0.014982540160417557,
-0.021833986043930054,
0.13795077800750732,
-0.05282970890402794,
-0.13184289634227753,
0.0640653520822525,
0.03139527887105942,
0.08650792390108109,
-0.05039847642183304,
-0.05810985341668129,
0.009135396219789982,
-0.028603939339518547,
0.029670823365449905,
0.007189975585788488,
0.043786585330963135,
-0.0469462051987648,
-0.048924755305051804,
0.1718495786190033,
-0.07764242589473724,
0.03388853743672371,
-0.00776376947760582,
-0.036804236471652985,
0.007598742842674255,
0.033722493797540665,
-0.034199539572000504,
-0.02675909921526909,
0.00403139553964138,
-0.10981760919094086,
-0.028180435299873352,
-0.08223018050193787,
-0.01805327646434307,
0.04652215912938118,
-0.08078986406326294,
-0.022911984473466873,
-0.06738899648189545,
-0.245856374502182,
0.026250379160046577,
0.04545506834983826,
-0.04863090068101883,
-0.01892455480992794,
-0.008184971287846565,
-0.02836550585925579,
0.010055627673864365,
0.011659828945994377,
0.07386687397956848,
-0.03617848455905914,
0.037359632551670074,
0.0338151752948761,
0.05989208444952965,
-0.12000779807567596,
-0.0034910812973976135,
-0.03236197307705879,
0.06892667710781097,
-0.10231821238994598,
0.12370371073484421,
-0.10193042457103729,
0.038858771324157715,
-0.05331413075327873,
-0.010523534379899502,
-0.02857663854956627,
-0.04133564978837967,
0.04083067551255226,
0.08621395379304886,
-0.1450532078742981,
-0.056647952646017075,
0.10935601592063904,
-0.10640808939933777,
-0.07867196947336197,
0.11780929565429688,
-0.0031521536875516176,
0.03301934897899628,
0.1089000478386879,
0.1146697923541069,
0.1407182514667511,
-0.05882478505373001,
-0.08079567551612854,
-0.02621321752667427,
0.0523950457572937,
0.01838982291519642,
0.0484667643904686,
0.0324205607175827,
-0.06601312756538391,
0.07498560100793839,
-0.11668310314416885,
0.06537254899740219,
0.012867932207882404,
-0.05026550963521004,
-0.035963088274002075,
-0.09163697808980942,
0.058340176939964294,
-0.006002938840538263,
-0.0240952055901289,
-0.0160799752920866,
-0.07277996838092804,
-0.01865486055612564,
0.12871576845645905,
-0.053822778165340424,
0.013849201612174511,
-0.08066073805093765,
0.14260569214820862,
-0.06852874159812927,
0.04941023141145706,
-0.04669860377907753,
-0.08703841269016266,
0.08025138080120087,
-0.08927097916603088,
0.06575660407543182,
-0.05087823048233986,
0.05313289910554886,
0.05802442878484726,
-0.05027921497821808,
0.022998696193099022,
-0.014782879501581192,
-0.022472096607089043,
-0.06206835061311722,
-0.036534614861011505,
0.002187022939324379,
-0.022048121318221092,
0.12319914251565933,
-0.07287360727787018,
0.011182906106114388,
0.10397730767726898,
0.04525250196456909,
0.031586114317178726,
-0.11584382504224777,
0.041181307286024094,
-0.020221587270498276,
0.021585920825600624,
-0.05652245879173279,
0.016540158540010452,
0.006953928619623184,
-0.09083903580904007,
0.041048694401979446,
-0.1173972338438034,
-0.016371551901102066,
0.06400151550769806,
0.16676923632621765,
0.029679108411073685,
0.001553952693939209,
0.00004943530075252056,
-0.041229523718357086,
0.010111598297953606,
-0.05325772985816002,
0.17814278602600098,
-0.007222415879368782,
0.042041003704071045,
-0.052040599286556244,
-0.027037374675273895,
0.0067980061285197735,
0.037757426500320435,
-0.0035200542770326138,
0.06321540474891663,
0.06216694787144661,
-0.04842359945178032,
0.04888582602143288,
0.032621387392282486,
-0.04432384669780731,
0.15787763893604279,
0.024396846070885658,
-0.036997340619564056,
-0.06389379501342773,
-0.020307494327425957,
0.012097387574613094,
0.1517806351184845,
-0.15662045776844025,
-0.009308524429798126,
0.018669430166482925,
0.010913300327956676,
0.050836920738220215,
-0.1084652915596962,
0.027489718049764633,
-0.04006333649158478,
-0.06785830110311508,
0.09083180874586105,
0.020495794713497162,
-0.08637276291847229,
0.036502644419670105,
0.04516301676630974,
0.09105876088142395,
0.014827915467321873,
0.010217087343335152,
-0.07425905019044876,
0.13112345337867737,
-0.11291584372520447,
-0.19803541898727417,
-0.1258319914340973,
-0.03617661073803902,
-0.0662415400147438,
-0.002811770886182785,
-0.0013622171245515347,
-0.028892355039715767,
-0.06305739283561707,
-0.06924880295991898,
0.04040471464395523,
0.028382306918501854,
-0.00832965224981308,
0.05136757716536522,
-0.0821266770362854,
-0.013740967959165573,
-0.10565365850925446,
0.01290400791913271,
0.01569337025284767,
-0.08763650059700012,
0.02669168822467327,
0.012177325785160065,
0.0744524672627449,
0.09074915945529938,
0.0408964678645134,
0.0016745124012231827,
-0.011389912106096745,
0.22234228253364563,
-0.0999971479177475,
0.11242833733558655,
0.0937805324792862,
0.048136062920093536,
0.04496490955352783,
-0.0035188328474760056,
0.011406526900827885,
-0.09669774770736694,
0.006693986710160971,
0.021878866478800774,
-0.10134848952293396,
-0.12549099326133728,
-0.06516905128955841,
-0.07045824825763702,
0.0733114704489708,
0.032556477934122086,
0.07484619319438934,
-0.03281053900718689,
0.0872398316860199,
-0.004706087522208691,
0.0579284131526947,
0.009203625842928886,
0.059129498898983,
0.1100156158208847,
-0.013288537040352821,
0.03892047330737114,
-0.06752673536539078,
0.04016011208295822,
0.10355818271636963,
0.1449756622314453,
0.15038494765758514,
-0.07180927693843842,
0.1795562505722046,
0.005825158208608627,
0.04669688642024994,
-0.0029184978920966387,
0.026348650455474854,
-0.06347586959600449,
0.006628594361245632,
-0.04550711065530777,
-0.05440641567111015,
-0.09504617750644684,
0.05638152360916138,
0.029504794627428055,
-0.012835624627768993,
0.03302477300167084,
0.0336417481303215,
0.06800433248281479,
0.11005264520645142,
0.028755251318216324,
-0.14173835515975952,
-0.13233211636543274,
0.05916450545191765,
-0.048727646470069885,
-0.05323530733585358,
0.02801382914185524,
0.1090555191040039,
-0.06785707920789719,
0.05768638476729393,
-0.0430607795715332,
0.03976311534643173,
-0.12490703910589218,
-0.03197942674160004,
0.010188020765781403,
0.12115955352783203,
0.02283434197306633,
0.06285616010427475,
-0.14494282007217407,
0.01767854019999504,
0.028947103768587112,
0.05692926421761513,
-0.06240835785865784,
0.0052774641662836075,
0.06716901063919067,
-0.01851193606853485,
0.06095140054821968,
0.0332203283905983,
0.05719539895653725,
-0.007246422581374645,
-0.10981108248233795,
0.05822886526584625,
0.02399824932217598,
-0.06663776934146881,
0.0672033280134201,
-0.01903659477829933,
0.00967013742774725,
-0.04479815810918808,
-0.030806275084614754,
-0.03482335805892944,
-0.1724090576171875,
0.109917551279068,
0.026828326284885406,
-0.04443271830677986,
-0.09198389202356339,
-0.05308499187231064,
-0.04542804881930351,
0.12345369160175323,
-0.09179125726222992,
-0.0655192881822586,
-0.09540607035160065,
-0.008863958530128002,
0.13926568627357483,
-0.08236785233020782,
0.03402761369943619,
-0.02649853564798832,
0.07922479510307312,
-0.04044453799724579,
-0.07662327587604523,
0.035620324313640594,
-0.06620410829782486,
-0.13833282887935638,
-0.014213853515684605,
0.12987437844276428,
0.06802554428577423,
0.04223770648241043,
-0.02467767521739006,
0.0248362235724926,
-0.0007374961860477924,
-0.13546767830848694,
0.03659580275416374,
0.16354724764823914,
-0.10093840211629868,
0.07578572630882263,
0.008998233824968338,
0.056282974779605865,
-0.028597038239240646,
-0.02041405811905861,
0.06969308853149414,
0.134086474776268,
-0.046044379472732544,
0.14293786883354187,
0.10572154074907303,
-0.0847606211900711,
-0.23933345079421997,
-0.0027135415002703667,
0.0034115053713321686,
0.012743259780108929,
-0.05930013582110405,
-0.20865842700004578,
0.10603891313076019,
0.07135634869337082,
-0.030412696301937103,
0.2463231086730957,
-0.2188207358121872,
-0.08260121196508408,
-0.04506197199225426,
0.07807054370641708,
0.16176119446754456,
-0.14200636744499207,
-0.05945131182670593,
0.019104236736893654,
-0.15652823448181152,
0.08764238655567169,
-0.008545048534870148,
0.13383066654205322,
-0.03277882933616638,
0.11791004240512848,
-0.0024283332750201225,
-0.04760405421257019,
0.15357285737991333,
-0.01916031539440155,
0.0023563108406960964,
-0.04717899486422539,
0.00875171460211277,
0.03830965980887413,
-0.07044487446546555,
0.10464456677436829,
-0.07282939553260803,
0.023399630561470985,
-0.04159965366125107,
-0.03423076868057251,
-0.07676301151514053,
0.03748828172683716,
0.002694046124815941,
-0.040357805788517,
-0.1051921397447586,
0.0727006196975708,
0.003640875918790698,
0.04202892631292343,
-0.02282046526670456,
0.01053663995116949,
-0.023250285536050797,
0.0502089224755764,
0.08577650785446167,
-0.16429203748703003,
-0.06588973104953766,
-0.01433504931628704,
-0.010102749802172184,
0.050629206001758575,
-0.14870452880859375,
0.024819564074277878,
0.09438082575798035,
0.02401292696595192,
0.0952913761138916,
0.012318083085119724,
-0.1391681432723999,
0.04520966112613678,
0.047964926809072495,
-0.13668598234653473,
-0.22408030927181244,
-0.02268742397427559,
-0.002791192615404725,
-0.05193521827459335,
0.05917186662554741,
0.14542487263679504,
-0.015042757615447044,
-0.0184485986828804,
-0.0076794857159256935,
0.0683000385761261,
-0.013590771704912186,
0.09870221465826035,
0.02390287071466446,
0.020586304366588593,
-0.09980276226997375,
0.07092870771884918,
0.0282706618309021,
-0.08736881613731384,
0.02880132384598255,
0.14291831851005554,
-0.08748301863670349,
-0.06412481516599655,
-0.18402400612831116,
-0.019138218834996223,
-0.013987072743475437,
-0.00005551427602767944,
-0.02271345630288124,
-0.039654091000556946,
0.014727394096553326,
0.06719435751438141,
0.024847619235515594,
0.05626577138900757,
-0.03819744288921356,
0.07468364387750626,
-0.04490780830383301,
0.07147626578807831,
-0.04168018698692322,
0.061961621046066284,
-0.10750260949134827,
-0.025158647447824478,
0.014281758107244968,
0.05890302360057831,
-0.023149149492383003,
-0.029657436534762383,
-0.06524677574634552,
-0.026034504175186157,
-0.08302151411771774,
0.0017475225031375885,
-0.11934340745210648,
0.016022469848394394,
0.0008197423303499818,
-0.0228851530700922,
-0.026917632669210434,
0.04861046373844147,
-0.055083826184272766,
-0.016315806657075882,
-0.03076121024787426,
0.00874706357717514,
-0.037964656949043274,
-0.003319600597023964,
0.08032873272895813,
-0.05623120442032814,
0.1455342173576355,
0.006892792880535126,
0.006575069855898619,
0.06086036190390587,
-0.08211186528205872,
0.03366401419043541,
0.017548108473420143,
-0.0025582374073565006,
-0.005154465325176716,
-0.12438777089118958,
0.022284824401140213,
-0.0203020591288805,
0.050757281482219696,
0.007306637242436409,
0.11353696882724762,
-0.06961789727210999,
-0.0014663152396678925,
-0.02728579379618168,
-0.03003762662410736,
-0.03330603241920471,
0.033384405076503754,
0.056387558579444885,
0.01974046416580677,
0.033208899199962616,
-0.031151756644248962,
-0.010764943435788155,
-0.09455014765262604,
-0.0021863083820790052,
0.0020289886742830276,
-0.04722200706601143,
-0.03746367618441582,
-0.02693944424390793,
0.04078543186187744,
0.005316130816936493,
0.19695192575454712,
-0.06002318114042282,
-0.08588504791259766,
-0.018750544637441635,
-0.047517016530036926,
0.03636198863387108,
-0.007848814129829407,
0.08701703697443008,
0.033579885959625244,
-0.027659853920340538,
-0.0037727737799286842,
0.04073915258049965,
0.022350508719682693,
0.010739283636212349,
0.07604780048131943,
-0.02087699994444847,
0.04509929567575455,
0.09927379339933395,
-0.009581165388226509,
-0.050687115639448166,
-0.09454792737960815,
0.07671608030796051,
-0.11068683862686157,
0.07129581272602081,
-0.06086985021829605,
0.14131911098957062,
0.08687125146389008,
-0.1075037270784378,
0.05709460377693176,
0.020256653428077698,
-0.07769894599914551,
-0.041546840220689774,
-0.14493513107299805,
-0.04679262638092041,
-0.08670737594366074,
0.012164732441306114,
-0.09412902593612671,
0.003505610628053546,
0.027741966769099236,
0.015619052574038506,
-0.017257176339626312,
0.17659196257591248,
-0.011454282328486443,
-0.006631794385612011,
0.06622055172920227,
0.035976093262434006,
-0.0640396848320961,
0.06785658001899719,
-0.03730253502726555,
-0.023205533623695374,
0.015507977455854416,
0.0782921314239502,
0.026595616713166237,
-0.00884753093123436,
0.044360581785440445,
0.014554956927895546,
-0.017135530710220337,
-0.026025710627436638,
0.03629017993807793,
0.011757277883589268,
0.11102262884378433,
0.002264192095026374,
-0.0665246993303299,
0.006142209284007549,
0.09876780211925507,
-0.03598304092884064,
0.008468206971883774,
-0.10601308941841125,
0.10450438410043716,
-0.06867331266403198,
-0.01323763094842434,
-0.007494600489735603,
-0.05614270642399788,
0.017380332574248314,
0.18343804776668549,
0.14911742508411407,
-0.06844551116228104,
-0.021132979542016983,
0.013563872314989567,
-0.0031800782307982445,
-0.01671290397644043,
0.10573025047779083,
0.03983006626367569,
0.23393744230270386,
-0.0043755220249295235,
-0.013185491785407066,
-0.00476117292419076,
-0.03557756170630455,
-0.05137038975954056,
0.03868871182203293,
-0.07491405308246613,
0.05203591659665108,
-0.07978026568889618,
0.000263778492808342,
-0.06524334102869034,
-0.15758146345615387,
0.010874789208173752,
-0.14770179986953735,
-0.08701633661985397,
-0.004826591350138187,
-0.05850495398044586,
-0.008935011923313141,
0.04710347205400467,
0.020880339667201042,
0.008322201669216156,
0.09045708179473877,
0.014332616701722145,
-0.16012051701545715,
-0.017646152526140213,
0.07251419126987457,
0.001926202210597694,
0.22153782844543457,
-0.030028726905584335,
0.010202588513493538,
0.08855447173118591,
-0.017823476344347,
-0.14948666095733643,
0.056154437363147736,
0.026558849960565567,
-0.10414660722017288,
-0.04173273220658302,
0.12456433475017548,
-0.004120930097997189,
0.07073105871677399,
0.06485201418399811,
0.0835212990641594,
-0.0028838887810707092,
0.07548202574253082,
0.01979595050215721,
-0.07925613224506378,
-0.012266742065548897,
-0.13555435836315155,
0.16550782322883606,
0.11326629668474197,
-0.01701967790722847,
-0.012581584975123405,
-0.06438765674829483,
0.035884223878383636,
-0.021049559116363525,
0.07383197546005249,
-0.03681512922048569,
-0.13309870660305023,
0.0019645406864583492,
-0.05977807939052582,
0.015514959581196308,
-0.23823955655097961,
-0.060341160744428635,
-0.03606990724802017,
0.006373491138219833,
0.006264045834541321,
0.1235116496682167,
0.06509993970394135,
0.0002874215133488178,
-0.030459638684988022,
-0.1556546837091446,
-0.02983780764043331,
0.05689911171793938,
-0.11495466530323029,
-0.07340739667415619
] |
null | null | null |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | kouki13/llama2 | [
"safetensors",
"autotrain",
"text-generation",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:26:40+00:00 | [] | [] | TAGS
#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
33,
29,
3
] | [
"passage: TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.03320549428462982,
0.03780708089470863,
-0.0005784488166682422,
0.037439193576574326,
0.13256101310253143,
-0.02594633586704731,
0.22870999574661255,
0.04971681907773018,
-0.04270017519593239,
-0.08776232600212097,
0.19642603397369385,
0.16802352666854858,
-0.04566871374845505,
0.18935616314411163,
-0.02990073338150978,
-0.2414124757051468,
0.021885043010115623,
-0.025850016623735428,
0.1327640414237976,
0.11522045731544495,
0.14238014817237854,
-0.07779128849506378,
0.06120644509792328,
0.04086628183722496,
-0.20404933393001556,
0.03463415056467056,
0.07968573272228241,
-0.11895040422677994,
0.18004877865314484,
0.032886918634176254,
0.13635416328907013,
0.01931498385965824,
0.14652439951896667,
-0.12186150997877121,
0.014377960003912449,
0.01464270893484354,
-0.015491045080125332,
0.055415596812963486,
0.08804452419281006,
-0.038794226944446564,
0.09763352572917938,
0.177653506398201,
0.10883878171443939,
0.04911845549941063,
-0.10558086633682251,
-0.014727416448295116,
-0.03310466557741165,
0.018835384398698807,
0.12075160443782806,
0.1193094402551651,
-0.01845790445804596,
0.20021599531173706,
-0.14986595511436462,
0.07329507917165756,
-0.0995626449584961,
-0.27255508303642273,
-0.0038277229759842157,
0.21143054962158203,
0.07346842437982559,
-0.025004452094435692,
-0.12620827555656433,
0.06475763022899628,
0.12761425971984863,
0.0030757547356188297,
0.06504988670349121,
-0.015198786742985249,
-0.055105701088905334,
-0.0015243350062519312,
-0.07397002726793289,
-0.004598719999194145,
0.18640007078647614,
-0.07974611967802048,
-0.031184203922748566,
-0.12737500667572021,
-0.019428882747888565,
0.04709514603018761,
0.011552144773304462,
-0.09352482110261917,
-0.0217994824051857,
0.11079124361276627,
-0.007622338831424713,
-0.02531961165368557,
-0.15207529067993164,
-0.05755603685975075,
-0.08864409476518631,
0.04077286645770073,
0.0017509139142930508,
0.011538662947714329,
-0.09947098046541214,
0.12073534727096558,
-0.029350996017456055,
-0.0943499282002449,
0.052897434681653976,
-0.1107030138373375,
0.04635190963745117,
-0.11982002854347229,
-0.03970254212617874,
-0.10856737196445465,
0.013430505990982056,
0.22841021418571472,
0.1669083684682846,
-0.015314205549657345,
-0.08587565273046494,
0.039016176015138626,
0.02371702343225479,
0.09614221751689911,
0.06376225501298904,
-0.015822242945432663,
0.06775996834039688,
-0.04785482585430145,
-0.017039362341165543,
-0.025495992973446846,
-0.1726902425289154,
0.032083623111248016,
0.01997307874262333,
0.07117509841918945,
-0.0760226845741272,
0.06040170043706894,
-0.01951628364622593,
0.055283352732658386,
0.05161101743578911,
-0.031190861016511917,
0.03744623437523842,
-0.052504897117614746,
0.01617865450680256,
-0.09791388362646103,
0.0286922138184309,
0.1180110052227974,
0.03286140412092209,
0.1336720734834671,
-0.09649777412414551,
-0.026225421577692032,
-0.1056324690580368,
-0.03878350928425789,
0.018166208639740944,
-0.0019215025240555406,
0.0628642737865448,
-0.19663763046264648,
-0.30395275354385376,
-0.027070891112089157,
0.053043100982904434,
-0.019671862944960594,
-0.05561401695013046,
-0.07015043497085571,
0.016289202496409416,
0.059536442160606384,
-0.02920805849134922,
0.054385289549827576,
-0.022419849410653114,
0.03813159465789795,
-0.07676586508750916,
-0.02052054926753044,
-0.06291672587394714,
0.006658008787781,
-0.14841435849666595,
-0.03448035567998886,
-0.030017102137207985,
0.006548900622874498,
-0.03775618225336075,
0.16895608603954315,
-0.011088937520980835,
0.047757651656866074,
-0.05747115612030029,
0.05074193328619003,
0.007877329364418983,
0.1440490484237671,
-0.1335235834121704,
0.005429679993540049,
0.1511751264333725,
-0.11302075535058975,
-0.10663392394781113,
0.09467647224664688,
-0.10317569971084595,
0.23649843037128448,
0.10416192561388016,
0.13955152034759521,
0.05125761032104492,
-0.12630151212215424,
0.11601320654153824,
0.03282208740711212,
-0.08780468255281448,
-0.062369491904973984,
-0.0006791196065023541,
-0.034443121403455734,
-0.22099432349205017,
0.031658004969358444,
0.11068084836006165,
0.07476310431957245,
-0.03403317928314209,
-0.08304393291473389,
-0.02895026095211506,
-0.058612581342458725,
0.03986813873052597,
0.016017582267522812,
0.12599535286426544,
-0.07699156552553177,
-0.02858225256204605,
0.032077912241220474,
0.038467586040496826,
0.07923582941293716,
-0.054815541952848434,
-0.057291675359010696,
-0.01996961608529091,
-0.023569827899336815,
-0.00915558822453022,
-0.0898597314953804,
-0.0620407834649086,
-0.006840218789875507,
0.1304454207420349,
0.03466487303376198,
0.07167287915945053,
0.0362425372004509,
0.052633073180913925,
-0.028641145676374435,
0.002677651820704341,
0.1629824936389923,
0.04459667578339577,
-0.12675853073596954,
-0.08582112193107605,
0.10815013945102692,
-0.07446087151765823,
0.1071702167391777,
-0.2590586841106415,
0.028333326801657677,
-0.11371348798274994,
0.08611167222261429,
-0.013308924622833729,
0.06491301208734512,
-0.08320876955986023,
0.024355897679924965,
-0.08930765837430954,
-0.008432179689407349,
0.05678462237119675,
0.04953930526971817,
-0.02282531000673771,
0.12372811883687973,
-0.1432238668203354,
0.21934939920902252,
0.1198250874876976,
-0.09310522675514221,
-0.11077594012022018,
-0.0739443302154541,
0.009118417277932167,
-0.005148864816874266,
-0.1179550290107727,
0.005491754971444607,
0.076014444231987,
-0.04686584323644638,
0.1847466230392456,
-0.034107014536857605,
-0.03428659960627556,
-0.015382813289761543,
-0.08532355725765228,
-0.009268855676054955,
-0.02073976956307888,
0.09649215638637543,
-0.2238936424255371,
0.1325010061264038,
0.16212041676044464,
-0.015046309679746628,
0.1718226969242096,
0.01847519353032112,
0.013679388910531998,
0.006052343640476465,
-0.04082776978611946,
-0.00007846848893677816,
0.02128027006983757,
0.0015916629927232862,
0.0011914868373423815,
0.007707077544182539,
0.02131907269358635,
0.030305195599794388,
-0.14438240230083466,
-0.05413905158638954,
0.010167223401367664,
0.052466847002506256,
0.00018202696810476482,
0.0614926852285862,
-0.08105885237455368,
0.05735839903354645,
-0.0333511158823967,
-0.11407014727592468,
0.12527471780776978,
0.0140310637652874,
-0.12375999987125397,
0.1809239387512207,
-0.09875242412090302,
-0.177916020154953,
-0.19897617399692535,
-0.11664178967475891,
0.025174645707011223,
0.09509945660829544,
0.06778308749198914,
-0.06591268628835678,
-0.0677633062005043,
-0.013884147629141808,
-0.13205823302268982,
0.015237858518958092,
-0.0303916335105896,
-0.10815607011318207,
0.06643082201480865,
0.002197817200794816,
-0.1106930822134018,
-0.04751880466938019,
0.012397545389831066,
-0.05212624743580818,
0.06534521281719208,
-0.032029394060373306,
0.06015416979789734,
0.12733860313892365,
-0.009645693004131317,
0.014830506406724453,
-0.03892328962683678,
0.1736617386341095,
-0.07863081991672516,
0.0028175772167742252,
0.11224561184644699,
-0.04382455348968506,
0.03531843051314354,
0.2027312070131302,
0.03458266332745552,
-0.07247956842184067,
0.06938916444778442,
-0.03509911522269249,
-0.05979844182729721,
-0.202435702085495,
-0.10123657435178757,
-0.007523522712290287,
-0.02823515795171261,
0.08373580127954483,
0.0565473809838295,
0.25448861718177795,
0.1288231760263443,
0.060374923050403595,
0.03997355327010155,
0.024889161810278893,
0.0913970097899437,
0.1029813289642334,
-0.027027886360883713,
0.16222402453422546,
-0.08429007232189178,
-0.14650671184062958,
0.048164136707782745,
-0.022769063711166382,
0.07281020283699036,
0.17174853384494781,
-0.06210782378911972,
0.04705783352255821,
0.11571547389030457,
0.13094793260097504,
0.12702703475952148,
0.07746905833482742,
-0.061997704207897186,
-0.006629003677517176,
0.0010869213147088885,
-0.04415592923760414,
0.14652740955352783,
-0.060009948909282684,
-0.06889448314905167,
-0.04306207224726677,
-0.003198902355507016,
0.04323491454124451,
0.05818231403827667,
0.026216039434075356,
-0.28657910227775574,
0.042942874133586884,
0.04888097196817398,
-0.05969006195664406,
-0.11467164009809494,
0.09232109785079956,
-0.027857046574354172,
-0.18361465632915497,
0.03563778102397919,
-0.033283449709415436,
0.09147034585475922,
0.062072351574897766,
0.04841171205043793,
-0.06585943698883057,
-0.0609852597117424,
-0.045712124556303024,
0.15376420319080353,
-0.33846980333328247,
0.20756816864013672,
-0.011205663904547691,
0.08115556091070175,
-0.10785048454999924,
0.010794016532599926,
0.08773794025182724,
0.19103488326072693,
0.12050216645002365,
-0.049261946231126785,
-0.19848455488681793,
-0.11937171965837479,
-0.08363119512796402,
-0.015415008179843426,
0.02001480758190155,
-0.008096402511000633,
0.0008919041720218956,
-0.11757626384496689,
0.0014032695908099413,
0.04126403480768204,
-0.0069845812395215034,
-0.17894983291625977,
-0.15384836494922638,
-0.03538630157709122,
0.030474675819277763,
0.10934672504663467,
-0.04776112735271454,
-0.0534328930079937,
-0.06292759627103806,
0.13548673689365387,
0.026695549488067627,
0.008182995021343231,
-0.1301279366016388,
-0.053804632276296616,
-0.044131867587566376,
-0.023950019851326942,
0.07710648328065872,
0.009424211457371712,
0.11959850043058395,
-0.08615647256374359,
-0.06447352468967438,
0.09218238294124603,
-0.12910714745521545,
-0.042984966188669205,
-0.12177132815122604,
0.03449074551463127,
-0.045684002339839935,
-0.01073586754500866,
0.11459703743457794,
0.04736353084445,
-0.07455705851316452,
-0.06686578691005707,
-0.016151487827301025,
-0.0162202138453722,
0.052238523960113525,
-0.10140960663557053,
-0.11989933252334595,
-0.12391869723796844,
-0.023699220269918442,
-0.11985665559768677,
0.1933230459690094,
0.14995472133159637,
-0.08873795717954636,
0.15256796777248383,
0.2099498212337494,
-0.11413656920194626,
-0.29302918910980225,
-0.05128840357065201,
-0.06601350009441376,
0.004299632739275694,
0.06156041473150253,
-0.10058135539293289,
0.1023014560341835,
0.016915474086999893,
-0.08869403600692749,
-0.016260353848338127,
-0.10926515609025955,
-0.16224952042102814,
0.22960300743579865,
-0.0020108406897634268,
0.18459931015968323,
-0.07568172365427017,
-0.05459576100111008,
-0.12268339842557907,
0.05030543729662895,
0.043312136083841324,
-0.06949128210544586,
0.04921199381351471,
0.045118432492017746,
0.04848489910364151,
0.02309754677116871,
-0.04944291338324547,
0.05402865633368492,
-0.07527824491262436,
0.09563448280096054,
-0.16834798455238342,
-0.019022751599550247,
0.05676575005054474,
-0.027846379205584526,
0.11607834696769714,
-0.040225449949502945,
0.045501600950956345,
-0.05838647112250328,
-0.07079911977052689,
0.02105431631207466,
0.07136379927396774,
-0.007516450714319944,
-0.11632271111011505,
0.009460309520363808,
0.0020681610330939293,
-0.007515698205679655,
-0.07468903809785843,
0.01720641367137432,
-0.009510648436844349,
0.14864802360534668,
0.13830016553401947,
0.2062399536371231,
-0.06995580345392227,
0.06706579029560089,
-0.03199863061308861,
-0.11711113899946213,
0.07805433124303818,
-0.07166967540979385,
0.004296483471989632,
0.05220668390393257,
-0.0538930743932724,
0.14611311256885529,
0.06082209199666977,
0.003751826472580433,
-0.01890469156205654,
0.16250212490558624,
-0.16876746714115143,
0.04684048146009445,
-0.0843876302242279,
0.1279323697090149,
0.04778100550174713,
-0.03293748199939728,
0.09026376903057098,
-0.07791304588317871,
-0.03329215198755264,
-0.0002585914626251906,
0.006090222392231226,
-0.038581836968660355,
0.06518552452325821,
0.04536600783467293,
0.02252393215894699,
-0.06704199314117432,
0.0445764996111393,
0.07239795476198196,
0.016518399119377136,
0.041721411049366,
0.015846284106373787,
-0.09952405095100403,
-0.09522253274917603,
0.04372299090027809,
0.26397231221199036,
-0.1863422393798828,
-0.09990737587213516,
0.004564397502690554,
-0.09345841407775879,
0.004960347898304462,
0.08620705455541611,
0.0809662714600563,
0.04341237619519234,
-0.03603934869170189,
-0.02565331570804119,
-0.11602527648210526,
0.08217493444681168,
-0.015696978196501732,
0.05509110167622566,
-0.16319575905799866,
0.06676459312438965,
-0.030968010425567627,
-0.008549565449357033,
-0.08279257267713547,
-0.010031647980213165,
-0.11571928858757019,
0.026098787784576416,
-0.10430167615413666,
-0.03189973905682564,
-0.041006896644830704,
-0.011233619414269924,
0.05850789323449135,
-0.011018243618309498,
-0.013110441155731678,
-0.01927962154150009,
-0.08805359154939651,
0.02887921780347824,
-0.0008198951254598796,
0.04547540098428726,
-0.05460818111896515,
-0.024217726662755013,
0.037278566509485245,
0.004562355112284422,
0.046250831335783005,
0.012032478116452694,
-0.0011190201621502638,
0.049139540642499924,
-0.14732354879379272,
0.009436994791030884,
0.06159417703747749,
-0.0016145178815349936,
0.0070913624949753284,
-0.028678715229034424,
0.005330502521246672,
0.09783722460269928,
0.018718764185905457,
0.04128317907452583,
-0.0048657008446753025,
-0.1091027706861496,
0.014511657878756523,
0.10307195782661438,
-0.14174701273441315,
-0.03145497664809227,
-0.052812907844781876,
0.01100962609052658,
-0.05524790287017822,
0.23351503908634186,
-0.11669892817735672,
0.04470064863562584,
-0.02692001312971115,
0.030550040304660797,
-0.05822846665978432,
-0.10757116973400116,
-0.12190251797437668,
-0.0954190194606781,
-0.042861051857471466,
0.007703589275479317,
0.2689315676689148,
0.1459355354309082,
-0.008143693208694458,
0.0415508970618248,
0.07256698608398438,
0.09993022680282593,
0.001325596240349114,
0.22187061607837677,
0.09407079964876175,
-0.011255222372710705,
-0.12900875508785248,
0.0802748054265976,
0.027718892320990562,
-0.10550516843795776,
0.0003671931044664234,
0.017833324149250984,
-0.07709381729364395,
0.05998256057500839,
0.04779348149895668,
-0.04618219658732414,
-0.11530262231826782,
-0.1887446641921997,
-0.1010153517127037,
0.01362328790128231,
-0.09494820982217789,
-0.00841664057224989,
0.17340072989463806,
-0.07381404936313629,
-0.020257510244846344,
-0.08453129231929779,
-0.042230453342199326,
-0.21403644979000092,
-0.1685105264186859,
-0.09951409697532654,
-0.07172851264476776,
0.054574232548475266,
-0.01444533746689558,
0.051937036216259,
0.0384058877825737,
0.03334033116698265,
-0.0690227821469307,
0.10118697583675385,
-0.11317354440689087,
0.006825347896665335,
-0.007538147736340761,
-0.042660877108573914,
0.007157159503549337,
-0.17031751573085785,
-0.023363124579191208,
-0.1397811770439148,
-0.04669688642024994,
-0.031707603484392166,
-0.04375086724758148,
0.0007692996296100318,
-0.003963754046708345,
-0.03139100596308708,
-0.009807240217924118,
-0.01006900705397129,
0.03744599595665932,
0.023235660046339035,
0.05043753236532211,
0.022183645516633987,
0.01541586872190237,
0.043549589812755585,
0.21836970746517181,
-0.03527946025133133,
-0.18426218628883362,
-0.12376350164413452,
0.24631790816783905,
0.03293769061565399,
0.11490416526794434,
-0.07057193666696548,
-0.01361043006181717,
0.07598087936639786,
0.31235218048095703,
0.2598150074481964,
-0.03414434567093849,
0.010121017694473267,
-0.03132476285099983,
-0.014958096668124199,
-0.0064048562198877335,
0.18490195274353027,
0.008828791789710522,
0.16826002299785614,
-0.0621221587061882,
0.059055350720882416,
-0.016177164390683174,
-0.07808512449264526,
-0.06689254939556122,
0.14256809651851654,
-0.036333873867988586,
-0.02151089534163475,
-0.01796986348927021,
0.08792226016521454,
-0.0589551106095314,
0.17949369549751282,
-0.09007178992033005,
-0.009130639024078846,
-0.04809116572141647,
0.053617071360349655,
0.11827872693538666,
-0.02074413187801838,
0.03285614401102066,
-0.03567332774400711,
-0.018393725156784058,
0.0029441264923661947,
-0.04050283133983612,
-0.07413910329341888,
-0.04345672205090523,
0.06311136484146118,
0.02551795169711113,
0.25671228766441345,
-0.009337767027318478,
0.05477561056613922,
0.07988451421260834,
-0.0020537625532597303,
-0.10351628065109253,
0.11267323791980743,
0.00224103475920856,
-0.029008302837610245,
0.12491703033447266,
-0.015443749725818634,
0.007564615458250046,
-0.01867114193737507,
-0.01239294558763504,
-0.15698960423469543,
0.14728498458862305,
-0.10142818093299866,
-0.08940913528203964,
-0.05584051460027695,
0.12545742094516754,
-0.032320525497198105,
0.16258437931537628,
0.05726946145296097,
-0.026426637545228004,
0.0021389273460954428,
-0.0331779383122921,
0.08067825436592102,
0.009919043630361557,
-0.09914126992225647,
-0.02203422784805298,
-0.17707498371601105,
-0.016973769292235374,
0.12876249849796295,
-0.02544221095740795,
-0.24601322412490845,
-0.07971391826868057,
-0.06824030727148056,
-0.04311496391892433,
-0.1386985182762146,
0.07398401945829391,
0.2028772532939911,
0.019287997856736183,
-0.01476763840764761,
-0.1369636058807373,
-0.021961720660328865,
0.019149890169501305,
-0.026857441291213036,
-0.10799262672662735
] |
null | null | transformers | # Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
We fine-tuned the [jphme/em_german_leo_mistral](https://huggingface.co/jphme/em_german_leo_mistral) with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency.
Our aim was to have a model which can simplify German-language text.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Members of the [Public Interest AI research group](https://publicinterest.ai/), [HIIG Berlin](https://www.hiig.de/)
- **Model type:** simplification model, text generation
- **Language(s) (NLP):** German
- **License:** Apache 2.0
- **Finetuned from model:** jphme/em_german_leo_mistral
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/fhewett/simba
<!-- - **Paper [optional]:** [More Information Needed] -->
- **Project website:** https://publicinterest.ai/tool/simba
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
This model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.
### Downstream Use
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
We have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data. Contact us if you have a dataset which you think could work (parallel texts, German standard & German simplified).
<!-- ### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
As with most text generation models, the model sometimes produces information that is incorrect.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Please check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
A sample of the data used to train our model can be found [here](https://github.com/fhewett/apa-rst/tree/main/original_texts).
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
<!-- #### Speeds, Sizes, Times [optional] -->
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
#### Summary
For now, we have manually checked the performance of our model on a small sample of texts. Whilst it seems to produce good summaries of all texts, it only seems to simplify newspaper articles (i.e. similar to our training data). We have not yet applied any large-scale metrics based evaluation.
<!-- ## Citation [optional]
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]-->
## Model Card Contact
simba -at- hiig.de | {"language": ["de"], "license": "apache-2.0", "tags": ["german", "deutsch", "simplification", "vereinfachung"], "pipeline_tag": "text-generation"} | text-generation | hiig-piai/simba-v01c | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"german",
"deutsch",
"simplification",
"vereinfachung",
"conversational",
"de",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:27:22+00:00 | [] | [
"de"
] | TAGS
#transformers #safetensors #mistral #text-generation #german #deutsch #simplification #vereinfachung #conversational #de #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| # Model Card for Model ID
We fine-tuned the jphme/em_german_leo_mistral with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency.
Our aim was to have a model which can simplify German-language text.
## Model Details
### Model Description
- Developed by: Members of the Public Interest AI research group, HIIG Berlin
- Model type: simplification model, text generation
- Language(s) (NLP): German
- License: Apache 2.0
- Finetuned from model: jphme/em_german_leo_mistral
### Model Sources
- Repository: URL
- Project website: URL
## Uses
### Direct Use
This model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.
### Downstream Use
We have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data. Contact us if you have a dataset which you think could work (parallel texts, German standard & German simplified).
## Bias, Risks, and Limitations
As with most text generation models, the model sometimes produces information that is incorrect.
### Recommendations
Please check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
A sample of the data used to train our model can be found here.
#### Training Hyperparameters
- Training regime:
## Evaluation
#### Summary
For now, we have manually checked the performance of our model on a small sample of texts. Whilst it seems to produce good summaries of all texts, it only seems to simplify newspaper articles (i.e. similar to our training data). We have not yet applied any large-scale metrics based evaluation.
## Model Card Contact
simba -at- URL | [
"# Model Card for Model ID\n\n\n\nWe fine-tuned the jphme/em_german_leo_mistral with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency. \nOur aim was to have a model which can simplify German-language text.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: Members of the Public Interest AI research group, HIIG Berlin\n- Model type: simplification model, text generation\n- Language(s) (NLP): German\n- License: Apache 2.0\n- Finetuned from model: jphme/em_german_leo_mistral",
"### Model Sources\n\n\n\n- Repository: URL\n\n- Project website: URL",
"## Uses",
"### Direct Use\n\n\n\nThis model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.",
"### Downstream Use\n\n\nWe have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data. Contact us if you have a dataset which you think could work (parallel texts, German standard & German simplified).",
"## Bias, Risks, and Limitations\n\n\n\nAs with most text generation models, the model sometimes produces information that is incorrect.",
"### Recommendations\n\n\n\nPlease check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data\n\n\n\nA sample of the data used to train our model can be found here.",
"#### Training Hyperparameters\n\n- Training regime:",
"## Evaluation",
"#### Summary\n\nFor now, we have manually checked the performance of our model on a small sample of texts. Whilst it seems to produce good summaries of all texts, it only seems to simplify newspaper articles (i.e. similar to our training data). We have not yet applied any large-scale metrics based evaluation.",
"## Model Card Contact\n\nsimba -at- URL"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #german #deutsch #simplification #vereinfachung #conversational #de #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID\n\n\n\nWe fine-tuned the jphme/em_german_leo_mistral with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency. \nOur aim was to have a model which can simplify German-language text.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: Members of the Public Interest AI research group, HIIG Berlin\n- Model type: simplification model, text generation\n- Language(s) (NLP): German\n- License: Apache 2.0\n- Finetuned from model: jphme/em_german_leo_mistral",
"### Model Sources\n\n\n\n- Repository: URL\n\n- Project website: URL",
"## Uses",
"### Direct Use\n\n\n\nThis model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.",
"### Downstream Use\n\n\nWe have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data. Contact us if you have a dataset which you think could work (parallel texts, German standard & German simplified).",
"## Bias, Risks, and Limitations\n\n\n\nAs with most text generation models, the model sometimes produces information that is incorrect.",
"### Recommendations\n\n\n\nPlease check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data\n\n\n\nA sample of the data used to train our model can be found here.",
"#### Training Hyperparameters\n\n- Training regime:",
"## Evaluation",
"#### Summary\n\nFor now, we have manually checked the performance of our model on a small sample of texts. Whilst it seems to produce good summaries of all texts, it only seems to simplify newspaper articles (i.e. similar to our training data). We have not yet applied any large-scale metrics based evaluation.",
"## Model Card Contact\n\nsimba -at- URL"
] | [
76,
64,
3,
69,
16,
3,
39,
84,
27,
33,
20,
3,
19,
11,
3,
75,
10
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #german #deutsch #simplification #vereinfachung #conversational #de #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID\n\n\n\nWe fine-tuned the jphme/em_german_leo_mistral with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency. \nOur aim was to have a model which can simplify German-language text.## Model Details### Model Description\n\n\n\n\n\n- Developed by: Members of the Public Interest AI research group, HIIG Berlin\n- Model type: simplification model, text generation\n- Language(s) (NLP): German\n- License: Apache 2.0\n- Finetuned from model: jphme/em_german_leo_mistral### Model Sources\n\n\n\n- Repository: URL\n\n- Project website: URL## Uses### Direct Use\n\n\n\nThis model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.### Downstream Use\n\n\nWe have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data. Contact us if you have a dataset which you think could work (parallel texts, German standard & German simplified).## Bias, Risks, and Limitations\n\n\n\nAs with most text generation models, the model sometimes produces information that is incorrect.### Recommendations\n\n\n\nPlease check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data\n\n\n\nA sample of the data used to train our model can be found here.#### Training Hyperparameters\n\n- Training regime:## Evaluation"
] | [
-0.07368363440036774,
0.10658423602581024,
-0.0013999267248436809,
-0.004904432222247124,
0.058464113622903824,
0.01787254959344864,
0.12052600085735321,
0.061760663986206055,
-0.0038016049657016993,
0.046045560389757156,
0.01323364581912756,
-0.06746524572372437,
0.029858918860554695,
0.044573716819286346,
-0.004719214979559183,
-0.19454830884933472,
0.05620681867003441,
-0.08387388288974762,
0.04237198457121849,
0.05480450391769409,
0.14764748513698578,
-0.05369716137647629,
0.04729758948087692,
-0.004122001118957996,
0.03581458330154419,
0.04471508413553238,
-0.033733926713466644,
0.03221734240651131,
0.031815849244594574,
0.05261394754052162,
0.06444817036390305,
-0.007399707566946745,
0.04890936613082886,
-0.1536591649055481,
0.009156680665910244,
0.04059825465083122,
-0.035470422357320786,
0.016706928610801697,
0.06800195574760437,
-0.03375646471977234,
0.19770357012748718,
-0.1307787299156189,
-0.033796824514865875,
0.0995972603559494,
-0.0638338103890419,
-0.03239504620432854,
-0.13858923316001892,
0.13123562932014465,
0.14424555003643036,
0.13389749825000763,
-0.061267659068107605,
0.08367670327425003,
-0.07977883517742157,
0.005236039869487286,
-0.01287703774869442,
-0.13332980871200562,
-0.05124231427907944,
0.018641933798789978,
-0.002385323401540518,
-0.0049973600544035435,
-0.08876927942037582,
0.03365987911820412,
0.06751872599124908,
0.026634974405169487,
-0.07761257141828537,
-0.02703467383980751,
0.15348400175571442,
-0.0484028197824955,
-0.11043655872344971,
-0.03698962926864624,
0.09498964995145798,
0.024562617763876915,
-0.10665367543697357,
-0.21457166969776154,
0.013412948697805405,
0.12699364125728607,
0.015944769605994225,
-0.033961985260248184,
0.0008775779278948903,
0.020216183736920357,
0.10804712772369385,
-0.11929117143154144,
-0.1164650097489357,
0.023298028856515884,
0.01575501821935177,
0.1790967732667923,
0.012374254874885082,
-0.01398494467139244,
-0.009351386688649654,
0.12206932157278061,
-0.04393915459513664,
-0.1048787459731102,
-0.061745401471853256,
-0.07345576584339142,
-0.028351014479994774,
-0.04273984581232071,
-0.0577746257185936,
-0.15271802246570587,
0.006327959708869457,
0.1893208771944046,
-0.02596387267112732,
0.0016914068255573511,
0.03958800435066223,
0.02537734992802143,
0.08278632164001465,
0.1466214954853058,
-0.042948704212903976,
-0.1258237063884735,
-0.0027359100058674812,
0.01778869330883026,
0.04930247366428375,
0.05443023145198822,
-0.039682235568761826,
0.001826498773880303,
-0.022782176733016968,
0.031317539513111115,
0.013678952120244503,
0.02070421352982521,
-0.02886984683573246,
-0.024400947615504265,
0.12814272940158844,
-0.15209205448627472,
-0.004595912527292967,
-0.04054637998342514,
-0.02624603919684887,
0.19412097334861755,
0.0007507439004257321,
-0.05249888077378273,
-0.09708942472934723,
0.07880649715662003,
-0.056764740496873856,
-0.01781211793422699,
-0.09802336245775223,
-0.06983201205730438,
0.01497315801680088,
-0.053759731352329254,
-0.04049333557486534,
-0.05611356720328331,
-0.12853994965553284,
-0.09864378720521927,
0.08154188841581345,
-0.05647452548146248,
0.04349005967378616,
-0.04602853208780289,
-0.08572064340114594,
0.019648199900984764,
0.036329638212919235,
-0.1417420208454132,
-0.019679905846714973,
-0.027172185480594635,
-0.04870425537228584,
0.0021285011898726225,
-0.046776991337537766,
0.011669020168483257,
-0.12010041624307632,
0.012133528478443623,
-0.2577517628669739,
0.1249806359410286,
-0.09771154075860977,
0.05413699522614479,
-0.09200464189052582,
0.015301871113479137,
0.06092981621623039,
0.06303476542234421,
0.010314642451703548,
0.1491241455078125,
-0.16895072162151337,
-0.02213769219815731,
0.12461401522159576,
-0.15799817442893982,
-0.02009078860282898,
0.10358433425426483,
-0.05092691630125046,
0.12617963552474976,
0.15604594349861145,
0.018576955422759056,
0.15244077146053314,
-0.1832873523235321,
-0.12684611976146698,
-0.008789462968707085,
0.009574880823493004,
0.15213105082511902,
0.049460526555776596,
-0.10601548850536346,
0.01763850264251232,
0.026257697492837906,
-0.1383458375930786,
-0.007266656961292028,
0.016416378319263458,
-0.038554299622774124,
-0.0014699911698698997,
-0.050025615841150284,
0.10347723960876465,
-0.0035647631157189608,
-0.027096016332507133,
-0.029182996600866318,
-0.18713024258613586,
0.07941961288452148,
0.10268191248178482,
-0.018201585859060287,
-0.008825795724987984,
-0.04666668176651001,
-0.02874002791941166,
0.016022736206650734,
0.009712479077279568,
-0.11617628484964371,
-0.058644235134124756,
-0.020859260112047195,
-0.053758297115564346,
0.1326623260974884,
0.028677381575107574,
0.061567097902297974,
0.039880868047475815,
-0.0783294141292572,
-0.01795896142721176,
-0.10343037545681,
0.005081505049020052,
0.004000613931566477,
-0.12976956367492676,
-0.03000335581600666,
-0.05655987933278084,
0.14877775311470032,
-0.14901240170001984,
0.003857534611597657,
0.021620601415634155,
0.1380687952041626,
0.04111819714307785,
-0.00770441722124815,
-0.019529717043042183,
-0.00525701092556119,
-0.03355805575847626,
-0.02170923724770546,
-0.01821981929242611,
-0.01717999577522278,
-0.08017206192016602,
0.03437088802456856,
0.034554846584796906,
-0.1850811243057251,
0.05385841056704521,
0.02600463479757309,
-0.12292670458555222,
0.03521354869008064,
-0.03263304755091667,
-0.007351425476372242,
-0.09011850506067276,
-0.17099593579769135,
0.14820833504199982,
0.006031688302755356,
0.0505622997879982,
-0.07043284922838211,
-0.06205276399850845,
-0.0550382025539875,
-0.04064721614122391,
-0.037519801408052444,
0.10359344631433487,
-0.01433371938765049,
-0.1613154411315918,
0.06862448900938034,
0.09223725646734238,
-0.07347206771373749,
0.19381104409694672,
-0.03495864197611809,
-0.1306261420249939,
-0.00022794620599597692,
0.06341670453548431,
0.01005767472088337,
0.12107384204864502,
0.02168428525328636,
0.059165503829717636,
0.007348678074777126,
0.012377106584608555,
0.06282667815685272,
-0.04938287287950516,
0.045314207673072815,
-0.005285406019538641,
-0.019390517845749855,
0.005959239322692156,
0.004056790843605995,
0.0008646667120046914,
0.0628887414932251,
0.01616516150534153,
0.047559093683958054,
0.02267417684197426,
-0.04328422248363495,
-0.09996863454580307,
0.1373739093542099,
-0.064280666410923,
-0.1719205379486084,
-0.17332111299037933,
0.10048352926969528,
-0.09258595108985901,
0.0154842808842659,
0.012340889312326908,
-0.09165901690721512,
-0.1112980917096138,
-0.09210734069347382,
0.04477822780609131,
0.0818432942032814,
-0.07291556894779205,
-0.09119787067174911,
0.03825850412249565,
0.06355602294206619,
-0.12058383971452713,
-0.00804427731782198,
-0.007140908855944872,
-0.07772007584571838,
-0.066074438393116,
0.05690469965338707,
0.07405558973550797,
0.08315212279558182,
-0.014494682662189007,
-0.036089811474084854,
0.014966449700295925,
0.2054821252822876,
-0.13956831395626068,
0.11044841259717941,
0.11931360512971878,
-0.0975397452712059,
0.07280747592449188,
0.1499849408864975,
0.04605624079704285,
-0.04089592024683952,
0.02730501815676689,
0.05779633671045303,
-0.002284137299284339,
-0.2219555675983429,
-0.11431828141212463,
-0.0509374737739563,
-0.06989860534667969,
0.004018477164208889,
0.06149519607424736,
0.01749328151345253,
0.028645532205700874,
-0.1238187626004219,
-0.03311068192124367,
0.029673123732209206,
0.046153903007507324,
0.05321540683507919,
0.015148712322115898,
0.010233237408101559,
-0.07882804423570633,
-0.0677509531378746,
0.10927847027778625,
-0.05031268671154976,
0.2075624167919159,
0.022385211661458015,
0.06257264316082001,
0.09438527375459671,
-0.067422054708004,
-0.0035657871048897505,
0.06229178234934807,
0.019545024260878563,
-0.03517317399382591,
-0.0038087042048573494,
-0.09713761508464813,
0.041735611855983734,
0.12796224653720856,
-0.040170975029468536,
0.040860943496227264,
-0.03533811867237091,
-0.05394590273499489,
0.09380366653203964,
0.10190489143133163,
0.10537417232990265,
-0.08036085963249207,
-0.10585150867700577,
0.04408193379640579,
-0.10440649092197418,
-0.02194741554558277,
-0.010805306024849415,
0.05115733668208122,
-0.11676305532455444,
0.07954569160938263,
0.011349424719810486,
0.10916177928447723,
-0.09869778901338577,
-0.01468459889292717,
0.005607874598354101,
0.051131993532180786,
-0.008427536115050316,
0.08704107999801636,
-0.11540749669075012,
0.15384261310100555,
0.02173442766070366,
0.15868791937828064,
-0.06651856005191803,
0.07958984375,
0.06921732425689697,
0.024813758209347725,
0.12421081960201263,
0.027257975190877914,
-0.012548334896564484,
0.021055513992905617,
-0.07382446527481079,
-0.02951996587216854,
0.09544595330953598,
-0.08688104897737503,
0.12336993962526321,
-0.026468677446246147,
0.00561318127438426,
-0.05813460797071457,
-0.01858580857515335,
-0.28454264998435974,
-0.17144379019737244,
0.029188722372055054,
-0.06486289948225021,
0.0399300716817379,
-0.0458081029355526,
-0.010005413554608822,
-0.030873959884047508,
0.21687795221805573,
-0.11664500087499619,
-0.0891069695353508,
-0.13017557561397552,
-0.06248759850859642,
0.08698643743991852,
-0.0649939700961113,
0.018277470022439957,
0.062054555863142014,
0.0694732666015625,
-0.029633572325110435,
-0.07596063613891602,
-0.028879383578896523,
-0.0946618989109993,
-0.1133246049284935,
0.0012906211195513606,
0.10375244915485382,
0.15356694161891937,
0.04306930676102638,
0.05512082949280739,
0.009151960723102093,
0.03084847889840603,
-0.15702906250953674,
-0.007708631455898285,
0.11980409175157547,
0.03844694793224335,
0.10879691690206528,
-0.0721302479505539,
-0.16710937023162842,
-0.05858872830867767,
0.0006117767770774662,
0.017357733100652695,
0.2250506579875946,
-0.028690291568636894,
0.1248641163110733,
0.14663614332675934,
-0.06378048658370972,
-0.271306574344635,
-0.006802396848797798,
0.05771733820438385,
0.05415527895092964,
0.05099794641137123,
-0.12642794847488403,
0.13936357200145721,
0.030202599242329597,
-0.04340988025069237,
0.042149174958467484,
-0.21745599806308746,
-0.1239783838391304,
0.08824041485786438,
-0.035946864634752274,
-0.03215858340263367,
-0.057280533015728,
-0.07671456038951874,
-0.09041006118059158,
-0.09674379229545593,
0.030544590204954147,
-0.141323521733284,
0.047631580382585526,
0.04069818556308746,
-0.0027509459760040045,
0.01834777183830738,
-0.007267098408192396,
0.1627098172903061,
0.06801854074001312,
0.03585507720708847,
-0.0817878395318985,
0.006121316459029913,
0.13631464540958405,
-0.032401103526353836,
0.16949063539505005,
-0.09255639463663101,
0.015979010611772537,
-0.1826421618461609,
-0.019989939406514168,
-0.037390295416116714,
0.0742364227771759,
-0.050120510160923004,
-0.06050571799278259,
-0.09663646668195724,
0.1118137389421463,
0.051397014409303665,
-0.0011324778897687793,
0.014658712781965733,
-0.0789712518453598,
0.004518222063779831,
0.11297279596328735,
0.192876398563385,
0.0656692236661911,
-0.07329103350639343,
0.003736570943146944,
-0.02516043558716774,
0.06259146332740784,
-0.06524096429347992,
0.027029484510421753,
0.047353994101285934,
0.018195150420069695,
0.12506218254566193,
-0.024547673761844635,
-0.0971021056175232,
0.02024218812584877,
0.08269024640321732,
-0.12004753202199936,
-0.17560602724552155,
-0.10881081968545914,
-0.06451565772294998,
-0.07658515125513077,
0.048274535685777664,
0.18033744394779205,
-0.07826504856348038,
0.022374458611011505,
0.008115638047456741,
0.028163563460111618,
-0.03557386249303818,
0.07588064670562744,
0.02117413841187954,
0.045468173921108246,
-0.034354351460933685,
0.09176146239042282,
0.05246011167764664,
-0.05448361113667488,
0.05474824830889702,
0.08404858410358429,
-0.05789460614323616,
-0.04227850213646889,
-0.02986971288919449,
0.06647642701864243,
-0.04352418705821037,
-0.04612326994538307,
-0.0211650300770998,
-0.058972492814064026,
-0.002977688331156969,
0.030602270737290382,
0.08709312975406647,
0.017947984859347343,
-0.018655145540833473,
0.004336607176810503,
-0.04408976435661316,
0.05907505378127098,
0.12926195561885834,
0.0010363430483266711,
-0.03972640261054039,
0.003967112861573696,
-0.03528613597154617,
-0.03722945973277092,
-0.04338958114385605,
-0.01450431253761053,
-0.10427644848823547,
-0.040539637207984924,
-0.21834371984004974,
0.027890244498848915,
-0.03774839639663696,
-0.03359825164079666,
0.006035021971911192,
-0.01247957069426775,
0.009327230975031853,
0.019008085131645203,
-0.05361879616975784,
-0.050332870334386826,
0.015752112492918968,
0.08916798233985901,
-0.11130991578102112,
0.0765112116932869,
0.04961302876472473,
-0.06272710114717484,
0.04072955623269081,
-0.022956641390919685,
-0.02434564381837845,
0.04742048308253288,
-0.1463141143321991,
-0.027628464624285698,
-0.04985441267490387,
-0.039006639271974564,
0.022608138620853424,
-0.059673238545656204,
-0.0009934211848303676,
0.0369662269949913,
-0.02080387994647026,
0.024136221036314964,
-0.05158917233347893,
-0.05088526010513306,
0.051860831677913666,
0.055425360798835754,
-0.04071316123008728,
-0.045154038816690445,
0.06203484162688255,
0.1453646570444107,
0.006360925734043121,
0.2005193531513214,
-0.10389169305562973,
0.030115202069282532,
-0.09723532199859619,
0.015728464350104332,
0.03372542932629585,
0.021919431164860725,
-0.07692144811153412,
-0.046775758266448975,
0.006422317586839199,
-0.031123192980885506,
0.12034419924020767,
-0.025878435000777245,
0.00926631037145853,
0.06989201158285141,
0.026647699996829033,
-0.03663694113492966,
0.03958425298333168,
0.10174554586410522,
-0.0028918504249304533,
0.025635823607444763,
-0.1627679020166397,
-0.034951940178871155,
-0.05692782998085022,
-0.1706416755914688,
0.1857139617204666,
0.12546302378177643,
-0.0067346226423978806,
0.12197988480329514,
0.03315456584095955,
0.015847083181142807,
-0.10895513743162155,
0.03312075510621071,
0.062473081052303314,
0.042610153555870056,
-0.0777302160859108,
0.10011840611696243,
0.16652561724185944,
-0.1335247904062271,
0.09865640848875046,
0.01331639476120472,
-0.006415056064724922,
-0.047123461961746216,
-0.20087186992168427,
-0.03127119317650795,
-0.08307119458913803,
0.018208637833595276,
-0.072916679084301,
0.0480295829474926,
0.035018227994441986,
0.03299945220351219,
-0.06426577270030975,
0.131845623254776,
-0.1384338140487671,
-0.10538040846586227,
0.12278221547603607,
-0.019283216446638107,
0.06228847801685333,
0.005476769525557756,
0.006821765098720789,
-0.015394042246043682,
0.06570488214492798,
0.03487752005457878,
0.0752677470445633,
0.033362094312906265,
-0.0363585501909256,
-0.0726534053683281,
-0.06681960076093674,
0.024799488484859467,
0.05488422140479088,
0.06171860545873642,
0.1643799990415573,
0.03330424427986145,
-0.04946374520659447,
-0.037774525582790375,
0.14671070873737335,
-0.08075626194477081,
-0.17003905773162842,
-0.13986161351203918,
0.1455419659614563,
0.037243034690618515,
0.0540141686797142,
0.004997284617275,
-0.17698688805103302,
0.07304278016090393,
0.14882151782512665,
0.16196776926517487,
0.009409650228917599,
0.015297402627766132,
-0.06353918462991714,
0.0037384831812232733,
0.027354586869478226,
0.06963057070970535,
-0.10148237645626068,
0.3347306251525879,
-0.021644584834575653,
0.0717465877532959,
-0.015195018611848354,
-0.08842877298593521,
-0.05762390419840813,
0.12852360308170319,
-0.013672384433448315,
-0.03747742995619774,
-0.07592891901731491,
0.17393790185451508,
-0.05357467755675316,
-0.1539357453584671,
-0.01040808018296957,
0.009792015887796879,
-0.08633266389369965,
-0.03454066440463066,
0.04958231747150421,
0.03425215929746628,
0.07734034955501556,
0.027289722114801407,
-0.0032640129793435335,
0.211160346865654,
-0.007620313670486212,
0.010105500929057598,
-0.04619628190994263,
0.06905406713485718,
-0.06774557381868362,
0.20168088376522064,
0.041438546031713486,
0.04901232197880745,
0.10246549546718597,
-0.015389414504170418,
-0.08176122605800629,
0.037130605429410934,
0.008859517052769661,
-0.12651647627353668,
0.06280159205198288,
0.16732200980186462,
-0.0311907809227705,
0.058418579399585724,
0.06639640778303146,
-0.09350089728832245,
0.06632804125547409,
0.0081194331869483,
-0.07557743042707443,
-0.13429465889930725,
0.16430160403251648,
-0.1083141565322876,
0.15648208558559418,
0.20128095149993896,
-0.024665096774697304,
0.044622424989938736,
-0.026540828868746758,
0.09784706681966782,
0.01976168341934681,
0.021329201757907867,
0.019400177523493767,
-0.13970628380775452,
0.010578556917607784,
0.07237444818019867,
0.03524329513311386,
-0.2117597758769989,
-0.05101301148533821,
0.007448370568454266,
0.03854105621576309,
-0.03750404343008995,
0.083518847823143,
0.13796567916870117,
0.004855806473642588,
-0.00019141138182021677,
-0.11835647374391556,
-0.023476647213101387,
0.01825857348740101,
-0.026056354865431786,
-0.011555799283087254
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# stablelm-2-1.6-disticoder-v0.1
This model is a fine-tuned version of [stabilityai/stablelm-2-1_6b](https://huggingface.co/stabilityai/stablelm-2-1_6b) on the argilla/DistiCoder-dpo-binarized dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1315
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.7319 | 0.44 | 5 | 1.5441 |
| 1.3425 | 0.89 | 10 | 1.2968 |
| 1.1709 | 1.33 | 15 | 1.2151 |
| 1.0994 | 1.78 | 20 | 1.1605 |
| 1.0287 | 2.22 | 25 | 1.1382 |
| 1.0303 | 2.67 | 30 | 1.1315 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.1+cu121
- Datasets 2.16.1
- Tokenizers 0.15.2
| {"license": "other", "tags": ["choo-choo", "trl", "sft", "generated_from_trainer", "trl", "sft", "generated_from_trainer"], "datasets": ["generator"], "base_model": "stabilityai/stablelm-2-1_6b", "model-index": [{"name": "stablelm-2-1.6-disticoder-v0.1", "results": []}]} | text-generation | plaguss/stablelm-2-1_6-sft-disticoder-v01 | [
"transformers",
"safetensors",
"stablelm_epoch",
"text-generation",
"choo-choo",
"trl",
"sft",
"generated_from_trainer",
"conversational",
"custom_code",
"dataset:generator",
"base_model:stabilityai/stablelm-2-1_6b",
"license:other",
"autotrain_compatible",
"region:us"
] | 2024-02-13T09:28:10+00:00 | [] | [] | TAGS
#transformers #safetensors #stablelm_epoch #text-generation #choo-choo #trl #sft #generated_from_trainer #conversational #custom_code #dataset-generator #base_model-stabilityai/stablelm-2-1_6b #license-other #autotrain_compatible #region-us
| stablelm-2-1.6-disticoder-v0.1
==============================
This model is a fine-tuned version of stabilityai/stablelm-2-1\_6b on the argilla/DistiCoder-dpo-binarized dataset.
It achieves the following results on the evaluation set:
* Loss: 1.1315
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 4
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* total\_eval\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.1+cu121
* Datasets 2.16.1
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #stablelm_epoch #text-generation #choo-choo #trl #sft #generated_from_trainer #conversational #custom_code #dataset-generator #base_model-stabilityai/stablelm-2-1_6b #license-other #autotrain_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.2"
] | [
90,
179,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #stablelm_epoch #text-generation #choo-choo #trl #sft #generated_from_trainer #conversational #custom_code #dataset-generator #base_model-stabilityai/stablelm-2-1_6b #license-other #autotrain_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.2"
] | [
-0.1185331717133522,
0.15032316744327545,
-0.00445615453645587,
0.07589433342218399,
0.09429056942462921,
0.05068193003535271,
0.1348089724779129,
0.13301333785057068,
-0.07700782269239426,
0.1444130837917328,
0.10084797441959381,
0.054630935192108154,
0.07537854462862015,
0.17750276625156403,
-0.03890414163470268,
-0.2674148976802826,
0.033818598836660385,
-0.026820795610547066,
-0.1462756097316742,
0.09957308322191238,
0.0875704362988472,
-0.1020454615354538,
0.06654615700244904,
-0.0049669332802295685,
-0.0996883288025856,
-0.03446127474308014,
-0.05059215798974037,
-0.029009776189923286,
0.08516298979520798,
0.04778236523270607,
0.08866014331579208,
0.03881823271512985,
0.09467808902263641,
-0.23751692473888397,
-0.0032109383028000593,
0.07422196120023727,
0.005096470471471548,
0.0772501602768898,
0.09688372910022736,
0.0004756799025926739,
0.09549957513809204,
-0.0721827819943428,
0.04912407696247101,
0.02492130920290947,
-0.125730499625206,
-0.2405063807964325,
-0.08891504257917404,
0.07069449126720428,
0.12332478165626526,
0.03998405858874321,
-0.021941613405942917,
0.08309654146432877,
-0.04500393569469452,
0.0746951624751091,
0.22553297877311707,
-0.2529822885990143,
-0.08317306637763977,
0.05649612471461296,
0.03851044550538063,
0.056678202003240585,
-0.09522081166505814,
-0.03418900445103645,
0.0489320307970047,
0.009000815451145172,
0.07162333279848099,
0.025439202785491943,
0.03287206217646599,
-0.0020707689691334963,
-0.1452665776014328,
-0.07834314554929733,
0.16469219326972961,
0.08322805911302567,
-0.021149663254618645,
-0.08196447789669037,
-0.06054705008864403,
-0.1866457313299179,
-0.023461518809199333,
-0.013438528403639793,
0.02143789455294609,
-0.037398990243673325,
-0.06534536927938461,
0.05384671688079834,
-0.06704676151275635,
-0.08914485573768616,
0.04592074453830719,
0.1436101347208023,
0.06511992961168289,
-0.00984821654856205,
0.009823312982916832,
0.12203940749168396,
0.034630272537469864,
-0.18431171774864197,
-0.022716166451573372,
0.016321765258908272,
-0.07628730684518814,
-0.031781114637851715,
-0.016556059941649437,
0.07642178237438202,
0.05101167410612106,
0.19432878494262695,
-0.06756051629781723,
0.063649982213974,
0.0420963317155838,
0.0011418589856475592,
-0.06296055018901825,
0.13118170201778412,
-0.07117228209972382,
-0.054266657680273056,
-0.04894383251667023,
0.127227783203125,
0.003790228394791484,
-0.004017490427941084,
-0.052088018506765366,
0.026432592421770096,
0.10889558494091034,
0.0468859039247036,
-0.011438129469752312,
0.04589928314089775,
-0.07760463654994965,
-0.024919798597693443,
0.086207315325737,
-0.10740730911493301,
0.02725314535200596,
0.04937976971268654,
-0.07163937389850616,
-0.04144040122628212,
-0.006549521815031767,
0.004747830331325531,
0.00934427510946989,
0.07245544344186783,
-0.07935941219329834,
-0.053262047469615936,
-0.0743965432047844,
-0.07931730151176453,
0.03378282114863396,
-0.031043093651533127,
0.005264895502477884,
-0.06051713228225708,
-0.1357220858335495,
-0.04251532629132271,
0.05016941577196121,
-0.07603184133768082,
-0.06995192915201187,
-0.05344439670443535,
-0.10559540241956711,
0.04071308299899101,
-0.009358986280858517,
0.11544346809387207,
-0.05553695186972618,
0.07807408273220062,
0.04756636545062065,
0.06742465496063232,
0.10310117900371552,
0.029385730624198914,
-0.04677582532167435,
0.0894021987915039,
-0.14780132472515106,
0.05800072103738785,
-0.097971610724926,
0.0353638119995594,
-0.11973775178194046,
-0.0996050089597702,
-0.016388995572924614,
-0.010346467606723309,
0.07596345990896225,
0.12775811553001404,
-0.1427871435880661,
-0.06403318792581558,
0.20776869356632233,
-0.10570860654115677,
-0.14582446217536926,
0.11273571103811264,
-0.01090178918093443,
-0.0349712111055851,
0.007210302632302046,
0.10117996484041214,
0.13798260688781738,
-0.0937347412109375,
-0.040402475744485855,
-0.016882315278053284,
0.11839161813259125,
0.06445873528718948,
0.11666903644800186,
-0.011176465079188347,
0.0330786295235157,
0.010148701258003712,
-0.05835341289639473,
0.010153913870453835,
-0.10533886402845383,
-0.08609676361083984,
-0.028695816174149513,
-0.08073142915964127,
0.00968993827700615,
0.04489782825112343,
0.025856979191303253,
-0.08208729326725006,
-0.12884290516376495,
-0.006238920148462057,
0.1285019963979721,
-0.09378346800804138,
-0.007715631742030382,
-0.06487011909484863,
0.07220793515443802,
-0.017978988587856293,
-0.00634870957583189,
-0.13483378291130066,
-0.11391016840934753,
0.05984564125537872,
-0.056982818990945816,
-0.03953595831990242,
0.008614019490778446,
0.0761091411113739,
0.09666183590888977,
-0.04971729964017868,
-0.07395470887422562,
-0.045389752835035324,
-0.009003954939544201,
-0.06102251634001732,
-0.2153330147266388,
-0.08571623265743256,
-0.031696733087301254,
0.1706300526857376,
-0.2305278331041336,
0.030798453837633133,
0.05303702503442764,
0.1603703647851944,
0.023252014070749283,
-0.045472968369722366,
-0.015200918540358543,
0.026723865419626236,
-0.05495584011077881,
-0.08688453584909439,
0.005949617363512516,
-0.009336207993328571,
-0.08796609938144684,
-0.028632499277591705,
-0.14925415813922882,
0.12276733666658401,
0.08320832252502441,
0.05426851660013199,
-0.09432805329561234,
-0.03987634554505348,
-0.07603355497121811,
-0.05625991150736809,
-0.00850620772689581,
-0.035147156566381454,
0.09393332898616791,
0.001680854125879705,
0.0997910276055336,
-0.08645965903997421,
-0.06523770838975906,
0.02688516303896904,
-0.004848710726946592,
-0.03291096165776253,
0.13698653876781464,
0.04773006960749626,
-0.08290643244981766,
0.13158465921878815,
0.09275900572538376,
-0.037548553198575974,
0.11973147839307785,
-0.08452847599983215,
-0.08703101426362991,
-0.05663640797138214,
0.06112772598862648,
0.04637297987937927,
0.09889237582683563,
-0.08083289116621017,
0.02376052550971508,
0.023334583267569542,
0.03021552599966526,
0.020080098882317543,
-0.1704399287700653,
-0.007606410421431065,
0.0417170487344265,
-0.08263273537158966,
0.02725553885102272,
-0.026299767196178436,
-0.017767855897545815,
0.08632544428110123,
-0.00002645091626618523,
-0.05340418964624405,
-0.006039679981768131,
-0.02297491580247879,
-0.08157099783420563,
0.21373452246189117,
-0.08619074523448944,
-0.10707666724920273,
-0.14694562554359436,
0.037152983248233795,
-0.02644890360534191,
0.007241979707032442,
0.024039922282099724,
-0.06959795951843262,
-0.04846048355102539,
-0.08884141594171524,
0.026812128722667694,
-0.009111540392041206,
0.02755429595708847,
0.0665723904967308,
0.01383287739008665,
0.07569360733032227,
-0.10710015147924423,
0.01987585797905922,
0.0058846240863204,
-0.07055360823869705,
0.031212057918310165,
-0.0015866645844653249,
0.09656984359025955,
0.13357791304588318,
0.03958463668823242,
0.024803772568702698,
-0.009132497943937778,
0.2027266025543213,
-0.08363986015319824,
0.0007811654359102249,
0.10149131715297699,
-0.007560576777905226,
0.059510838240385056,
0.1455351561307907,
0.0365467369556427,
-0.08520293235778809,
0.019080283120274544,
0.03144257888197899,
-0.016664301976561546,
-0.21917986869812012,
-0.018888862803578377,
-0.03389206528663635,
0.03884072229266167,
0.11730007082223892,
0.03148519992828369,
-0.015009178780019283,
0.05926637351512909,
-0.05021687224507332,
0.016601959243416786,
0.02739587612450123,
0.0662749782204628,
0.02006194181740284,
0.05289691314101219,
0.10470646619796753,
-0.01642410084605217,
-0.03806013613939285,
0.040572475641965866,
0.017968760803341866,
0.2216818928718567,
-0.026969941332936287,
0.1960504949092865,
0.04695999249815941,
0.1468796730041504,
-0.010724110528826714,
0.07317188382148743,
0.0001479526690673083,
-0.0007975357002578676,
0.005011213943362236,
-0.058184485882520676,
-0.0077356318943202496,
0.05066563934087753,
0.007944703102111816,
0.0426018126308918,
-0.10370982438325882,
0.0433611199259758,
0.046785250306129456,
0.26175448298454285,
0.0930614247918129,
-0.3272106349468231,
-0.08764118701219559,
0.03509122133255005,
-0.03132351115345955,
-0.031364139169454575,
0.013979388400912285,
0.14970366656780243,
-0.07836055010557175,
0.07430452853441238,
-0.05355781689286232,
0.07230940461158752,
-0.04649779573082924,
-0.0027036918327212334,
0.10300374776124954,
0.0847286656498909,
0.008484205231070518,
0.07286608964204788,
-0.19689509272575378,
0.26892316341400146,
-0.00791962444782257,
0.04622010886669159,
-0.04352790117263794,
0.058891307562589645,
-0.0008674877462908626,
-0.007740136235952377,
0.06917424499988556,
-0.01421208307147026,
-0.11789900064468384,
-0.17946307361125946,
-0.11801332980394363,
0.02732478268444538,
0.1362232267856598,
-0.12746858596801758,
0.13949714601039886,
-0.01899908110499382,
-0.02865809202194214,
0.05991357937455177,
-0.05572100356221199,
-0.09450041502714157,
-0.0966915413737297,
0.040856774896383286,
-0.04151025414466858,
0.03869905695319176,
-0.10041245073080063,
-0.09791439771652222,
-0.10465915501117706,
0.15628361701965332,
-0.08887293189764023,
-0.03232595697045326,
-0.1257813423871994,
0.06920908391475677,
0.17671418190002441,
-0.0889565497636795,
0.0397845134139061,
0.009364979341626167,
0.13167804479599,
0.03673754632472992,
-0.02563227340579033,
0.09748321026563644,
-0.08521029353141785,
-0.26352155208587646,
-0.05149615556001663,
0.14880487322807312,
0.03310108557343483,
0.04172881692647934,
-0.023165715858340263,
0.0304409246891737,
0.0008308175601996481,
-0.09701935946941376,
0.04523879662156105,
0.017971862107515335,
0.07404189556837082,
0.05036400631070137,
-0.03920508921146393,
0.005037694238126278,
-0.03205438330769539,
-0.028801582753658295,
0.061859507113695145,
0.33560338616371155,
-0.08631392568349838,
-0.012593215331435204,
0.03430413827300072,
-0.05248230695724487,
-0.14801916480064392,
-0.028626281768083572,
0.12228897958993912,
0.021459463983774185,
0.015199205838143826,
-0.1809089034795761,
0.058399319648742676,
0.10723916441202164,
-0.030078645795583725,
0.07965550571680069,
-0.30277955532073975,
-0.13242964446544647,
0.08468075841665268,
0.08798953145742416,
-0.04270484670996666,
-0.189968079328537,
-0.07359705865383148,
0.002384216757491231,
-0.158543199300766,
0.09329277276992798,
-0.04259946197271347,
0.09198843687772751,
-0.01553032174706459,
-0.012615730054676533,
0.017588188871741295,
-0.059030067175626755,
0.18743687868118286,
0.037446029484272,
0.06864938884973526,
-0.0276049692183733,
0.006490182597190142,
0.06310214847326279,
-0.07438457012176514,
0.023635635152459145,
-0.08084504306316376,
0.055248551070690155,
-0.11552800983190536,
-0.024184811860322952,
-0.06028163060545921,
0.017282987013459206,
-0.06645813584327698,
-0.029601218178868294,
-0.03986344859004021,
0.05017000064253807,
0.07064272463321686,
-0.0079400185495615,
0.1108107641339302,
0.029875895008444786,
0.15413030982017517,
0.1580229103565216,
0.06184384599328041,
0.04791238158941269,
-0.06221603602170944,
-0.008738613687455654,
0.00384361669421196,
0.035693489015102386,
-0.11073999106884003,
0.03535929694771767,
0.13947470486164093,
0.02647409774363041,
0.1319352388381958,
0.057716839015483856,
-0.07545655220746994,
-0.004311462864279747,
0.07439247518777847,
-0.12102554738521576,
-0.1308220624923706,
-0.016752155497670174,
-0.018641838803887367,
-0.1682293862104416,
0.03846965357661247,
0.10889625549316406,
-0.033770594745874405,
0.0007328175124712288,
-0.011418602429330349,
0.05513133108615875,
-0.015301715582609177,
0.2192644625902176,
0.04030805826187134,
0.10358764231204987,
-0.07840991020202637,
0.07965141534805298,
0.05194474011659622,
-0.07345227152109146,
0.014675824902951717,
0.052354391664266586,
-0.06562432646751404,
-0.016881762072443962,
0.05575789511203766,
0.0897594541311264,
0.015121394768357277,
-0.04476780816912651,
-0.13269342482089996,
-0.12432507425546646,
0.07219424098730087,
0.09627509862184525,
0.05380436033010483,
0.07194206118583679,
0.02306784875690937,
0.044256746768951416,
-0.08899489045143127,
0.13481177389621735,
0.10298673063516617,
0.09901832789182663,
-0.14901961386203766,
0.11691659688949585,
-0.01355204451829195,
-0.005025937687605619,
-0.00338856247253716,
0.033030811697244644,
-0.1416918933391571,
-0.025131165981292725,
-0.09501250088214874,
0.007547945249825716,
-0.056670356541872025,
-0.0026901632081717253,
0.01562550850212574,
-0.05847495049238205,
-0.04650469869375229,
0.011276140809059143,
-0.09124249964952469,
-0.0441267229616642,
-0.029594892635941505,
0.07528269290924072,
-0.12551338970661163,
-0.013430041261017323,
0.05206716060638428,
-0.136133074760437,
0.1020686998963356,
0.03242875263094902,
0.05445658788084984,
0.010605773888528347,
-0.08014878630638123,
0.04170325770974159,
0.030533136799931526,
0.01564491167664528,
0.027356110513210297,
-0.15590107440948486,
0.008927141316235065,
-0.038688093423843384,
-0.0025185649283230305,
-0.004314246587455273,
0.02621925063431263,
-0.12342538684606552,
0.019284501671791077,
-0.03942316398024559,
-0.049464620649814606,
-0.050490014255046844,
0.0351535826921463,
0.07605142891407013,
-0.0205742996186018,
0.15278245508670807,
-0.05934227257966995,
0.040980033576488495,
-0.2574860155582428,
-0.014951402321457863,
0.013237744569778442,
-0.06961803138256073,
-0.07456609606742859,
-0.003644767450168729,
0.08496589958667755,
-0.059391047805547714,
0.1048695296049118,
-0.057458773255348206,
0.0135544054210186,
0.009700620546936989,
-0.08552809059619904,
0.0775679424405098,
0.06957992911338806,
0.16261059045791626,
0.04583070054650307,
-0.01765543781220913,
0.02949102781713009,
-0.019320523366332054,
0.06292398273944855,
-0.019701911136507988,
0.16556452214717865,
0.11594170331954956,
-0.04616089165210724,
0.0727134495973587,
0.08470166474580765,
-0.15032729506492615,
-0.1357220709323883,
0.08369442820549011,
-0.08661910146474838,
0.10219213366508484,
-0.01242574118077755,
0.13383305072784424,
0.09208185970783234,
-0.21725820004940033,
0.015997467562556267,
-0.04733447730541229,
-0.08659043908119202,
-0.1051807776093483,
-0.07031645625829697,
-0.09741708636283875,
-0.1651747077703476,
0.016383105888962746,
-0.13696405291557312,
0.021150749176740646,
0.11273790150880814,
0.02387249283492565,
0.033268123865127563,
0.13820943236351013,
0.06629978865385056,
0.03798874467611313,
0.034029725939035416,
0.03764960914850235,
-0.0035635249223560095,
-0.02764655463397503,
-0.10439702868461609,
0.01944565959274769,
0.0016441169427707791,
0.05170775577425957,
-0.05165758356451988,
-0.03875957056879997,
0.059076085686683655,
0.02551202103495598,
-0.09408997744321823,
0.016745613887906075,
-0.0216749869287014,
0.025646517053246498,
0.026173576712608337,
0.020352328196167946,
0.00838801171630621,
-0.018474280834197998,
0.16986879706382751,
-0.07922189682722092,
-0.05049962177872658,
-0.14093583822250366,
0.2123441845178604,
-0.015246380120515823,
0.015952132642269135,
0.05870935693383217,
-0.07845646888017654,
-0.017268892377614975,
0.1115526333451271,
0.1467963308095932,
-0.027121039107441902,
-0.01634969376027584,
0.03320461884140968,
-0.012959049083292484,
-0.002940251026302576,
0.0816177949309349,
0.10942169278860092,
0.06857002526521683,
-0.05171378701925278,
-0.04409416764974594,
-0.00714357104152441,
-0.03467334061861038,
-0.03204498067498207,
0.0645001009106636,
0.006351863965392113,
0.014044343493878841,
-0.023331761360168457,
0.07930263131856918,
-0.04413623362779617,
-0.11719474196434021,
0.06320776045322418,
-0.19335772097110748,
-0.18562807142734528,
-0.04431585222482681,
0.069793701171875,
-0.0004948531277477741,
0.06932482123374939,
-0.0056912777945399284,
-0.05381440371274948,
0.10888935625553131,
-0.00020214257529005408,
-0.056536439806222916,
-0.0843575969338417,
0.05458132550120354,
-0.10221856832504272,
0.22140027582645416,
-0.037893496453762054,
0.051358550786972046,
0.1269640177488327,
-0.009585877880454063,
-0.10764483362436295,
0.02122916840016842,
0.09976588189601898,
-0.09062755107879639,
0.027252724394202232,
0.13114990293979645,
-0.04844697192311287,
0.12377040088176727,
0.06389212608337402,
-0.11695989221334457,
-0.016388429328799248,
-0.0007913523004390299,
-0.024235567077994347,
-0.08130580931901932,
-0.020737074315547943,
-0.04342224821448326,
0.1523861438035965,
0.22192521393299103,
-0.0732802078127861,
-0.010139748454093933,
-0.016316749155521393,
0.062456414103507996,
0.036253683269023895,
0.1420353204011917,
0.0007606249419040978,
-0.28510114550590515,
0.021895788609981537,
0.010552539490163326,
0.03196893259882927,
-0.19364184141159058,
-0.09066344797611237,
0.035199109464883804,
-0.039145004004240036,
-0.07577573508024216,
0.11324154585599899,
0.0784635990858078,
0.05422507971525192,
-0.06031776964664459,
-0.052672095596790314,
-0.07820677757263184,
0.16319550573825836,
-0.1600501388311386,
-0.08407843858003616
] |
null | null | transformers |
# Model Card for Mistral-chem-v0.1
The Mistral-chem-v0.1 Large Language Model (LLM) is a pretrained generative chemical molecule model with 14.55M parameters x 64 experts = 931.5M parameters.
It is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced.
The model was pretrained using 250k molecule SMILES strings from the Zinc database.
*This version v0.1 of Mistral-chem corresponds to a simple model, which was primarly designed for low computational resources (the aim was not to get the best accuracy results). Moreover, the model was trained on only 250k molecules.*
For full details of this model please read our [github repo](https://github.com/raphaelmourad/Mistral-chem).
## Model Architecture
Like Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
## Load the model from huggingface:
```
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("RaphaelMourad/Mistral-chem-v0.1", trust_remote_code=True)
model = AutoModel.from_pretrained("RaphaelMourad/Mistral-chem-v0.1", trust_remote_code=True)
```
## Calculate the embedding of a DNA sequence
```
dna = "CCCCC[C@H](Br)CC"
inputs = tokenizer(dna, return_tensors = 'pt')["input_ids"]
hidden_states = model(inputs)[0] # [1, sequence_length, 256]
# embedding with max pooling
embedding_max = torch.max(hidden_states[0], dim=0)[0]
print(embedding_max.shape) # expect to be 256
```
## Troubleshooting
Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.
## Notice
Mistral-chem is a pretrained base model for chemistry.
## Contact
Raphaël Mourad. [email protected]
| {"license": "apache-2.0", "tags": ["pretrained"]} | text-generation | RaphaelMourad/mixtral-chem-v0.1 | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"pretrained",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T09:33:54+00:00 | [] | [] | TAGS
#transformers #safetensors #mixtral #text-generation #pretrained #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Mistral-chem-v0.1
The Mistral-chem-v0.1 Large Language Model (LLM) is a pretrained generative chemical molecule model with 14.55M parameters x 64 experts = 931.5M parameters.
It is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced.
The model was pretrained using 250k molecule SMILES strings from the Zinc database.
*This version v0.1 of Mistral-chem corresponds to a simple model, which was primarly designed for low computational resources (the aim was not to get the best accuracy results). Moreover, the model was trained on only 250k molecules.*
For full details of this model please read our github repo.
## Model Architecture
Like Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
## Load the model from huggingface:
## Calculate the embedding of a DNA sequence
## Troubleshooting
Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.
## Notice
Mistral-chem is a pretrained base model for chemistry.
## Contact
Raphaël Mourad. URL@URL
| [
"# Model Card for Mistral-chem-v0.1\n\nThe Mistral-chem-v0.1 Large Language Model (LLM) is a pretrained generative chemical molecule model with 14.55M parameters x 64 experts = 931.5M parameters. \nIt is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced. \nThe model was pretrained using 250k molecule SMILES strings from the Zinc database. \n\n*This version v0.1 of Mistral-chem corresponds to a simple model, which was primarly designed for low computational resources (the aim was not to get the best accuracy results). Moreover, the model was trained on only 250k molecules.*\n\nFor full details of this model please read our github repo.",
"## Model Architecture\n\nLike Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer",
"## Load the model from huggingface:",
"## Calculate the embedding of a DNA sequence",
"## Troubleshooting\n\nEnsure you are utilizing a stable version of Transformers, 4.34.0 or newer.",
"## Notice\n\nMistral-chem is a pretrained base model for chemistry.",
"## Contact\n \nRaphaël Mourad. URL@URL"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #pretrained #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Mistral-chem-v0.1\n\nThe Mistral-chem-v0.1 Large Language Model (LLM) is a pretrained generative chemical molecule model with 14.55M parameters x 64 experts = 931.5M parameters. \nIt is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced. \nThe model was pretrained using 250k molecule SMILES strings from the Zinc database. \n\n*This version v0.1 of Mistral-chem corresponds to a simple model, which was primarly designed for low computational resources (the aim was not to get the best accuracy results). Moreover, the model was trained on only 250k molecules.*\n\nFor full details of this model please read our github repo.",
"## Model Architecture\n\nLike Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer",
"## Load the model from huggingface:",
"## Calculate the embedding of a DNA sequence",
"## Troubleshooting\n\nEnsure you are utilizing a stable version of Transformers, 4.34.0 or newer.",
"## Notice\n\nMistral-chem is a pretrained base model for chemistry.",
"## Contact\n \nRaphaël Mourad. URL@URL"
] | [
59,
181,
53,
10,
13,
25,
19,
11
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #pretrained #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Mistral-chem-v0.1\n\nThe Mistral-chem-v0.1 Large Language Model (LLM) is a pretrained generative chemical molecule model with 14.55M parameters x 64 experts = 931.5M parameters. \nIt is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced. \nThe model was pretrained using 250k molecule SMILES strings from the Zinc database. \n\n*This version v0.1 of Mistral-chem corresponds to a simple model, which was primarly designed for low computational resources (the aim was not to get the best accuracy results). Moreover, the model was trained on only 250k molecules.*\n\nFor full details of this model please read our github repo.## Model Architecture\n\nLike Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer## Load the model from huggingface:## Calculate the embedding of a DNA sequence## Troubleshooting\n\nEnsure you are utilizing a stable version of Transformers, 4.34.0 or newer.## Notice\n\nMistral-chem is a pretrained base model for chemistry.## Contact\n \nRaphaël Mourad. URL@URL"
] | [
-0.0780758187174797,
0.10492459684610367,
-0.002791703911498189,
0.028276639059185982,
0.05462189391255379,
-0.025497881695628166,
0.05923859030008316,
0.14061017334461212,
0.006530938670039177,
0.14173394441604614,
0.018327388912439346,
0.021209005266427994,
0.05506026744842529,
0.16781291365623474,
0.07831087708473206,
-0.21711759269237518,
0.04630988463759422,
-0.058602865785360336,
0.017507383599877357,
0.06594888865947723,
0.10447948426008224,
-0.07636270672082901,
0.07160616666078568,
-0.024443794041872025,
-0.06413013488054276,
0.020359396934509277,
-0.023930761963129044,
-0.032900501042604446,
0.08032427728176117,
0.036695696413517,
0.03611133247613907,
0.04094426706433296,
0.05366844683885574,
-0.17855671048164368,
0.0254150852560997,
0.0648857057094574,
0.045384570956230164,
0.10365590453147888,
0.04267403110861778,
0.017592202872037888,
0.09473627805709839,
-0.1367156207561493,
-0.005274757742881775,
0.05493756756186485,
-0.0497128926217556,
-0.016113141551613808,
-0.11529216915369034,
0.06856480240821838,
0.0320238396525383,
0.04937049746513367,
0.009322556667029858,
0.09606077522039413,
0.04898064211010933,
0.06037114933133125,
0.24508094787597656,
-0.24341146647930145,
-0.04686758667230606,
0.10127218067646027,
0.04129436984658241,
0.027208592742681503,
-0.08170779049396515,
0.030848445370793343,
0.0019207063596695662,
-0.006175203714519739,
0.046072542667388916,
-0.019703827798366547,
0.07643754035234451,
-0.07628155499696732,
-0.1096113920211792,
-0.021569162607192993,
0.13911660015583038,
-0.0028168053831905127,
-0.1033165380358696,
-0.12863261997699738,
-0.11412705481052399,
0.007780001033097506,
-0.012294591404497623,
-0.03987408056855202,
0.05967408046126366,
0.00837691593915224,
0.012830574065446854,
-0.060282353311777115,
-0.04004015028476715,
-0.047992952167987823,
-0.033555470407009125,
0.07776886224746704,
-0.01967141218483448,
0.014705675654113293,
-0.017716368660330772,
0.09550882875919342,
-0.09855350852012634,
-0.07470410317182541,
-0.10845978558063507,
-0.08599935472011566,
-0.02746361494064331,
-0.03212385252118111,
0.009720297530293465,
-0.023969510570168495,
0.052677325904369354,
0.19278669357299805,
0.010523896664381027,
0.04912969470024109,
0.0235847607254982,
-0.017460767179727554,
0.03531432896852493,
0.11384545266628265,
-0.05177708715200424,
-0.10392449051141739,
0.0994967445731163,
0.006879798602312803,
0.050597842782735825,
0.017351442947983742,
-0.045123059302568436,
-0.043097127228975296,
0.06650891900062561,
0.05475733429193497,
0.0357295460999012,
0.0274739321321249,
0.005636442918330431,
-0.0503028966486454,
0.20904697477817535,
-0.11547950655221939,
0.011199873872101307,
0.004962504841387272,
-0.01344023272395134,
0.08842034637928009,
0.04758818820118904,
-0.0909377783536911,
-0.04249104857444763,
-0.07420452684164047,
-0.10720068216323853,
-0.05548291280865669,
-0.08779103308916092,
-0.09298034012317657,
0.03408880904316902,
-0.030102401971817017,
-0.03517117351293564,
-0.10828238725662231,
-0.18799777328968048,
-0.020454196259379387,
0.00010645310976542532,
-0.041473548859357834,
-0.023120654746890068,
0.007208044640719891,
-0.037503186613321304,
0.026534948498010635,
-0.0023625900503247976,
0.023291025310754776,
-0.004574903286993504,
0.01726272888481617,
-0.04298835247755051,
0.08238018304109573,
0.004399148281663656,
0.008761459030210972,
-0.117129385471344,
0.08921364694833755,
-0.25496479868888855,
0.08038923889398575,
-0.026953833177685738,
0.00474828016012907,
-0.11863283812999725,
-0.061798304319381714,
-0.032746318727731705,
-0.06175579875707626,
0.06949089467525482,
0.12126932293176651,
-0.15985527634620667,
-0.02886880561709404,
0.09993257373571396,
-0.1286945790052414,
0.017102641984820366,
0.010440651327371597,
0.004446288105100393,
0.06745164096355438,
0.09740149974822998,
0.08781412988901138,
0.06671830266714096,
-0.014545156620442867,
-0.1473376452922821,
0.019325649365782738,
-0.03452793508768082,
0.07221170514822006,
0.025528529658913612,
-0.019951902329921722,
-0.056512657552957535,
0.045193545520305634,
0.0023508097510784864,
-0.05097755044698715,
-0.03769921511411667,
-0.01731489598751068,
-0.06103118509054184,
-0.013490108773112297,
0.025503510609269142,
-0.05634712800383568,
-0.005306926090270281,
-0.01607784442603588,
-0.07679414749145508,
0.09900949895381927,
0.14828239381313324,
-0.06309623271226883,
0.01679052598774433,
-0.028602171689271927,
0.09115880727767944,
-0.09335029125213623,
0.012048620730638504,
-0.16514354944229126,
-0.056455496698617935,
0.0718579813838005,
-0.16975706815719604,
-0.022163374349474907,
-0.031880468130111694,
0.038114115595817566,
0.08093372732400894,
-0.02403099089860916,
0.01891651749610901,
-0.02293955720961094,
0.018581606447696686,
-0.08298662304878235,
-0.12971168756484985,
-0.10287895053625107,
-0.058824408799409866,
0.09021814912557602,
-0.08889634162187576,
0.010453394614160061,
0.006739750970155001,
0.11836393177509308,
0.012497959658503532,
-0.05271945521235466,
0.05623918026685715,
-0.032565828412771225,
-0.011568671092391014,
-0.09514670819044113,
0.020530985668301582,
0.014668485149741173,
-0.0013747455086559057,
-0.06519483774900436,
-0.07544199377298355,
-0.15737053751945496,
0.07476188242435455,
0.11912279576063156,
-0.021678093820810318,
-0.05326968431472778,
-0.034166909754276276,
-0.029024634510278702,
-0.020476164296269417,
-0.03914715349674225,
0.037515681236982346,
0.0027039404958486557,
0.06683910638093948,
-0.09317275136709213,
-0.008483423851430416,
0.022216688841581345,
-0.005513366311788559,
-0.030974196270108223,
0.05600462853908539,
-0.08133111149072647,
-0.1637113094329834,
0.05500718951225281,
0.025547398254275322,
0.08413393795490265,
0.0750943124294281,
0.057713255286216736,
-0.08600424975156784,
-0.04944472759962082,
-0.04758916795253754,
-0.006826961878687143,
0.11793486028909683,
0.01690676249563694,
0.023859277367591858,
0.015200003050267696,
0.007937639020383358,
0.01777292788028717,
-0.05050394684076309,
0.06702080368995667,
0.014650420285761356,
-0.03859034925699234,
-0.007929380983114243,
-0.0105569027364254,
-0.04377784952521324,
0.07909475266933441,
0.040306735783815384,
0.06756310164928436,
-0.016915328800678253,
-0.03041931986808777,
-0.10128505527973175,
0.15668149292469025,
-0.10863666981458664,
-0.21866577863693237,
-0.10655562579631805,
0.033176857978105545,
-0.06088881567120552,
0.020833628252148628,
0.003173443488776684,
-0.008517992682754993,
-0.05769316107034683,
-0.09831779450178146,
-0.01671522669494152,
-0.005230254493653774,
0.029131578281521797,
-0.014513464644551277,
-0.010221255011856556,
0.08260403573513031,
-0.11240730434656143,
0.009340197779238224,
-0.00668382691219449,
-0.08227386325597763,
0.02086629718542099,
-0.023965181782841682,
0.041273485869169235,
0.08498246222734451,
-0.012477438896894455,
-0.022281093522906303,
-0.013298289850354195,
0.17832444608211517,
-0.003989880438894033,
0.10103220492601395,
0.15754856169223785,
0.0349515937268734,
0.05512739717960358,
0.07336512953042984,
0.00043014614493586123,
-0.029847174882888794,
0.02738569863140583,
0.009705839678645134,
-0.051676757633686066,
-0.1343052238225937,
-0.04122331365942955,
-0.012482683174312115,
-0.04115411639213562,
0.05617694556713104,
0.09041069447994232,
-0.007439009379595518,
0.027755549177527428,
-0.040440239012241364,
-0.011970672756433487,
0.007155281491577625,
0.10327879339456558,
0.0022415374405682087,
0.006262501236051321,
0.04979103058576584,
-0.038669832050800323,
0.06374338269233704,
0.10584718734025955,
0.026940949261188507,
0.1785435527563095,
-0.008632051758468151,
0.145364910364151,
0.017837388440966606,
0.054959964007139206,
0.06010778993368149,
0.09780889749526978,
-0.019452428445219994,
0.03273463249206543,
-0.026098666712641716,
-0.09273514151573181,
-0.040445175021886826,
0.026009835302829742,
-0.09195507317781448,
0.12244715541601181,
-0.0670250952243805,
0.001430350006558001,
0.059908218681812286,
0.23842988908290863,
0.073942631483078,
-0.15323859453201294,
-0.12377042323350906,
0.06923598796129227,
-0.03736012056469917,
-0.10904606431722641,
-0.014188406988978386,
0.06085682660341263,
-0.0994492769241333,
0.0229458287358284,
-0.03562324494123459,
0.08272401988506317,
-0.07215892523527145,
0.014066481962800026,
-0.025364220142364502,
0.1306169033050537,
-0.0297843124717474,
0.08039609342813492,
-0.15657024085521698,
0.03962212800979614,
0.029569435864686966,
0.08421030640602112,
-0.047654889523983,
0.03541964292526245,
0.031348757445812225,
0.04219347611069679,
0.14079861342906952,
0.04100409895181656,
-0.09401706606149673,
-0.06579265743494034,
-0.22219586372375488,
0.005973303224891424,
0.037246208637952805,
-0.029129914939403534,
0.13551710546016693,
-0.01771068014204502,
-0.02793315052986145,
-0.019405797123908997,
0.027501875534653664,
-0.15137477219104767,
-0.07150302827358246,
0.08310781419277191,
-0.10050606727600098,
0.011584504507482052,
-0.07462654262781143,
-0.03909135237336159,
0.0239317137748003,
0.15719547867774963,
-0.11299687623977661,
-0.0804450660943985,
-0.13253143429756165,
-0.0038768432568758726,
0.14673633873462677,
-0.06568612158298492,
0.08960337191820145,
-0.004720099736005068,
0.1398746520280838,
0.0124129643663764,
-0.15781328082084656,
0.04381224885582924,
-0.06207943707704544,
-0.146185502409935,
-0.018733492121100426,
0.07976842671632767,
0.030915632843971252,
0.060977764427661896,
0.032548967748880386,
0.06186547875404358,
-0.03095168247818947,
-0.08604985475540161,
0.022065896540880203,
0.21011725068092346,
0.022821906954050064,
0.03870319575071335,
-0.08901189267635345,
-0.019619310274720192,
-0.03671993687748909,
0.026822663843631744,
0.03736673668026924,
0.2398214042186737,
-0.07598603516817093,
0.10537535697221756,
0.1368888020515442,
-0.11859169602394104,
-0.18177755177021027,
0.02389155887067318,
-0.0021524119656533003,
0.05104827880859375,
-0.04252159595489502,
-0.2146303802728653,
0.0649113729596138,
0.05236700177192688,
-0.021769052371382713,
0.0785730704665184,
-0.24905778467655182,
-0.1098167672753334,
0.047015443444252014,
0.010726243257522583,
0.13799819350242615,
-0.07447084784507751,
-0.0794014111161232,
-0.08073614537715912,
0.03302551060914993,
0.03851762413978577,
-0.12749478220939636,
0.10300226509571075,
-0.046411845833063126,
-0.02940431982278824,
0.03656594082713127,
-0.03415814787149429,
0.14690008759498596,
-0.026910219341516495,
0.05360003188252449,
-0.03866960480809212,
0.003023920115083456,
0.09998920559883118,
-0.07599665969610214,
0.11752568930387497,
0.07474281638860703,
0.09737709909677505,
-0.013295025564730167,
-0.031206443905830383,
-0.07998697459697723,
0.08816391974687576,
-0.029511239379644394,
-0.05038014426827431,
-0.07569324225187302,
0.07151512801647186,
0.043939825147390366,
-0.010364554822444916,
-0.018093151971697807,
-0.03499077260494232,
-0.00814145989716053,
0.16295483708381653,
0.09807140380144119,
0.03534334897994995,
-0.04574865847826004,
0.09739027172327042,
-0.03130141645669937,
0.05800289288163185,
-0.05412091687321663,
0.02158529870212078,
0.09645199775695801,
0.03743484988808632,
0.036139074712991714,
0.01734696514904499,
-0.09565699100494385,
-0.043945375829935074,
0.03653906658291817,
-0.13663731515407562,
-0.10756432265043259,
-0.022329483181238174,
0.0640207976102829,
-0.13178978860378265,
0.02077174000442028,
0.14212974905967712,
-0.04661555960774422,
-0.05241037532687187,
0.013474932871758938,
0.047007422894239426,
-0.006122649647295475,
0.09968403726816177,
0.027383195236325264,
0.021521147340536118,
-0.06877081841230392,
0.13713888823986053,
0.04199987277388573,
-0.047195643186569214,
0.012177285738289356,
0.07359305769205093,
-0.13055765628814697,
-0.04117771238088608,
-0.0225936658680439,
0.05792750418186188,
-0.030446700751781464,
0.01605209894478321,
-0.056897856295108795,
-0.08601988852024078,
0.030661247670650482,
0.14394988119602203,
0.033780209720134735,
0.056520428508520126,
-0.036105524748563766,
0.048841290175914764,
-0.08736308664083481,
0.08733205497264862,
-0.012643839232623577,
0.07160332053899765,
-0.07317757606506348,
0.06430895626544952,
-0.022587891668081284,
0.03507072851061821,
-0.0481860488653183,
-0.0406172014772892,
-0.05721491575241089,
-0.06136694923043251,
-0.12244345247745514,
0.015657847747206688,
-0.10183681547641754,
-0.060868605971336365,
-0.029271243140101433,
0.032557230442762375,
-0.002449874533340335,
0.028197381645441055,
-0.019042422994971275,
-0.05847626551985741,
-0.08517768234014511,
0.0407634899020195,
-0.08370500802993774,
-0.02198254130780697,
0.02193671278655529,
-0.08905291557312012,
0.12642057240009308,
0.07317192852497101,
0.041258081793785095,
-0.011180794797837734,
-0.01883881539106369,
-0.004487772937864065,
-0.02123761549592018,
0.018332375213503838,
-0.009264066815376282,
-0.2063811719417572,
-0.005046174395829439,
-0.009319931268692017,
-0.06023899465799332,
-0.05050167441368103,
0.04063980281352997,
-0.09638770669698715,
-0.055006030946969986,
-0.026333000510931015,
0.04844595119357109,
-0.049285147339105606,
0.009328375570476055,
0.12101238965988159,
0.03202369436621666,
0.05572571977972984,
-0.043613333255052567,
0.008171061053872108,
-0.20287883281707764,
-0.0196278914809227,
-0.0013001938350498676,
-0.059245698153972626,
-0.033236317336559296,
-0.021097436547279358,
0.05805037170648575,
0.008869444020092487,
0.12774384021759033,
-0.005870984401553869,
0.004046889487653971,
0.005905992351472378,
-0.002253824146464467,
-0.01442542765289545,
0.04474232345819473,
0.21272802352905273,
0.04312873259186745,
0.00508399261161685,
0.07386049628257751,
0.06414661556482315,
-0.03871401026844978,
0.08430509269237518,
0.1380729228258133,
0.11279026418924332,
0.12822093069553375,
0.07046213001012802,
0.0001544211700093001,
-0.05189388617873192,
-0.060910601168870926,
0.0300876684486866,
0.014137513004243374,
0.0418015792965889,
-0.04186248779296875,
0.04155018925666809,
0.11851479113101959,
-0.14681577682495117,
0.09099526703357697,
-0.011117463000118732,
-0.06173919886350632,
-0.10520530492067337,
-0.15650439262390137,
-0.08310577273368835,
-0.02795472741127014,
-0.06399735808372498,
-0.1376795619726181,
0.01470875646919012,
0.0493471659719944,
-0.026492375880479813,
0.011502983048558235,
0.1019086241722107,
-0.04882802814245224,
-0.07354088127613068,
0.03822463005781174,
0.020862851291894913,
0.003774242242798209,
0.005958351772278547,
-0.008372609503567219,
0.07570808380842209,
-0.005781961604952812,
0.040591925382614136,
0.041619449853897095,
0.1358264535665512,
0.031279224902391434,
0.02537984400987625,
-0.07184405624866486,
-0.014713888056576252,
-0.006105240434408188,
-0.005188404582440853,
0.12450875341892242,
0.05078811198472977,
-0.07129133492708206,
-0.025365464389324188,
0.18023505806922913,
-0.0551045797765255,
-0.026672998443245888,
-0.06065363064408302,
0.2005472630262375,
-0.023515215143561363,
0.039151959121227264,
-0.002749812323600054,
-0.0849684476852417,
0.04100647568702698,
0.11495272815227509,
0.10705941915512085,
-0.00018267556151840836,
-0.025091826915740967,
-0.022337811067700386,
-0.020176924765110016,
0.025357484817504883,
0.061758652329444885,
0.03320964053273201,
0.13893325626850128,
-0.06120733916759491,
0.09041596204042435,
-0.01543935015797615,
-0.012922074645757675,
-0.11509393155574799,
0.06926263868808746,
-0.015244641341269016,
0.0025358337443321943,
-0.05569172650575638,
0.003686237148940563,
-0.0037193046882748604,
-0.13885340094566345,
0.04307191073894501,
-0.0383063480257988,
-0.08601260930299759,
0.0160025954246521,
-0.06840050220489502,
0.022159626707434654,
0.03968964144587517,
-0.038905929774045944,
0.0580107718706131,
0.2682558596134186,
-0.0036950192879885435,
-0.055095233023166656,
-0.1149953231215477,
0.08842833340167999,
-0.04126572236418724,
0.1481405794620514,
0.0032769429963082075,
0.04170744866132736,
0.08581478893756866,
-0.0027713109739124775,
-0.22112397849559784,
0.02000097930431366,
-0.005352627951651812,
-0.10746733099222183,
0.0656982883810997,
0.1734703779220581,
0.0034634515177458525,
0.10144244879484177,
0.014285636134445667,
0.02051166072487831,
-0.011878369376063347,
0.12173575907945633,
-0.04064028337597847,
-0.0696525052189827,
0.02526356838643551,
-0.08284714818000793,
0.16831617057323456,
0.13464508950710297,
-0.06129986047744751,
0.013742136768996716,
-0.03935288265347481,
0.03865320235490799,
0.014826140366494656,
0.034707941114902496,
-0.017247244715690613,
-0.12710700929164886,
0.02509305812418461,
-0.012518860399723053,
0.02112925797700882,
-0.19261419773101807,
-0.11916565895080566,
-0.004599079489707947,
-0.05311475321650505,
-0.045898642390966415,
0.10271692276000977,
0.011602739803493023,
0.06152556836605072,
-0.04372032731771469,
0.030788617208600044,
-0.024819599464535713,
0.11386723816394806,
-0.13064467906951904,
-0.1097201257944107
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | kenchenxingyu/flan-large-lora-stance-human6 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:35:38+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased_emotion_ft
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1529
- Accuracy: 0.934
- F1: 0.9345
- Precision: 0.9052
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|
| No log | 1.0 | 250 | 0.2728 | 0.9155 | 0.9138 | 0.9034 |
| 0.5164 | 2.0 | 500 | 0.1793 | 0.9275 | 0.9280 | 0.8951 |
| 0.5164 | 3.0 | 750 | 0.1552 | 0.935 | 0.9354 | 0.9036 |
| 0.1258 | 4.0 | 1000 | 0.1529 | 0.934 | 0.9345 | 0.9052 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.1.2+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy", "f1", "precision"], "model-index": [{"name": "distilbert-base-uncased_emotion_ft", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.934, "name": "Accuracy"}, {"type": "f1", "value": 0.9344783366934866, "name": "F1"}, {"type": "precision", "value": 0.9052089351876242, "name": "Precision"}]}]}]} | text-classification | shykennys/distilbert-base-uncased_emotion_ft | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:37:14+00:00 | [] | [] | TAGS
#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased\_emotion\_ft
====================================
This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1529
* Accuracy: 0.934
* F1: 0.9345
* Precision: 0.9052
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.28.0
* Pytorch 2.1.2+cu121
* Datasets 2.17.0
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.13.3"
] | [
63,
98,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.28.0\n* Pytorch 2.1.2+cu121\n* Datasets 2.17.0\n* Tokenizers 0.13.3"
] | [
-0.10957523435354233,
0.1155039593577385,
-0.0015673587331548333,
0.1455605924129486,
0.18671713769435883,
0.04941437020897865,
0.08846377581357956,
0.11618949472904205,
-0.09674284607172012,
0.025930246338248253,
0.11277841776609421,
0.1660252809524536,
0.009656376205384731,
0.12763597071170807,
-0.07047660648822784,
-0.2749861776828766,
-0.003352479776367545,
0.03945140168070793,
-0.00802378449589014,
0.13511323928833008,
0.10662263631820679,
-0.1240731030702591,
0.10884790122509003,
-0.009946496225893497,
-0.19904513657093048,
0.008003314957022667,
0.016687139868736267,
-0.038037337362766266,
0.13961659371852875,
0.022923054173588753,
0.09349028766155243,
0.002274205442517996,
0.09392205625772476,
-0.21875235438346863,
0.01623399741947651,
0.035460472106933594,
0.002444726414978504,
0.08204136043787003,
0.031007030978798866,
-0.023359334096312523,
0.15907755494117737,
-0.08251636475324631,
0.04420579597353935,
0.018433351069688797,
-0.11666549742221832,
-0.22116880118846893,
-0.08426821976900101,
0.03541924059391022,
0.06357626616954803,
0.12527714669704437,
-0.019569499418139458,
0.14213772118091583,
-0.10987895727157593,
0.09488339722156525,
0.22572198510169983,
-0.24124819040298462,
-0.06699077785015106,
0.011086142621934414,
0.00001591429281688761,
0.045802611857652664,
-0.12106300890445709,
-0.04113924503326416,
0.06405901163816452,
0.049081455916166306,
0.11009619385004044,
-0.041094060987234116,
-0.1154525876045227,
0.015441272407770157,
-0.12101619690656662,
-0.03552034869790077,
0.17506389319896698,
0.06561104953289032,
-0.02643350139260292,
-0.03428090363740921,
-0.0546734444797039,
-0.1502387523651123,
-0.027312012389302254,
-0.003119113389402628,
0.059016261249780655,
-0.015475105494260788,
-0.0863066092133522,
0.034727875143289566,
-0.11307030916213989,
-0.030943481251597404,
-0.06732002645730972,
0.11415448784828186,
0.006081657949835062,
0.014914299361407757,
-0.007720406632870436,
0.08743476122617722,
-0.010601185262203217,
-0.12673084437847137,
0.01052639540284872,
0.014581887982785702,
0.027733633294701576,
-0.04682965204119682,
-0.08658187091350555,
-0.019534235820174217,
-0.006587423849850893,
0.10672464221715927,
-0.06228775903582573,
0.042161546647548676,
0.0521596223115921,
0.01820230670273304,
-0.053469467908144,
0.2082284539937973,
-0.039179276674985886,
-0.04814213514328003,
0.002330135554075241,
0.04846559092402458,
0.026244331151247025,
-0.00022157923376653343,
-0.12196346372365952,
0.01893758215010166,
0.0941782221198082,
0.0002503823779989034,
-0.11128275096416473,
0.08631737530231476,
-0.08071237802505493,
-0.01476588100194931,
-0.01922987587749958,
-0.07369902729988098,
0.0375279001891613,
0.010740249417722225,
-0.08046066761016846,
0.009789452888071537,
0.011591482907533646,
0.010933018289506435,
-0.007091095671057701,
0.09068892896175385,
-0.08496975898742676,
0.03526996448636055,
-0.09387071430683136,
-0.10181847959756851,
0.02967807650566101,
-0.05952019616961479,
0.048229511827230453,
-0.10255827009677887,
-0.21108654141426086,
-0.028898021206259727,
0.0631043091416359,
-0.0217698123306036,
-0.056559763848781586,
-0.0747552365064621,
-0.06408865749835968,
0.017006848007440567,
-0.010404113680124283,
0.08848287910223007,
-0.07266006618738174,
0.08224806189537048,
0.04144725576043129,
0.08911546319723129,
-0.057759396731853485,
0.05057120695710182,
-0.12953093647956848,
-0.0011581189464777708,
-0.1563558280467987,
0.05846515670418739,
-0.03636033833026886,
0.09958156943321228,
-0.05948735401034355,
-0.11711028218269348,
0.05494767427444458,
-0.010863817296922207,
0.06520221382379532,
0.11268506199121475,
-0.1959269493818283,
-0.08589215576648712,
0.14827698469161987,
-0.06314490735530853,
-0.11662915349006653,
0.12298480421304703,
-0.07369869947433472,
0.07550350576639175,
0.09403254836797714,
0.1751401126384735,
0.03215785697102547,
-0.05076507478952408,
-0.024039151147007942,
0.012072107754647732,
0.05507396534085274,
-0.03828688710927963,
0.05449029803276062,
0.044993773102760315,
0.008614012971520424,
0.04351351037621498,
-0.009660874493420124,
0.08076498657464981,
-0.10065478831529617,
-0.0940122976899147,
-0.037186142057180405,
-0.10771217197179794,
0.04120401665568352,
0.0853661373257637,
0.06475458294153214,
-0.12042080610990524,
-0.06110028550028801,
0.04889644309878349,
0.10286835581064224,
-0.056506186723709106,
0.025074299424886703,
-0.06364498287439346,
0.04918716102838516,
0.009925211779773235,
-0.019607948139309883,
-0.17590555548667908,
0.021324362605810165,
0.0013589720474556088,
0.054162949323654175,
0.004959731362760067,
0.010184905491769314,
0.05985983461141586,
0.036746904253959656,
-0.06264983862638474,
-0.017758456990122795,
-0.024804245680570602,
0.004392583854496479,
-0.11441536992788315,
-0.2170713245868683,
-0.011505134403705597,
-0.01829448714852333,
0.1871926188468933,
-0.21391883492469788,
0.028937526047229767,
-0.01642606593668461,
0.05902445688843727,
0.007849697954952717,
-0.015293106436729431,
-0.040645208209753036,
0.06676416099071503,
-0.04989401623606682,
-0.04565143957734108,
0.06796872615814209,
0.0141745675355196,
-0.08069388568401337,
-0.042563267052173615,
-0.09877142310142517,
0.15220105648040771,
0.1352301985025406,
-0.11500038206577301,
-0.07022473216056824,
0.001158463186584413,
-0.056857310235500336,
-0.005866293795406818,
-0.03754441440105438,
0.048349253833293915,
0.18196071684360504,
-0.01683042384684086,
0.14201459288597107,
-0.05633847415447235,
-0.014276327565312386,
0.014278847724199295,
-0.04832633212208748,
0.02091313526034355,
0.12153004854917526,
0.10883335769176483,
-0.0874185636639595,
0.14653994143009186,
0.1428689807653427,
-0.09809575974941254,
0.13205105066299438,
-0.03237004950642586,
-0.04938104748725891,
-0.02814118191599846,
-0.06878551840782166,
-0.02856985107064247,
0.09330985695123672,
-0.15120349824428558,
-0.007662154734134674,
0.02045859396457672,
0.00853903777897358,
-0.0023306733928620815,
-0.2135952115058899,
-0.05614067241549492,
0.04423770681023598,
-0.03244780749082565,
-0.03056332655251026,
-0.018691465258598328,
0.004721696022897959,
0.10549908131361008,
0.007647041231393814,
-0.08807815611362457,
0.03817835450172424,
0.007240302860736847,
-0.08814964443445206,
0.2049877643585205,
-0.11716510355472565,
-0.16544070839881897,
-0.09710298478603363,
-0.08737082779407501,
-0.05041927471756935,
0.014265245757997036,
0.07010973989963531,
-0.11945855617523193,
-0.031887490302324295,
-0.0725325420498848,
0.03162526711821556,
0.014021055772900581,
0.02151060849428177,
0.04463040828704834,
-0.011455846019089222,
0.05233072116971016,
-0.09818699955940247,
-0.022572508081793785,
-0.05351836979389191,
-0.05249440670013428,
0.06036689877510071,
0.01115836575627327,
0.12034296989440918,
0.16537727415561676,
-0.007349393796175718,
0.008702345192432404,
-0.03984219580888748,
0.26547667384147644,
-0.06789552420377731,
-0.029054995626211166,
0.13066978752613068,
-0.002996032824739814,
0.04493614286184311,
0.13141165673732758,
0.05882761627435684,
-0.1133432686328888,
0.02554323710501194,
0.036335691809654236,
-0.03255274146795273,
-0.201851025223732,
-0.05252990126609802,
-0.047393057495355606,
0.006685181055217981,
0.0631345883011818,
0.016496868804097176,
0.03727814182639122,
0.07442964613437653,
0.046563394367694855,
0.05840560793876648,
-0.06305031478404999,
0.054352350533008575,
0.1374031901359558,
0.024773409590125084,
0.0995207279920578,
-0.033585138618946075,
-0.05062377452850342,
0.0702318400144577,
-0.041056256741285324,
0.1983710527420044,
-0.006611341144889593,
0.12710784375667572,
0.03632041811943054,
0.16386301815509796,
-0.033473219722509384,
0.07708216458559036,
-0.0004019102780148387,
-0.044629063457250595,
-0.03730421140789986,
-0.020412379875779152,
-0.07685333490371704,
0.03932742029428482,
-0.07010789215564728,
0.08507746458053589,
-0.1515870839357376,
0.00800351146608591,
0.07574433088302612,
0.26986315846443176,
0.04364100471138954,
-0.3364903926849365,
-0.1210692971944809,
0.009328621439635754,
-0.038398873060941696,
-0.008108982816338539,
0.020614437758922577,
0.07131402939558029,
-0.09403225034475327,
0.04821139574050903,
-0.0362294502556324,
0.09043996036052704,
-0.05982716754078865,
0.07413699477910995,
0.038594771176576614,
0.08087748289108276,
0.010659557767212391,
0.08424483239650726,
-0.2947792410850525,
0.2534107267856598,
-0.00810191035270691,
0.06497480720281601,
-0.08091261982917786,
-0.01184703130275011,
0.06537502259016037,
0.09609895944595337,
0.04414305463433266,
0.002331349765881896,
-0.018654609099030495,
-0.19306962192058563,
-0.018869740888476372,
0.04321875795722008,
0.06184743344783783,
-0.015425941906869411,
0.08417142927646637,
-0.029332710430026054,
0.00955874752253294,
0.07974360138177872,
0.028162291273474693,
-0.057931382209062576,
-0.09090303629636765,
-0.022990215569734573,
0.04704443737864494,
-0.015600204467773438,
-0.045724231749773026,
-0.11931299418210983,
-0.09721405059099197,
0.15155331790447235,
0.044371142983436584,
-0.019604355096817017,
-0.10979019105434418,
0.09539537876844406,
0.04761390760540962,
-0.08890844881534576,
0.02132950723171234,
0.014055398292839527,
0.07736022025346756,
0.03865500167012215,
-0.07547705620527267,
0.1149003654718399,
-0.07096195220947266,
-0.1644326150417328,
-0.06037333607673645,
0.07393228262662888,
0.04916252940893173,
0.0796942189335823,
0.0051424261182546616,
-0.0009198336047120392,
-0.055252786725759506,
-0.08910170942544937,
0.020211011171340942,
0.03489546850323677,
0.04808255285024643,
0.034120142459869385,
-0.03045644611120224,
0.015471961349248886,
-0.0673486515879631,
-0.034935854375362396,
0.19833578169345856,
0.2386687695980072,
-0.08954256027936935,
0.016977472230792046,
0.021722478792071342,
-0.06465774774551392,
-0.19735032320022583,
0.06227459758520126,
0.05611370876431465,
0.010798720642924309,
0.025975901633501053,
-0.18192289769649506,
0.1453780233860016,
0.08655456453561783,
-0.007691761013120413,
0.07976078987121582,
-0.26665908098220825,
-0.11523359268903732,
0.13031858205795288,
0.1260422021150589,
0.16414883732795715,
-0.13671711087226868,
0.0122136902064085,
-0.06720056384801865,
-0.12372434884309769,
0.11754109710454941,
-0.08741986751556396,
0.11424801498651505,
-0.014346311800181866,
0.14064686000347137,
0.0068334476090967655,
-0.0329703725874424,
0.12770220637321472,
0.041200120002031326,
0.10479382425546646,
-0.07635101675987244,
-0.021439481526613235,
0.016250506043434143,
-0.039976149797439575,
0.02684229612350464,
-0.09443951398134232,
0.028718117624521255,
-0.12265197187662125,
-0.028627296909689903,
-0.09216779470443726,
0.02566373161971569,
-0.026948656886816025,
-0.06901425868272781,
-0.05524992570281029,
0.03063664399087429,
0.08762503415346146,
-0.006828391924500465,
0.07102887332439423,
0.020691733807325363,
0.09297869354486465,
0.08027885109186172,
0.09191276133060455,
-0.08809100091457367,
-0.041810568422079086,
-0.006898632273077965,
-0.01486660074442625,
0.048490751534700394,
-0.16430968046188354,
0.028864163905382156,
0.13761098682880402,
0.009432867169380188,
0.1662234365940094,
0.08525241166353226,
-0.019765513017773628,
0.01667318306863308,
0.07029632478952408,
-0.14970706403255463,
-0.06803136318922043,
-0.015509257093071938,
-0.07447769492864609,
-0.12254655361175537,
0.018939148634672165,
0.0829823762178421,
-0.07152431458234787,
-0.005153258331120014,
-0.025090772658586502,
0.015569963492453098,
-0.05620339885354042,
0.1653214991092682,
0.04448036476969719,
0.019839588552713394,
-0.12278029322624207,
0.06752966344356537,
0.01987207680940628,
-0.09545724838972092,
0.028452662751078606,
0.07665861397981644,
-0.08429449796676636,
-0.05325903370976448,
0.09557914733886719,
0.21082277595996857,
-0.0960102453827858,
-0.050118766725063324,
-0.1383187174797058,
-0.12895193696022034,
0.08277668058872223,
0.147922083735466,
0.12458623945713043,
0.013107525184750557,
-0.07535496354103088,
0.013204246759414673,
-0.12437768280506134,
0.063705675303936,
0.06408239901065826,
0.04230353981256485,
-0.13550983369350433,
0.1301393359899521,
-0.009844956919550896,
0.04304855689406395,
-0.02740413323044777,
0.018398283049464226,
-0.09641941636800766,
0.0046887774951756,
-0.1496344357728958,
-0.023496318608522415,
-0.0476854145526886,
0.020174484699964523,
0.01238661352545023,
-0.04850855469703674,
-0.04659693315625191,
0.002793748863041401,
-0.12286591529846191,
-0.018324406817555428,
0.037723079323768616,
0.07366707175970078,
-0.11104287207126617,
-0.04684515297412872,
0.026100939139723778,
-0.062005266547203064,
0.09730188548564911,
0.06898394227027893,
0.013331742957234383,
0.0756540298461914,
-0.18082796037197113,
-0.0014747988898307085,
0.0991440936923027,
0.008704593405127525,
0.07305693626403809,
-0.06861980259418488,
-0.010665862821042538,
0.006241464521735907,
0.05584780126810074,
0.02142920345067978,
0.08539486676454544,
-0.11519933491945267,
0.008597702719271183,
0.006555729545652866,
-0.08238188922405243,
-0.06322184205055237,
0.024043744429945946,
0.08408865332603455,
0.009513869881629944,
0.20369140803813934,
-0.07847325503826141,
0.03354437276721001,
-0.20523221790790558,
0.002020003739744425,
-0.01642012782394886,
-0.10718482732772827,
-0.1637532114982605,
-0.07460491359233856,
0.06588800996541977,
-0.04416094347834587,
0.1398112177848816,
0.04374057054519653,
0.03307079151272774,
0.011494459584355354,
0.012689732946455479,
0.02825338765978813,
0.0090336874127388,
0.2003679722547531,
0.04580879211425781,
-0.05159951373934746,
0.06747230142354965,
0.06456563621759415,
0.12910380959510803,
0.1281774491071701,
0.19757144153118134,
0.13479891419410706,
0.03185226395726204,
0.11070632189512253,
0.014007211662828922,
-0.033643629401922226,
-0.1372695416212082,
0.004359454847872257,
-0.06263713538646698,
0.10806780308485031,
-0.02189444564282894,
0.22573760151863098,
0.039187006652355194,
-0.1555841863155365,
0.05546140298247337,
-0.08695310354232788,
-0.08578979969024658,
-0.10190063714981079,
-0.02952645532786846,
-0.09378514438867569,
-0.16378207504749298,
0.0020933353807777166,
-0.14329195022583008,
0.00934809260070324,
0.09365704655647278,
0.002653011353686452,
-0.0433962307870388,
0.11525096744298935,
-0.0001004791192826815,
0.021353306248784065,
0.08646490424871445,
-0.0060415612533688545,
-0.06560572236776352,
-0.10497012734413147,
-0.056779276579618454,
-0.019362835213541985,
-0.015788014978170395,
0.047574784606695175,
-0.04913879558444023,
-0.07101558148860931,
0.028086058795452118,
-0.03475792333483696,
-0.10055124759674072,
0.012688787654042244,
0.02194829471409321,
0.0597555935382843,
0.04190470278263092,
0.0012409627670422196,
0.022064633667469025,
0.005562343634665012,
0.2071109265089035,
-0.07967577874660492,
-0.040605779737234116,
-0.11979120969772339,
0.2614520490169525,
0.02409188635647297,
-0.00847665872424841,
0.028826074674725533,
-0.0748862698674202,
-0.015872633084654808,
0.23906353116035461,
0.20528066158294678,
-0.11244872212409973,
-0.008565574884414673,
-0.027429740875959396,
0.005186484660953283,
-0.052260622382164,
0.110774926841259,
0.15180477499961853,
-0.004714897833764553,
-0.09995263069868088,
-0.016953138634562492,
-0.06189150735735893,
-0.02808999828994274,
-0.0202131699770689,
0.03342815116047859,
0.05475250631570816,
0.008408638648688793,
-0.04111814871430397,
0.05647978559136391,
-0.099273182451725,
-0.08005142956972122,
0.06792514771223068,
-0.21145692467689514,
-0.15397273004055023,
-0.015937436372041702,
0.08509520441293716,
0.048697929829359055,
0.0800035148859024,
-0.016431378200650215,
-0.002645643427968025,
0.117641881108284,
-0.028454957529902458,
-0.12428157031536102,
-0.08032490313053131,
0.11378723382949829,
-0.1316385716199875,
0.18199096620082855,
-0.06240404397249222,
0.05213576927781105,
0.12598100304603577,
0.07409706711769104,
-0.06659049540758133,
0.0804297924041748,
0.04183397814631462,
-0.06175728514790535,
0.01615973189473152,
0.10328328609466553,
-0.04408210888504982,
0.07651318609714508,
0.04349488019943237,
-0.16186243295669556,
0.015078894793987274,
-0.035612933337688446,
-0.056856367737054825,
-0.04916609823703766,
-0.026963086798787117,
-0.06604232639074326,
0.11765444278717041,
0.22687256336212158,
-0.035288915038108826,
0.012160087935626507,
-0.06663110107183456,
0.007262803614139557,
0.038125280290842056,
-0.002861590590327978,
-0.07401587069034576,
-0.20804981887340546,
0.006127515807747841,
0.0839976891875267,
-0.013729589059948921,
-0.23246710002422333,
-0.08485031872987747,
-0.017158152535557747,
-0.06471166759729385,
-0.08071030676364899,
0.08737026154994965,
0.049324117600917816,
0.04797031730413437,
-0.045921504497528076,
-0.07547707855701447,
-0.06349248439073563,
0.17375092208385468,
-0.1276097595691681,
-0.10077737271785736
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2098
- Accuracy: 0.9255
- F1: 0.9256
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8453 | 1.0 | 250 | 0.3061 | 0.91 | 0.9094 |
| 0.2487 | 2.0 | 500 | 0.2098 | 0.9255 | 0.9256 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9255, "name": "Accuracy"}, {"type": "f1", "value": 0.9256033121528526, "name": "F1"}]}]}]} | text-classification | tiredbear/distilbert-base-uncased-finetuned-emotion | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:42:11+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-emotion
=========================================
This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2098
* Accuracy: 0.9255
* F1: 0.9256
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
82,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.12298925966024399,
0.16415931284427643,
-0.0022249515168368816,
0.12838715314865112,
0.12877148389816284,
0.01904618926346302,
0.1514195203781128,
0.12019442021846771,
-0.03650584816932678,
0.046193916350603104,
0.12899209558963776,
0.13060440123081207,
0.020094528794288635,
0.13745972514152527,
-0.08555696159601212,
-0.20673809945583344,
0.021901700645685196,
0.034984759986400604,
-0.019554216414690018,
0.1276279240846634,
0.10021701455116272,
-0.11031892150640488,
0.10418032854795456,
-0.0035885514225810766,
-0.13512423634529114,
-0.007379281334578991,
0.016138669103384018,
-0.038977932184934616,
0.12101870775222778,
0.019757069647312164,
0.08555757254362106,
0.03709016367793083,
0.061780042946338654,
-0.2063295543193817,
0.01633494906127453,
0.037308480590581894,
-0.006699043791741133,
0.08305735141038895,
0.033081576228141785,
-0.027233779430389404,
0.06545687466859818,
-0.09334614872932434,
0.04899849742650986,
0.025416752323508263,
-0.12426608800888062,
-0.2230583131313324,
-0.08652891963720322,
0.05607512220740318,
0.06954199075698853,
0.09302328526973724,
-0.02298552542924881,
0.12882556021213531,
-0.03278733789920807,
0.09309118241071701,
0.1929847002029419,
-0.2785753309726715,
-0.06253271549940109,
0.019933240488171577,
0.024857191368937492,
0.07781456410884857,
-0.11607067286968231,
-0.02765176258981228,
0.05328065901994705,
0.025419652462005615,
0.1408209651708603,
-0.03000331111252308,
-0.0000100845136330463,
-0.008367571979761124,
-0.122127965092659,
-0.0467156320810318,
0.19661131501197815,
0.08830983191728592,
-0.05298612639307976,
-0.08292418718338013,
-0.062137067317962646,
-0.12344267219305038,
-0.02369128353893757,
-0.015646615996956825,
0.048551563173532486,
-0.0006837965920567513,
-0.07165957242250443,
-0.018645696341991425,
-0.10995084792375565,
-0.039818160235881805,
-0.014501640573143959,
0.13194607198238373,
0.011023708619177341,
0.0013879250036552548,
0.006495350040495396,
0.09365515410900116,
-0.005687214434146881,
-0.15282344818115234,
0.011868472211062908,
0.008473686873912811,
0.02234680764377117,
-0.03086315095424652,
-0.06160522252321243,
-0.08252810686826706,
0.007355652283877134,
0.12234635651111603,
-0.06391282379627228,
0.0381796732544899,
0.02186567336320877,
0.02776717022061348,
-0.0757814571261406,
0.18540902435779572,
-0.03730515390634537,
-0.04703554883599281,
0.018810899928212166,
0.12801603972911835,
0.06705563515424728,
-0.01113202329725027,
-0.11687757074832916,
0.036215994507074356,
0.12639904022216797,
0.008259239606559277,
-0.06383469700813293,
0.07295916974544525,
-0.09291710704565048,
-0.03045615740120411,
0.040017351508140564,
-0.08345825970172882,
0.010006575845181942,
0.010459926910698414,
-0.051640018820762634,
-0.05533228814601898,
0.02033049426972866,
0.023422755300998688,
0.008880945853888988,
0.05069858580827713,
-0.08697323501110077,
-0.004888592287898064,
-0.06064574047923088,
-0.09836641699075699,
0.013786204159259796,
-0.07303544878959656,
0.0294688418507576,
-0.10141290724277496,
-0.22318731248378754,
-0.023288331925868988,
0.06568637490272522,
-0.015711693093180656,
-0.05751540884375572,
-0.0765494853258133,
-0.05566942319273949,
0.01963070221245289,
-0.0009356812224723399,
0.02586907520890236,
-0.06492088735103607,
0.08844626694917679,
0.043852370232343674,
0.06694947183132172,
-0.04401993751525879,
0.03836948052048683,
-0.12762954831123352,
0.04506474360823631,
-0.12825657427310944,
0.064083531498909,
-0.03988830745220184,
0.08981011807918549,
-0.07992232590913773,
-0.07603347301483154,
0.015640147030353546,
-0.02602706104516983,
0.05740893632173538,
0.1261247992515564,
-0.16774190962314606,
-0.07420198619365692,
0.16842208802700043,
-0.08345673978328705,
-0.15481406450271606,
0.13748565316200256,
-0.05880170688033104,
0.07106537371873856,
0.07528718560934067,
0.20262359082698822,
0.06439574062824249,
-0.048542819917201996,
-0.022134937345981598,
-0.005324368830770254,
0.07184375822544098,
-0.00817684456706047,
0.09434540569782257,
0.008535358123481274,
-0.004509931895881891,
0.022837214171886444,
-0.05818382650613785,
0.060154400765895844,
-0.06362628936767578,
-0.10318101197481155,
-0.03899693489074707,
-0.1149534285068512,
0.07968122512102127,
0.06385963410139084,
0.05263492837548256,
-0.10782752186059952,
-0.08684656769037247,
0.011198734864592552,
0.0920170322060585,
-0.08252817392349243,
0.01560035441070795,
-0.0750403180718422,
0.0779593288898468,
-0.06215186044573784,
-0.012691549956798553,
-0.14994220435619354,
0.0015186409000307322,
0.025656571611762047,
0.008467240259051323,
-0.003673878498375416,
-0.010716867633163929,
0.07532812654972076,
0.057596828788518906,
-0.07332060486078262,
-0.06730838119983673,
-0.036077093333005905,
0.009738643653690815,
-0.09620614349842072,
-0.20298953354358673,
-0.013542743399739265,
-0.04094722494482994,
0.2085779309272766,
-0.22458098828792572,
0.055817391723394394,
0.001140425680205226,
0.06879791617393494,
0.03843928128480911,
-0.03587650880217552,
-0.0029504087287932634,
0.03329670801758766,
-0.04915175959467888,
-0.07146980613470078,
0.07085344940423965,
0.02826550044119358,
-0.1311282217502594,
-0.0149794090539217,
-0.14575673639774323,
0.16261208057403564,
0.11418330669403076,
-0.03650867938995361,
-0.048777587711811066,
-0.0063068438321352005,
-0.04165167361497879,
-0.020295223221182823,
-0.018865853548049927,
0.009229526855051517,
0.1351911425590515,
0.007918436080217361,
0.1456819772720337,
-0.08592311292886734,
-0.021741749718785286,
0.017022565007209778,
-0.04692652076482773,
-0.013722056522965431,
0.11608871072530746,
0.013309158384799957,
-0.14549824595451355,
0.14500054717063904,
0.1914234459400177,
-0.06624388694763184,
0.14039725065231323,
-0.042862098664045334,
-0.04359167069196701,
-0.049233049154281616,
-0.00044229652849026024,
0.008725951425731182,
0.10347870737314224,
-0.11197403818368912,
0.002216064603999257,
0.01330116018652916,
-0.00366339017637074,
-0.00963506568223238,
-0.19666317105293274,
-0.038123928010463715,
0.06050406023859978,
-0.04967494308948517,
0.004861168097704649,
-0.009077387861907482,
-0.02050962671637535,
0.08195244520902634,
0.005004735663533211,
-0.06806423515081406,
0.05337528884410858,
-0.008175288327038288,
-0.08379372954368591,
0.19792059063911438,
-0.08355582505464554,
-0.1820179522037506,
-0.1360347718000412,
-0.04509621486067772,
-0.08249577134847641,
0.03641597554087639,
0.06329063326120377,
-0.07827824354171753,
-0.02807299606502056,
-0.11779822409152985,
-0.022146549075841904,
0.026902813464403152,
0.01316885743290186,
0.046345897018909454,
-0.02449585497379303,
0.09087736159563065,
-0.09302917122840881,
-0.015289775095880032,
-0.015670262277126312,
-0.02726726606488228,
0.043986979871988297,
0.0003908520156983286,
0.11644674837589264,
0.1344602108001709,
-0.0024111997336149216,
0.001982764108106494,
-0.02674761414527893,
0.24015909433364868,
-0.060976624488830566,
-0.021268535405397415,
0.14057914912700653,
-0.025824066251516342,
0.0664115697145462,
0.14003711938858032,
0.04754038155078888,
-0.10216362029314041,
0.01825735531747341,
0.02516135387122631,
-0.022555982694029808,
-0.19683779776096344,
-0.023665999993681908,
-0.03820670023560524,
0.013110728934407234,
0.0923033356666565,
0.02500477060675621,
0.05578959360718727,
0.07980433851480484,
0.011307960376143456,
0.046082641929388046,
-0.01593147963285446,
0.07934863120317459,
0.11102430522441864,
0.03545748442411423,
0.10672835260629654,
-0.02969278022646904,
-0.036512281745672226,
0.05040682479739189,
-0.005518865305930376,
0.17186596989631653,
0.0012750305468216538,
0.19665370881557465,
0.0376354418694973,
0.16429176926612854,
-0.03006909415125847,
0.057630978524684906,
-0.009824772365391254,
-0.02506519854068756,
-0.029358426108956337,
-0.042727939784526825,
-0.07352055609226227,
0.047611966729164124,
-0.06342733651399612,
0.09695445001125336,
-0.11881578713655472,
0.016130482777953148,
0.06686265766620636,
0.27346566319465637,
0.05122973397374153,
-0.3381480276584625,
-0.118779256939888,
0.03163718804717064,
-0.012638452462852001,
-0.024628782644867897,
0.006265256088227034,
0.11430396884679794,
-0.0633852407336235,
0.053050447255373,
-0.07694634050130844,
0.0762980729341507,
-0.06483963876962662,
0.05347886681556702,
0.017030466347932816,
0.05816536024212837,
-0.003607850056141615,
0.07206951081752777,
-0.2507160007953644,
0.237730011343956,
0.011494971811771393,
0.06829703599214554,
-0.05151974409818649,
-0.004263938404619694,
0.06421706080436707,
0.0902511328458786,
0.08569177985191345,
-0.0013113958993926644,
0.00467272661626339,
-0.17602375149726868,
-0.06630519032478333,
0.022091830149292946,
0.0539868026971817,
-0.06566138565540314,
0.09515029937028885,
-0.03607471287250519,
0.005177485756576061,
0.06704428791999817,
0.042426832020282745,
-0.07040222734212875,
-0.10223818570375443,
0.002481512725353241,
0.05932764708995819,
0.0026643802411854267,
-0.08312251418828964,
-0.10207563638687134,
-0.09500055015087128,
0.15390591323375702,
-0.012838419526815414,
-0.04031655564904213,
-0.1032203808426857,
0.05126528441905975,
0.04013681784272194,
-0.08564738184213638,
0.019136466085910797,
-0.003202908206731081,
0.11475201696157455,
0.01415063627064228,
-0.05189846456050873,
0.10044047981500626,
-0.06313786655664444,
-0.1765647828578949,
-0.05559659004211426,
0.10916933417320251,
0.02679434046149254,
0.04905441030859947,
0.0072315763682127,
0.006358620710670948,
-0.04452360421419144,
-0.06758596748113632,
0.04355176165699959,
0.02181336283683777,
0.04135998710989952,
0.01441817544400692,
-0.01724773831665516,
-0.004459582734853029,
-0.07943835109472275,
-0.029633482918143272,
0.16276660561561584,
0.2990800440311432,
-0.0724276676774025,
0.008852411061525345,
0.053898707032203674,
-0.053623735904693604,
-0.17211347818374634,
0.02427157387137413,
0.03177283704280853,
0.015426634810864925,
0.0598289929330349,
-0.14455188810825348,
0.0749862790107727,
0.06510473042726517,
-0.0287938192486763,
0.0798645168542862,
-0.2513330578804016,
-0.12475714832544327,
0.12642055749893188,
0.14921393990516663,
0.1332807093858719,
-0.15596584975719452,
-0.038069210946559906,
-0.04890201613306999,
-0.10739458352327347,
0.10282620787620544,
-0.10546713322401047,
0.10821212828159332,
-0.006642368156462908,
0.07144784927368164,
0.012308468110859394,
-0.037015244364738464,
0.1497841775417328,
-0.0053465948440134525,
0.10033758729696274,
-0.05874332785606384,
-0.012331431731581688,
0.07007947564125061,
-0.07364233583211899,
0.03307785093784332,
-0.140011727809906,
0.04514949768781662,
-0.10610543936491013,
-0.03220716491341591,
-0.06697490066289902,
0.02192913554608822,
-0.03276041895151138,
-0.06069423258304596,
-0.03603912517428398,
0.04129990562796593,
0.09074567258358002,
0.004461248405277729,
0.13149239122867584,
0.012619505636394024,
0.11043127626180649,
0.15007495880126953,
0.09350668638944626,
-0.045194126665592194,
-0.03603368252515793,
-0.031068192794919014,
-0.0329701267182827,
0.051759686321020126,
-0.15761122107505798,
0.038663070648908615,
0.10641254484653473,
0.008402816019952297,
0.17831099033355713,
0.05722370743751526,
-0.0326821506023407,
0.015341658145189285,
0.061104100197553635,
-0.1644946038722992,
-0.12632444500923157,
-0.03550735488533974,
-0.023713625967502594,
-0.16105681657791138,
0.02984657511115074,
0.11732334643602371,
-0.05834845080971718,
-0.0014169812202453613,
-0.015113011002540588,
0.02170756831765175,
-0.021074876189231873,
0.14048346877098083,
0.04846009239554405,
0.03503165766596794,
-0.09008875489234924,
0.09292999655008316,
0.039195068180561066,
-0.0952715054154396,
0.0248859915882349,
0.012673599645495415,
-0.0900382325053215,
-0.04714033380150795,
0.03031144291162491,
0.19745920598506927,
-0.040961854159832,
-0.051432423293590546,
-0.15959081053733826,
-0.10266458988189697,
0.051544420421123505,
0.1154305636882782,
0.09952162951231003,
0.023818695917725563,
-0.04972377419471741,
-0.0019818672444671392,
-0.09349818527698517,
0.11611457169055939,
0.06941138207912445,
0.061568208038806915,
-0.16531093418598175,
0.0715322345495224,
-0.01668120175600052,
0.014330887235701084,
-0.016147714108228683,
0.01841355860233307,
-0.07997079938650131,
-0.019082441926002502,
-0.13620004057884216,
0.0056248800829052925,
-0.04218827933073044,
0.017805742099881172,
-0.0014917199732735753,
-0.05678331479430199,
-0.04375004023313522,
0.010821771807968616,
-0.09677516669034958,
-0.03369671106338501,
0.04078563302755356,
0.0690656453371048,
-0.11939346790313721,
-0.051640816032886505,
0.029735703021287918,
-0.07973416894674301,
0.08691716939210892,
0.03120613284409046,
0.009364795871078968,
0.03232576325535774,
-0.15181834995746613,
0.04097861424088478,
0.061356496065855026,
0.0010428341338410974,
0.018393993377685547,
-0.10278753936290741,
-0.026722000911831856,
-0.0025697823148220778,
0.007415526546537876,
0.016234593465924263,
0.10535606741905212,
-0.11115217953920364,
0.007499100640416145,
0.011769969016313553,
-0.037147048860788345,
-0.06605912744998932,
0.037295207381248474,
0.07689788937568665,
0.021435894072055817,
0.22531521320343018,
-0.08321985602378845,
0.01358514092862606,
-0.20898322761058807,
0.008152627386152744,
0.004840954206883907,
-0.11456567794084549,
-0.1384253352880478,
-0.06088176742196083,
0.03864532336592674,
-0.05609184131026268,
0.1072925254702568,
-0.005791793577373028,
0.02195131406188011,
0.019817862659692764,
0.0024811180774122477,
0.05772358551621437,
0.00891666579991579,
0.2178145945072174,
0.016372209414839745,
-0.04919351637363434,
0.0604318343102932,
0.027111297473311424,
0.11198686808347702,
0.11162023991346359,
0.1395191252231598,
0.1593293696641922,
-0.004752764478325844,
0.10206945240497589,
0.011167536489665508,
0.003676127642393112,
-0.14276158809661865,
0.04454519972205162,
-0.03397553041577339,
0.10183025151491165,
0.004331657197326422,
0.23346753418445587,
0.08796288818120956,
-0.16622917354106903,
0.03672078251838684,
-0.04876039922237396,
-0.07527824491262436,
-0.07965926080942154,
-0.10067258030176163,
-0.09405376762151718,
-0.13725221157073975,
-0.005322910379618406,
-0.12106827646493912,
0.0037962363567203283,
0.07040152698755264,
-0.00980472657829523,
-0.04000251367688179,
0.14826920628547668,
-0.004319100175052881,
0.01474982313811779,
0.08450791984796524,
-0.01017678715288639,
-0.07189807295799255,
-0.06506330519914627,
-0.08639872819185257,
0.020230503752827644,
-0.0006633383454754949,
0.04267972707748413,
-0.04681497439742088,
-0.024334469810128212,
0.030356328934431076,
-0.016857890412211418,
-0.12031431496143341,
0.010171568021178246,
0.02395612746477127,
0.04880282282829285,
0.044744398444890976,
0.015952477231621742,
0.00639567943289876,
0.01082262210547924,
0.2127608209848404,
-0.06947115808725357,
-0.01553473062813282,
-0.1102161779999733,
0.16648617386817932,
0.0038185124285519123,
-0.013057523407042027,
0.02416670322418213,
-0.09783343225717545,
0.042867351323366165,
0.1887388378381729,
0.16507670283317566,
-0.09282473474740982,
0.004849682096391916,
-0.03368780389428139,
-0.0017054337076842785,
-0.04403488710522652,
0.06038481742143631,
0.09466515481472015,
-0.05141542851924896,
-0.08203145116567612,
0.002195686334744096,
-0.04958672448992729,
-0.023554332554340363,
-0.023252420127391815,
0.04895400255918503,
0.02790220081806183,
0.01331760548055172,
-0.05317055433988571,
0.055212270468473434,
-0.04259690269827843,
-0.08908868581056595,
0.05131212994456291,
-0.1913837343454361,
-0.13959868252277374,
-0.04022086411714554,
0.07758738100528717,
0.016668090596795082,
0.053110647946596146,
-0.011013668961822987,
0.0170731358230114,
0.08375632762908936,
-0.02727385051548481,
-0.07615333795547485,
-0.057335834950208664,
0.07826893031597137,
-0.10985586792230606,
0.2154049277305603,
-0.046016622334718704,
0.01907510496675968,
0.12245169281959534,
0.03985793516039848,
-0.0819580927491188,
0.08338607102632523,
0.046950891613960266,
-0.02591324783861637,
0.033736422657966614,
0.10738630592823029,
-0.03027603216469288,
0.12614227831363678,
0.05635765939950943,
-0.12484896183013916,
0.002519885776564479,
-0.021606534719467163,
-0.07779785990715027,
-0.05144015699625015,
-0.02547837793827057,
-0.05639272183179855,
0.13560938835144043,
0.18014462292194366,
-0.05217359960079193,
-0.003382803173735738,
-0.04032931104302406,
0.01732526533305645,
0.07948071509599686,
0.016613582149147987,
-0.03237352892756462,
-0.20988033711910248,
0.01582900993525982,
0.07813900709152222,
0.009509092196822166,
-0.3007434606552124,
-0.10395227372646332,
-0.01963624358177185,
-0.04528871551156044,
-0.06816833466291428,
0.09115426242351532,
0.08285064995288849,
0.048586610704660416,
-0.051645874977111816,
-0.054796792566776276,
-0.07190260291099548,
0.16186556220054626,
-0.11737532913684845,
-0.089300237596035
] |
null | null | diffusers | ### My-Pet-cat-XZG Dreambooth model trained by shafi4 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 21KT1A0559
Sample pictures of this concept:
| {"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]} | text-to-image | shafi4/my-pet-cat-xzg | [
"diffusers",
"safetensors",
"NxtWave-GenAI-Webinar",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-13T09:42:57+00:00 | [] | [] | TAGS
#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### My-Pet-cat-XZG Dreambooth model trained by shafi4 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 21KT1A0559
Sample pictures of this concept:
| [
"### My-Pet-cat-XZG Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
"TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### My-Pet-cat-XZG Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
73,
56
] | [
"passage: TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### My-Pet-cat-XZG Dreambooth model trained by shafi4 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 21KT1A0559\n\nSample pictures of this concept:"
] | [
-0.11079604923725128,
0.12257757037878036,
-0.0009406926110386848,
-0.016832545399665833,
0.056608863174915314,
-0.02287440001964569,
0.17800168693065643,
0.006988749373704195,
0.06342680752277374,
0.03606550022959709,
0.12461967766284943,
0.07338154315948486,
0.03270207345485687,
0.19509629905223846,
-0.029260652139782906,
-0.10422861576080322,
0.05077655613422394,
0.09189683198928833,
0.02436392568051815,
0.0708155706524849,
0.07532859593629837,
-0.0724271610379219,
0.1202562227845192,
-0.0184318944811821,
-0.1441134810447693,
-0.012151535600423813,
-0.02961132675409317,
-0.03992687910795212,
0.045083142817020416,
0.038813576102256775,
0.055304933339357376,
0.1291254460811615,
0.03050042688846588,
-0.034420665353536606,
0.038283850997686386,
0.016420213505625725,
-0.06597287207841873,
0.04556847736239433,
0.05591031163930893,
0.03174131363630295,
0.12245593965053558,
0.04020310565829277,
-0.06014611944556236,
0.04835839942097664,
-0.05015287175774574,
-0.05157874897122383,
0.03259975090622902,
0.10907544940710068,
0.11500085890293121,
0.09296438097953796,
-0.00023716312716715038,
0.10646574944257736,
0.055140212178230286,
0.11915092915296555,
0.14390361309051514,
-0.2941153943538666,
-0.09360231459140778,
0.1832304298877716,
0.10425411909818649,
0.06253707408905029,
-0.04981616139411926,
0.11301314830780029,
0.10555046051740646,
-0.029018891975283623,
0.042106993496418,
-0.06105150654911995,
0.07351550459861755,
-0.08914334326982498,
-0.11252740025520325,
0.014536191709339619,
0.24422408640384674,
0.06653039902448654,
-0.034935448318719864,
-0.06161278858780861,
-0.08914799243211746,
-0.0032848804257810116,
-0.05549084395170212,
-0.024035697802901268,
-0.0600910559296608,
0.017260296270251274,
-0.007151402998715639,
-0.03222782537341118,
-0.12814953923225403,
-0.06517894566059113,
-0.022256925702095032,
0.14573267102241516,
-0.0005504075670614839,
0.060279250144958496,
-0.12014490365982056,
0.0943494513630867,
0.01612229086458683,
-0.12122971564531326,
0.006411634851247072,
-0.08986695855855942,
0.03666459023952484,
0.06954006105661392,
0.0418819785118103,
-0.051817167550325394,
0.06855224817991257,
0.009012144058942795,
0.05591379851102829,
-0.006444271188229322,
0.035209909081459045,
0.08873919397592545,
0.022640520706772804,
-0.05980588123202324,
-0.08726014196872711,
-0.11765467375516891,
-0.003302263794466853,
-0.04581005498766899,
-0.004916287958621979,
-0.045630574226379395,
-0.0911712497472763,
0.01004770863801241,
-0.04348001256585121,
0.039053693413734436,
0.048355914652347565,
0.06742940098047256,
-0.00841712485998869,
-0.037612855434417725,
0.18613804876804352,
0.049716558307409286,
-0.016576118767261505,
-0.023162994533777237,
0.0044405702501535416,
0.06414242833852768,
0.05689264088869095,
-0.02005269192159176,
0.0039008420426398516,
0.003695561084896326,
-0.0967138409614563,
-0.0528801791369915,
-0.032786332070827484,
-0.03119966946542263,
-0.006887597031891346,
-0.15169069170951843,
0.04444415494799614,
-0.17552168667316437,
-0.09410353004932404,
0.0426769033074379,
0.06733555346727371,
-0.017626158893108368,
-0.0354471318423748,
-0.046460025012493134,
-0.08582179248332977,
0.01875537820160389,
-0.01967674121260643,
-0.015588665381073952,
-0.025288814678788185,
0.03916971758008003,
-0.011231631971895695,
0.08622913807630539,
-0.23470152914524078,
0.005508304573595524,
-0.05038272216916084,
0.02659880928695202,
0.018613561987876892,
-0.01490844041109085,
-0.05747876316308975,
0.06690449267625809,
-0.0054624564945697784,
-0.009196391329169273,
-0.026305342093110085,
-0.0021754279732704163,
-0.0003618116315919906,
0.12933337688446045,
-0.08391741663217545,
0.008003266528248787,
0.14816917479038239,
-0.12369377911090851,
-0.16895218193531036,
0.08504724502563477,
0.06074603646993637,
0.10262294113636017,
0.06769351661205292,
0.11002255976200104,
0.10727233439683914,
-0.20662769675254822,
-0.046212274581193924,
0.051914915442466736,
-0.13611917197704315,
-0.15998968482017517,
0.005624893121421337,
0.14977093040943146,
-0.058345966041088104,
0.013481199741363525,
-0.08963029086589813,
0.08229125291109085,
-0.08606448769569397,
-0.026836078613996506,
-0.03833640366792679,
-0.12545032799243927,
-0.015149938873946667,
0.009713162668049335,
0.022011887282133102,
-0.01841207966208458,
0.0003033889806829393,
-0.15196163952350616,
0.0555695965886116,
-0.03823248669505119,
-0.01717953197658062,
-0.14069107174873352,
0.08921433985233307,
-0.06179872527718544,
0.011915492825210094,
-0.0205878596752882,
-0.06452279537916183,
0.0382559634745121,
0.13538561761379242,
-0.016339890658855438,
0.1797798126935959,
0.04838748276233673,
0.056595731526613235,
-0.013568775728344917,
-0.09088683873414993,
0.11671469360589981,
0.02086736261844635,
-0.05013499781489372,
-0.1684914231300354,
0.07862623035907745,
-0.05625004693865776,
-0.007456632796674967,
-0.16494405269622803,
0.040583137422800064,
0.05806441232562065,
0.09935609996318817,
0.031849704682826996,
-0.003242978360503912,
0.01696663722395897,
-0.03292986750602722,
-0.0640062540769577,
-0.002131835324689746,
0.06067516282200813,
0.034024614840745926,
-0.08831137418746948,
0.16027389466762543,
-0.13855953514575958,
0.20392972230911255,
0.09490185230970383,
-0.019014853984117508,
-0.012742065824568272,
0.06315793842077255,
-0.06319902837276459,
-0.0033699043560773134,
0.02293243817985058,
-0.025256555527448654,
-0.03329003229737282,
-0.06727274507284164,
0.09845621883869171,
-0.07153918594121933,
-0.016241950914263725,
0.08055732399225235,
-0.03158630058169365,
-0.0285160094499588,
0.08735304325819016,
0.05745755881071091,
-0.1786186397075653,
0.12933914363384247,
0.12119922786951065,
0.03224363178014755,
0.19414149224758148,
0.04544505849480629,
-0.010045312345027924,
-0.05741830915212631,
0.08448824286460876,
0.014234591275453568,
0.2414562851190567,
-0.08636099845170975,
0.03527594357728958,
0.021190278232097626,
-0.015768975019454956,
0.057463981211185455,
-0.1061135083436966,
-0.06672623753547668,
-0.020577672868967056,
-0.037739794701337814,
0.07615260779857635,
0.08090898394584656,
-0.1262872964143753,
0.09858494251966476,
-0.0914134681224823,
-0.12992185354232788,
0.04202745109796524,
-0.017236167564988136,
-0.06367480009794235,
0.07117815315723419,
-0.050153374671936035,
-0.23083820939064026,
-0.11874733865261078,
-0.06276044994592667,
-0.0591547004878521,
-0.01614951342344284,
0.05205432325601578,
-0.025149617344141006,
-0.034679993987083435,
-0.0813925713300705,
-0.10617704689502716,
-0.07122766226530075,
0.04180872440338135,
0.04482843726873398,
0.017467377707362175,
-0.014578619971871376,
-0.059899210929870605,
0.013757703825831413,
-0.04025806486606598,
-0.008587781339883804,
0.11544348299503326,
-0.010843717493116856,
0.1704052984714508,
0.1124376654624939,
0.004738138988614082,
-0.014357814565300941,
0.007971247658133507,
0.21065972745418549,
-0.06808966398239136,
0.108972929418087,
0.10677974671125412,
0.039978716522455215,
0.06535597890615463,
0.14708077907562256,
0.035876110196113586,
-0.09837232530117035,
0.033061444759368896,
-0.07086902111768723,
-0.10462650656700134,
-0.1135319396853447,
-0.08642393350601196,
-0.05854175612330437,
0.15023094415664673,
-0.011615761555731297,
0.07617589831352234,
0.12987485527992249,
0.15309937298297882,
-0.018692603334784508,
-0.04517877846956253,
-0.03422410413622856,
0.09516622871160507,
-0.06988267600536346,
-0.03912659361958504,
0.04409824684262276,
-0.09733208268880844,
-0.020394695922732353,
0.0943526178598404,
0.041396237909793854,
0.1535644382238388,
0.04566468670964241,
0.03000226989388466,
0.08754455298185349,
0.15838757157325745,
0.14109346270561218,
0.11430582404136658,
-0.04143179953098297,
-0.07741940766572952,
-0.011588171124458313,
-0.06687921285629272,
0.10701686888933182,
0.06005392223596573,
-0.06505513191223145,
-0.039546430110931396,
0.06487362831830978,
0.006526429671794176,
-0.02217783033847809,
0.06808976829051971,
0.09486524760723114,
-0.23994854092597961,
0.01883242093026638,
0.004358440637588501,
0.045705411583185196,
-0.08144336193799973,
0.004725875798612833,
0.244045227766037,
-0.024397872388362885,
0.08190090954303741,
-0.03502601385116577,
0.07328145951032639,
0.07722615450620651,
-0.0037341404240578413,
-0.07420779019594193,
-0.0019578991923481226,
-0.017829690128564835,
0.05099374055862427,
-0.20777584612369537,
0.16815607249736786,
-0.005847265478223562,
0.06333936750888824,
-0.03157670050859451,
-0.04808126762509346,
-0.03787047043442726,
0.20072497427463531,
0.18838460743427277,
0.012067391537129879,
-0.04142088443040848,
-0.041653793305158615,
-0.09596700966358185,
0.04632306098937988,
0.05975373089313507,
-0.001237916061654687,
0.03071090206503868,
0.07503187656402588,
-0.041459664702415466,
0.01116678211838007,
0.002947714179754257,
-0.1837248057126999,
-0.12317831069231033,
0.009822173044085503,
0.23513251543045044,
0.05176790431141853,
-0.02297518029808998,
0.03208155557513237,
-0.062136247754096985,
0.1091284304857254,
-0.25766775012016296,
-0.0677216649055481,
-0.06403106451034546,
-0.11419203132390976,
-0.02468487061560154,
-0.041915275156497955,
0.017058484256267548,
-0.05321367830038071,
0.0650874450802803,
-0.04609887674450874,
-0.12466458976268768,
0.02327057719230652,
-0.17499464750289917,
-0.1017645001411438,
-0.11007145792245865,
0.0520443357527256,
0.06459859758615494,
-0.006832548882812262,
0.01626342162489891,
-0.07203946262598038,
-0.03377519175410271,
-0.1046719029545784,
0.019463811069726944,
0.07905896008014679,
-0.07265246659517288,
-0.07140547037124634,
-0.05043512582778931,
-0.09144323319196701,
-0.0539742112159729,
-0.05734355375170708,
0.05136999860405922,
0.23814590275287628,
-0.08318308740854263,
0.06430218368768692,
0.21478824317455292,
-0.04012172296643257,
-0.23662714660167694,
-0.11015301942825317,
-0.05774175375699997,
-0.029217934235930443,
-0.009348887018859386,
-0.10881467908620834,
0.12577418982982635,
0.008549701422452927,
-0.06183680146932602,
0.2406528741121292,
-0.25325775146484375,
-0.04833877086639404,
0.03601952642202377,
0.12443751841783524,
0.320402055978775,
-0.14608444273471832,
-0.03410683944821358,
-0.0037697437219321728,
-0.1238829642534256,
0.21824899315834045,
0.010324610397219658,
0.06354150921106339,
-0.049852099269628525,
-0.015989771112799644,
-0.029099075123667717,
-0.027616579085588455,
0.10116071254014969,
-0.04139957204461098,
0.05802352726459503,
-0.07385595887899399,
0.08248613774776459,
0.18236592411994934,
-0.013734361156821251,
0.03497978299856186,
-0.15805503726005554,
0.021276528015732765,
-0.10025495290756226,
-0.012932546436786652,
-0.04695626720786095,
0.03560980409383774,
-0.04007944092154503,
-0.10755743086338043,
-0.0743764191865921,
0.00024506376939825714,
-0.0038940217345952988,
0.02630765177309513,
-0.035832133144140244,
-0.010829774662852287,
-0.0048890807665884495,
0.1649809181690216,
0.05717519670724869,
-0.11015302687883377,
-0.016973767429590225,
-0.08593028783798218,
-0.053318824619054794,
0.1434449702501297,
-0.026574747636914253,
-0.03309087082743645,
0.10303159803152084,
-0.0033664717338979244,
0.030484862625598907,
0.03793482854962349,
-0.04243570566177368,
0.02741037867963314,
0.12702980637550354,
-0.15945816040039062,
-0.17143544554710388,
-0.036594923585653305,
0.18170249462127686,
0.08154315501451492,
0.14049053192138672,
0.10410645604133606,
-0.07645305246114731,
0.03807396814227104,
-0.042651157826185226,
0.01081005297601223,
-0.02118457667529583,
0.053722940385341644,
-0.011494570411741734,
0.051884476095438004,
-0.07416824251413345,
0.01732660084962845,
-0.027354540303349495,
-0.05193093791604042,
-0.031017377972602844,
0.03215596452355385,
-0.10709209740161896,
-0.07416776567697525,
0.010096189565956593,
0.129786416888237,
-0.11104714870452881,
-0.1066480278968811,
-0.033340878784656525,
-0.07450227439403534,
0.02774200774729252,
0.12683320045471191,
0.01039914321154356,
0.028018560260534286,
0.05076773837208748,
-0.006221279967576265,
-0.07507684081792831,
0.027310434728860855,
-0.023682331666350365,
0.11456330120563507,
-0.23902061581611633,
-0.07057111710309982,
0.019772984087467194,
0.03865853324532509,
-0.08713845908641815,
-0.01388793345540762,
-0.07157031446695328,
0.007027775514870882,
0.020574646070599556,
0.07145988196134567,
-0.10356664657592773,
-0.07179169356822968,
-0.03432676941156387,
-0.01949446275830269,
-0.06234810873866081,
0.028401266783475876,
-0.03648395836353302,
0.06250792741775513,
0.055822718888521194,
0.007310699671506882,
-0.019616056233644485,
-0.011831669136881828,
-0.031068852171301842,
-0.04070466011762619,
0.0706784799695015,
-0.014552856795489788,
-0.09881553053855896,
-0.025887997820973396,
-0.263624370098114,
0.00842813029885292,
0.07249071449041367,
0.025804193690419197,
-0.0060112192295491695,
0.11647478491067886,
-0.009469214826822281,
0.04011128470301628,
0.03359892591834068,
-0.021292181685566902,
0.01196449063718319,
-0.09998748451471329,
-0.002066351007670164,
-0.05483504384756088,
0.006832593586295843,
-0.06657738238573074,
-0.025085706263780594,
0.09007102996110916,
0.03373134136199951,
0.1378016322851181,
-0.0811479315161705,
0.034554775804281235,
-0.04489261656999588,
0.03293420001864433,
0.09040824323892593,
-0.07425415515899658,
0.059401802718639374,
-0.04540463536977768,
-0.0201873742043972,
0.0009660107898525894,
0.1109398677945137,
-0.07935018837451935,
-0.2603929042816162,
-0.021513571962714195,
-0.12121067196130753,
-0.02404925599694252,
-0.029031800106167793,
0.2839040458202362,
0.03134792670607567,
0.01116369292140007,
-0.13778965175151825,
0.06783091276884079,
0.060297198593616486,
0.08896554261445999,
0.025622030720114708,
0.06492657959461212,
0.004783382173627615,
0.08727170526981354,
0.045768048614263535,
-0.0056650349870324135,
-0.07554508000612259,
0.03444948419928551,
-0.1860092431306839,
0.11542220413684845,
-0.013252835720777512,
0.07387975603342056,
0.18232618272304535,
0.0030456865206360817,
-0.033014390617609024,
0.09056662768125534,
-0.01544016320258379,
-0.043831367045640945,
-0.24366545677185059,
-0.05663982033729553,
-0.13941121101379395,
0.015144658274948597,
-0.04661327227950096,
-0.008573278784751892,
-0.023266220465302467,
0.059050317853689194,
-0.05624329671263695,
0.07959720492362976,
0.0471482090651989,
-0.012519203126430511,
0.09102058410644531,
0.0028973445296287537,
-0.05071172118186951,
0.013950216583907604,
0.029237547889351845,
0.01233204547315836,
0.018618836998939514,
-0.005278618540614843,
0.06010223180055618,
-0.005626697093248367,
0.06406412273645401,
0.04658258706331253,
-0.05162898451089859,
-0.04874977469444275,
-0.007532560732215643,
0.00954064354300499,
0.1074945256114006,
0.01982855796813965,
-0.019464993849396706,
0.013819199055433273,
0.0854802206158638,
-0.007037535309791565,
-0.041959457099437714,
-0.04908525198698044,
0.0878269150853157,
-0.1303020566701889,
0.06281778961420059,
-0.027205022051930428,
-0.006544164381921291,
-0.04977704584598541,
0.25759077072143555,
0.13757097721099854,
-0.049670666456222534,
0.01932639628648758,
-0.09229274094104767,
0.014921698719263077,
-0.0732184499502182,
0.08505916595458984,
0.025155259296298027,
0.3201156258583069,
-0.03461376577615738,
-0.04601646959781647,
-0.10816063731908798,
-0.0041716559790074825,
-0.07352317124605179,
-0.09847669303417206,
0.01077481172978878,
-0.055328939110040665,
-0.09878529608249664,
0.06845977902412415,
-0.18258589506149292,
-0.00819627195596695,
0.0901220440864563,
0.008428635075688362,
0.007637940812855959,
-0.02621528133749962,
0.1209082081913948,
0.012411755509674549,
0.03207828104496002,
-0.10589475929737091,
0.046925563365221024,
0.0024505984038114548,
-0.020226819440722466,
-0.09217027574777603,
0.06744621694087982,
-0.012107104063034058,
-0.20710113644599915,
0.189629927277565,
-0.021360307931900024,
0.029097355902194977,
0.07399580627679825,
-0.06759458035230637,
-0.13459551334381104,
0.11456356197595596,
-0.02640751749277115,
-0.07097581028938293,
-0.011376352980732918,
0.11151863634586334,
0.01128649152815342,
0.05087170749902725,
0.005911279935389757,
-0.08507068455219269,
-0.02945181354880333,
0.09112952649593353,
0.0343901552259922,
-0.08505742251873016,
0.0657300278544426,
-0.030283009633421898,
0.10914835333824158,
-0.023092519491910934,
-0.050675168633461,
-0.04500436410307884,
-0.01696300134062767,
0.03830130025744438,
0.002706721192225814,
-0.0397818498313427,
0.06503716856241226,
-0.16792801022529602,
-0.028963973745703697,
0.030662858858704567,
0.0610947422683239,
-0.14976754784584045,
0.01349245011806488,
-0.17582109570503235,
-0.00681304419413209,
-0.055604785680770874,
-0.015453662723302841,
0.22004251182079315,
0.009255881421267986,
-0.0017060949467122555,
-0.10514593869447708,
-0.029915597289800644,
0.040700141340494156,
-0.013095798902213573,
-0.1224205493927002
] |
null | null | transformers | # Description
[MaziyarPanahi/samantha-1.1-westlake-7b-AWQ](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-AWQ) is a quantized (AWQ) version of [cognitivecomputations/samantha-1.1-westlake-7b](https://huggingface.co/cognitivecomputations/samantha-1.1-westlake-7b)
## How to use
### Install the necessary packages
```
pip install --upgrade accelerate autoawq transformers
```
### Example Python code
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "MaziyarPanahi/samantha-1.1-westlake-7b-AWQ"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id).to(0)
text = "User:\nHello can you provide me with top-3 cool places to visit in Paris?\n\nAssistant:\n"
inputs = tokenizer(text, return_tensors="pt").to(0)
out = model.generate(**inputs, max_new_tokens=300)
print(tokenizer.decode(out[0], skip_special_tokens=True))
```
Results:
```
User:
Hello can you provide me with top-3 cool places to visit in Paris?
Assistant:
Absolutely, here are my top-3 recommendations for must-see places in Paris:
1. The Eiffel Tower: An icon of Paris, this wrought-iron lattice tower is a global cultural icon of France and is among the most recognizable structures in the world. Climbing up to the top offers breathtaking views of the city.
2. The Louvre Museum: Home to thousands of works of art, the Louvre is the world's largest art museum and a historic monument in Paris. Must-see pieces include the Mona Lisa, the Winged Victory of Samothrace, and the Venus de Milo.
3. Notre-Dame Cathedral: This cathedral is a masterpiece of French Gothic architecture and is famous for its intricate stone carvings, beautiful stained glass, and its iconic twin towers. Be sure to spend some time exploring its history and learning about the fascinating restoration efforts post the 2019 fire.
I hope you find these recommendations helpful and that they make for an enjoyable and memorable trip to Paris. Safe travels!
``` | {"tags": ["finetuned", "quantized", "4-bit", "AWQ", "transformers", "pytorch", "mistral", "text-generation", "conversational", "dataset:cognitivecomputations/samantha-data", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us"], "model_name": "samantha-1.1-westlake-7b-AWQ", "base_model": "cognitivecomputations/samantha-1.1-westlake-7b", "inference": false, "model_creator": "cognitivecomputations", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/samantha-1.1-westlake-7b-AWQ | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"finetuned",
"quantized",
"4-bit",
"AWQ",
"pytorch",
"conversational",
"dataset:cognitivecomputations/samantha-data",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:cognitivecomputations/samantha-1.1-westlake-7b"
] | 2024-02-13T09:48:22+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #AWQ #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b
| # Description
MaziyarPanahi/samantha-1.1-westlake-7b-AWQ is a quantized (AWQ) version of cognitivecomputations/samantha-1.1-westlake-7b
## How to use
### Install the necessary packages
### Example Python code
Results:
| [
"# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-AWQ is a quantized (AWQ) version of cognitivecomputations/samantha-1.1-westlake-7b",
"## How to use",
"### Install the necessary packages",
"### Example Python code\n\n\n\n\nResults:"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #AWQ #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n",
"# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-AWQ is a quantized (AWQ) version of cognitivecomputations/samantha-1.1-westlake-7b",
"## How to use",
"### Install the necessary packages",
"### Example Python code\n\n\n\n\nResults:"
] | [
111,
45,
4,
7,
8
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #AWQ #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-AWQ is a quantized (AWQ) version of cognitivecomputations/samantha-1.1-westlake-7b## How to use### Install the necessary packages### Example Python code\n\n\n\n\nResults:"
] | [
-0.11432256549596786,
0.26406025886535645,
-0.0009085617493838072,
0.056608330458402634,
0.09047004580497742,
-0.0027309227734804153,
0.05961949750781059,
0.10260576754808426,
0.0697113499045372,
0.042641591280698776,
0.13220621645450592,
0.12648198008537292,
0.07319937646389008,
0.0799681767821312,
-0.054826367646455765,
-0.13736745715141296,
0.016996225342154503,
0.01948314905166626,
0.026285387575626373,
0.11399824917316437,
0.05431089550256729,
-0.056599847972393036,
0.08564688265323639,
-0.03698240965604782,
-0.018290633335709572,
-0.04262646660208702,
-0.041370633989572525,
-0.08100022375583649,
0.05993913114070892,
0.0035898135975003242,
0.006371710449457169,
0.03487783297896385,
0.012468426488339901,
-0.19796159863471985,
0.010967563837766647,
-0.04022734984755516,
0.0031034352723509073,
0.054495494812726974,
0.011785449460148811,
0.0004410860419739038,
-0.06438079476356506,
-0.08990422636270523,
0.0286357831209898,
0.08829723298549652,
-0.06920922547578812,
0.012554893270134926,
-0.08884282410144806,
0.051246289163827896,
0.11084950715303421,
0.0672890916466713,
-0.033516258001327515,
0.13449235260486603,
0.04076717048883438,
0.09708938002586365,
0.07745125144720078,
-0.2888956069946289,
-0.028272254392504692,
0.038762472569942474,
-0.02571680396795273,
0.07565051317214966,
-0.01868712715804577,
-0.010242318734526634,
0.050207141786813736,
0.04483369365334511,
-0.021926887333393097,
-0.09107871353626251,
-0.058074742555618286,
-0.007842836901545525,
-0.1394755244255066,
-0.008093763142824173,
0.2917672097682953,
0.009686107747256756,
-0.06960782408714294,
-0.027481237426400185,
-0.08117135614156723,
-0.059790562838315964,
-0.03264998272061348,
0.05663158744573593,
-0.030140863731503487,
-0.01522919163107872,
-0.032066695392131805,
-0.001416768878698349,
-0.08235287666320801,
0.0232106801122427,
-0.05879051610827446,
0.1015240028500557,
0.0019636170472949743,
0.019044138491153717,
-0.04265212640166283,
0.060559358447790146,
-0.1488671749830246,
-0.08609488606452942,
-0.04566983878612518,
-0.00785072147846222,
0.031227080151438713,
0.031341034919023514,
-0.029958046972751617,
0.011633891612291336,
0.0664520338177681,
0.16848354041576385,
0.07697869092226028,
0.10234823822975159,
0.0461253859102726,
0.007516867481172085,
-0.008425110951066017,
0.1707618534564972,
-0.05116670951247215,
-0.10596314072608948,
0.08257245272397995,
0.10756215453147888,
0.10754252970218658,
-0.02063138782978058,
-0.07649397850036621,
0.02507934719324112,
0.08441724628210068,
0.07493224740028381,
0.019462950527668,
0.053421277552843094,
-0.08300568908452988,
-0.03184274584054947,
0.09838131815195084,
-0.1082070842385292,
-0.045660510659217834,
0.011368946172297001,
0.01471676304936409,
-0.034663841128349304,
0.08559499680995941,
0.030226843431591988,
-0.051485057920217514,
-0.02006225846707821,
-0.06507912278175354,
-0.04306914657354355,
-0.029716510325670242,
-0.03260871395468712,
0.029116185382008553,
-0.04789251834154129,
0.0406198725104332,
-0.1322062611579895,
-0.28525272011756897,
0.02475394867360592,
0.0028897325973957777,
-0.028834795579314232,
-0.040857356041669846,
-0.023829026147723198,
-0.039399467408657074,
-0.017585160210728645,
-0.04330660402774811,
0.03982869163155556,
-0.04579663649201393,
0.07919584959745407,
0.10540913790464401,
0.05739408731460571,
-0.10534268617630005,
0.005144593305885792,
-0.09199057519435883,
0.057564087212085724,
0.011825183406472206,
0.10645993798971176,
-0.05250540375709534,
0.08996664732694626,
-0.10511506348848343,
-0.041393592953681946,
0.024232041090726852,
-0.040556423366069794,
0.05027042701840401,
0.17996461689472198,
-0.2812120318412781,
-0.0030867045279592276,
0.10605054348707199,
-0.08437345176935196,
-0.2271832674741745,
0.15252535045146942,
0.023191111162304878,
0.13209307193756104,
0.07814246416091919,
0.19238431751728058,
0.06567182391881943,
-0.07356361299753189,
-0.06495737284421921,
0.01403314620256424,
0.12400747090578079,
-0.03244223818182945,
0.08748339861631393,
0.058460984379053116,
-0.11899520456790924,
0.07146095484495163,
-0.08975719660520554,
0.026341993361711502,
-0.010780073702335358,
-0.11121305078268051,
-0.041682980954647064,
-0.13815341889858246,
0.0060443393886089325,
-0.013056688010692596,
0.0008148138877004385,
-0.08797115087509155,
-0.037481773644685745,
-0.0499604307115078,
0.09870051592588425,
-0.05638791248202324,
-0.015029344707727432,
-0.12968312203884125,
0.05378895625472069,
-0.06084239110350609,
0.038208961486816406,
-0.13473978638648987,
0.0634036511182785,
0.03519115969538689,
0.02434351295232773,
0.07060825079679489,
-0.1134510487318039,
0.05918028578162193,
0.03384332358837128,
-0.013873238116502762,
-0.08482711762189865,
0.06329478323459625,
0.020025670528411865,
-0.08991329371929169,
-0.04361814633011818,
0.03951263427734375,
-0.02683008462190628,
0.26271578669548035,
-0.08335984498262405,
0.05970076844096184,
0.034498181194067,
0.03331541642546654,
-0.0027950445655733347,
-0.009393060579895973,
0.05756588280200958,
0.0391027070581913,
-0.0043340749107301235,
-0.01872304081916809,
0.06363566964864731,
0.01359492912888527,
-0.17441053688526154,
-0.0003509590169414878,
-0.151344433426857,
0.031498122960329056,
0.11357507854700089,
0.0695919319987297,
0.059548694640398026,
0.04842111095786095,
-0.01485819648951292,
-0.042192913591861725,
0.04481375589966774,
-0.06856299191713333,
0.12487677484750748,
0.023853380233049393,
0.09382282942533493,
-0.06481665372848511,
-0.0016698489198461175,
0.0020803131628781557,
-0.05230220407247543,
-0.04134703055024147,
0.11846530437469482,
0.01625625602900982,
-0.17283101379871368,
0.09618281573057175,
0.24959909915924072,
-0.06859881430864334,
0.08204839378595352,
-0.043854184448719025,
-0.01925847679376602,
-0.05120309069752693,
0.006536279339343309,
0.033191341906785965,
0.07424426823854446,
-0.12068255245685577,
0.010995362885296345,
0.06918022781610489,
-0.02994910441339016,
0.02064899168908596,
-0.14945919811725616,
0.0077864788472652435,
-0.022675270214676857,
0.013124747201800346,
-0.034446198493242264,
-0.014151841402053833,
-0.05783146992325783,
0.03491458296775818,
-0.01581301912665367,
0.022407913580536842,
0.042686041444540024,
0.006465007551014423,
-0.11886027455329895,
0.17051957547664642,
-0.17142269015312195,
-0.2973273694515228,
-0.15949547290802002,
-0.06566929072141647,
-0.04525042697787285,
-0.022849811241030693,
0.06490498781204224,
-0.062123578041791916,
-0.08313360065221786,
-0.08479618281126022,
-0.0058263614773750305,
0.015041675418615341,
-0.01798042096197605,
0.04486967995762825,
-0.026417914777994156,
0.059355396777391434,
-0.09598814696073532,
0.0076204752549529076,
0.04932929202914238,
-0.0763951763510704,
0.11623591184616089,
-0.08904299885034561,
0.10490353405475616,
0.10796290636062622,
0.023438751697540283,
0.00020165086607448757,
-0.025873292237520218,
0.3168905973434448,
-0.04419861361384392,
0.035926587879657745,
0.13500703871250153,
-0.0682177022099495,
0.014628199860453606,
0.10855342447757721,
-0.005342587362974882,
-0.11824261397123337,
0.03269842639565468,
-0.06910602748394012,
-0.037233296781778336,
-0.24300095438957214,
-0.08914521336555481,
-0.044188179075717926,
0.1708369106054306,
0.0786413699388504,
0.030698711052536964,
-0.08272393047809601,
0.1070263460278511,
-0.010690228082239628,
0.029156550765037537,
-0.016945140436291695,
0.09019815176725388,
0.16267624497413635,
-0.01979471743106842,
0.08794613927602768,
-0.07272400707006454,
-0.0003261241363361478,
0.06935726851224899,
0.19206130504608154,
0.08536703884601593,
0.04735235869884491,
0.20983466506004333,
0.025178823620080948,
0.16072656214237213,
0.03044942580163479,
0.07156465202569962,
0.0028369000647217035,
0.0007744799950160086,
-0.054352160543203354,
-0.060600049793720245,
-0.15746842324733734,
0.06597729027271271,
-0.038447994738817215,
0.027544043958187103,
0.018744489178061485,
0.07899464666843414,
0.08130931109189987,
0.16928625106811523,
0.044500015676021576,
-0.17701628804206848,
-0.1258782148361206,
0.10298554599285126,
-0.036592528223991394,
-0.027977850288152695,
0.06698990613222122,
0.024543708190321922,
-0.04090301692485809,
0.08728814870119095,
-0.060445453971624374,
0.11473675072193146,
-0.07653775066137314,
0.0005493333446793258,
-0.08968708664178848,
-0.04112456366419792,
0.045494817197322845,
0.08978346735239029,
-0.24685229361057281,
0.1152339056134224,
0.06995124369859695,
0.060248371213674545,
-0.052137866616249084,
-0.006693835370242596,
-0.004732879810035229,
0.06643180549144745,
0.08659946918487549,
-0.016795689240098,
0.06317003071308136,
-0.064579077064991,
-0.0816507637500763,
0.07465310394763947,
0.050908349454402924,
0.030153805390000343,
0.06777593493461609,
-0.008138608187437057,
0.011003173887729645,
-0.04531381279230118,
-0.009987350553274155,
-0.07968838512897491,
-0.1524878740310669,
0.05757032334804535,
0.060224730521440506,
0.0009583057253621519,
-0.07450886815786362,
-0.05627080425620079,
-0.14053216576576233,
0.055021997541189194,
-0.20778390765190125,
-0.05777116119861603,
-0.06511577218770981,
-0.021572306752204895,
0.10450819879770279,
-0.09058734029531479,
0.03945421800017357,
-0.03443412482738495,
0.029821297153830528,
-0.03839629516005516,
-0.08026004582643509,
0.044976383447647095,
-0.1380193531513214,
-0.11328630894422531,
-0.030236156657338142,
0.13900339603424072,
-0.02643623761832714,
0.059998560696840286,
-0.033769357949495316,
0.030393172055482864,
-0.0850289836525917,
-0.09070456773042679,
-0.0022149034775793552,
0.05930577591061592,
-0.004127210471779108,
0.05314183607697487,
-0.034288439899683,
-0.09854166954755783,
-0.1103055402636528,
-0.06817040592432022,
0.18436852097511292,
0.18455153703689575,
-0.02709350734949112,
0.03200307488441467,
0.171696737408638,
-0.023813584819436073,
-0.29528385400772095,
-0.07398147881031036,
0.0012063877657055855,
0.002272933954373002,
0.0482456348836422,
-0.10470406711101532,
0.09987818449735641,
0.08129346370697021,
-0.03218498453497887,
0.032148029655218124,
-0.20635688304901123,
-0.09279312938451767,
0.09553945809602737,
0.13093790411949158,
0.06256691366434097,
-0.20881091058254242,
-0.051489900797605515,
-0.03115837834775448,
-0.18054506182670593,
0.13435105979442596,
-0.0959586650133133,
0.09244337677955627,
0.005222745705395937,
0.21219640970230103,
-0.008916818536818027,
-0.05058642849326134,
0.15192155539989471,
-0.034518636763095856,
-0.03311409056186676,
-0.0009948810329660773,
-0.006905023939907551,
0.044485852122306824,
-0.03111773356795311,
0.07871703058481216,
-0.11736124753952026,
0.0725659504532814,
-0.02100154384970665,
0.0011489924509078264,
-0.024331435561180115,
0.016367260366678238,
-0.03641597181558609,
-0.06029712036252022,
-0.02535266801714897,
0.01946939155459404,
0.011194013059139252,
0.007566782645881176,
0.09829956293106079,
-0.013101580552756786,
-0.018088355660438538,
0.22938083112239838,
0.141993910074234,
-0.09705556184053421,
0.020556647330522537,
-0.03349331393837929,
-0.06404222548007965,
0.02401469275355339,
-0.19900621473789215,
0.01677335426211357,
0.10301629453897476,
0.037781186401844025,
0.14710023999214172,
-0.012990707531571388,
-0.0535406731069088,
-0.010890149511396885,
0.037280019372701645,
-0.12240313738584518,
-0.23429781198501587,
-0.00806132797151804,
0.21330931782722473,
-0.1027444452047348,
0.10215253382921219,
0.14110207557678223,
-0.03550151363015175,
-0.058906763792037964,
-0.0013550171861425042,
0.05415305122733116,
-0.023358821868896484,
0.15918812155723572,
0.0557524599134922,
0.07930155843496323,
-0.11370998620986938,
0.1209278330206871,
0.04751871898770332,
-0.17423473298549652,
0.044843804091215134,
0.08306296914815903,
-0.1345486044883728,
-0.10246387124061584,
-0.13255688548088074,
0.054061684757471085,
-0.021017953753471375,
-0.05719916895031929,
-0.05719396471977234,
-0.046839337795972824,
-0.015306157991290092,
0.09403114020824432,
0.0592711865901947,
-0.01653173379600048,
-0.024342214688658714,
-0.03390468284487724,
-0.039259377866983414,
0.14030581712722778,
0.02656695619225502,
0.04525582119822502,
-0.13028940558433533,
-0.09152326732873917,
0.0055949874222278595,
0.09091326594352722,
-0.005373276304453611,
-0.020085956901311874,
-0.06626569479703903,
0.01559903658926487,
-0.12905511260032654,
0.07467743009328842,
-0.10920720547437668,
0.023433109745383263,
0.013285654596984386,
-0.04712984710931778,
-0.057538848370313644,
0.0323026068508625,
-0.04715864360332489,
0.004126930143684149,
-0.017990954220294952,
0.04202977940440178,
-0.016697075217962265,
-0.029416469857096672,
0.07806358486413956,
-0.06697995960712433,
0.08097252249717712,
0.03100198321044445,
-0.08258607238531113,
0.08837655186653137,
-0.021983526647090912,
-0.003475710516795516,
0.04744545742869377,
0.06340529024600983,
0.027133353054523468,
-0.08897565305233002,
-0.03876737132668495,
0.048929497599601746,
0.034247878938913345,
0.007651453837752342,
0.22745324671268463,
-0.0655607357621193,
-0.005900416988879442,
0.00042047176975756884,
-0.03676048666238785,
-0.09602375328540802,
0.0005671857506968081,
0.023154132068157196,
0.027620829641819,
0.15360987186431885,
-0.08181662857532501,
0.02282862924039364,
-0.08370165526866913,
0.004357742145657539,
-0.024260232225060463,
-0.10235201567411423,
-0.1846422553062439,
-0.03342954069375992,
0.03740294277667999,
-0.04388899356126785,
0.1651962548494339,
-0.13141080737113953,
0.02668016403913498,
0.014768599532544613,
0.0064601656049489975,
-0.0012043667957186699,
-0.026240698993206024,
0.28088411688804626,
0.034256577491760254,
0.006963072344660759,
-0.07114428281784058,
0.03846663609147072,
-0.001187245943583548,
0.1121385246515274,
-0.01610606350004673,
0.08557438850402832,
-0.02061908133327961,
0.10518399626016617,
-0.03989637643098831,
0.03151244670152664,
-0.06093211472034454,
0.04698600620031357,
-0.05619930848479271,
0.058253511786460876,
-0.04555988311767578,
0.2160511165857315,
0.15527456998825073,
-0.0846605971455574,
-0.012729856185615063,
-0.0913507491350174,
-0.08691713213920593,
-0.07816784083843231,
-0.06633376330137253,
-0.1323576271533966,
-0.056673452258110046,
0.0006631345604546368,
-0.1300969421863556,
-0.06267504394054413,
0.03644556179642677,
0.015806347131729126,
-0.0412103608250618,
0.16392570734024048,
-0.0020379177294671535,
-0.03410344943404198,
0.03529999405145645,
0.0330946147441864,
-0.04140598699450493,
-0.02097734995186329,
-0.023991864174604416,
0.0029930518940091133,
0.018633751198649406,
0.05563056468963623,
0.032396238297224045,
-0.007786551024764776,
0.03711511194705963,
-0.058450113981962204,
-0.08743114769458771,
-0.03322237357497215,
0.07763675600290298,
0.016048459336161613,
0.1431223452091217,
-0.005207724403589964,
-0.03006098046898842,
0.050170332193374634,
0.13862481713294983,
-0.07131938636302948,
-0.12042202800512314,
-0.12858745455741882,
0.25582408905029297,
-0.07361068576574326,
0.012551595456898212,
0.06092526391148567,
-0.08420397341251373,
0.03631196916103363,
0.2637849748134613,
0.17138880491256714,
-0.09269394725561142,
-0.014027665369212627,
-0.015121674165129662,
0.017163382843136787,
-0.08484228700399399,
0.06575305014848709,
0.10927097499370575,
0.13417261838912964,
-0.048423849046230316,
-0.01992242969572544,
-0.04881967604160309,
-0.07466286420822144,
-0.10816805809736252,
0.022563988342881203,
-0.039027292281389236,
-0.008883344009518623,
-0.05155562236905098,
0.07339490205049515,
-0.055589012801647186,
-0.09841686487197876,
-0.07793532311916351,
-0.18817685544490814,
-0.11828601360321045,
-0.030985305085778236,
0.08239774405956268,
-0.0037463705521076918,
0.02239067293703556,
-0.023260856047272682,
0.02225393056869507,
0.03295900672674179,
-0.005928465630859137,
-0.05670759081840515,
0.051921870559453964,
0.057967811822891235,
-0.050215937197208405,
0.11073073744773865,
0.005662126466631889,
0.05306973680853844,
0.12117331475019455,
0.03169000893831253,
-0.13558723032474518,
0.15812420845031738,
0.07112552970647812,
0.010880944319069386,
-0.00003839058263110928,
0.10263379663228989,
-0.017444990575313568,
0.1427711695432663,
0.06747300922870636,
-0.123015396296978,
-0.0259872879832983,
0.09744735062122345,
-0.004710381850600243,
-0.09172789752483368,
0.06487157195806503,
-0.09652169793844223,
0.1320272982120514,
0.04969775304198265,
-0.0789109319448471,
-0.011488143354654312,
-0.06517457216978073,
0.0447898730635643,
0.015423236414790154,
0.02495703287422657,
-0.005042342469096184,
-0.18554410338401794,
-0.024340299889445305,
-0.04127335548400879,
0.028495637699961662,
-0.3149661719799042,
-0.013297973200678825,
-0.08411884307861328,
0.008907546289265156,
-0.09014398604631424,
0.13570326566696167,
0.0868305042386055,
-0.04711590334773064,
-0.02123568020761013,
-0.10714723914861679,
-0.0269452016800642,
0.08362171798944473,
-0.07512305676937103,
-0.08872853964567184
] |
null | null | transformers | # [MaziyarPanahi/natural-sql-7b-GGUF](https://huggingface.co/MaziyarPanahi/natural-sql-7b-GGUF)
- Model creator: [chatdb](https://huggingface.co/chatdb)
- Original model: [chatdb/natural-sql-7b](https://huggingface.co/chatdb/natural-sql-7b)
## Description
[MaziyarPanahi/natural-sql-7b-GGUF](https://huggingface.co/MaziyarPanahi/natural-sql-7b-GGUF) contains GGUF format model files for [chatdb/natural-sql-7b](https://huggingface.co/chatdb/natural-sql-7b).
## How to use
Thanks to [TheBloke](https://huggingface.co/TheBloke) for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: [MaziyarPanahi/natural-sql-7b-GGUF](https://huggingface.co/MaziyarPanahi/natural-sql-7b-GGUF) and below it, a specific filename to download, such as: natural-sql-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download MaziyarPanahi/natural-sql-7b-GGUF natural-sql-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download [MaziyarPanahi/natural-sql-7b-GGUF](https://huggingface.co/MaziyarPanahi/natural-sql-7b-GGUF) --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MaziyarPanahi/natural-sql-7b-GGUF natural-sql-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m natural-sql-7b-GGUF.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./natural-sql-7b-GGUF.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./natural-sql-7b-GGUF.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers) | {"tags": ["quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "transformers", "safetensors", "llama", "text-generation", "instruct", "finetune", "conversational", "base_model:deepseek-ai/deepseek-coder-6.7b-instruct", "license:cc-by-sa-4.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us"], "model_name": "natural-sql-7b-GGUF", "base_model": "chatdb/natural-sql-7b", "inference": false, "model_creator": "chatdb", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/natural-sql-7b-GGUF | [
"transformers",
"gguf",
"mistral",
"quantized",
"2-bit",
"3-bit",
"4-bit",
"5-bit",
"6-bit",
"8-bit",
"GGUF",
"safetensors",
"llama",
"text-generation",
"instruct",
"finetune",
"conversational",
"base_model:deepseek-ai/deepseek-coder-6.7b-instruct",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:chatdb/natural-sql-7b"
] | 2024-02-13T09:48:46+00:00 | [] | [] | TAGS
#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #instruct #finetune #conversational #base_model-deepseek-ai/deepseek-coder-6.7b-instruct #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-chatdb/natural-sql-7b
| # MaziyarPanahi/natural-sql-7b-GGUF
- Model creator: chatdb
- Original model: chatdb/natural-sql-7b
## Description
MaziyarPanahi/natural-sql-7b-GGUF contains GGUF format model files for chatdb/natural-sql-7b.
## How to use
Thanks to TheBloke for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* URL. The source project for GGUF. Offers a CLI and a server option.
* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.
* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
Note for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* URL
### In 'text-generation-webui'
Under Download Model, you can enter the model repo: MaziyarPanahi/natural-sql-7b-GGUF and below it, a specific filename to download, such as: natural-sql-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the 'huggingface-hub' Python library:
Then you can download any individual model file to the current directory, at high speed, with a command like this:
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
For more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.
To accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':
And set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':
Windows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.
</details>
## Example 'URL' command
Make sure you are using 'URL' from commit d0cee0d or later.
Change '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'
For other parameters and how to use them, please refer to the URL documentation
## How to run in 'text-generation-webui'
Further instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.
## How to run from Python code
You can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: llama-cpp-python docs.
#### First install the package
Run one of the following commands, according to your system:
#### Simple llama-cpp-python example code
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* LangChain + llama-cpp-python
* LangChain + ctransformers | [
"# MaziyarPanahi/natural-sql-7b-GGUF\n- Model creator: chatdb\n- Original model: chatdb/natural-sql-7b",
"## Description\nMaziyarPanahi/natural-sql-7b-GGUF contains GGUF format model files for chatdb/natural-sql-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/natural-sql-7b-GGUF and below it, a specific filename to download, such as: natural-sql-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
"TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #instruct #finetune #conversational #base_model-deepseek-ai/deepseek-coder-6.7b-instruct #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-chatdb/natural-sql-7b \n",
"# MaziyarPanahi/natural-sql-7b-GGUF\n- Model creator: chatdb\n- Original model: chatdb/natural-sql-7b",
"## Description\nMaziyarPanahi/natural-sql-7b-GGUF contains GGUF format model files for chatdb/natural-sql-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/natural-sql-7b-GGUF and below it, a specific filename to download, such as: natural-sql-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
137,
34,
35,
26,
401,
323,
84,
75,
218,
182,
49,
77,
36,
19,
12,
50
] | [
"passage: TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #instruct #finetune #conversational #base_model-deepseek-ai/deepseek-coder-6.7b-instruct #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-chatdb/natural-sql-7b \n# MaziyarPanahi/natural-sql-7b-GGUF\n- Model creator: chatdb\n- Original model: chatdb/natural-sql-7b## Description\nMaziyarPanahi/natural-sql-7b-GGUF contains GGUF format model files for chatdb/natural-sql-7b.## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"passage: ### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/natural-sql-7b-GGUF and below it, a specific filename to download, such as: natural-sql-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download."
] | [
-0.07280448824167252,
0.12509384751319885,
-0.0027422350831329823,
0.06568942964076996,
0.06087282672524452,
0.04147268086671829,
0.01375606469810009,
0.10628305375576019,
0.10456809401512146,
0.05691404640674591,
0.053854309022426605,
0.04792330041527748,
0.04111950471997261,
0.17708967626094818,
0.09023918956518173,
-0.2021491527557373,
0.03209107741713524,
-0.010824553668498993,
0.007246782537549734,
0.03678085282444954,
0.04678039625287056,
-0.034473881125450134,
0.0683424323797226,
-0.019662082195281982,
-0.028230274096131325,
-0.05703873559832573,
-0.043959490954875946,
0.0014717206358909607,
0.0662771612405777,
0.04982224851846695,
-0.08007271587848663,
-0.04451649636030197,
-0.023386407643556595,
-0.1347077488899231,
0.030125532299280167,
0.03644983097910881,
-0.02071112021803856,
0.042522016912698746,
-0.020186757668852806,
0.02490220218896866,
0.13037395477294922,
-0.07441508769989014,
-0.013206489384174347,
0.054496802389621735,
-0.07257343083620071,
-0.12456387281417847,
-0.10172129422426224,
0.003223162144422531,
0.018412642180919647,
0.05101485550403595,
0.0038165953010320663,
0.02325696311891079,
0.004182368982583284,
0.0394892618060112,
0.2012506127357483,
-0.2321215271949768,
-0.06128593906760216,
0.1250353902578354,
0.05717775970697403,
0.08262316882610321,
-0.09768804162740707,
0.05775050073862076,
0.01009081955999136,
0.004614314064383507,
0.049030058085918427,
-0.03638126328587532,
0.11899617314338684,
-0.019238730892539024,
-0.10869662463665009,
0.0018723942339420319,
0.07538378238677979,
0.00529385544359684,
-0.04820305481553078,
-0.06943604350090027,
-0.0501050166785717,
-0.02272064983844757,
-0.055430665612220764,
0.020292038097977638,
0.00719406409189105,
0.027874138206243515,
0.03698975592851639,
-0.14085379242897034,
-0.020292319357395172,
-0.06360368430614471,
-0.007351721171289682,
0.2374509871006012,
0.01313050091266632,
0.043985992670059204,
0.026895646005868912,
0.1064852774143219,
-0.14117097854614258,
-0.038359906524419785,
-0.10891902446746826,
-0.0006366907618939877,
-0.06218189746141434,
0.030934162437915802,
0.01902749016880989,
0.07564380019903183,
0.05723356455564499,
0.10717421025037766,
-0.07672929763793945,
0.09825490415096283,
0.07125702500343323,
-0.0037144641391932964,
-0.028328213840723038,
0.09697720408439636,
-0.031096789985895157,
-0.1230025589466095,
0.0723024308681488,
0.03375353291630745,
0.08213558048009872,
-0.0363524928689003,
-0.05095047876238823,
-0.013084571808576584,
-0.040889672935009,
0.029384002089500427,
0.03330724686384201,
0.033529046922922134,
-0.02967178449034691,
-0.04579905793070793,
0.18948853015899658,
-0.07563263177871704,
0.03619825467467308,
0.010432757437229156,
-0.045673027634620667,
-0.050871919840574265,
0.026091523468494415,
-0.028757546097040176,
-0.029525360092520714,
-0.049171995371580124,
-0.08767638355493546,
-0.019254367798566818,
-0.07688844203948975,
-0.020237958058714867,
0.033608078956604004,
-0.04061034321784973,
-0.03410295769572258,
-0.06706278026103973,
-0.21646106243133545,
0.035603806376457214,
0.05551883578300476,
-0.029435770586133003,
-0.03438680246472359,
-0.005299463868141174,
-0.012386651709675789,
0.003486345987766981,
0.01778826117515564,
0.05886700749397278,
-0.04268980771303177,
0.024286584928631783,
0.031670644879341125,
0.04641082510352135,
-0.1387653648853302,
-0.003264168044552207,
-0.034925200045108795,
0.0686425268650055,
-0.08618172258138657,
0.12969878315925598,
-0.10744023323059082,
0.036833006888628006,
-0.06420139968395233,
-0.0035264892503619194,
-0.04343413561582565,
-0.034146394580602646,
0.04716607183218002,
0.07104192674160004,
-0.1004573404788971,
-0.05575205758213997,
0.11339707672595978,
-0.11534184217453003,
-0.06262870132923126,
0.12605717778205872,
-0.001273891655728221,
0.012608900666236877,
0.09642641991376877,
0.1081412062048912,
0.19683517515659332,
-0.02237013913691044,
-0.07312922179698944,
0.009996986947953701,
0.04020168259739876,
0.02381070703268051,
0.04766971617937088,
0.015610558912158012,
-0.0711694210767746,
0.06227676197886467,
-0.08092751353979111,
0.057048749178647995,
0.015195013955235481,
-0.06218812242150307,
-0.03894616290926933,
-0.08658407628536224,
0.0732148215174675,
-0.03112293779850006,
-0.01936398819088936,
-0.01082109659910202,
-0.060776881873607635,
-0.06832413375377655,
0.14656925201416016,
-0.023598283529281616,
0.014012714847922325,
-0.0841987133026123,
0.1326652467250824,
-0.05736267566680908,
0.054649293422698975,
-0.038875095546245575,
-0.1020512729883194,
0.07113395631313324,
-0.09530822932720184,
0.07750784605741501,
-0.07668463885784149,
0.065251424908638,
0.06393815577030182,
-0.04320221394300461,
0.015389680862426758,
-0.022839337587356567,
-0.029674066230654716,
-0.056701984256505966,
-0.058872103691101074,
-0.0009022427257150412,
-0.03646291792392731,
0.13480119407176971,
-0.0949038416147232,
0.017042292281985283,
0.06405367702245712,
0.025817451998591423,
0.020735938102006912,
-0.10651887953281403,
0.046195562928915024,
-0.029082097113132477,
0.027170587331056595,
-0.041268810629844666,
0.018312880769371986,
0.02930763177573681,
-0.10340913385152817,
0.03435950726270676,
-0.10785108804702759,
0.00222860649228096,
0.07157250493764877,
0.1263016313314438,
0.028673727065324783,
-0.057290904223918915,
-0.003723042318597436,
-0.02960764616727829,
0.005987547338008881,
-0.031034402549266815,
0.17858175933361053,
-0.013112147338688374,
0.03688113018870354,
-0.05355262756347656,
-0.0035579390823841095,
0.004707402549684048,
0.017313441261649132,
-0.0012402243446558714,
0.05951140075922012,
0.034198544919490814,
-0.07659173011779785,
0.030755335465073586,
0.013933967798948288,
-0.029843226075172424,
0.16695643961429596,
0.03795771300792694,
-0.021287251263856888,
-0.047170352190732956,
0.02235574834048748,
0.005410471465438604,
0.14915010333061218,
-0.1423281878232956,
0.002119404263794422,
0.00403513852506876,
0.02313770353794098,
0.07264608889818192,
-0.10144606977701187,
0.021414678543806076,
-0.041182272136211395,
-0.09933477640151978,
0.05874864012002945,
0.02177407592535019,
-0.09443065524101257,
0.042473968118429184,
0.0549909844994545,
0.06502605229616165,
0.013957136310636997,
0.0016507795080542564,
-0.0780426636338234,
0.13433146476745605,
-0.10661439597606659,
-0.16744953393936157,
-0.0991586223244667,
-0.043194934725761414,
-0.07263612002134323,
-0.004644505213946104,
0.010984916239976883,
-0.0674145370721817,
-0.03639773651957512,
-0.06456869095563889,
-0.00023355367011390626,
0.010196041315793991,
0.0006680861115455627,
0.05866885930299759,
-0.07139293104410172,
0.006654851138591766,
-0.10542640089988708,
0.009536159224808216,
0.008472318761050701,
-0.08703628927469254,
0.02951243333518505,
-0.0008194409310817719,
0.07536822557449341,
0.07944060117006302,
0.0268220454454422,
0.015365324914455414,
-0.0035716297570616007,
0.22992566227912903,
-0.08194649964570999,
0.0909736305475235,
0.1216617003083229,
0.0829661637544632,
0.05863537639379501,
0.007544472813606262,
0.014163065701723099,
-0.07578635215759277,
0.004626011475920677,
0.02885570004582405,
-0.10283180326223373,
-0.1191149428486824,
-0.07040289044380188,
-0.07964903116226196,
0.061100348830223083,
0.010506830178201199,
0.08515360206365585,
-0.04253213480114937,
0.09242689609527588,
-0.014049537479877472,
0.032562438398599625,
0.04056398570537567,
0.05283733457326889,
0.07451067119836807,
0.0047066351398825645,
0.04229342192411423,
-0.05755387246608734,
0.04789331555366516,
0.09238813817501068,
0.12219029664993286,
0.121678426861763,
-0.08580076694488525,
0.19560855627059937,
0.003871308406814933,
0.07704541087150574,
0.008505197241902351,
0.0017374646849930286,
-0.058842435479164124,
0.0033399099484086037,
-0.02758970856666565,
-0.05793609470129013,
-0.07882729917764664,
0.04395154491066933,
0.010020053945481777,
-0.008508753031492233,
0.022983118891716003,
0.0666387751698494,
0.07114504277706146,
0.08967865258455276,
0.014834077097475529,
-0.1479322612285614,
-0.11975820362567902,
0.03528964892029762,
-0.014801612123847008,
-0.05139411240816116,
0.014206633903086185,
0.07400359958410263,
-0.05589870363473892,
0.05229329317808151,
-0.0367928221821785,
0.04592471197247505,
-0.09520594030618668,
-0.014430143870413303,
0.0114523284137249,
0.1644352674484253,
0.00917296577244997,
0.07127565890550613,
-0.15980824828147888,
0.008984221145510674,
0.02675989642739296,
0.06860438734292984,
-0.04597637802362442,
0.008502388373017311,
0.0931413322687149,
-0.019420500844717026,
0.04573987424373627,
0.03542010486125946,
0.03368547558784485,
-0.005961606279015541,
-0.12178798764944077,
0.07079055905342102,
0.024251800030469894,
-0.05372586101293564,
0.06438679993152618,
-0.01855141669511795,
-0.0029461923986673355,
-0.030779123306274414,
-0.027760064229369164,
-0.08228039741516113,
-0.1583249419927597,
0.09920431673526764,
0.044767096638679504,
-0.018276963382959366,
-0.08488104492425919,
-0.023821178823709488,
-0.02668249048292637,
0.1783156394958496,
-0.010517144575715065,
-0.06833184510469437,
-0.08806446194648743,
-0.011542091146111488,
0.11571168154478073,
-0.09056110680103302,
0.02198789268732071,
-0.023978613317012787,
0.09257708489894867,
-0.0385570265352726,
-0.09228251874446869,
0.036208052188158035,
-0.08393999189138412,
-0.10878422111272812,
-0.00843315664678812,
0.11097633838653564,
0.07654940336942673,
0.0360802561044693,
0.0016133924946188927,
-0.0032486533746123314,
-0.016545113176107407,
-0.14487183094024658,
0.04616549238562584,
0.14647448062896729,
-0.14888426661491394,
0.09412871301174164,
-0.018321236595511436,
0.0012810081243515015,
-0.030712146311998367,
-0.020878160372376442,
0.0737857073545456,
0.16958685219287872,
-0.040511615574359894,
0.11497659236192703,
0.14439748227596283,
-0.07064257562160492,
-0.2041420042514801,
-0.0076323822140693665,
-0.002192903310060501,
0.004454305395483971,
-0.08237253129482269,
-0.22792170941829681,
0.09770339727401733,
0.050192154943943024,
-0.039978574961423874,
0.23320338129997253,
-0.22959750890731812,
-0.06908722966909409,
-0.03962863236665726,
0.07343391329050064,
0.20621411502361298,
-0.13356122374534607,
-0.06393901258707047,
-0.0075712651014328,
-0.11444394290447235,
0.07388164848089218,
-0.09085901081562042,
0.12833425402641296,
-0.023933246731758118,
0.07821071147918701,
-0.012063917703926563,
-0.03549833223223686,
0.14713045954704285,
-0.06172909587621689,
-0.00545086432248354,
-0.04395316168665886,
0.044264473021030426,
0.009782008826732635,
-0.06489481776952744,
0.09363889694213867,
-0.09874234348535538,
0.03146031126379967,
-0.03945619612932205,
-0.024272505193948746,
-0.0759110301733017,
0.039520055055618286,
-0.00020193401724100113,
-0.041145507246255875,
-0.09362602978944778,
0.061570778489112854,
0.020034873858094215,
0.051301900297403336,
-0.03663991764187813,
-0.002928616479039192,
0.01827559620141983,
0.0703011155128479,
0.08178496360778809,
-0.15072007477283478,
-0.07817761600017548,
-0.005428733304142952,
-0.015478704124689102,
0.05754941701889038,
-0.09407904744148254,
0.02437547594308853,
0.0794786810874939,
0.0191781185567379,
0.06586968153715134,
0.022725187242031097,
-0.1321527659893036,
0.05283418670296669,
0.046290073543787,
-0.10854342579841614,
-0.1905655860900879,
-0.022318463772535324,
0.03355155140161514,
-0.05772586166858673,
0.05992819741368294,
0.15094730257987976,
-0.004908289294689894,
-0.01674756594002247,
-0.009759925305843353,
0.05739917978644371,
-0.02932865172624588,
0.110979825258255,
0.013229000382125378,
0.009590605273842812,
-0.0998607724905014,
0.048306576907634735,
0.013361021876335144,
-0.00994553416967392,
0.01864892989397049,
0.16307301819324493,
-0.0957413911819458,
-0.08236147463321686,
-0.15572920441627502,
-0.01947280578315258,
-0.036395762115716934,
-0.010957406833767891,
-0.04169788211584091,
-0.05865434929728508,
0.02039509266614914,
0.050137706100940704,
0.024910762906074524,
0.02344929426908493,
-0.012467628344893456,
0.07629841566085815,
-0.02804301679134369,
0.05814274400472641,
-0.03238730505108833,
0.059584878385066986,
-0.11188602447509766,
-0.01211437489837408,
0.008722711354494095,
0.05777899548411369,
-0.02510412596166134,
-0.03286713361740112,
-0.07229437679052353,
-0.02190246433019638,
-0.11698402464389801,
0.029305022209882736,
-0.1252020001411438,
0.025336340069770813,
-0.019658302888274193,
0.013032155111432076,
-0.014799946919083595,
0.048088476061820984,
-0.052252210676670074,
-0.04115757346153259,
-0.040962815284729004,
-0.0015975451096892357,
-0.05833683907985687,
0.012794389389455318,
0.05625097081065178,
-0.042422641068696976,
0.13465538620948792,
0.010204680263996124,
0.01092192716896534,
0.035378795117139816,
-0.12251344323158264,
0.02732960134744644,
0.01150834746658802,
-0.009262054227292538,
-0.010777668096125126,
-0.10799767076969147,
0.031167112290859222,
-0.023132042959332466,
0.022713148966431618,
0.006028778851032257,
0.13856332004070282,
-0.07143107801675797,
-0.013410115614533424,
-0.0523083433508873,
-0.009824506938457489,
-0.01953035593032837,
0.04521095007658005,
0.06566662341356277,
0.025127727538347244,
0.03876543045043945,
-0.04363312944769859,
-0.02860056422650814,
-0.1157439798116684,
-0.0007900448981672525,
0.0005117272958159447,
-0.046450335532426834,
-0.032799359411001205,
-0.014387975446879864,
0.0457640215754509,
-0.00103754922747612,
0.15519419312477112,
-0.04492209106683731,
-0.09297160804271698,
-0.02822546847164631,
-0.07826781272888184,
0.06787759065628052,
-0.004917760379612446,
0.09824208170175552,
0.03821764886379242,
-0.015902068465948105,
0.025974160060286522,
0.04910746589303017,
0.039889439940452576,
0.01706385612487793,
0.05525385960936546,
-0.016292519867420197,
0.07906894385814667,
0.08879061043262482,
0.0010480235796421766,
-0.08138766884803772,
-0.0680391788482666,
0.06658835709095001,
-0.098076730966568,
0.053302522748708725,
-0.051990944892168045,
0.08945406228303909,
0.12932974100112915,
-0.10460200905799866,
0.06348343938589096,
0.0353923961520195,
-0.06709789484739304,
-0.04774806648492813,
-0.12683376669883728,
-0.04714946448802948,
-0.0986827164888382,
-0.005104719195514917,
-0.08207419514656067,
0.00023832544684410095,
0.012621156871318817,
0.012715792283415794,
0.00796060636639595,
0.16004955768585205,
0.009560702368617058,
-0.023147206753492355,
0.04149826988577843,
0.007043525576591492,
-0.04665456712245941,
0.11825603246688843,
-0.05500135198235512,
0.015281259082257748,
-0.015792865306138992,
0.07785893231630325,
0.018528152257204056,
-0.0189496036618948,
0.0782240480184555,
0.015894345939159393,
-0.010010777041316032,
-0.027781477198004723,
0.02658657357096672,
0.023203954100608826,
0.13918635249137878,
0.01993679627776146,
-0.08576725423336029,
0.006419570650905371,
0.13300010561943054,
-0.03202886879444122,
0.01026156172156334,
-0.09674578905105591,
0.11435715109109879,
-0.0638904869556427,
-0.007636759430170059,
-0.04354960471391678,
-0.04740846902132034,
0.029241526499390602,
0.14722777903079987,
0.17634113132953644,
-0.0867030918598175,
-0.004393375478684902,
0.005657580681145191,
-0.006979061756283045,
-0.00944078341126442,
0.10988214612007141,
0.05902411416172981,
0.22776296734809875,
-0.016029449179768562,
0.007363038137555122,
-0.016580447554588318,
-0.02411574497818947,
-0.09164853394031525,
0.008447973057627678,
-0.06074884161353111,
0.06725254654884338,
-0.06069422885775566,
0.013231594115495682,
-0.08376552909612656,
-0.12078840285539627,
-0.03941350057721138,
-0.10927462577819824,
-0.08258506655693054,
-0.024121303111314774,
-0.08016730844974518,
0.025452882051467896,
0.05556847155094147,
0.012420290149748325,
0.02254725620150566,
0.038418445736169815,
0.004938175901770592,
-0.13098518550395966,
-0.05109583958983421,
0.07153724879026413,
0.021222682669758797,
0.23682785034179688,
-0.03021692857146263,
0.005399149842560291,
0.0829932764172554,
-0.03797851875424385,
-0.1218835711479187,
0.08610501885414124,
0.013567084446549416,
-0.09532134979963303,
-0.015165319666266441,
0.10000492632389069,
-0.018600253388285637,
0.07872051000595093,
0.07355558127164841,
0.11411596834659576,
-0.0026747146621346474,
0.03548193350434303,
0.02174813114106655,
-0.06724649667739868,
-0.011005207896232605,
-0.13439041376113892,
0.1574806272983551,
0.12335754930973053,
-0.02111666463315487,
-0.006804045755416155,
-0.051750607788562775,
0.04501309245824814,
-0.017275502905249596,
0.058599747717380524,
-0.05487385019659996,
-0.13854363560676575,
-0.000004720408469438553,
0.0007804166525602341,
0.021795276552438736,
-0.21477779746055603,
-0.07227470725774765,
-0.0356181375682354,
0.008861171081662178,
0.020764091983437538,
0.10484258085489273,
0.09958378970623016,
0.002166854217648506,
-0.034861039370298386,
-0.13657480478286743,
-0.03896918520331383,
0.06081409007310867,
-0.11402127146720886,
-0.08007977157831192
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "NousResearch/Llama-2-7b-chat-hf"} | null | hyun5oo/hansoldeco | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:NousResearch/Llama-2-7b-chat-hf",
"region:us"
] | 2024-02-13T09:50:00+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
43,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.11769948899745941,
0.20666998624801636,
-0.002912783296778798,
0.02549395151436329,
0.07785112410783768,
0.015407757833600044,
0.05577832832932472,
0.13303913176059723,
0.03283666446805,
0.11651013046503067,
0.06938543915748596,
0.11774429678916931,
0.1151692196726799,
0.21962304413318634,
0.003263794118538499,
-0.1657102108001709,
0.01971868798136711,
-0.07241468876600266,
0.01743026077747345,
0.11806745082139969,
0.14102990925312042,
-0.09932662546634674,
0.07670142501592636,
-0.020442117005586624,
0.0024542235769331455,
-0.027936438098549843,
-0.06807847321033478,
-0.011055584996938705,
0.05399150028824806,
0.03122783452272415,
0.056819941848516464,
-0.010763264261186123,
0.08520374447107315,
-0.2704300880432129,
0.01883009262382984,
0.04265301674604416,
-0.00045290516573004425,
0.08344653248786926,
0.09688374400138855,
-0.04538474604487419,
0.12346991151571274,
-0.021854383870959282,
0.13367369771003723,
0.09051225334405899,
-0.09567297995090485,
-0.2351798564195633,
-0.06292394548654556,
0.07993721961975098,
0.18764273822307587,
0.08551130443811417,
-0.04316225275397301,
0.12375939637422562,
-0.0640316754579544,
0.022428808733820915,
0.06704075634479523,
-0.10372592508792877,
-0.06345343589782715,
0.06291820853948593,
0.1294797956943512,
0.0773601308465004,
-0.12618185579776764,
-0.037074875086545944,
0.035886481404304504,
0.04580415412783623,
0.0580124706029892,
0.006647665984928608,
0.1484030783176422,
0.028769001364707947,
-0.1454513818025589,
-0.049566421657800674,
0.13674598932266235,
0.010416027158498764,
-0.03749023377895355,
-0.21604330837726593,
-0.00459075253456831,
-0.09522778540849686,
-0.03878160938620567,
-0.04798002541065216,
0.03698987141251564,
0.010453630238771439,
0.13307736814022064,
-0.049591004848480225,
-0.09215915948152542,
-0.014346052892506123,
0.11040274053812027,
0.0616430938243866,
0.02060583047568798,
-0.01945985108613968,
0.008026303723454475,
0.12192189693450928,
0.0676833912730217,
-0.13428759574890137,
-0.06298412382602692,
-0.06815947592258453,
-0.03369535133242607,
-0.024816837161779404,
0.040182024240493774,
0.017229147255420685,
0.0635613352060318,
0.27198895812034607,
-0.04016723483800888,
0.06374870985746384,
0.04097883775830269,
0.022351374849677086,
0.03009030781686306,
0.10533419251441956,
-0.03212955966591835,
-0.16400747001171112,
-0.007433966733515263,
0.10063730925321579,
0.002702203579246998,
-0.03417186439037323,
-0.05627066642045975,
0.03344479948282242,
0.03579871356487274,
0.11764659732580185,
0.10942773520946503,
-0.028066188097000122,
-0.0745202898979187,
-0.05581606552004814,
0.19079482555389404,
-0.15589196979999542,
0.043175265192985535,
0.031009791418910027,
0.0013891590060666203,
-0.06065008044242859,
0.008123516105115414,
0.018420519307255745,
-0.03341829776763916,
0.0739302784204483,
-0.06741747260093689,
-0.0401163212954998,
-0.12049110978841782,
-0.029961997643113136,
0.03624962642788887,
0.009220915846526623,
-0.04452921822667122,
-0.042916469275951385,
-0.07037478685379028,
-0.10976991057395935,
0.1085909754037857,
-0.054557181894779205,
-0.05871255323290825,
-0.028399605304002762,
-0.08273676037788391,
0.018992358818650246,
0.03493666648864746,
0.06826084107160568,
-0.026227839291095734,
0.046194083988666534,
-0.010782663710415363,
0.06776405870914459,
0.06998622417449951,
0.030902881175279617,
-0.0827704519033432,
0.06522461771965027,
-0.19576740264892578,
0.07253402471542358,
-0.08013460040092468,
0.044235534965991974,
-0.1595429927110672,
-0.004312295466661453,
-0.0022420838940888643,
0.029259683564305305,
0.041751157492399216,
0.16127003729343414,
-0.21196487545967102,
-0.03095497004687786,
0.1684923619031906,
-0.10783151537179947,
-0.13275355100631714,
0.040584247559309006,
-0.03692902997136116,
0.18247874081134796,
0.02804495394229889,
0.029673883691430092,
0.08894111216068268,
-0.16022709012031555,
-0.02174060046672821,
-0.018446754664182663,
0.010418129153549671,
0.06808888167142868,
0.08132006227970123,
-0.09663040190935135,
-0.001616360037587583,
0.010858171619474888,
-0.061541199684143066,
-0.01785045862197876,
-0.04080429673194885,
-0.1045517548918724,
0.004818684887140989,
-0.08689999580383301,
0.010899664834141731,
0.005562866572290659,
-0.09412923455238342,
-0.00767026050016284,
-0.15247979760169983,
-0.05846627429127693,
0.08434145152568817,
0.00026128877652809024,
-0.01405352633446455,
-0.09419026970863342,
0.06373747438192368,
-0.03559573367238045,
-0.020782528445124626,
-0.14397205412387848,
-0.015432771295309067,
0.017898816615343094,
-0.13868916034698486,
0.0012420830316841602,
-0.11995251476764679,
0.06763311475515366,
0.004810863174498081,
-0.05048419162631035,
-0.04406342655420303,
-0.002766441088169813,
-0.004278186243027449,
-0.06090925633907318,
-0.23663276433944702,
-0.02428145334124565,
-0.052476897835731506,
0.1713789999485016,
-0.23148222267627716,
0.04160921275615692,
0.0034466448705643415,
0.11964506655931473,
0.0047644018195569515,
-0.058687981218099594,
0.022583601996302605,
-0.06231268495321274,
-0.024701951071619987,
-0.06840142607688904,
-0.0037527058739215136,
0.003462479216977954,
-0.02865241840481758,
0.014165260829031467,
-0.12116673588752747,
-0.06389053910970688,
0.09515070170164108,
0.058769457042217255,
-0.1450631022453308,
0.00842469185590744,
-0.040074050426483154,
-0.056336693465709686,
-0.06754444539546967,
-0.07108866423368454,
0.08409534394741058,
0.05292753130197525,
0.047818623483181,
-0.08274413645267487,
-0.06752345710992813,
0.003514396958053112,
-0.02452346496284008,
-0.013681194745004177,
0.12610596418380737,
0.09137961268424988,
-0.09851912409067154,
0.09228390455245972,
0.07080904394388199,
0.021283060312271118,
0.08558592200279236,
-0.02348261885344982,
-0.10639158636331558,
-0.02593001164495945,
0.05667613446712494,
0.01070303376764059,
0.1701316386461258,
-0.07188218832015991,
0.055811841040849686,
0.047385260462760925,
-0.05746626481413841,
0.04811330884695053,
-0.09233375638723373,
0.006447041407227516,
-0.0029063266701996326,
-0.015782566741108894,
0.036864910274744034,
-0.016450000926852226,
0.004836694337427616,
0.09010760486125946,
0.062471237033605576,
0.021535998210310936,
0.012572001665830612,
-0.0362418070435524,
-0.14193294942378998,
0.1797328144311905,
-0.09205848723649979,
-0.23891016840934753,
-0.15006007254123688,
0.054771315306425095,
0.05779189616441727,
-0.013948877342045307,
0.03144465386867523,
-0.05449340119957924,
-0.09502875059843063,
-0.08760391175746918,
0.004416328854858875,
0.03345770016312599,
-0.06084810197353363,
-0.06309141218662262,
0.03578837960958481,
0.03894244134426117,
-0.12027259171009064,
0.023747729137539864,
0.05629263445734978,
-0.0018340221140533686,
-0.003648567944765091,
0.045919474214315414,
0.09278853237628937,
0.20445209741592407,
-0.002732523949816823,
0.0053982362151145935,
0.05899197608232498,
0.2761322557926178,
-0.15901462733745575,
0.11325082182884216,
0.13837623596191406,
-0.06625627726316452,
0.07702389359474182,
0.1908654421567917,
0.030556995421648026,
-0.09384198486804962,
0.018727079033851624,
0.031007766723632812,
-0.023953305557370186,
-0.27104878425598145,
-0.05058536306023598,
-0.023827584460377693,
-0.07544421404600143,
0.08135921508073807,
0.08835428208112717,
0.09257134795188904,
0.028403934091329575,
-0.06399580091238022,
-0.09893711656332016,
0.02674330212175846,
0.11227049678564072,
-0.017586790025234222,
0.0025482589844614267,
0.07991060614585876,
-0.04866483062505722,
0.004952625837177038,
0.08520778268575668,
-0.02139362134039402,
0.12702924013137817,
0.056118953973054886,
0.1073608547449112,
0.08325479924678802,
0.08240807801485062,
-0.009224953129887581,
0.03056410513818264,
0.0027502768207341433,
0.020547926425933838,
0.020710214972496033,
-0.09094986319541931,
0.01736580580472946,
0.11510791629552841,
0.014805049635469913,
0.020639518275856972,
0.014339569956064224,
-0.059905439615249634,
0.037447262555360794,
0.1929825097322464,
0.03151291236281395,
-0.2053559273481369,
-0.0801534503698349,
0.05455378443002701,
-0.0739559680223465,
-0.15504314005374908,
-0.00788013357669115,
0.014482896775007248,
-0.1574634462594986,
0.018814608454704285,
-0.03978566825389862,
0.10737770050764084,
-0.06571333855390549,
-0.03766518458724022,
0.10156018286943436,
0.047414667904376984,
-0.028234774246811867,
0.04994218423962593,
-0.19223366677761078,
0.10771425813436508,
0.028445864096283913,
0.06718984991312027,
-0.08868084102869034,
0.08744743466377258,
-0.001796784228645265,
-0.011346758343279362,
0.1650870144367218,
-0.0022033178247511387,
-0.06180639937520027,
-0.07702392339706421,
-0.07925916463136673,
-0.005427278578281403,
0.07996804267168045,
-0.13732460141181946,
0.07520841062068939,
-0.0333210825920105,
-0.031404491513967514,
-0.007430676370859146,
-0.086235411465168,
-0.11866632848978043,
-0.16253423690795898,
0.061424531042575836,
-0.08553852140903473,
0.025479501113295555,
-0.08024374395608902,
-0.052194323390722275,
0.03343738615512848,
0.17655520141124725,
-0.2028171271085739,
-0.10914232581853867,
-0.14351201057434082,
-0.10141443461179733,
0.15255947411060333,
-0.04746145382523537,
0.08725551515817642,
-0.007392728701233864,
0.16233710944652557,
0.000411053973948583,
-0.01836213283240795,
0.08401200920343399,
-0.09487809985876083,
-0.18540970981121063,
-0.04660943150520325,
0.18383155763149261,
0.1311776340007782,
0.028439510613679886,
-0.011346815153956413,
0.026449725031852722,
-0.06680743396282196,
-0.10957765579223633,
0.030112503096461296,
0.1476605385541916,
0.06770458072423935,
-0.020437177270650864,
-0.042344409972429276,
-0.09610117226839066,
-0.06520573794841766,
-0.04310684651136398,
-0.002870124764740467,
0.20515766739845276,
-0.07029063999652863,
0.15548402070999146,
0.11205708235502243,
-0.060042425990104675,
-0.21054470539093018,
0.032464709132909775,
0.03981616720557213,
0.016663486137986183,
0.03228053078055382,
-0.1917620599269867,
0.08767081797122955,
-0.02572266198694706,
-0.08159942924976349,
0.1786719262599945,
-0.19226399064064026,
-0.129422128200531,
0.10824183374643326,
0.02104264684021473,
-0.201046884059906,
-0.150085911154747,
-0.10347102582454681,
-0.01812194101512432,
-0.12009748816490173,
0.04840534180402756,
0.008618081919848919,
0.010992096737027168,
0.011450343765318394,
0.020118551328778267,
0.041532836854457855,
-0.04830056428909302,
0.20299124717712402,
-0.04482565075159073,
-0.005569585133343935,
-0.0527876652777195,
-0.07773393392562866,
0.013384186662733555,
-0.054856233298778534,
0.12370224297046661,
-0.015441779978573322,
0.033861491829156876,
-0.16196617484092712,
-0.04311643913388252,
-0.06270512193441391,
0.035143591463565826,
-0.09606029093265533,
-0.0794484093785286,
-0.04419834166765213,
0.08294829726219177,
0.09136927872896194,
-0.012586906552314758,
0.01242639496922493,
-0.09655292332172394,
0.09700454771518707,
0.1995052993297577,
0.19330982863903046,
0.06315502524375916,
-0.053107570856809616,
0.02997264452278614,
-0.038537558168172836,
0.04430471360683441,
-0.21931912004947662,
0.04287564381957054,
0.06498876214027405,
0.026542434468865395,
0.06985615193843842,
-0.005677002016454935,
-0.1625482589006424,
-0.09128525853157043,
0.08836907148361206,
-0.06292731314897537,
-0.17292796075344086,
-0.033785052597522736,
0.041705161333084106,
-0.20931172370910645,
-0.04640975967049599,
0.03935948386788368,
-0.0181092731654644,
-0.041782595217227936,
0.02617095597088337,
0.08081985265016556,
-0.021255910396575928,
0.08439317345619202,
0.09534917026758194,
0.08989959210157394,
-0.09506035596132278,
0.05267556756734848,
0.07946302741765976,
-0.019431734457612038,
0.029825052246451378,
0.13751423358917236,
-0.0364147424697876,
-0.04645836725831032,
0.0798555314540863,
0.12185007333755493,
-0.002486835466697812,
-0.05506465584039688,
0.004287934862077236,
-0.049309078603982925,
0.061294808983802795,
0.12155837565660477,
0.021408192813396454,
-0.01193462684750557,
0.07872650027275085,
0.025506949052214622,
-0.09194063395261765,
0.12346944957971573,
0.04140791669487953,
0.02029072493314743,
-0.03513696417212486,
-0.028924908488988876,
-0.013744531199336052,
-0.0018778513185679913,
-0.014825914986431599,
0.00004693585287895985,
-0.0909915491938591,
0.0014284261269494891,
-0.11594712734222412,
0.01780756004154682,
-0.06718336790800095,
-0.0002576978877186775,
0.028643004596233368,
-0.0489656962454319,
-0.003824668936431408,
-0.005410241428762674,
-0.07838259637355804,
-0.05261590704321861,
-0.021815035492181778,
0.07858611643314362,
-0.13979020714759827,
0.03456014022231102,
0.07484147697687149,
-0.10328766703605652,
0.06876613199710846,
-0.008326759561896324,
0.013081645593047142,
0.008228299207985401,
-0.1439802497625351,
0.056155234575271606,
-0.029309317469596863,
-0.006359034683555365,
0.0010422393679618835,
-0.17944684624671936,
-0.011577526107430458,
-0.042701829224824905,
-0.07143910974264145,
0.013309884816408157,
-0.013215545564889908,
-0.1226518526673317,
0.11009237170219421,
0.008095293305814266,
-0.06616021692752838,
-0.015245208516716957,
0.044449418783187866,
0.07164029777050018,
-0.012409849092364311,
0.10877691954374313,
-0.02684897929430008,
0.083103708922863,
-0.1807156205177307,
-0.00621566828340292,
-0.016833368688821793,
0.05384806543588638,
-0.018549276515841484,
-0.04573789983987808,
0.05623883008956909,
-0.020538190379738808,
0.16466617584228516,
-0.0018338061636313796,
0.0742441937327385,
0.051905106753110886,
0.010930253192782402,
0.04378392919898033,
0.0728876143693924,
0.06468360126018524,
-0.016203518956899643,
-0.004701197147369385,
0.03255317360162735,
-0.0020409130956977606,
-0.045227568596601486,
-0.14094270765781403,
0.07253962010145187,
0.17666760087013245,
0.07048549503087997,
0.02179078198969364,
0.008067925460636616,
-0.1332378387451172,
-0.07408107072114944,
0.10511837154626846,
-0.017402758821845055,
-0.031061973422765732,
-0.06629138439893723,
0.22787198424339294,
0.14990010857582092,
-0.18986721336841583,
0.07560385763645172,
-0.05423163250088692,
-0.03786854073405266,
-0.14348988234996796,
-0.16802245378494263,
-0.05776524171233177,
-0.04911024123430252,
-0.0318753756582737,
-0.05938649922609329,
0.050970252603292465,
0.03954758495092392,
-0.004729952663183212,
-0.02203095331788063,
0.10803087800741196,
0.031586550176143646,
-0.04009048268198967,
0.045863546431064606,
0.060998860746622086,
0.04236721992492676,
-0.09942521899938583,
0.011735196225345135,
0.001886715879663825,
0.008814944885671139,
0.062213458120822906,
0.023173239082098007,
-0.06990323960781097,
0.02930132858455181,
-0.01787971705198288,
-0.12080670148134232,
0.0495670922100544,
-0.007516996935009956,
-0.021949628368020058,
0.14967697858810425,
0.03512033075094223,
0.008099704049527645,
-0.010065858252346516,
0.23994873464107513,
-0.07199644297361374,
-0.0820726528763771,
-0.13058407604694366,
0.08454304188489914,
-0.0638623833656311,
0.023955434560775757,
0.015532204881310463,
-0.12446270138025284,
0.012716526165604591,
0.17904044687747955,
0.11603523045778275,
-0.019778354093432426,
0.013520904816687107,
0.04626742750406265,
0.009430119767785072,
-0.03490632027387619,
0.011960557661950588,
0.055921632796525955,
0.20638400316238403,
-0.07805577665567398,
0.06097545102238655,
-0.017648804932832718,
-0.0689961239695549,
-0.031498104333877563,
0.10827583074569702,
-0.011656714603304863,
-0.01122299861162901,
-0.05968675762414932,
0.14143596589565277,
-0.07639602571725845,
-0.21431203186511993,
0.05089925602078438,
-0.08246009796857834,
-0.13886047899723053,
-0.04927203059196472,
0.027118146419525146,
-0.02602965012192726,
0.005761643406003714,
0.06048549711704254,
-0.05353428050875664,
0.18044669926166534,
0.029145246371626854,
-0.042828578501939774,
-0.09458549320697784,
0.056870587170124054,
-0.16182497143745422,
0.2819679081439972,
0.021850652992725372,
0.0487053208053112,
0.1097458079457283,
-0.021935712546110153,
-0.1319884955883026,
0.015168975107371807,
0.1129152700304985,
-0.0632040724158287,
0.06390555948019028,
0.1606759876012802,
0.0027896345127373934,
0.12182102352380753,
0.06664198637008667,
-0.0592242032289505,
0.035914625972509384,
-0.06755085289478302,
-0.05441083759069443,
-0.11569532752037048,
0.07832225412130356,
-0.0966244786977768,
0.1526871919631958,
0.12093057483434677,
-0.07346441596746445,
-0.0029697499703615904,
-0.020845314487814903,
0.08185786008834839,
0.018558043986558914,
0.10965380072593689,
0.008656207472085953,
-0.1857033669948578,
0.046339020133018494,
0.00887568574398756,
0.09886037558317184,
-0.21062983572483063,
-0.04863942787051201,
0.041914358735084534,
-0.017102444544434547,
-0.08565417677164078,
0.11376497149467468,
0.03838564455509186,
0.01722962036728859,
-0.035149652510881424,
-0.04790586978197098,
0.01729344017803669,
0.15234188735485077,
-0.1053488478064537,
-0.014266646467149258
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Influencer_ChatBot
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3
### Training results
### Framework versions
- PEFT 0.7.2.dev0
- Transformers 4.38.0.dev0
- Pytorch 2.1.2
- Datasets 2.16.1
- Tokenizers 0.15.1 | {"license": "mit", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "datasets": ["generator"], "base_model": "microsoft/phi-2", "model-index": [{"name": "Influencer_ChatBot", "results": []}]} | null | Konrad57/Influencer_ChatBot | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:microsoft/phi-2",
"license:mit",
"region:us"
] | 2024-02-13T09:50:01+00:00 | [] | [] | TAGS
#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us
|
# Influencer_ChatBot
This model is a fine-tuned version of microsoft/phi-2 on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3
### Training results
### Framework versions
- PEFT 0.7.2.dev0
- Transformers 4.38.0.dev0
- Pytorch 2.1.2
- Datasets 2.16.1
- Tokenizers 0.15.1 | [
"# Influencer_ChatBot\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 3",
"### Training results",
"### Framework versions\n\n- PEFT 0.7.2.dev0\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us \n",
"# Influencer_ChatBot\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 3",
"### Training results",
"### Framework versions\n\n- PEFT 0.7.2.dev0\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
51,
29,
6,
12,
8,
3,
128,
4,
44
] | [
"passage: TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-microsoft/phi-2 #license-mit #region-us \n# Influencer_ChatBot\n\nThis model is a fine-tuned version of microsoft/phi-2 on the generator dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 3### Training results### Framework versions\n\n- PEFT 0.7.2.dev0\n- Transformers 4.38.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
-0.10643191635608673,
0.13555178046226501,
-0.003831558395177126,
0.0785357877612114,
0.10838577151298523,
0.018393341451883316,
0.0632176324725151,
0.1353457272052765,
-0.06118524447083473,
0.10631508380174637,
0.08557117730379105,
0.05670413374900818,
0.06539365649223328,
0.16213105618953705,
-0.011700937524437904,
-0.24129974842071533,
0.026638701558113098,
-0.04520308971405029,
-0.030610844492912292,
0.0994110107421875,
0.10164210945367813,
-0.0755365714430809,
0.06543664634227753,
0.008968083187937737,
-0.11801052838563919,
-0.02232963591814041,
-0.036752283573150635,
-0.043037865310907364,
0.09096734970808029,
0.002652517519891262,
0.04706919938325882,
0.03310569003224373,
0.13165950775146484,
-0.20802289247512817,
0.0035222156438976526,
0.09460950642824173,
0.0404309406876564,
0.09814021736383438,
0.07916388660669327,
0.0034816379193216562,
0.1431274116039276,
-0.09720311313867569,
0.08190690726041794,
0.04463392123579979,
-0.08175001293420792,
-0.1534699648618698,
-0.09349916130304337,
0.06023300439119339,
0.061744172126054764,
0.09262816607952118,
0.019717209041118622,
0.16234321892261505,
-0.06630021333694458,
0.0665464699268341,
0.20679201185703278,
-0.27236512303352356,
-0.042602889239788055,
0.07718265056610107,
0.04808536544442177,
0.06089591234922409,
-0.11411481350660324,
-0.031183337792754173,
0.024152830243110657,
0.03164166584610939,
0.09709745645523071,
0.01733569987118244,
-0.012955584563314915,
-0.020730920135974884,
-0.11107305437326431,
-0.04631560295820236,
0.11212881654500961,
0.05133025720715523,
-0.038216810673475266,
-0.11381549388170242,
-0.07341931015253067,
-0.13887614011764526,
-0.011262757703661919,
-0.00033042277209460735,
0.01437441073358059,
-0.03950713947415352,
-0.03942325711250305,
-0.03305475041270256,
-0.05475515127182007,
-0.06721224635839462,
0.012172379530966282,
0.0848233625292778,
0.05969269201159477,
0.04092591255903244,
0.006294586695730686,
0.1168999895453453,
0.0013582840329036117,
-0.12237658351659775,
-0.03382867947220802,
-0.020944299176335335,
-0.11096812039613724,
-0.016832556575536728,
-0.020478764548897743,
0.03120252676308155,
0.021319344639778137,
0.12715232372283936,
-0.08098353445529938,
0.07751619070768356,
0.05009114742279053,
0.0005045648431405425,
-0.0067314086481928825,
0.1440654844045639,
-0.04799458384513855,
-0.030787214636802673,
0.011583717539906502,
0.10079176723957062,
0.034532785415649414,
-0.01100178062915802,
-0.07797162234783173,
-0.0083362627774477,
0.09384988993406296,
0.05424756929278374,
-0.01653565652668476,
0.03095332160592079,
-0.05172843113541603,
-0.03140750527381897,
0.06583152711391449,
-0.1202184334397316,
0.04561038315296173,
0.02666557766497135,
-0.07575978338718414,
-0.035765647888183594,
0.03617008775472641,
-0.028913695365190506,
-0.056328464299440384,
0.05150185897946358,
-0.0687805563211441,
-0.018348848447203636,
-0.0764533132314682,
-0.03820769116282463,
0.005382477771490812,
-0.10210034251213074,
-0.023348025977611542,
-0.06485386192798615,
-0.18095874786376953,
-0.0375526137650013,
0.04335637763142586,
-0.09907478094100952,
-0.07342017441987991,
-0.00843875203281641,
-0.07425812631845474,
0.011747400276362896,
-0.021452391520142555,
0.10519517213106155,
-0.0652032122015953,
0.0849594995379448,
-0.007221195846796036,
0.029937779530882835,
0.03551552817225456,
0.04260807856917381,
-0.07177434116601944,
0.03378617763519287,
-0.13505065441131592,
0.0936087816953659,
-0.0967085212469101,
-0.001310312538407743,
-0.11991002410650253,
-0.0963648334145546,
-0.011394117027521133,
-0.036729682236909866,
0.061128005385398865,
0.13127659261226654,
-0.16112422943115234,
-0.003630684921517968,
0.1430233120918274,
-0.08129885792732239,
-0.06038773059844971,
0.0738978236913681,
-0.015093070454895496,
-0.020799078047275543,
0.033383507281541824,
0.13877281546592712,
0.08923829346895218,
-0.1467529833316803,
0.0020058010704815388,
0.031073274090886116,
0.033314354717731476,
-0.00431103864684701,
0.05692768841981888,
-0.020740602165460587,
0.05304069444537163,
0.03901047632098198,
-0.03133177384734154,
-0.007863946259021759,
-0.07698184996843338,
-0.05183485522866249,
-0.06684809923171997,
-0.07467693090438843,
-0.011017466895282269,
0.02931729331612587,
-0.0013954362366348505,
-0.05805746838450432,
-0.1278902143239975,
0.05134153366088867,
0.14617836475372314,
-0.030279017984867096,
0.0023835592437535524,
-0.09257852286100388,
0.0472729317843914,
-0.006034984719008207,
-0.017422420904040337,
-0.1899672895669937,
-0.11401496082544327,
0.05440320074558258,
-0.06722577661275864,
0.02346346713602543,
0.03823498263955116,
0.06314082443714142,
0.08115134388208389,
-0.0296931229531765,
-0.04151354730129242,
-0.07674776762723923,
-0.0005247303633950651,
-0.08155787736177444,
-0.17718283832073212,
-0.05582199618220329,
-0.024583352729678154,
0.172835573554039,
-0.19115303456783295,
0.007718778681010008,
-0.008609781973063946,
0.15722692012786865,
0.02055640146136284,
-0.08302825689315796,
0.02313970774412155,
-0.0016110518481582403,
-0.008492575958371162,
-0.10122184455394745,
0.03102531097829342,
0.020811403170228004,
-0.06495871394872665,
-0.01300390250980854,
-0.12627269327640533,
0.03898944333195686,
0.07222945988178253,
0.16496625542640686,
-0.07349450886249542,
-0.031018033623695374,
-0.08866211771965027,
-0.03760978952050209,
-0.06976456940174103,
0.0034550982527434826,
0.15452885627746582,
0.04135784134268761,
0.10919494926929474,
-0.06928713619709015,
-0.06817814707756042,
0.013963610865175724,
0.005680576432496309,
-0.006155149545520544,
0.09860660880804062,
0.062455881386995316,
-0.06585165858268738,
0.07678312808275223,
0.030312787741422653,
0.0010753126116469502,
0.09261952340602875,
-0.05869634076952934,
-0.1014145091176033,
-0.016216255724430084,
0.02753119356930256,
0.0021027529146522284,
0.14015264809131622,
-0.038080744445323944,
0.048542365431785583,
0.05427401885390282,
0.026806630194187164,
0.026825718581676483,
-0.1548989862203598,
0.004417109303176403,
0.007431322243064642,
-0.055692173540592194,
-0.015577406622469425,
-0.007372591644525528,
0.04362364858388901,
0.07950223982334137,
0.004193476866930723,
-0.04928469657897949,
-0.004361005499958992,
-0.01588895171880722,
-0.06475891917943954,
0.1652381420135498,
-0.09766030311584473,
-0.14355634152889252,
-0.11315519362688065,
0.05590484291315079,
-0.061532218009233475,
-0.029627254232764244,
0.009682678617537022,
-0.04643839970231056,
-0.06360618025064468,
-0.11479510366916656,
-0.006500014569610357,
-0.009104321710765362,
0.00017916786600835621,
0.03372827544808388,
0.031952422112226486,
0.10717052966356277,
-0.12883295118808746,
0.0034819277934730053,
-0.026359429582953453,
-0.08310697227716446,
0.033376678824424744,
0.05779262259602547,
0.0679069384932518,
0.11030815541744232,
0.004832087084650993,
0.029982415959239006,
-0.003129767719656229,
0.22322233021259308,
-0.06835444271564484,
0.02506144531071186,
0.13607804477214813,
0.034653615206480026,
0.06312157213687897,
0.11392538249492645,
0.015290061011910439,
-0.08718925714492798,
0.021926196292042732,
0.05358697474002838,
-0.030867302790284157,
-0.2385534644126892,
-0.06707549095153809,
-0.024528048932552338,
-0.03290329873561859,
0.08161371946334839,
0.07487452030181885,
-0.054991062730550766,
0.025251159444451332,
-0.01888367533683777,
-0.03652047738432884,
0.013868259266018867,
0.06386133283376694,
0.0217174980789423,
0.03164253383874893,
0.0828152522444725,
-0.024522870779037476,
0.011766280978918076,
0.07610093802213669,
0.009818649850785732,
0.22992001473903656,
-0.06940814107656479,
0.11756382882595062,
0.01802751235663891,
0.16569823026657104,
-0.02139810100197792,
0.04305104911327362,
0.010266335681080818,
0.011637268587946892,
0.0038495229091495275,
-0.054411157965660095,
-0.04867726191878319,
0.022515948861837387,
0.008784817531704903,
0.026510749012231827,
-0.06418158859014511,
0.06300585716962814,
0.021884938701987267,
0.269947350025177,
0.06391812115907669,
-0.3072666823863983,
-0.0754900574684143,
-0.00551684619858861,
-0.021930236369371414,
-0.07106968760490417,
0.011743885464966297,
0.11665622144937515,
-0.1379430592060089,
0.050209302455186844,
-0.05298773571848869,
0.08234160393476486,
-0.06586223095655441,
0.0002657401200849563,
0.04816606640815735,
0.15549463033676147,
0.013414145447313786,
0.08294469863176346,
-0.15698933601379395,
0.1749294549226761,
0.0026769167743623257,
0.09186315536499023,
-0.05643318593502045,
0.03125617653131485,
-0.009534618817269802,
0.04896731674671173,
0.09696955233812332,
0.0045752874575555325,
-0.036648765206336975,
-0.13414745032787323,
-0.15624605119228363,
0.01791890151798725,
0.09326157718896866,
-0.023274661973118782,
0.07227063924074173,
-0.04825560376048088,
-0.0028112444560974836,
0.018350092694163322,
-0.09475452452898026,
-0.1185518205165863,
-0.13263854384422302,
0.023485902696847916,
-0.026414604857563972,
-0.038781967014074326,
-0.08730490505695343,
-0.08508733659982681,
-0.010432496666908264,
0.19655880331993103,
0.01632213406264782,
-0.06588271260261536,
-0.1538812816143036,
0.058758754283189774,
0.14882463216781616,
-0.0662614107131958,
0.011888577602803707,
-0.017216451466083527,
0.13943101465702057,
0.041223250329494476,
-0.08245591819286346,
0.06156286597251892,
-0.06831075996160507,
-0.1844959706068039,
-0.04908592253923416,
0.1564091295003891,
0.0454123318195343,
0.048953551799058914,
-0.008317381143569946,
0.0189322829246521,
-0.004149254411458969,
-0.09244489669799805,
0.034862712025642395,
0.09472212195396423,
0.053285256028175354,
0.038373399525880814,
-0.0664386972784996,
0.047853920608758926,
-0.010356183163821697,
-0.023040147498250008,
0.09905604273080826,
0.22466373443603516,
-0.09780159592628479,
0.1142660453915596,
0.04888872429728508,
-0.029467349871993065,
-0.19118168950080872,
0.0009940206073224545,
0.11860424280166626,
0.014984525740146637,
0.05599598214030266,
-0.15872269868850708,
0.07579183578491211,
0.11112792044878006,
-0.041255928575992584,
0.06998273730278015,
-0.33111172914505005,
-0.1202995702624321,
0.0819772258400917,
0.08492447435855865,
-0.008069139905273914,
-0.12779302895069122,
-0.057186223566532135,
-0.02487805299460888,
-0.13992293179035187,
0.07605372369289398,
-0.08059427887201309,
0.11106759309768677,
-0.02494189888238907,
0.06485936790704727,
0.031960003077983856,
-0.049057889729738235,
0.1323259174823761,
0.01567019522190094,
0.045512404292821884,
-0.057420045137405396,
0.08605767041444778,
0.02343466505408287,
-0.06698553264141083,
0.07721270620822906,
-0.05729427933692932,
0.09404050558805466,
-0.1307395100593567,
-0.010205412283539772,
-0.08180993050336838,
0.06762956827878952,
-0.04360711947083473,
-0.053570326417684555,
-0.03570839390158653,
0.047945015132427216,
0.05989156290888786,
-0.0448700375854969,
0.044507309794425964,
0.023661835119128227,
0.05712128430604935,
0.19071975350379944,
0.034415971487760544,
-0.018257422372698784,
-0.15018850564956665,
-0.014455782249569893,
0.0013762470334768295,
0.024981755763292313,
-0.07616399228572845,
0.012792926281690598,
0.10896191745996475,
0.027779171243309975,
0.10449386388063431,
0.013990662060678005,
-0.09025388211011887,
-0.031045222654938698,
0.044909410178661346,
-0.11073552817106247,
-0.1992521435022354,
0.015264648012816906,
0.03785423934459686,
-0.10946332663297653,
0.027225201949477196,
0.09212658554315567,
-0.03864019364118576,
-0.017705827951431274,
-0.008296182379126549,
0.05509030073881149,
0.006314529571682215,
0.17949822545051575,
0.022450124844908714,
0.07798595726490021,
-0.10448622703552246,
0.15722069144248962,
0.06408603489398956,
-0.058978449553251266,
0.03460745885968208,
0.08416373282670975,
-0.11313189566135406,
-0.004256204701960087,
0.1013554260134697,
0.07049008458852768,
-0.012660881504416466,
-0.02797025442123413,
-0.07367292046546936,
-0.11238057166337967,
0.057597555220127106,
0.017582453787326813,
0.03859350085258484,
0.01045627798885107,
-0.008526943624019623,
0.0035438125487416983,
-0.11931309849023819,
0.09058135747909546,
0.062714122235775,
0.060594405978918076,
-0.13716194033622742,
0.05059453845024109,
0.012550227344036102,
0.046032920479774475,
-0.01678503304719925,
0.013585483655333519,
-0.08655531704425812,
-0.025157541036605835,
-0.09958989918231964,
0.0034389575012028217,
-0.03632018715143204,
-0.0009956912836059928,
-0.01740036904811859,
-0.05929328501224518,
-0.01332983747124672,
0.056120432913303375,
-0.07439888268709183,
-0.08152125030755997,
-0.029948487877845764,
0.051686644554138184,
-0.12638050317764282,
0.0019970994908362627,
0.026570003479719162,
-0.11129169166088104,
0.11633799225091934,
0.072502501308918,
0.01459553837776184,
0.022636694833636284,
-0.06463582068681717,
-0.024679910391569138,
0.020511353388428688,
0.03291107341647148,
0.08057411015033722,
-0.10957836359739304,
-0.00934918038547039,
-0.014812033623456955,
-0.007903028279542923,
-0.010681734420359135,
0.061798129230737686,
-0.14653152227401733,
-0.02485167421400547,
-0.07051806151866913,
-0.042404141277074814,
-0.06185189634561539,
0.022412842139601707,
0.08047080785036087,
0.03849304839968681,
0.17255878448486328,
-0.07495275884866714,
0.03415945917367935,
-0.2103966325521469,
-0.022649167105555534,
0.006316498853266239,
-0.01182405836880207,
-0.06828685104846954,
0.006438469048589468,
0.09227654337882996,
-0.029209084808826447,
0.10100261867046356,
-0.019223755225539207,
0.03176295757293701,
0.035774629563093185,
-0.0313141904771328,
-0.014591029845178127,
-0.0018085155170410872,
0.10833557695150375,
0.04603312909603119,
0.008084062486886978,
0.11316405981779099,
-0.05001257359981537,
0.046553611755371094,
0.029884304851293564,
0.18029369413852692,
0.14601142704486847,
0.02956634759902954,
0.058607108891010284,
0.08659929037094116,
-0.11894736438989639,
-0.1382787823677063,
0.12937411665916443,
-0.04981667175889015,
0.11699295043945312,
-0.05860894173383713,
0.17327602207660675,
0.10232333838939667,
-0.16067227721214294,
0.04429446533322334,
-0.05563284829258919,
-0.10183786600828171,
-0.1104772537946701,
-0.06216417998075485,
-0.08163335174322128,
-0.13271531462669373,
0.027732999995350838,
-0.09003698080778122,
0.0331859216094017,
0.08594727516174316,
0.01674000732600689,
0.01299404725432396,
0.11590622365474701,
-0.017498988658189774,
-0.0029755267314612865,
0.011537670157849789,
0.040741439908742905,
-0.013233061879873276,
-0.0020721221808344126,
-0.05710717663168907,
0.03509337827563286,
0.029244160279631615,
0.09589237719774246,
-0.007271275855600834,
0.006215906701982021,
0.04839864745736122,
0.003212637733668089,
-0.06613175570964813,
0.022452641278505325,
-0.0017791838617995381,
0.011001050472259521,
0.08423636853694916,
0.061738960444927216,
0.006380851846188307,
-0.05477525293827057,
0.2821008563041687,
-0.07339268177747726,
-0.07840939611196518,
-0.12502703070640564,
0.22421564161777496,
0.03259305655956268,
-0.0140460804104805,
0.07383490353822708,
-0.12491869181394577,
-0.05370469391345978,
0.140993133187294,
0.16170236468315125,
-0.07394204288721085,
-0.03181831166148186,
-0.02731143683195114,
-0.01907750405371189,
-0.0506296344101429,
0.1120012104511261,
0.11224368959665298,
0.08011145889759064,
-0.053660064935684204,
0.01748514175415039,
-0.014698791317641735,
-0.029705705121159554,
-0.09360334277153015,
0.08003909885883331,
-0.013457566499710083,
0.0160800963640213,
-0.04471958801150322,
0.04906611144542694,
-0.027079906314611435,
-0.2260291576385498,
0.015729157254099846,
-0.13152487576007843,
-0.1848626732826233,
-0.024100763723254204,
0.08167659491300583,
-0.01506178267300129,
0.07737137377262115,
-0.009428191930055618,
0.01897048018872738,
0.11332661658525467,
-0.026751955971121788,
-0.07090495526790619,
-0.07716018706560135,
0.06708426028490067,
-0.05353992059826851,
0.23253268003463745,
0.002126684645190835,
0.05238521099090576,
0.09593286365270615,
0.00858357734978199,
-0.18452773988246918,
0.05801152065396309,
0.055128708481788635,
-0.09598454087972641,
0.02138601988554001,
0.14921890199184418,
-0.03307857736945152,
0.049328453838825226,
0.04284728690981865,
-0.09853928536176682,
-0.03872387111186981,
-0.033171992748975754,
-0.006244914140552282,
-0.062277406454086304,
0.00294860964640975,
-0.04357385262846947,
0.15268877148628235,
0.19568589329719543,
-0.04075337201356888,
0.006478544790297747,
-0.09022369235754013,
0.03179357573390007,
0.02204301208257675,
0.04914802685379982,
0.00513506168499589,
-0.203229621052742,
0.03489521145820618,
0.0185459665954113,
0.030055729672312737,
-0.18938077986240387,
-0.08249988406896591,
0.023500878363847733,
-0.053576622158288956,
-0.061709221452474594,
0.1347023993730545,
0.030686357989907265,
0.03585171699523926,
-0.03539453446865082,
-0.03357121720910072,
-0.040243301540613174,
0.12784413993358612,
-0.16177091002464294,
-0.05354189872741699
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# DH_DOOR_BOT
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the audiofolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1345
- Accuracy: 0.9565
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- distributed_type: tpu
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2536 | 1.0 | 423 | 0.2130 | 0.9297 |
| 0.1807 | 2.0 | 847 | 0.1698 | 0.9438 |
| 0.1613 | 3.0 | 1270 | 0.1642 | 0.9457 |
| 0.1447 | 4.0 | 1694 | 0.1372 | 0.9561 |
| 0.1348 | 4.99 | 2115 | 0.1345 | 0.9565 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.0.0+cu118
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["audiofolder"], "metrics": ["accuracy"], "base_model": "ntu-spml/distilhubert", "model-index": [{"name": "DH_DOOR_BOT", "results": [{"task": {"type": "audio-classification", "name": "Audio Classification"}, "dataset": {"name": "audiofolder", "type": "audiofolder", "config": "default", "split": "train", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.956539391366933, "name": "Accuracy"}]}]}]} | audio-classification | iamhack/DH_DOOR_BOT | [
"transformers",
"tensorboard",
"safetensors",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:audiofolder",
"base_model:ntu-spml/distilhubert",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:51:26+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-audiofolder #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us
| DH\_DOOR\_BOT
=============
This model is a fine-tuned version of ntu-spml/distilhubert on the audiofolder dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1345
* Accuracy: 0.9565
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* distributed\_type: tpu
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.0.0+cu118
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* distributed\\_type: tpu\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-audiofolder #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* distributed\\_type: tpu\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
76,
153,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #hubert #audio-classification #generated_from_trainer #dataset-audiofolder #base_model-ntu-spml/distilhubert #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* distributed\\_type: tpu\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.11516498774290085,
0.14189639687538147,
-0.002718420000746846,
0.08152533322572708,
0.12329348921775818,
0.02373158000409603,
0.1048499271273613,
0.0997847244143486,
-0.0922706127166748,
0.10584039241075516,
0.07079014927148819,
0.11486360430717468,
0.05240238830447197,
0.12188310921192169,
-0.02485818602144718,
-0.2859746813774109,
0.0026512648910284042,
-0.010320671834051609,
-0.1430845558643341,
0.11430249363183975,
0.06811748445034027,
-0.09429628401994705,
0.06047334149479866,
-0.01621934585273266,
-0.1225433200597763,
-0.008571243844926357,
-0.018019801005721092,
-0.06126777455210686,
0.08037998527288437,
0.029233964160084724,
0.07866379618644714,
0.04695061594247818,
0.1061750203371048,
-0.21185073256492615,
0.015789467841386795,
0.08372584730386734,
0.0004889114643447101,
0.07755536586046219,
0.1346844881772995,
-0.016403259709477425,
0.14877010881900787,
-0.08611240237951279,
0.054699186235666275,
0.045831095427274704,
-0.08638525754213333,
-0.22301259636878967,
-0.07573714852333069,
0.09930378198623657,
0.09752090275287628,
0.07072434574365616,
-0.03197391331195831,
0.08972112089395523,
-0.08570515364408493,
0.08094918727874756,
0.24399204552173615,
-0.2646205723285675,
-0.062593474984169,
0.028877966105937958,
0.04207489266991615,
0.07281443476676941,
-0.10918711870908737,
-0.022923672571778297,
0.05024318769574165,
0.020872928202152252,
0.11740456521511078,
0.006322247441858053,
0.03708425909280777,
0.002611661097034812,
-0.1717219352722168,
-0.021821435540914536,
0.12169026583433151,
0.08731945604085922,
-0.02824249118566513,
-0.05876884609460831,
-0.04234219342470169,
-0.2175440788269043,
-0.024935219436883926,
0.004806957207620144,
0.033362362533807755,
-0.05187629908323288,
-0.086299367249012,
0.04839117452502251,
-0.04280158132314682,
-0.08322016894817352,
0.022954396903514862,
0.14509135484695435,
0.06995192915201187,
-0.03511400893330574,
0.042034439742565155,
0.11542902886867523,
0.06978017836809158,
-0.14421634376049042,
0.02671806700527668,
0.022644976153969765,
-0.09976272284984589,
-0.03287205472588539,
-0.006217179819941521,
-0.018679365515708923,
0.019144147634506226,
0.14479538798332214,
-0.041960231959819794,
0.04631764441728592,
0.05407760664820671,
0.03235793486237526,
-0.05398797243833542,
0.10418777167797089,
-0.1097501888871193,
-0.09974538534879684,
-0.020661210641264915,
0.10877934098243713,
0.029164057224988937,
-0.006161347962915897,
-0.0802302435040474,
0.0382620245218277,
0.12145145237445831,
0.024208448827266693,
-0.01861736737191677,
0.030648883432149887,
-0.08164861798286438,
-0.053803738206624985,
0.031346626579761505,
-0.08908607810735703,
0.028234949335455894,
0.027417894452810287,
-0.038523729890584946,
0.01979137398302555,
0.012391719967126846,
0.007250412832945585,
-0.019974282011389732,
0.15725009143352509,
-0.09351300448179245,
-0.03260338306427002,
-0.06476187705993652,
-0.08721262216567993,
0.04792394861578941,
-0.0937253087759018,
0.015751462429761887,
-0.08388655632734299,
-0.0696859210729599,
-0.03252862021327019,
0.07288803905248642,
-0.04292076453566551,
-0.09855318069458008,
-0.04521072655916214,
-0.09922460466623306,
0.037119120359420776,
-0.018674267455935478,
0.10749940574169159,
-0.07166540622711182,
0.113226018846035,
-0.02216440439224243,
0.0626557469367981,
0.06165645271539688,
0.0713142678141594,
-0.04482756555080414,
0.04734288901090622,
-0.21318767964839935,
0.038258910179138184,
-0.09754671901464462,
0.03692653775215149,
-0.11386461555957794,
-0.12198726087808609,
-0.019882312044501305,
0.0013367572100833058,
0.062027812004089355,
0.09346982091665268,
-0.16017864644527435,
-0.11362893879413605,
0.1379631608724594,
-0.09003186970949173,
-0.1057075634598732,
0.1592637598514557,
0.004947592504322529,
-0.07590346038341522,
0.031206799671053886,
0.1722061038017273,
0.1066240444779396,
-0.125553697347641,
-0.03334531560540199,
-0.028949279338121414,
0.09493453800678253,
0.014655829407274723,
0.11215807497501373,
-0.013343511149287224,
-0.015235669910907745,
-0.024873720481991768,
-0.029522346332669258,
0.07207018882036209,
-0.09240560978651047,
-0.07254703342914581,
-0.017466548830270767,
-0.09945409744977951,
0.035578712821006775,
0.06212537735700607,
0.004039003048092127,
-0.06967978924512863,
-0.1364242285490036,
0.054897408932447433,
0.10446333140134811,
-0.07621348649263382,
0.011935497634112835,
-0.05465106666088104,
0.10571280121803284,
-0.04431331157684326,
-0.03189250826835632,
-0.15017876029014587,
-0.016980387270450592,
0.02157268486917019,
-0.06729544699192047,
0.04411621764302254,
0.0027463205624371767,
0.05036717280745506,
0.08992773294448853,
-0.0803871601819992,
-0.08722097426652908,
-0.06583113968372345,
0.016910675913095474,
-0.04253355786204338,
-0.2697262167930603,
-0.05256282165646553,
-0.01587248407304287,
0.13267378509044647,
-0.2126728594303131,
-0.01007911004126072,
-0.020689161494374275,
0.13193631172180176,
0.060963209718465805,
-0.05750347673892975,
0.009427353739738464,
0.07829535752534866,
-0.03169872611761093,
-0.07920718193054199,
0.019982632249593735,
0.012172728776931763,
-0.09229462593793869,
-0.04195597395300865,
-0.12384094297885895,
0.15564033389091492,
0.11011698842048645,
-0.0040825908072292805,
-0.11450039595365524,
-0.005305934231728315,
-0.07662314176559448,
-0.06068439036607742,
-0.028963584452867508,
0.014290587045252323,
0.1280667632818222,
-0.006656779907643795,
0.09618742018938065,
-0.09426171332597733,
-0.039975155144929886,
0.044396575540304184,
-0.0005747273098677397,
-0.006934676319360733,
0.14383263885974884,
0.11043812334537506,
-0.08476553857326508,
0.13593336939811707,
0.12433447688817978,
-0.08787806332111359,
0.1683366596698761,
-0.07223448902368546,
-0.12865270674228668,
-0.022073468193411827,
0.021682679653167725,
0.029436586424708366,
0.13451775908470154,
-0.11073064804077148,
0.025755392387509346,
0.01767313852906227,
0.021912354975938797,
0.0035932317841798067,
-0.17569591104984283,
-0.019922811537981033,
0.026599638164043427,
-0.054853349924087524,
-0.04223508760333061,
-0.014279085211455822,
-0.00785274151712656,
0.10754062980413437,
-0.002316107042133808,
-0.07641773670911789,
-0.006525264587253332,
0.00846954993903637,
-0.07843022048473358,
0.16575133800506592,
-0.09898757189512253,
-0.11702676117420197,
-0.12212737649679184,
0.01809351146221161,
-0.03513338044285774,
-0.0011936993105337024,
0.045974958688020706,
-0.09061489254236221,
-0.04827225208282471,
-0.060255590826272964,
0.06440835446119308,
-0.017261547967791557,
0.04892808571457863,
-0.0003716665378306061,
0.03456763178110123,
0.07545588165521622,
-0.07586808502674103,
0.028012951835989952,
-0.016613095998764038,
-0.025058159604668617,
0.015599850565195084,
0.04456612095236778,
0.10285444557666779,
0.18895722925662994,
0.04025711491703987,
0.007005829364061356,
-0.018863964825868607,
0.1673583686351776,
-0.10584116727113724,
0.029838765040040016,
0.08693692833185196,
-0.018814142793416977,
0.032015517354011536,
0.1554395854473114,
0.053658291697502136,
-0.060375235974788666,
0.0251126978546381,
0.06946901232004166,
-0.02382284589111805,
-0.2463524341583252,
-0.04578119143843651,
-0.05926259979605675,
0.0011982396245002747,
0.09621355682611465,
0.03291646018624306,
-0.002907331334426999,
0.0478740818798542,
-0.016163161024451256,
0.027922863140702248,
-0.006583597045391798,
0.04822632297873497,
0.0045381528325378895,
0.04657012224197388,
0.09746050834655762,
-0.03456062823534012,
-0.034739747643470764,
0.056589506566524506,
0.0008790441788733006,
0.22854742407798767,
-0.01930452138185501,
0.11478205770254135,
0.06671058386564255,
0.1481442004442215,
0.012563304975628853,
0.06401501595973969,
0.013944614678621292,
-0.019680114462971687,
0.008673462085425854,
-0.06440232694149017,
-0.005517530255019665,
0.03309425339102745,
0.06020371615886688,
0.052189212292432785,
-0.11958985030651093,
0.030419830232858658,
0.022348489612340927,
0.27560171484947205,
0.08989578485488892,
-0.29689347743988037,
-0.08749334514141083,
0.00477889459580183,
-0.04360057786107063,
-0.023595834150910378,
0.056388791650533676,
0.13715557754039764,
-0.04385356977581978,
0.08430003374814987,
-0.05740705877542496,
0.07286936044692993,
-0.05843682587146759,
-0.01906893588602543,
0.11612534523010254,
0.13143359124660492,
-0.013842394575476646,
0.04485524445772171,
-0.19729861617088318,
0.27925431728363037,
-0.001093029510229826,
0.060311511158943176,
-0.023205939680337906,
0.023375416174530983,
0.033141616731882095,
0.026647796854376793,
0.10959295183420181,
0.011304307729005814,
-0.06845393031835556,
-0.16908375918865204,
-0.11039770394563675,
-0.00206911563873291,
0.11997612565755844,
-0.0551256388425827,
0.09147978574037552,
-0.036968640983104706,
-0.042898017913103104,
0.05352850630879402,
-0.0785936489701271,
-0.12063225358724594,
-0.08556945621967316,
0.01656963676214218,
-0.012962931767106056,
0.016000375151634216,
-0.08088511228561401,
-0.12666229903697968,
-0.13406506180763245,
0.15022876858711243,
-0.09002532064914703,
-0.0168782789260149,
-0.12097451090812683,
0.04935087636113167,
0.14482295513153076,
-0.06383080035448074,
0.06450953334569931,
0.02093912661075592,
0.0819438099861145,
0.04545263946056366,
-0.05524694547057152,
0.12589499354362488,
-0.09422245621681213,
-0.20306190848350525,
-0.06293787807226181,
0.1634458601474762,
0.04900006204843521,
0.07620920240879059,
-0.05222219228744507,
0.018853675574064255,
0.02349439263343811,
-0.07927633821964264,
0.04386480897665024,
0.012862509116530418,
0.05972253158688545,
0.06707178056240082,
-0.06551907956600189,
-0.03313373774290085,
-0.030266953632235527,
-0.04628710076212883,
0.0903688296675682,
0.2737356126308441,
-0.08006677776575089,
0.04397952929139137,
0.05838805064558983,
-0.035539984703063965,
-0.18204107880592346,
0.016272978857159615,
0.11697806417942047,
0.013480243273079395,
0.033356696367263794,
-0.19542613625526428,
0.10530603677034378,
0.07509613782167435,
-0.007692616432905197,
0.08785680681467056,
-0.30889567732810974,
-0.11786027252674103,
0.11521650850772858,
0.1151786670088768,
-0.03579592704772949,
-0.16479821503162384,
-0.04205195605754852,
0.008016644977033138,
-0.09539701044559479,
0.0989312008023262,
-0.0633901059627533,
0.12343237549066544,
0.0036676907911896706,
-0.0023897753562778234,
0.024469632655382156,
-0.05287947878241539,
0.10155673325061798,
0.02654966711997986,
0.055017758160829544,
-0.02854331210255623,
0.035934437066316605,
0.001381886308081448,
-0.05718793720006943,
0.0032655682880431414,
-0.08731355518102646,
0.037881165742874146,
-0.07010854780673981,
-0.030254673212766647,
-0.07100795954465866,
0.01783883385360241,
-0.04259432479739189,
-0.058730095624923706,
-0.033977851271629333,
0.06833568215370178,
0.08561244606971741,
-0.03125695511698723,
0.13568894565105438,
-0.006272460334002972,
0.1419839859008789,
0.07672064006328583,
0.08875363320112228,
-0.018026400357484818,
-0.1224091500043869,
-0.021288758143782616,
-0.032104626297950745,
0.05776900053024292,
-0.10106851905584335,
0.0259323101490736,
0.13135604560375214,
0.053538281470537186,
0.12381553649902344,
0.06632760912179947,
-0.06852523237466812,
-0.021871652454137802,
0.08759020268917084,
-0.1223793476819992,
-0.11647169291973114,
-0.03697987273335457,
-0.05084376037120819,
-0.12036798894405365,
0.017185281962156296,
0.07894735783338547,
-0.061233215034008026,
0.0031168365385383368,
0.025882668793201447,
0.03245953842997551,
-0.050473105162382126,
0.2063663750886917,
0.05922876298427582,
0.06344152241945267,
-0.0980527400970459,
0.11504260450601578,
0.04410172626376152,
-0.15928912162780762,
0.01490237470716238,
0.07063783705234528,
-0.07118339836597443,
-0.012482371181249619,
0.08167428523302078,
0.0852520540356636,
-0.000253896665526554,
-0.03653128072619438,
-0.10448562353849411,
-0.12869636714458466,
0.07989221066236496,
0.11219299584627151,
0.04801015183329582,
0.0257888101041317,
-0.03342156857252121,
0.05035460740327835,
-0.10952519625425339,
0.12973584234714508,
0.10406861454248428,
0.09719080477952957,
-0.2068677544593811,
0.10820271819829941,
0.006483267992734909,
0.004401307087391615,
-0.027119185775518417,
0.04971112683415413,
-0.1344827264547348,
-0.009848198853433132,
-0.07445565611124039,
-0.016693567857146263,
-0.0614057220518589,
0.008504744619131088,
-0.004934186581522226,
-0.0463360995054245,
-0.06002500280737877,
0.024531230330467224,
-0.09379922598600388,
-0.03753288462758064,
0.014041787013411522,
0.08120841532945633,
-0.09485062956809998,
-0.013048265129327774,
0.032864391803741455,
-0.1149938777089119,
0.09515905380249023,
0.03130362927913666,
0.025212062522768974,
0.02427385374903679,
-0.13864067196846008,
-0.0060125431045889854,
0.06393469870090485,
0.005556733813136816,
0.044138457626104355,
-0.17831993103027344,
0.00044407194945961237,
-0.047581933438777924,
0.014338365755975246,
-0.01298020500689745,
0.02981807105243206,
-0.10827428102493286,
-0.008848104625940323,
-0.062001511454582214,
-0.058962270617485046,
-0.055095259100198746,
0.029090063646435738,
0.10039322823286057,
-0.0060537732206285,
0.16592757403850555,
-0.08029705286026001,
0.02388932928442955,
-0.2168607860803604,
0.0035334669519215822,
-0.011636908166110516,
-0.06432851403951645,
-0.09285934269428253,
-0.003834014991298318,
0.07277602702379227,
-0.04618401825428009,
0.07398194819688797,
-0.03443336859345436,
0.0008752606227062643,
0.024381807073950768,
-0.04849061742424965,
0.04416665434837341,
0.05202249437570572,
0.19186654686927795,
0.014665279537439346,
-0.034730613231658936,
0.004376156721264124,
-0.010670999065041542,
0.07712434232234955,
0.08021833002567291,
0.18625396490097046,
0.1526789367198944,
-0.04414427652955055,
0.04953772574663162,
0.053374845534563065,
-0.13911522924900055,
-0.17472130060195923,
0.13803240656852722,
-0.06263550370931625,
0.14326998591423035,
0.01904403232038021,
0.21346423029899597,
0.08428029716014862,
-0.1864028424024582,
0.03912469372153282,
-0.02856971137225628,
-0.08088494092226028,
-0.10544059425592422,
-0.06617169082164764,
-0.08130389451980591,
-0.18424153327941895,
0.019388902932405472,
-0.108766108751297,
0.03468720242381096,
0.04475538060069084,
0.03457660600543022,
0.028784170746803284,
0.16071461141109467,
0.024670805782079697,
0.0110816964879632,
0.07468748092651367,
0.05008925125002861,
-0.06242992728948593,
-0.03249955177307129,
-0.09353480488061905,
0.042295798659324646,
-0.03761517256498337,
0.02951711416244507,
-0.07661032676696777,
-0.09368828684091568,
0.09987090528011322,
0.035009849816560745,
-0.08631277829408646,
0.03756175562739372,
-0.02487659454345703,
0.04875776171684265,
0.09676273167133331,
0.01309962011873722,
-0.002935242373496294,
-0.016236700117588043,
0.2048051804304123,
-0.08750434964895248,
-0.03915438801050186,
-0.1284858137369156,
0.23763853311538696,
-0.020648952573537827,
-0.0055930474773049355,
0.052120406180620193,
-0.0841151550412178,
-0.013977485708892345,
0.14196676015853882,
0.16977322101593018,
-0.04088582843542099,
-0.02649986557662487,
0.01438638474792242,
-0.008900686167180538,
-0.05915546417236328,
0.09277667850255966,
0.12807191908359528,
0.10021053999662399,
-0.03770820423960686,
-0.0586266815662384,
-0.06422808766365051,
-0.025659814476966858,
-0.016961442306637764,
0.058515045791864395,
0.05731829255819321,
-0.0347679927945137,
-0.017084993422031403,
0.11131203919649124,
-0.10820852965116501,
-0.13775105774402618,
0.04522229731082916,
-0.16411037743091583,
-0.18788929283618927,
-0.06760040670633316,
0.09350069612264633,
0.012410011142492294,
0.035134412348270416,
-0.007649764884263277,
-0.04330899938941002,
0.08574063330888748,
0.00909652840346098,
-0.06920196861028671,
-0.07519716769456863,
0.04232294112443924,
-0.03946699947118759,
0.20338888466358185,
-0.03762122243642807,
0.027782151475548744,
0.1053825169801712,
0.07442469149827957,
-0.08822743594646454,
0.04166300967335701,
0.07355286926031113,
-0.1613437980413437,
0.03730228915810585,
0.18604013323783875,
-0.05780963972210884,
0.1252143234014511,
0.04076186195015907,
-0.08712733536958694,
-0.010148512199521065,
-0.09050177037715912,
-0.06062899902462959,
-0.061288584023714066,
0.007417506072670221,
-0.06853604316711426,
0.14679785072803497,
0.18524426221847534,
-0.05302689969539642,
-0.011110133491456509,
-0.06352740526199341,
0.031889576464891434,
0.07581993192434311,
0.11937060207128525,
0.022190876305103302,
-0.24478505551815033,
0.031667567789554596,
-0.05505664274096489,
0.023692747578024864,
-0.23704256117343903,
-0.06901630014181137,
0.006537817418575287,
-0.038358207792043686,
-0.13611546158790588,
0.10107947885990143,
0.05428667739033699,
0.048662230372428894,
-0.05604499205946922,
-0.048515863716602325,
-0.05427572876214981,
0.15420179069042206,
-0.18293839693069458,
-0.07330141216516495
] |
null | null | pruna-engine | <!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
<img src="https://i.imgur.com/eDAlcgk.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/CP4VSgck)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.
- ***How does the model quality change?*** The quality of the model output might slightly vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/) if needed.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with `nvcc --version` and install with `conda install nvidia/label/cuda-12.1.0::cuda`.
1. Install the `pruna-engine` available [here](https://pypi.org/project/pruna-engine/) on Pypi. It might take up to 15 minutes to install.
```bash
pip install pruna-engine[gpu]==0.6.0 --extra-index-url https://pypi.nvidia.com --extra-index-url https://pypi.ngc.nvidia.com --extra-index-url https://prunaai.pythonanywhere.com/
```
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
```bash
mkdir SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed
huggingface-cli download PrunaAI/SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed --local-dir SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed --local-dir-use-symlinks False
```
- Option 2 - Use Python:
```python
import subprocess
repo_name = "SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed"
subprocess.run(["mkdir", repo_name])
subprocess.run(["huggingface-cli", "download", 'PrunaAI/'+ repo_name, "--local-dir", repo_name, "--local-dir-use-symlinks", "False"])
```
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
```python
from pruna_engine.PrunaModel import PrunaModel
model_path = "SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed/model" # Specify the downloaded model path.
smashed_model = PrunaModel.load_model(model_path) # Load the model.
smashed_model(prompt='Beautiful fruits in trees', height=1024, width=1024)[0][0] # Run the model where x is the expected input of.
```
## Configurations
The configuration info are in `config.json`.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V2.0 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your own AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). | {"license": "apache-2.0", "library_name": "pruna-engine", "metrics": ["memory_disk", "memory_inference", "inference_latency", "inference_throughput", "inference_CO2_emissions", "inference_energy_consumption"], "thumbnail": "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"} | null | PrunaAI/SG161222-RealVisXL_V2.0-turbo-tiny-green-smashed | [
"pruna-engine",
"license:apache-2.0",
"region:us"
] | 2024-02-13T09:52:02+00:00 | [] | [] | TAGS
#pruna-engine #license-apache-2.0 #region-us
|
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="URL target="_blank" rel="noopener noreferrer">
<img src="https://i.URL alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
. We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.
- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.
- *What are "first" metrics?* Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.
1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
- Option 2 - Use Python:
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
## Configurations
The configuration info are in 'URL'.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V2.0 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next here.
- Request access to easily compress your own AI models here. | [
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V2.0 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
"TAGS\n#pruna-engine #license-apache-2.0 #region-us \n",
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V2.0 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
19,
92,
402,
155,
13,
43,
36
] | [
"passage: TAGS\n#pruna-engine #license-apache-2.0 #region-us \n# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help."
] | [
-0.03573288768529892,
0.13100412487983704,
-0.001540288794785738,
0.022444402799010277,
0.10067341476678848,
0.055228229612112045,
0.07351797074079514,
0.10892018675804138,
0.048860158771276474,
0.008610251359641552,
0.1138949766755104,
0.11960647255182266,
0.08499637991189957,
0.1467972695827484,
0.04886651411652565,
-0.36119624972343445,
0.12328322976827621,
0.030059607699513435,
0.11286655068397522,
0.06615262478590012,
0.1305178701877594,
-0.08176743984222412,
0.12242286652326584,
0.11668917536735535,
-0.07146744430065155,
-0.058715738356113434,
0.01354521606117487,
-0.04568718746304512,
0.047688763588666916,
0.006920846179127693,
0.11834761500358582,
0.016820384189486504,
0.08045748621225357,
-0.13821633160114288,
0.04106122627854347,
-0.030310893431305885,
-0.009298821911215782,
0.1184764951467514,
0.02953704260289669,
0.10704020410776138,
0.2749005854129791,
0.11839497834444046,
-0.1026306003332138,
0.08103878796100616,
-0.07297040522098541,
-0.10587455332279205,
-0.0623883455991745,
-0.010004812851548195,
0.06654322892427444,
0.04772229865193367,
-0.06958308070898056,
0.18299241364002228,
-0.12177606672048569,
-0.07967323064804077,
0.07131063938140869,
-0.2638101875782013,
-0.05668892711400986,
0.031334877014160156,
0.10106468200683594,
-0.061002105474472046,
0.011471700854599476,
0.0484081394970417,
0.07423757761716843,
0.003831893904134631,
0.02755037508904934,
0.0231326911598444,
0.1263711303472519,
-0.026059621945023537,
-0.08856093138456345,
-0.05992208048701286,
0.12840677797794342,
0.01695944555103779,
-0.04769099876284599,
-0.09044229239225388,
-0.05195551738142967,
-0.14491762220859528,
-0.05469183251261711,
-0.07260868698358536,
0.00944194383919239,
0.12943263351917267,
0.10376454889774323,
-0.12659528851509094,
-0.12641963362693787,
-0.08089456707239151,
-0.0874733030796051,
0.12121638655662537,
0.08226454257965088,
0.06846921145915985,
-0.020507263019680977,
0.05536094680428505,
0.07567624747753143,
-0.051027387380599976,
0.023186611011624336,
-0.1948918253183365,
-0.016140220686793327,
0.004553433042019606,
-0.12328886240720749,
0.074356809258461,
0.05938659980893135,
0.1526218056678772,
0.10516335070133209,
-0.07109570503234863,
0.14941607415676117,
0.03119758330285549,
0.05183535814285278,
0.0016945579554885626,
-0.19732382893562317,
0.044110603630542755,
0.03397487848997116,
0.029030555859208107,
0.07634714245796204,
0.01838402822613716,
-0.09485938400030136,
0.03653564304113388,
-0.10627062618732452,
0.05883750319480896,
-0.056691549718379974,
-0.02411743625998497,
-0.1321919858455658,
-0.06697039306163788,
0.20706887543201447,
-0.01862199977040291,
-0.020022790879011154,
-0.02232302725315094,
-0.02678769826889038,
0.16259340941905975,
-0.08499446511268616,
0.07367748767137527,
-0.06866656243801117,
-0.07434844970703125,
-0.11305806785821915,
0.008160690777003765,
-0.13459642231464386,
0.04250127822160721,
0.007071048486977816,
-0.09537149220705032,
0.050375018268823624,
-0.11538799852132797,
-0.05920444801449776,
0.10739794373512268,
0.11890298873186111,
-0.038714323192834854,
-0.07378857582807541,
0.07845373451709747,
0.021176116541028023,
-0.1269662231206894,
-0.04373960942029953,
-0.06861985474824905,
-0.0158606618642807,
-0.0012650806456804276,
0.022747088223695755,
0.05848544463515282,
-0.17233939468860626,
0.06719639152288437,
-0.11638288199901581,
0.016827668994665146,
-0.04659169912338257,
0.021059416234493256,
-0.030015811324119568,
0.12740249931812286,
-0.07529475539922714,
0.00627522449940443,
-0.0467182919383049,
0.03761160373687744,
0.04215855151414871,
0.05199176073074341,
-0.2508608102798462,
0.030298501253128052,
0.1098698228597641,
-0.12760701775550842,
-0.08380157500505447,
0.17382264137268066,
0.014979318715631962,
-0.03036264143884182,
0.10716407746076584,
0.08396483957767487,
0.023649759590625763,
-0.05164189636707306,
0.03376004472374916,
-0.05715443566441536,
-0.11622927337884903,
-0.13548094034194946,
0.16667284071445465,
0.08274152129888535,
-0.18323549628257751,
0.05830454081296921,
-0.03704249486327171,
0.07311563938856125,
-0.0804717093706131,
-0.11762288212776184,
-0.022608499974012375,
-0.15161584317684174,
-0.008770188316702843,
0.06604208797216415,
0.028975360095500946,
0.006918381433933973,
-0.06216441094875336,
-0.08010344207286835,
0.17335863411426544,
0.04135739430785179,
-0.07565157860517502,
-0.20717956125736237,
0.1366199105978012,
-0.02334769256412983,
0.017625825479626656,
-0.11182565987110138,
0.0436074398458004,
0.04173315688967705,
-0.0868011936545372,
0.0665515661239624,
0.13982601463794708,
0.013523505069315434,
-0.018048446625471115,
0.05104345083236694,
0.10467413812875748,
0.02380356192588806,
0.014978175982832909,
-0.017454128712415695,
0.035858120769262314,
-0.03912922739982605,
-0.020320983603596687,
0.16504476964473724,
-0.03555602207779884,
0.00458597531542182,
-0.10513406246900558,
0.11301354318857193,
-0.021027158945798874,
0.009408654645085335,
0.0349666066467762,
-0.0038192281499505043,
-0.04639878869056702,
0.039253491908311844,
0.09851683676242828,
-0.07275566458702087,
-0.017454536631703377,
0.11367116123437881,
0.06871969252824783,
0.019681844860315323,
0.16850601136684418,
0.03191102296113968,
0.06726911664009094,
0.012733091600239277,
-0.03376281261444092,
0.060752686113119125,
-0.06495172530412674,
0.011898837052285671,
0.030007708817720413,
-0.00917510874569416,
0.02521984465420246,
-0.053047437220811844,
0.0434323213994503,
0.0030628403183072805,
-0.021423103287816048,
-0.02638157084584236,
0.022684084251523018,
0.3416460454463959,
-0.13255257904529572,
0.07944878935813904,
0.15327060222625732,
-0.048273082822561264,
0.03909280523657799,
-0.01437810342758894,
-0.15436100959777832,
-0.0026699609588831663,
-0.042268503457307816,
-0.03032074123620987,
0.15610229969024658,
0.01616038754582405,
0.014675425365567207,
0.07114095240831375,
-0.10695059597492218,
0.03881934657692909,
-0.1104544848203659,
-0.06612041592597961,
-0.02173178642988205,
-0.06350896507501602,
-0.028971035033464432,
0.03697436302900314,
-0.08885105699300766,
0.04942630976438522,
-0.06041031703352928,
-0.0835113525390625,
-0.00279027596116066,
0.05765359848737717,
0.05292991176247597,
-0.007865676656365395,
0.0266135074198246,
-0.12370338290929794,
-0.1333969384431839,
0.04225758835673332,
0.059311866760253906,
0.07703723758459091,
0.03257518634200096,
-0.04629363492131233,
-0.08232147246599197,
-0.023416923359036446,
-0.11862172931432724,
0.08274577558040619,
-0.05005235597491264,
0.02116742543876171,
0.036951590329408646,
0.05057327821850777,
-0.045071668922901154,
-0.028049485757946968,
-0.031239887699484825,
0.03177438676357269,
0.027354221791028976,
-0.06876137107610703,
0.10529255121946335,
0.06632733345031738,
-0.0037226425483822823,
-0.032648492604494095,
0.00643171789124608,
0.17272710800170898,
-0.055237580090761185,
0.023839082568883896,
0.2087412327528,
-0.011401763185858727,
0.011130680330097675,
0.16558146476745605,
0.019315127283334732,
-0.10980436205863953,
0.04490264132618904,
-0.011618091724812984,
-0.024433093145489693,
-0.24778036773204803,
-0.0949326828122139,
-0.04385732486844063,
-0.0007706784526817501,
0.06361816078424454,
0.011042679660022259,
-0.03859875723719597,
0.23577840626239777,
-0.026849139481782913,
0.018829621374607086,
-0.12459840625524521,
0.0004486647667363286,
-0.0048059262335300446,
-0.023355385288596153,
0.0931726023554802,
-0.08299873769283295,
-0.13033398985862732,
0.16407406330108643,
-0.027186540886759758,
0.16622412204742432,
0.1495974212884903,
0.19688820838928223,
0.05116363242268562,
0.15335489809513092,
0.10609954595565796,
0.07970421761274338,
0.03676251694560051,
-0.03067491576075554,
-0.09282566606998444,
0.021405506879091263,
-0.164025217294693,
0.04669925570487976,
0.13864991068840027,
-0.12398896366357803,
0.02497870661318302,
0.05730264261364937,
0.05873735621571541,
0.18854323029518127,
0.07822155207395554,
-0.2538689970970154,
0.01973096653819084,
0.055058181285858154,
-0.07391679286956787,
0.022711774334311485,
0.06807753443717957,
0.0036701757926493883,
-0.0067734792828559875,
-0.010745672509074211,
-0.06353491544723511,
0.0965924933552742,
-0.04308589547872543,
0.04941694438457489,
0.01959800533950329,
0.16887013614177704,
0.07060588151216507,
0.08728066831827164,
-0.09544848650693893,
0.18648295104503632,
-0.03208779916167259,
-0.011359825730323792,
-0.08284597098827362,
0.004318110644817352,
0.13521210849285126,
0.019071491435170174,
0.09255356341600418,
-0.044269099831581116,
-0.121957927942276,
0.006323229055851698,
-0.23562082648277283,
0.08331291377544403,
-0.05058889836072922,
-0.09113127738237381,
-0.011328799650073051,
-0.05844489112496376,
0.009289783425629139,
-0.08998627215623856,
0.09676292538642883,
-0.17567263543605804,
-0.11333465576171875,
0.04538518562912941,
0.12856784462928772,
0.11294994503259659,
-0.0013648143503814936,
-0.030656781047582626,
-0.0699990838766098,
-0.011223812587559223,
0.20029914379119873,
0.00366450403816998,
-0.07581856101751328,
-0.0320870466530323,
0.15712374448776245,
-0.039278823882341385,
0.04287504032254219,
-0.03278914466500282,
0.0889153853058815,
0.014466522261500359,
-0.054190460592508316,
0.049298208206892014,
-0.06592636555433273,
-0.02200072817504406,
-0.010078194551169872,
0.043240293860435486,
-0.006756619084626436,
0.08686710149049759,
0.08034902811050415,
-0.02028435468673706,
-0.08227753639221191,
-0.12333385646343231,
-0.08462619036436081,
0.02255922183394432,
0.014176737517118454,
-0.03690947964787483,
-0.1973867118358612,
-0.16155937314033508,
-0.09742958098649979,
-0.0629657432436943,
0.15201593935489655,
0.14297084510326385,
-0.08191602677106857,
0.00922099594026804,
0.21116870641708374,
0.06292436271905899,
-0.19491003453731537,
-0.24432238936424255,
-0.03983849659562111,
0.01585426554083824,
0.09687988460063934,
-0.1933489441871643,
0.08233629912137985,
0.182514950633049,
-0.06398133188486099,
-0.07713980972766876,
-0.18120619654655457,
-0.028968514874577522,
0.15624676644802094,
0.08880409598350525,
0.034364253282547,
-0.12160810083150864,
-0.019484393298625946,
-0.08997685462236404,
-0.06645027548074722,
0.1836807280778885,
-0.1339590847492218,
0.10656681656837463,
0.047402817755937576,
-0.03403366357088089,
-0.003687099553644657,
0.005388102028518915,
0.18110449612140656,
-0.06074931100010872,
-0.03121829405426979,
-0.07036804407835007,
-0.040840599685907364,
-0.03660854324698448,
0.002168794395402074,
0.18515387177467346,
-0.12171797454357147,
-0.038098592311143875,
-0.10541598498821259,
-0.06389426440000534,
0.03395857289433479,
-0.07763153314590454,
0.060194969177246094,
-0.03642268478870392,
-0.10219133645296097,
0.08158847689628601,
-0.08674930036067963,
0.04596758633852005,
0.08569365739822388,
0.025330739095807076,
-0.13958010077476501,
-0.041966404765844345,
0.13235506415367126,
-0.05054902657866478,
0.16125746071338654,
-0.10117390006780624,
-0.035509366542100906,
0.06462109833955765,
-0.08589782565832138,
-0.020662501454353333,
0.05795184150338173,
-0.13294143974781036,
0.05381989851593971,
-0.034517206251621246,
-0.054303109645843506,
0.06638223677873611,
0.13539311289787292,
-0.08174611628055573,
-0.26379698514938354,
-0.028586648404598236,
0.18116533756256104,
-0.011824109591543674,
0.03490758687257767,
0.007525808177888393,
-0.06581656634807587,
-0.09945452958345413,
0.03421634063124657,
0.015848571434617043,
-0.058819036930799484,
0.011253371834754944,
0.03603195771574974,
-0.025691984221339226,
-0.12638072669506073,
0.05936821177601814,
0.06823792308568954,
-0.0941934734582901,
-0.032972872257232666,
0.000624003354460001,
-0.0942000150680542,
-0.1938478946685791,
-0.17557795345783234,
-0.05545451119542122,
-0.04006402567028999,
-0.05628135800361633,
-0.021884309127926826,
-0.08124807476997375,
0.018083522096276283,
-0.1502816379070282,
0.14475713670253754,
-0.038145869970321655,
0.006252584047615528,
-0.0209975466132164,
-0.031452372670173645,
0.006174853537231684,
0.025056565180420876,
0.009395711123943329,
-0.1390637904405594,
-0.13066346943378448,
0.024542182683944702,
0.014182592742145061,
-0.07936779409646988,
0.028898935765028,
-0.07041247189044952,
-0.013368768617510796,
-0.19853784143924713,
-0.016248196363449097,
-0.22301587462425232,
-0.04428691789507866,
0.07373011857271194,
-0.06579138338565826,
-0.0643509179353714,
-0.008556312881410122,
-0.10626900941133499,
0.011290484108030796,
-0.001610172912478447,
0.03655340522527695,
-0.014065271243453026,
0.1284508854150772,
0.09549132734537125,
0.022866301238536835,
0.0911332368850708,
-0.012985085137188435,
0.03078892081975937,
-0.002364721614867449,
0.005114673636853695,
-0.008743821643292904,
0.026989391073584557,
0.041291479021310806,
0.036686498671770096,
-0.09319939464330673,
0.051485151052474976,
0.07455861568450928,
0.012060899287462234,
-0.016920236870646477,
0.05300012603402138,
-0.06268929690122604,
-0.04640784114599228,
0.1739540696144104,
-0.15719911456108093,
0.021751491352915764,
-0.14736860990524292,
0.1444556564092636,
-0.010513187386095524,
0.2193831503391266,
0.06425803899765015,
0.006590514909476042,
-0.06233430281281471,
0.09131742268800735,
-0.11381170153617859,
-0.04915030673146248,
-0.09661798924207687,
-0.12367360293865204,
-0.06325404345989227,
-0.007560494355857372,
0.21556709706783295,
0.05387962982058525,
-0.019160475581884384,
0.01564701832830906,
0.10275020450353622,
0.018284136429429054,
0.0486762709915638,
0.19745464622974396,
0.15271995961666107,
0.02341005951166153,
-0.0889909639954567,
-0.01627037115395069,
0.01350808423012495,
-0.0853503867983818,
0.0018235126044601202,
0.06810587644577026,
0.009528014808893204,
0.10209719091653824,
0.04158112779259682,
0.07730802148580551,
-0.08426982909440994,
-0.036100927740335464,
-0.054412636905908585,
-0.039002493023872375,
0.025442468002438545,
0.14708547294139862,
0.1561807543039322,
-0.045414891093969345,
0.00960021186619997,
-0.044451650232076645,
-0.014602012000977993,
-0.11235427111387253,
-0.14517642557621002,
-0.05312458053231239,
-0.11602789163589478,
-0.012407226487994194,
-0.012481655925512314,
-0.10946076363325119,
0.19402679800987244,
0.0020146952010691166,
-0.10891479253768921,
0.15836599469184875,
-0.0684225931763649,
-0.05141875520348549,
-0.01276442687958479,
0.01876121386885643,
-0.07850293815135956,
-0.0022818923462182283,
-0.10077759623527527,
-0.10132168978452682,
0.02481774054467678,
0.01985805295407772,
0.004291232209652662,
-0.08486028015613556,
0.017950383946299553,
-0.0426529161632061,
-0.025940028950572014,
-0.03003011643886566,
-0.0709792897105217,
-0.036955878138542175,
0.031351156532764435,
0.0058463444001972675,
0.01388684380799532,
0.07808863371610641,
0.07333865016698837,
0.018339574337005615,
-0.01919405534863472,
-0.2616608738899231,
0.24205926060676575,
-0.004039466846734285,
0.020393166691064835,
-0.010021558031439781,
-0.02907947264611721,
-0.009054461494088173,
0.3215360641479492,
0.23173850774765015,
-0.19331027567386627,
-0.05439596250653267,
0.012476698495447636,
0.0026319981552660465,
-0.07542501389980316,
0.1094593033194542,
0.018875539302825928,
0.00423765042796731,
-0.05596840754151344,
0.04422816261649132,
-0.05862133949995041,
-0.050088413059711456,
-0.1257522851228714,
-0.04494333639740944,
0.05905360355973244,
-0.022934434935450554,
-0.03221585974097252,
0.10358119010925293,
-0.13487578928470612,
0.16533994674682617,
-0.10657414048910141,
0.05017029866576195,
-0.07994366437196732,
0.028935737907886505,
0.0966583639383316,
0.07425305992364883,
0.07716985791921616,
-0.06533738225698471,
0.0058298250660300255,
0.1655988246202469,
-0.0008596695261076093,
-0.18927240371704102,
0.013590690679848194,
0.1733534038066864,
0.06023263931274414,
0.19089101254940033,
0.01796259731054306,
-0.04964626953005791,
0.06430752575397491,
-0.045803219079971313,
-0.17122182250022888,
0.1270221322774887,
0.01922406442463398,
-0.06127423793077469,
0.04282431676983833,
0.008042684756219387,
-0.024938564747571945,
-0.09220714867115021,
0.030806705355644226,
0.015598767437040806,
0.012003421783447266,
-0.06450504809617996,
0.10216627269983292,
0.005314295180141926,
0.15338653326034546,
-0.1475324183702469,
0.07550209760665894,
0.0819275975227356,
-0.06108740344643593,
-0.012732340954244137,
-0.043435707688331604,
0.12170585989952087,
0.027922818437218666,
-0.05581796541810036,
-0.08375987410545349,
-0.0668790340423584,
-0.08820337802171707,
0.061364442110061646,
0.004165459889918566,
-0.106651172041893,
0.01591005176305771,
-0.057940948754549026,
-0.012328415177762508,
-0.09319394081830978,
-0.02908286266028881,
0.21767060458660126,
0.016501590609550476,
0.006692094262689352,
-0.0009450694778934121,
-0.010726286098361015,
-0.06910049915313721,
-0.11001616716384888,
-0.08330272138118744
] |
null | null | pruna-engine | <!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
<img src="https://i.imgur.com/eDAlcgk.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/CP4VSgck)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.
- ***How does the model quality change?*** The quality of the model output might slightly vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/) if needed.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with `nvcc --version` and install with `conda install nvidia/label/cuda-12.1.0::cuda`.
1. Install the `pruna-engine` available [here](https://pypi.org/project/pruna-engine/) on Pypi. It might take up to 15 minutes to install.
```bash
pip install pruna-engine[gpu]==0.6.0 --extra-index-url https://pypi.nvidia.com --extra-index-url https://pypi.ngc.nvidia.com --extra-index-url https://prunaai.pythonanywhere.com/
```
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
```bash
mkdir SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed
huggingface-cli download PrunaAI/SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed --local-dir SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed --local-dir-use-symlinks False
```
- Option 2 - Use Python:
```python
import subprocess
repo_name = "SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed"
subprocess.run(["mkdir", repo_name])
subprocess.run(["huggingface-cli", "download", 'PrunaAI/'+ repo_name, "--local-dir", repo_name, "--local-dir-use-symlinks", "False"])
```
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
```python
from pruna_engine.PrunaModel import PrunaModel
model_path = "SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed/model" # Specify the downloaded model path.
smashed_model = PrunaModel.load_model(model_path) # Load the model.
smashed_model(prompt='Beautiful fruits in trees', height=1024, width=1024)[0][0] # Run the model where x is the expected input of.
```
## Configurations
The configuration info are in `config.json`.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V1.0 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your own AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). | {"license": "apache-2.0", "library_name": "pruna-engine", "metrics": ["memory_disk", "memory_inference", "inference_latency", "inference_throughput", "inference_CO2_emissions", "inference_energy_consumption"], "thumbnail": "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"} | null | PrunaAI/SG161222-RealVisXL_V1.0-turbo-tiny-green-smashed | [
"pruna-engine",
"license:apache-2.0",
"region:us"
] | 2024-02-13T09:53:20+00:00 | [] | [] | TAGS
#pruna-engine #license-apache-2.0 #region-us
|
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="URL target="_blank" rel="noopener noreferrer">
<img src="https://i.URL alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
. We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.
- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.
- *What are "first" metrics?* Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.
1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
- Option 2 - Use Python:
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
## Configurations
The configuration info are in 'URL'.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V1.0 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next here.
- Request access to easily compress your own AI models here. | [
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V1.0 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
"TAGS\n#pruna-engine #license-apache-2.0 #region-us \n",
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/RealVisXL_V1.0 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
19,
92,
402,
155,
13,
43,
36
] | [
"passage: TAGS\n#pruna-engine #license-apache-2.0 #region-us \n# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help."
] | [
-0.03573288768529892,
0.13100412487983704,
-0.001540288794785738,
0.022444402799010277,
0.10067341476678848,
0.055228229612112045,
0.07351797074079514,
0.10892018675804138,
0.048860158771276474,
0.008610251359641552,
0.1138949766755104,
0.11960647255182266,
0.08499637991189957,
0.1467972695827484,
0.04886651411652565,
-0.36119624972343445,
0.12328322976827621,
0.030059607699513435,
0.11286655068397522,
0.06615262478590012,
0.1305178701877594,
-0.08176743984222412,
0.12242286652326584,
0.11668917536735535,
-0.07146744430065155,
-0.058715738356113434,
0.01354521606117487,
-0.04568718746304512,
0.047688763588666916,
0.006920846179127693,
0.11834761500358582,
0.016820384189486504,
0.08045748621225357,
-0.13821633160114288,
0.04106122627854347,
-0.030310893431305885,
-0.009298821911215782,
0.1184764951467514,
0.02953704260289669,
0.10704020410776138,
0.2749005854129791,
0.11839497834444046,
-0.1026306003332138,
0.08103878796100616,
-0.07297040522098541,
-0.10587455332279205,
-0.0623883455991745,
-0.010004812851548195,
0.06654322892427444,
0.04772229865193367,
-0.06958308070898056,
0.18299241364002228,
-0.12177606672048569,
-0.07967323064804077,
0.07131063938140869,
-0.2638101875782013,
-0.05668892711400986,
0.031334877014160156,
0.10106468200683594,
-0.061002105474472046,
0.011471700854599476,
0.0484081394970417,
0.07423757761716843,
0.003831893904134631,
0.02755037508904934,
0.0231326911598444,
0.1263711303472519,
-0.026059621945023537,
-0.08856093138456345,
-0.05992208048701286,
0.12840677797794342,
0.01695944555103779,
-0.04769099876284599,
-0.09044229239225388,
-0.05195551738142967,
-0.14491762220859528,
-0.05469183251261711,
-0.07260868698358536,
0.00944194383919239,
0.12943263351917267,
0.10376454889774323,
-0.12659528851509094,
-0.12641963362693787,
-0.08089456707239151,
-0.0874733030796051,
0.12121638655662537,
0.08226454257965088,
0.06846921145915985,
-0.020507263019680977,
0.05536094680428505,
0.07567624747753143,
-0.051027387380599976,
0.023186611011624336,
-0.1948918253183365,
-0.016140220686793327,
0.004553433042019606,
-0.12328886240720749,
0.074356809258461,
0.05938659980893135,
0.1526218056678772,
0.10516335070133209,
-0.07109570503234863,
0.14941607415676117,
0.03119758330285549,
0.05183535814285278,
0.0016945579554885626,
-0.19732382893562317,
0.044110603630542755,
0.03397487848997116,
0.029030555859208107,
0.07634714245796204,
0.01838402822613716,
-0.09485938400030136,
0.03653564304113388,
-0.10627062618732452,
0.05883750319480896,
-0.056691549718379974,
-0.02411743625998497,
-0.1321919858455658,
-0.06697039306163788,
0.20706887543201447,
-0.01862199977040291,
-0.020022790879011154,
-0.02232302725315094,
-0.02678769826889038,
0.16259340941905975,
-0.08499446511268616,
0.07367748767137527,
-0.06866656243801117,
-0.07434844970703125,
-0.11305806785821915,
0.008160690777003765,
-0.13459642231464386,
0.04250127822160721,
0.007071048486977816,
-0.09537149220705032,
0.050375018268823624,
-0.11538799852132797,
-0.05920444801449776,
0.10739794373512268,
0.11890298873186111,
-0.038714323192834854,
-0.07378857582807541,
0.07845373451709747,
0.021176116541028023,
-0.1269662231206894,
-0.04373960942029953,
-0.06861985474824905,
-0.0158606618642807,
-0.0012650806456804276,
0.022747088223695755,
0.05848544463515282,
-0.17233939468860626,
0.06719639152288437,
-0.11638288199901581,
0.016827668994665146,
-0.04659169912338257,
0.021059416234493256,
-0.030015811324119568,
0.12740249931812286,
-0.07529475539922714,
0.00627522449940443,
-0.0467182919383049,
0.03761160373687744,
0.04215855151414871,
0.05199176073074341,
-0.2508608102798462,
0.030298501253128052,
0.1098698228597641,
-0.12760701775550842,
-0.08380157500505447,
0.17382264137268066,
0.014979318715631962,
-0.03036264143884182,
0.10716407746076584,
0.08396483957767487,
0.023649759590625763,
-0.05164189636707306,
0.03376004472374916,
-0.05715443566441536,
-0.11622927337884903,
-0.13548094034194946,
0.16667284071445465,
0.08274152129888535,
-0.18323549628257751,
0.05830454081296921,
-0.03704249486327171,
0.07311563938856125,
-0.0804717093706131,
-0.11762288212776184,
-0.022608499974012375,
-0.15161584317684174,
-0.008770188316702843,
0.06604208797216415,
0.028975360095500946,
0.006918381433933973,
-0.06216441094875336,
-0.08010344207286835,
0.17335863411426544,
0.04135739430785179,
-0.07565157860517502,
-0.20717956125736237,
0.1366199105978012,
-0.02334769256412983,
0.017625825479626656,
-0.11182565987110138,
0.0436074398458004,
0.04173315688967705,
-0.0868011936545372,
0.0665515661239624,
0.13982601463794708,
0.013523505069315434,
-0.018048446625471115,
0.05104345083236694,
0.10467413812875748,
0.02380356192588806,
0.014978175982832909,
-0.017454128712415695,
0.035858120769262314,
-0.03912922739982605,
-0.020320983603596687,
0.16504476964473724,
-0.03555602207779884,
0.00458597531542182,
-0.10513406246900558,
0.11301354318857193,
-0.021027158945798874,
0.009408654645085335,
0.0349666066467762,
-0.0038192281499505043,
-0.04639878869056702,
0.039253491908311844,
0.09851683676242828,
-0.07275566458702087,
-0.017454536631703377,
0.11367116123437881,
0.06871969252824783,
0.019681844860315323,
0.16850601136684418,
0.03191102296113968,
0.06726911664009094,
0.012733091600239277,
-0.03376281261444092,
0.060752686113119125,
-0.06495172530412674,
0.011898837052285671,
0.030007708817720413,
-0.00917510874569416,
0.02521984465420246,
-0.053047437220811844,
0.0434323213994503,
0.0030628403183072805,
-0.021423103287816048,
-0.02638157084584236,
0.022684084251523018,
0.3416460454463959,
-0.13255257904529572,
0.07944878935813904,
0.15327060222625732,
-0.048273082822561264,
0.03909280523657799,
-0.01437810342758894,
-0.15436100959777832,
-0.0026699609588831663,
-0.042268503457307816,
-0.03032074123620987,
0.15610229969024658,
0.01616038754582405,
0.014675425365567207,
0.07114095240831375,
-0.10695059597492218,
0.03881934657692909,
-0.1104544848203659,
-0.06612041592597961,
-0.02173178642988205,
-0.06350896507501602,
-0.028971035033464432,
0.03697436302900314,
-0.08885105699300766,
0.04942630976438522,
-0.06041031703352928,
-0.0835113525390625,
-0.00279027596116066,
0.05765359848737717,
0.05292991176247597,
-0.007865676656365395,
0.0266135074198246,
-0.12370338290929794,
-0.1333969384431839,
0.04225758835673332,
0.059311866760253906,
0.07703723758459091,
0.03257518634200096,
-0.04629363492131233,
-0.08232147246599197,
-0.023416923359036446,
-0.11862172931432724,
0.08274577558040619,
-0.05005235597491264,
0.02116742543876171,
0.036951590329408646,
0.05057327821850777,
-0.045071668922901154,
-0.028049485757946968,
-0.031239887699484825,
0.03177438676357269,
0.027354221791028976,
-0.06876137107610703,
0.10529255121946335,
0.06632733345031738,
-0.0037226425483822823,
-0.032648492604494095,
0.00643171789124608,
0.17272710800170898,
-0.055237580090761185,
0.023839082568883896,
0.2087412327528,
-0.011401763185858727,
0.011130680330097675,
0.16558146476745605,
0.019315127283334732,
-0.10980436205863953,
0.04490264132618904,
-0.011618091724812984,
-0.024433093145489693,
-0.24778036773204803,
-0.0949326828122139,
-0.04385732486844063,
-0.0007706784526817501,
0.06361816078424454,
0.011042679660022259,
-0.03859875723719597,
0.23577840626239777,
-0.026849139481782913,
0.018829621374607086,
-0.12459840625524521,
0.0004486647667363286,
-0.0048059262335300446,
-0.023355385288596153,
0.0931726023554802,
-0.08299873769283295,
-0.13033398985862732,
0.16407406330108643,
-0.027186540886759758,
0.16622412204742432,
0.1495974212884903,
0.19688820838928223,
0.05116363242268562,
0.15335489809513092,
0.10609954595565796,
0.07970421761274338,
0.03676251694560051,
-0.03067491576075554,
-0.09282566606998444,
0.021405506879091263,
-0.164025217294693,
0.04669925570487976,
0.13864991068840027,
-0.12398896366357803,
0.02497870661318302,
0.05730264261364937,
0.05873735621571541,
0.18854323029518127,
0.07822155207395554,
-0.2538689970970154,
0.01973096653819084,
0.055058181285858154,
-0.07391679286956787,
0.022711774334311485,
0.06807753443717957,
0.0036701757926493883,
-0.0067734792828559875,
-0.010745672509074211,
-0.06353491544723511,
0.0965924933552742,
-0.04308589547872543,
0.04941694438457489,
0.01959800533950329,
0.16887013614177704,
0.07060588151216507,
0.08728066831827164,
-0.09544848650693893,
0.18648295104503632,
-0.03208779916167259,
-0.011359825730323792,
-0.08284597098827362,
0.004318110644817352,
0.13521210849285126,
0.019071491435170174,
0.09255356341600418,
-0.044269099831581116,
-0.121957927942276,
0.006323229055851698,
-0.23562082648277283,
0.08331291377544403,
-0.05058889836072922,
-0.09113127738237381,
-0.011328799650073051,
-0.05844489112496376,
0.009289783425629139,
-0.08998627215623856,
0.09676292538642883,
-0.17567263543605804,
-0.11333465576171875,
0.04538518562912941,
0.12856784462928772,
0.11294994503259659,
-0.0013648143503814936,
-0.030656781047582626,
-0.0699990838766098,
-0.011223812587559223,
0.20029914379119873,
0.00366450403816998,
-0.07581856101751328,
-0.0320870466530323,
0.15712374448776245,
-0.039278823882341385,
0.04287504032254219,
-0.03278914466500282,
0.0889153853058815,
0.014466522261500359,
-0.054190460592508316,
0.049298208206892014,
-0.06592636555433273,
-0.02200072817504406,
-0.010078194551169872,
0.043240293860435486,
-0.006756619084626436,
0.08686710149049759,
0.08034902811050415,
-0.02028435468673706,
-0.08227753639221191,
-0.12333385646343231,
-0.08462619036436081,
0.02255922183394432,
0.014176737517118454,
-0.03690947964787483,
-0.1973867118358612,
-0.16155937314033508,
-0.09742958098649979,
-0.0629657432436943,
0.15201593935489655,
0.14297084510326385,
-0.08191602677106857,
0.00922099594026804,
0.21116870641708374,
0.06292436271905899,
-0.19491003453731537,
-0.24432238936424255,
-0.03983849659562111,
0.01585426554083824,
0.09687988460063934,
-0.1933489441871643,
0.08233629912137985,
0.182514950633049,
-0.06398133188486099,
-0.07713980972766876,
-0.18120619654655457,
-0.028968514874577522,
0.15624676644802094,
0.08880409598350525,
0.034364253282547,
-0.12160810083150864,
-0.019484393298625946,
-0.08997685462236404,
-0.06645027548074722,
0.1836807280778885,
-0.1339590847492218,
0.10656681656837463,
0.047402817755937576,
-0.03403366357088089,
-0.003687099553644657,
0.005388102028518915,
0.18110449612140656,
-0.06074931100010872,
-0.03121829405426979,
-0.07036804407835007,
-0.040840599685907364,
-0.03660854324698448,
0.002168794395402074,
0.18515387177467346,
-0.12171797454357147,
-0.038098592311143875,
-0.10541598498821259,
-0.06389426440000534,
0.03395857289433479,
-0.07763153314590454,
0.060194969177246094,
-0.03642268478870392,
-0.10219133645296097,
0.08158847689628601,
-0.08674930036067963,
0.04596758633852005,
0.08569365739822388,
0.025330739095807076,
-0.13958010077476501,
-0.041966404765844345,
0.13235506415367126,
-0.05054902657866478,
0.16125746071338654,
-0.10117390006780624,
-0.035509366542100906,
0.06462109833955765,
-0.08589782565832138,
-0.020662501454353333,
0.05795184150338173,
-0.13294143974781036,
0.05381989851593971,
-0.034517206251621246,
-0.054303109645843506,
0.06638223677873611,
0.13539311289787292,
-0.08174611628055573,
-0.26379698514938354,
-0.028586648404598236,
0.18116533756256104,
-0.011824109591543674,
0.03490758687257767,
0.007525808177888393,
-0.06581656634807587,
-0.09945452958345413,
0.03421634063124657,
0.015848571434617043,
-0.058819036930799484,
0.011253371834754944,
0.03603195771574974,
-0.025691984221339226,
-0.12638072669506073,
0.05936821177601814,
0.06823792308568954,
-0.0941934734582901,
-0.032972872257232666,
0.000624003354460001,
-0.0942000150680542,
-0.1938478946685791,
-0.17557795345783234,
-0.05545451119542122,
-0.04006402567028999,
-0.05628135800361633,
-0.021884309127926826,
-0.08124807476997375,
0.018083522096276283,
-0.1502816379070282,
0.14475713670253754,
-0.038145869970321655,
0.006252584047615528,
-0.0209975466132164,
-0.031452372670173645,
0.006174853537231684,
0.025056565180420876,
0.009395711123943329,
-0.1390637904405594,
-0.13066346943378448,
0.024542182683944702,
0.014182592742145061,
-0.07936779409646988,
0.028898935765028,
-0.07041247189044952,
-0.013368768617510796,
-0.19853784143924713,
-0.016248196363449097,
-0.22301587462425232,
-0.04428691789507866,
0.07373011857271194,
-0.06579138338565826,
-0.0643509179353714,
-0.008556312881410122,
-0.10626900941133499,
0.011290484108030796,
-0.001610172912478447,
0.03655340522527695,
-0.014065271243453026,
0.1284508854150772,
0.09549132734537125,
0.022866301238536835,
0.0911332368850708,
-0.012985085137188435,
0.03078892081975937,
-0.002364721614867449,
0.005114673636853695,
-0.008743821643292904,
0.026989391073584557,
0.041291479021310806,
0.036686498671770096,
-0.09319939464330673,
0.051485151052474976,
0.07455861568450928,
0.012060899287462234,
-0.016920236870646477,
0.05300012603402138,
-0.06268929690122604,
-0.04640784114599228,
0.1739540696144104,
-0.15719911456108093,
0.021751491352915764,
-0.14736860990524292,
0.1444556564092636,
-0.010513187386095524,
0.2193831503391266,
0.06425803899765015,
0.006590514909476042,
-0.06233430281281471,
0.09131742268800735,
-0.11381170153617859,
-0.04915030673146248,
-0.09661798924207687,
-0.12367360293865204,
-0.06325404345989227,
-0.007560494355857372,
0.21556709706783295,
0.05387962982058525,
-0.019160475581884384,
0.01564701832830906,
0.10275020450353622,
0.018284136429429054,
0.0486762709915638,
0.19745464622974396,
0.15271995961666107,
0.02341005951166153,
-0.0889909639954567,
-0.01627037115395069,
0.01350808423012495,
-0.0853503867983818,
0.0018235126044601202,
0.06810587644577026,
0.009528014808893204,
0.10209719091653824,
0.04158112779259682,
0.07730802148580551,
-0.08426982909440994,
-0.036100927740335464,
-0.054412636905908585,
-0.039002493023872375,
0.025442468002438545,
0.14708547294139862,
0.1561807543039322,
-0.045414891093969345,
0.00960021186619997,
-0.044451650232076645,
-0.014602012000977993,
-0.11235427111387253,
-0.14517642557621002,
-0.05312458053231239,
-0.11602789163589478,
-0.012407226487994194,
-0.012481655925512314,
-0.10946076363325119,
0.19402679800987244,
0.0020146952010691166,
-0.10891479253768921,
0.15836599469184875,
-0.0684225931763649,
-0.05141875520348549,
-0.01276442687958479,
0.01876121386885643,
-0.07850293815135956,
-0.0022818923462182283,
-0.10077759623527527,
-0.10132168978452682,
0.02481774054467678,
0.01985805295407772,
0.004291232209652662,
-0.08486028015613556,
0.017950383946299553,
-0.0426529161632061,
-0.025940028950572014,
-0.03003011643886566,
-0.0709792897105217,
-0.036955878138542175,
0.031351156532764435,
0.0058463444001972675,
0.01388684380799532,
0.07808863371610641,
0.07333865016698837,
0.018339574337005615,
-0.01919405534863472,
-0.2616608738899231,
0.24205926060676575,
-0.004039466846734285,
0.020393166691064835,
-0.010021558031439781,
-0.02907947264611721,
-0.009054461494088173,
0.3215360641479492,
0.23173850774765015,
-0.19331027567386627,
-0.05439596250653267,
0.012476698495447636,
0.0026319981552660465,
-0.07542501389980316,
0.1094593033194542,
0.018875539302825928,
0.00423765042796731,
-0.05596840754151344,
0.04422816261649132,
-0.05862133949995041,
-0.050088413059711456,
-0.1257522851228714,
-0.04494333639740944,
0.05905360355973244,
-0.022934434935450554,
-0.03221585974097252,
0.10358119010925293,
-0.13487578928470612,
0.16533994674682617,
-0.10657414048910141,
0.05017029866576195,
-0.07994366437196732,
0.028935737907886505,
0.0966583639383316,
0.07425305992364883,
0.07716985791921616,
-0.06533738225698471,
0.0058298250660300255,
0.1655988246202469,
-0.0008596695261076093,
-0.18927240371704102,
0.013590690679848194,
0.1733534038066864,
0.06023263931274414,
0.19089101254940033,
0.01796259731054306,
-0.04964626953005791,
0.06430752575397491,
-0.045803219079971313,
-0.17122182250022888,
0.1270221322774887,
0.01922406442463398,
-0.06127423793077469,
0.04282431676983833,
0.008042684756219387,
-0.024938564747571945,
-0.09220714867115021,
0.030806705355644226,
0.015598767437040806,
0.012003421783447266,
-0.06450504809617996,
0.10216627269983292,
0.005314295180141926,
0.15338653326034546,
-0.1475324183702469,
0.07550209760665894,
0.0819275975227356,
-0.06108740344643593,
-0.012732340954244137,
-0.043435707688331604,
0.12170585989952087,
0.027922818437218666,
-0.05581796541810036,
-0.08375987410545349,
-0.0668790340423584,
-0.08820337802171707,
0.061364442110061646,
0.004165459889918566,
-0.106651172041893,
0.01591005176305771,
-0.057940948754549026,
-0.012328415177762508,
-0.09319394081830978,
-0.02908286266028881,
0.21767060458660126,
0.016501590609550476,
0.006692094262689352,
-0.0009450694778934121,
-0.010726286098361015,
-0.06910049915313721,
-0.11001616716384888,
-0.08330272138118744
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [ybelkada/falcon-7b-sharded-bf16](https://huggingface.co/ybelkada/falcon-7b-sharded-bf16) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- training_steps: 50
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | {"library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "datasets": ["timdettmers/openassistant-guanaco"], "base_model": "ybelkada/falcon-7b-sharded-bf16", "model-index": [{"name": "results", "results": []}]} | null | omarfarooq908/falcon-7b-finetuned01 | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:timdettmers/openassistant-guanaco",
"base_model:ybelkada/falcon-7b-sharded-bf16",
"region:us"
] | 2024-02-13T09:57:52+00:00 | [] | [] | TAGS
#peft #safetensors #trl #sft #generated_from_trainer #dataset-timdettmers/openassistant-guanaco #base_model-ybelkada/falcon-7b-sharded-bf16 #region-us
|
# results
This model is a fine-tuned version of ybelkada/falcon-7b-sharded-bf16 on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- training_steps: 50
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1 | [
"# results\n\nThis model is a fine-tuned version of ybelkada/falcon-7b-sharded-bf16 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 50\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #dataset-timdettmers/openassistant-guanaco #base_model-ybelkada/falcon-7b-sharded-bf16 #region-us \n",
"# results\n\nThis model is a fine-tuned version of ybelkada/falcon-7b-sharded-bf16 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 50\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
63,
35,
6,
12,
8,
3,
140,
4,
39
] | [
"passage: TAGS\n#peft #safetensors #trl #sft #generated_from_trainer #dataset-timdettmers/openassistant-guanaco #base_model-ybelkada/falcon-7b-sharded-bf16 #region-us \n# results\n\nThis model is a fine-tuned version of ybelkada/falcon-7b-sharded-bf16 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: constant\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 50\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.10961569845676422,
0.09110279381275177,
-0.0035167569294571877,
0.0876123458147049,
0.1337803602218628,
0.0230542179197073,
0.11325078457593918,
0.12332414835691452,
-0.07270585745573044,
0.08026962727308273,
0.03979307413101196,
0.037561457604169846,
0.0642426535487175,
0.13313578069210052,
-0.04795641824603081,
-0.22314992547035217,
0.012476316653192043,
-0.021143916994333267,
-0.07782386243343353,
0.10720838606357574,
0.11353952437639236,
-0.10075584053993225,
0.02812565118074417,
0.022877294570207596,
-0.1289321780204773,
0.026499100029468536,
-0.018951041623950005,
-0.039552368223667145,
0.10038825124502182,
0.013971762731671333,
0.13050074875354767,
0.024934465065598488,
0.11720086634159088,
-0.24321308732032776,
-0.0014119873521849513,
0.08757874369621277,
0.039064910262823105,
0.07600906491279602,
0.07889605313539505,
-0.008644205518066883,
0.07868651300668716,
-0.09596006572246552,
0.08932894468307495,
0.010301552712917328,
-0.11106681078672409,
-0.23333875834941864,
-0.11759389191865921,
0.07216162979602814,
0.0948609858751297,
0.07365071028470993,
0.002043316373601556,
0.11223604530096054,
-0.10567285120487213,
0.057428572326898575,
0.18767718970775604,
-0.24771885573863983,
-0.07725891470909119,
0.028542248532176018,
0.022843727841973305,
0.0705256536602974,
-0.09459622204303741,
-0.03990035131573677,
0.047707799822092056,
0.023469310253858566,
0.07773271948099136,
0.027856353670358658,
-0.08562786877155304,
-0.008083975873887539,
-0.1182544007897377,
-0.03155435994267464,
0.08433409035205841,
0.03702117130160332,
-0.04060366377234459,
-0.10840915143489838,
-0.03178854286670685,
-0.10714462399482727,
-0.019270626828074455,
-0.028352433815598488,
0.027912896126508713,
-0.04478504881262779,
-0.00813403818756342,
-0.01659807190299034,
-0.0713891014456749,
-0.06669970601797104,
0.02009119838476181,
0.1471208930015564,
0.04707075282931328,
0.019347308203577995,
-0.033456288278102875,
0.13603945076465607,
0.0005556275136768818,
-0.14283621311187744,
-0.007007868494838476,
-0.0033287755213677883,
-0.07738500088453293,
-0.05400943011045456,
-0.04756055399775505,
0.010981881991028786,
-0.004419412929564714,
0.15362033247947693,
-0.12335189431905746,
0.06990926712751389,
0.0009976860601454973,
0.002671457128599286,
-0.05857151746749878,
0.12944504618644714,
-0.05458475276827812,
-0.008452665992081165,
-0.004078148398548365,
0.1285526305437088,
-0.007418696768581867,
0.0023094466887414455,
-0.06014212220907211,
-0.030266011133790016,
0.06710110604763031,
0.05057915672659874,
-0.06087900698184967,
0.007436845451593399,
-0.06970778852701187,
-0.027558138594031334,
0.04145892709493637,
-0.12134456634521484,
0.04015936329960823,
0.024436239153146744,
-0.0914035439491272,
-0.05858876556158066,
0.01797393336892128,
0.03452387824654579,
-0.0021841113921254873,
0.11788386106491089,
-0.0617561936378479,
-0.00517273461446166,
-0.0987810492515564,
-0.0573558546602726,
0.004897539038211107,
-0.05432416498661041,
-0.010258067399263382,
-0.04190725460648537,
-0.2016565501689911,
-0.047947779297828674,
0.048491086810827255,
-0.07633910328149796,
-0.02994069829583168,
-0.02573590725660324,
-0.05986671894788742,
0.024124320596456528,
-0.01801280491054058,
0.1614573895931244,
-0.04942053556442261,
0.09053203463554382,
0.005672423634678125,
0.024637313559651375,
0.04882551357150078,
0.02685147151350975,
-0.06762008368968964,
0.03675728291273117,
-0.13058874011039734,
0.08017009496688843,
-0.08992404490709305,
0.008310269564390182,
-0.09494125097990036,
-0.09143372625112534,
-0.03673374652862549,
-0.01628001779317856,
0.09634071588516235,
0.12487302720546722,
-0.17941536009311676,
-0.02754979208111763,
0.19128401577472687,
-0.09134748578071594,
-0.09324518591165543,
0.07940074801445007,
-0.059554848819971085,
0.035840462893247604,
0.0514586977660656,
0.1550581157207489,
0.10264156013727188,
-0.12563996016979218,
0.021447448059916496,
-0.03205886483192444,
0.12298300862312317,
0.06805584579706192,
0.050762493163347244,
-0.020295921713113785,
0.021393587812781334,
0.005681018810719252,
-0.06561219692230225,
0.009373326785862446,
-0.0907771959900856,
-0.07846322655677795,
-0.041995417326688766,
-0.0877632275223732,
0.04447675123810768,
0.033851634711027145,
0.03505497798323631,
-0.08052678406238556,
-0.11498820036649704,
0.09380540996789932,
0.1286754161119461,
-0.04909466207027435,
0.0057093300856649876,
-0.07240758091211319,
0.04563622921705246,
0.0016224407590925694,
-0.05187836289405823,
-0.17956285178661346,
-0.10644961148500443,
0.0308371651917696,
-0.0634973868727684,
-0.02790248766541481,
-0.010891532525420189,
0.07958956807851791,
0.064501091837883,
-0.06812263280153275,
-0.02652451954782009,
-0.10940485447645187,
-0.009008288383483887,
-0.08813293278217316,
-0.18982002139091492,
-0.04318554699420929,
-0.024732723832130432,
0.22783394157886505,
-0.2384667843580246,
0.011177501641213894,
-0.00845300778746605,
0.1586318165063858,
0.04203154891729355,
-0.06519965827465057,
-0.021375559270381927,
0.04442233592271805,
0.011758465319871902,
-0.08271446079015732,
0.03136221691966057,
0.0015634470619261265,
-0.08428773283958435,
-0.03575807809829712,
-0.1193329319357872,
0.05577089264988899,
0.06476754695177078,
0.0936187356710434,
-0.09675554186105728,
-0.06114344671368599,
-0.08003469556570053,
-0.035796526819467545,
-0.0669674500823021,
0.00686983997002244,
0.18336579203605652,
0.01752781681716442,
0.1287093311548233,
-0.09705603122711182,
-0.083653524518013,
-0.004679759498685598,
-0.007197325583547354,
0.020555544644594193,
0.09712646901607513,
0.03590637817978859,
-0.07395215332508087,
0.09184391796588898,
0.06551878154277802,
-0.03659777715802193,
0.149469256401062,
-0.06282450258731842,
-0.09282362461090088,
-0.011818503960967064,
0.03930492699146271,
0.0057513779029250145,
0.12998968362808228,
-0.04063481092453003,
0.05348590761423111,
0.02139700949192047,
0.02385304681956768,
0.04553135856986046,
-0.2070884108543396,
-0.02854745276272297,
0.031804125756025314,
-0.04829920828342438,
-0.024158738553524017,
-0.022646218538284302,
0.04612814635038376,
0.1031455397605896,
0.013066248036921024,
-0.012916998006403446,
0.0006487224018201232,
-0.02657299116253853,
-0.08696480095386505,
0.18862572312355042,
-0.11320069432258606,
-0.11210036277770996,
-0.08599817007780075,
0.042552173137664795,
-0.013729141093790531,
-0.03690190240740776,
0.005328271072357893,
-0.08697252720594406,
-0.034830063581466675,
-0.08429361879825592,
-0.04727361723780632,
0.02269815467298031,
-0.030669497326016426,
0.0653386041522026,
0.008150363340973854,
0.11339130997657776,
-0.11911458522081375,
0.011365411803126335,
-0.03664412349462509,
-0.07324998825788498,
0.01838698983192444,
0.04257243499159813,
0.07279501855373383,
0.13243749737739563,
-0.0012886899057775736,
0.02007824368774891,
-0.024003412574529648,
0.239364892244339,
-0.09696406126022339,
-0.02367653325200081,
0.1212243065237999,
-0.011418560519814491,
0.05299661308526993,
0.11826426535844803,
0.04572169482707977,
-0.10950484871864319,
0.02498253807425499,
0.08586235344409943,
-0.018518129363656044,
-0.25088122487068176,
-0.042274702340364456,
-0.02858944796025753,
-0.11376266181468964,
0.08212137222290039,
0.03354942798614502,
-0.00496323686093092,
0.03526009991765022,
-0.032331619411706924,
-0.003774862503632903,
0.02026441879570484,
0.05562283843755722,
0.04354091361165047,
0.05716099962592125,
0.10546769946813583,
-0.01267991866916418,
-0.026891592890024185,
0.05465511605143547,
0.01697375811636448,
0.23855772614479065,
-0.03808662295341492,
0.046203188598155975,
0.05850642919540405,
0.14777140319347382,
-0.0334181971848011,
0.03686607629060745,
-0.018567413091659546,
-0.029488487169146538,
0.0009992580162361264,
-0.06770826131105423,
-0.012123980559408665,
0.025440052151679993,
-0.0672927275300026,
0.06271877884864807,
-0.09220573306083679,
-0.021759187802672386,
0.03318922966718674,
0.2572377026081085,
0.055248066782951355,
-0.2658749520778656,
-0.07172582298517227,
0.02172696776688099,
-0.027541832998394966,
-0.07442940771579742,
-0.0076450807973742485,
0.109627366065979,
-0.1281692534685135,
0.06864690780639648,
-0.06316263973712921,
0.08954128623008728,
-0.0013875628355890512,
-0.0030475377570837736,
0.07156884670257568,
0.09306594729423523,
-0.01333592925220728,
0.06697327643632889,
-0.1976689100265503,
0.2350102812051773,
0.0117581682279706,
0.1065022423863411,
-0.054621849209070206,
0.039001960307359695,
0.012129777111113071,
0.03903835266828537,
0.07371283322572708,
-0.005137981381267309,
-0.0013345219194889069,
-0.1712556630373001,
-0.06521078199148178,
0.03800588846206665,
0.11358286440372467,
-0.0577687993645668,
0.0929984524846077,
-0.026352358981966972,
0.02450278215110302,
0.04504573345184326,
-0.05114984139800072,
-0.15547434985637665,
-0.11941322684288025,
0.03030581958591938,
-0.00837137270718813,
-0.037000950425863266,
-0.10619710385799408,
-0.10369464010000229,
-0.05159040167927742,
0.12377259135246277,
-0.009364373981952667,
-0.06640733033418655,
-0.14078015089035034,
0.06868495047092438,
0.15331022441387177,
-0.04405047744512558,
0.027521830052137375,
0.03442689776420593,
0.12113305926322937,
0.021505799144506454,
-0.04245072603225708,
0.061381492763757706,
-0.07239039242267609,
-0.20771639049053192,
-0.06881976872682571,
0.1404561549425125,
0.08091053366661072,
0.03888271749019623,
-0.00007391520193777978,
0.03131159394979477,
0.0188535675406456,
-0.08758464455604553,
0.015452063642442226,
0.06615082174539566,
0.07339838147163391,
0.05003334954380989,
-0.08138465136289597,
0.05990337207913399,
-0.04578489437699318,
-0.03203355520963669,
0.06119382381439209,
0.2589911222457886,
-0.08961927890777588,
0.07665276527404785,
0.01876133866608143,
-0.07450314611196518,
-0.14122825860977173,
0.06432882696390152,
0.1740407645702362,
0.030445093289017677,
0.0668591558933258,
-0.17494837939739227,
0.08310670405626297,
0.1323206126689911,
-0.043249715119600296,
0.07879142463207245,
-0.31635627150535583,
-0.1349133402109146,
0.04945389926433563,
0.10467628389596939,
-0.007479338441044092,
-0.13817332684993744,
-0.045460112392902374,
-0.016011014580726624,
-0.12880173325538635,
0.09840189665555954,
-0.09874086081981659,
0.08136598765850067,
-0.005046093836426735,
0.0757141187787056,
0.04352480545639992,
-0.034621890634298325,
0.17374804615974426,
0.0014710393734276295,
0.09290357679128647,
-0.03872304782271385,
0.05179661512374878,
0.021474622189998627,
-0.061118729412555695,
-0.0008733102004043758,
-0.020766854286193848,
0.06877142190933228,
-0.16174034774303436,
-0.01616600900888443,
-0.0996415987610817,
0.038771916180849075,
-0.07035616785287857,
-0.05417675897479057,
-0.03137403726577759,
0.07191126048564911,
0.0707329735159874,
-0.023367371410131454,
0.040439024567604065,
-0.012047826312482357,
0.18047982454299927,
0.10101500153541565,
0.08078206330537796,
-0.0011306335218250751,
-0.03956681489944458,
0.0020930832251906395,
0.0018338286317884922,
0.0446905754506588,
-0.12716154754161835,
0.0467858649790287,
0.1291719377040863,
0.03822265565395355,
0.13747791945934296,
0.03335205093026161,
-0.08006379008293152,
-0.011973707005381584,
0.06766767799854279,
-0.1033790335059166,
-0.08800846338272095,
-0.00046989161637611687,
0.018251873552799225,
-0.11496532708406448,
-0.013061774894595146,
0.1018485426902771,
-0.033693548291921616,
-0.025838179513812065,
-0.014836727641522884,
0.017893636599183083,
-0.02357359416782856,
0.20175515115261078,
0.04324589669704437,
0.07144024223089218,
-0.07298361510038376,
0.09453659504652023,
0.06033732742071152,
-0.04087742045521736,
0.021986041218042374,
0.049387674778699875,
-0.07795742154121399,
0.0048648035153746605,
0.044676922261714935,
0.12399286776781082,
-0.011138100177049637,
-0.049674540758132935,
-0.10054108500480652,
-0.10504429042339325,
0.04266783595085144,
0.11935823410749435,
0.03714561089873314,
-0.005448494106531143,
0.0006110554677434266,
0.028809253126382828,
-0.12397164106369019,
0.09589912742376328,
0.06375189870595932,
0.06485210359096527,
-0.1119081899523735,
0.11784590780735016,
0.012472543865442276,
-0.0005822777166031301,
-0.0014768739929422736,
0.043299600481987,
-0.09598954021930695,
-0.004509010352194309,
-0.15470609068870544,
-0.011099094524979591,
0.015519526787102222,
0.0058834366500377655,
-0.015909934416413307,
-0.060399871319532394,
-0.038451820611953735,
0.03478734940290451,
-0.09581657499074936,
-0.03849288821220398,
0.005515212193131447,
0.0461929515004158,
-0.1322723925113678,
-0.02744896523654461,
0.04634563624858856,
-0.10319861769676208,
0.08525032550096512,
0.05551166087388992,
0.053224824368953705,
0.043343592435121536,
-0.15350759029388428,
0.006018739193677902,
0.013295810669660568,
0.017830852419137955,
0.04544370621442795,
-0.1259806603193283,
0.008279120549559593,
-0.034207943826913834,
0.018718887120485306,
0.015115777030587196,
0.027318455278873444,
-0.124967560172081,
-0.025599513202905655,
-0.02571851573884487,
-0.05390699952840805,
-0.03713512793183327,
0.04048876836895943,
0.0743723064661026,
0.05001584067940712,
0.12850572168827057,
-0.08302558213472366,
0.045264557003974915,
-0.22816714644432068,
-0.0491473451256752,
-0.0058554657734930515,
-0.002381295897066593,
-0.08099493384361267,
-0.008814494125545025,
0.08123452216386795,
-0.04772988706827164,
0.12473902851343155,
-0.01648700051009655,
0.08735540509223938,
0.02765216864645481,
-0.10013855248689651,
0.009001029655337334,
0.01280999556183815,
0.1808868646621704,
0.05178183317184448,
-0.00027669346309266984,
0.08601529896259308,
-0.02752099372446537,
0.02378096431493759,
0.04389223828911781,
0.17294375598430634,
0.17030344903469086,
-0.02556828036904335,
0.024979563429951668,
0.07105521857738495,
-0.12624111771583557,
-0.07731383293867111,
0.08556410670280457,
-0.03757096081972122,
0.06351310759782791,
-0.051891956478357315,
0.16831345856189728,
0.09958061575889587,
-0.20702604949474335,
0.05360415577888489,
-0.055521391332149506,
-0.09166499227285385,
-0.13343045115470886,
-0.0074104261584579945,
-0.07586830109357834,
-0.1221250519156456,
0.031432799994945526,
-0.13585466146469116,
0.045116033405065536,
0.10688488930463791,
0.014378729276359081,
0.01619100756943226,
0.14436200261116028,
0.01257791556417942,
0.005870061460882425,
0.07172996550798416,
0.03823227807879448,
0.030090855434536934,
-0.06644561141729355,
-0.0807584598660469,
0.04371369630098343,
0.0022871727123856544,
0.05576369911432266,
-0.05889271944761276,
-0.02372516691684723,
0.006880988832563162,
0.023584412410855293,
-0.06319602578878403,
0.03437065705657005,
-0.002112528309226036,
0.06597444415092468,
0.04751213639974594,
0.04611055925488472,
0.03606449440121651,
-0.05616140738129616,
0.30163195729255676,
-0.06254276633262634,
-0.06879530102014542,
-0.13929063081741333,
0.2238413542509079,
0.024176442995667458,
0.00966909620910883,
0.06488805264234543,
-0.10992489755153656,
-0.0051823994144797325,
0.1346847116947174,
0.11236470192670822,
-0.10055132955312729,
-0.00852272380143404,
-0.014355412684381008,
-0.012715272605419159,
-0.04334995150566101,
0.11841924488544464,
0.09903167933225632,
0.04574328660964966,
-0.04920908436179161,
-0.0028191052842885256,
-0.01054058875888586,
-0.0014419665094465017,
-0.04981203377246857,
0.1078694686293602,
0.0016943307127803564,
0.004955577198415995,
-0.052363403141498566,
0.08377458900213242,
0.02886279486119747,
-0.1782696396112442,
0.07001890242099762,
-0.18950219452381134,
-0.20911097526550293,
-0.029615851119160652,
0.03956591337919235,
-0.027871176600456238,
0.05622471496462822,
-0.016590090468525887,
-0.029733415693044662,
0.10934779047966003,
-0.012105252593755722,
0.011247543618083,
-0.1279134303331375,
0.07101526111364365,
-0.0709419697523117,
0.25345540046691895,
-0.004024009685963392,
0.06571611762046814,
0.09594258666038513,
0.013862370513379574,
-0.10978425294160843,
0.04041757062077522,
0.08479512482881546,
-0.09304389357566833,
-0.003334348089993,
0.13422758877277374,
-0.04762246459722519,
0.05930766090750694,
0.06548003107309341,
-0.15126928687095642,
0.016895322129130363,
-0.013809804804623127,
-0.04785357788205147,
-0.08834222704172134,
0.0065979743376374245,
-0.05827190354466438,
0.13987255096435547,
0.22898079454898834,
-0.04482682794332504,
0.027450313791632652,
-0.060947053134441376,
0.04939534142613411,
0.04365352913737297,
0.13234393298625946,
-0.02274542674422264,
-0.22459764778614044,
0.04293078929185867,
0.05855265259742737,
0.003077256493270397,
-0.22930897772312164,
-0.0678446814417839,
0.04196300357580185,
-0.08055856823921204,
-0.04700806736946106,
0.119124636054039,
0.04549054428935051,
0.04456469044089317,
-0.02811264432966709,
-0.13445347547531128,
-0.05092567577958107,
0.15645676851272583,
-0.12511654198169708,
-0.030408458784222603
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-detect-cheapfake-combined-train-test-contradict-5-5
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4564
- Accuracy: 0.87
- F1: 0.8550
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 172 | 0.4396 | 0.8133 | 0.7565 |
| No log | 2.0 | 344 | 0.3395 | 0.8567 | 0.8401 |
| 0.1806 | 3.0 | 516 | 0.4137 | 0.8633 | 0.8417 |
| 0.1806 | 4.0 | 688 | 0.4293 | 0.8633 | 0.8441 |
| 0.1806 | 5.0 | 860 | 0.4564 | 0.87 | 0.8550 |
### Framework versions
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "metrics": ["accuracy", "f1"], "base_model": "roberta-base", "model-index": [{"name": "roberta-base-detect-cheapfake-combined-train-test-contradict-5-5", "results": []}]} | text-classification | hoanghoavienvo/roberta-base-detect-cheapfake-combined-train-test-contradict-5-5 | [
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:58:11+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us
| roberta-base-detect-cheapfake-combined-train-test-contradict-5-5
================================================================
This model is a fine-tuned version of roberta-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4564
* Accuracy: 0.87
* F1: 0.8550
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-06
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.37.0
* Pytorch 2.1.2
* Datasets 2.1.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
63,
98,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.1"
] | [
-0.08621460944414139,
0.07034635543823242,
-0.0019199398811906576,
0.10115745663642883,
0.16650773584842682,
0.014715912751853466,
0.16630806028842926,
0.10816951096057892,
-0.10288658738136292,
0.0370427630841732,
0.12479393929243088,
0.15665750205516815,
-0.00386789720505476,
0.13193999230861664,
-0.07780896127223969,
-0.2460506111383438,
0.0017037739744409919,
0.03195207566022873,
-0.08352852612733841,
0.11412577331066132,
0.10524716228246689,
-0.13441744446754456,
0.08543899655342102,
-0.010788877494633198,
-0.20579494535923004,
0.03757031634449959,
0.04433418810367584,
-0.06682558357715607,
0.13942250609397888,
0.04353126883506775,
0.138076514005661,
0.03182223439216614,
0.08476164191961288,
-0.19053460657596588,
0.019515322521328926,
0.06025005504488945,
-0.01696426421403885,
0.08433520048856735,
0.04447320103645325,
-0.02846045419573784,
0.11073671281337738,
-0.09475521743297577,
0.06298670917749405,
0.02233942784368992,
-0.13000643253326416,
-0.20548291504383087,
-0.07241541147232056,
0.03683169558644295,
0.09002912789583206,
0.08057185262441635,
-0.018453553318977356,
0.15328148007392883,
-0.060680683702230453,
0.10176348686218262,
0.2039148211479187,
-0.3121492266654968,
-0.06807814538478851,
0.0557270273566246,
0.026285406202077866,
0.08735282719135284,
-0.115474633872509,
0.0026458969805389643,
0.07811914384365082,
0.020640091970562935,
0.12106876820325851,
-0.037907227873802185,
-0.05945747345685959,
0.01250428706407547,
-0.1404155045747757,
0.004329723306000233,
0.14547623693943024,
0.04535907506942749,
-0.04377753287553787,
-0.04129227623343468,
-0.05750489607453346,
-0.12297330051660538,
-0.041320838034152985,
-0.03006129153072834,
0.04189131036400795,
-0.028421731665730476,
-0.11054307967424393,
-0.017965661361813545,
-0.1175297349691391,
-0.07437912374734879,
-0.06167883425951004,
0.1773151457309723,
0.03495987877249718,
0.0025804294273257256,
-0.03309919312596321,
0.09120360016822815,
-0.038548119366168976,
-0.12727007269859314,
0.01885819435119629,
0.023834383115172386,
0.004928858485072851,
-0.0724116712808609,
-0.06592026352882385,
-0.09593129903078079,
0.03041834942996502,
0.15105116367340088,
-0.06584733724594116,
0.050920501351356506,
0.00844074971973896,
0.043425071984529495,
-0.09286519140005112,
0.156424880027771,
-0.04224889352917671,
-0.0200910996645689,
0.02856389805674553,
0.05661073699593544,
0.039455950260162354,
-0.002625084714964032,
-0.12428244203329086,
0.018618585541844368,
0.11993711441755295,
0.014321994967758656,
-0.0776013731956482,
0.08416330814361572,
-0.03710022568702698,
0.004922328982502222,
0.012066755443811417,
-0.09140391647815704,
0.03385770693421364,
0.003724277252331376,
-0.060643598437309265,
-0.06862009316682816,
0.025731030851602554,
0.01758686453104019,
0.010992304421961308,
0.10249429941177368,
-0.09221751987934113,
0.02013399265706539,
-0.09168679267168045,
-0.13208584487438202,
0.009481038898229599,
-0.06193193048238754,
0.03720010817050934,
-0.11700829863548279,
-0.15841850638389587,
-0.01614351198077202,
0.04650472104549408,
-0.026190951466560364,
-0.01897256076335907,
-0.054438266903162,
-0.08079638332128525,
0.012490568682551384,
-0.015759531408548355,
0.09302414953708649,
-0.055287234485149384,
0.09891168028116226,
0.06301828473806381,
0.0675201267004013,
-0.05937451496720314,
0.031105026602745056,
-0.10197152197360992,
0.014138037338852882,
-0.20764753222465515,
0.02310292422771454,
-0.06100793182849884,
0.07491709291934967,
-0.07645316421985626,
-0.08150498569011688,
-0.005314336623996496,
0.024412302300333977,
0.07218009233474731,
0.08865291625261307,
-0.14711210131645203,
-0.07527957856655121,
0.16828382015228271,
-0.10392335057258606,
-0.1358744204044342,
0.11798913776874542,
-0.0721166729927063,
0.07070565968751907,
0.0756799504160881,
0.19184152781963348,
0.0618230402469635,
-0.0861574038863182,
0.005990234669297934,
-0.012431567534804344,
0.0401887446641922,
-0.05041053146123886,
0.05824347212910652,
0.008431577123701572,
-0.0034512151032686234,
0.011282360181212425,
-0.02325744368135929,
0.048965733498334885,
-0.09330045431852341,
-0.07874763011932373,
-0.03844728320837021,
-0.10583189874887466,
0.05032626539468765,
0.06182317063212395,
0.08372177183628082,
-0.13273939490318298,
-0.08757881075143814,
0.10252247005701065,
0.0725402757525444,
-0.07010117918252945,
0.018913112580776215,
-0.07962192595005035,
0.06808958947658539,
-0.05985528603196144,
-0.029801342636346817,
-0.16415730118751526,
-0.05009562149643898,
-0.002676796168088913,
0.031824056059122086,
0.0347660593688488,
0.014260555617511272,
0.08079540729522705,
0.07340314984321594,
-0.0760706290602684,
-0.02627664804458618,
0.001719681778922677,
0.015024897642433643,
-0.1262606680393219,
-0.20914557576179504,
-0.0023294438142329454,
-0.04222992807626724,
0.1321078985929489,
-0.25167644023895264,
0.05542728677392006,
0.0058716339990496635,
0.08717125654220581,
0.04098164662718773,
0.002394840121269226,
-0.04215797781944275,
0.07370348274707794,
-0.05050470307469368,
-0.05236615985631943,
0.05169191583991051,
0.005766124464571476,
-0.09053945541381836,
-0.04152698442339897,
-0.15880395472049713,
0.19915197789669037,
0.14128848910331726,
-0.11368162930011749,
-0.10693640261888504,
-0.0018688521813601255,
-0.03975243493914604,
-0.021979743614792824,
-0.048833996057510376,
0.015094748698174953,
0.12209979444742203,
-0.0269149336963892,
0.15366685390472412,
-0.07238663733005524,
-0.03532639890909195,
0.020037991926074028,
-0.06456677615642548,
0.009610732086002827,
0.10727270692586899,
0.08742791414260864,
-0.13918396830558777,
0.15090174973011017,
0.14885742962360382,
-0.10853871703147888,
0.1527588665485382,
-0.0332222580909729,
-0.05289981886744499,
-0.02175251953303814,
0.0012371899792924523,
0.015059403143823147,
0.109502412378788,
-0.11255775392055511,
-0.01091880351305008,
-0.0013665318256244063,
0.007489899173378944,
0.021909918636083603,
-0.2227829545736313,
-0.03621427342295647,
0.029583213850855827,
-0.03281821683049202,
0.013786832801997662,
-0.019375290721654892,
-0.00892646238207817,
0.10360633581876755,
-0.004862393252551556,
-0.08221158385276794,
0.04331734776496887,
0.0006397544057108462,
-0.09185978025197983,
0.22285795211791992,
-0.07136739790439606,
-0.11943015456199646,
-0.13255707919597626,
-0.06425787508487701,
-0.04347159340977669,
0.033622320741415024,
0.06763775646686554,
-0.08547840267419815,
-0.03778986632823944,
-0.09566111862659454,
0.007720544934272766,
0.02329927682876587,
0.0344257727265358,
-0.0016292420914396644,
0.015377731062471867,
0.07979975640773773,
-0.10849437862634659,
-0.008027816191315651,
-0.05612938851118088,
-0.07000115513801575,
0.045762114226818085,
0.030557934194803238,
0.11570652574300766,
0.1440325677394867,
-0.04411771893501282,
-0.0012083378387615085,
-0.043645355850458145,
0.22117897868156433,
-0.06906338036060333,
-0.017804423347115517,
0.12098298966884613,
-0.018330426886677742,
0.040507640689611435,
0.1338038295507431,
0.06436765938997269,
-0.09637270122766495,
0.03533366322517395,
0.04467233270406723,
-0.032245371490716934,
-0.2173183113336563,
-0.029396653175354004,
-0.028981659561395645,
-0.012439590878784657,
0.0846438854932785,
0.041100725531578064,
0.048482928425073624,
0.0778949186205864,
0.03482109308242798,
0.06626007705926895,
-0.006496085785329342,
0.07564367353916168,
0.10486944764852524,
0.04796665534377098,
0.13633906841278076,
-0.05979489907622337,
-0.07824981957674026,
0.027423327788710594,
-0.016282638534903526,
0.19207338988780975,
0.02345864474773407,
0.10902959853410721,
0.05600495636463165,
0.14305923879146576,
0.012536057271063328,
0.06671948730945587,
-0.0012640421045944095,
-0.06476438790559769,
-0.0017678000731393695,
-0.04603888466954231,
-0.011062346398830414,
0.0424518920481205,
-0.09809726476669312,
0.05192866176366806,
-0.11623319983482361,
0.014843374490737915,
0.06703438609838486,
0.20600728690624237,
0.057366516441106796,
-0.32683876156806946,
-0.0957888662815094,
0.027665939182043076,
-0.02500305138528347,
-0.016003021970391273,
0.02404729276895523,
0.11640861630439758,
-0.04653707519173622,
0.04369260370731354,
-0.06907264143228531,
0.0758499950170517,
-0.025330089032649994,
0.04277022182941437,
0.03960750624537468,
0.09704454243183136,
-0.02770659327507019,
0.06778678297996521,
-0.2931763529777527,
0.2799687683582306,
0.02005266211926937,
0.08785542100667953,
-0.0463939793407917,
-0.00894091371446848,
0.030869992449879646,
0.10063181072473526,
0.06270267069339752,
-0.030952533707022667,
-0.10102472454309464,
-0.19663341343402863,
-0.030363373458385468,
0.03185392543673515,
0.1124514639377594,
-0.017430542036890984,
0.10546272993087769,
-0.03484777361154556,
0.0047864909283816814,
0.09675119072198868,
-0.03662927448749542,
-0.09195798635482788,
-0.08765549212694168,
-0.03306208923459053,
0.027491334825754166,
-0.029358653351664543,
-0.08068525046110153,
-0.10269691795110703,
-0.1264357715845108,
0.17448873817920685,
-0.028650738298892975,
-0.015903662890195847,
-0.1050434410572052,
0.08985832333564758,
0.03873691335320473,
-0.08446221053600311,
0.041750937700271606,
0.015750380232930183,
0.08344025909900665,
0.021724918857216835,
-0.06324604898691177,
0.1342329978942871,
-0.06491038203239441,
-0.16354233026504517,
-0.062297314405441284,
0.09393078833818436,
0.0215628519654274,
0.041513506323099136,
0.006853341590613127,
0.012888304889202118,
-0.009470623917877674,
-0.07694359123706818,
0.030128495767712593,
-0.036660972982645035,
0.05633040890097618,
0.016162028536200523,
-0.06458757817745209,
-0.02503466233611107,
-0.06137208268046379,
-0.031021272763609886,
0.16852742433547974,
0.2912300229072571,
-0.08650317043066025,
-0.014204743318259716,
0.06477013975381851,
-0.0666562020778656,
-0.21708612143993378,
0.0695045068860054,
0.015195329673588276,
0.00017304415814578533,
0.05297519639134407,
-0.14020195603370667,
0.12152352929115295,
0.09696492552757263,
-0.02311311848461628,
0.09208936989307404,
-0.2660888433456421,
-0.14487594366073608,
0.13833503425121307,
0.16948775947093964,
0.13819974660873413,
-0.16753290593624115,
-0.028249239549040794,
-0.041832804679870605,
-0.07565026730298996,
0.10307412594556808,
-0.15185464918613434,
0.11157113313674927,
0.0009330498287454247,
0.05784858763217926,
0.005438691005110741,
-0.05826614424586296,
0.11652719229459763,
-0.017209693789482117,
0.12986524403095245,
-0.06930266320705414,
-0.03487389534711838,
0.06814896315336227,
-0.047473613172769547,
0.018708450719714165,
-0.09876470267772675,
0.03085518628358841,
-0.04120754078030586,
-0.03367144614458084,
-0.054560791701078415,
0.05033528432250023,
-0.036152809858322144,
-0.07031293213367462,
-0.04981227219104767,
0.03086826577782631,
0.019111763685941696,
-0.01952076144516468,
0.17663782835006714,
-0.0020540719851851463,
0.1839490383863449,
0.14268068969249725,
0.09166259318590164,
-0.057264018803834915,
0.011308937333524227,
0.00958600640296936,
-0.042966630309820175,
0.05829288437962532,
-0.15859676897525787,
0.03947537764906883,
0.11097300052642822,
0.011818347498774529,
0.15359733998775482,
0.08246209472417831,
-0.03407113626599312,
0.029384920373558998,
0.08883360028266907,
-0.16351072490215302,
-0.11217951029539108,
-0.00722486712038517,
-0.0611744225025177,
-0.10050814598798752,
0.08688836544752121,
0.12312385439872742,
-0.07501193135976791,
0.0005258176242932677,
-0.01341442484408617,
0.00048384827096015215,
-0.0507504940032959,
0.18453915417194366,
0.09220081567764282,
0.045537300407886505,
-0.0735415443778038,
0.07366827875375748,
0.041846778243780136,
-0.06537298858165741,
0.01792643405497074,
0.05824901536107063,
-0.07235166430473328,
-0.0520692877471447,
0.07197844237089157,
0.21372637152671814,
-0.06219947338104248,
-0.05331191048026085,
-0.1639651507139206,
-0.11016421020030975,
0.0420074425637722,
0.21725662052631378,
0.10022244602441788,
0.0015682053053751588,
-0.02477157488465309,
0.02682790905237198,
-0.13826236128807068,
0.1056167259812355,
0.035438958555459976,
0.08831042796373367,
-0.15440605580806732,
0.16504178941249847,
-0.004934485536068678,
0.00738111324608326,
-0.03403133153915405,
0.03900950029492378,
-0.13402023911476135,
-0.001358143868856132,
-0.12975062429904938,
-0.017948977649211884,
-0.036150187253952026,
0.00705350935459137,
0.011725328862667084,
-0.05977802351117134,
-0.073634073138237,
0.011922621168196201,
-0.10703765600919724,
-0.006815774366259575,
0.0448417030274868,
0.04978194087743759,
-0.13038961589336395,
-0.035286642611026764,
0.013649185188114643,
-0.05796428769826889,
0.0619937889277935,
0.01848706416785717,
0.021011264994740486,
0.06734245270490646,
-0.19985172152519226,
0.03395448997616768,
0.07704784721136093,
-0.012239503674209118,
0.050021715462207794,
-0.07495354861021042,
-0.005170348100364208,
-0.000559842272195965,
0.0739847794175148,
0.024887854233384132,
0.0753970667719841,
-0.12177865952253342,
0.021222397685050964,
-0.04275210574269295,
-0.07630109041929245,
-0.061168380081653595,
0.035573262721300125,
0.08181490004062653,
-0.009855812415480614,
0.2004423588514328,
-0.11339035630226135,
0.005301987286657095,
-0.20558036863803864,
0.01143474131822586,
-0.0068168132565915585,
-0.11917541176080704,
-0.11895749717950821,
-0.05339057371020317,
0.0550699420273304,
-0.06255640834569931,
0.14445756375789642,
0.027589716017246246,
0.022842060774564743,
0.04499496892094612,
-0.04698629304766655,
0.04499046131968498,
0.03900059312582016,
0.23826658725738525,
0.022651933133602142,
-0.040031518787145615,
0.011317627504467964,
0.05503624305129051,
0.1177772507071495,
0.07435490936040878,
0.18096236884593964,
0.16575883328914642,
-0.07952086627483368,
0.10649328678846359,
0.04639053717255592,
-0.04334588348865509,
-0.11245573312044144,
0.05445774644613266,
-0.039720553904771805,
0.06882435083389282,
-0.028617139905691147,
0.16729699075222015,
0.13477405905723572,
-0.15014676749706268,
0.008578702807426453,
-0.06372388452291489,
-0.08196338266134262,
-0.1247776448726654,
-0.045366864651441574,
-0.11185631901025772,
-0.15185493230819702,
0.00907887052744627,
-0.11548378318548203,
0.0030371355824172497,
0.08570306748151779,
0.009196741506457329,
-0.022737177088856697,
0.1770169734954834,
0.017320849001407623,
0.03781522065401077,
0.04434143751859665,
-0.005049495026469231,
-0.034411150962114334,
-0.08074658364057541,
-0.07970217615365982,
0.0009836758254095912,
-0.03258924558758736,
0.011286775581538677,
-0.05255812406539917,
-0.03520456701517105,
0.04501992464065552,
-0.02387145720422268,
-0.10394816100597382,
0.016458895057439804,
0.0423688068985939,
0.05770774930715561,
0.03840900957584381,
0.012109171599149704,
-0.003801998682320118,
-0.0024740351364016533,
0.23563319444656372,
-0.07412221282720566,
-0.07752007246017456,
-0.09853071719408035,
0.2599938213825226,
0.0445479080080986,
0.031177973374724388,
0.0011631363304331899,
-0.1040104404091835,
0.038260821253061295,
0.21774843335151672,
0.20907457172870636,
-0.07038480788469315,
0.01571628637611866,
-0.020128199830651283,
-0.010362588800489902,
-0.03143841028213501,
0.0996226817369461,
0.10291974991559982,
0.018538620322942734,
-0.08051638305187225,
-0.051019784063100815,
-0.03268466889858246,
-0.002526161726564169,
-0.04672601819038391,
0.06364256143569946,
0.046502791345119476,
0.019397519528865814,
-0.048147737979888916,
0.06624621897935867,
-0.03716207295656204,
-0.11400065571069717,
0.056192513555288315,
-0.193911612033844,
-0.14073801040649414,
-0.011184628121554852,
0.11281964927911758,
-0.01577158086001873,
0.06394682079553604,
-0.03357775881886482,
-0.004784015938639641,
0.02422148361802101,
-0.016762642189860344,
-0.07807765901088715,
-0.06770554929971695,
0.06420914828777313,
-0.10638224333524704,
0.2148200273513794,
-0.04502749443054199,
0.042727094143629074,
0.1441943198442459,
0.034243207424879074,
-0.06757210195064545,
0.10389722883701324,
0.03781982138752937,
-0.07656672596931458,
0.042372412979602814,
0.09151187539100647,
-0.04608732461929321,
0.10941384732723236,
0.05434328690171242,
-0.14807142317295074,
0.029908789321780205,
-0.09365039318799973,
-0.08575345575809479,
-0.0612005740404129,
-0.03713468089699745,
-0.06510815024375916,
0.12557339668273926,
0.18704690039157867,
-0.029371334239840508,
0.0244159996509552,
-0.05118292570114136,
0.02412959188222885,
0.07948243618011475,
0.027997558936476707,
-0.044953200966119766,
-0.25250717997550964,
0.02982354536652565,
0.09052589535713196,
-0.015610013157129288,
-0.3069535791873932,
-0.07393258064985275,
-0.011071725748479366,
-0.04628496244549751,
-0.09845677763223648,
0.08551959693431854,
0.1397314816713333,
0.05784335732460022,
-0.06320219486951828,
-0.12059269845485687,
-0.07806798070669174,
0.16363604366779327,
-0.1351093053817749,
-0.10535386204719543
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | yeniceriSGK/falcon-1b-pibrain-v2-1 | [
"transformers",
"safetensors",
"falcon",
"text-generation",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | 2024-02-13T09:58:51+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #falcon #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #falcon #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
64,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #falcon #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04453665390610695,
0.1909387707710266,
-0.005555706564337015,
0.02009698562324047,
0.09928517788648605,
0.002202220493927598,
0.05591393634676933,
0.11277725547552109,
-0.057992883026599884,
0.12979470193386078,
0.042654167860746384,
0.10920123755931854,
0.12005752325057983,
0.14929279685020447,
-0.006252828054130077,
-0.2134041041135788,
0.04953882843255997,
-0.10335224866867065,
-0.01086056511849165,
0.12334966659545898,
0.1476563662290573,
-0.09718861430883408,
0.07114724069833755,
-0.036244992166757584,
-0.024860747158527374,
-0.03785305470228195,
-0.0595487505197525,
-0.039634235203266144,
0.039995789527893066,
0.06042226403951645,
0.06769220530986786,
-0.0009836091194301844,
0.07885425537824631,
-0.2796455919742584,
0.019178325310349464,
0.07151005417108536,
-0.006919655948877335,
0.06624610722064972,
0.07163652777671814,
-0.06252288073301315,
0.10966669768095016,
-0.049444738775491714,
0.13246740400791168,
0.084211565554142,
-0.0938078835606575,
-0.1822056919336319,
-0.09178999811410904,
0.10561470687389374,
0.17416922748088837,
0.0484575480222702,
-0.025951452553272247,
0.09858915209770203,
-0.07840164005756378,
0.022246353328227997,
0.04883703216910362,
-0.09256608784198761,
-0.05855829641222954,
0.06611574441194534,
0.09133755415678024,
0.05204923823475838,
-0.12691274285316467,
-0.03522179275751114,
0.009058474563062191,
0.01721723936498165,
0.07700812071561813,
0.019413044676184654,
0.1418759971857071,
0.032454222440719604,
-0.13212166726589203,
-0.059588126838207245,
0.10731249302625656,
0.039681512862443924,
-0.046043410897254944,
-0.23570485413074493,
-0.03336421772837639,
-0.02272348292171955,
-0.0341198705136776,
-0.04095032811164856,
0.04227769747376442,
-0.0006343789864331484,
0.09008241444826126,
-0.009444409981369972,
-0.07376758754253387,
-0.037807099521160126,
0.06939547508955002,
0.06980503350496292,
0.03010455332696438,
-0.01593116484582424,
0.019894573837518692,
0.10873140394687653,
0.08636973053216934,
-0.1162659227848053,
-0.05815749242901802,
-0.06317965686321259,
-0.06965572386980057,
-0.03770272806286812,
0.033693619072437286,
0.012340168468654156,
0.07487021386623383,
0.26779240369796753,
0.019667603075504303,
0.05574297159910202,
0.023485545068979263,
0.007517560850828886,
0.04799899831414223,
0.10866300761699677,
-0.061261147260665894,
-0.11703209578990936,
-0.014219003729522228,
0.08728978037834167,
0.023079033941030502,
-0.038951773196458817,
-0.04325496777892113,
0.06728525459766388,
0.04300529137253761,
0.11015605926513672,
0.10105620324611664,
0.021074479445815086,
-0.0715402141213417,
-0.0557621605694294,
0.21137705445289612,
-0.15680187940597534,
0.03674543276429176,
0.044267017394304276,
-0.0316292904317379,
-0.02875566855072975,
0.010211686603724957,
0.02587735652923584,
-0.036022067070007324,
0.08833824843168259,
-0.0525839701294899,
-0.049408841878175735,
-0.10717582702636719,
-0.030840111896395683,
0.04087112098932266,
0.008656567893922329,
-0.030996592715382576,
-0.03811638429760933,
-0.07353513687849045,
-0.08327128738164902,
0.08451396971940994,
-0.06826525181531906,
-0.056710876524448395,
-0.020656531676650047,
-0.0837526023387909,
0.021076807752251625,
0.022399982437491417,
0.07524201273918152,
-0.02734057791531086,
0.05734923854470253,
-0.046132802963256836,
0.049341313540935516,
0.09273886680603027,
0.03224007412791252,
-0.06011774390935898,
0.06155902519822121,
-0.22752223908901215,
0.0827341079711914,
-0.0701761245727539,
0.05535000190138817,
-0.1552843600511551,
-0.021174373105168343,
0.03665536269545555,
0.0028983328957110643,
-0.004714240785688162,
0.13058264553546906,
-0.20505709946155548,
-0.01968754455447197,
0.16734007000923157,
-0.09772137552499771,
-0.06466186791658401,
0.05246074125170708,
-0.04568861424922943,
0.09151032567024231,
0.03263774514198303,
0.003646659664809704,
0.06071169301867485,
-0.10888087004423141,
-0.012166064232587814,
-0.05273926630616188,
-0.025421906262636185,
0.1370338350534439,
0.08078528195619583,
-0.08123568445444107,
0.055560965090990067,
0.02434842474758625,
-0.030269447714090347,
-0.06682562083005905,
-0.016069523990154266,
-0.09870079904794693,
0.013424086384475231,
-0.07048244774341583,
0.00775548443198204,
-0.020568309351801872,
-0.09736152738332748,
-0.027479711920022964,
-0.16466915607452393,
-0.03865848854184151,
0.08033930510282516,
-0.004795180168002844,
-0.013016881421208382,
-0.10705817490816116,
0.03258499503135681,
0.024063624441623688,
0.0026895657647401094,
-0.13068236410617828,
-0.03602854534983635,
0.037118151783943176,
-0.15183864533901215,
0.03041362762451172,
-0.07635925710201263,
0.051014017313718796,
0.01741175539791584,
-0.026979008689522743,
-0.021592440083622932,
0.016442833468317986,
0.007207158952951431,
-0.021860433742403984,
-0.22696970403194427,
-0.029399745166301727,
-0.029851561412215233,
0.15872712433338165,
-0.20569345355033875,
0.03521941974759102,
0.08486489951610565,
0.15441760420799255,
0.0037020102608948946,
-0.05495212972164154,
0.021063216030597687,
-0.06899091601371765,
-0.025963960215449333,
-0.0567413829267025,
-0.001321499585174024,
-0.01364624872803688,
-0.039724286645650864,
0.02455776184797287,
-0.17795418202877045,
-0.04123859107494354,
0.09635645896196365,
0.05286227911710739,
-0.12172684818506241,
-0.017793001607060432,
-0.03796517476439476,
-0.05116693675518036,
-0.04316309839487076,
-0.0652548149228096,
0.10152675211429596,
0.06190277636051178,
0.04221278801560402,
-0.06449716538190842,
-0.07975452393293381,
-0.002510865917429328,
-0.015534510836005211,
-0.0208453256636858,
0.09268542379140854,
0.07696153223514557,
-0.12062439322471619,
0.09454844146966934,
0.08124031871557236,
0.07064208388328552,
0.08388709276914597,
-0.019784949719905853,
-0.07501683384180069,
-0.03474224731326103,
0.04096398875117302,
0.02070939913392067,
0.127272829413414,
-0.04324626177549362,
0.044735442847013474,
0.042813338339328766,
-0.03281288221478462,
0.019368071109056473,
-0.0789046362042427,
0.03477751463651657,
0.02438444457948208,
-0.023067085072398186,
0.04966627433896065,
-0.03796491399407387,
0.017777003347873688,
0.086410291492939,
0.055618125945329666,
0.034197285771369934,
0.015752006322145462,
-0.05180222541093826,
-0.1116505116224289,
0.1603255420923233,
-0.11804656684398651,
-0.21760660409927368,
-0.133219912648201,
0.02440512925386429,
0.02340768836438656,
-0.013408999890089035,
0.0042162188328802586,
-0.05273256078362465,
-0.10836806893348694,
-0.09519179910421371,
0.0020809932611882687,
0.05688413232564926,
-0.08434031158685684,
-0.060320284217596054,
0.03948112577199936,
0.04618431627750397,
-0.1438177227973938,
0.02078346349298954,
0.04113969951868057,
-0.09475471079349518,
-0.014071599580347538,
0.08162059634923935,
0.08266530930995941,
0.1828899383544922,
0.018958356231451035,
-0.018643934279680252,
0.03172494098544121,
0.23090940713882446,
-0.13629217445850372,
0.11638624221086502,
0.1350768655538559,
-0.08650082349777222,
0.08394848555326462,
0.2100806087255478,
0.043485693633556366,
-0.09806657582521439,
0.026381801813840866,
0.033306267112493515,
-0.026859089732170105,
-0.23307451605796814,
-0.06782349944114685,
-0.0019858244340866804,
-0.06354628503322601,
0.07740631699562073,
0.09866151213645935,
0.0780390128493309,
0.021794214844703674,
-0.09731794893741608,
-0.08760849386453629,
0.05686582997441292,
0.10732293128967285,
0.0023640308063477278,
-0.002918716287240386,
0.08658154308795929,
-0.036342114210128784,
0.01677820459008217,
0.0912386104464531,
0.013294782489538193,
0.14498746395111084,
0.04715769737958908,
0.17620420455932617,
0.08277589827775955,
0.07846562564373016,
0.003410395933315158,
0.011320756748318672,
0.008235873654484749,
0.044780924916267395,
-0.006131312344223261,
-0.0848819762468338,
-0.02776206284761429,
0.10752521455287933,
0.06370342522859573,
0.016796445474028587,
0.017344636842608452,
-0.05728970840573311,
0.08524413406848907,
0.18406425416469574,
-0.002633927622810006,
-0.18432281911373138,
-0.05770765617489815,
0.07218682020902634,
-0.09839048981666565,
-0.10539096593856812,
-0.00723700225353241,
0.02178032509982586,
-0.16946057975292206,
0.03620018810033798,
-0.02362026832997799,
0.10919876396656036,
-0.13090506196022034,
-0.01869846321642399,
0.08002742379903793,
0.07152026891708374,
-0.0031599143985658884,
0.056512217968702316,
-0.1868113875389099,
0.09503406286239624,
0.013970657251775265,
0.0697934478521347,
-0.09368602931499481,
0.09287985414266586,
-0.008849430829286575,
-0.026499394327402115,
0.14446701109409332,
-0.004401217680424452,
-0.06620814651250839,
-0.07032917439937592,
-0.09607784450054169,
-0.007364852819591761,
0.12227921187877655,
-0.13433295488357544,
0.08828657120466232,
-0.02958436869084835,
-0.034628983587026596,
-0.014183561317622662,
-0.0855000838637352,
-0.10762646794319153,
-0.17592936754226685,
0.05824372172355652,
-0.1235247254371643,
0.036857567727565765,
-0.10702664405107498,
-0.021083727478981018,
-0.029708638787269592,
0.1712174266576767,
-0.24061107635498047,
-0.07881004363298416,
-0.14433011412620544,
-0.09914591908454895,
0.12869399785995483,
-0.04734301567077637,
0.09356654435396194,
-0.02294059656560421,
0.1584029644727707,
0.016575131565332413,
-0.021096564829349518,
0.08110643178224564,
-0.08579336851835251,
-0.2013741284608841,
-0.06729526072740555,
0.1667507141828537,
0.11356096714735031,
0.028554249554872513,
0.00021348004520405084,
0.03888364136219025,
-0.02329942025244236,
-0.12095320969820023,
0.01913701742887497,
0.1473778784275055,
0.06899469345808029,
0.008328313007950783,
-0.015725411474704742,
-0.10770174860954285,
-0.07449213415384293,
-0.02470727078616619,
0.02393377758562565,
0.1620502769947052,
-0.07433250546455383,
0.1653832495212555,
0.14307019114494324,
-0.05678045377135277,
-0.20564031600952148,
-0.004954501520842314,
0.026278968900442123,
-0.011154405772686005,
0.008703439496457577,
-0.1850665956735611,
0.08444999903440475,
0.0019649346359074116,
-0.05588759109377861,
0.10494216531515121,
-0.16007831692695618,
-0.13503114879131317,
0.081987164914608,
0.05090386047959328,
-0.190145805478096,
-0.1397211104631424,
-0.0986657589673996,
-0.0438401997089386,
-0.1640320122241974,
0.09016454964876175,
0.011982852593064308,
0.013060753233730793,
0.03215835615992546,
0.00997347291558981,
0.024044077843427658,
-0.050926510244607925,
0.17516721785068512,
-0.012887895107269287,
0.026755787432193756,
-0.09369775652885437,
-0.08567874133586884,
0.014542883262038231,
-0.04831458628177643,
0.07301216572523117,
-0.027258891612291336,
0.01678038388490677,
-0.10315357893705368,
-0.03471795842051506,
-0.04467841982841492,
0.01759176142513752,
-0.09924584627151489,
-0.08678551763296127,
-0.05264589935541153,
0.08543965965509415,
0.09601205587387085,
-0.019729604944586754,
-0.03255130350589752,
-0.07696829736232758,
0.056086186319589615,
0.22337792813777924,
0.17939896881580353,
0.04744863137602806,
-0.07171594351530075,
-0.002723131561651826,
-0.015915779396891594,
0.04550015926361084,
-0.19737431406974792,
0.05970478802919388,
0.05623520165681839,
0.02068583481013775,
0.1036977469921112,
-0.02265029028058052,
-0.15529058873653412,
-0.07423704117536545,
0.06274627894163132,
-0.06600723415613174,
-0.20556195080280304,
0.009404796175658703,
0.04761991277337074,
-0.1723693460226059,
-0.0333247110247612,
0.0480525940656662,
-0.007163160480558872,
-0.03616609424352646,
0.01816665008664131,
0.09348054975271225,
0.0033172310795634985,
0.08192651718854904,
0.07648915797472,
0.08316144347190857,
-0.10130025446414948,
0.08446116000413895,
0.09844685345888138,
-0.060192350298166275,
0.02733948454260826,
0.09193270653486252,
-0.05831551551818848,
-0.036187272518873215,
0.03698268160223961,
0.08099217712879181,
0.02633502706885338,
-0.04658699780702591,
0.007984031923115253,
-0.09199433028697968,
0.06505311280488968,
0.11281085014343262,
0.029439743608236313,
0.022750042378902435,
0.04665811359882355,
0.04562155902385712,
-0.073212169110775,
0.12069565057754517,
0.028477732092142105,
0.015673451125621796,
-0.04220086708664894,
-0.03826799616217613,
0.009484526701271534,
-0.028716184198856354,
-0.005086944438517094,
-0.01901283487677574,
-0.08357292413711548,
-0.01591898314654827,
-0.12416110187768936,
0.0014331045094877481,
-0.060666002333164215,
0.014897594228386879,
0.02503013238310814,
-0.03190592676401138,
0.008168932050466537,
0.008999391458928585,
-0.06911855190992355,
-0.0667494684457779,
-0.01107252947986126,
0.09214410930871964,
-0.17018061876296997,
0.02913808822631836,
0.08447875827550888,
-0.11064672470092773,
0.10246492922306061,
0.006543856579810381,
-0.009631764143705368,
0.01821054145693779,
-0.1580088883638382,
0.041540052741765976,
-0.03687871992588043,
0.008244183845818043,
0.018192864954471588,
-0.19031718373298645,
0.0022233319468796253,
-0.032371241599321365,
-0.0713675394654274,
-0.010592167265713215,
-0.012611191719770432,
-0.1175762265920639,
0.10545824468135834,
0.003398436587303877,
-0.07779555022716522,
-0.029375309124588966,
0.031890060752630234,
0.0767318531870842,
-0.029489094391465187,
0.14903375506401062,
-0.010945397429168224,
0.06344805657863617,
-0.16184058785438538,
-0.012112514115869999,
-0.008795775473117828,
0.009387471713125706,
-0.05059373006224632,
-0.0023798528127372265,
0.051249098032712936,
-0.014759009703993797,
0.18000589311122894,
-0.03264591097831726,
0.014091522432863712,
0.06283827871084213,
0.04961233586072922,
-0.02504689060151577,
0.09745560586452484,
0.04935402050614357,
0.016042273491621017,
0.010107748210430145,
0.010634302161633968,
-0.04519105702638626,
-0.03718317672610283,
-0.1896188110113144,
0.06667887419462204,
0.19332972168922424,
0.09775267541408539,
-0.021238725632429123,
0.07185591757297516,
-0.10060291737318039,
-0.09193979948759079,
0.1432638168334961,
-0.039614953100681305,
-0.003152070799842477,
-0.07366806268692017,
0.12533800303936005,
0.14702938497066498,
-0.18035173416137695,
0.0680631548166275,
-0.06366228312253952,
-0.04062335565686226,
-0.10973858088254929,
-0.19396264851093292,
-0.06291978806257248,
-0.043153245002031326,
-0.01706063002347946,
-0.046428751200437546,
0.06866542994976044,
0.07235037535429001,
-0.005334464833140373,
-0.008923451416194439,
0.07358722388744354,
-0.03242181986570358,
-0.0023431198205798864,
0.027715085074305534,
0.060844551771879196,
0.009789250791072845,
-0.03430457413196564,
0.013830730691552162,
-0.009553893469274044,
0.055388353765010834,
0.08014345169067383,
0.05010252445936203,
-0.018158378079533577,
0.021721720695495605,
-0.03957204148173332,
-0.10294679552316666,
0.05101633816957474,
-0.029352717101573944,
-0.07048176229000092,
0.1506207436323166,
0.019765805453062057,
0.008169750683009624,
-0.00927541870623827,
0.23298712074756622,
-0.061777859926223755,
-0.09948190301656723,
-0.14721636474132538,
0.06843370944261551,
-0.044959235936403275,
0.050324033945798874,
0.03928495943546295,
-0.112355075776577,
0.026658296585083008,
0.15314003825187683,
0.1527816653251648,
-0.03738558664917946,
0.02435494400560856,
0.032263368368148804,
0.007983527146279812,
-0.01703016646206379,
0.03988349065184593,
0.05664530023932457,
0.1520930528640747,
-0.046840306371450424,
0.07457534968852997,
0.004483088385313749,
-0.08482256531715393,
-0.03528733178973198,
0.12070002406835556,
-0.020297368988394737,
0.010244566015899181,
-0.059731606394052505,
0.11496695131063461,
-0.0638689175248146,
-0.2224048227071762,
0.04006026312708855,
-0.06709709763526917,
-0.12944550812244415,
-0.021726736798882484,
0.06823421269655228,
-0.009106227196753025,
0.018608465790748596,
0.07853177189826965,
-0.0701594352722168,
0.1892324835062027,
0.03615923225879669,
-0.05911361053586006,
-0.04938262328505516,
0.06969781219959259,
-0.0822150930762291,
0.29828280210494995,
0.017663899809122086,
0.03969915956258774,
0.10929078608751297,
-0.014072122052311897,
-0.13797621428966522,
0.034022893756628036,
0.09724123775959015,
-0.094990573823452,
0.05288228765130043,
0.17340928316116333,
0.0039305174723267555,
0.1284504532814026,
0.07164544612169266,
-0.08326496928930283,
0.047714803367853165,
-0.068685382604599,
-0.0693238228559494,
-0.10318871587514877,
0.09887052327394485,
-0.0937386229634285,
0.142961323261261,
0.1215214654803276,
-0.055416040122509,
0.01026212889701128,
-0.03587771952152252,
0.048618871718645096,
-0.005160714965313673,
0.12107990682125092,
0.014536378905177116,
-0.18938297033309937,
0.030264662578701973,
-0.03206425532698631,
0.09826018661260605,
-0.1596945971250534,
-0.08264226466417313,
0.04596994072198868,
0.010617619380354881,
-0.06794066727161407,
0.12218311429023743,
0.060725919902324677,
0.02809346467256546,
-0.05020172521471977,
-0.025824300944805145,
-0.012600240297615528,
0.14013776183128357,
-0.10047949850559235,
-0.005263396073132753
] |
null | null | ml-agents |
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: haihuynh/ppo-Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["Pyramids", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Pyramids"]} | reinforcement-learning | haihuynh/ppo-Pyramids | [
"ml-agents",
"tensorboard",
"onnx",
"Pyramids",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Pyramids",
"region:us"
] | 2024-02-13T09:59:16+00:00 | [] | [] | TAGS
#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us
|
# ppo Agent playing Pyramids
This is a trained model of a ppo agent playing Pyramids
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: haihuynh/ppo-Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us \n",
"# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
48,
204
] | [
"passage: TAGS\n#ml-agents #tensorboard #onnx #Pyramids #deep-reinforcement-learning #reinforcement-learning #ML-Agents-Pyramids #region-us \n# ppo Agent playing Pyramids\n This is a trained model of a ppo agent playing Pyramids\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: haihuynh/ppo-Pyramids\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
-0.0026798085309565067,
0.04752451553940773,
-0.003513073083013296,
0.05116667598485947,
0.13749760389328003,
-0.022683659568428993,
0.16817708313465118,
0.1297869235277176,
0.2123185396194458,
0.09020970016717911,
0.02937183529138565,
0.08151381462812424,
0.07840315252542496,
0.1198790967464447,
0.08310234546661377,
-0.18933884799480438,
-0.024987930431962013,
-0.04767997935414314,
0.06231126934289932,
0.0897325724363327,
0.051479317247867584,
-0.07278711348772049,
0.07421741634607315,
0.05092174559831619,
-0.0006967378430999815,
0.0007246213499456644,
-0.1052217110991478,
-0.033558864146471024,
0.06060267239809036,
-0.022788746282458305,
-0.0033220541663467884,
-0.04251229763031006,
0.09094464778900146,
-0.1485559344291687,
0.03001270443201065,
0.08822973072528839,
-0.005724860820919275,
0.004117142409086227,
0.11529108881950378,
0.028299206867814064,
0.10640889406204224,
-0.08958273380994797,
0.047598183155059814,
0.045168835669755936,
-0.05784950777888298,
0.020606499165296555,
-0.11715733259916306,
0.059294961392879486,
0.2205743044614792,
0.13326171040534973,
0.0032612918876111507,
0.13069695234298706,
-0.027804313227534294,
0.03818661347031593,
0.175567626953125,
-0.27996546030044556,
-0.05987397953867912,
0.11090116947889328,
-0.03074038028717041,
0.016410719603300095,
-0.009416332468390465,
0.04746788367629051,
-0.04291904345154762,
0.02792041376233101,
0.013048428110778332,
-0.01859673298895359,
0.17441949248313904,
-0.014939257875084877,
-0.0874919667840004,
-0.07358492910861969,
0.07917510718107224,
0.05036163330078125,
-0.011798316612839699,
-0.18524086475372314,
-0.012991877272725105,
0.12633559107780457,
-0.018833482638001442,
0.030531007796525955,
0.053613439202308655,
-0.002868782728910446,
0.01296838466078043,
-0.12159261852502823,
-0.04759516939520836,
-0.05706661194562912,
0.04282103851437569,
0.08597579598426819,
0.032900575548410416,
-0.029535438865423203,
0.0624844916164875,
0.0551019161939621,
0.0769287496805191,
-0.0564480796456337,
-0.025094371289014816,
-0.00081136345397681,
-0.12895755469799042,
-0.039120033383369446,
0.02831174246966839,
-0.05577687919139862,
0.042523130774497986,
0.0250716470181942,
0.04174003750085831,
0.03881998360157013,
0.011898599565029144,
0.052466049790382385,
0.017263658344745636,
0.11787186563014984,
-0.014977890998125076,
0.06271697580814362,
0.037524379789829254,
0.06357093155384064,
0.03965633362531662,
-0.05898148939013481,
-0.08616362512111664,
0.09478990733623505,
-0.06321505457162857,
0.09491197019815445,
0.10600374639034271,
0.002872123848646879,
-0.02122838795185089,
-0.0638442412018776,
0.0005844124825671315,
-0.144287109375,
0.055146824568510056,
0.04996945708990097,
-0.019669393077492714,
-0.05785634368658066,
-0.03485124558210373,
0.012487920932471752,
-0.09857187420129776,
-0.0014598832931369543,
-0.022162163630127907,
0.06945719569921494,
-0.01103933434933424,
-0.00400807149708271,
0.045908112078905106,
-0.03517303615808487,
-0.04453153535723686,
-0.17109082639217377,
-0.19429855048656464,
-0.07064328342676163,
0.03590710088610649,
-0.06966882199048996,
-0.08293953537940979,
-0.0309434924274683,
0.048483554273843765,
-0.11106440424919128,
0.01628830097615719,
-0.057463210076093674,
-0.05854526162147522,
-0.013089507818222046,
-0.047045521438121796,
0.04922608658671379,
0.17956840991973877,
0.03318138048052788,
-0.033833928406238556,
0.0665244460105896,
-0.19988380372524261,
0.1376478523015976,
-0.10157348215579987,
0.20094454288482666,
-0.0904494896531105,
0.03232409432530403,
0.08559388667345047,
-0.0032347380183637142,
0.01795581355690956,
0.1616467982530594,
-0.09619814157485962,
-0.07819627225399017,
0.10337088257074356,
-0.03923622518777847,
-0.1637047529220581,
0.0575607605278492,
0.01817069761455059,
0.08775198459625244,
0.06728322058916092,
0.21984463930130005,
0.09819791465997696,
-0.2174210250377655,
0.05352983623743057,
-0.0005791047005914152,
-0.08384296298027039,
0.00551306176930666,
0.10754452645778656,
-0.0959664136171341,
-0.026211977005004883,
-0.02585889771580696,
-0.15133988857269287,
0.07019991427659988,
-0.018491355702280998,
-0.0685085877776146,
0.04764482378959656,
-0.057758625596761703,
-0.06276898831129074,
0.027917159721255302,
0.060734208673238754,
0.003954945597797632,
-0.041822586208581924,
-0.058704495429992676,
0.08858759701251984,
-0.03592269867658615,
0.04215092211961746,
-0.05761447176337242,
0.15770003199577332,
-0.002292951103299856,
0.041363779455423355,
-0.11699450761079788,
-0.09069488942623138,
0.02217544987797737,
0.010057262144982815,
0.08562176674604416,
-0.13794675469398499,
0.07103336602449417,
0.08480764180421829,
0.025864839553833008,
-0.06228409707546234,
-0.06537004560232162,
0.015533152967691422,
-0.08292359858751297,
-0.09197549521923065,
-0.05784248933196068,
-0.03612124174833298,
0.05942126363515854,
-0.06616488099098206,
0.051510266959667206,
-0.13318412005901337,
0.0990091860294342,
-0.013472193852066994,
-0.042936258018016815,
0.05040080100297928,
0.024687299504876137,
0.026040270924568176,
-0.07258118689060211,
0.09346811473369598,
0.010893327184021473,
-0.06539218127727509,
0.007064377889037132,
-0.001527420012280345,
-0.0818934366106987,
0.09017142653465271,
-0.025371361523866653,
-0.001079596346244216,
-0.01017187163233757,
-0.04716259986162186,
0.007693542167544365,
-0.08341380953788757,
-0.0019171996973454952,
0.2002583146095276,
0.10034259408712387,
0.11048135161399841,
-0.07246042788028717,
-0.04991552233695984,
-0.027425870299339294,
-0.046062082052230835,
-0.04385552927851677,
0.1411430537700653,
0.05246664583683014,
-0.041778564453125,
0.056676384061574936,
0.06572476774454117,
0.06165703013539314,
0.07296308875083923,
-0.021682897582650185,
-0.12175697088241577,
0.0036670267581939697,
0.0785418227314949,
0.0496041476726532,
0.02638302929699421,
-0.008234461769461632,
-0.031398918479681015,
0.009607081301510334,
-0.04357731714844704,
-0.009861580096185207,
-0.11804559081792831,
-0.057130057364702225,
0.02320987358689308,
-0.017290446907281876,
0.04388905689120293,
-0.0485478900372982,
-0.022216375917196274,
0.05556029453873634,
0.07591874897480011,
0.013305353000760078,
-0.014647376723587513,
-0.06011437624692917,
-0.10866334289312363,
0.08473675698041916,
-0.08217355608940125,
-0.29551565647125244,
-0.07073833048343658,
-0.06824914366006851,
-0.07467576116323471,
0.024288833141326904,
0.0377681739628315,
-0.16693998873233795,
-0.00563139608129859,
-0.09786314517259598,
-0.030039191246032715,
0.022760147228837013,
-0.05275937169790268,
0.19210587441921234,
0.09305287152528763,
0.002741052769124508,
-0.06268157064914703,
-0.020536063238978386,
-0.008917471393942833,
-0.05170179903507233,
-0.00016523757949471474,
0.03296949714422226,
0.06955595314502716,
0.09290154278278351,
0.08244391530752182,
0.07529480010271072,
-0.015903998166322708,
0.10771165788173676,
-0.06827035546302795,
-0.015312771312892437,
0.1306019425392151,
0.01815630868077278,
0.058535136282444,
0.038407765328884125,
0.04536162689328194,
-0.010122016072273254,
0.009533579461276531,
0.012213364243507385,
-0.05250595137476921,
-0.1806076467037201,
-0.11025665700435638,
-0.04552186653017998,
0.11271895468235016,
0.11555412411689758,
0.10541357100009918,
-0.05952782183885574,
-0.011590288951992989,
0.0006990450201556087,
-0.03025120124220848,
0.09555684030056,
0.10437046736478806,
-0.051949724555015564,
-0.026703670620918274,
-0.011214064434170723,
-0.0442756786942482,
0.007106151431798935,
0.04986419528722763,
0.02384888008236885,
0.11930152773857117,
0.03650831803679466,
0.057587090879678726,
0.03358922153711319,
-0.03606448695063591,
-0.046394433826208115,
0.05092768743634224,
0.023018505424261093,
0.0010858732275664806,
0.0017470314633101225,
-0.08902805298566818,
-0.040129560977220535,
0.09187203645706177,
0.12535467743873596,
-0.016129229217767715,
-0.07185389846563339,
0.08214452862739563,
0.10567867010831833,
0.1588621437549591,
0.018712444230914116,
-0.15859295427799225,
-0.04636833444237709,
0.0005955829401500523,
-0.09008465707302094,
0.0310517568141222,
0.019436074420809746,
-0.035792939364910126,
-0.18256138265132904,
0.05080913379788399,
0.011546102352440357,
0.13124175369739532,
-0.07895056158304214,
-0.007333511486649513,
0.05082986503839493,
0.0366516038775444,
-0.0021984009072184563,
0.06373760104179382,
-0.161088228225708,
0.12362641096115112,
0.00797204114496708,
0.0883723795413971,
-0.06736499071121216,
0.01337511371821165,
0.09774857014417648,
-0.041155096143484116,
0.19271710515022278,
0.03613153472542763,
0.05458740144968033,
-0.1135827973484993,
-0.18199801445007324,
-0.05735689029097557,
-0.05501469969749451,
-0.0950755700469017,
0.06858772039413452,
0.0374365858733654,
-0.04170685261487961,
-0.10670489817857742,
0.08081512898206711,
-0.055408187210559845,
-0.08328959345817566,
0.0019241804257035255,
-0.05054108798503876,
-0.018593164160847664,
-0.04807243123650551,
-0.03380483761429787,
-0.13506688177585602,
0.17290034890174866,
0.09483275562524796,
-0.07727351039648056,
-0.08685647696256638,
-0.04213530197739601,
-0.04870781674981117,
-0.04195793345570564,
-0.013624520972371101,
-0.003156906459480524,
0.07991138100624084,
-0.06335413455963135,
-0.07595110684633255,
-0.013517511077225208,
-0.11250865459442139,
-0.07978213578462601,
-0.03631607070565224,
0.22150017321109772,
0.035456448793411255,
0.06803631782531738,
0.005347684025764465,
0.032441385090351105,
-0.021556556224822998,
-0.0753130093216896,
0.15387824177742004,
0.16261430084705353,
-0.0030623418278992176,
0.09416791051626205,
-0.036249611526727676,
0.07327666878700256,
-0.12897507846355438,
-0.0053743962198495865,
0.19531993567943573,
0.25939640402793884,
-0.031526267528533936,
0.18435654044151306,
0.05571465566754341,
-0.05840984359383583,
-0.15179580450057983,
-0.07169760018587112,
0.016447698697447777,
-0.030152682214975357,
0.09518858790397644,
-0.19316178560256958,
0.05406731739640236,
0.003761130152270198,
-0.027973396703600883,
-0.03154177591204643,
-0.26747605204582214,
-0.07652780413627625,
0.06102820113301277,
0.08270300179719925,
-0.05086309090256691,
-0.10400191694498062,
-0.07357358187437057,
0.013496533036231995,
-0.13062013685703278,
0.04581877589225769,
-0.18177111446857452,
0.05382313206791878,
-0.010677430778741837,
0.04513624683022499,
0.03558477759361267,
-0.037738580256700516,
0.13052105903625488,
-0.03075185976922512,
-0.033219113945961,
-0.04614223539829254,
0.043669622391462326,
0.028920555487275124,
-0.08587051182985306,
0.05476541072130203,
0.01177974697202444,
-0.02975289523601532,
-0.20957447588443756,
-0.029169371351599693,
-0.018051650375127792,
0.04476894810795784,
-0.0070089735090732574,
-0.031170552596449852,
-0.0053877760656178,
0.08260176330804825,
0.09104318916797638,
0.046468738466501236,
0.13936395943164825,
0.01652386039495468,
0.03187886252999306,
0.060918424278497696,
0.04480908066034317,
-0.0019362298771739006,
-0.13933752477169037,
-0.06378990411758423,
-0.03715116158127785,
0.0052825529128313065,
-0.03650718182325363,
0.001100987195968628,
0.06264247745275497,
0.026952246204018593,
0.042505014687776566,
0.060407981276512146,
-0.11776180565357208,
0.008633852936327457,
0.04915237799286842,
-0.11598990112543106,
-0.17844782769680023,
-0.07160040736198425,
-0.07576884329319,
0.008861579932272434,
-0.03706909716129303,
0.03275744244456291,
-0.0321076437830925,
-0.01319168508052826,
0.051023028790950775,
0.03442930057644844,
-0.04198422655463219,
0.0365920215845108,
-0.026717498898506165,
0.026757434010505676,
-0.0751815214753151,
0.14583338797092438,
0.07794764637947083,
-0.0024368949234485626,
0.02358354814350605,
0.2247416377067566,
-0.08878642320632935,
-0.0800517201423645,
-0.07064434885978699,
0.07573605328798294,
0.10654555261135101,
-0.02080351859331131,
-0.03541311249136925,
-0.08760227262973785,
0.07804691791534424,
-0.15620900690555573,
0.022789837792515755,
-0.13112995028495789,
0.013711556792259216,
0.04622930288314819,
-0.061079662293195724,
0.09526025503873825,
-0.03362148255109787,
-0.04230785369873047,
-0.13100124895572662,
0.027006983757019043,
0.037410032004117966,
0.16384842991828918,
-0.021703151986002922,
-0.0717451199889183,
-0.14245375990867615,
0.054610855877399445,
-0.04928719624876976,
-0.01423154678195715,
-0.21433280408382416,
-0.033718183636665344,
-0.0068075698800385,
0.03872775286436081,
0.005666715558618307,
0.05415432155132294,
-0.03813658654689789,
-0.0890047550201416,
-0.027909843251109123,
0.11310778558254242,
-0.06514366716146469,
-0.021692678332328796,
0.015019487589597702,
-0.07847736775875092,
0.07734356820583344,
0.05203091353178024,
-0.017429694533348083,
-0.025917943567037582,
-0.04296402633190155,
-0.05787191912531853,
-0.022192006930708885,
0.0015703249955549836,
0.061314795166254044,
-0.17415204644203186,
0.01979769580066204,
-0.03951754793524742,
-0.11530977487564087,
0.004031541757285595,
0.11226921528577805,
-0.0572906956076622,
0.0222440455108881,
0.02110466919839382,
-0.05047304183244705,
-0.06731332838535309,
0.02953629195690155,
0.030556662008166313,
0.044317957013845444,
0.03981318697333336,
-0.07445381581783295,
0.1743098497390747,
-0.12041565030813217,
-0.031457360833883286,
0.009211870841681957,
0.040680501610040665,
0.02260168269276619,
-0.09662195295095444,
0.051841430366039276,
-0.039230797439813614,
0.11784867197275162,
0.07925394177436829,
0.009006674401462078,
0.034492507576942444,
0.028216682374477386,
0.09505807608366013,
0.004643307067453861,
0.053560931235551834,
-0.008101329207420349,
0.0017249921802431345,
0.10295145213603973,
-0.00874832458794117,
0.056637488305568695,
-0.019512873142957687,
0.12525233626365662,
0.09380434453487396,
0.12337838858366013,
0.04221704602241516,
0.07352352887392044,
-0.09054054319858551,
-0.20448298752307892,
-0.07638846337795258,
0.013694515451788902,
0.047833316028118134,
-0.05298306792974472,
0.12341302633285522,
0.1156233549118042,
-0.18392214179039001,
0.0592624694108963,
-0.013962083496153355,
0.010346253402531147,
-0.06935561448335648,
-0.13515548408031464,
0.000877954182215035,
-0.15704792737960815,
0.06302141398191452,
-0.029404373839497566,
-0.01972498744726181,
-0.022236507385969162,
-0.012695956975221634,
-0.013346206396818161,
0.08859320729970932,
-0.030634382739663124,
-0.029045552015304565,
0.06771392375230789,
-0.019908733665943146,
0.02894064225256443,
-0.04992613568902016,
-0.023390037938952446,
-0.04350554943084717,
-0.0883154571056366,
0.009503316134214401,
0.03870266675949097,
-0.021697912365198135,
0.07726450264453888,
-0.02374078333377838,
-0.080594003200531,
0.047707829624414444,
-0.025243349373340607,
-0.017255134880542755,
0.13433235883712769,
0.07744251191616058,
-0.0753524973988533,
-0.025356018915772438,
0.18091073632240295,
-0.032569337636232376,
0.007912281900644302,
-0.0778689906001091,
0.21559859812259674,
-0.02706126868724823,
-0.08397466689348221,
-0.008238965645432472,
-0.12671852111816406,
-0.07265506684780121,
0.2519664466381073,
0.14710456132888794,
-0.0810018926858902,
0.020907647907733917,
-0.044711124151945114,
0.006173606961965561,
-0.016253262758255005,
0.0961860790848732,
0.08117428421974182,
0.15077227354049683,
-0.08386573940515518,
0.009070220403373241,
-0.03386605530977249,
-0.09008907526731491,
-0.22303664684295654,
-0.024345040321350098,
0.06005365028977394,
-0.01646810583770275,
-0.02193245105445385,
0.09679640829563141,
-0.1379760205745697,
-0.08963668346405029,
0.08547624945640564,
-0.10725609213113785,
-0.09396637231111526,
-0.02700330875813961,
-0.004553067032247782,
0.03587677702307701,
0.06921213865280151,
0.02777284011244774,
0.0434592142701149,
0.08795562386512756,
-0.00946748536080122,
-0.04489907994866371,
-0.0016073354054242373,
0.09771288931369781,
-0.10480605065822601,
0.2293972671031952,
-0.043693576008081436,
0.03512971103191376,
0.06720997393131256,
0.03225964680314064,
-0.1569891721010208,
0.023388205096125603,
0.06019151955842972,
-0.14712190628051758,
0.009846127592027187,
0.08346246927976608,
-0.02964239940047264,
-0.011570705100893974,
0.08483631163835526,
-0.002933440264314413,
0.004589905496686697,
0.09645398706197739,
0.046696893870830536,
-0.04884779080748558,
0.06045827642083168,
-0.13887695968151093,
0.11138057708740234,
0.11274326592683792,
-0.06485963612794876,
0.011804950423538685,
-0.019306328147649765,
0.024921871721744537,
0.026822710409760475,
0.03843468800187111,
-0.054606884717941284,
-0.1300945281982422,
-0.0008440923411399126,
-0.03223482519388199,
0.052438877522945404,
-0.22344976663589478,
-0.11775736510753632,
-0.03858104348182678,
-0.0741918608546257,
-0.035871949046850204,
0.09957131743431091,
0.14349643886089325,
-0.009369484148919582,
-0.01710742525756359,
-0.12182319164276123,
0.015059580095112324,
0.13960771262645721,
-0.093650221824646,
-0.01831468939781189
] |
null | null | transformers |
<p align="center">
<img src="https://doctr-static.mindee.com/models?id=v0.3.1/Logo_doctr.gif&src=0" width="60%">
</p>
**Optical Character Recognition made seamless & accessible to anyone, powered by TensorFlow 2 & PyTorch**
## Task: recognition
https://github.com/mindee/doctr
### Example usage:
```python
>>> from doctr.io import DocumentFile
>>> from doctr.models import ocr_predictor, from_hub
>>> img = DocumentFile.from_images(['<image_path>'])
>>> # Load your model from the hub
>>> model = from_hub('mindee/my-model')
>>> # Pass it to the predictor
>>> # If your model is a recognition model:
>>> predictor = ocr_predictor(det_arch='db_mobilenet_v3_large',
>>> reco_arch=model,
>>> pretrained=True)
>>> # If your model is a detection model:
>>> predictor = ocr_predictor(det_arch=model,
>>> reco_arch='crnn_mobilenet_v3_small',
>>> pretrained=True)
>>> # Get your predictions
>>> res = predictor(img)
```
| {"language": "en"} | null | Noxilus/doctr-torch-parseq-german | [
"transformers",
"pytorch",
"en",
"endpoints_compatible",
"region:us"
] | 2024-02-13T09:59:52+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #en #endpoints_compatible #region-us
|
<p align="center">
<img src="URL width="60%">
</p>
Optical Character Recognition made seamless & accessible to anyone, powered by TensorFlow 2 & PyTorch
## Task: recognition
URL
### Example usage:
| [
"## Task: recognition\n\nURL",
"### Example usage:"
] | [
"TAGS\n#transformers #pytorch #en #endpoints_compatible #region-us \n",
"## Task: recognition\n\nURL",
"### Example usage:"
] | [
23,
6,
6
] | [
"passage: TAGS\n#transformers #pytorch #en #endpoints_compatible #region-us \n## Task: recognition\n\nURL### Example usage:"
] | [
0.03270414099097252,
-0.027599597349762917,
-0.00910111702978611,
-0.04490388557314873,
0.13444852828979492,
0.03268662095069885,
0.03940471261739731,
0.060651861131191254,
0.15886354446411133,
-0.017243899405002594,
0.13023442029953003,
0.21057485044002533,
-0.05032191425561905,
-0.04390718415379524,
-0.058335963636636734,
-0.1756501942873001,
0.08560232073068619,
0.06485321372747421,
-0.033668145537376404,
0.13000620901584625,
0.047748763114213943,
-0.10946774482727051,
0.05538155511021614,
-0.02709086239337921,
-0.12249356508255005,
0.05483949929475784,
0.006534461863338947,
-0.07038439065217972,
0.11352653056383133,
0.0017147330800071359,
0.18504805862903595,
0.035126909613609314,
-0.05627407506108284,
-0.241455078125,
0.0011365527752786875,
-0.011875789612531662,
-0.032622985541820526,
-0.0010987300192937255,
0.019709190353751183,
-0.10091208666563034,
-0.005443592555820942,
0.015482322312891483,
0.010725373402237892,
0.0008299554465338588,
-0.10201457142829895,
-0.07743388414382935,
-0.004035712219774723,
0.11332761496305466,
0.006187989376485348,
0.10157135874032974,
0.030629001557826996,
0.1670650839805603,
-0.2214292734861374,
0.10569499433040619,
0.12899193167686462,
-0.21245750784873962,
0.0038889257702976465,
0.11137113720178604,
0.06917312741279602,
0.010815666057169437,
-0.02199188433587551,
0.061704110354185104,
-0.024276917800307274,
0.03862236812710762,
-0.030152009800076485,
-0.07930940389633179,
-0.10542470216751099,
0.11373971402645111,
-0.09010087698698044,
-0.13203908503055573,
0.17624518275260925,
0.01407382357865572,
0.08935052156448364,
-0.017078017815947533,
-0.12972179055213928,
0.01738903485238552,
-0.04660618305206299,
-0.04633061960339546,
0.014865762554109097,
0.06560975313186646,
-0.0003750571340788156,
-0.05458732694387436,
-0.06375661492347717,
-0.027558421716094017,
-0.17438645660877228,
0.18428659439086914,
0.029740525409579277,
0.09290347993373871,
-0.1599360704421997,
0.05288536474108696,
0.06484805792570114,
-0.05138230323791504,
0.028877895325422287,
-0.07615333050489426,
0.03636744245886803,
0.03156355023384094,
-0.02537020854651928,
-0.007240276783704758,
0.12985670566558838,
0.10408302396535873,
0.08240829408168793,
0.01726851984858513,
-0.03806469589471817,
0.14362363517284393,
-0.014545547775924206,
0.20236901938915253,
-0.059392716735601425,
0.008196336217224598,
0.043446335941553116,
-0.09009569883346558,
-0.026490340009331703,
-0.007151836529374123,
-0.08781874924898148,
-0.08367810398340225,
0.08518670499324799,
0.13892975449562073,
0.07235444337129593,
0.0753839984536171,
-0.07395365089178085,
-0.009867903776466846,
-0.007455177139490843,
-0.07374777644872665,
-0.01349574327468872,
0.04656638577580452,
-0.001190511859022081,
0.1420859843492508,
0.0196010023355484,
0.0044150399044156075,
-0.0961945578455925,
0.05748841166496277,
-0.05629947781562805,
0.012191885150969028,
-0.033544886857271194,
-0.06190599128603935,
0.04160205274820328,
-0.09710223972797394,
0.04872336611151695,
-0.17998066544532776,
-0.11646421253681183,
-0.013473187573254108,
0.05942726135253906,
0.01823052391409874,
0.03652334585785866,
-0.06366150081157684,
-0.04032636061310768,
-0.026460612192749977,
-0.051225047558546066,
-0.05599082261323929,
-0.08508323132991791,
0.12972615659236908,
-0.07617867738008499,
0.05417032167315483,
-0.06379246711730957,
0.050253164023160934,
-0.12816166877746582,
0.010959719307720661,
-0.09853862971067429,
0.058299798518419266,
-0.05995984375476837,
0.22708682715892792,
-0.06423510611057281,
-0.03256266191601753,
-0.12784571945667267,
0.04300375282764435,
-0.06929122656583786,
0.1655939817428589,
0.04642341658473015,
-0.09404759854078293,
0.20688249170780182,
-0.03569023311138153,
-0.13941949605941772,
0.07896725833415985,
-0.010751238092780113,
0.10548803955316544,
0.06781233847141266,
0.25453147292137146,
0.0465003103017807,
-0.07811084389686584,
0.0977809950709343,
0.15951570868492126,
-0.19903703033924103,
-0.10026513785123825,
-0.01257302612066269,
-0.05652352422475815,
-0.05301569774746895,
0.022698938846588135,
-0.01229810994118452,
0.08302761614322662,
-0.07076416164636612,
-0.049141738563776016,
-0.037490516901016235,
-0.002946042688563466,
0.03632953763008118,
0.0415823757648468,
0.0788005068898201,
-0.009825772605836391,
-0.025083929300308228,
-0.03788186237215996,
-0.04164562001824379,
-0.01579562947154045,
0.07286885380744934,
-0.0944332554936409,
0.09873063862323761,
-0.03605172783136368,
0.03993780538439751,
-0.20998801290988922,
-0.06222657859325409,
0.003277037525549531,
0.09596208482980728,
-0.059057921171188354,
0.08447207510471344,
0.06804995983839035,
-0.14238440990447998,
0.048279769718647,
-0.08464951068162918,
0.09268169850111008,
-0.010777468793094158,
-0.04991583153605461,
-0.023197684437036514,
0.01093652006238699,
-0.02029726654291153,
-0.14775845408439636,
-0.017736677080392838,
-0.004657731391489506,
0.0813659280538559,
0.09122779220342636,
-0.01554095558822155,
0.07559364289045334,
-0.020398804917931557,
0.06419193744659424,
-0.007216616068035364,
0.012028065510094166,
0.11436575651168823,
-0.03325403481721878,
-0.07464928179979324,
0.14007587730884552,
-0.02445136196911335,
0.302318811416626,
0.18739275634288788,
-0.3378460705280304,
0.029017938300967216,
0.08908210694789886,
-0.031188415363430977,
0.017078900709748268,
0.10360831022262573,
-0.023263847455382347,
0.13490816950798035,
0.04879279062151909,
0.10845015943050385,
-0.04616791382431984,
-0.018289262428879738,
-0.027598362416028976,
-0.028337061405181885,
-0.009101619943976402,
0.0748957097530365,
0.0594186894595623,
-0.17086726427078247,
0.13986793160438538,
0.15792858600616455,
-0.011109116487205029,
0.08833690732717514,
-0.056862279772758484,
-0.011021159589290619,
0.07625383138656616,
0.07908215373754501,
-0.08748628944158554,
-0.013018690049648285,
-0.2802930772304535,
-0.030648034065961838,
0.07821330428123474,
0.01856822334229946,
0.10006235539913177,
-0.15477953851222992,
-0.03778154030442238,
0.0009117769659496844,
-0.031879618763923645,
-0.09307155013084412,
0.044706791639328,
0.07728298753499985,
0.05857471376657486,
0.0573776513338089,
-0.07430531829595566,
0.11049347370862961,
-0.031820427626371384,
-0.07904384285211563,
0.18437373638153076,
-0.06573330610990524,
-0.3133313059806824,
-0.13381166756153107,
-0.06377173960208893,
0.01457512192428112,
-0.018000289797782898,
0.08481769263744354,
-0.07452510297298431,
-0.025052661076188087,
0.06830229610204697,
0.07729893922805786,
-0.10730451345443726,
-0.03530649468302727,
-0.03193192183971405,
0.0973428413271904,
-0.09938590228557587,
-0.10983163118362427,
-0.05269240215420723,
-0.0673312246799469,
-0.045541927218437195,
0.09140918403863907,
-0.1409199982881546,
0.14207321405410767,
0.11696391552686691,
0.005693276412785053,
0.077423095703125,
0.011984570883214474,
0.12109598517417908,
-0.09710078686475754,
-0.06036459282040596,
0.13035635650157928,
-0.03955647349357605,
0.08492255210876465,
0.16808930039405823,
0.018064452335238457,
-0.06940054148435593,
-0.04683447629213333,
-0.041471198201179504,
-0.08089010417461395,
-0.2007138580083847,
-0.11129135638475418,
-0.09587275981903076,
-0.002966204658150673,
-0.013849702663719654,
0.07354029268026352,
0.07919809967279434,
0.04873337224125862,
0.04441668465733528,
-0.03885959833860397,
-0.0014501825207844377,
0.04464361071586609,
0.23716065287590027,
-0.018820906057953835,
0.07082534581422806,
-0.07207667082548141,
-0.12949110567569733,
0.00789872370660305,
0.06736656278371811,
0.2366783171892166,
0.12244200706481934,
0.03393195569515228,
0.09525147825479507,
0.14615340530872345,
0.09218373149633408,
0.1494268923997879,
0.024641012772917747,
-0.003569502616301179,
0.003192717209458351,
-0.018206577748060226,
-0.08973030000925064,
0.055161964148283005,
0.20120513439178467,
-0.09289968013763428,
0.003166440175846219,
-0.20274168252944946,
0.10522811859846115,
0.21796956658363342,
-0.028373554348945618,
-0.19342166185379028,
0.024198099970817566,
0.05546240881085396,
-0.05635995790362358,
-0.01778959110379219,
0.12850873172283173,
0.0033053536899387836,
-0.12062197178602219,
-0.012796743772923946,
-0.019509142264723778,
0.14717650413513184,
-0.05498476326465607,
0.08438613265752792,
-0.04543029144406319,
-0.19146956503391266,
0.06473685801029205,
0.09094797819852829,
-0.23444108664989471,
0.31167417764663696,
0.003306447993963957,
-0.038561154156923294,
-0.07112199813127518,
-0.04483785107731819,
0.009928601793944836,
0.1808573603630066,
0.1750727891921997,
0.015233807265758514,
0.07980579137802124,
-0.1486387699842453,
-0.009045353159308434,
0.0782691240310669,
0.1545184701681137,
-0.03327476978302002,
0.010215473361313343,
-0.02331915684044361,
-0.029989201575517654,
-0.012267888523638248,
0.03715818002820015,
-0.008524499833583832,
-0.10536221414804459,
-0.06154453009366989,
0.020288394764065742,
0.0904325395822525,
0.0006644376553595066,
0.04600219056010246,
0.001070830738171935,
0.14299315214157104,
-0.07304336130619049,
-0.06837349385023117,
-0.0797109305858612,
-0.1477975696325302,
0.08623110502958298,
-0.0668104737997055,
0.04108239337801933,
-0.10201054066419601,
-0.12007119506597519,
-0.05189809575676918,
-0.18513686954975128,
0.12989051640033722,
-0.07441450655460358,
0.027602603659033775,
-0.028094951063394547,
0.13307133316993713,
-0.08617876470088959,
-0.011420823633670807,
0.0078117926605045795,
0.0257965549826622,
-0.10764393210411072,
-0.10013315826654434,
0.02437674067914486,
0.06106824800372124,
0.025204842910170555,
0.11564603447914124,
-0.08772896230220795,
-0.0031381070148199797,
-0.030762873589992523,
0.052779462188482285,
0.2662453353404999,
0.07170353829860687,
-0.026249771937727928,
0.09803158044815063,
0.1772981435060501,
-0.05846834182739258,
-0.3723665475845337,
-0.09070718288421631,
-0.1412006914615631,
-0.03534398227930069,
-0.10075396299362183,
-0.10822948813438416,
0.0723094642162323,
-0.013957844115793705,
-0.012158345431089401,
0.12747971713542938,
-0.31170567870140076,
-0.03506371006369591,
0.1213264986872673,
0.022563209757208824,
0.29785263538360596,
-0.14823122322559357,
-0.07885516434907913,
-0.012521319091320038,
-0.2965478301048279,
0.06017686799168587,
0.0922515019774437,
0.0581921711564064,
-0.021584883332252502,
0.09089738130569458,
0.030341142788529396,
-0.08422140777111053,
0.08213987201452255,
0.013461165130138397,
0.06969908624887466,
-0.05176528915762901,
-0.11622791737318039,
0.10610198974609375,
0.01924169436097145,
0.031485553830862045,
0.10906878113746643,
0.04333681985735893,
-0.10707519203424454,
-0.04248547554016113,
-0.16172702610492706,
0.08770730346441269,
0.04357675835490227,
-0.041813675314188004,
-0.04659615457057953,
-0.07273367047309875,
0.05970463156700134,
0.06668668985366821,
0.26733770966529846,
-0.05488459765911102,
0.07639911025762558,
0.05199141055345535,
0.07850302010774612,
-0.17005112767219543,
-0.14732131361961365,
-0.06400860846042633,
-0.05119277909398079,
0.1369655579328537,
-0.1216486245393753,
0.07406838238239288,
0.12623417377471924,
0.009859554469585419,
0.007141930516809225,
0.12651310861110687,
0.02521677128970623,
-0.032346274703741074,
0.09566818177700043,
-0.16859330236911774,
-0.131964772939682,
-0.0078474385663867,
-0.1586616039276123,
0.1390065997838974,
0.05800560116767883,
0.07064823806285858,
0.02168467827141285,
-0.026202542707324028,
-0.014326730743050575,
-0.014201663434505463,
-0.08670194447040558,
-0.018705114722251892,
0.0529981404542923,
0.0413152277469635,
-0.08931098133325577,
0.05055082216858864,
-0.0003891869564540684,
-0.2598893940448761,
-0.04113886505365372,
0.04994266480207443,
-0.14934520423412323,
-0.1047123447060585,
-0.05020565912127495,
0.14038589596748352,
-0.07231295853853226,
-0.05653749033808708,
-0.061507437378168106,
-0.09672261774539948,
0.07273713499307632,
0.21024306118488312,
0.0810416117310524,
0.108673594892025,
-0.03277362510561943,
-0.005587117280811071,
0.020516857504844666,
-0.0312730148434639,
-0.06744340807199478,
-0.05141725763678551,
-0.10874760895967484,
-0.04648858681321144,
0.025783248245716095,
0.14447630941867828,
-0.08243455737829208,
-0.06434068828821182,
-0.1765460968017578,
0.12586501240730286,
-0.15264290571212769,
-0.04493829607963562,
-0.05616871267557144,
-0.041107166558504105,
0.005686645396053791,
-0.05922761559486389,
-0.07126617431640625,
-0.011580842547118664,
-0.154485821723938,
0.0030423952266573906,
0.03663560748100281,
0.033557210117578506,
-0.07489218562841415,
-0.039779383689165115,
0.11740900576114655,
-0.04257723689079285,
0.08555926382541656,
0.176638662815094,
-0.1462661325931549,
0.09246957302093506,
-0.14960269629955292,
-0.1375490427017212,
0.1395706832408905,
0.009273394010961056,
0.03425491228699684,
0.13292677700519562,
0.048374056816101074,
0.06482790410518646,
0.03134450316429138,
0.05979492887854576,
0.032448362559080124,
-0.14709095656871796,
0.05963708087801933,
-0.06769146770238876,
-0.15388783812522888,
-0.02925100363790989,
-0.06518721580505371,
0.09521565586328506,
0.08544307947158813,
0.12423396110534668,
-0.01844089850783348,
0.07397501170635223,
-0.020496224984526634,
-0.014061595313251019,
0.013403342105448246,
-0.10654618591070175,
-0.03531536087393761,
-0.09954486787319183,
0.013292074203491211,
-0.04841762036085129,
0.2828221619129181,
0.0031543804798275232,
0.011320449411869049,
0.038212958723306656,
0.06949949264526367,
-0.02927611954510212,
0.02666269801557064,
0.20465631783008575,
0.07100603729486465,
-0.018795952200889587,
-0.0033977965358644724,
0.02616511657834053,
-0.0008159334538504481,
-0.11789605766534805,
0.06748761236667633,
0.24040332436561584,
0.09567099809646606,
0.12290588021278381,
0.03372737392783165,
0.03162875771522522,
-0.11338841170072556,
-0.1877398043870926,
0.02795301005244255,
0.09022104740142822,
-0.025515304878354073,
0.13254426419734955,
0.13935786485671997,
0.02819826453924179,
0.09376376867294312,
0.003742426633834839,
-0.014908524230122566,
-0.12102306634187698,
-0.0443989560008049,
-0.07251373678445816,
-0.13524295389652252,
0.04322607070207596,
-0.04698150232434273,
-0.02606256492435932,
0.2030750811100006,
0.0179939903318882,
-0.03420597314834595,
0.09785472601652145,
0.0421818308532238,
-0.06997424364089966,
0.08179320394992828,
-0.04289522394537926,
-0.026412561535835266,
0.00055879628052935,
-0.0346599705517292,
-0.0751967579126358,
-0.06483342498540878,
-0.03410165756940842,
0.07224040478467941,
-0.10611730068922043,
-0.02457038126885891,
-0.1662091612815857,
-0.07463259249925613,
-0.06463247537612915,
0.0925065129995346,
-0.08244489878416061,
0.04450501129031181,
0.015051777474582195,
-0.041947659105062485,
0.022479530423879623,
0.21140354871749878,
-0.08741587400436401,
-0.14311516284942627,
-0.005460052285343409,
0.28742894530296326,
0.033626604825258255,
0.09928186237812042,
-0.009964990429580212,
-0.006446478422731161,
-0.09364791214466095,
0.34549522399902344,
0.23661594092845917,
-0.056418851017951965,
0.03570612892508507,
0.04023140296339989,
0.03622014820575714,
0.092843197286129,
0.09890351444482803,
0.10934626311063766,
0.2994347810745239,
-0.09157008677721024,
-0.07316969335079193,
-0.04277901351451874,
-0.022340746596455574,
-0.1365603804588318,
0.05781230330467224,
0.03555132821202278,
-0.06268171221017838,
-0.12673425674438477,
0.1244690790772438,
-0.24090325832366943,
0.09227778762578964,
0.14072950184345245,
-0.19596484303474426,
-0.043907761573791504,
-0.06260692328214645,
0.14123304188251495,
-0.046254128217697144,
0.09966244548559189,
-0.035839617252349854,
-0.14508672058582306,
0.030125906690955162,
0.029977014288306236,
-0.2787565290927887,
-0.0032371000852435827,
0.04502212628722191,
0.02168104238808155,
0.052524276077747345,
0.009334537200629711,
0.07107052952051163,
0.08238043636083603,
0.062403496354818344,
0.010394016280770302,
0.07077473402023315,
0.049105241894721985,
-0.06968847662210464,
-0.07490189373493195,
-0.06616254895925522,
0.018164828419685364,
-0.08264587074518204,
0.023209089413285255,
-0.17781150341033936,
0.05683336779475212,
-0.01963288150727749,
0.00445194635540247,
-0.09381145238876343,
-0.05481733754277229,
-0.06427531689405441,
-0.008732501417398453,
0.05319303646683693,
0.03609471023082733,
0.004405466374009848,
-0.05553659796714783,
0.011063934303820133,
0.05035819113254547,
-0.07433667778968811,
-0.15044541656970978,
-0.06796272844076157,
-0.06511654704809189,
0.025130584836006165,
-0.024509107694029808,
-0.08889204263687134,
-0.01918594166636467,
0.010601107962429523,
0.05910160765051842,
-0.04882096126675606,
0.055259428918361664,
0.050999026745557785,
0.03679294511675835,
0.03482266142964363,
-0.2051863670349121,
0.08412422984838486,
0.09744313359260559,
-0.08082951605319977,
-0.0753294974565506
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "TheBloke/Mixtral-8x7B-Instruct-v0.1-GPTQ"} | null | man4j/schedule_adapter_w123 | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:TheBloke/Mixtral-8x7B-Instruct-v0.1-GPTQ",
"region:us"
] | 2024-02-13T10:01:27+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-TheBloke/Mixtral-8x7B-Instruct-v0.1-GPTQ #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-TheBloke/Mixtral-8x7B-Instruct-v0.1-GPTQ #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
49,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-TheBloke/Mixtral-8x7B-Instruct-v0.1-GPTQ #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.10241162031888962,
0.20291121304035187,
-0.0033211822155863047,
0.027554785832762718,
0.08416222035884857,
0.022646872326731682,
0.0688340961933136,
0.12285299599170685,
0.014241965487599373,
0.1298184096813202,
0.05316650867462158,
0.10087748616933823,
0.12388463318347931,
0.2219739854335785,
-0.008415017277002335,
-0.19339223206043243,
0.02651054412126541,
-0.07841257005929947,
0.00622309185564518,
0.1220838651061058,
0.13814494013786316,
-0.09734494984149933,
0.07359287887811661,
-0.02212522365152836,
-0.002798334462568164,
-0.03409436345100403,
-0.0674295723438263,
-0.026013536378741264,
0.05184267461299896,
0.051205407828092575,
0.04352471977472305,
-0.005666714161634445,
0.09222743660211563,
-0.2709183394908905,
0.00988082867115736,
0.05450855940580368,
0.0004463361983653158,
0.08654193580150604,
0.1037665382027626,
-0.02922348491847515,
0.11032471805810928,
-0.03863106667995453,
0.13349297642707825,
0.07953806966543198,
-0.0945005789399147,
-0.22421208024024963,
-0.0734376534819603,
0.08812413364648819,
0.18274888396263123,
0.06756641715765,
-0.036271050572395325,
0.12798361480236053,
-0.07224558293819427,
0.014562600292265415,
0.07725424319505692,
-0.10370124876499176,
-0.07292476296424866,
0.0700574442744255,
0.11812886595726013,
0.08786118030548096,
-0.11989298462867737,
-0.03652530536055565,
0.03259773179888725,
0.0405353382229805,
0.08462199568748474,
0.012338562868535519,
0.17083656787872314,
0.032764509320259094,
-0.14368265867233276,
-0.05431520938873291,
0.13684897124767303,
0.02277020737528801,
-0.03927306458353996,
-0.2312253713607788,
-0.0147244306281209,
-0.07286060601472855,
-0.03476874157786369,
-0.05808602273464203,
0.03883173316717148,
0.007416407577693462,
0.1119336187839508,
-0.035968296229839325,
-0.08093801885843277,
-0.016997680068016052,
0.11218547075986862,
0.07410099357366562,
0.014264436438679695,
-0.016047395765781403,
0.012463731691241264,
0.12808389961719513,
0.0529506616294384,
-0.12075208872556686,
-0.04642258211970329,
-0.06770551204681396,
-0.03977956250309944,
-0.029056036844849586,
0.05456842854619026,
0.036427490413188934,
0.04068655148148537,
0.2539169490337372,
-0.022554973140358925,
0.05624387040734291,
0.05343281850218773,
0.016929902136325836,
0.033251192420721054,
0.09979264438152313,
-0.05554435774683952,
-0.18952210247516632,
-0.019396746531128883,
0.10349728167057037,
0.007398975547403097,
-0.024227553978562355,
-0.04216491058468819,
0.03047754056751728,
0.030641576275229454,
0.11486386507749557,
0.10578913241624832,
-0.02181314304471016,
-0.06947889924049377,
-0.058524344116449356,
0.21724362671375275,
-0.1507103443145752,
0.05149835720658302,
0.01805715076625347,
-0.016935838386416435,
-0.04699482023715973,
0.013885955326259136,
0.013949448242783546,
-0.038323141634464264,
0.10757450759410858,
-0.06294876337051392,
-0.04819023981690407,
-0.11351808905601501,
-0.045790355652570724,
0.032998405396938324,
0.006595016457140446,
-0.043710365891456604,
-0.030802011489868164,
-0.08483964204788208,
-0.09372523427009583,
0.09612125903367996,
-0.05832058563828468,
-0.07376881688833237,
-0.020545417442917824,
-0.06751634925603867,
0.023718176409602165,
0.019348789006471634,
0.0770672857761383,
-0.02645258605480194,
0.04099829122424126,
-0.03301355242729187,
0.06740741431713104,
0.09270182251930237,
0.036715760827064514,
-0.07072219997644424,
0.06749111413955688,
-0.19039791822433472,
0.07915014028549194,
-0.07730600982904434,
0.0259315837174654,
-0.16072310507297516,
-0.007193777710199356,
0.0065791672095656395,
0.023450937122106552,
0.03642042353749275,
0.15155065059661865,
-0.19257301092147827,
-0.030999841168522835,
0.16964593529701233,
-0.1034105122089386,
-0.11007869243621826,
0.04466579109430313,
-0.0359031967818737,
0.15868283808231354,
0.03239075094461441,
-0.004666374996304512,
0.09189342707395554,
-0.14812920987606049,
-0.01602858118712902,
-0.02270488440990448,
0.015184148214757442,
0.07695864886045456,
0.07180127501487732,
-0.08512724936008453,
0.013627678155899048,
0.016883332282304764,
-0.059581007808446884,
-0.009483852423727512,
-0.03997640311717987,
-0.09793119877576828,
0.006952229421585798,
-0.0850548967719078,
0.017064262181520462,
0.0061867497861385345,
-0.0844012051820755,
-0.016043178737163544,
-0.14383696019649506,
-0.03986300528049469,
0.07936456799507141,
0.01142581831663847,
-0.016689026728272438,
-0.070533886551857,
0.030959682539105415,
-0.03974509984254837,
-0.01608947105705738,
-0.14325198531150818,
-0.019686013460159302,
0.034839656203985214,
-0.15373988449573517,
-0.0043590012937784195,
-0.11967501044273376,
0.06718328595161438,
0.01278102956712246,
-0.06357010453939438,
-0.0395624078810215,
0.02111448161303997,
-0.0039273761212825775,
-0.060736481100320816,
-0.21663007140159607,
-0.03398824483156204,
-0.04213433340191841,
0.1359318047761917,
-0.22130750119686127,
0.045259810984134674,
0.011464261449873447,
0.1195969358086586,
0.01012354250997305,
-0.06636261940002441,
0.025195196270942688,
-0.05916885659098625,
-0.02360491454601288,
-0.07411377876996994,
-0.009803853929042816,
-0.0015868277987465262,
-0.0307029839605093,
0.028626328334212303,
-0.14913149178028107,
-0.04881889000535011,
0.09034619480371475,
0.08486010879278183,
-0.15073388814926147,
0.005739310756325722,
-0.0449642613530159,
-0.06598568707704544,
-0.0883575826883316,
-0.07506520301103592,
0.07097290456295013,
0.05091328173875809,
0.05194438621401787,
-0.07808561623096466,
-0.06680195778608322,
0.007582232356071472,
-0.014358406886458397,
-0.0263998880982399,
0.11779258400201797,
0.0843748226761818,
-0.08056825399398804,
0.09426524490118027,
0.07968970388174057,
0.045986682176589966,
0.08395859599113464,
-0.00979606807231903,
-0.10105940699577332,
-0.03201433643698692,
0.056587737053632736,
0.014709155075252056,
0.1461801528930664,
-0.05392427742481232,
0.04700694978237152,
0.05154579505324364,
-0.04241063445806503,
0.03874242305755615,
-0.10193897783756256,
0.016217412427067757,
0.011710701510310173,
-0.01351161953061819,
0.04146059975028038,
-0.022880302742123604,
0.010416623204946518,
0.08702710270881653,
0.06402462720870972,
0.029095549136400223,
0.022188544273376465,
-0.0358247235417366,
-0.1377100795507431,
0.16925844550132751,
-0.0919458344578743,
-0.2317943423986435,
-0.15182356536388397,
0.030899234116077423,
0.05365525931119919,
-0.020363418385386467,
0.02656489424407482,
-0.03483406826853752,
-0.10815080255270004,
-0.0833033099770546,
0.018068116158246994,
0.04147995635867119,
-0.06885619461536407,
-0.056137703359127045,
0.03660798445343971,
0.03671291843056679,
-0.12685708701610565,
0.029957983642816544,
0.05683210492134094,
-0.0014920257963240147,
-0.0018424380104988813,
0.04514772444963455,
0.0837569609284401,
0.19294017553329468,
0.00678630918264389,
-0.00033998058643192053,
0.05113794654607773,
0.2802429497241974,
-0.1532752960920334,
0.1288866102695465,
0.12712986767292023,
-0.048333749175071716,
0.09257275611162186,
0.1977733075618744,
0.04262400045990944,
-0.0806935653090477,
0.023329798132181168,
0.03125106543302536,
-0.03424730524420738,
-0.258588582277298,
-0.06727924942970276,
-0.022342883050441742,
-0.06876970827579498,
0.08892931789159775,
0.08419667929410934,
0.09419924765825272,
0.03849819675087929,
-0.07990297675132751,
-0.05752116069197655,
0.04981521889567375,
0.11148285865783691,
-0.03475463390350342,
0.013263084925711155,
0.08193547278642654,
-0.03910587728023529,
0.003016039030626416,
0.09705069661140442,
-0.011384314857423306,
0.1503230333328247,
0.04875265434384346,
0.10934828966856003,
0.06258805841207504,
0.07787226885557175,
-0.004907501861453056,
0.05358579382300377,
0.012701563537120819,
0.026646392419934273,
0.009867367334663868,
-0.09110385179519653,
0.027962125837802887,
0.12554438412189484,
0.021915199235081673,
0.03460192307829857,
0.02514786832034588,
-0.05864737182855606,
0.0371248833835125,
0.21132348477840424,
0.010867614299058914,
-0.19658063352108002,
-0.07221776992082596,
0.06515976041555405,
-0.07966580986976624,
-0.1482042521238327,
-0.004233686253428459,
0.02674056589603424,
-0.17231301963329315,
0.020228011533617973,
-0.04328742250800133,
0.10723823308944702,
-0.07417872548103333,
-0.04155757650732994,
0.09809311479330063,
0.06075751408934593,
-0.032227762043476105,
0.047201432287693024,
-0.16842719912528992,
0.1130785346031189,
0.029493173584342003,
0.06834647804498672,
-0.09470956027507782,
0.10082831233739853,
0.00573690515011549,
-0.01805390603840351,
0.168406143784523,
0.005364075768738985,
-0.05327284708619118,
-0.08046768605709076,
-0.06854433566331863,
-0.025691505521535873,
0.09039940685033798,
-0.13580916821956635,
0.06675631552934647,
-0.021497096866369247,
-0.039563074707984924,
0.0031384171452373266,
-0.10701049864292145,
-0.11377666890621185,
-0.1714104413986206,
0.0680714026093483,
-0.07135035842657089,
0.0074243079870939255,
-0.09528306871652603,
-0.05566806346178055,
-0.013526429422199726,
0.17676512897014618,
-0.18544819951057434,
-0.11301789432764053,
-0.14834918081760406,
-0.10644631087779999,
0.1688571274280548,
-0.04588889703154564,
0.08620408177375793,
-0.003780101891607046,
0.16818776726722717,
-0.013117827475070953,
-0.013871356844902039,
0.09139063954353333,
-0.09512437134981155,
-0.19607728719711304,
-0.0536918081343174,
0.17631983757019043,
0.12736283242702484,
0.034028176218271255,
-0.02477572113275528,
0.026034241542220116,
-0.048813607543706894,
-0.11486516147851944,
0.01693202368915081,
0.14413820207118988,
0.04346391558647156,
-0.0013214257778599858,
-0.02247532270848751,
-0.12267143279314041,
-0.053297027945518494,
-0.05065767094492912,
-0.007045168895274401,
0.2028045952320099,
-0.08254018425941467,
0.16004601120948792,
0.1230425015091896,
-0.05118198320269585,
-0.2072300761938095,
0.032102689146995544,
0.03805069625377655,
0.011795414611697197,
0.03697442635893822,
-0.18376684188842773,
0.08461438864469528,
-0.010470968671143055,
-0.07971590757369995,
0.1731080710887909,
-0.18928386270999908,
-0.13251115381717682,
0.08961473405361176,
0.01953088864684105,
-0.22736068069934845,
-0.13408683240413666,
-0.1121285930275917,
-0.017403503879904747,
-0.12868450582027435,
0.04539474844932556,
0.033211033791303635,
0.0047318232245743275,
0.017090417444705963,
0.0167858749628067,
0.03868073225021362,
-0.05380486324429512,
0.19802410900592804,
-0.02534671314060688,
0.0064383139833807945,
-0.05045955628156662,
-0.0972965806722641,
0.021636666730046272,
-0.0593206025660038,
0.10891751945018768,
-0.008704948239028454,
0.022663893178105354,
-0.15567873418331146,
-0.044015731662511826,
-0.06518451869487762,
0.018279114738106728,
-0.09403128176927567,
-0.0914546400308609,
-0.0517440102994442,
0.08104117214679718,
0.10678773373365402,
-0.02520112879574299,
0.0030321080703288317,
-0.07314292341470718,
0.07954472303390503,
0.2165769785642624,
0.16140280663967133,
0.050408151000738144,
-0.062376733869314194,
0.01095870416611433,
-0.032874345779418945,
0.04081052914261818,
-0.21826592087745667,
0.04178958758711815,
0.05794466659426689,
0.035158585757017136,
0.08424677699804306,
-0.011153952218592167,
-0.1608465164899826,
-0.07657818496227264,
0.07186894863843918,
-0.0630071759223938,
-0.16725952923297882,
-0.03666141629219055,
0.043596040457487106,
-0.19635580480098724,
-0.050604816526174545,
0.029116792604327202,
-0.023043865337967873,
-0.034027356654405594,
0.015835268422961235,
0.0855732336640358,
-0.006201168522238731,
0.1006394773721695,
0.07577869296073914,
0.09691638499498367,
-0.10427147895097733,
0.06933150440454483,
0.08673715591430664,
-0.03649352118372917,
0.008871869184076786,
0.1356717199087143,
-0.04907501861453056,
-0.02333948388695717,
0.06828055530786514,
0.07583365589380264,
0.011994399130344391,
-0.05187755823135376,
0.01617954857647419,
-0.06943706423044205,
0.06270506232976913,
0.09608164429664612,
0.018787182867527008,
-0.015092877671122551,
0.06536426395177841,
0.02200726419687271,
-0.09196051955223083,
0.1232004314661026,
0.06719259917736053,
0.023839196190238,
-0.040039580315351486,
-0.029383352026343346,
-0.013052203692495823,
-0.012861467897891998,
-0.013706198893487453,
-0.004777237307280302,
-0.07749321311712265,
-0.004901526030153036,
-0.10677313804626465,
0.017987443134188652,
-0.08586433529853821,
0.0062556639313697815,
0.01460693497210741,
-0.04169868677854538,
-0.0010356156853958964,
-0.0019442873308435082,
-0.08135992288589478,
-0.06072331219911575,
-0.023410381749272346,
0.08421844989061356,
-0.12814922630786896,
0.02233981341123581,
0.07157502323389053,
-0.11225058883428574,
0.06513433158397675,
-0.013766749761998653,
0.011903202161192894,
-0.0011828290298581123,
-0.14105287194252014,
0.054434750229120255,
-0.02232430875301361,
-0.004397719632834196,
0.021794527769088745,
-0.16848334670066833,
-0.003927278332412243,
-0.05384964495897293,
-0.07464244961738586,
0.005784913431853056,
-0.04033106565475464,
-0.13273654878139496,
0.11301997303962708,
-0.01205352135002613,
-0.07007218897342682,
-0.02108526974916458,
0.04965181276202202,
0.08582427352666855,
-0.029263244941830635,
0.09391331672668457,
-0.026220787316560745,
0.08132342994213104,
-0.17990025877952576,
-0.007992210797965527,
-0.01314880046993494,
0.03594740480184555,
-0.021472977474331856,
-0.01701587252318859,
0.05286865308880806,
-0.011645601131021976,
0.17302604019641876,
-0.016252128407359123,
0.0715932697057724,
0.04926299303770065,
-0.002273160731419921,
0.02612932026386261,
0.06922506541013718,
0.06263475865125656,
-0.013777012936770916,
-0.006703877355903387,
0.025518175214529037,
-0.01203182339668274,
-0.043883875012397766,
-0.15184491872787476,
0.04190473631024361,
0.16416770219802856,
0.0669953003525734,
0.03149545565247536,
0.016359500586986542,
-0.1381765753030777,
-0.09079206734895706,
0.1024278923869133,
-0.01965966820716858,
-0.015629665926098824,
-0.0710109993815422,
0.19229452311992645,
0.1239180862903595,
-0.1959129273891449,
0.06958167999982834,
-0.05019497871398926,
-0.033301521092653275,
-0.12825556099414825,
-0.15621665120124817,
-0.05774960294365883,
-0.04301399737596512,
-0.024062801152467728,
-0.06046803668141365,
0.06114653870463371,
0.04824526235461235,
0.0005861376994289458,
-0.0034641874954104424,
0.09882006794214249,
-0.0030698522459715605,
-0.0261316429823637,
0.062375977635383606,
0.070594921708107,
0.04351262003183365,
-0.0851086676120758,
0.0034336168318986893,
-0.0036912199575453997,
0.006291464436799288,
0.06221063435077667,
0.02046637050807476,
-0.05995354428887367,
0.023651355877518654,
-0.0031497443560510874,
-0.11754647642374039,
0.03964175283908844,
-0.013863922096788883,
-0.03659239411354065,
0.1409895122051239,
0.02411041036248207,
0.01002442929893732,
-0.025545427575707436,
0.22200126945972443,
-0.09043747186660767,
-0.07335679233074188,
-0.1350328028202057,
0.07634291052818298,
-0.05163929983973503,
0.034580618143081665,
0.03461921960115433,
-0.12328187376260757,
0.007982177659869194,
0.16491727530956268,
0.13233119249343872,
0.005296965595334768,
0.008370660245418549,
0.05376332998275757,
0.0050503346137702465,
-0.033722180873155594,
0.02334059216082096,
0.04789893701672554,
0.18667389452457428,
-0.07347661256790161,
0.08116268366575241,
-0.011531081981956959,
-0.07647887617349625,
-0.02597833052277565,
0.13372492790222168,
-0.0061443643644452095,
0.0012886060867458582,
-0.060973331332206726,
0.13901866972446442,
-0.05249802768230438,
-0.21539801359176636,
0.06114204227924347,
-0.08400261402130127,
-0.13779664039611816,
-0.03524930402636528,
0.004324160981923342,
-0.01877412386238575,
0.015222083777189255,
0.07058162242174149,
-0.05617117881774902,
0.1822892725467682,
0.03271899372339249,
-0.06301932781934738,
-0.08896319568157196,
0.050009239464998245,
-0.13676290214061737,
0.28795701265335083,
0.026645557954907417,
0.034253478050231934,
0.10099101811647415,
-0.02537885122001171,
-0.1473681926727295,
0.018547220155596733,
0.11292359232902527,
-0.07360169291496277,
0.052456751465797424,
0.16655004024505615,
-0.008003112860023975,
0.13382694125175476,
0.053850140422582626,
-0.0640820637345314,
0.03422285616397858,
-0.058017946779727936,
-0.05961751565337181,
-0.12273355573415756,
0.06998232752084732,
-0.06957422941923141,
0.15258458256721497,
0.1265372633934021,
-0.06430234760046005,
-0.01140555553138256,
-0.017742551863193512,
0.07563493400812149,
0.013451684266328812,
0.1336747705936432,
0.021661264821887016,
-0.1818597912788391,
0.04453620687127113,
-0.0050002168864011765,
0.114112988114357,
-0.19861380755901337,
-0.06124785542488098,
0.03626563400030136,
-0.027486030012369156,
-0.08385716378688812,
0.11399971693754196,
0.04718085378408432,
0.0199828390032053,
-0.03075578063726425,
-0.07687613368034363,
0.0018916029948741198,
0.15286992490291595,
-0.10229536145925522,
-0.00791933760046959
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilroberta-base", "model-index": [{"name": "distilroberta-base-finetuned-wikitext2", "results": []}]} | text-generation | Doniaa/tryMModel | [
"transformers",
"safetensors",
"roberta",
"text-generation",
"generated_from_trainer",
"base_model:distilroberta-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:01:45+00:00 | [] | [] | TAGS
#transformers #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of distilroberta-base on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
64,
39,
6,
12,
8,
3,
90,
33
] | [
"passage: TAGS\n#transformers #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.05755733698606491,
0.06291667371988297,
-0.0017274122219532728,
0.07049737125635147,
0.17588646709918976,
0.017893649637699127,
0.15770304203033447,
0.06305845081806183,
-0.1311996579170227,
0.04601528123021126,
0.06021011993288994,
0.08616229146718979,
0.020650194957852364,
0.14250946044921875,
-0.05562435835599899,
-0.2343900501728058,
0.02206701971590519,
-0.013178869150578976,
-0.08562831580638885,
0.09726450592279434,
0.10053586959838867,
-0.11248474568128586,
0.06790635734796524,
-0.009303913451731205,
-0.1995515376329422,
0.0321517139673233,
0.01176750659942627,
-0.04899389669299126,
0.11386629939079285,
0.0076617104932665825,
0.1323269009590149,
0.00923811923712492,
0.15090429782867432,
-0.20975126326084137,
0.007108967285603285,
0.09811246395111084,
0.03344572335481644,
0.06188860163092613,
0.036014020442962646,
0.015893109142780304,
0.1166745126247406,
-0.12963198125362396,
0.09050677716732025,
0.03018166311085224,
-0.07828782498836517,
-0.13349749147891998,
-0.07796435058116913,
0.048372332006692886,
0.10969548672437668,
0.09628979116678238,
0.003080459777265787,
0.1239042580127716,
-0.08962015062570572,
0.08376730233430862,
0.16586829721927643,
-0.276356041431427,
-0.08890634775161743,
0.07267820090055466,
0.0443861186504364,
0.10168208926916122,
-0.0855945274233818,
-0.0011810697615146637,
0.06894716620445251,
0.037918269634246826,
0.12973342835903168,
-0.030064139515161514,
-0.09861799329519272,
-0.026462046429514885,
-0.14540430903434753,
0.022678889334201813,
0.17076027393341064,
0.03976108506321907,
-0.053012002259492874,
-0.042600423097610474,
-0.07997609674930573,
-0.01787058264017105,
-0.033645644783973694,
-0.06675242632627487,
0.0466390959918499,
-0.02570193260908127,
-0.05229807645082474,
-0.058084215968847275,
-0.07271403819322586,
-0.057636965066194534,
-0.017662832513451576,
0.13112589716911316,
0.0345221646130085,
0.007027133833616972,
-0.04101867601275444,
0.07853330671787262,
-0.034703705459833145,
-0.10771674662828445,
0.019350862130522728,
-0.007536028511822224,
-0.03244051709771156,
-0.0748826190829277,
-0.05429389327764511,
-0.027403777465224266,
0.02660755254328251,
0.155097097158432,
-0.06961076706647873,
0.044911883771419525,
0.001901281182654202,
0.0022432708647102118,
-0.04842793568968773,
0.11312578618526459,
-0.03558267280459404,
-0.08277863264083862,
0.04051292687654495,
0.09066300839185715,
0.03521018102765083,
-0.009039608761668205,
-0.08968950062990189,
-0.007906854152679443,
0.0920201987028122,
0.04149626940488815,
-0.04791025072336197,
0.039805371314287186,
-0.015868818387389183,
-0.017991503700613976,
-0.015941863879561424,
-0.12337157875299454,
0.03508300334215164,
-0.017292426899075508,
-0.0538800023496151,
-0.0070933070965111256,
0.025629397481679916,
0.027303092181682587,
-0.01227851677685976,
0.11700360476970673,
-0.09153284877538681,
0.0068517387844622135,
-0.11231252551078796,
-0.09735357016324997,
-0.003814252093434334,
-0.06773018836975098,
-0.004309377633035183,
-0.09641087055206299,
-0.21046751737594604,
-0.03212543576955795,
0.04736797884106636,
-0.021837303414940834,
-0.04198337346315384,
-0.049809444695711136,
-0.07941306382417679,
0.0038912775926291943,
-0.0067772879265248775,
0.09555011242628098,
-0.04937077313661575,
0.06066739559173584,
0.017495742067694664,
0.032238852232694626,
-0.03367629647254944,
0.025747651234269142,
-0.09737686812877655,
0.01812298595905304,
-0.15285776555538177,
0.047816257923841476,
-0.07604620605707169,
0.06685464084148407,
-0.08994189649820328,
-0.08780612796545029,
-0.005089232232421637,
-0.001960460329428315,
0.055244479328393936,
0.10010039806365967,
-0.18193089962005615,
-0.03634022921323776,
0.1584029495716095,
-0.09763024002313614,
-0.08735259622335434,
0.09782031178474426,
-0.04522634670138359,
0.06574077159166336,
0.08231229335069656,
0.14845706522464752,
0.06869174540042877,
-0.12119739502668381,
0.006618368439376354,
-0.01078302413225174,
0.06530866771936417,
0.0033119090367108583,
0.03410082682967186,
0.004249215591698885,
-0.004925489891320467,
0.011678067035973072,
-0.06770963221788406,
0.005667591001838446,
-0.08976933360099792,
-0.08312340825796127,
-0.06567345559597015,
-0.10508738458156586,
0.01879037171602249,
0.029924945905804634,
0.04689619317650795,
-0.0839938372373581,
-0.08302026987075806,
0.14861877262592316,
0.11059780418872833,
-0.062435347586870193,
0.009521136991679668,
-0.05776030942797661,
0.034013208001852036,
-0.015377103351056576,
-0.01166564505547285,
-0.18683013319969177,
-0.10936374217271805,
0.016776828095316887,
-0.013223395682871342,
0.05147596448659897,
0.021378695964813232,
0.05037243291735649,
0.08949432522058487,
-0.05212562531232834,
-0.00788905005902052,
-0.06391847133636475,
0.018894068896770477,
-0.1019643098115921,
-0.2146281749010086,
-0.02087407559156418,
-0.025728711858391762,
0.16999876499176025,
-0.24382437765598297,
0.04022786766290665,
-0.08195208013057709,
0.11343450099229813,
0.01658625714480877,
-0.022880326956510544,
-0.05609266832470894,
0.07931157946586609,
-0.025114282965660095,
-0.0819220319390297,
0.044275298714637756,
0.0027390175964683294,
-0.05483565852046013,
-0.09526093304157257,
-0.16684792935848236,
0.1058451384305954,
0.0990031436085701,
-0.020677300170063972,
-0.07996837049722672,
0.014175104908645153,
-0.04700546711683273,
-0.036097098141908646,
-0.07647266238927841,
0.017987264320254326,
0.11407031863927841,
-0.02586905099451542,
0.13548514246940613,
-0.0523463636636734,
-0.022121861577033997,
-0.006583831273019314,
-0.03253498300909996,
0.015007374808192253,
0.05309741571545601,
0.09681933373212814,
-0.11811215430498123,
0.10585112124681473,
0.1436239331960678,
-0.10683158040046692,
0.12721367180347443,
-0.03610869497060776,
-0.05576346814632416,
-0.006992429494857788,
-0.020164599642157555,
-0.007830386981368065,
0.095238097012043,
-0.07874729484319687,
0.009227531962096691,
0.0041729179210960865,
0.010969879105687141,
0.027581630274653435,
-0.1771201193332672,
-0.006731441244482994,
0.01189370360225439,
-0.025154804810881615,
0.0005699012544937432,
-0.03807080537080765,
0.01569061167538166,
0.09461374580860138,
0.010923356749117374,
-0.036334145814180374,
0.013498139567673206,
0.0017841997323557734,
-0.08226901292800903,
0.19483377039432526,
-0.12428903579711914,
-0.09531360119581223,
-0.10260522365570068,
0.006003525108098984,
-0.06130356714129448,
-0.00405117915943265,
0.03521259129047394,
-0.09686180204153061,
-0.06511837244033813,
-0.08735103905200958,
0.004549181554466486,
0.0010633376659825444,
-0.010183141566812992,
0.07066556811332703,
0.01884155347943306,
0.08329924941062927,
-0.12557153403759003,
-0.004351102747023106,
-0.02425972744822502,
-0.12200324982404709,
0.006109918933361769,
0.06075552478432655,
0.10929308831691742,
0.14308874309062958,
-0.023753074929118156,
-0.007634779904037714,
-0.019147193059325218,
0.2445216029882431,
-0.046365439891815186,
0.007942038588225842,
0.14713548123836517,
0.008424704894423485,
0.04607225954532623,
0.14604508876800537,
0.044816017150878906,
-0.10525385290384293,
0.05178249999880791,
0.07174251973628998,
-0.012145644053816795,
-0.21952764689922333,
-0.05840907618403435,
-0.0263513270765543,
-0.07379718869924545,
0.07245373725891113,
0.030088569968938828,
0.0441165566444397,
0.05665818974375725,
-0.008046697825193405,
0.08128190040588379,
0.009573920629918575,
0.08654196560382843,
0.12505283951759338,
0.042727671563625336,
0.12888145446777344,
-0.03881122171878815,
-0.04019175469875336,
0.0489717498421669,
-0.037065062671899796,
0.25897011160850525,
0.010108153335750103,
0.03189484030008316,
0.05694981664419174,
0.12807029485702515,
-0.02008626237511635,
0.042766373604536057,
0.027106283232569695,
-0.017726469784975052,
-0.0003935463901143521,
-0.06784369051456451,
-0.02864917926490307,
0.0310688316822052,
-0.10766468197107315,
0.061539486050605774,
-0.07750843465328217,
0.07162906229496002,
0.04672112688422203,
0.252516508102417,
0.01492906454950571,
-0.29436007142066956,
-0.09250055998563766,
0.02866220660507679,
-0.025350326672196388,
-0.0356704518198967,
0.013113364577293396,
0.10836216062307358,
-0.10194747149944305,
0.047881320118904114,
-0.05029673874378204,
0.0990133285522461,
0.016646603122353554,
0.03470410779118538,
0.04204945266246796,
0.17447057366371155,
-0.00662460969761014,
0.06341741979122162,
-0.26047781109809875,
0.20458589494228363,
0.018996739760041237,
0.14936892688274384,
-0.03810853883624077,
0.02214890904724598,
0.02936849370598793,
0.14335249364376068,
0.05707023665308952,
-0.006658126600086689,
-0.04937244951725006,
-0.15393808484077454,
-0.03242035582661629,
0.05012304335832596,
0.12430145591497421,
0.0004458236799109727,
0.09254183620214462,
-0.06814222782850266,
0.010746191255748272,
0.07501523941755295,
-0.07137461006641388,
-0.22426755726337433,
-0.10629823058843613,
-0.013145084492862225,
0.02181399241089821,
-0.03158506006002426,
-0.08849131315946579,
-0.08953478187322617,
-0.050274576991796494,
0.17968839406967163,
0.0002807830460369587,
-0.01534348912537098,
-0.12434578686952591,
0.07955662161111832,
0.0806211456656456,
-0.060834892094135284,
0.04063628986477852,
0.007046940270811319,
0.09776278585195541,
0.021188823506236076,
-0.1010490208864212,
0.06373651325702667,
-0.09927348792552948,
-0.14811450242996216,
-0.05125432088971138,
0.0712805911898613,
0.05088639259338379,
0.035502899438142776,
0.01066090352833271,
0.023150965571403503,
0.010231762193143368,
-0.09032974392175674,
-0.02867472730576992,
0.0872214064002037,
0.06963890045881271,
0.05585014820098877,
-0.10748866945505142,
-0.06453291326761246,
-0.04911554604768753,
-0.025113530457019806,
0.09854435920715332,
0.22008126974105835,
-0.0769650936126709,
0.023251241073012352,
0.09414742887020111,
-0.09426763653755188,
-0.19937649369239807,
0.08329832553863525,
0.0757526308298111,
-0.0015872296644374728,
0.05095777288079262,
-0.16712048649787903,
0.17781543731689453,
0.12760646641254425,
-0.02447427064180374,
0.06622692197561264,
-0.3027687668800354,
-0.1394229680299759,
0.0812944620847702,
0.14086413383483887,
0.08565952628850937,
-0.15520721673965454,
-0.01871110312640667,
-0.06692991405725479,
-0.12301739305257797,
0.11997230350971222,
-0.16741779446601868,
0.08981291204690933,
0.009313153102993965,
0.06386071443557739,
0.004476733505725861,
-0.033962395042181015,
0.1310993880033493,
0.0028808016795665026,
0.10912195593118668,
-0.07082454860210419,
0.03898165002465248,
0.09602779895067215,
-0.05429964140057564,
0.015988942235708237,
-0.04428701475262642,
0.06175670400261879,
-0.043190017342567444,
-0.022798338904976845,
-0.06190616264939308,
0.0761895552277565,
-0.05575846508145332,
-0.07790474593639374,
-0.04009387269616127,
0.040732357650995255,
0.04133410379290581,
-0.03160759061574936,
0.07177186757326126,
0.015539189800620079,
0.16388310492038727,
0.07458606362342834,
0.1031251922249794,
-0.04789860546588898,
0.006719580385833979,
0.008945446461439133,
-0.028784973546862602,
0.06780438870191574,
-0.12966391444206238,
0.023125246167182922,
0.1087605208158493,
0.03222472593188286,
0.1382971554994583,
0.05556759238243103,
-0.02444327436387539,
0.0022756592370569706,
0.05605604127049446,
-0.13432589173316956,
-0.15178845822811127,
0.008920708671212196,
-0.0562080480158329,
-0.10421130061149597,
0.05259818211197853,
0.1190354973077774,
-0.08321387320756912,
0.0006015683757141232,
-0.028131216764450073,
0.018853897228837013,
-0.04727607220411301,
0.1783989816904068,
0.04380670189857483,
0.044813159853219986,
-0.07340896129608154,
0.11284451931715012,
0.06600325554609299,
-0.056220781058073044,
0.0320388525724411,
0.07672558724880219,
-0.09842527657747269,
-0.03038918599486351,
0.09361395239830017,
0.16937832534313202,
-0.0636344850063324,
-0.04549136757850647,
-0.09698486328125,
-0.11262429505586624,
0.019146421924233437,
0.16975508630275726,
0.05670344829559326,
-0.0323982834815979,
-0.03589467704296112,
0.05337606370449066,
-0.1510636806488037,
0.0860254317522049,
0.028198324143886566,
0.0865233764052391,
-0.16060376167297363,
0.14144235849380493,
0.015156839042901993,
0.008989516645669937,
-0.026981711387634277,
0.03809366375207901,
-0.11152853816747665,
-0.025715762749314308,
-0.152996227145195,
-0.035206541419029236,
-0.04475736618041992,
0.005412244703620672,
-0.004376590717583895,
-0.04386977106332779,
-0.06207802891731262,
0.04814683645963669,
-0.055251844227313995,
-0.0464831180870533,
0.017660336568951607,
0.046755410730838776,
-0.14350485801696777,
0.005238603800535202,
0.014592645689845085,
-0.0780719444155693,
0.049520593136548996,
0.047564294189214706,
0.018422316759824753,
0.06093904376029968,
-0.18906256556510925,
-0.002881348365917802,
0.0525023527443409,
0.024291494861245155,
0.06727628409862518,
-0.05032540112733841,
-0.02021525613963604,
-0.001419366104528308,
0.08682219684123993,
0.015286603011190891,
0.08534667640924454,
-0.10491768270730972,
0.0072796703316271305,
-0.05830592289566994,
-0.07007022202014923,
-0.05821133032441139,
0.023806007578969002,
0.11765957623720169,
0.0078041162341833115,
0.1953868418931961,
-0.10304515808820724,
0.021818792447447777,
-0.18651770055294037,
-0.026471227407455444,
-0.009497396647930145,
-0.05590355023741722,
-0.09772230684757233,
-0.04868115484714508,
0.061365265399217606,
-0.05720284581184387,
0.10690716654062271,
-0.002281845547258854,
0.09904689341783524,
0.044164080172777176,
-0.05484599992632866,
-0.024769969284534454,
0.021933747455477715,
0.214177668094635,
0.06973883509635925,
-0.013238406740128994,
0.038851525634527206,
0.011278376914560795,
0.07927580177783966,
0.028636161237955093,
0.1962329000234604,
0.16054768860340118,
-0.08299556374549866,
0.0671420693397522,
0.0556977204978466,
-0.08789630979299545,
-0.13533177971839905,
0.09057198464870453,
-0.040160294622182846,
0.09802202880382538,
-0.05182497948408127,
0.15335768461227417,
0.13336311280727386,
-0.1510782688856125,
0.01890360750257969,
-0.07246976345777512,
-0.09496369957923889,
-0.13777042925357819,
-0.01620313711464405,
-0.0945674255490303,
-0.1323234885931015,
0.01527775265276432,
-0.13436263799667358,
0.03257543966174126,
0.09755095839500427,
0.00008274372521555051,
0.0011475292267277837,
0.1609438955783844,
-0.012262089177966118,
0.017079081386327744,
0.03792533278465271,
-0.004022078588604927,
-0.027385439723730087,
-0.05375038832426071,
-0.07829372584819794,
0.020926810801029205,
0.005501332227140665,
0.05212552472949028,
-0.0433037243783474,
-0.040297798812389374,
0.046768978238105774,
-0.03357331454753876,
-0.06103884056210518,
0.04196295887231827,
0.03179096430540085,
0.015622376464307308,
0.04056374728679657,
0.03439268469810486,
-0.03096466138958931,
-0.01293323002755642,
0.2899188697338104,
-0.08321334421634674,
-0.14398755133152008,
-0.11842457205057144,
0.2426874041557312,
0.027581777423620224,
0.008278602734208107,
0.0341317355632782,
-0.09573651850223541,
0.00474507175385952,
0.1755436658859253,
0.1679319441318512,
-0.08661670982837677,
-0.013377012684941292,
-0.03234153985977173,
-0.023827264085412025,
-0.08114376664161682,
0.14568953216075897,
0.11716480553150177,
0.037270091474056244,
-0.04671796038746834,
-0.008613028563559055,
-0.020456768572330475,
-0.02262476459145546,
-0.11919427663087845,
0.020571541041135788,
0.016243869438767433,
0.004614737816154957,
-0.009771658107638359,
0.07591915130615234,
-0.00011705484212143347,
-0.13491381704807281,
0.0325557142496109,
-0.1254749298095703,
-0.15573860704898834,
-0.028613995760679245,
0.10587508231401443,
-0.0374743789434433,
0.038081396371126175,
-0.027124766260385513,
-0.011790321208536625,
0.10953544080257416,
-0.03230956196784973,
-0.015681730583310127,
-0.09507926553487778,
0.07278931140899658,
-0.08798885345458984,
0.2105405628681183,
-0.012749259360134602,
0.05465775728225708,
0.111289381980896,
0.024851273745298386,
-0.1016237735748291,
0.08138039708137512,
0.04655236750841141,
-0.07383116334676743,
0.0330040417611599,
0.11251016706228256,
-0.05990096926689148,
0.09690011292695999,
0.05117587372660637,
-0.1479054093360901,
0.004712938331067562,
-0.0035077508073300123,
-0.0601501539349556,
-0.05650131776928902,
-0.008789335377514362,
-0.08381925523281097,
0.14583365619182587,
0.19139191508293152,
-0.025921864435076714,
0.037487804889678955,
-0.08374205231666565,
0.03380664810538292,
0.05698056146502495,
0.11680392175912857,
-0.04673710837960243,
-0.23427262902259827,
0.020597372204065323,
0.028952043503522873,
0.009084634482860565,
-0.2472868263721466,
-0.07678589224815369,
0.0031779243145138025,
-0.034610241651535034,
-0.0882192999124527,
0.10116081684827805,
0.11690640449523926,
0.02858014404773712,
-0.04457087442278862,
-0.18711474537849426,
-0.0640290379524231,
0.1755242645740509,
-0.14162282645702362,
-0.06685449928045273
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "mistralai/Mixtral-8x7B-Instruct-v0.1"} | null | man4j/schedule_adapter_full | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:mistralai/Mixtral-8x7B-Instruct-v0.1",
"region:us"
] | 2024-02-13T10:02:34+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
45,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.11166330426931381,
0.20509707927703857,
-0.0033231477718800306,
0.02670646272599697,
0.0721336156129837,
0.01440113503485918,
0.06981661915779114,
0.1324472576379776,
0.028830822557210922,
0.1290389448404312,
0.0648079365491867,
0.11770330369472504,
0.11638239026069641,
0.21900972723960876,
-0.0036957194097340107,
-0.16946078836917877,
0.02037356235086918,
-0.05711452662944794,
0.037952274084091187,
0.12752673029899597,
0.1354571282863617,
-0.09072265774011612,
0.07362384349107742,
-0.02466413751244545,
-0.007538862992078066,
-0.027965065091848373,
-0.06436469405889511,
-0.011902961879968643,
0.05073476582765579,
0.037214867770671844,
0.05864691361784935,
-0.0088164322078228,
0.0801631286740303,
-0.26900163292884827,
0.015581502579152584,
0.0500354990363121,
-0.0132666090503335,
0.08303997665643692,
0.1026032343506813,
-0.04947930946946144,
0.1267668604850769,
-0.028907323256134987,
0.13223110139369965,
0.08255668729543686,
-0.11067952960729599,
-0.22358666360378265,
-0.06518556922674179,
0.08340518921613693,
0.18160270154476166,
0.06752914935350418,
-0.04131465032696724,
0.11849096417427063,
-0.05605986714363098,
0.0285615436732769,
0.082554392516613,
-0.11574912071228027,
-0.06541609019041061,
0.06995807588100433,
0.1368875801563263,
0.0859280526638031,
-0.1182481050491333,
-0.03850546106696129,
0.03349372372031212,
0.04787788167595863,
0.0693620815873146,
0.0069637605920434,
0.15715058147907257,
0.02904781699180603,
-0.14012712240219116,
-0.05291072279214859,
0.10426473617553711,
0.004123837687075138,
-0.0424688495695591,
-0.21668238937854767,
-0.0120907137170434,
-0.09746074676513672,
-0.03937709331512451,
-0.04808767884969711,
0.034633416682481766,
0.013793008401989937,
0.12047848850488663,
-0.051246386021375656,
-0.08069919794797897,
-0.014758959412574768,
0.11872085928916931,
0.06343500316143036,
0.01053662970662117,
-0.021266069263219833,
0.0022145153488963842,
0.12097897380590439,
0.06366746872663498,
-0.12812542915344238,
-0.06373108923435211,
-0.05845389887690544,
-0.0262139979749918,
-0.020789992064237595,
0.04954610392451286,
0.02505623921751976,
0.041352368891239166,
0.27777883410453796,
-0.025617342442274094,
0.06330657750368118,
0.03365691751241684,
0.018453136086463928,
0.02072696015238762,
0.10739724338054657,
-0.028229515999555588,
-0.19324511289596558,
-0.009276064112782478,
0.10443978756666183,
0.010422861203551292,
-0.031572725623846054,
-0.055636804550886154,
0.02226903848350048,
0.03740885853767395,
0.1252785623073578,
0.10548059642314911,
-0.029080217704176903,
-0.06828282028436661,
-0.055659107863903046,
0.20272165536880493,
-0.15297724306583405,
0.058102600276470184,
0.0312640443444252,
0.0012538626324385405,
-0.06887955218553543,
0.01576593518257141,
0.006739018019288778,
-0.0408230684697628,
0.09014762938022614,
-0.06375429779291153,
-0.0401591956615448,
-0.12028152495622635,
-0.046620503067970276,
0.03773490712046623,
-0.0212379340082407,
-0.05187296122312546,
-0.03473834693431854,
-0.08006973564624786,
-0.10711801052093506,
0.10159563273191452,
-0.054858479648828506,
-0.04623228684067726,
-0.029073171317577362,
-0.07103325426578522,
0.02448420040309429,
0.02929825708270073,
0.060687314718961716,
-0.028546743094921112,
0.04735744744539261,
-0.01890261471271515,
0.07320936024188995,
0.07898474484682083,
0.03669197857379913,
-0.0795140415430069,
0.0666833147406578,
-0.17665353417396545,
0.07909466326236725,
-0.05987977981567383,
0.031177056953310966,
-0.16438622772693634,
0.004525814671069384,
0.0013136515626683831,
0.03303147479891777,
0.05271271616220474,
0.15712793171405792,
-0.20410780608654022,
-0.034817468374967575,
0.18653540313243866,
-0.10162936896085739,
-0.12230569124221802,
0.03465908765792847,
-0.0399034358561039,
0.17828771471977234,
0.04166671261191368,
0.020315758883953094,
0.08869317173957825,
-0.153215691447258,
-0.020568665117025375,
-0.02985585667192936,
0.016156645491719246,
0.05328887701034546,
0.0749252587556839,
-0.08246476203203201,
0.0037532304413616657,
0.005757969804108143,
-0.051963333040475845,
-0.020279431715607643,
-0.037390682846307755,
-0.0952802374958992,
0.00902447011321783,
-0.0785297378897667,
0.003007701598107815,
0.004587811417877674,
-0.09589298069477081,
-0.009361395612359047,
-0.14459697902202606,
-0.024548599496483803,
0.07398907095193863,
0.0033467477187514305,
-0.0062054237350821495,
-0.0840120017528534,
0.04842614382505417,
-0.0549992173910141,
-0.011467142961919308,
-0.1527877300977707,
0.003920058254152536,
0.021322594955563545,
-0.13757358491420746,
0.010791716165840626,
-0.14597731828689575,
0.07221092283725739,
0.012608293443918228,
-0.05457204580307007,
-0.04034373164176941,
0.014677162282168865,
-0.015796886757016182,
-0.07429452240467072,
-0.22484152019023895,
-0.03387301042675972,
-0.05676783621311188,
0.12951238453388214,
-0.22788982093334198,
0.047676488757133484,
0.0008462080149911344,
0.10812564194202423,
0.01358463428914547,
-0.06432308256626129,
0.024316376075148582,
-0.05512988939881325,
-0.028903810307383537,
-0.0712110698223114,
-0.002782947849482298,
0.0032723350450396538,
-0.027405301108956337,
0.02432456612586975,
-0.1412428617477417,
-0.06550132483243942,
0.08897653222084045,
0.07976904511451721,
-0.14601227641105652,
0.006544655654579401,
-0.035382404923439026,
-0.05774768069386482,
-0.06926282495260239,
-0.07145460695028305,
0.06820356845855713,
0.0507492795586586,
0.05507652461528778,
-0.0921960398554802,
-0.07265115529298782,
-0.00022185911075212061,
-0.017731428146362305,
-0.021630696952342987,
0.12863275408744812,
0.07938657701015472,
-0.0967739000916481,
0.09643683582544327,
0.07334139198064804,
0.029162142425775528,
0.09942179173231125,
-0.006189142819494009,
-0.10333321988582611,
-0.0323331393301487,
0.061176080256700516,
0.02202449180185795,
0.149826318025589,
-0.06507730484008789,
0.04255092516541481,
0.044255781918764114,
-0.04944013059139252,
0.04018179327249527,
-0.09618253260850906,
0.012253649532794952,
0.008919699117541313,
-0.01841765083372593,
0.028289828449487686,
-0.02604750543832779,
0.004083717241883278,
0.09261870384216309,
0.06582789868116379,
0.026572220027446747,
0.011559187434613705,
-0.038181811571121216,
-0.14245527982711792,
0.18022096157073975,
-0.08636847138404846,
-0.2264573872089386,
-0.1570708006620407,
0.02772059664130211,
0.06175167113542557,
-0.010242952965199947,
0.03813089430332184,
-0.044610027223825455,
-0.08540985733270645,
-0.09058350324630737,
0.021640483289957047,
0.04559521749615669,
-0.06051452085375786,
-0.07110442221164703,
0.03483714163303375,
0.025602947920560837,
-0.1318269819021225,
0.024511557072401047,
0.050059232860803604,
-0.0008806852274574339,
-0.007982462644577026,
0.030581658706068993,
0.08573268353939056,
0.2095610797405243,
-0.0011059334501624107,
0.0016475646989420056,
0.05603134632110596,
0.2789561152458191,
-0.1537395417690277,
0.12177807092666626,
0.1266861855983734,
-0.06267382204532623,
0.08250944316387177,
0.19280177354812622,
0.033107466995716095,
-0.08670145273208618,
0.013368132524192333,
0.034535206854343414,
-0.03631370887160301,
-0.26806116104125977,
-0.03918072208762169,
-0.025767333805561066,
-0.06695462018251419,
0.09110713005065918,
0.0822334736585617,
0.0973796620965004,
0.031145665794610977,
-0.074386365711689,
-0.075241819024086,
0.04526735097169876,
0.12028519809246063,
-0.054307494312524796,
0.014810320921242237,
0.08480410277843475,
-0.05065874755382538,
0.008095936849713326,
0.08659816533327103,
-0.007655641529709101,
0.1305343061685562,
0.05927086994051933,
0.12087471038103104,
0.07645414024591446,
0.061710651963949203,
0.0024476933758705854,
0.04993705451488495,
-0.015308795496821404,
0.02851467952132225,
0.015338247641921043,
-0.09492670744657516,
0.017916060984134674,
0.11240890622138977,
-0.0006568920216523111,
0.029645591974258423,
0.017176909372210503,
-0.08312083780765533,
0.03544115647673607,
0.203003391623497,
0.03396875411272049,
-0.21467509865760803,
-0.08064685761928558,
0.05756235122680664,
-0.07199268043041229,
-0.1566236913204193,
-0.013925652019679546,
0.01366207655519247,
-0.1509673148393631,
0.014531517401337624,
-0.044804707169532776,
0.11285463720560074,
-0.06882506608963013,
-0.04451088607311249,
0.09908490628004074,
0.04649810865521431,
-0.04520895704627037,
0.03668010234832764,
-0.18456342816352844,
0.10883324593305588,
0.03265436366200447,
0.07441603392362595,
-0.08295898139476776,
0.0845610648393631,
-0.003081863047555089,
-0.01308427844196558,
0.15556883811950684,
-0.004696136340498924,
-0.06872716546058655,
-0.09006325900554657,
-0.07025061547756195,
-0.017569346353411674,
0.08264625817537308,
-0.13687968254089355,
0.07910913974046707,
-0.023657599464058876,
-0.03461776301264763,
-0.006063689943403006,
-0.10183300077915192,
-0.10068420320749283,
-0.16540798544883728,
0.05504715442657471,
-0.07707192748785019,
0.01805059239268303,
-0.07433118671178818,
-0.048582080751657486,
0.05241271108388901,
0.16825318336486816,
-0.20981496572494507,
-0.11376700550317764,
-0.14254477620124817,
-0.1046508327126503,
0.15093111991882324,
-0.051787152886390686,
0.08818354457616806,
-0.014932326972484589,
0.15637946128845215,
-0.014913041144609451,
-0.023366790264844894,
0.08306203037500381,
-0.09069420397281647,
-0.18480487167835236,
-0.05259774252772331,
0.19044405221939087,
0.1301640123128891,
0.02691480703651905,
-0.012986465357244015,
0.027836337685585022,
-0.05487077683210373,
-0.10264495760202408,
0.02454826608300209,
0.13148340582847595,
0.06640268117189407,
-0.01259929221123457,
-0.04068192467093468,
-0.11214029788970947,
-0.060452722012996674,
-0.035928238183259964,
-0.014029801823198795,
0.20606954395771027,
-0.06927391141653061,
0.15720155835151672,
0.13590209186077118,
-0.06756436824798584,
-0.2036600112915039,
0.03499874100089073,
0.03182447329163551,
0.017930839210748672,
0.024082046002149582,
-0.19639916718006134,
0.07405146956443787,
-0.02150672860443592,
-0.07467077672481537,
0.1789371222257614,
-0.20603999495506287,
-0.12886619567871094,
0.09587264806032181,
0.0192502923309803,
-0.19783897697925568,
-0.15329484641551971,
-0.11278848350048065,
-0.017088163644075394,
-0.1209569126367569,
0.059731777757406235,
0.017304545268416405,
0.017043709754943848,
0.008588355965912342,
0.011434673331677914,
0.04348500818014145,
-0.04466533288359642,
0.19158902764320374,
-0.037938203662633896,
0.004061787389218807,
-0.05528546869754791,
-0.10194719582796097,
0.004680037032812834,
-0.06555310636758804,
0.11734730750322342,
-0.02901756390929222,
0.02578197419643402,
-0.15906190872192383,
-0.046070992946624756,
-0.06399116665124893,
0.022924257442355156,
-0.09457999467849731,
-0.08070390671491623,
-0.045959945768117905,
0.07646775990724564,
0.09082925319671631,
-0.016879785805940628,
0.03063819743692875,
-0.09313318133354187,
0.10220625251531601,
0.2036600261926651,
0.17042630910873413,
0.0523909293115139,
-0.048259295523166656,
0.026148542761802673,
-0.036430373787879944,
0.04696285352110863,
-0.22457227110862732,
0.036623843014240265,
0.05990183725953102,
0.034734610468149185,
0.08271022140979767,
-0.000820218469016254,
-0.16162452101707458,
-0.08172181993722916,
0.08636876940727234,
-0.0569140650331974,
-0.15749938786029816,
-0.02341824397444725,
0.03656458482146263,
-0.207781121134758,
-0.04331054538488388,
0.043484050780534744,
-0.017495112493634224,
-0.04186376929283142,
0.024120425805449486,
0.08257562667131424,
-0.019885050132870674,
0.09454873949289322,
0.08260365575551987,
0.08945350348949432,
-0.09389611333608627,
0.05459190532565117,
0.08485744148492813,
-0.018617792055010796,
0.01914980076253414,
0.14742110669612885,
-0.03676264360547066,
-0.036764420568943024,
0.08064738661050797,
0.11689505726099014,
-0.004289600998163223,
-0.04370661824941635,
0.015796435996890068,
-0.05171751230955124,
0.07595200091600418,
0.13992278277873993,
0.017587020993232727,
-0.009553513489663601,
0.07028737664222717,
0.02903277613222599,
-0.09225096553564072,
0.12392615526914597,
0.05293682590126991,
0.02374996431171894,
-0.016641244292259216,
-0.021211594343185425,
-0.013807263225317001,
-0.010407522320747375,
-0.011634978465735912,
-0.003480554325506091,
-0.09855695068836212,
0.0010899297194555402,
-0.11433646082878113,
0.024433929473161697,
-0.07757493853569031,
0.0008218908915296197,
0.013917168602347374,
-0.040228184312582016,
-0.004629031755030155,
-0.009808741509914398,
-0.07527894526720047,
-0.055786557495594025,
-0.03329669311642647,
0.07748576253652573,
-0.14448437094688416,
0.031110361218452454,
0.07093042880296707,
-0.10996546596288681,
0.06290020048618317,
-0.009586277417838573,
0.016450246796011925,
0.0007712004007771611,
-0.1522747427225113,
0.057865798473358154,
-0.025229573249816895,
-0.01656339131295681,
0.006286178715527058,
-0.16610750555992126,
-0.0015158246969804168,
-0.05036249756813049,
-0.07723090052604675,
0.010180831886827946,
-0.011144115589559078,
-0.12630094587802887,
0.12195052951574326,
-0.0013362449826672673,
-0.06491982191801071,
-0.01423726137727499,
0.06232818216085434,
0.07105667144060135,
-0.01915879361331463,
0.09330692887306213,
-0.02619985118508339,
0.08270486444234848,
-0.18474635481834412,
-0.006736153736710548,
-0.015146181918680668,
0.03437120094895363,
-0.02818780019879341,
-0.040035512298345566,
0.05121663212776184,
-0.012617827393114567,
0.1497207134962082,
-0.0013910223497077823,
0.07047301530838013,
0.0445491224527359,
0.00551946833729744,
0.03574914485216141,
0.0691220685839653,
0.05975884571671486,
-0.021295152604579926,
-0.013102144934237003,
0.031134944409132004,
0.002761667827144265,
-0.04361557215452194,
-0.12302900850772858,
0.06342794746160507,
0.18746478855609894,
0.0793762356042862,
0.034884434193372726,
-0.0007829178939573467,
-0.12844935059547424,
-0.08534754812717438,
0.08495340496301651,
-0.011109262704849243,
-0.030032392591238022,
-0.06514054536819458,
0.23229186236858368,
0.14553618431091309,
-0.19135145843029022,
0.07998742163181305,
-0.03848966211080551,
-0.030879085883498192,
-0.1389072984457016,
-0.16332456469535828,
-0.055162135511636734,
-0.02839273028075695,
-0.03636994585394859,
-0.0647556260228157,
0.061717331409454346,
0.03505990654230118,
-0.0008671171963214874,
-0.007139184512197971,
0.1022312119603157,
0.020206036046147346,
-0.03845922276377678,
0.05089803412556648,
0.06703292578458786,
0.0503745973110199,
-0.09356891363859177,
0.011249984614551067,
0.004637192469090223,
0.003823795123025775,
0.0653163492679596,
0.03135764226317406,
-0.05729525536298752,
0.028653517365455627,
-0.01634574867784977,
-0.12136867642402649,
0.047787535935640335,
-0.007056705188006163,
-0.013424142263829708,
0.15174071490764618,
0.033216577023267746,
0.0022304432932287455,
-0.008063813671469688,
0.23331664502620697,
-0.06710677593946457,
-0.07700929045677185,
-0.12463169544935226,
0.08008050918579102,
-0.05321254953742027,
0.028127998113632202,
0.009365432895720005,
-0.12377535551786423,
0.012522047385573387,
0.1727612465620041,
0.11888722330331802,
-0.0036224955692887306,
0.009012805297970772,
0.04554319754242897,
0.01104091014713049,
-0.016152532771229744,
0.017792684957385063,
0.04387819394469261,
0.22020068764686584,
-0.0733967199921608,
0.07472807168960571,
-0.010444166138768196,
-0.06862218677997589,
-0.02134905382990837,
0.12480595707893372,
-0.014355064369738102,
-0.009997096844017506,
-0.05856077000498772,
0.13643240928649902,
-0.06472945213317871,
-0.21727880835533142,
0.06010531634092331,
-0.09216073155403137,
-0.13260914385318756,
-0.04330691322684288,
0.011016927659511566,
-0.02855832315981388,
0.013806391507387161,
0.06769496202468872,
-0.058736540377140045,
0.1615155190229416,
0.027249159291386604,
-0.05561522766947746,
-0.10632453113794327,
0.050351668149232864,
-0.14485137164592743,
0.2886793613433838,
0.020861772820353508,
0.02977316826581955,
0.10983941704034805,
-0.01932557299733162,
-0.1387106329202652,
0.012825645506381989,
0.105259470641613,
-0.05405554920434952,
0.057556916028261185,
0.15566983819007874,
-0.006471342407166958,
0.12092328071594238,
0.05748960003256798,
-0.06073775514960289,
0.03436625376343727,
-0.05974448099732399,
-0.06053941324353218,
-0.12207556515932083,
0.06671395152807236,
-0.0813545286655426,
0.1478293538093567,
0.12660232186317444,
-0.07281895726919174,
-0.008089852519333363,
-0.017476042732596397,
0.07881394773721695,
0.022320006042718887,
0.12238386273384094,
0.01385805569589138,
-0.18090462684631348,
0.04726128280162811,
0.00904572568833828,
0.11121045798063278,
-0.22690168023109436,
-0.05516495183110237,
0.0460582971572876,
-0.020485080778598785,
-0.09627743810415268,
0.1187213882803917,
0.044769398868083954,
0.01799786649644375,
-0.0303325317800045,
-0.0926905944943428,
0.022043555974960327,
0.15577276051044464,
-0.09885565936565399,
-0.018312275409698486
] |
null | null | transformers |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | maramzarkaoui/llama2 | [
"transformers",
"safetensors",
"autotrain",
"text-generation",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:03:24+00:00 | [] | [] | TAGS
#transformers #safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#transformers #safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
36,
29,
3
] | [
"passage: TAGS\n#transformers #safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.023006148636341095,
0.05046960711479187,
-0.0010037448955699801,
0.03580224886536598,
0.1440940499305725,
-0.021481232717633247,
0.24606527388095856,
0.041373081505298615,
-0.07430633157491684,
-0.10234048962593079,
0.18846328556537628,
0.17833441495895386,
-0.04865241423249245,
0.20599697530269623,
-0.022239185869693756,
-0.2639702558517456,
0.036890044808387756,
-0.014957736246287823,
0.07165995985269547,
0.12281326204538345,
0.14157381653785706,
-0.08094550669193268,
0.06314131617546082,
0.0530468188226223,
-0.21894866228103638,
0.030067868530750275,
0.08150321990251541,
-0.12674033641815186,
0.18498511612415314,
0.05292187258601189,
0.1383083462715149,
0.0383191853761673,
0.14889580011367798,
-0.11574234068393707,
0.013409752398729324,
0.015240980312228203,
-0.01779787428677082,
0.07112516462802887,
0.06934591382741928,
-0.0304710790514946,
0.08784668892621994,
0.17253881692886353,
0.1106535941362381,
0.043535955250263214,
-0.10264319181442261,
-0.03034444898366928,
-0.028859339654445648,
0.0274159274995327,
0.10759268701076508,
0.12047161161899567,
-0.014398117549717426,
0.1926349699497223,
-0.1453714221715927,
0.07423928380012512,
-0.09448342770338058,
-0.2600191533565521,
-0.009287437424063683,
0.18603195250034332,
0.06799202412366867,
-0.01955469511449337,
-0.12317368388175964,
0.07887494564056396,
0.11429125815629959,
-0.00463357986882329,
0.08441097289323807,
-0.019625037908554077,
-0.0407516211271286,
-0.005363111849874258,
-0.07860609143972397,
-0.003415796672925353,
0.1773252785205841,
-0.07490051537752151,
-0.03383995220065117,
-0.11752590537071228,
-0.030355878174304962,
0.040952179580926895,
0.009735563769936562,
-0.11376053839921951,
-0.01594076305627823,
0.10556721687316895,
-0.04791182279586792,
-0.036078523844480515,
-0.13472582399845123,
-0.06205762177705765,
-0.10021799057722092,
0.05704645812511444,
-0.00009224791574524716,
0.0034954091534018517,
-0.08960220962762833,
0.11469295620918274,
-0.006156537216156721,
-0.09380223602056503,
0.05474410951137543,
-0.10856923460960388,
0.028691910207271576,
-0.11759942024946213,
-0.037882935255765915,
-0.10411670804023743,
0.004172018729150295,
0.23073101043701172,
0.1783810406923294,
-0.02269141934812069,
-0.07773572951555252,
0.03042212501168251,
0.010559555143117905,
0.12997497618198395,
0.04471700266003609,
-0.028003722429275513,
0.0635356605052948,
-0.045596327632665634,
-0.03023592010140419,
-0.03175151348114014,
-0.17976830899715424,
0.029883908107876778,
0.029726488515734673,
0.06127764657139778,
-0.07074327766895294,
0.0861908346414566,
-0.01250474527478218,
0.03990834206342697,
0.05091428756713867,
-0.03020680882036686,
0.03630146011710167,
-0.05661218613386154,
0.005226188339293003,
-0.061964910477399826,
0.0429050587117672,
0.10641790926456451,
0.0322391651570797,
0.1186658963561058,
-0.09188699722290039,
-0.03202752396464348,
-0.11321412026882172,
-0.06591932475566864,
0.003149614203721285,
0.004794694483280182,
0.0618300586938858,
-0.19819198548793793,
-0.2962443232536316,
-0.01906852424144745,
0.06040035933256149,
-0.008637036196887493,
-0.0671197697520256,
-0.07381688058376312,
0.011710653081536293,
0.058137163519859314,
-0.03132254630327225,
0.039640430361032486,
-0.01131268497556448,
0.03946414962410927,
-0.06542262434959412,
-0.0069712260738015175,
-0.051937978714704514,
0.019929492846131325,
-0.15022505819797516,
-0.032731760293245316,
-0.045004718005657196,
0.020713184028863907,
-0.042506176978349686,
0.15997955203056335,
-0.02587318792939186,
0.032208655029535294,
-0.02783725969493389,
0.054800912737846375,
0.0011260026367381215,
0.14031249284744263,
-0.12461306154727936,
-0.021506303921341896,
0.1475609391927719,
-0.12052030116319656,
-0.10906532406806946,
0.09393379092216492,
-0.10835801064968109,
0.24426677823066711,
0.11271938681602478,
0.10602916032075882,
0.07814669609069824,
-0.09744085371494293,
0.11606165766716003,
0.03259074687957764,
-0.09054090082645416,
-0.05469582974910736,
0.001406642608344555,
-0.021298319101333618,
-0.21178807318210602,
0.02712981589138508,
0.08436551690101624,
0.08671091496944427,
-0.03758726641535759,
-0.08693122118711472,
-0.018888602033257484,
-0.059668201953172684,
0.07404693216085434,
0.0101083405315876,
0.13378392159938812,
-0.05093035846948624,
-0.039963699877262115,
0.08241565525531769,
0.04268242046236992,
0.05726049840450287,
-0.05335669219493866,
-0.07650406658649445,
-0.04842369258403778,
-0.03038499690592289,
0.008199294097721577,
-0.09431637078523636,
-0.06148980185389519,
-0.01717430353164673,
0.08582023531198502,
0.04303932562470436,
0.1022045686841011,
0.040709082037210464,
0.04473729804158211,
-0.02594601735472679,
0.0013766733463853598,
0.14481137692928314,
0.04908852279186249,
-0.11317011713981628,
-0.09226454794406891,
0.0987686812877655,
-0.07432932406663895,
0.1174137070775032,
-0.25251907110214233,
0.032014086842536926,
-0.11311955004930496,
0.08321117609739304,
-0.0067457230761647224,
0.07558929920196533,
-0.0907612144947052,
0.031444404274225235,
-0.09166737645864487,
-0.011222150176763535,
0.05582568421959877,
0.04270628094673157,
-0.03993121162056923,
0.13056305050849915,
-0.1428522765636444,
0.252463698387146,
0.12270622700452805,
-0.11607427150011063,
-0.10004787147045135,
-0.0613517239689827,
0.01742742396891117,
-0.008180458098649979,
-0.10434523224830627,
0.001996284117922187,
0.08226445317268372,
-0.03686952590942383,
0.18906699120998383,
-0.017291929572820663,
-0.015924064442515373,
-0.016253501176834106,
-0.08460725843906403,
-0.0026853010058403015,
-0.027748405933380127,
0.09072670340538025,
-0.21708054840564728,
0.1348140835762024,
0.1643182635307312,
-0.034257080405950546,
0.17737609148025513,
0.01792263239622116,
0.018321236595511436,
0.002668620552867651,
-0.06682255864143372,
0.005511118099093437,
0.0020026813726872206,
-0.000492085178848356,
-0.014392944052815437,
0.012255203910171986,
0.01760677807033062,
0.02507147379219532,
-0.13759054243564606,
-0.041822779923677444,
0.032047152519226074,
0.04801781475543976,
0.004819761496037245,
0.04864083230495453,
-0.08737404644489288,
0.049372851848602295,
-0.027863839641213417,
-0.12646131217479706,
0.1332874298095703,
0.022478442639112473,
-0.11931633949279785,
0.18422749638557434,
-0.09882816672325134,
-0.19498006999492645,
-0.21047110855579376,
-0.13546288013458252,
0.02439134567975998,
0.08529271930456161,
0.06211759150028229,
-0.07229309529066086,
-0.0692247524857521,
0.0033805493731051683,
-0.08390248566865921,
0.01633763685822487,
-0.027947988361120224,
-0.0853905975818634,
0.04800122231245041,
0.0034779177512973547,
-0.11472620069980621,
-0.040257714688777924,
0.01763991080224514,
-0.07419609278440475,
0.06185048818588257,
-0.041400160640478134,
0.06739397346973419,
0.14478057622909546,
-0.023932816460728645,
0.018098579719662666,
-0.034783586859703064,
0.16564594209194183,
-0.08481018990278244,
0.0012745795538648963,
0.11987090110778809,
-0.04218839108943939,
0.03269821032881737,
0.20156428217887878,
0.032490238547325134,
-0.060903556644916534,
0.07015017420053482,
-0.03521709516644478,
-0.060141000896692276,
-0.19807757437229156,
-0.09679161012172699,
-0.0018222294747829437,
-0.025505205616354942,
0.09128635376691818,
0.04790486395359039,
0.2758360803127289,
0.13866449892520905,
0.07286936044692993,
0.060641203075647354,
0.02662266604602337,
0.08723177760839462,
0.1502876728773117,
-0.020674970000982285,
0.1753813773393631,
-0.08138120174407959,
-0.1593133509159088,
0.04092516005039215,
-0.0375240221619606,
0.09196453541517258,
0.17303624749183655,
-0.015553678385913372,
0.04561338201165199,
0.09282860159873962,
0.14021608233451843,
0.1343209147453308,
0.08669082820415497,
-0.061393339186906815,
-0.007637881673872471,
-0.0006249327561818063,
-0.05215142294764519,
0.1316773146390915,
-0.042238544672727585,
-0.05891881883144379,
-0.041188400238752365,
0.01725822687149048,
0.04303298890590668,
0.06353260576725006,
0.002761262934654951,
-0.30222421884536743,
0.03000793233513832,
0.03449222072958946,
-0.061126966029405594,
-0.09679623693227768,
0.07701791077852249,
-0.04201863706111908,
-0.1934855729341507,
0.029195407405495644,
-0.04704124480485916,
0.0934133380651474,
0.03774815797805786,
0.05465435981750488,
-0.05683758482336998,
-0.03798997029662132,
-0.03781534358859062,
0.14723844826221466,
-0.36124858260154724,
0.21674223244190216,
-0.01860789954662323,
0.07806213945150375,
-0.1010344848036766,
0.01735423319041729,
0.09445473551750183,
0.21099194884300232,
0.10297887772321701,
-0.05539694428443909,
-0.17936211824417114,
-0.12916940450668335,
-0.06683283299207687,
-0.010509343817830086,
0.01618899218738079,
-0.017661429941654205,
0.008837479166686535,
-0.12809018790721893,
-0.0026399956550449133,
0.0517050065100193,
0.009733729995787144,
-0.16842865943908691,
-0.16192136704921722,
-0.019920513033866882,
0.0048285964876413345,
0.13214963674545288,
-0.04712256044149399,
-0.07380867004394531,
-0.07432052493095398,
0.15452365577220917,
0.04846067726612091,
0.0073686446994543076,
-0.13324832916259766,
-0.04316146671772003,
-0.0353224091231823,
-0.03438885882496834,
0.08668427914381027,
0.00816283468157053,
0.11935707926750183,
-0.0860477164387703,
-0.074086993932724,
0.0963996946811676,
-0.1223563477396965,
-0.056629639118909836,
-0.11120148748159409,
0.006045287940651178,
-0.04085000231862068,
-0.0009410463972017169,
0.11416077613830566,
0.04620915651321411,
-0.06116208806633949,
-0.0688200294971466,
-0.024785928428173065,
-0.009172101505100727,
-0.006290373858064413,
-0.10778804123401642,
-0.1051861047744751,
-0.10487903654575348,
-0.028152436017990112,
-0.10405804216861725,
0.19645440578460693,
0.1438249945640564,
-0.09755223244428635,
0.13965505361557007,
0.2229921519756317,
-0.10746392607688904,
-0.2963859736919403,
-0.05274598300457001,
-0.06568930298089981,
0.0059789628721773624,
0.05390051379799843,
-0.13015693426132202,
0.10193809866905212,
0.014996213838458061,
-0.08038043230772018,
-0.048520829528570175,
-0.11711455881595612,
-0.1583443284034729,
0.2512666583061218,
0.010452411137521267,
0.18587473034858704,
-0.07831650227308273,
-0.049541812390089035,
-0.13045138120651245,
0.038647644221782684,
0.0488838255405426,
-0.06278447806835175,
0.051292285323143005,
0.04884246736764908,
0.05297474190592766,
0.021952124312520027,
-0.04413874074816704,
0.05185624212026596,
-0.06887590885162354,
0.09076615422964096,
-0.17091482877731323,
-0.025716489180922508,
0.07694622129201889,
-0.02692396752536297,
0.10601745545864105,
-0.04011686518788338,
0.04284490644931793,
-0.032815124839544296,
-0.0755019336938858,
0.016431376338005066,
0.07304095476865768,
-0.0022275010123848915,
-0.12076404690742493,
0.010210824199020863,
0.00709371455013752,
-0.0008029851014725864,
-0.07250814884901047,
0.021519308909773827,
-0.00432422012090683,
0.13222207129001617,
0.14775076508522034,
0.2065606713294983,
-0.04621176794171333,
0.05270147696137428,
-0.03532959148287773,
-0.11586715281009674,
0.09111319482326508,
-0.06482606381177902,
0.016120556741952896,
0.06283482164144516,
-0.04885070398449898,
0.1550329327583313,
0.05711381882429123,
0.005190057214349508,
-0.017632992938160896,
0.15170270204544067,
-0.15404976904392242,
0.025018449872732162,
-0.08085252344608307,
0.0947846844792366,
0.031890109181404114,
-0.015246044844388962,
0.10716357827186584,
-0.09005149453878403,
-0.02597333863377571,
0.009867984801530838,
-0.00793418101966381,
-0.03569035232067108,
0.09453240036964417,
0.0442148894071579,
0.02859344147145748,
-0.06889442354440689,
0.029526684433221817,
0.07372083514928818,
0.00619673915207386,
0.03790383040904999,
-0.00029643086600117385,
-0.10682865977287292,
-0.09965331107378006,
0.03642730042338371,
0.254408061504364,
-0.20212560892105103,
-0.08650431782007217,
-0.008580835536122322,
-0.1059601753950119,
0.008985747583210468,
0.0857999324798584,
0.08850012719631195,
0.05151601508259773,
-0.0601990669965744,
-0.021751662716269493,
-0.1236807107925415,
0.08181203901767731,
-0.005813955795019865,
0.05252043530344963,
-0.1731976866722107,
0.09326323866844177,
-0.023697828873991966,
-0.002538486383855343,
-0.09208793938159943,
-0.019446147605776787,
-0.11593508720397949,
0.020832397043704987,
-0.10869579017162323,
-0.05005227029323578,
-0.045802175998687744,
-0.007226692512631416,
0.05889459326863289,
-0.015567611902952194,
-0.01892920583486557,
-0.020367030054330826,
-0.09396491199731827,
0.02818489447236061,
0.011118375696241856,
0.03985261544585228,
-0.05630745366215706,
-0.02137371152639389,
0.03317386284470558,
0.0006411992362700403,
0.04636351391673088,
0.005511044524610043,
-0.002032718388363719,
0.059319332242012024,
-0.13885454833507538,
0.008496670983731747,
0.0650734007358551,
-0.0030566283967345953,
0.020524753257632256,
-0.04371722787618637,
0.00853944756090641,
0.09254854917526245,
0.027601486071944237,
0.041354384273290634,
-0.01164696179330349,
-0.1028488501906395,
0.02337401732802391,
0.0833592563867569,
-0.12200818955898285,
-0.035818714648485184,
-0.02528981864452362,
0.012294970452785492,
-0.052567265927791595,
0.2529691755771637,
-0.11891333013772964,
0.049488313496112823,
-0.03267465531826019,
0.026030708104372025,
-0.0586434006690979,
-0.11815062910318375,
-0.10792829841375351,
-0.10968075692653656,
-0.04541591927409172,
0.004508506041020155,
0.2689996659755707,
0.13644033670425415,
0.0038688546046614647,
0.03944488987326622,
0.065635085105896,
0.07215114682912827,
0.005168523173779249,
0.23808495700359344,
0.1087392121553421,
-0.0010016158921644092,
-0.11976668238639832,
0.0833549052476883,
0.02428320422768593,
-0.09951290488243103,
0.0009715823107399046,
0.024275295436382294,
-0.09482088685035706,
0.07196833193302155,
0.0557919405400753,
-0.03798023238778114,
-0.12250353395938873,
-0.176181361079216,
-0.10479963570833206,
0.036747366189956665,
-0.08988049626350403,
-0.003276057541370392,
0.16489656269550323,
-0.05511564761400223,
-0.01080505270510912,
-0.05300484597682953,
-0.04731086269021034,
-0.2172880470752716,
-0.15815839171409607,
-0.1112399622797966,
-0.09179763495922089,
0.04345835745334625,
-0.02084386721253395,
0.04623923450708389,
0.04555289074778557,
0.0377981998026371,
-0.06606848537921906,
0.0996394008398056,
-0.10094716399908066,
0.0021724712569266558,
-0.0019105275860056281,
-0.051692262291908264,
-0.010041615925729275,
-0.1787000149488449,
-0.018906332552433014,
-0.13605888187885284,
-0.011757775209844112,
-0.030424442142248154,
-0.04292851313948631,
-0.008123358711600304,
0.0020226617343723774,
-0.034472040832042694,
-0.023563841357827187,
-0.01004047691822052,
0.04256843030452728,
0.016979418694972992,
0.034373629838228226,
0.010642160661518574,
0.001043044845573604,
0.039898402988910675,
0.20765984058380127,
-0.04083061218261719,
-0.19630497694015503,
-0.14036925137043,
0.2603556513786316,
0.03892805427312851,
0.12410715967416763,
-0.06265673041343689,
-0.0052998787723481655,
0.05897653102874756,
0.3017213046550751,
0.2862779200077057,
-0.03375839814543724,
0.010591065511107445,
-0.018728913739323616,
-0.017383165657520294,
-0.008246404118835926,
0.18533962965011597,
0.0034209436271339655,
0.13756972551345825,
-0.04916476830840111,
0.055966828018426895,
-0.02057797834277153,
-0.09353124350309372,
-0.041706569492816925,
0.1361774206161499,
-0.02159058302640915,
-0.01321462169289589,
-0.01874959096312523,
0.08257261663675308,
-0.061929039657115936,
0.15772533416748047,
-0.10085609555244446,
-0.009556636214256287,
-0.05062995105981827,
0.03981149196624756,
0.11429575085639954,
-0.006433768197894096,
0.03454282879829407,
-0.02640240639448166,
-0.01806567795574665,
0.0357576422393322,
-0.033421069383621216,
-0.08391266316175461,
-0.03788287937641144,
0.08422775566577911,
0.014526239596307278,
0.23392081260681152,
-0.01633317582309246,
0.04056178033351898,
0.08718748390674591,
0.003936471417546272,
-0.09521330893039703,
0.10426279902458191,
-0.0012348884483799338,
-0.04771503433585167,
0.12493635714054108,
-0.020244482904672623,
0.008243391290307045,
-0.0053984615951776505,
-0.011651858687400818,
-0.18458795547485352,
0.14560383558273315,
-0.13425081968307495,
-0.09588589519262314,
-0.051299285143613815,
0.10570967942476273,
-0.02462267316877842,
0.15630926191806793,
0.08158250898122787,
-0.019184449687600136,
0.012958957813680172,
-0.03348344936966896,
0.0787903442978859,
0.008676338940858841,
-0.09003611654043198,
-0.037315260618925095,
-0.17064739763736725,
-0.03283747658133507,
0.09531781077384949,
-0.02645190805196762,
-0.23745925724506378,
-0.08856049180030823,
-0.07534709572792053,
-0.04410878196358681,
-0.12295244634151459,
0.0834372490644455,
0.20323392748832703,
0.03257528692483902,
-0.012365467846393585,
-0.12114307284355164,
-0.02698216401040554,
0.027812235057353973,
-0.05582612007856369,
-0.10069433599710464
] |
null | null | transformers | # Description
[MaziyarPanahi/samantha-1.1-westlake-7b-GPTQ](https://huggingface.co/MaziyarPanahi/samantha-1.1-westlake-7b-GPTQ) is a quantized (GPTQ) version of [cognitivecomputations/samantha-1.1-westlake-7b](https://huggingface.co/cognitivecomputations/samantha-1.1-westlake-7b)
## How to use
### Install the necessary packages
```
pip install --upgrade accelerate auto-gptq transformers
```
### Example Python code
```python
from transformers import AutoTokenizer, pipeline
from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
import torch
model_id = "MaziyarPanahi/samantha-1.1-westlake-7b-GPTQ"
quantize_config = BaseQuantizeConfig(
bits=4,
group_size=128,
desc_act=False
)
model = AutoGPTQForCausalLM.from_quantized(
model_id,
use_safetensors=True,
device="cuda:0",
quantize_config=quantize_config)
tokenizer = AutoTokenizer.from_pretrained(model_id)
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
temperature=0.7,
top_p=0.95,
repetition_penalty=1.1
)
outputs = pipe("What is a large language model?")
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["finetuned", "quantized", "4-bit", "gptq", "transformers", "pytorch", "mistral", "text-generation", "conversational", "dataset:cognitivecomputations/samantha-data", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us"], "model_name": "samantha-1.1-westlake-7b-GPTQ", "base_model": "cognitivecomputations/samantha-1.1-westlake-7b", "inference": false, "model_creator": "cognitivecomputations", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/samantha-1.1-westlake-7b-GPTQ | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"finetuned",
"quantized",
"4-bit",
"gptq",
"pytorch",
"conversational",
"dataset:cognitivecomputations/samantha-data",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:cognitivecomputations/samantha-1.1-westlake-7b"
] | 2024-02-13T10:03:37+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #gptq #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b
| # Description
MaziyarPanahi/samantha-1.1-westlake-7b-GPTQ is a quantized (GPTQ) version of cognitivecomputations/samantha-1.1-westlake-7b
## How to use
### Install the necessary packages
### Example Python code
| [
"# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GPTQ is a quantized (GPTQ) version of cognitivecomputations/samantha-1.1-westlake-7b",
"## How to use",
"### Install the necessary packages",
"### Example Python code"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #gptq #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n",
"# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GPTQ is a quantized (GPTQ) version of cognitivecomputations/samantha-1.1-westlake-7b",
"## How to use",
"### Install the necessary packages",
"### Example Python code"
] | [
112,
45,
4,
7,
6
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #finetuned #quantized #4-bit #gptq #pytorch #conversational #dataset-cognitivecomputations/samantha-data #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-cognitivecomputations/samantha-1.1-westlake-7b \n# Description\nMaziyarPanahi/samantha-1.1-westlake-7b-GPTQ is a quantized (GPTQ) version of cognitivecomputations/samantha-1.1-westlake-7b## How to use### Install the necessary packages### Example Python code"
] | [
-0.10807432979345322,
0.26302799582481384,
-0.001226385124027729,
0.08242815732955933,
0.09770385921001434,
0.015717897564172745,
0.05213290452957153,
0.12735667824745178,
0.058479249477386475,
0.0465192049741745,
0.13153617084026337,
0.16708329319953918,
0.06470467150211334,
0.09914496541023254,
-0.01299307867884636,
-0.18808254599571228,
0.017591696232557297,
0.01952148787677288,
0.039600107818841934,
0.11548998951911926,
0.0565774142742157,
-0.028628947213292122,
0.0871221199631691,
-0.05070198327302933,
-0.040212180465459824,
-0.08268080651760101,
-0.0061826384626328945,
-0.0749393031001091,
0.07457542419433594,
-0.019739419221878052,
-0.03043964132666588,
0.04341797158122063,
0.005282898433506489,
-0.14318042993545532,
0.01096750982105732,
-0.017598548904061317,
-0.006432711612433195,
0.05830613896250725,
-0.01613781228661537,
-0.005601242650300264,
0.020588628947734833,
-0.09807408601045609,
0.027882982045412064,
0.06345351785421371,
-0.09233750402927399,
-0.02526465617120266,
-0.09652706980705261,
0.045401573181152344,
0.07821748405694962,
0.05221078544855118,
-0.027735305950045586,
0.12646988034248352,
0.03011152520775795,
0.10444462299346924,
0.12432171404361725,
-0.3017689883708954,
-0.03152085468173027,
0.12484417855739594,
-0.0292263925075531,
0.06707537174224854,
-0.0014588609337806702,
0.01120726391673088,
0.024885527789592743,
0.03912481665611267,
0.0003670969163067639,
-0.09383650869131088,
-0.09419460594654083,
-0.007621113210916519,
-0.15877315402030945,
-0.015382753685116768,
0.28044593334198,
-0.015430927276611328,
-0.056724682450294495,
-0.03879503533244133,
-0.08747727423906326,
-0.0640123039484024,
-0.03284419700503349,
0.04363137483596802,
-0.04161425679922104,
-0.014727900736033916,
-0.03174898773431778,
-0.05171861872076988,
-0.06368597596883774,
-0.02085409127175808,
-0.06880932301282883,
0.11880519986152649,
0.030132075771689415,
0.014027012512087822,
-0.006039600819349289,
0.08292452245950699,
-0.18555566668510437,
-0.05336964502930641,
-0.05640153959393501,
-0.02055879682302475,
0.03486848250031471,
0.015178616158664227,
-0.007935707457363605,
0.027076013386249542,
0.06800947338342667,
0.1672920286655426,
0.03835882619023323,
0.09288257360458374,
0.08837512880563736,
-0.0057959891855716705,
-0.015828771516680717,
0.15802139043807983,
-0.08352042734622955,
-0.05589514225721359,
0.11775064468383789,
0.07162289321422577,
0.1293720155954361,
-0.02181841805577278,
-0.09293821454048157,
0.03163284435868263,
0.08638620376586914,
0.09303741902112961,
0.015759393572807312,
0.09824799001216888,
-0.08618289232254028,
-0.030081771314144135,
0.1399991810321808,
-0.09280861169099808,
-0.038434840738773346,
-0.0142372976988554,
0.009916688315570354,
-0.05287380516529083,
0.06270032376050949,
-0.0016831510001793504,
-0.056230947375297546,
-0.06333132088184357,
-0.06986550986766815,
-0.03539491072297096,
-0.027978690341114998,
-0.035417355597019196,
0.04026345536112785,
-0.09219089895486832,
0.029313696548342705,
-0.14387580752372742,
-0.3104051351547241,
0.056437864899635315,
0.013526376336812973,
-0.05464604124426842,
-0.03904591128230095,
-0.021726887673139572,
-0.049072008579969406,
-0.03993361443281174,
-0.04466699808835983,
0.006089580245316029,
-0.05810517817735672,
0.08584293723106384,
0.08489082008600235,
0.09186756610870361,
-0.10433667153120041,
-0.0059625133872032166,
-0.09073488414287567,
0.05033814162015915,
-0.03433312103152275,
0.11274395883083344,
-0.06946774572134018,
0.046579938381910324,
-0.08832930028438568,
-0.05352195352315903,
0.04038839414715767,
-0.05699558183550835,
0.03693780675530434,
0.18774448335170746,
-0.23675131797790527,
0.0019719135016202927,
0.1844916194677353,
-0.08427872508764267,
-0.20163384079933167,
0.1604190319776535,
0.04139983654022217,
0.14049072563648224,
0.09797733277082443,
0.20015254616737366,
0.05362585932016373,
-0.09742911159992218,
-0.048393722623586655,
0.037896305322647095,
0.06926232576370239,
-0.025532100349664688,
0.0752435177564621,
0.06102411448955536,
-0.11094558984041214,
0.08382570743560791,
-0.07142746448516846,
0.05422116816043854,
-0.022365136072039604,
-0.07605660706758499,
-0.04440591484308243,
-0.12254877388477325,
0.07162903249263763,
-0.00019734221859835088,
0.0029244092293083668,
-0.09248193353414536,
-0.05852292850613594,
-0.025029122829437256,
0.09728625416755676,
-0.07157046347856522,
-0.015996044501662254,
-0.12003690749406815,
0.09237506985664368,
-0.0574973039329052,
0.04840024933218956,
-0.12176179885864258,
-0.003045162884518504,
0.0558653362095356,
0.006402520928531885,
0.08156343549489975,
-0.1239607110619545,
0.06402340531349182,
0.045453086495399475,
-0.044975005090236664,
-0.04883372783660889,
0.07403045892715454,
0.03134356811642647,
-0.058077115565538406,
-0.04380379989743233,
0.05140608921647072,
-0.015188218094408512,
0.2934378385543823,
-0.04830862209200859,
0.07027310132980347,
0.08821447193622589,
0.037092797458171844,
-0.015320874750614166,
-0.03459332138299942,
0.0820118635892868,
0.03203083947300911,
0.012162109836935997,
-0.06436241418123245,
0.07482770085334778,
0.030438318848609924,
-0.12265729904174805,
0.001230563153512776,
-0.1361236721277237,
-0.000015793715647305362,
0.12418913096189499,
0.07100016623735428,
0.022379891946911812,
0.07460508495569229,
-0.011022260412573814,
-0.02228415571153164,
0.03953614830970764,
-0.05640475079417229,
0.09123238921165466,
-0.01779089868068695,
0.07896368950605392,
-0.07484150677919388,
-0.01277978252619505,
0.017390176653862,
-0.06326501071453094,
-0.01855168119072914,
0.10065595805644989,
0.042882367968559265,
-0.1532280147075653,
0.10773434489965439,
0.1652320772409439,
-0.06734272837638855,
0.08106468617916107,
-0.022883743047714233,
-0.01151985116302967,
-0.06642202287912369,
0.02828347496688366,
0.04457169771194458,
0.07690320909023285,
-0.16327008605003357,
-0.006185908801853657,
0.05918828770518303,
-0.019875651225447655,
0.03386898338794708,
-0.1646113246679306,
-0.0015691111329942942,
-0.03387945145368576,
-0.024716926738619804,
-0.004740741103887558,
0.02536204643547535,
-0.0744219496846199,
0.03783347085118294,
-0.010537588968873024,
0.03841324523091316,
0.06477926671504974,
0.03741664066910744,
-0.09938707947731018,
0.17257599532604218,
-0.17255380749702454,
-0.3398911952972412,
-0.13568349182605743,
-0.07672528922557831,
-0.06511888653039932,
0.0002800517831929028,
0.054492514580488205,
-0.05880952998995781,
-0.06990950554609299,
-0.08901713043451309,
0.021377839148044586,
0.03549869358539581,
0.004176135174930096,
-0.025985637679696083,
-0.02996385097503662,
0.016403287649154663,
-0.0896754339337349,
-0.0014378639170899987,
0.03522307053208351,
-0.12608325481414795,
0.1356167048215866,
-0.11063384264707565,
0.08637413382530212,
0.13372007012367249,
0.01824452541768551,
0.003635617671534419,
-0.050189100205898285,
0.29376986622810364,
-0.0524398609995842,
0.0637747198343277,
0.15906941890716553,
0.00958506390452385,
0.0047973450273275375,
0.08825340867042542,
-0.006847687531262636,
-0.11578166484832764,
0.02752198651432991,
-0.07768209278583527,
-0.0714975893497467,
-0.22687752544879913,
-0.09646543115377426,
-0.055415622889995575,
0.18184761703014374,
0.07701413333415985,
0.04488411918282509,
-0.018847884610295296,
0.1112508624792099,
-0.014038637280464172,
0.04992000758647919,
0.018131915479898453,
0.097423255443573,
0.18500667810440063,
-0.043610263615846634,
0.07573829591274261,
-0.06524476408958435,
0.004555098246783018,
0.09136982262134552,
0.18768881261348724,
0.10083834081888199,
0.002073718933388591,
0.20982079207897186,
0.0302262082695961,
0.18098576366901398,
0.02149193175137043,
0.04845285043120384,
0.006614138837903738,
0.009526689536869526,
-0.04417260363698006,
-0.05641718953847885,
-0.16292603313922882,
0.041565462946891785,
-0.03519528731703758,
-0.0171772763133049,
0.047874316573143005,
0.046367477625608444,
0.09398798644542694,
0.10276301950216293,
0.07202273607254028,
-0.1895611584186554,
-0.13832804560661316,
0.08025109767913818,
0.017160141840577126,
-0.03290124237537384,
0.06759442389011383,
0.03344781696796417,
-0.04802021384239197,
0.0723675787448883,
-0.05805773288011551,
0.06719320267438889,
-0.11098963022232056,
-0.03476493060588837,
-0.061778806149959564,
-0.0003958269953727722,
0.05318450927734375,
0.09381052106618881,
-0.2232609987258911,
0.11906281113624573,
0.06767860800027847,
0.06988347321748734,
-0.07458721101284027,
0.0003167669929098338,
0.0032930546440184116,
0.073920838534832,
0.10736822336912155,
0.001559508265927434,
0.08781394362449646,
-0.07954484224319458,
-0.0851350724697113,
0.054786473512649536,
0.0485144779086113,
0.03952119126915932,
0.03250214830040932,
-0.0039684828370809555,
-0.008548177778720856,
-0.05717581510543823,
-0.08084605634212494,
-0.06473337858915329,
-0.14435380697250366,
0.07387600094079971,
0.05895991623401642,
-0.014155831187963486,
-0.08768633753061295,
-0.05101925879716873,
-0.15584604442119598,
0.09897972643375397,
-0.13126236200332642,
-0.0855940580368042,
-0.06152680143713951,
-0.021599814295768738,
0.10562555491924286,
-0.09323282539844513,
0.07461287826299667,
-0.045932888984680176,
0.021862579509615898,
-0.012218308635056019,
-0.08292301744222641,
0.05945850536227226,
-0.12118025869131088,
-0.11263767629861832,
-0.01670723222196102,
0.16474291682243347,
-0.026232531294226646,
0.049898821860551834,
-0.02969052456319332,
0.04961151257157326,
-0.06264545768499374,
-0.11296985298395157,
0.002025877358391881,
0.08385562151670456,
-0.01275093574076891,
0.06311657279729843,
-0.03025110438466072,
-0.0435505174100399,
-0.07865285873413086,
-0.06219518557190895,
0.1730199009180069,
0.17482256889343262,
-0.03865817189216614,
0.04123567417263985,
0.15310437977313995,
-0.016021857038140297,
-0.3286411762237549,
-0.06780834496021271,
-0.028355784714221954,
-0.03405251353979111,
0.0073580555617809296,
-0.1334851086139679,
0.07653842121362686,
0.10645020753145218,
-0.040643613785505295,
0.052944887429475784,
-0.2553178369998932,
-0.08888565748929977,
0.06345152854919434,
0.16066330671310425,
0.04354426637291908,
-0.202983558177948,
-0.05681046098470688,
-0.000869660871103406,
-0.16879932582378387,
0.1458972841501236,
-0.1228419691324234,
0.12409630417823792,
0.005606509745121002,
0.1695217341184616,
-0.015013236552476883,
-0.04867912083864212,
0.1281982660293579,
-0.04557047039270401,
-0.041166845709085464,
-0.01199440099298954,
-0.011593646369874477,
0.1189400851726532,
-0.02475423365831375,
0.09296122938394547,
-0.06374651938676834,
0.08831176906824112,
0.050055406987667084,
-0.01352599635720253,
-0.05498706176877022,
0.017308196052908897,
-0.012211503461003304,
-0.0832769051194191,
-0.06001714989542961,
0.044620294123888016,
-0.0061150505207479,
0.0021939671132713556,
0.06478672474622726,
-0.0035685051698237658,
-0.028207415714859962,
0.15884284675121307,
0.08500917255878448,
-0.14312180876731873,
0.023427877575159073,
-0.022232666611671448,
-0.07146243751049042,
-0.003892218694090843,
-0.1976868361234665,
0.002206013537943363,
0.07541774958372116,
0.05320413410663605,
0.14654669165611267,
-0.009480430744588375,
-0.05562863498926163,
-0.001212955336086452,
0.03436754643917084,
-0.14633886516094208,
-0.2485601156949997,
-0.003924551419913769,
0.14122475683689117,
-0.04591170698404312,
0.15064381062984467,
0.1255357414484024,
-0.02606256864964962,
-0.06252492219209671,
-0.016908705234527588,
0.0511290542781353,
-0.01713700406253338,
0.13281653821468353,
0.04198175296187401,
0.05784453824162483,
-0.12057460844516754,
0.10393213480710983,
0.06003370136022568,
-0.15962691605091095,
0.02771199308335781,
0.0722905844449997,
-0.1593722105026245,
-0.10590460896492004,
-0.13676568865776062,
-0.04138721525669098,
0.0248570516705513,
-0.03703771531581879,
-0.06300458312034607,
-0.031273253262043,
-0.026953184977173805,
0.07854773849248886,
0.06809782981872559,
0.013337971642613411,
-0.02798328548669815,
0.0018125026253983378,
-0.07320629060268402,
0.12167918682098389,
-0.0016926011303439736,
0.06712828576564789,
-0.12900033593177795,
-0.05844835937023163,
0.01666007749736309,
0.07126772403717041,
-0.02071310766041279,
-0.024678539484739304,
-0.07587782293558121,
0.0028692528139799833,
-0.03645513206720352,
0.050411853939294815,
-0.14649446308612823,
0.02035597153007984,
0.001701087225228548,
-0.03933008015155792,
-0.04487575590610504,
0.050377123057842255,
-0.036241743713617325,
0.0033353520557284355,
-0.01795736700296402,
0.03703903406858444,
-0.02677001804113388,
-0.04260043427348137,
0.06364346295595169,
-0.07276055216789246,
0.10755643248558044,
0.03568245470523834,
-0.059840839356184006,
0.11140422523021698,
-0.0594681091606617,
-0.004711405839771032,
0.06539162248373032,
0.06637958437204361,
0.02015724778175354,
-0.11932998895645142,
-0.015512421727180481,
0.04739365354180336,
0.020778071135282516,
0.021305548027157784,
0.1564595103263855,
-0.060484834015369415,
0.0158928781747818,
-0.03064318560063839,
-0.03531509265303612,
-0.08382046967744827,
0.015807922929525375,
0.03445864096283913,
0.009074972942471504,
0.10638277232646942,
-0.09161033481359482,
-0.010326020419597626,
-0.06972721219062805,
-0.008375070989131927,
-0.0350010022521019,
-0.11652495712041855,
-0.18506650626659393,
-0.013713815249502659,
0.04651820659637451,
-0.0166286863386631,
0.21255724132061005,
-0.09272381663322449,
-0.029756532981991768,
0.001596704008989036,
0.03419971093535423,
0.059714458882808685,
-0.026515599340200424,
0.2458605170249939,
0.031562164425849915,
-0.015503908507525921,
-0.07147634774446487,
0.06887082755565643,
0.014572104439139366,
0.10018044710159302,
0.01206892915070057,
0.018379028886556625,
-0.033463701605796814,
0.07968422770500183,
-0.057053882628679276,
0.034442782402038574,
-0.05364958569407463,
0.06396164000034332,
-0.06746984273195267,
0.057555727660655975,
-0.051324665546417236,
0.2151208370923996,
0.1151958703994751,
-0.07939224690198898,
-0.009951415471732616,
-0.05752827972173691,
-0.11245256662368774,
-0.07769203931093216,
-0.1099909096956253,
-0.12935195863246918,
-0.08691451698541641,
-0.004309787880629301,
-0.12640954554080963,
-0.06761059165000916,
0.04678316041827202,
0.031878288835287094,
-0.03937426954507828,
0.19239629805088043,
-0.01293199509382248,
-0.01499153021723032,
0.025912798941135406,
0.029070695862174034,
-0.04833781346678734,
-0.011879552155733109,
-0.04301810264587402,
-0.020882505923509598,
0.02516699582338333,
0.06740901619195938,
0.02575800009071827,
-0.008235424757003784,
0.030389048159122467,
-0.05005970224738121,
-0.06376668810844421,
-0.03442778065800667,
0.09036436676979065,
0.004039490129798651,
0.1293090283870697,
0.007259831763803959,
-0.056476205587387085,
0.06732230633497238,
0.1330244541168213,
-0.04255428537726402,
-0.12708677351474762,
-0.1158810406923294,
0.26821526885032654,
-0.07162188738584518,
0.004653493873775005,
0.07943682372570038,
-0.0785602331161499,
0.026293426752090454,
0.24564076960086823,
0.19335991144180298,
-0.03528798371553421,
-0.018376734107732773,
-0.0035872708540409803,
0.0069016036577522755,
-0.0693928599357605,
0.0845356434583664,
0.10330743342638016,
0.14432932436466217,
-0.03407011181116104,
-0.051439106464385986,
-0.014059214852750301,
-0.057212792336940765,
-0.12631583213806152,
-0.017319459468126297,
-0.024994252249598503,
0.0018426957540214062,
-0.049816589802503586,
0.07411644607782364,
-0.07435429096221924,
-0.07428894191980362,
-0.03975393995642662,
-0.1877179592847824,
-0.11625886708498001,
-0.012135865166783333,
0.03846130892634392,
-0.026511341333389282,
0.03705894201993942,
-0.018175609409809113,
0.03289772942662239,
0.025950565934181213,
0.014962833374738693,
-0.11329112946987152,
0.054734714329242706,
0.06251641362905502,
-0.002264543669298291,
0.157162144780159,
-0.019186031073331833,
0.08546734601259232,
0.1363016963005066,
-0.004492480773478746,
-0.1471814215183258,
0.16654816269874573,
0.06158515810966492,
-0.02090810425579548,
0.008863404393196106,
0.08176478743553162,
-0.021754013374447823,
0.08429950475692749,
0.06044324114918709,
-0.07389390468597412,
-0.019891826435923576,
0.10226594656705856,
-0.0032321717590093613,
-0.11794175207614899,
0.0483216717839241,
-0.11354048550128937,
0.13041597604751587,
0.0487307570874691,
-0.06739882379770279,
-0.00162154296413064,
-0.08871287852525711,
0.029722604900598526,
0.039796944707632065,
0.01242890302091837,
-0.010047086514532566,
-0.17316125333309174,
-0.03812913969159126,
-0.05259586125612259,
0.043731335550546646,
-0.2679707109928131,
-0.0032048721332103014,
-0.09481285512447357,
0.011760890483856201,
-0.1096770167350769,
0.1366339772939682,
0.04772547632455826,
-0.0300605446100235,
-0.009882135316729546,
-0.07844112813472748,
-0.04420981556177139,
0.08246277272701263,
-0.10216128081083298,
-0.09696054458618164
] |
null | null | transformers |
# NB-Whisper Large
Introducing the **_Norwegian NB-Whisper Large model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-large](https://huggingface.co/openai/whisper-large)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-large")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-large/resolve/main/ggml-model.bin -O models/nb-large-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-large/resolve/main/ggml-model-q5_0.bin -O models/nb-large-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-large-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-large-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-large --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-large", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-large | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-large",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:07:22+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us
| NB-Whisper Large
================
Introducing the *Norwegian NB-Whisper Large model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-large
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05296173319220543,
0.09735371917486191,
-0.0038811834529042244,
0.0014292100677266717,
0.046840399503707886,
-0.02868485637009144,
0.025657236576080322,
0.0586586594581604,
0.00044433525181375444,
0.06980598717927933,
-0.0025090251583606005,
-0.07681825757026672,
0.08619535714387894,
0.02034042589366436,
0.09691891074180603,
-0.29794561862945557,
0.05506785586476326,
-0.07901722937822342,
0.010908905416727066,
0.04024575278162956,
0.09332382678985596,
-0.05598115921020508,
0.04575476050376892,
0.013667314313352108,
-0.005830614361912012,
0.01446677464991808,
-0.05406441166996956,
-0.035535600036382675,
0.07691726088523865,
0.09523952007293701,
0.03947269544005394,
-0.001200802973471582,
0.08161482214927673,
-0.14516384899616241,
0.010021316818892956,
0.052999284118413925,
0.035766568034887314,
0.01152550708502531,
0.034543149173259735,
0.110737644135952,
0.20021562278270721,
-0.08375980705022812,
0.011800634674727917,
0.05020305514335632,
-0.027194129303097725,
-0.13769549131393433,
-0.04244932904839516,
-0.025640452280640602,
0.05091197416186333,
0.04310137405991554,
-0.03038913570344448,
0.08412078022956848,
-0.07798495143651962,
0.03629342466592789,
0.0793427899479866,
-0.11635222285985947,
-0.01390546839684248,
0.013795830309391022,
0.05027379095554352,
0.048764750361442566,
-0.02689228765666485,
0.009632653556764126,
0.01609691046178341,
0.04036324843764305,
0.016208069398999214,
-0.010413284413516521,
0.07168019562959671,
-0.061756353825330734,
-0.10830625146627426,
-0.04798975586891174,
0.1256929486989975,
0.01678694598376751,
-0.0604981891810894,
-0.17052853107452393,
-0.04102388396859169,
0.057824838906526566,
-0.02250353805720806,
-0.025910697877407074,
0.029069101437926292,
-0.0028914485592395067,
0.09736276417970657,
-0.07309924066066742,
-0.09759596735239029,
0.01554356049746275,
-0.006983798462897539,
0.10894838720560074,
0.044244226068258286,
0.0029826241079717875,
0.017428340390324593,
0.04950243607163429,
-0.0686146691441536,
-0.05511527135968208,
-0.06594141572713852,
-0.06519396603107452,
-0.06071474775671959,
0.014010556042194366,
-0.026062652468681335,
-0.09530984610319138,
-0.00044711012742482126,
0.07696559280157089,
-0.019289720803499222,
0.022466830909252167,
0.02606024779379368,
-0.002254621358588338,
0.05276660993695259,
0.13157035410404205,
-0.024742385372519493,
-0.07805513590574265,
0.0007598797674290836,
-0.006350067909806967,
0.06558695435523987,
0.011003070510923862,
-0.04939951375126839,
-0.03351832553744316,
-0.0034243345726281404,
0.0255057904869318,
0.016754867509007454,
0.011694788932800293,
0.021010801196098328,
-0.028433188796043396,
0.24201010167598724,
-0.11676207929849625,
0.0006706168060190976,
0.014296325854957104,
-0.047493770718574524,
0.08031879365444183,
0.03006855584681034,
-0.02528420276939869,
-0.12931175529956818,
0.014655306935310364,
-0.0030807729344815016,
-0.02174314856529236,
-0.06593751162290573,
-0.10676616430282593,
0.04901987686753273,
0.018551694229245186,
-0.013206958770751953,
-0.1141371950507164,
-0.09358695894479752,
-0.046055570244789124,
0.018629496917128563,
-0.011313331313431263,
-0.03832380101084709,
0.004021466244012117,
-0.04007009416818619,
-0.032910000532865524,
-0.030484966933727264,
0.002345296321436763,
-0.028500782325863838,
-0.01930953934788704,
-0.025070542469620705,
0.03633755445480347,
-0.0215923935174942,
0.011501959525048733,
-0.06324150413274765,
-0.006226410623639822,
-0.18761898577213287,
0.11527365446090698,
-0.07509057968854904,
-0.006859801709651947,
-0.027452178299427032,
-0.07189810276031494,
-0.0483868233859539,
0.05083911493420601,
0.0014520175755023956,
0.07380668073892593,
-0.1732584834098816,
-0.028558097779750824,
0.1421183943748474,
-0.14270813763141632,
0.040253784507513046,
0.14096151292324066,
0.014791722409427166,
0.003908494487404823,
0.13270074129104614,
0.1131194606423378,
0.17026090621948242,
-0.12379786372184753,
-0.0643012747168541,
0.0007211365737020969,
-0.03232498839497566,
0.0596870593726635,
0.04074323922395706,
-0.007581554353237152,
0.0911276564002037,
0.0451594702899456,
0.00928732380270958,
0.022662363946437836,
0.03906916454434395,
-0.021798105910420418,
-0.006612842436879873,
-0.02501421608030796,
-0.004234470427036285,
0.036115679889917374,
-0.048227641731500626,
-0.03897625952959061,
-0.0912146270275116,
0.07760781794786453,
0.10939109325408936,
-0.0421467125415802,
0.033686891198158264,
-0.06955401599407196,
-0.02231217920780182,
0.010236898437142372,
0.0026011858135461807,
-0.11216232925653458,
-0.04681015387177467,
0.0459500215947628,
-0.12978380918502808,
0.07347387075424194,
0.053935933858156204,
0.036231059581041336,
0.07219239324331284,
-0.01684352569282055,
0.015265405178070068,
-0.03083498775959015,
0.0011618459830060601,
-0.02125057764351368,
-0.03709186613559723,
-0.027334697544574738,
-0.036927852779626846,
0.027512812986969948,
-0.10794975608587265,
0.001636496395803988,
0.00835143681615591,
0.08859696239233017,
0.014188110828399658,
-0.024627333506941795,
0.016180718317627907,
0.014519180171191692,
0.010088969953358173,
-0.04137594625353813,
-0.00701132183894515,
-0.0027623921632766724,
-0.001423430978320539,
0.10769343376159668,
-0.14947479963302612,
-0.10857168585062027,
0.05461400747299194,
0.10630147904157639,
-0.013484450988471508,
-0.002914704382419586,
-0.03769854083657265,
-0.032972682267427444,
-0.05419835075736046,
-0.11773157119750977,
0.20508414506912231,
0.016314396634697914,
0.060321975499391556,
-0.08903151750564575,
-0.024631552398204803,
0.0073272367008030415,
-0.0005635258858092129,
-0.0066109634935855865,
0.08216977119445801,
0.010710707865655422,
-0.0626220777630806,
-0.01409828383475542,
-0.04705403372645378,
0.02752281166613102,
0.1699429750442505,
-0.021083012223243713,
-0.1075795516371727,
0.0039990185759961605,
-0.013599189929664135,
-0.013461505062878132,
0.09256710857152939,
0.008553463965654373,
-0.005116764921694994,
0.030824089422822,
0.025995343923568726,
0.05442393943667412,
-0.054551735520362854,
0.0838296189904213,
0.025514965876936913,
-0.04724667966365814,
0.03818230330944061,
-0.024837642908096313,
-0.007064869161695242,
0.04058477655053139,
0.005656140390783548,
0.022196030244231224,
-0.04450571909546852,
-0.04310232028365135,
-0.08649798482656479,
0.10298939794301987,
-0.10416754335165024,
-0.2189396172761917,
-0.16612835228443146,
0.07336845993995667,
-0.024220863357186317,
-0.014543112367391586,
0.03515360876917839,
-0.05421018972992897,
-0.10324738174676895,
-0.1336701363325119,
0.04294244572520256,
0.020161591470241547,
-0.07268714159727097,
-0.051295626908540726,
0.027628792449831963,
0.009093421511352062,
-0.12460020929574966,
0.0028417417779564857,
0.005951263010501862,
0.012100577354431152,
-0.01768934540450573,
0.015359426848590374,
0.032774507999420166,
0.07383307814598083,
0.009565385989844799,
-0.05632412061095238,
0.00956219993531704,
0.1628551483154297,
-0.07192520052194595,
0.15017475187778473,
0.15586446225643158,
0.0054406276904046535,
0.05770708993077278,
0.09382065385580063,
0.014712400734424591,
-0.024534448981285095,
0.02474200166761875,
0.01706424541771412,
-0.0622892789542675,
-0.1461806744337082,
-0.12692248821258545,
-0.04245195910334587,
-0.005139066372066736,
0.06547801941633224,
0.03236158564686775,
-0.0489032007753849,
0.017026199027895927,
-0.07868372648954391,
-0.011005006730556488,
0.054854441434144974,
0.05161076784133911,
0.1378561407327652,
0.0016075456514954567,
0.02648150362074375,
-0.0778646171092987,
-0.017339365556836128,
0.10061412304639816,
-0.01005042064934969,
0.15805131196975708,
-0.059578731656074524,
0.11629685014486313,
0.02861650101840496,
-0.011224727146327496,
0.05680428817868233,
0.05552901700139046,
0.00019034867000300437,
0.02077609673142433,
-0.023199856281280518,
-0.07852808386087418,
-0.04387184977531433,
0.08404038101434708,
0.05242825672030449,
-0.07903861999511719,
-0.0008927416056394577,
0.001876294962130487,
0.002558555454015732,
0.13630978763103485,
0.03063478134572506,
-0.11342356353998184,
-0.13053272664546967,
0.017538586631417274,
-0.10485208034515381,
-0.07290510088205338,
0.022740306332707405,
0.15048323571681976,
-0.08786367624998093,
0.030035870149731636,
0.0009829780319705606,
0.07475189119577408,
-0.07369504123926163,
0.009010996669530869,
-0.034506428986787796,
0.13453082740306854,
0.011850911192595959,
0.05447641387581825,
-0.037888169288635254,
0.040536027401685715,
0.009022600017488003,
0.11673209816217422,
-0.06458648294210434,
0.04415391758084297,
0.028068995103240013,
0.01322715450078249,
0.059337396174669266,
0.04093489795923233,
-0.15136121213436127,
0.021390115842223167,
-0.11134638637304306,
0.0545467734336853,
0.041399676352739334,
0.0487743616104126,
0.08103000372648239,
-0.008717638440430164,
-0.0014168756315484643,
-0.02616856060922146,
-0.10543747991323471,
-0.119498610496521,
-0.16386330127716064,
0.023354746401309967,
0.00039676824235357344,
-0.003779347287490964,
-0.05475013330578804,
-0.01618996448814869,
-0.08386760205030441,
0.1165747344493866,
-0.08008600026369095,
-0.12010297924280167,
-0.07303469628095627,
-0.056210070848464966,
0.15171030163764954,
-0.044452618807554245,
0.009267677552998066,
0.03489411249756813,
0.15693654119968414,
-0.05205686017870903,
-0.060824915766716,
-0.015623800456523895,
-0.08291152119636536,
-0.10521890968084335,
0.00824075285345316,
0.11681818217039108,
0.10207229852676392,
0.05529077351093292,
0.011769297532737255,
0.002116154180839658,
-0.0018033211817964911,
-0.09992746263742447,
-0.054452985525131226,
0.17412935197353363,
-0.01004746649414301,
0.033443618565797806,
-0.050800930708646774,
-0.06694277375936508,
-0.052575718611478806,
-0.011065750382840633,
0.049001652747392654,
0.14527146518230438,
-0.05122670158743858,
0.13461782038211823,
0.19320888817310333,
-0.06530234217643738,
-0.21305684745311737,
-0.05243651196360588,
0.050460055470466614,
0.05392737314105034,
0.01418014895170927,
-0.17135661840438843,
0.10715041309595108,
0.03822924569249153,
0.0002464201534166932,
0.03807703033089638,
-0.21804499626159668,
-0.10214245319366455,
0.044813673943281174,
-0.009448856115341187,
-0.02056264691054821,
-0.020527226850390434,
-0.02819511853158474,
-0.03827245533466339,
-0.030696621164679527,
0.06748364865779877,
-0.04899786040186882,
0.054357632994651794,
0.03569592535495758,
0.0776619091629982,
0.047273170202970505,
-0.023765087127685547,
0.11385207623243332,
-0.03243732824921608,
0.005109324585646391,
-0.06166874244809151,
0.09525668621063232,
0.015406418591737747,
-0.06549522280693054,
0.14650605618953705,
-0.024275988340377808,
0.012038539163768291,
-0.10497775673866272,
-0.06322721391916275,
-0.07562566548585892,
0.07462479919195175,
-0.01811317913234234,
-0.0402008593082428,
-0.08485238999128342,
0.07786790281534195,
0.0943509116768837,
0.005746990907937288,
-0.07748321443796158,
-0.07052754610776901,
-0.07378210872411728,
0.12262004613876343,
0.17353589832782745,
-0.04997195303440094,
-0.05173530802130699,
0.012837546877563,
-0.000521339476108551,
0.05559206381440163,
-0.05002859607338905,
0.016424797475337982,
0.09397712349891663,
-0.019245365634560585,
0.03369341045618057,
-0.03236564248800278,
-0.1331532746553421,
-0.017314599826931953,
0.02276366390287876,
-0.03971531614661217,
-0.1674547791481018,
-0.03144443780183792,
0.03730102628469467,
-0.05292665958404541,
-0.0442548505961895,
0.1313992142677307,
-0.08168986439704895,
-0.0013182867551222444,
0.013311084359884262,
0.056429069489240646,
0.025051511824131012,
0.09355419129133224,
0.026740601286292076,
0.023231109604239464,
-0.07062175869941711,
0.0975257158279419,
0.0351807065308094,
-0.128244087100029,
0.047866757959127426,
0.14979134500026703,
-0.09087872505187988,
-0.0500328354537487,
-0.12078886479139328,
-0.032211776822805405,
-0.013274802826344967,
-0.10345709323883057,
0.008031888864934444,
-0.07307847589254379,
0.01426768396049738,
0.009669262915849686,
0.008012375794351101,
-0.021835090592503548,
-0.019820181652903557,
0.039022840559482574,
-0.10119923204183578,
0.08379780501127243,
0.007940425537526608,
0.02602614462375641,
-0.04065416008234024,
0.0905618667602539,
-0.0013774080434814095,
0.012019799090921879,
-0.024590807035565376,
-0.01851358264684677,
-0.010060264728963375,
-0.031500235199928284,
-0.1394885778427124,
0.0135020287707448,
-0.08772540092468262,
0.008053626865148544,
-0.003273502690717578,
0.03379353880882263,
-0.017731189727783203,
0.052241235971450806,
-0.03545324131846428,
-0.020981157198548317,
-0.055661678314208984,
0.03188474476337433,
-0.052059244364500046,
0.008883794769644737,
0.04673681780695915,
-0.07409559935331345,
0.049059271812438965,
0.0036629755049943924,
-0.0508648045361042,
0.048924077302217484,
-0.012695367448031902,
0.006648706737905741,
0.01813393644988537,
0.06413381546735764,
0.004572323989123106,
-0.05151090398430824,
-0.004642277956008911,
0.02946055494248867,
-0.00006704963743686676,
-0.03902660310268402,
0.03554965928196907,
-0.05527603253722191,
0.07775525003671646,
0.027323707938194275,
-0.015320166014134884,
-0.05732179805636406,
0.030017131939530373,
0.02757575921714306,
0.01918026991188526,
0.08741313964128494,
-0.06096639856696129,
0.015535983256995678,
-0.09593475610017776,
-0.00422298489138484,
0.0063530136831104755,
0.0065930611453950405,
0.08321885019540787,
-0.017077991738915443,
0.02884180098772049,
0.0027738052885979414,
0.18191327154636383,
-0.018986212089657784,
0.02444194070994854,
0.05324159935116768,
-0.0924469605088234,
-0.10539183020591736,
0.01778334379196167,
0.060584381222724915,
0.013411623425781727,
-0.008521289564669132,
-0.04034918174147606,
-0.027802368625998497,
-0.016025317832827568,
-0.000584036111831665,
0.08599909394979477,
0.1060648187994957,
0.09845110774040222,
0.08825982362031937,
0.014499169774353504,
-0.041329436004161835,
-0.10930155962705612,
0.077110655605793,
-0.031137846410274506,
0.07765442132949829,
-0.0422254353761673,
0.0850609838962555,
0.12324772030115128,
-0.06427275389432907,
0.09467244893312454,
-0.0033962279558181763,
-0.045910339802503586,
-0.07054578512907028,
-0.13325297832489014,
-0.04449774697422981,
-0.03876001015305519,
-0.026226336136460304,
-0.0905275046825409,
0.03519812598824501,
0.012006416916847229,
0.01917162723839283,
-0.026100099086761475,
0.11170578747987747,
-0.06656112521886826,
-0.09367245435714722,
0.056832801550626755,
-0.02092493325471878,
0.03052701987326145,
0.11388850212097168,
-0.001505266292952001,
0.05415790155529976,
0.06214847043156624,
0.060997020453214645,
0.060106221586465836,
0.016665415838360786,
-0.00595567561686039,
0.0008331987191922963,
-0.011151152662932873,
-0.00961034931242466,
-0.0018747070571407676,
-0.005674006883054972,
0.10459678620100021,
0.07635261863470078,
-0.08236616849899292,
-0.006330238189548254,
0.1009625792503357,
-0.04659382998943329,
-0.14912475645542145,
-0.1302499771118164,
0.1280684918165207,
-0.0008649304509162903,
0.020751524716615677,
-0.005069941282272339,
-0.07699073106050491,
-0.019001776352524757,
0.12956927716732025,
0.12917053699493408,
0.03177894279360771,
-0.00719151645898819,
-0.03574550151824951,
-0.010987960733473301,
-0.06502034515142441,
0.11047756671905518,
0.005229934584349394,
0.27657556533813477,
-0.005531327333301306,
0.053835466504096985,
-0.02236650139093399,
-0.023051680997014046,
-0.11584397405385971,
0.07836860418319702,
-0.051013749092817307,
-0.0024139557499438524,
-0.027742832899093628,
0.07749023288488388,
-0.0417633019387722,
-0.2472580075263977,
-0.022812368348240852,
0.0033747144043445587,
-0.06395704299211502,
0.028330467641353607,
-0.011594031937420368,
0.021457383409142494,
0.06269682198762894,
-0.004426983650773764,
0.003754106117412448,
0.15013843774795532,
-0.014930774457752705,
-0.07101913541555405,
0.035930853337049484,
0.04340527951717377,
-0.04247202351689339,
0.16361312568187714,
0.034531351178884506,
0.05780244991183281,
0.05674431100487709,
-0.01095303613692522,
-0.11473534256219864,
0.060690075159072876,
0.02832530252635479,
-0.11735076457262039,
0.02635217271745205,
0.14663197100162506,
-0.011272517032921314,
0.07317855209112167,
0.047359075397253036,
-0.03782832995057106,
0.0024816456716507673,
0.08594688028097153,
0.006529544945806265,
-0.051630739122629166,
0.0534086674451828,
-0.09295033663511276,
0.12609757483005524,
0.10194072127342224,
0.012503470294177532,
-0.026533514261245728,
-0.047788992524147034,
0.006107345223426819,
-0.011060155928134918,
0.09403341263532639,
-0.0035379703622311354,
-0.1149706020951271,
-0.027516065165400505,
-0.005723027978092432,
0.05655448138713837,
-0.1625673770904541,
-0.041079841554164886,
0.005061662290245295,
-0.021265484392642975,
0.02887839637696743,
0.09843403100967407,
0.02709369547665119,
0.007294067647308111,
-0.02258148603141308,
-0.02608385682106018,
0.005602516699582338,
0.07362786680459976,
-0.11932618170976639,
-0.05967900529503822
] |
null | null | transformers |
# NB-Whisper Medium
Introducing the **_Norwegian NB-Whisper Medium model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-medium](https://huggingface.co/openai/whisper-medium)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-medium")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-medium/resolve/main/ggml-model.bin -O models/nb-medium-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-medium/resolve/main/ggml-model-q5_0.bin -O models/nb-medium-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-medium-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-medium-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-medium --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Medium model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-medium", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-medium | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-medium",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:07:32+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us
| NB-Whisper Medium
=================
Introducing the *Norwegian NB-Whisper Medium model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-medium
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Medium model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Medium model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Medium model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.053408872336149216,
0.09734372049570084,
-0.0038889963179826736,
0.0007580332458019257,
0.04640887305140495,
-0.028522321954369545,
0.026218613609671593,
0.058278799057006836,
-0.00048149129725061357,
0.06930404156446457,
-0.002532658400014043,
-0.07761071622371674,
0.08640396595001221,
0.02065064013004303,
0.09733971953392029,
-0.2983796000480652,
0.05479859933257103,
-0.07921868562698364,
0.010434887371957302,
0.04061432555317879,
0.0935901403427124,
-0.056546133011579514,
0.04552413150668144,
0.014031033031642437,
-0.0058809793554246426,
0.014090013690292835,
-0.05407995358109474,
-0.03553939238190651,
0.0762382224202156,
0.09500553458929062,
0.039830632507801056,
-0.0008958484977483749,
0.08211835473775864,
-0.14503984153270721,
0.009906000457704067,
0.05288361385464668,
0.03606242686510086,
0.011530482210218906,
0.03556419909000397,
0.11090878397226334,
0.20059294998645782,
-0.08328181505203247,
0.011760891415178776,
0.05040663108229637,
-0.026640726253390312,
-0.13732576370239258,
-0.04293892905116081,
-0.024638867005705833,
0.05123947188258171,
0.04289204999804497,
-0.02990558184683323,
0.08480087667703629,
-0.07812599092721939,
0.03647651523351669,
0.07791078090667725,
-0.11642521619796753,
-0.01401425153017044,
0.013596017844974995,
0.05050720274448395,
0.04940086603164673,
-0.02637522667646408,
0.008759605698287487,
0.015379950404167175,
0.04044670984148979,
0.015311437658965588,
-0.010136113502085209,
0.07275011390447617,
-0.06173372268676758,
-0.10859950631856918,
-0.047759827226400375,
0.12678463757038116,
0.016961518675088882,
-0.06058543920516968,
-0.1698049157857895,
-0.040774572640657425,
0.05690477415919304,
-0.022340716794133186,
-0.02595818229019642,
0.028909994289278984,
-0.002604811219498515,
0.09714715927839279,
-0.0719771683216095,
-0.0975651666522026,
0.015375934541225433,
-0.007007630541920662,
0.10912636667490005,
0.044051241129636765,
0.0033276965841650963,
0.01721496693789959,
0.0495186485350132,
-0.06760430335998535,
-0.05515226349234581,
-0.06536924093961716,
-0.06540770083665848,
-0.060463715344667435,
0.013614557683467865,
-0.0255903247743845,
-0.09668278694152832,
-0.0004457826726138592,
0.07777168601751328,
-0.020016660913825035,
0.022975953295826912,
0.02646392025053501,
-0.0026948489248752594,
0.05315357446670532,
0.13118433952331543,
-0.024681389331817627,
-0.07917671650648117,
0.00047070407890714705,
-0.006538307759910822,
0.06497280299663544,
0.011317860335111618,
-0.04879562929272652,
-0.032953113317489624,
-0.0024076071567833424,
0.02516581304371357,
0.01661512814462185,
0.011733061634004116,
0.021251285448670387,
-0.02913583628833294,
0.242924764752388,
-0.11680103093385696,
0.00042745284736156464,
0.014008517377078533,
-0.047167640179395676,
0.08106967061758041,
0.029631895944476128,
-0.025097988545894623,
-0.12910595536231995,
0.014666043221950531,
-0.00270654889754951,
-0.02214106358587742,
-0.0662289410829544,
-0.1065153256058693,
0.04883219674229622,
0.01899176836013794,
-0.013431533239781857,
-0.11421225219964981,
-0.09376063197851181,
-0.045685842633247375,
0.019359087571501732,
-0.011402313597500324,
-0.0379108227789402,
0.004448224324733019,
-0.03985005244612694,
-0.03200840950012207,
-0.03046736679971218,
0.0033287506084889174,
-0.028618531301617622,
-0.019680999219417572,
-0.024644503369927406,
0.03599286079406738,
-0.021723225712776184,
0.011471763253211975,
-0.06224140524864197,
-0.00609582057222724,
-0.18763315677642822,
0.114989273250103,
-0.07522726058959961,
-0.00663727754727006,
-0.027749305590987206,
-0.07128012925386429,
-0.04814068600535393,
0.05021432042121887,
0.0016163960099220276,
0.07371591776609421,
-0.17315465211868286,
-0.028381990268826485,
0.14147548377513885,
-0.1430736929178238,
0.040566351264715195,
0.1405785232782364,
0.014873306266963482,
0.0031983796507120132,
0.1321834772825241,
0.11182025820016861,
0.1696401834487915,
-0.12388795614242554,
-0.06404087692499161,
0.0010088678682222962,
-0.032210420817136765,
0.05935576558113098,
0.040972016751766205,
-0.007507853209972382,
0.09090474992990494,
0.04503936693072319,
0.009750480763614178,
0.02279219590127468,
0.03841046616435051,
-0.02174186147749424,
-0.006196301430463791,
-0.025294104591012,
-0.0038667619228363037,
0.03627414628863335,
-0.047969043254852295,
-0.03901522979140282,
-0.09149178862571716,
0.07757269591093063,
0.10937994718551636,
-0.04161308705806732,
0.033465467393398285,
-0.070184625685215,
-0.022795295342803,
0.010248573496937752,
0.0017934056231752038,
-0.11272559314966202,
-0.045850932598114014,
0.04599980637431145,
-0.12754832208156586,
0.07256680727005005,
0.05360956862568855,
0.03651643171906471,
0.07237730175256729,
-0.016613945364952087,
0.015032734721899033,
-0.03066820092499256,
0.0017288910457864404,
-0.021153412759304047,
-0.03747197613120079,
-0.027834737673401833,
-0.03720042482018471,
0.02796112932264805,
-0.10778429359197617,
0.0014105489244684577,
0.008628486655652523,
0.08899572491645813,
0.01426113024353981,
-0.024564845487475395,
0.015873663127422333,
0.014531883411109447,
0.01003175601363182,
-0.041215796023607254,
-0.0068251267075538635,
-0.0020047214347869158,
-0.001313773333095014,
0.10708699375391006,
-0.14935016632080078,
-0.10947976261377335,
0.05435444042086601,
0.10609474033117294,
-0.013332486152648926,
-0.0034710157196968794,
-0.03791607916355133,
-0.03294724225997925,
-0.05326893925666809,
-0.11773455142974854,
0.20487748086452484,
0.01667243242263794,
0.06056677922606468,
-0.08855202049016953,
-0.024819156154990196,
0.007287248503416777,
-0.0008281755144707859,
-0.006904043257236481,
0.08173364400863647,
0.010145111940801144,
-0.06245696172118187,
-0.014163839630782604,
-0.04737750068306923,
0.027906736359000206,
0.16968441009521484,
-0.021354442462325096,
-0.10731213539838791,
0.0038344881031662226,
-0.013813246972858906,
-0.01307507324963808,
0.09232836216688156,
0.007668837904930115,
-0.005060643423348665,
0.030821606516838074,
0.026537412777543068,
0.0545656681060791,
-0.05451233312487602,
0.08432430773973465,
0.0257288608700037,
-0.04667338356375694,
0.03855479136109352,
-0.025211701169610023,
-0.007140308618545532,
0.04032542183995247,
0.005685374140739441,
0.022695770487189293,
-0.044395919889211655,
-0.04319600388407707,
-0.08694405108690262,
0.10310300439596176,
-0.10499384254217148,
-0.21899788081645966,
-0.1662282794713974,
0.07195380330085754,
-0.02378780208528042,
-0.014698207378387451,
0.03469110652804375,
-0.05438333377242088,
-0.1034940779209137,
-0.13333451747894287,
0.04265667125582695,
0.019058844074606895,
-0.07224418967962265,
-0.05033149942755699,
0.02782060019671917,
0.009625238366425037,
-0.12468359619379044,
0.002635714365169406,
0.005851972382515669,
0.012207088060677052,
-0.017269598320126534,
0.015314143151044846,
0.03237262740731239,
0.07395341247320175,
0.009488577954471111,
-0.05585721135139465,
0.01020015124231577,
0.16351009905338287,
-0.07121943682432175,
0.15030349791049957,
0.15639163553714752,
0.005180885549634695,
0.05736575648188591,
0.09373966604471207,
0.014885875396430492,
-0.024212351068854332,
0.02464011311531067,
0.016354113817214966,
-0.061992302536964417,
-0.14632174372673035,
-0.12716954946517944,
-0.042609382420778275,
-0.006120461504906416,
0.06480147689580917,
0.032209817320108414,
-0.04859266057610512,
0.016571080312132835,
-0.0783882588148117,
-0.010494023561477661,
0.05456635355949402,
0.051513224840164185,
0.1362501084804535,
0.0022744890302419662,
0.02654748223721981,
-0.07742047309875488,
-0.016508925706148148,
0.10074860602617264,
-0.00939792487770319,
0.15831823647022247,
-0.05874590948224068,
0.11643251031637192,
0.029519693925976753,
-0.011629466898739338,
0.057408690452575684,
0.05566565319895744,
0.00008146030450006947,
0.020684227347373962,
-0.02280978299677372,
-0.07856887578964233,
-0.04432787373661995,
0.08330632746219635,
0.05166657641530037,
-0.0785033106803894,
-0.0013335850089788437,
0.0025829935912042856,
0.00223400816321373,
0.13705594837665558,
0.029970601201057434,
-0.11357136815786362,
-0.1309383064508438,
0.01728852279484272,
-0.10541566461324692,
-0.07253478467464447,
0.02218998782336712,
0.1497228890657425,
-0.08801192045211792,
0.030105141922831535,
0.0006926196510903537,
0.0746789202094078,
-0.07381237298250198,
0.008762650191783905,
-0.03366309776902199,
0.13450491428375244,
0.01223599910736084,
0.05387371405959129,
-0.037838660180568695,
0.04143340140581131,
0.009239842183887959,
0.11705637723207474,
-0.06436719745397568,
0.044642385095357895,
0.02766738273203373,
0.01370085496455431,
0.05866938829421997,
0.041256941854953766,
-0.15221142768859863,
0.021549047902226448,
-0.11149812489748001,
0.05397466942667961,
0.041763629764318466,
0.04991475120186806,
0.08116678148508072,
-0.00875602662563324,
-0.0014098869869485497,
-0.025948092341423035,
-0.1049409881234169,
-0.12142070382833481,
-0.16426512598991394,
0.0234389528632164,
0.0001188392416224815,
-0.004753742832690477,
-0.05502857640385628,
-0.015834441408514977,
-0.08359181880950928,
0.1164366826415062,
-0.0812012329697609,
-0.1199306845664978,
-0.07339360564947128,
-0.05578172206878662,
0.15130998194217682,
-0.04440511390566826,
0.009434311650693417,
0.03456052020192146,
0.15712666511535645,
-0.051698993891477585,
-0.0608486533164978,
-0.01606670394539833,
-0.08380403369665146,
-0.10560864210128784,
0.007861779071390629,
0.11657353490591049,
0.1033165454864502,
0.055156558752059937,
0.011573378928005695,
0.001775673241354525,
-0.0020898033399134874,
-0.09945199638605118,
-0.05493663251399994,
0.17473940551280975,
-0.009328407235443592,
0.03205331787467003,
-0.05047791078686714,
-0.06687429547309875,
-0.052116263657808304,
-0.011229577474296093,
0.04908677935600281,
0.1445901244878769,
-0.05135548114776611,
0.13494910299777985,
0.1937495321035385,
-0.0653407946228981,
-0.2124556452035904,
-0.0528397262096405,
0.051637131720781326,
0.05386717990040779,
0.013575288467109203,
-0.17106205224990845,
0.10755989700555801,
0.03849410265684128,
0.000554183148778975,
0.03661763668060303,
-0.2173703908920288,
-0.10174086689949036,
0.04431682825088501,
-0.009372142143547535,
-0.022019200026988983,
-0.0206910353153944,
-0.02851649560034275,
-0.0384073443710804,
-0.03203586861491203,
0.06708086282014847,
-0.04820941016077995,
0.053981486707925797,
0.03547094389796257,
0.07657995074987411,
0.04691901430487633,
-0.024240486323833466,
0.11358589679002762,
-0.03205002844333649,
0.0047635179944336414,
-0.061694901436567307,
0.0951966941356659,
0.016085924580693245,
-0.06556948274374008,
0.14569371938705444,
-0.02394179441034794,
0.011835903860628605,
-0.10583123564720154,
-0.06310834735631943,
-0.07532072812318802,
0.07395213097333908,
-0.01820492558181286,
-0.04013926163315773,
-0.08435025066137314,
0.07823792845010757,
0.09449157863855362,
0.0055583142675459385,
-0.07751237601041794,
-0.07029364258050919,
-0.0732273980975151,
0.12457800656557083,
0.17442218959331512,
-0.05191715061664581,
-0.05090334638953209,
0.012585081160068512,
-0.00044745454215444624,
0.05524785444140434,
-0.0493268184363842,
0.016355860978364944,
0.09376704692840576,
-0.019356898963451385,
0.03371105715632439,
-0.0322917178273201,
-0.13340704143047333,
-0.017315110191702843,
0.023090576753020287,
-0.0392928346991539,
-0.1670091301202774,
-0.03167755529284477,
0.039266422390937805,
-0.05307302996516228,
-0.043962299823760986,
0.13113509118556976,
-0.08118858188390732,
-0.0013896049931645393,
0.013084187172353268,
0.05620915815234184,
0.024603545665740967,
0.09426602721214294,
0.02685466967523098,
0.0237167626619339,
-0.0702846422791481,
0.09802669286727905,
0.035271111875772476,
-0.1270463764667511,
0.04709966108202934,
0.14992715418338776,
-0.09153342247009277,
-0.04981948062777519,
-0.11973070353269577,
-0.03296862915158272,
-0.012486893683671951,
-0.10357055813074112,
0.008045473136007786,
-0.07289750128984451,
0.014427908696234226,
0.009438048116862774,
0.00788271427154541,
-0.02218623459339142,
-0.01982065476477146,
0.03920961543917656,
-0.10088992863893509,
0.08409100025892258,
0.008028184063732624,
0.026034871116280556,
-0.04035166651010513,
0.0911010131239891,
-0.0016192058101296425,
0.011630511842668056,
-0.02449945919215679,
-0.01880870945751667,
-0.009824731387197971,
-0.031434234231710434,
-0.1394326537847519,
0.013447797857224941,
-0.08816266059875488,
0.007652454078197479,
-0.003321102587506175,
0.034024305641651154,
-0.017663342878222466,
0.052337050437927246,
-0.035217929631471634,
-0.021065140143036842,
-0.05593349412083626,
0.031378962099552155,
-0.05096543952822685,
0.009215501137077808,
0.046408820897340775,
-0.0742160752415657,
0.048864785581827164,
0.0037604505196213722,
-0.050612691789865494,
0.04873538389801979,
-0.012939934618771076,
0.006064849440008402,
0.01727568544447422,
0.06445728242397308,
0.005071787163615227,
-0.05167876556515694,
-0.004715636372566223,
0.029887521639466286,
-0.0007479532505385578,
-0.03904491662979126,
0.03568945452570915,
-0.05497480928897858,
0.0777621790766716,
0.027322612702846527,
-0.015041903592646122,
-0.057955507189035416,
0.029973505064845085,
0.028617536649107933,
0.018963776528835297,
0.08754566311836243,
-0.06080442667007446,
0.01492118090391159,
-0.09637986868619919,
-0.004170130472630262,
0.006634741555899382,
0.007241316605359316,
0.08213584870100021,
-0.016846338286995888,
0.028740113601088524,
0.002889492781832814,
0.18214677274227142,
-0.019202591851353645,
0.024632876738905907,
0.05330730602145195,
-0.09245314449071884,
-0.10558085888624191,
0.017723413184285164,
0.0604800246655941,
0.013455321080982685,
-0.007930897176265717,
-0.04014964774250984,
-0.02814667671918869,
-0.016257578507065773,
-0.001036370755173266,
0.08714871853590012,
0.10617371648550034,
0.0976230725646019,
0.08826819807291031,
0.014028012752532959,
-0.04098311439156532,
-0.10997330397367477,
0.0763208195567131,
-0.03176049143075943,
0.07777931541204453,
-0.04222716763615608,
0.08517944067716599,
0.12359615415334702,
-0.06490137428045273,
0.09454122930765152,
-0.0036572974640876055,
-0.04615459963679314,
-0.07075981795787811,
-0.13402414321899414,
-0.04433242976665497,
-0.03901807591319084,
-0.026252364739775658,
-0.09059158712625504,
0.03540237620472908,
0.012055334635078907,
0.01949060708284378,
-0.02560693584382534,
0.1101616695523262,
-0.06621414422988892,
-0.0931764543056488,
0.05674548074603081,
-0.020955616608262062,
0.030241765081882477,
0.11319563537836075,
-0.0012463784078136086,
0.05373707786202431,
0.062121134251356125,
0.06094951555132866,
0.05987996980547905,
0.016542397439479828,
-0.006221379619091749,
0.0016856197034940124,
-0.011593181639909744,
-0.009227938018739223,
-0.002609333023428917,
-0.005829900503158569,
0.10509815812110901,
0.07614535093307495,
-0.0819401815533638,
-0.006236647721379995,
0.10055315494537354,
-0.046291034668684006,
-0.14998243749141693,
-0.1307687759399414,
0.12786686420440674,
-0.0009357755188830197,
0.020748091861605644,
-0.004485463257879019,
-0.07711076736450195,
-0.019075853750109673,
0.12942977249622345,
0.12907133996486664,
0.031856417655944824,
-0.007237451151013374,
-0.035651423037052155,
-0.011334405280649662,
-0.06539835035800934,
0.11127518862485886,
0.005234651267528534,
0.2760324478149414,
-0.005274938885122538,
0.05352477356791496,
-0.022896533831954002,
-0.02340439148247242,
-0.11547502130270004,
0.07878604531288147,
-0.05087262764573097,
-0.0026227708440274,
-0.027426285669207573,
0.07760471850633621,
-0.0417303629219532,
-0.24655364453792572,
-0.0229586660861969,
0.003556982846930623,
-0.06418276578187943,
0.028180137276649475,
-0.01179917436093092,
0.021242111921310425,
0.06225660815834999,
-0.004294329788535833,
0.003906911239027977,
0.15048623085021973,
-0.014604535885155201,
-0.07048328220844269,
0.03564010560512543,
0.04415198042988777,
-0.041966501623392105,
0.16365796327590942,
0.03469070792198181,
0.057523827999830246,
0.05693890526890755,
-0.010615795850753784,
-0.11535201221704483,
0.06075581535696983,
0.02837255597114563,
-0.11714828759431839,
0.02646351419389248,
0.1469414085149765,
-0.010864478535950184,
0.073973648250103,
0.04690287634730339,
-0.03802666440606117,
0.0027837723027914762,
0.08550050109624863,
0.006520237773656845,
-0.051601674407720566,
0.054037224501371384,
-0.09269941598176956,
0.1265147477388382,
0.10168280452489853,
0.012311205267906189,
-0.026300907135009766,
-0.04780186340212822,
0.006401794496923685,
-0.011390899308025837,
0.09497781842947006,
-0.003815124509856105,
-0.11586036533117294,
-0.027454063296318054,
-0.0054977149702608585,
0.05677550658583641,
-0.161662295460701,
-0.04133867472410202,
0.00468446733430028,
-0.021599285304546356,
0.02855628728866577,
0.09876874089241028,
0.02730019949376583,
0.007248224690556526,
-0.0229452196508646,
-0.02427842654287815,
0.005362322088330984,
0.07350432127714157,
-0.11940198391675949,
-0.05905388295650482
] |
null | null | transformers |
# NB-Whisper Small
Introducing the **_Norwegian NB-Whisper Small model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-small](https://huggingface.co/openai/whisper-small)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-small")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model.bin -O models/nb-small-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model-q5_0.bin -O models/nb-small-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-small-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-small-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-small --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-small", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-small | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:07:40+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us
| NB-Whisper Small
================
Introducing the *Norwegian NB-Whisper Small model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-small
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05313670262694359,
0.09707695245742798,
-0.003880731062963605,
0.0012831786880269647,
0.046301472932100296,
-0.029270900413393974,
0.025701439008116722,
0.05904564633965492,
-0.00023794670414645225,
0.06951453536748886,
-0.002706193597987294,
-0.07667413353919983,
0.08631157875061035,
0.021189367398619652,
0.09632975608110428,
-0.2983481287956238,
0.05454470217227936,
-0.07895288616418839,
0.010602572001516819,
0.03997239097952843,
0.0936204195022583,
-0.05643267557024956,
0.045930664986371994,
0.014130756258964539,
-0.005923555698245764,
0.014167976565659046,
-0.05366680026054382,
-0.03547634556889534,
0.07598380744457245,
0.09494748711585999,
0.0394386388361454,
-0.0011252068215981126,
0.08186641335487366,
-0.14478033781051636,
0.009939960204064846,
0.05257062986493111,
0.036123644560575485,
0.011760481633245945,
0.035008665174245834,
0.11113276332616806,
0.20000874996185303,
-0.08396207541227341,
0.010943327099084854,
0.05032143369317055,
-0.026787815615534782,
-0.138142392039299,
-0.0423724390566349,
-0.025564784184098244,
0.05061521753668785,
0.04267450049519539,
-0.03012804128229618,
0.08522967249155045,
-0.07865230739116669,
0.03649737313389778,
0.07793093472719193,
-0.11590909957885742,
-0.014122061431407928,
0.014020264148712158,
0.04893474280834198,
0.04814661666750908,
-0.026305215433239937,
0.009253446944057941,
0.01583971455693245,
0.04073401167988777,
0.015422825701534748,
-0.009919791482388973,
0.07131191343069077,
-0.06190745159983635,
-0.10806053876876831,
-0.04816674813628197,
0.12705682218074799,
0.016789792105555534,
-0.06055287644267082,
-0.17028534412384033,
-0.041089627891778946,
0.056730836629867554,
-0.022781148552894592,
-0.026135198771953583,
0.02914804220199585,
-0.0025629810988903046,
0.0968579649925232,
-0.07163555175065994,
-0.0979076698422432,
0.015982838347554207,
-0.007248008158057928,
0.1084735170006752,
0.04391639307141304,
0.0032915985211730003,
0.017141973599791527,
0.04907306656241417,
-0.06850013881921768,
-0.05525605008006096,
-0.0659816637635231,
-0.06542304903268814,
-0.05962321162223816,
0.013917632400989532,
-0.026013746857643127,
-0.09549623727798462,
-0.00031829081126488745,
0.0769890770316124,
-0.019275715574622154,
0.02246892638504505,
0.026539525017142296,
-0.002181742340326309,
0.052935678511857986,
0.13106216490268707,
-0.0235371645539999,
-0.07798952609300613,
0.0012427183100953698,
-0.006733561400324106,
0.06595762819051743,
0.01150660589337349,
-0.04909980297088623,
-0.03309032320976257,
-0.003275076625868678,
0.024991512298583984,
0.01667100004851818,
0.012090765871107578,
0.021332645788788795,
-0.02876955084502697,
0.24200840294361115,
-0.11678973585367203,
0.00032916790223680437,
0.014547653496265411,
-0.04745154082775116,
0.0810275599360466,
0.029732907190918922,
-0.02524886094033718,
-0.12939578294754028,
0.014258217997848988,
-0.0023223310708999634,
-0.02188037894666195,
-0.06600665301084518,
-0.10683397203683853,
0.04931458830833435,
0.019808337092399597,
-0.013135972432792187,
-0.114737868309021,
-0.09354698657989502,
-0.04552862048149109,
0.01894090510904789,
-0.011279534548521042,
-0.03845116123557091,
0.004129962995648384,
-0.04065261036157608,
-0.03251716122031212,
-0.030552411451935768,
0.0021749038714915514,
-0.028590066358447075,
-0.01985478773713112,
-0.024314217269420624,
0.035928357392549515,
-0.02141048200428486,
0.011384367011487484,
-0.06342495232820511,
-0.006687468383461237,
-0.1882234662771225,
0.1151350736618042,
-0.07525362819433212,
-0.006661958992481232,
-0.02711421065032482,
-0.07187265157699585,
-0.04935802146792412,
0.05072873458266258,
0.0013195810606703162,
0.07402697205543518,
-0.17393003404140472,
-0.028222842141985893,
0.1417880803346634,
-0.14306537806987762,
0.04109632596373558,
0.14123377203941345,
0.015080549754202366,
0.004092665389180183,
0.13263969123363495,
0.11340143531560898,
0.17020194232463837,
-0.12335120886564255,
-0.06412967294454575,
0.0018372070044279099,
-0.032944682985544205,
0.05900487303733826,
0.0407794751226902,
-0.0073601058684289455,
0.09095627069473267,
0.04521762207150459,
0.009426462464034557,
0.022795652970671654,
0.0386730395257473,
-0.021873140707612038,
-0.006592024117708206,
-0.024910712614655495,
-0.004523947834968567,
0.036533284932374954,
-0.04791133478283882,
-0.03944604843854904,
-0.09118659049272537,
0.07751038670539856,
0.10942360013723373,
-0.0418989397585392,
0.03315546736121178,
-0.07041851431131363,
-0.02286585606634617,
0.009751184843480587,
0.0019668142776936293,
-0.11190291494131088,
-0.04579504206776619,
0.04627826809883118,
-0.12929825484752655,
0.07301092892885208,
0.054888591170310974,
0.03660829737782478,
0.0719652846455574,
-0.016242740675807,
0.01578412391245365,
-0.030603118240833282,
0.0015486478805541992,
-0.021183989942073822,
-0.03719048574566841,
-0.027448566630482674,
-0.037148769944906235,
0.028130074962973595,
-0.10822416096925735,
0.0015185835072770715,
0.008852694183588028,
0.08869853615760803,
0.01412180531769991,
-0.024355031549930573,
0.016014395281672478,
0.014259892515838146,
0.0100739486515522,
-0.04136030003428459,
-0.00638921745121479,
-0.001843442558310926,
-0.0006580464541912079,
0.10725913196802139,
-0.14937596023082733,
-0.10943901538848877,
0.054853785783052444,
0.10564041137695312,
-0.013185436837375164,
-0.0027868885081261396,
-0.038119301199913025,
-0.03295012190937996,
-0.05370565131306648,
-0.1174565926194191,
0.2050842046737671,
0.015831485390663147,
0.060475338250398636,
-0.0889221653342247,
-0.02486799657344818,
0.007388660684227943,
-0.0008354944293387234,
-0.0063574109226465225,
0.08234357833862305,
0.010048775933682919,
-0.06351929903030396,
-0.014198740012943745,
-0.04787725210189819,
0.02790415845811367,
0.16964773833751678,
-0.021508963778614998,
-0.10722240805625916,
0.003961676266044378,
-0.013383145444095135,
-0.012975757010281086,
0.09238606691360474,
0.008721154183149338,
-0.00505990581586957,
0.031140172854065895,
0.026048585772514343,
0.054733723402023315,
-0.054619211703538895,
0.0842786431312561,
0.02590578980743885,
-0.04710424318909645,
0.03870037570595741,
-0.02515474520623684,
-0.006976984441280365,
0.0405050627887249,
0.005498888436704874,
0.022333582863211632,
-0.044265683740377426,
-0.043198417872190475,
-0.08687541633844376,
0.10283917188644409,
-0.10409072786569595,
-0.21965236961841583,
-0.1659354865550995,
0.07346221804618835,
-0.023266231641173363,
-0.014174141921103,
0.035258736461400986,
-0.05420442298054695,
-0.10379225760698318,
-0.1337755173444748,
0.0421854667365551,
0.01956932246685028,
-0.07287488132715225,
-0.05041934922337532,
0.028165118768811226,
0.009453055448830128,
-0.12478236109018326,
0.0027940552681684494,
0.005860915873199701,
0.012749065645039082,
-0.01719336397945881,
0.015236203558743,
0.031876951456069946,
0.07406831532716751,
0.009132600389420986,
-0.0558936782181263,
0.0101427948102355,
0.16270941495895386,
-0.07086695730686188,
0.15033970773220062,
0.1562698483467102,
0.005631578620523214,
0.05730154737830162,
0.09364261478185654,
0.014711350202560425,
-0.024254417046904564,
0.024636538699269295,
0.01629345864057541,
-0.06223832443356514,
-0.14600996673107147,
-0.12706467509269714,
-0.0425279401242733,
-0.004692379385232925,
0.06527990102767944,
0.032293666154146194,
-0.049374427646398544,
0.017062177881598473,
-0.07859542220830917,
-0.010797430761158466,
0.054243315011262894,
0.051642727106809616,
0.13773994147777557,
0.0020953600760549307,
0.026424771174788475,
-0.07799598574638367,
-0.017226500436663628,
0.10078561305999756,
-0.010684768669307232,
0.15625354647636414,
-0.05970749631524086,
0.11622975021600723,
0.028910383582115173,
-0.010988420806825161,
0.058008670806884766,
0.05566098168492317,
0.0005455004866234958,
0.020927725359797478,
-0.0232278760522604,
-0.07865266501903534,
-0.04497179388999939,
0.08335456997156143,
0.051190122961997986,
-0.07843119651079178,
-0.0012857100227847695,
0.0037725695874542,
0.002151771215721965,
0.13667893409729004,
0.030793333426117897,
-0.11376843601465225,
-0.1319599747657776,
0.017322754487395287,
-0.10490995645523071,
-0.07242858409881592,
0.022758839651942253,
0.15094606578350067,
-0.08784168213605881,
0.02962145395576954,
0.0008819873328320682,
0.07446951419115067,
-0.07377202808856964,
0.009348523803055286,
-0.03418629243969917,
0.13505835831165314,
0.012755165807902813,
0.05442996695637703,
-0.03722630441188812,
0.04009241238236427,
0.009042722173035145,
0.11728579550981522,
-0.06465107202529907,
0.044438671320676804,
0.028123924508690834,
0.012225285172462463,
0.05873239040374756,
0.04132826253771782,
-0.15392108261585236,
0.02159232646226883,
-0.11197928339242935,
0.05467735230922699,
0.04135614261031151,
0.05001397803425789,
0.0809231624007225,
-0.008953464217483997,
-0.0015275055775418878,
-0.02577725611627102,
-0.10593116283416748,
-0.12030007690191269,
-0.16340433061122894,
0.02344527281820774,
0.0004366636276245117,
-0.00523392716422677,
-0.05475655198097229,
-0.015985265374183655,
-0.08414293080568314,
0.11709720641374588,
-0.08076038211584091,
-0.11992288380861282,
-0.07288709282875061,
-0.05552985891699791,
0.1508675068616867,
-0.044193852692842484,
0.009508131071925163,
0.03500058129429817,
0.15747155249118805,
-0.051235269755125046,
-0.061332523822784424,
-0.015845758840441704,
-0.08370790630578995,
-0.10546223074197769,
0.00861471425741911,
0.11654555797576904,
0.10216354578733444,
0.0555858351290226,
0.012048144824802876,
0.0016352894017472863,
-0.001895089983008802,
-0.09959890693426132,
-0.054681312292814255,
0.17476268112659454,
-0.010313902050256729,
0.03309483081102371,
-0.0507279634475708,
-0.0676894262433052,
-0.052058830857276917,
-0.010619651526212692,
0.04928664490580559,
0.14504148066043854,
-0.0515596829354763,
0.13446246087551117,
0.192693829536438,
-0.0651121512055397,
-0.21338246762752533,
-0.052382100373506546,
0.05056386813521385,
0.05381326749920845,
0.01359261479228735,
-0.17020352184772491,
0.10854663699865341,
0.03819425776600838,
0.00042788463179022074,
0.03690018132328987,
-0.21739418804645538,
-0.10163155943155289,
0.04473710060119629,
-0.0096145523712039,
-0.02072981745004654,
-0.019876651465892792,
-0.02849278412759304,
-0.03807775676250458,
-0.03133409097790718,
0.06722855567932129,
-0.04959271475672722,
0.054221466183662415,
0.03577839955687523,
0.07713714241981506,
0.0474044494330883,
-0.023890845477581024,
0.11339205503463745,
-0.032442424446344376,
0.004955462645739317,
-0.06187007948756218,
0.09500572830438614,
0.01632487215101719,
-0.06547483056783676,
0.14617858827114105,
-0.023286888375878334,
0.012256506830453873,
-0.10549134016036987,
-0.06298240274190903,
-0.07537183910608292,
0.0745459794998169,
-0.017882883548736572,
-0.040177371352910995,
-0.08476237207651138,
0.07771856337785721,
0.09474539756774902,
0.005961759015917778,
-0.07780201733112335,
-0.07052728533744812,
-0.07399367541074753,
0.12443294376134872,
0.17438654601573944,
-0.050500378012657166,
-0.05015835538506508,
0.012608158402144909,
-0.00040455162525177,
0.05465744435787201,
-0.04949146509170532,
0.016545670107007027,
0.09414318948984146,
-0.019306370988488197,
0.03281625360250473,
-0.03229619190096855,
-0.1326659470796585,
-0.01768515817821026,
0.022716455161571503,
-0.039342429488897324,
-0.16728843748569489,
-0.031687237322330475,
0.03801671043038368,
-0.05294129252433777,
-0.044117242097854614,
0.13111990690231323,
-0.08157230168581009,
-0.0012005154276266694,
0.013236693106591702,
0.05600733682513237,
0.02499464713037014,
0.09346991032361984,
0.026659101247787476,
0.023409834131598473,
-0.07067522406578064,
0.09815231710672379,
0.03496327996253967,
-0.12817589938640594,
0.047648947685956955,
0.1502082347869873,
-0.09159968048334122,
-0.05008998513221741,
-0.11932400614023209,
-0.032365161925554276,
-0.01244290079921484,
-0.10346478223800659,
0.007711347192525864,
-0.07365702092647552,
0.014432557858526707,
0.0091245761141181,
0.008175571449100971,
-0.02214003913104534,
-0.01989944465458393,
0.03948504105210304,
-0.10130547732114792,
0.08423126488924026,
0.008853754960000515,
0.025744909420609474,
-0.04087499529123306,
0.09207773208618164,
-0.002216203138232231,
0.012180165387690067,
-0.024727383628487587,
-0.01848761737346649,
-0.009370808489620686,
-0.03160339593887329,
-0.1387440711259842,
0.014791886322200298,
-0.08791568130254745,
0.007802583277225494,
-0.00317222042940557,
0.03427378088235855,
-0.01833273470401764,
0.0523056797683239,
-0.0354597233235836,
-0.020654622465372086,
-0.05594886466860771,
0.03145807236433029,
-0.05038841441273689,
0.00910224299877882,
0.046396151185035706,
-0.07415524870157242,
0.0487644225358963,
0.004426298197358847,
-0.05110938847064972,
0.048794638365507126,
-0.013280510902404785,
0.005640966352075338,
0.018198734149336815,
0.06447144597768784,
0.004770229104906321,
-0.05151667073369026,
-0.004744712263345718,
0.029938340187072754,
-0.00018579575407784432,
-0.03939804434776306,
0.03659697622060776,
-0.0551922507584095,
0.07752563059329987,
0.02746741659939289,
-0.015197635628283024,
-0.05802598595619202,
0.02963772974908352,
0.02814788557589054,
0.01871461607515812,
0.08735248446464539,
-0.061210278421640396,
0.015165169723331928,
-0.09669142961502075,
-0.004097163677215576,
0.006682078819721937,
0.0063231936655938625,
0.08226530998945236,
-0.016139281913638115,
0.02895633690059185,
0.002758507849648595,
0.1819487363100052,
-0.018903326243162155,
0.023826882243156433,
0.053360819816589355,
-0.09414822608232498,
-0.10512331128120422,
0.01768769510090351,
0.06145940348505974,
0.013478961773216724,
-0.008507783524692059,
-0.04033714160323143,
-0.02806883491575718,
-0.01580694690346718,
-0.0008854120969772339,
0.08684146404266357,
0.10731986910104752,
0.09825684875249863,
0.0889103040099144,
0.01407468318939209,
-0.04119017347693443,
-0.10962633043527603,
0.07789019495248795,
-0.031503383070230484,
0.07799782603979111,
-0.04238221049308777,
0.08439106494188309,
0.12369954586029053,
-0.06412716209888458,
0.0944921001791954,
-0.00461883470416069,
-0.04585379362106323,
-0.0711073949933052,
-0.13473056256771088,
-0.04483099654316902,
-0.038716237992048264,
-0.026582792401313782,
-0.09069667011499405,
0.0348714143037796,
0.01204677950590849,
0.019142286852002144,
-0.026215462014079094,
0.11082395911216736,
-0.06747175008058548,
-0.09304589778184891,
0.05668144300580025,
-0.02135493792593479,
0.030029000714421272,
0.11332351714372635,
-0.001903074444271624,
0.0538879819214344,
0.0626368597149849,
0.061253368854522705,
0.05991177633404732,
0.017391294240951538,
-0.006223815958946943,
0.0012516319984570146,
-0.011559360660612583,
-0.009327809326350689,
-0.002457044320181012,
-0.005853129085153341,
0.10366081446409225,
0.07637200504541397,
-0.08245780318975449,
-0.005937032867223024,
0.10046041011810303,
-0.04660309478640556,
-0.15016955137252808,
-0.13030411303043365,
0.12804454565048218,
-0.0013114610919728875,
0.02104349248111248,
-0.00456608459353447,
-0.07677078992128372,
-0.018956074491143227,
0.12963031232357025,
0.1300942301750183,
0.03230154141783714,
-0.007181910332292318,
-0.03572767227888107,
-0.011372965760529041,
-0.06597274541854858,
0.11057919263839722,
0.005265643820166588,
0.2758779525756836,
-0.005390087608247995,
0.05360968038439751,
-0.02204703725874424,
-0.0230290275067091,
-0.11652066558599472,
0.07801657915115356,
-0.05089867115020752,
-0.0024473480880260468,
-0.02735804207623005,
0.07720375806093216,
-0.04201756790280342,
-0.24628931283950806,
-0.023046985268592834,
0.0037102538626641035,
-0.0638623759150505,
0.028440719470381737,
-0.012544691562652588,
0.021744966506958008,
0.06285843998193741,
-0.005074951332062483,
0.004013536497950554,
0.14971476793289185,
-0.015112691558897495,
-0.07112941890954971,
0.03717851638793945,
0.04316838085651398,
-0.04310224950313568,
0.16356422007083893,
0.034751567989587784,
0.05759380757808685,
0.057338353246450424,
-0.01089188177138567,
-0.1154375895857811,
0.060483336448669434,
0.02838708646595478,
-0.11712319403886795,
0.026896247640252113,
0.14681857824325562,
-0.011345725506544113,
0.07475735992193222,
0.047464653849601746,
-0.038985710591077805,
0.0022469956893473864,
0.08736366778612137,
0.006847434211522341,
-0.051591675728559494,
0.05323134362697601,
-0.09291020035743713,
0.1258869767189026,
0.10196837037801743,
0.012543381191790104,
-0.02674541063606739,
-0.04791439697146416,
0.006477579474449158,
-0.01092036347836256,
0.09372072666883469,
-0.003220735350623727,
-0.11508854478597641,
-0.027409130707383156,
-0.005490185227245092,
0.0566849410533905,
-0.1619563102722168,
-0.041261639446020126,
0.004434375092387199,
-0.021752452477812767,
0.028687670826911926,
0.09841183573007584,
0.027271397411823273,
0.007381968665868044,
-0.022927766665816307,
-0.024275457486510277,
0.005169641226530075,
0.07391040027141571,
-0.1190115287899971,
-0.05995332822203636
] |
null | null | transformers |
# NB-Whisper Base
Introducing the **_Norwegian NB-Whisper Base model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-base](https://huggingface.co/openai/whisper-base)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-base")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-base/resolve/main/ggml-model.bin -O models/nb-base-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-base/resolve/main/ggml-model-q5_0.bin -O models/nb-base-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-base-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-base-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-base --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Base model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-base", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-base | [
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:07:48+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us
| NB-Whisper Base
===============
Introducing the *Norwegian NB-Whisper Base model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-base
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Base model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Base model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Base model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
145,
198,
106,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.054324280470609665,
0.10830754041671753,
-0.004045714158564806,
0.0034350857604295015,
0.04753349721431732,
-0.026376157999038696,
0.027371948584914207,
0.059434566646814346,
-0.002522749127820134,
0.0697074830532074,
-0.004164496902376413,
-0.07856494933366776,
0.08875740319490433,
0.021036086603999138,
0.09572138637304306,
-0.29835382103919983,
0.05780331417918205,
-0.07918870449066162,
0.01793087087571621,
0.04202157258987427,
0.0910264253616333,
-0.054098647087812424,
0.044246699661016464,
0.013836067169904709,
-0.007548196706920862,
0.017894916236400604,
-0.05565613508224487,
-0.03392494469881058,
0.07785508781671524,
0.09398309141397476,
0.04395340010523796,
-0.0012346789008006454,
0.08269514888525009,
-0.1458626389503479,
0.010314143262803555,
0.053992968052625656,
0.03297135978937149,
0.012021365575492382,
0.032188624143600464,
0.10854282230138779,
0.20033179223537445,
-0.07605312764644623,
0.012440677732229233,
0.04978746175765991,
-0.029518229886889458,
-0.14181816577911377,
-0.04306786134839058,
-0.024353528395295143,
0.04785383120179176,
0.0428207702934742,
-0.03155800327658653,
0.08809412270784378,
-0.07334675639867783,
0.03613302484154701,
0.08263062685728073,
-0.12248154729604721,
-0.015069839544594288,
0.010059772059321404,
0.04815388098359108,
0.05126175284385681,
-0.03036113642156124,
0.0052971746772527695,
0.014518913812935352,
0.04290309548377991,
0.016057245433330536,
-0.007989183999598026,
0.07444687932729721,
-0.06166571006178856,
-0.10648137331008911,
-0.04682639241218567,
0.12245076894760132,
0.018604187294840813,
-0.06292039901018143,
-0.16913676261901855,
-0.04439597204327583,
0.056120723485946655,
-0.02029743790626526,
-0.02439282275736332,
0.027525432407855988,
0.00009383881842950359,
0.09575428813695908,
-0.07163209468126297,
-0.09744634479284286,
0.018736910074949265,
-0.007273256313055754,
0.11417708545923233,
0.04634099081158638,
0.0043635135516524315,
0.027236366644501686,
0.05130161717534065,
-0.060434263199567795,
-0.05776330456137657,
-0.06439557671546936,
-0.06605701893568039,
-0.06125568225979805,
0.01806466095149517,
-0.026352522894740105,
-0.09152209758758545,
-0.0028310262132436037,
0.07780063897371292,
-0.02006847970187664,
0.023647554218769073,
0.024241620674729347,
-0.0035324941854923964,
0.05045899748802185,
0.13043932616710663,
-0.02579289674758911,
-0.0746370479464531,
-0.005357114598155022,
-0.003832292975857854,
0.06398472189903259,
0.008664495311677456,
-0.048005688935518265,
-0.03312993422150612,
0.0008877243963070214,
0.022888386622071266,
0.014530223794281483,
0.009499483741819859,
0.016073046252131462,
-0.02908833511173725,
0.23790937662124634,
-0.11554362624883652,
0.0012951554963365197,
0.01363594550639391,
-0.04640624299645424,
0.07846233248710632,
0.033371586352586746,
-0.024122819304466248,
-0.12948910892009735,
0.015625640749931335,
-0.0040100268088281155,
-0.02402147464454174,
-0.0660073459148407,
-0.1066637858748436,
0.04761136695742607,
0.009874820709228516,
-0.014689295552670956,
-0.11329958587884903,
-0.08462363481521606,
-0.04312802478671074,
0.022822627797722816,
-0.012132578529417515,
-0.03654857352375984,
0.003924489486962557,
-0.03998676314949989,
-0.031829964369535446,
-0.030326513573527336,
-0.0029354009311646223,
-0.027243943884968758,
-0.018481025472283363,
-0.024432791396975517,
0.03737574815750122,
-0.02394411526620388,
0.00968193355947733,
-0.06156754493713379,
-0.00538244703784585,
-0.1832929253578186,
0.11136110872030258,
-0.07723142206668854,
-0.00916315894573927,
-0.025818146765232086,
-0.0716954693198204,
-0.04578505828976631,
0.047708507627248764,
0.005577514413744211,
0.07230084389448166,
-0.16909979283809662,
-0.02809152565896511,
0.14345593750476837,
-0.14401482045650482,
0.04383571818470955,
0.1390991359949112,
0.01384268794208765,
-0.0032446961849927902,
0.13155874609947205,
0.1146804690361023,
0.16768257319927216,
-0.1265096515417099,
-0.0641268864274025,
-0.0014777234755456448,
-0.030719565227627754,
0.0636446475982666,
0.037771906703710556,
-0.0057786800898611546,
0.09200017899274826,
0.045282214879989624,
0.012701070867478848,
0.022962169721722603,
0.0367811918258667,
-0.022380461916327477,
-0.0068133980967104435,
-0.024449439719319344,
-0.0015369480242952704,
0.038641367107629776,
-0.053290825337171555,
-0.04143724590539932,
-0.09481915831565857,
0.07186105102300644,
0.10618409514427185,
-0.0432138554751873,
0.03234725072979927,
-0.07034046947956085,
-0.022714795544743538,
0.010322428308427334,
-0.00012579064059536904,
-0.11039918661117554,
-0.053553592413663864,
0.047111522406339645,
-0.12960615754127502,
0.07418935745954514,
0.05566840246319771,
0.03647516295313835,
0.07598616182804108,
-0.01880766451358795,
0.012429955415427685,
-0.03361785039305687,
0.002449348568916321,
-0.01976754702627659,
-0.04126621410250664,
-0.0287074763327837,
-0.035296082496643066,
0.023791750892996788,
-0.10503849387168884,
0.0006582413916476071,
0.011085540987551212,
0.09367462992668152,
0.014758285135030746,
-0.0288019310683012,
0.01486497838050127,
0.013292656280100346,
0.01069293636828661,
-0.04171261191368103,
-0.008044381625950336,
0.0011926094302907586,
-0.0015697814524173737,
0.10993955284357071,
-0.14894695580005646,
-0.10034968703985214,
0.055758286267519,
0.10848663002252579,
-0.015345893800258636,
0.0006558274035342038,
-0.04077426716685295,
-0.032614704221487045,
-0.05092265084385872,
-0.11549931764602661,
0.20609863102436066,
0.017337964847683907,
0.061135053634643555,
-0.08898011595010757,
-0.022184953093528748,
0.009582833386957645,
-0.0009652103180997074,
-0.00672365166246891,
0.08142278343439102,
0.009568776935338974,
-0.06262443214654922,
-0.013353701680898666,
-0.04822496697306633,
0.03179549798369408,
0.17106731235980988,
-0.02015509456396103,
-0.10886839032173157,
0.0015016967663541436,
-0.013639909215271473,
-0.013332786969840527,
0.09747162461280823,
0.007494351360946894,
-0.004611985292285681,
0.02909460850059986,
0.025928519666194916,
0.053431108593940735,
-0.05683601275086403,
0.08257267624139786,
0.026711896061897278,
-0.04647250100970268,
0.03976719453930855,
-0.02612733282148838,
-0.006721531506627798,
0.03941192105412483,
0.007628448307514191,
0.019280433654785156,
-0.04615538939833641,
-0.04244108125567436,
-0.08617760986089706,
0.10042119026184082,
-0.10677190870046616,
-0.22058796882629395,
-0.16691075265407562,
0.06747245043516159,
-0.024031341075897217,
-0.015504482202231884,
0.03421500697731972,
-0.0558585487306118,
-0.10428494960069656,
-0.13152027130126953,
0.04239832982420921,
0.019963974133133888,
-0.07059792429208755,
-0.05030554533004761,
0.026786664500832558,
0.013894450850784779,
-0.12521958351135254,
0.0027520405128598213,
0.006357754115015268,
0.01186825055629015,
-0.018663255497813225,
0.013271872885525227,
0.035359811037778854,
0.06879481673240662,
0.009923937730491161,
-0.05409073829650879,
0.012278325855731964,
0.16981659829616547,
-0.07227780669927597,
0.149541437625885,
0.14741434156894684,
0.0018276125192642212,
0.06211668625473976,
0.09444856643676758,
0.017669232562184334,
-0.025258302688598633,
0.025439394637942314,
0.018627233803272247,
-0.06057259812951088,
-0.1443335860967636,
-0.12366444617509842,
-0.041716206818819046,
-0.011064601130783558,
0.06761788576841354,
0.032287415117025375,
-0.04721187427639961,
0.01804843544960022,
-0.07906857877969742,
-0.011766470968723297,
0.05098145082592964,
0.052094995975494385,
0.1343822032213211,
0.0024680669885128736,
0.027809614315629005,
-0.07743790745735168,
-0.018845660611987114,
0.10279885679483414,
-0.006288114935159683,
0.1664942055940628,
-0.05679665505886078,
0.11645752191543579,
0.032243724912405014,
-0.0015800570836290717,
0.05520087853074074,
0.05499732494354248,
0.0011639731237664819,
0.02252054214477539,
-0.023389553651213646,
-0.07777050882577896,
-0.04171973466873169,
0.08322887122631073,
0.0512181855738163,
-0.07642656564712524,
0.00026274286210536957,
-0.005998899694532156,
0.0022145770490169525,
0.14253218472003937,
0.03209179639816284,
-0.11645134538412094,
-0.131564661860466,
0.015154607594013214,
-0.10805520415306091,
-0.07535029202699661,
0.02022547274827957,
0.1464964896440506,
-0.08644171804189682,
0.030763087794184685,
-0.00040091699338518083,
0.07416968792676926,
-0.07872221618890762,
0.007804406341165304,
-0.029465490952134132,
0.1323324590921402,
0.01037482637912035,
0.05266174301505089,
-0.03197858855128288,
0.04691866412758827,
0.00907252635806799,
0.11466419696807861,
-0.06196719408035278,
0.04321189597249031,
0.027618998661637306,
0.012235909700393677,
0.05906466767191887,
0.04168371856212616,
-0.13959939777851105,
0.021120935678482056,
-0.10980629920959473,
0.05590997636318207,
0.046653423458337784,
0.04611940309405327,
0.0842750295996666,
-0.007960955612361431,
-0.00011294955766061321,
-0.025747155770659447,
-0.10188156366348267,
-0.12058132141828537,
-0.16559584438800812,
0.02513674832880497,
0.0029479078948497772,
-0.006567599717527628,
-0.05595548823475838,
-0.018877586349844933,
-0.0792221799492836,
0.11979370564222336,
-0.08167776465415955,
-0.12157943844795227,
-0.07385270297527313,
-0.05667257308959961,
0.1526588499546051,
-0.04134473577141762,
0.009318864904344082,
0.03160112351179123,
0.15879304707050323,
-0.05294026806950569,
-0.06320685148239136,
-0.015444204211235046,
-0.080427385866642,
-0.10684928297996521,
0.006127122789621353,
0.1182025596499443,
0.10767073184251785,
0.05240168794989586,
0.010376069694757462,
0.004002703819423914,
-0.002684297738596797,
-0.10014164447784424,
-0.050296392291784286,
0.16432584822177887,
-0.00908194575458765,
0.03025503270328045,
-0.04728090763092041,
-0.06481116265058517,
-0.05558992549777031,
-0.00858185812830925,
0.04159414395689964,
0.14245685935020447,
-0.05233299359679222,
0.13565464317798615,
0.19548048079013824,
-0.06504156440496445,
-0.2072339504957199,
-0.05446870997548103,
0.05187297239899635,
0.05642858147621155,
0.009941024705767632,
-0.17725235223770142,
0.10550588369369507,
0.032702963799238205,
-0.0004927674890495837,
0.040597256273031235,
-0.2129274606704712,
-0.10007578134536743,
0.04559077322483063,
-0.004748706240206957,
-0.024529429152607918,
-0.018393119797110558,
-0.0304936021566391,
-0.0364324152469635,
-0.03201624006032944,
0.06691210716962814,
-0.04748617485165596,
0.056064728647470474,
0.033630743622779846,
0.0825229063630104,
0.046810001134872437,
-0.0224076509475708,
0.11534202843904495,
-0.03247370943427086,
0.005242114420980215,
-0.06338273733854294,
0.09762442111968994,
0.01917804591357708,
-0.06322649866342545,
0.14558982849121094,
-0.02419518493115902,
0.012799333781003952,
-0.10655134916305542,
-0.06406288594007492,
-0.0764843225479126,
0.0705169066786766,
-0.019261490553617477,
-0.042133960872888565,
-0.0844804123044014,
0.07916447520256042,
0.09475564956665039,
0.005253948736935854,
-0.08527082204818726,
-0.06702236086130142,
-0.06822579354047775,
0.12730629742145538,
0.16845102608203888,
-0.05185210332274437,
-0.0521286316215992,
0.012923016212880611,
-0.0012021759757772088,
0.05611901357769966,
-0.04832043871283531,
0.013803740032017231,
0.09701348096132278,
-0.022072963416576385,
0.03381049260497093,
-0.03085445798933506,
-0.138718381524086,
-0.01958462782204151,
0.026008889079093933,
-0.04621149227023125,
-0.16711623966693878,
-0.032231997698545456,
0.03536525368690491,
-0.052636485546827316,
-0.04290471598505974,
0.1325513869524002,
-0.08232510834932327,
-0.0010942131048068404,
0.013880841434001923,
0.05362561345100403,
0.02507609874010086,
0.08733579516410828,
0.03205835446715355,
0.02244492806494236,
-0.07048668712377548,
0.09519656747579575,
0.03247803822159767,
-0.12482795119285583,
0.04691629484295845,
0.1497410535812378,
-0.09295890480279922,
-0.04762725904583931,
-0.12329784035682678,
-0.0328177772462368,
-0.011217079125344753,
-0.1018255352973938,
0.008415131829679012,
-0.07155431807041168,
0.0160874892026186,
0.00922887772321701,
0.008568399585783482,
-0.019230196252465248,
-0.020810751244425774,
0.039183054119348526,
-0.09704756736755371,
0.08313807845115662,
0.007902164943516254,
0.02552991546690464,
-0.04193558171391487,
0.08521004766225815,
-0.0011760328197851777,
0.008752875030040741,
-0.02434580773115158,
-0.020049648359417915,
-0.009026806801557541,
-0.0315326564013958,
-0.14119991660118103,
0.011121377348899841,
-0.08731406927108765,
0.007406965363770723,
-0.0030495747923851013,
0.028562961146235466,
-0.017951058223843575,
0.053039539605379105,
-0.034857600927352905,
-0.02238789200782776,
-0.053211864084005356,
0.028092483058571815,
-0.05503957346081734,
0.009656405076384544,
0.04938392713665962,
-0.0774889811873436,
0.049056973308324814,
0.005910300184041262,
-0.05279238894581795,
0.0509844534099102,
-0.006647632922977209,
0.007073561195284128,
0.018018372356891632,
0.06393290311098099,
0.005091497208923101,
-0.052510395646095276,
-0.0056539252400398254,
0.02892102114856243,
0.00024303731333930045,
-0.037853095680475235,
0.04536067321896553,
-0.05672747269272804,
0.07126151770353317,
0.030015336349606514,
-0.01794077828526497,
-0.05631856992840767,
0.033446475863456726,
0.034857433289289474,
0.018574709072709084,
0.08746140450239182,
-0.059699613600969315,
0.014896285720169544,
-0.1002684235572815,
-0.004550850484520197,
0.008846465498209,
0.007239101454615593,
0.08700460195541382,
-0.01889476366341114,
0.027315817773342133,
0.0019266282906755805,
0.18359576165676117,
-0.022976437583565712,
0.026835693046450615,
0.051752518862485886,
-0.09385515004396439,
-0.11077612638473511,
0.018566757440567017,
0.055949389934539795,
0.014001868665218353,
-0.011677120812237263,
-0.04335619881749153,
-0.03350071981549263,
-0.019536206498742104,
-0.009578664787113667,
0.07817275077104568,
0.1042119637131691,
0.10440146178007126,
0.08561196178197861,
0.017940523102879524,
-0.04057428985834122,
-0.11208036541938782,
0.07852932065725327,
-0.03489282354712486,
0.08118394762277603,
-0.0393342450261116,
0.08641216158866882,
0.12366176396608353,
-0.0631616860628128,
0.09672260284423828,
-0.0018848218023777008,
-0.04603148624300957,
-0.06959795206785202,
-0.1335711032152176,
-0.0442143976688385,
-0.0401642769575119,
-0.023463783785700798,
-0.08970252424478531,
0.03627242520451546,
0.01135194767266512,
0.01983405090868473,
-0.02512500248849392,
0.11389023065567017,
-0.05841277912259102,
-0.09273316711187363,
0.05873673036694527,
-0.018256040289998055,
0.027910061180591583,
0.11579176783561707,
0.00022993174206931144,
0.05304748937487602,
0.06741414964199066,
0.06068974733352661,
0.05914367735385895,
0.014652062207460403,
-0.006422838196158409,
0.001844293437898159,
-0.013253358192741871,
-0.007081287447363138,
-0.006043273489922285,
-0.009566125459969044,
0.1029699444770813,
0.07321625202894211,
-0.07772093266248703,
-0.007440707180649042,
0.10031818598508835,
-0.04728600010275841,
-0.1465270072221756,
-0.13382042944431305,
0.12223666906356812,
-0.002933631418272853,
0.02054661512374878,
-0.0045937360264360905,
-0.07631856948137283,
-0.01799355261027813,
0.12590013444423676,
0.128449946641922,
0.030269533395767212,
-0.0068426053039729595,
-0.031958628445863724,
-0.010876905173063278,
-0.061339955776929855,
0.11014404892921448,
0.004727677907794714,
0.2768014967441559,
-0.00688566779717803,
0.050629228353500366,
-0.02738310396671295,
-0.02387046627700329,
-0.11064211279153824,
0.08171636611223221,
-0.04914199188351631,
-0.0015299879014492035,
-0.03574967011809349,
0.07995576411485672,
-0.03577098250389099,
-0.24685125052928925,
-0.01846735179424286,
0.000054989010095596313,
-0.06562232226133347,
0.025667523965239525,
-0.011751718819141388,
0.017618807032704353,
0.06215329095721245,
-0.0032711829990148544,
0.0044484976679086685,
0.1570773869752884,
-0.013089845888316631,
-0.07300366461277008,
0.03533497452735901,
0.04293191060423851,
-0.048538465052843094,
0.16182249784469604,
0.036189090460538864,
0.055061426013708115,
0.056045740842819214,
-0.009053359739482403,
-0.11359014362096786,
0.060032546520233154,
0.029019981622695923,
-0.1158197745680809,
0.01917576976120472,
0.14772175252437592,
-0.009480555541813374,
0.06867680698633194,
0.045466747134923935,
-0.038711946457624435,
0.003310027765110135,
0.08529756218194962,
0.002275640843436122,
-0.0502193309366703,
0.05290193483233452,
-0.0931798592209816,
0.12399683147668839,
0.10426633805036545,
0.01332272868603468,
-0.025984304025769234,
-0.04679189249873161,
0.0034847308415919542,
-0.00799794401973486,
0.09006775170564651,
-0.0002741670759860426,
-0.11376865953207016,
-0.030826019123196602,
-0.0017221929738298059,
0.05683353170752525,
-0.1649216264486313,
-0.04263174906373024,
0.007136654108762741,
-0.02034718357026577,
0.027984268963336945,
0.10022950917482376,
0.02092256397008896,
0.009136552922427654,
-0.023382434621453285,
-0.022533655166625977,
0.006510287057608366,
0.07256027311086655,
-0.11877094954252243,
-0.05845535919070244
] |
null | null | transformers |
# NB-Whisper Small
Introducing the **_Norwegian NB-Whisper Small model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-small](https://huggingface.co/openai/whisper-small)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-small")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model.bin -O models/nb-small-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model-q5_0.bin -O models/nb-small-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-small-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-small-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-small --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-small", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-tiny | [
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:07:55+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us
| NB-Whisper Small
================
Introducing the *Norwegian NB-Whisper Small model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-small
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
146,
198,
107,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05595729127526283,
0.10957113653421402,
-0.004185075871646404,
0.003683684393763542,
0.04644007608294487,
-0.027359848842024803,
0.028367673978209496,
0.05932166799902916,
-0.0021105806808918715,
0.07070865482091904,
-0.0029929790180176497,
-0.07421991974115372,
0.09184768050909042,
0.021621257066726685,
0.09246473759412766,
-0.2979411780834198,
0.05877043679356575,
-0.07867005467414856,
0.02261350303888321,
0.04359615221619606,
0.09201803058385849,
-0.05530157685279846,
0.042665284126996994,
0.015337258577346802,
-0.004546137060970068,
0.01574155129492283,
-0.05597935989499092,
-0.03430740907788277,
0.07702098041772842,
0.09046751260757446,
0.046444255858659744,
-0.0017388699343428016,
0.08297839015722275,
-0.147238627076149,
0.010472044348716736,
0.053692031651735306,
0.03419133648276329,
0.013193007558584213,
0.03298887610435486,
0.10888194292783737,
0.1993558406829834,
-0.07860422134399414,
0.014330617152154446,
0.047765087336301804,
-0.02910742722451687,
-0.14211152493953705,
-0.04390902444720268,
-0.026039371266961098,
0.04809437319636345,
0.04131939634680748,
-0.03132976219058037,
0.0884852483868599,
-0.0743095874786377,
0.0358167327940464,
0.08269158750772476,
-0.12046617269515991,
-0.015920156612992287,
0.007928883656859398,
0.04445699229836464,
0.04908204451203346,
-0.030451707541942596,
0.0024281032383441925,
0.013299413025379181,
0.04323276877403259,
0.016924722120165825,
-0.006636261940002441,
0.0700678750872612,
-0.06200633943080902,
-0.10610425472259521,
-0.04887941852211952,
0.12418750673532486,
0.014583869837224483,
-0.0629294216632843,
-0.17139649391174316,
-0.04367728531360626,
0.05314905568957329,
-0.020227400586009026,
-0.023216301575303078,
0.026258794590830803,
-0.0007446606759913266,
0.09863749891519547,
-0.06783456355333328,
-0.09906933456659317,
0.01910584233701229,
-0.007240698207169771,
0.11122428625822067,
0.04742242023348808,
0.006139127071946859,
0.025510774925351143,
0.05406947806477547,
-0.05612662434577942,
-0.06039923429489136,
-0.0613519586622715,
-0.06690678000450134,
-0.05939147248864174,
0.01840221881866455,
-0.02416708879172802,
-0.09450099617242813,
-0.0019876654259860516,
0.07519110292196274,
-0.020042700693011284,
0.021614111959934235,
0.024053022265434265,
-0.0024196419399231672,
0.053898364305496216,
0.13287436962127686,
-0.021660422906279564,
-0.07134010642766953,
-0.005280818324536085,
-0.0021383638959378004,
0.06522231549024582,
0.006817637477070093,
-0.04564196988940239,
-0.03106912039220333,
0.002788423327729106,
0.023616740480065346,
0.015601211227476597,
0.006800869945436716,
0.013174213469028473,
-0.028698379173874855,
0.23656010627746582,
-0.11634872108697891,
0.0008783203666098416,
0.013940642587840557,
-0.04578356817364693,
0.07825009524822235,
0.030111951753497124,
-0.023773320019245148,
-0.13129140436649323,
0.01893772929906845,
-0.0022884185891598463,
-0.023230237886309624,
-0.06604725867509842,
-0.10386154800653458,
0.0475231409072876,
0.009065764956176281,
-0.015343434177339077,
-0.1112871766090393,
-0.08784902095794678,
-0.045468587428331375,
0.022602597251534462,
-0.012800972908735275,
-0.03781973943114281,
0.0025298960972577333,
-0.03922031447291374,
-0.03120609186589718,
-0.028628995642066002,
-0.0034175601322203875,
-0.027599556371569633,
-0.016096828505396843,
-0.027078352868556976,
0.03563833609223366,
-0.020449155941605568,
0.010089471004903316,
-0.06549054384231567,
-0.0059835296124219894,
-0.18053443729877472,
0.11111023277044296,
-0.0764886736869812,
-0.010035504586994648,
-0.02464425563812256,
-0.07214254140853882,
-0.047314196825027466,
0.04643293097615242,
0.006902298424392939,
0.07286535948514938,
-0.1779240220785141,
-0.026247814297676086,
0.14369988441467285,
-0.14535409212112427,
0.045266781002283096,
0.13992084562778473,
0.011835464276373386,
-0.0015470044454559684,
0.12977059185504913,
0.1165449321269989,
0.16446606814861298,
-0.12556566298007965,
-0.06230111047625542,
-0.0013974206522107124,
-0.02867063879966736,
0.06462816894054413,
0.037748660892248154,
-0.007191019598394632,
0.09154041856527328,
0.046227529644966125,
0.011196929030120373,
0.01941496692597866,
0.03542516008019447,
-0.02403213083744049,
-0.00666817044839263,
-0.024569297209382057,
-0.003787035821005702,
0.04108531400561333,
-0.05418844893574715,
-0.041770342737436295,
-0.09302351623773575,
0.06754366308450699,
0.10623035579919815,
-0.04342184588313103,
0.03226578235626221,
-0.0674060732126236,
-0.020842716097831726,
0.012336906976997852,
-0.0014363210648298264,
-0.1102164015173912,
-0.05247797444462776,
0.04908955469727516,
-0.13097761571407318,
0.07252240180969238,
0.04974469915032387,
0.034087203443050385,
0.07507915049791336,
-0.015585407614707947,
0.011116317473351955,
-0.033859673887491226,
0.0017018094658851624,
-0.021508261561393738,
-0.04174104705452919,
-0.02784629724919796,
-0.034292396157979965,
0.024054119363427162,
-0.10515110939741135,
-0.00032526502036489546,
0.008346598595380783,
0.09550124406814575,
0.014215294271707535,
-0.02976670302450657,
0.014644703827798367,
0.013933378271758556,
0.00963566917926073,
-0.040206074714660645,
-0.007994033396244049,
0.002057991921901703,
-0.000735873996745795,
0.10984241962432861,
-0.14647239446640015,
-0.10315237194299698,
0.05356321856379509,
0.10620417445898056,
-0.01499879639595747,
-0.0010503133526071906,
-0.04199841246008873,
-0.031180383637547493,
-0.053947243839502335,
-0.11295493692159653,
0.20289425551891327,
0.017624415457248688,
0.059307802468538284,
-0.09082835167646408,
-0.02374119870364666,
0.009293765760958195,
-0.002121416851878166,
-0.007208898663520813,
0.08636841177940369,
0.013025276362895966,
-0.06393793970346451,
-0.012967932038009167,
-0.0464145801961422,
0.032192278653383255,
0.17173080146312714,
-0.02289699576795101,
-0.10995841026306152,
0.003097960026934743,
-0.013323565013706684,
-0.013730672188103199,
0.09882237762212753,
0.01115154754370451,
-0.003299964591860771,
0.030643543228507042,
0.02449873648583889,
0.052172500640153885,
-0.0595327652990818,
0.08089730888605118,
0.026057926937937737,
-0.044838979840278625,
0.0391053669154644,
-0.025845160707831383,
-0.005819948855787516,
0.04218243062496185,
0.005414710845798254,
0.01793920248746872,
-0.04585083946585655,
-0.043560564517974854,
-0.0848943293094635,
0.10100853443145752,
-0.10664377361536026,
-0.22029893100261688,
-0.16480720043182373,
0.07296884804964066,
-0.022554174065589905,
-0.01583801954984665,
0.033819954842329025,
-0.05478956922888756,
-0.10398636013269424,
-0.1311662793159485,
0.038996029645204544,
0.023107198998332024,
-0.07086223363876343,
-0.05030864104628563,
0.025815987959504128,
0.010747559368610382,
-0.12482839822769165,
0.002412603236734867,
0.004192490130662918,
0.012058444321155548,
-0.016825713217258453,
0.014214272610843182,
0.03570197522640228,
0.07022746652364731,
0.011014147661626339,
-0.05349000170826912,
0.012522180564701557,
0.16930294036865234,
-0.07350112497806549,
0.14810144901275635,
0.1459464430809021,
0.00029460093355737627,
0.06337632983922958,
0.09337707608938217,
0.019193066284060478,
-0.024259096011519432,
0.02461063861846924,
0.017431989312171936,
-0.061509475111961365,
-0.14521028101444244,
-0.12446743994951248,
-0.040779829025268555,
-0.009744972921907902,
0.0672706663608551,
0.03279934450984001,
-0.04846729710698128,
0.01714111864566803,
-0.07837653160095215,
-0.014100994914770126,
0.05145840719342232,
0.05157002806663513,
0.1295468509197235,
0.003373523475602269,
0.02718426287174225,
-0.0760110542178154,
-0.019847853109240532,
0.10320020467042923,
-0.006300040055066347,
0.16443373262882233,
-0.060237836092710495,
0.11318684369325638,
0.03305559977889061,
0.001314586610533297,
0.05383548513054848,
0.054846446961164474,
0.00033951239311136305,
0.02487320452928543,
-0.02321501262485981,
-0.0785490944981575,
-0.043187517672777176,
0.08129732310771942,
0.0474298894405365,
-0.07419047504663467,
-0.0005583185702562332,
-0.006737411022186279,
0.004980066325515509,
0.14273609220981598,
0.03287049010396004,
-0.11356239765882492,
-0.13092058897018433,
0.015078199096024036,
-0.10726716369390488,
-0.07606560736894608,
0.021654533222317696,
0.14259813725948334,
-0.08724889159202576,
0.029956795275211334,
0.0002176469861296937,
0.07470373064279556,
-0.07926418632268906,
0.00834654737263918,
-0.028274929150938988,
0.13140946626663208,
0.010503987781703472,
0.05146188661456108,
-0.03241728991270065,
0.046712473034858704,
0.008826847188174725,
0.11437768489122391,
-0.0642932876944542,
0.041641347110271454,
0.02685617469251156,
0.0076730153523385525,
0.05927778407931328,
0.04160362854599953,
-0.1387632042169571,
0.020998219028115273,
-0.10976102203130722,
0.05413484945893288,
0.04540444537997246,
0.04524761438369751,
0.08296412229537964,
-0.007798222359269857,
0.0024489189963787794,
-0.027455037459731102,
-0.10225915908813477,
-0.11748135089874268,
-0.16404031217098236,
0.026586376130580902,
0.0037087302189320326,
-0.011185691691935062,
-0.05635637044906616,
-0.01962965913116932,
-0.07923835515975952,
0.11568132042884827,
-0.08535972982645035,
-0.12163346260786057,
-0.07396308332681656,
-0.058842141181230545,
0.15302498638629913,
-0.03972542658448219,
0.009019928053021431,
0.03084593079984188,
0.1577683389186859,
-0.05379590764641762,
-0.06310968846082687,
-0.017946097999811172,
-0.08048603683710098,
-0.11165499687194824,
0.005017837975174189,
0.12164685130119324,
0.11024624854326248,
0.051234181970357895,
0.010151674039661884,
0.004340230021625757,
-0.005175670143216848,
-0.0996767058968544,
-0.04818522930145264,
0.16609667241573334,
-0.006850122008472681,
0.032458800822496414,
-0.046352047473192215,
-0.0682096853852272,
-0.057409364730119705,
-0.009815678931772709,
0.03860221430659294,
0.14065630733966827,
-0.05121446028351784,
0.1380784511566162,
0.1938403844833374,
-0.06482783704996109,
-0.20766283571720123,
-0.05420462414622307,
0.05264726281166077,
0.05568508431315422,
0.012649960815906525,
-0.17498742043972015,
0.10659170150756836,
0.03376619890332222,
-0.002400800818577409,
0.040403008460998535,
-0.21544025838375092,
-0.09997042268514633,
0.048024386167526245,
-0.005311001092195511,
-0.025139180943369865,
-0.01871819607913494,
-0.029652709141373634,
-0.035530880093574524,
-0.031430743634700775,
0.07166802883148193,
-0.04671624302864075,
0.053734418004751205,
0.03325603902339935,
0.08358029276132584,
0.049246761947870255,
-0.02198277972638607,
0.11670511960983276,
-0.033978115767240524,
0.005432372912764549,
-0.06347247958183289,
0.09622194617986679,
0.015267182141542435,
-0.0636666938662529,
0.1454901248216629,
-0.02422676980495453,
0.013882189057767391,
-0.10799247771501541,
-0.06252994388341904,
-0.07636678218841553,
0.07138855010271072,
-0.020504029467701912,
-0.042027488350868225,
-0.08377527445554733,
0.07990442961454391,
0.09619655460119247,
0.005831145215779543,
-0.08112034946680069,
-0.06526985764503479,
-0.06881637871265411,
0.12619495391845703,
0.16739922761917114,
-0.04677611216902733,
-0.04770747199654579,
0.011042426340281963,
-0.002179401693865657,
0.05574340745806694,
-0.04681950435042381,
0.013790321536362171,
0.09844034910202026,
-0.021822012960910797,
0.03229723125696182,
-0.03096933476626873,
-0.13861042261123657,
-0.0214974507689476,
0.025918392464518547,
-0.04738447070121765,
-0.1639711856842041,
-0.0320153646171093,
0.04091313108801842,
-0.05199425294995308,
-0.040931981056928635,
0.12912790477275848,
-0.08091352134943008,
-0.002749068895354867,
0.013328012079000473,
0.0580652616918087,
0.02725859172642231,
0.08686313778162003,
0.030247166752815247,
0.021640418097376823,
-0.07089834660291672,
0.09613245725631714,
0.03390197828412056,
-0.12411677837371826,
0.04670397564768791,
0.1497066766023636,
-0.09235399961471558,
-0.04663835093379021,
-0.11950208991765976,
-0.030531400814652443,
-0.010226141661405563,
-0.10320255905389786,
0.01001262292265892,
-0.07225053012371063,
0.016741445288062096,
0.010344498790800571,
0.009299051947891712,
-0.017522506415843964,
-0.02004552073776722,
0.038925789296627045,
-0.09722187370061874,
0.08512803912162781,
0.011804881505668163,
0.024550700560212135,
-0.03982305899262428,
0.08355293422937393,
-0.0027633486315608025,
0.008769924752414227,
-0.024022182449698448,
-0.019576197490096092,
-0.008925062604248524,
-0.02965480089187622,
-0.13559778034687042,
0.01331929862499237,
-0.085511215031147,
0.007561111822724342,
-0.002520172158256173,
0.027772001922130585,
-0.016737811267375946,
0.052618443965911865,
-0.03324433043599129,
-0.020597999915480614,
-0.05299239233136177,
0.02815154753625393,
-0.05556164309382439,
0.008255436085164547,
0.05087519809603691,
-0.07826238125562668,
0.04873509332537651,
0.0068429396487772465,
-0.053288910537958145,
0.05250184237957001,
-0.01275506243109703,
0.0093611478805542,
0.01825561188161373,
0.06349324434995651,
0.004913626238703728,
-0.05350684002041817,
-0.00731252646073699,
0.02980848215520382,
-0.004954386968165636,
-0.037591561675071716,
0.046910177916288376,
-0.05719102546572685,
0.07432886213064194,
0.033216726034879684,
-0.02020438387989998,
-0.05652947351336479,
0.03468960151076317,
0.03425610065460205,
0.019782373681664467,
0.0890602171421051,
-0.05949057266116142,
0.01700548268854618,
-0.09998799115419388,
-0.005026417318731546,
0.010487683117389679,
0.006585482507944107,
0.08428909629583359,
-0.018348904326558113,
0.027423283085227013,
0.0002667581138666719,
0.18358738720417023,
-0.019492143765091896,
0.02468046359717846,
0.04958933964371681,
-0.08965274691581726,
-0.110981285572052,
0.01871076039969921,
0.055938079953193665,
0.012034137733280659,
-0.012238036841154099,
-0.04548953101038933,
-0.03565378114581108,
-0.02116534113883972,
-0.006347181741148233,
0.07800780981779099,
0.10377788543701172,
0.10627353936433792,
0.08266595005989075,
0.015515054576098919,
-0.03970794752240181,
-0.11047127097845078,
0.0767340287566185,
-0.03349859640002251,
0.07943911105394363,
-0.04004514962434769,
0.08929967880249023,
0.12169188261032104,
-0.06438853591680527,
0.09765523672103882,
-0.0027581590693444014,
-0.04612380266189575,
-0.07092731446027756,
-0.14147064089775085,
-0.0430755577981472,
-0.03504415228962898,
-0.023744910955429077,
-0.08937117457389832,
0.037219684571027756,
0.008931952528655529,
0.018611881881952286,
-0.02545355260372162,
0.11215310543775558,
-0.056923966854810715,
-0.08975019305944443,
0.060770392417907715,
-0.016069309785962105,
0.02726742811501026,
0.11194989830255508,
-0.0007088941638357937,
0.05115021392703056,
0.0675782784819603,
0.06193334981799126,
0.057887423783540726,
0.01436611358076334,
-0.007265977095812559,
0.0025002947077155113,
-0.01355015393346548,
-0.004910522140562534,
-0.01024537067860365,
-0.009198498912155628,
0.10231820493936539,
0.0751994177699089,
-0.07499998062849045,
-0.006722572725266218,
0.10197614878416061,
-0.04535790905356407,
-0.14631597697734833,
-0.1327906847000122,
0.12062901258468628,
-0.003703247755765915,
0.018282290548086166,
-0.0013508772244676948,
-0.07793276757001877,
-0.017445823177695274,
0.1220097541809082,
0.12955015897750854,
0.03109929896891117,
-0.006316162180155516,
-0.03387553244829178,
-0.010592441074550152,
-0.061585862189531326,
0.10763280838727951,
0.005703908856958151,
0.28210416436195374,
-0.008191837929189205,
0.05371509864926338,
-0.028334209695458412,
-0.022847602143883705,
-0.11190036684274673,
0.08003786951303482,
-0.048094406723976135,
-0.003488939953967929,
-0.03777254745364189,
0.07981053739786148,
-0.03920700028538704,
-0.24689443409442902,
-0.018260860815644264,
-0.0016485886881127954,
-0.065154530107975,
0.025291571393609047,
-0.010455004870891571,
0.016010235995054245,
0.0607060082256794,
-0.004427028354257345,
0.004302747081965208,
0.15386377274990082,
-0.013039279729127884,
-0.06850764900445938,
0.037113338708877563,
0.04283713176846504,
-0.050310853868722916,
0.163479283452034,
0.03728402778506279,
0.058256376534700394,
0.05514112487435341,
-0.008695547468960285,
-0.11130207777023315,
0.061039794236421585,
0.02947930432856083,
-0.11218380928039551,
0.017011480405926704,
0.1472930759191513,
-0.00914732739329338,
0.06655165553092957,
0.04755810275673866,
-0.03727398440241814,
0.003313825698569417,
0.08551361411809921,
0.0022471558768302202,
-0.050676241517066956,
0.0553617961704731,
-0.09310587495565414,
0.1234501376748085,
0.1026386097073555,
0.012981951236724854,
-0.02418057806789875,
-0.048571471124887466,
0.004153946880251169,
-0.010093835182487965,
0.09029712527990341,
0.0015744982520118356,
-0.1121993288397789,
-0.029066136106848717,
-0.0010003186762332916,
0.0570002943277359,
-0.1614142805337906,
-0.041319601237773895,
0.006437119096517563,
-0.022997358813881874,
0.02609461545944214,
0.09901314228773117,
0.02080121636390686,
0.00771885784342885,
-0.023649511858820915,
-0.016887307167053223,
0.006860064808279276,
0.0718434527516365,
-0.11303075402975082,
-0.056768808513879776
] |
null | null | transformers | # Finetuned Verbatim model.
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
# NB-Whisper Large
Introducing the **_Norwegian NB-Whisper Large model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-large](https://huggingface.co/openai/whisper-large)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-large-verbatim")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-large/resolve/main/ggml-model.bin -O models/nb-large-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-large/resolve/main/ggml-model-q5_0.bin -O models/nb-large-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-large-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-large-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-large-verbatim --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-large", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-large-verbatim | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-large",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:08:03+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us
| Finetuned Verbatim model.
=========================
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
NB-Whisper Large
================
Introducing the *Norwegian NB-Whisper Large model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-large
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Large model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
497
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-large #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-large\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05296173319220543,
0.09735371917486191,
-0.0038811834529042244,
0.0014292100677266717,
0.046840399503707886,
-0.02868485637009144,
0.025657236576080322,
0.0586586594581604,
0.00044433525181375444,
0.06980598717927933,
-0.0025090251583606005,
-0.07681825757026672,
0.08619535714387894,
0.02034042589366436,
0.09691891074180603,
-0.29794561862945557,
0.05506785586476326,
-0.07901722937822342,
0.010908905416727066,
0.04024575278162956,
0.09332382678985596,
-0.05598115921020508,
0.04575476050376892,
0.013667314313352108,
-0.005830614361912012,
0.01446677464991808,
-0.05406441166996956,
-0.035535600036382675,
0.07691726088523865,
0.09523952007293701,
0.03947269544005394,
-0.001200802973471582,
0.08161482214927673,
-0.14516384899616241,
0.010021316818892956,
0.052999284118413925,
0.035766568034887314,
0.01152550708502531,
0.034543149173259735,
0.110737644135952,
0.20021562278270721,
-0.08375980705022812,
0.011800634674727917,
0.05020305514335632,
-0.027194129303097725,
-0.13769549131393433,
-0.04244932904839516,
-0.025640452280640602,
0.05091197416186333,
0.04310137405991554,
-0.03038913570344448,
0.08412078022956848,
-0.07798495143651962,
0.03629342466592789,
0.0793427899479866,
-0.11635222285985947,
-0.01390546839684248,
0.013795830309391022,
0.05027379095554352,
0.048764750361442566,
-0.02689228765666485,
0.009632653556764126,
0.01609691046178341,
0.04036324843764305,
0.016208069398999214,
-0.010413284413516521,
0.07168019562959671,
-0.061756353825330734,
-0.10830625146627426,
-0.04798975586891174,
0.1256929486989975,
0.01678694598376751,
-0.0604981891810894,
-0.17052853107452393,
-0.04102388396859169,
0.057824838906526566,
-0.02250353805720806,
-0.025910697877407074,
0.029069101437926292,
-0.0028914485592395067,
0.09736276417970657,
-0.07309924066066742,
-0.09759596735239029,
0.01554356049746275,
-0.006983798462897539,
0.10894838720560074,
0.044244226068258286,
0.0029826241079717875,
0.017428340390324593,
0.04950243607163429,
-0.0686146691441536,
-0.05511527135968208,
-0.06594141572713852,
-0.06519396603107452,
-0.06071474775671959,
0.014010556042194366,
-0.026062652468681335,
-0.09530984610319138,
-0.00044711012742482126,
0.07696559280157089,
-0.019289720803499222,
0.022466830909252167,
0.02606024779379368,
-0.002254621358588338,
0.05276660993695259,
0.13157035410404205,
-0.024742385372519493,
-0.07805513590574265,
0.0007598797674290836,
-0.006350067909806967,
0.06558695435523987,
0.011003070510923862,
-0.04939951375126839,
-0.03351832553744316,
-0.0034243345726281404,
0.0255057904869318,
0.016754867509007454,
0.011694788932800293,
0.021010801196098328,
-0.028433188796043396,
0.24201010167598724,
-0.11676207929849625,
0.0006706168060190976,
0.014296325854957104,
-0.047493770718574524,
0.08031879365444183,
0.03006855584681034,
-0.02528420276939869,
-0.12931175529956818,
0.014655306935310364,
-0.0030807729344815016,
-0.02174314856529236,
-0.06593751162290573,
-0.10676616430282593,
0.04901987686753273,
0.018551694229245186,
-0.013206958770751953,
-0.1141371950507164,
-0.09358695894479752,
-0.046055570244789124,
0.018629496917128563,
-0.011313331313431263,
-0.03832380101084709,
0.004021466244012117,
-0.04007009416818619,
-0.032910000532865524,
-0.030484966933727264,
0.002345296321436763,
-0.028500782325863838,
-0.01930953934788704,
-0.025070542469620705,
0.03633755445480347,
-0.0215923935174942,
0.011501959525048733,
-0.06324150413274765,
-0.006226410623639822,
-0.18761898577213287,
0.11527365446090698,
-0.07509057968854904,
-0.006859801709651947,
-0.027452178299427032,
-0.07189810276031494,
-0.0483868233859539,
0.05083911493420601,
0.0014520175755023956,
0.07380668073892593,
-0.1732584834098816,
-0.028558097779750824,
0.1421183943748474,
-0.14270813763141632,
0.040253784507513046,
0.14096151292324066,
0.014791722409427166,
0.003908494487404823,
0.13270074129104614,
0.1131194606423378,
0.17026090621948242,
-0.12379786372184753,
-0.0643012747168541,
0.0007211365737020969,
-0.03232498839497566,
0.0596870593726635,
0.04074323922395706,
-0.007581554353237152,
0.0911276564002037,
0.0451594702899456,
0.00928732380270958,
0.022662363946437836,
0.03906916454434395,
-0.021798105910420418,
-0.006612842436879873,
-0.02501421608030796,
-0.004234470427036285,
0.036115679889917374,
-0.048227641731500626,
-0.03897625952959061,
-0.0912146270275116,
0.07760781794786453,
0.10939109325408936,
-0.0421467125415802,
0.033686891198158264,
-0.06955401599407196,
-0.02231217920780182,
0.010236898437142372,
0.0026011858135461807,
-0.11216232925653458,
-0.04681015387177467,
0.0459500215947628,
-0.12978380918502808,
0.07347387075424194,
0.053935933858156204,
0.036231059581041336,
0.07219239324331284,
-0.01684352569282055,
0.015265405178070068,
-0.03083498775959015,
0.0011618459830060601,
-0.02125057764351368,
-0.03709186613559723,
-0.027334697544574738,
-0.036927852779626846,
0.027512812986969948,
-0.10794975608587265,
0.001636496395803988,
0.00835143681615591,
0.08859696239233017,
0.014188110828399658,
-0.024627333506941795,
0.016180718317627907,
0.014519180171191692,
0.010088969953358173,
-0.04137594625353813,
-0.00701132183894515,
-0.0027623921632766724,
-0.001423430978320539,
0.10769343376159668,
-0.14947479963302612,
-0.10857168585062027,
0.05461400747299194,
0.10630147904157639,
-0.013484450988471508,
-0.002914704382419586,
-0.03769854083657265,
-0.032972682267427444,
-0.05419835075736046,
-0.11773157119750977,
0.20508414506912231,
0.016314396634697914,
0.060321975499391556,
-0.08903151750564575,
-0.024631552398204803,
0.0073272367008030415,
-0.0005635258858092129,
-0.0066109634935855865,
0.08216977119445801,
0.010710707865655422,
-0.0626220777630806,
-0.01409828383475542,
-0.04705403372645378,
0.02752281166613102,
0.1699429750442505,
-0.021083012223243713,
-0.1075795516371727,
0.0039990185759961605,
-0.013599189929664135,
-0.013461505062878132,
0.09256710857152939,
0.008553463965654373,
-0.005116764921694994,
0.030824089422822,
0.025995343923568726,
0.05442393943667412,
-0.054551735520362854,
0.0838296189904213,
0.025514965876936913,
-0.04724667966365814,
0.03818230330944061,
-0.024837642908096313,
-0.007064869161695242,
0.04058477655053139,
0.005656140390783548,
0.022196030244231224,
-0.04450571909546852,
-0.04310232028365135,
-0.08649798482656479,
0.10298939794301987,
-0.10416754335165024,
-0.2189396172761917,
-0.16612835228443146,
0.07336845993995667,
-0.024220863357186317,
-0.014543112367391586,
0.03515360876917839,
-0.05421018972992897,
-0.10324738174676895,
-0.1336701363325119,
0.04294244572520256,
0.020161591470241547,
-0.07268714159727097,
-0.051295626908540726,
0.027628792449831963,
0.009093421511352062,
-0.12460020929574966,
0.0028417417779564857,
0.005951263010501862,
0.012100577354431152,
-0.01768934540450573,
0.015359426848590374,
0.032774507999420166,
0.07383307814598083,
0.009565385989844799,
-0.05632412061095238,
0.00956219993531704,
0.1628551483154297,
-0.07192520052194595,
0.15017475187778473,
0.15586446225643158,
0.0054406276904046535,
0.05770708993077278,
0.09382065385580063,
0.014712400734424591,
-0.024534448981285095,
0.02474200166761875,
0.01706424541771412,
-0.0622892789542675,
-0.1461806744337082,
-0.12692248821258545,
-0.04245195910334587,
-0.005139066372066736,
0.06547801941633224,
0.03236158564686775,
-0.0489032007753849,
0.017026199027895927,
-0.07868372648954391,
-0.011005006730556488,
0.054854441434144974,
0.05161076784133911,
0.1378561407327652,
0.0016075456514954567,
0.02648150362074375,
-0.0778646171092987,
-0.017339365556836128,
0.10061412304639816,
-0.01005042064934969,
0.15805131196975708,
-0.059578731656074524,
0.11629685014486313,
0.02861650101840496,
-0.011224727146327496,
0.05680428817868233,
0.05552901700139046,
0.00019034867000300437,
0.02077609673142433,
-0.023199856281280518,
-0.07852808386087418,
-0.04387184977531433,
0.08404038101434708,
0.05242825672030449,
-0.07903861999511719,
-0.0008927416056394577,
0.001876294962130487,
0.002558555454015732,
0.13630978763103485,
0.03063478134572506,
-0.11342356353998184,
-0.13053272664546967,
0.017538586631417274,
-0.10485208034515381,
-0.07290510088205338,
0.022740306332707405,
0.15048323571681976,
-0.08786367624998093,
0.030035870149731636,
0.0009829780319705606,
0.07475189119577408,
-0.07369504123926163,
0.009010996669530869,
-0.034506428986787796,
0.13453082740306854,
0.011850911192595959,
0.05447641387581825,
-0.037888169288635254,
0.040536027401685715,
0.009022600017488003,
0.11673209816217422,
-0.06458648294210434,
0.04415391758084297,
0.028068995103240013,
0.01322715450078249,
0.059337396174669266,
0.04093489795923233,
-0.15136121213436127,
0.021390115842223167,
-0.11134638637304306,
0.0545467734336853,
0.041399676352739334,
0.0487743616104126,
0.08103000372648239,
-0.008717638440430164,
-0.0014168756315484643,
-0.02616856060922146,
-0.10543747991323471,
-0.119498610496521,
-0.16386330127716064,
0.023354746401309967,
0.00039676824235357344,
-0.003779347287490964,
-0.05475013330578804,
-0.01618996448814869,
-0.08386760205030441,
0.1165747344493866,
-0.08008600026369095,
-0.12010297924280167,
-0.07303469628095627,
-0.056210070848464966,
0.15171030163764954,
-0.044452618807554245,
0.009267677552998066,
0.03489411249756813,
0.15693654119968414,
-0.05205686017870903,
-0.060824915766716,
-0.015623800456523895,
-0.08291152119636536,
-0.10521890968084335,
0.00824075285345316,
0.11681818217039108,
0.10207229852676392,
0.05529077351093292,
0.011769297532737255,
0.002116154180839658,
-0.0018033211817964911,
-0.09992746263742447,
-0.054452985525131226,
0.17412935197353363,
-0.01004746649414301,
0.033443618565797806,
-0.050800930708646774,
-0.06694277375936508,
-0.052575718611478806,
-0.011065750382840633,
0.049001652747392654,
0.14527146518230438,
-0.05122670158743858,
0.13461782038211823,
0.19320888817310333,
-0.06530234217643738,
-0.21305684745311737,
-0.05243651196360588,
0.050460055470466614,
0.05392737314105034,
0.01418014895170927,
-0.17135661840438843,
0.10715041309595108,
0.03822924569249153,
0.0002464201534166932,
0.03807703033089638,
-0.21804499626159668,
-0.10214245319366455,
0.044813673943281174,
-0.009448856115341187,
-0.02056264691054821,
-0.020527226850390434,
-0.02819511853158474,
-0.03827245533466339,
-0.030696621164679527,
0.06748364865779877,
-0.04899786040186882,
0.054357632994651794,
0.03569592535495758,
0.0776619091629982,
0.047273170202970505,
-0.023765087127685547,
0.11385207623243332,
-0.03243732824921608,
0.005109324585646391,
-0.06166874244809151,
0.09525668621063232,
0.015406418591737747,
-0.06549522280693054,
0.14650605618953705,
-0.024275988340377808,
0.012038539163768291,
-0.10497775673866272,
-0.06322721391916275,
-0.07562566548585892,
0.07462479919195175,
-0.01811317913234234,
-0.0402008593082428,
-0.08485238999128342,
0.07786790281534195,
0.0943509116768837,
0.005746990907937288,
-0.07748321443796158,
-0.07052754610776901,
-0.07378210872411728,
0.12262004613876343,
0.17353589832782745,
-0.04997195303440094,
-0.05173530802130699,
0.012837546877563,
-0.000521339476108551,
0.05559206381440163,
-0.05002859607338905,
0.016424797475337982,
0.09397712349891663,
-0.019245365634560585,
0.03369341045618057,
-0.03236564248800278,
-0.1331532746553421,
-0.017314599826931953,
0.02276366390287876,
-0.03971531614661217,
-0.1674547791481018,
-0.03144443780183792,
0.03730102628469467,
-0.05292665958404541,
-0.0442548505961895,
0.1313992142677307,
-0.08168986439704895,
-0.0013182867551222444,
0.013311084359884262,
0.056429069489240646,
0.025051511824131012,
0.09355419129133224,
0.026740601286292076,
0.023231109604239464,
-0.07062175869941711,
0.0975257158279419,
0.0351807065308094,
-0.128244087100029,
0.047866757959127426,
0.14979134500026703,
-0.09087872505187988,
-0.0500328354537487,
-0.12078886479139328,
-0.032211776822805405,
-0.013274802826344967,
-0.10345709323883057,
0.008031888864934444,
-0.07307847589254379,
0.01426768396049738,
0.009669262915849686,
0.008012375794351101,
-0.021835090592503548,
-0.019820181652903557,
0.039022840559482574,
-0.10119923204183578,
0.08379780501127243,
0.007940425537526608,
0.02602614462375641,
-0.04065416008234024,
0.0905618667602539,
-0.0013774080434814095,
0.012019799090921879,
-0.024590807035565376,
-0.01851358264684677,
-0.010060264728963375,
-0.031500235199928284,
-0.1394885778427124,
0.0135020287707448,
-0.08772540092468262,
0.008053626865148544,
-0.003273502690717578,
0.03379353880882263,
-0.017731189727783203,
0.052241235971450806,
-0.03545324131846428,
-0.020981157198548317,
-0.055661678314208984,
0.03188474476337433,
-0.052059244364500046,
0.008883794769644737,
0.04673681780695915,
-0.07409559935331345,
0.049059271812438965,
0.0036629755049943924,
-0.0508648045361042,
0.048924077302217484,
-0.012695367448031902,
0.006648706737905741,
0.01813393644988537,
0.06413381546735764,
0.004572323989123106,
-0.05151090398430824,
-0.004642277956008911,
0.02946055494248867,
-0.00006704963743686676,
-0.03902660310268402,
0.03554965928196907,
-0.05527603253722191,
0.07775525003671646,
0.027323707938194275,
-0.015320166014134884,
-0.05732179805636406,
0.030017131939530373,
0.02757575921714306,
0.01918026991188526,
0.08741313964128494,
-0.06096639856696129,
0.015535983256995678,
-0.09593475610017776,
-0.00422298489138484,
0.0063530136831104755,
0.0065930611453950405,
0.08321885019540787,
-0.017077991738915443,
0.02884180098772049,
0.0027738052885979414,
0.18191327154636383,
-0.018986212089657784,
0.02444194070994854,
0.05324159935116768,
-0.0924469605088234,
-0.10539183020591736,
0.01778334379196167,
0.060584381222724915,
0.013411623425781727,
-0.008521289564669132,
-0.04034918174147606,
-0.027802368625998497,
-0.016025317832827568,
-0.000584036111831665,
0.08599909394979477,
0.1060648187994957,
0.09845110774040222,
0.08825982362031937,
0.014499169774353504,
-0.041329436004161835,
-0.10930155962705612,
0.077110655605793,
-0.031137846410274506,
0.07765442132949829,
-0.0422254353761673,
0.0850609838962555,
0.12324772030115128,
-0.06427275389432907,
0.09467244893312454,
-0.0033962279558181763,
-0.045910339802503586,
-0.07054578512907028,
-0.13325297832489014,
-0.04449774697422981,
-0.03876001015305519,
-0.026226336136460304,
-0.0905275046825409,
0.03519812598824501,
0.012006416916847229,
0.01917162723839283,
-0.026100099086761475,
0.11170578747987747,
-0.06656112521886826,
-0.09367245435714722,
0.056832801550626755,
-0.02092493325471878,
0.03052701987326145,
0.11388850212097168,
-0.001505266292952001,
0.05415790155529976,
0.06214847043156624,
0.060997020453214645,
0.060106221586465836,
0.016665415838360786,
-0.00595567561686039,
0.0008331987191922963,
-0.011151152662932873,
-0.00961034931242466,
-0.0018747070571407676,
-0.005674006883054972,
0.10459678620100021,
0.07635261863470078,
-0.08236616849899292,
-0.006330238189548254,
0.1009625792503357,
-0.04659382998943329,
-0.14912475645542145,
-0.1302499771118164,
0.1280684918165207,
-0.0008649304509162903,
0.020751524716615677,
-0.005069941282272339,
-0.07699073106050491,
-0.019001776352524757,
0.12956927716732025,
0.12917053699493408,
0.03177894279360771,
-0.00719151645898819,
-0.03574550151824951,
-0.010987960733473301,
-0.06502034515142441,
0.11047756671905518,
0.005229934584349394,
0.27657556533813477,
-0.005531327333301306,
0.053835466504096985,
-0.02236650139093399,
-0.023051680997014046,
-0.11584397405385971,
0.07836860418319702,
-0.051013749092817307,
-0.0024139557499438524,
-0.027742832899093628,
0.07749023288488388,
-0.0417633019387722,
-0.2472580075263977,
-0.022812368348240852,
0.0033747144043445587,
-0.06395704299211502,
0.028330467641353607,
-0.011594031937420368,
0.021457383409142494,
0.06269682198762894,
-0.004426983650773764,
0.003754106117412448,
0.15013843774795532,
-0.014930774457752705,
-0.07101913541555405,
0.035930853337049484,
0.04340527951717377,
-0.04247202351689339,
0.16361312568187714,
0.034531351178884506,
0.05780244991183281,
0.05674431100487709,
-0.01095303613692522,
-0.11473534256219864,
0.060690075159072876,
0.02832530252635479,
-0.11735076457262039,
0.02635217271745205,
0.14663197100162506,
-0.011272517032921314,
0.07317855209112167,
0.047359075397253036,
-0.03782832995057106,
0.0024816456716507673,
0.08594688028097153,
0.006529544945806265,
-0.051630739122629166,
0.0534086674451828,
-0.09295033663511276,
0.12609757483005524,
0.10194072127342224,
0.012503470294177532,
-0.026533514261245728,
-0.047788992524147034,
0.006107345223426819,
-0.011060155928134918,
0.09403341263532639,
-0.0035379703622311354,
-0.1149706020951271,
-0.027516065165400505,
-0.005723027978092432,
0.05655448138713837,
-0.1625673770904541,
-0.041079841554164886,
0.005061662290245295,
-0.021265484392642975,
0.02887839637696743,
0.09843403100967407,
0.02709369547665119,
0.007294067647308111,
-0.02258148603141308,
-0.02608385682106018,
0.005602516699582338,
0.07362786680459976,
-0.11932618170976639,
-0.05967900529503822
] |
null | null | transformers | # Finetuned Verbatim model.
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
# NB-Whisper Medium Verbatim
Introducing the **_Norwegian NB-Whisper Medium Verbatim model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-medium](https://huggingface.co/openai/whisper-medium)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-medium-verbatim")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-medium/resolve/main/ggml-model.bin -O models/nb-medium-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-medium/resolve/main/ggml-model-q5_0.bin -O models/nb-medium-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-medium-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-medium-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-medium-verbatim --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Medium Verbatim model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-medium", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-medium-verbatim | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-medium",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:08:10+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us
| Finetuned Verbatim model.
=========================
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
NB-Whisper Medium Verbatim
==========================
Introducing the *Norwegian NB-Whisper Medium Verbatim model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-medium
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Medium Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Medium Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Medium Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
500
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-medium\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.053408872336149216,
0.09734372049570084,
-0.0038889963179826736,
0.0007580332458019257,
0.04640887305140495,
-0.028522321954369545,
0.026218613609671593,
0.058278799057006836,
-0.00048149129725061357,
0.06930404156446457,
-0.002532658400014043,
-0.07761071622371674,
0.08640396595001221,
0.02065064013004303,
0.09733971953392029,
-0.2983796000480652,
0.05479859933257103,
-0.07921868562698364,
0.010434887371957302,
0.04061432555317879,
0.0935901403427124,
-0.056546133011579514,
0.04552413150668144,
0.014031033031642437,
-0.0058809793554246426,
0.014090013690292835,
-0.05407995358109474,
-0.03553939238190651,
0.0762382224202156,
0.09500553458929062,
0.039830632507801056,
-0.0008958484977483749,
0.08211835473775864,
-0.14503984153270721,
0.009906000457704067,
0.05288361385464668,
0.03606242686510086,
0.011530482210218906,
0.03556419909000397,
0.11090878397226334,
0.20059294998645782,
-0.08328181505203247,
0.011760891415178776,
0.05040663108229637,
-0.026640726253390312,
-0.13732576370239258,
-0.04293892905116081,
-0.024638867005705833,
0.05123947188258171,
0.04289204999804497,
-0.02990558184683323,
0.08480087667703629,
-0.07812599092721939,
0.03647651523351669,
0.07791078090667725,
-0.11642521619796753,
-0.01401425153017044,
0.013596017844974995,
0.05050720274448395,
0.04940086603164673,
-0.02637522667646408,
0.008759605698287487,
0.015379950404167175,
0.04044670984148979,
0.015311437658965588,
-0.010136113502085209,
0.07275011390447617,
-0.06173372268676758,
-0.10859950631856918,
-0.047759827226400375,
0.12678463757038116,
0.016961518675088882,
-0.06058543920516968,
-0.1698049157857895,
-0.040774572640657425,
0.05690477415919304,
-0.022340716794133186,
-0.02595818229019642,
0.028909994289278984,
-0.002604811219498515,
0.09714715927839279,
-0.0719771683216095,
-0.0975651666522026,
0.015375934541225433,
-0.007007630541920662,
0.10912636667490005,
0.044051241129636765,
0.0033276965841650963,
0.01721496693789959,
0.0495186485350132,
-0.06760430335998535,
-0.05515226349234581,
-0.06536924093961716,
-0.06540770083665848,
-0.060463715344667435,
0.013614557683467865,
-0.0255903247743845,
-0.09668278694152832,
-0.0004457826726138592,
0.07777168601751328,
-0.020016660913825035,
0.022975953295826912,
0.02646392025053501,
-0.0026948489248752594,
0.05315357446670532,
0.13118433952331543,
-0.024681389331817627,
-0.07917671650648117,
0.00047070407890714705,
-0.006538307759910822,
0.06497280299663544,
0.011317860335111618,
-0.04879562929272652,
-0.032953113317489624,
-0.0024076071567833424,
0.02516581304371357,
0.01661512814462185,
0.011733061634004116,
0.021251285448670387,
-0.02913583628833294,
0.242924764752388,
-0.11680103093385696,
0.00042745284736156464,
0.014008517377078533,
-0.047167640179395676,
0.08106967061758041,
0.029631895944476128,
-0.025097988545894623,
-0.12910595536231995,
0.014666043221950531,
-0.00270654889754951,
-0.02214106358587742,
-0.0662289410829544,
-0.1065153256058693,
0.04883219674229622,
0.01899176836013794,
-0.013431533239781857,
-0.11421225219964981,
-0.09376063197851181,
-0.045685842633247375,
0.019359087571501732,
-0.011402313597500324,
-0.0379108227789402,
0.004448224324733019,
-0.03985005244612694,
-0.03200840950012207,
-0.03046736679971218,
0.0033287506084889174,
-0.028618531301617622,
-0.019680999219417572,
-0.024644503369927406,
0.03599286079406738,
-0.021723225712776184,
0.011471763253211975,
-0.06224140524864197,
-0.00609582057222724,
-0.18763315677642822,
0.114989273250103,
-0.07522726058959961,
-0.00663727754727006,
-0.027749305590987206,
-0.07128012925386429,
-0.04814068600535393,
0.05021432042121887,
0.0016163960099220276,
0.07371591776609421,
-0.17315465211868286,
-0.028381990268826485,
0.14147548377513885,
-0.1430736929178238,
0.040566351264715195,
0.1405785232782364,
0.014873306266963482,
0.0031983796507120132,
0.1321834772825241,
0.11182025820016861,
0.1696401834487915,
-0.12388795614242554,
-0.06404087692499161,
0.0010088678682222962,
-0.032210420817136765,
0.05935576558113098,
0.040972016751766205,
-0.007507853209972382,
0.09090474992990494,
0.04503936693072319,
0.009750480763614178,
0.02279219590127468,
0.03841046616435051,
-0.02174186147749424,
-0.006196301430463791,
-0.025294104591012,
-0.0038667619228363037,
0.03627414628863335,
-0.047969043254852295,
-0.03901522979140282,
-0.09149178862571716,
0.07757269591093063,
0.10937994718551636,
-0.04161308705806732,
0.033465467393398285,
-0.070184625685215,
-0.022795295342803,
0.010248573496937752,
0.0017934056231752038,
-0.11272559314966202,
-0.045850932598114014,
0.04599980637431145,
-0.12754832208156586,
0.07256680727005005,
0.05360956862568855,
0.03651643171906471,
0.07237730175256729,
-0.016613945364952087,
0.015032734721899033,
-0.03066820092499256,
0.0017288910457864404,
-0.021153412759304047,
-0.03747197613120079,
-0.027834737673401833,
-0.03720042482018471,
0.02796112932264805,
-0.10778429359197617,
0.0014105489244684577,
0.008628486655652523,
0.08899572491645813,
0.01426113024353981,
-0.024564845487475395,
0.015873663127422333,
0.014531883411109447,
0.01003175601363182,
-0.041215796023607254,
-0.0068251267075538635,
-0.0020047214347869158,
-0.001313773333095014,
0.10708699375391006,
-0.14935016632080078,
-0.10947976261377335,
0.05435444042086601,
0.10609474033117294,
-0.013332486152648926,
-0.0034710157196968794,
-0.03791607916355133,
-0.03294724225997925,
-0.05326893925666809,
-0.11773455142974854,
0.20487748086452484,
0.01667243242263794,
0.06056677922606468,
-0.08855202049016953,
-0.024819156154990196,
0.007287248503416777,
-0.0008281755144707859,
-0.006904043257236481,
0.08173364400863647,
0.010145111940801144,
-0.06245696172118187,
-0.014163839630782604,
-0.04737750068306923,
0.027906736359000206,
0.16968441009521484,
-0.021354442462325096,
-0.10731213539838791,
0.0038344881031662226,
-0.013813246972858906,
-0.01307507324963808,
0.09232836216688156,
0.007668837904930115,
-0.005060643423348665,
0.030821606516838074,
0.026537412777543068,
0.0545656681060791,
-0.05451233312487602,
0.08432430773973465,
0.0257288608700037,
-0.04667338356375694,
0.03855479136109352,
-0.025211701169610023,
-0.007140308618545532,
0.04032542183995247,
0.005685374140739441,
0.022695770487189293,
-0.044395919889211655,
-0.04319600388407707,
-0.08694405108690262,
0.10310300439596176,
-0.10499384254217148,
-0.21899788081645966,
-0.1662282794713974,
0.07195380330085754,
-0.02378780208528042,
-0.014698207378387451,
0.03469110652804375,
-0.05438333377242088,
-0.1034940779209137,
-0.13333451747894287,
0.04265667125582695,
0.019058844074606895,
-0.07224418967962265,
-0.05033149942755699,
0.02782060019671917,
0.009625238366425037,
-0.12468359619379044,
0.002635714365169406,
0.005851972382515669,
0.012207088060677052,
-0.017269598320126534,
0.015314143151044846,
0.03237262740731239,
0.07395341247320175,
0.009488577954471111,
-0.05585721135139465,
0.01020015124231577,
0.16351009905338287,
-0.07121943682432175,
0.15030349791049957,
0.15639163553714752,
0.005180885549634695,
0.05736575648188591,
0.09373966604471207,
0.014885875396430492,
-0.024212351068854332,
0.02464011311531067,
0.016354113817214966,
-0.061992302536964417,
-0.14632174372673035,
-0.12716954946517944,
-0.042609382420778275,
-0.006120461504906416,
0.06480147689580917,
0.032209817320108414,
-0.04859266057610512,
0.016571080312132835,
-0.0783882588148117,
-0.010494023561477661,
0.05456635355949402,
0.051513224840164185,
0.1362501084804535,
0.0022744890302419662,
0.02654748223721981,
-0.07742047309875488,
-0.016508925706148148,
0.10074860602617264,
-0.00939792487770319,
0.15831823647022247,
-0.05874590948224068,
0.11643251031637192,
0.029519693925976753,
-0.011629466898739338,
0.057408690452575684,
0.05566565319895744,
0.00008146030450006947,
0.020684227347373962,
-0.02280978299677372,
-0.07856887578964233,
-0.04432787373661995,
0.08330632746219635,
0.05166657641530037,
-0.0785033106803894,
-0.0013335850089788437,
0.0025829935912042856,
0.00223400816321373,
0.13705594837665558,
0.029970601201057434,
-0.11357136815786362,
-0.1309383064508438,
0.01728852279484272,
-0.10541566461324692,
-0.07253478467464447,
0.02218998782336712,
0.1497228890657425,
-0.08801192045211792,
0.030105141922831535,
0.0006926196510903537,
0.0746789202094078,
-0.07381237298250198,
0.008762650191783905,
-0.03366309776902199,
0.13450491428375244,
0.01223599910736084,
0.05387371405959129,
-0.037838660180568695,
0.04143340140581131,
0.009239842183887959,
0.11705637723207474,
-0.06436719745397568,
0.044642385095357895,
0.02766738273203373,
0.01370085496455431,
0.05866938829421997,
0.041256941854953766,
-0.15221142768859863,
0.021549047902226448,
-0.11149812489748001,
0.05397466942667961,
0.041763629764318466,
0.04991475120186806,
0.08116678148508072,
-0.00875602662563324,
-0.0014098869869485497,
-0.025948092341423035,
-0.1049409881234169,
-0.12142070382833481,
-0.16426512598991394,
0.0234389528632164,
0.0001188392416224815,
-0.004753742832690477,
-0.05502857640385628,
-0.015834441408514977,
-0.08359181880950928,
0.1164366826415062,
-0.0812012329697609,
-0.1199306845664978,
-0.07339360564947128,
-0.05578172206878662,
0.15130998194217682,
-0.04440511390566826,
0.009434311650693417,
0.03456052020192146,
0.15712666511535645,
-0.051698993891477585,
-0.0608486533164978,
-0.01606670394539833,
-0.08380403369665146,
-0.10560864210128784,
0.007861779071390629,
0.11657353490591049,
0.1033165454864502,
0.055156558752059937,
0.011573378928005695,
0.001775673241354525,
-0.0020898033399134874,
-0.09945199638605118,
-0.05493663251399994,
0.17473940551280975,
-0.009328407235443592,
0.03205331787467003,
-0.05047791078686714,
-0.06687429547309875,
-0.052116263657808304,
-0.011229577474296093,
0.04908677935600281,
0.1445901244878769,
-0.05135548114776611,
0.13494910299777985,
0.1937495321035385,
-0.0653407946228981,
-0.2124556452035904,
-0.0528397262096405,
0.051637131720781326,
0.05386717990040779,
0.013575288467109203,
-0.17106205224990845,
0.10755989700555801,
0.03849410265684128,
0.000554183148778975,
0.03661763668060303,
-0.2173703908920288,
-0.10174086689949036,
0.04431682825088501,
-0.009372142143547535,
-0.022019200026988983,
-0.0206910353153944,
-0.02851649560034275,
-0.0384073443710804,
-0.03203586861491203,
0.06708086282014847,
-0.04820941016077995,
0.053981486707925797,
0.03547094389796257,
0.07657995074987411,
0.04691901430487633,
-0.024240486323833466,
0.11358589679002762,
-0.03205002844333649,
0.0047635179944336414,
-0.061694901436567307,
0.0951966941356659,
0.016085924580693245,
-0.06556948274374008,
0.14569371938705444,
-0.02394179441034794,
0.011835903860628605,
-0.10583123564720154,
-0.06310834735631943,
-0.07532072812318802,
0.07395213097333908,
-0.01820492558181286,
-0.04013926163315773,
-0.08435025066137314,
0.07823792845010757,
0.09449157863855362,
0.0055583142675459385,
-0.07751237601041794,
-0.07029364258050919,
-0.0732273980975151,
0.12457800656557083,
0.17442218959331512,
-0.05191715061664581,
-0.05090334638953209,
0.012585081160068512,
-0.00044745454215444624,
0.05524785444140434,
-0.0493268184363842,
0.016355860978364944,
0.09376704692840576,
-0.019356898963451385,
0.03371105715632439,
-0.0322917178273201,
-0.13340704143047333,
-0.017315110191702843,
0.023090576753020287,
-0.0392928346991539,
-0.1670091301202774,
-0.03167755529284477,
0.039266422390937805,
-0.05307302996516228,
-0.043962299823760986,
0.13113509118556976,
-0.08118858188390732,
-0.0013896049931645393,
0.013084187172353268,
0.05620915815234184,
0.024603545665740967,
0.09426602721214294,
0.02685466967523098,
0.0237167626619339,
-0.0702846422791481,
0.09802669286727905,
0.035271111875772476,
-0.1270463764667511,
0.04709966108202934,
0.14992715418338776,
-0.09153342247009277,
-0.04981948062777519,
-0.11973070353269577,
-0.03296862915158272,
-0.012486893683671951,
-0.10357055813074112,
0.008045473136007786,
-0.07289750128984451,
0.014427908696234226,
0.009438048116862774,
0.00788271427154541,
-0.02218623459339142,
-0.01982065476477146,
0.03920961543917656,
-0.10088992863893509,
0.08409100025892258,
0.008028184063732624,
0.026034871116280556,
-0.04035166651010513,
0.0911010131239891,
-0.0016192058101296425,
0.011630511842668056,
-0.02449945919215679,
-0.01880870945751667,
-0.009824731387197971,
-0.031434234231710434,
-0.1394326537847519,
0.013447797857224941,
-0.08816266059875488,
0.007652454078197479,
-0.003321102587506175,
0.034024305641651154,
-0.017663342878222466,
0.052337050437927246,
-0.035217929631471634,
-0.021065140143036842,
-0.05593349412083626,
0.031378962099552155,
-0.05096543952822685,
0.009215501137077808,
0.046408820897340775,
-0.0742160752415657,
0.048864785581827164,
0.0037604505196213722,
-0.050612691789865494,
0.04873538389801979,
-0.012939934618771076,
0.006064849440008402,
0.01727568544447422,
0.06445728242397308,
0.005071787163615227,
-0.05167876556515694,
-0.004715636372566223,
0.029887521639466286,
-0.0007479532505385578,
-0.03904491662979126,
0.03568945452570915,
-0.05497480928897858,
0.0777621790766716,
0.027322612702846527,
-0.015041903592646122,
-0.057955507189035416,
0.029973505064845085,
0.028617536649107933,
0.018963776528835297,
0.08754566311836243,
-0.06080442667007446,
0.01492118090391159,
-0.09637986868619919,
-0.004170130472630262,
0.006634741555899382,
0.007241316605359316,
0.08213584870100021,
-0.016846338286995888,
0.028740113601088524,
0.002889492781832814,
0.18214677274227142,
-0.019202591851353645,
0.024632876738905907,
0.05330730602145195,
-0.09245314449071884,
-0.10558085888624191,
0.017723413184285164,
0.0604800246655941,
0.013455321080982685,
-0.007930897176265717,
-0.04014964774250984,
-0.02814667671918869,
-0.016257578507065773,
-0.001036370755173266,
0.08714871853590012,
0.10617371648550034,
0.0976230725646019,
0.08826819807291031,
0.014028012752532959,
-0.04098311439156532,
-0.10997330397367477,
0.0763208195567131,
-0.03176049143075943,
0.07777931541204453,
-0.04222716763615608,
0.08517944067716599,
0.12359615415334702,
-0.06490137428045273,
0.09454122930765152,
-0.0036572974640876055,
-0.04615459963679314,
-0.07075981795787811,
-0.13402414321899414,
-0.04433242976665497,
-0.03901807591319084,
-0.026252364739775658,
-0.09059158712625504,
0.03540237620472908,
0.012055334635078907,
0.01949060708284378,
-0.02560693584382534,
0.1101616695523262,
-0.06621414422988892,
-0.0931764543056488,
0.05674548074603081,
-0.020955616608262062,
0.030241765081882477,
0.11319563537836075,
-0.0012463784078136086,
0.05373707786202431,
0.062121134251356125,
0.06094951555132866,
0.05987996980547905,
0.016542397439479828,
-0.006221379619091749,
0.0016856197034940124,
-0.011593181639909744,
-0.009227938018739223,
-0.002609333023428917,
-0.005829900503158569,
0.10509815812110901,
0.07614535093307495,
-0.0819401815533638,
-0.006236647721379995,
0.10055315494537354,
-0.046291034668684006,
-0.14998243749141693,
-0.1307687759399414,
0.12786686420440674,
-0.0009357755188830197,
0.020748091861605644,
-0.004485463257879019,
-0.07711076736450195,
-0.019075853750109673,
0.12942977249622345,
0.12907133996486664,
0.031856417655944824,
-0.007237451151013374,
-0.035651423037052155,
-0.011334405280649662,
-0.06539835035800934,
0.11127518862485886,
0.005234651267528534,
0.2760324478149414,
-0.005274938885122538,
0.05352477356791496,
-0.022896533831954002,
-0.02340439148247242,
-0.11547502130270004,
0.07878604531288147,
-0.05087262764573097,
-0.0026227708440274,
-0.027426285669207573,
0.07760471850633621,
-0.0417303629219532,
-0.24655364453792572,
-0.0229586660861969,
0.003556982846930623,
-0.06418276578187943,
0.028180137276649475,
-0.01179917436093092,
0.021242111921310425,
0.06225660815834999,
-0.004294329788535833,
0.003906911239027977,
0.15048623085021973,
-0.014604535885155201,
-0.07048328220844269,
0.03564010560512543,
0.04415198042988777,
-0.041966501623392105,
0.16365796327590942,
0.03469070792198181,
0.057523827999830246,
0.05693890526890755,
-0.010615795850753784,
-0.11535201221704483,
0.06075581535696983,
0.02837255597114563,
-0.11714828759431839,
0.02646351419389248,
0.1469414085149765,
-0.010864478535950184,
0.073973648250103,
0.04690287634730339,
-0.03802666440606117,
0.0027837723027914762,
0.08550050109624863,
0.006520237773656845,
-0.051601674407720566,
0.054037224501371384,
-0.09269941598176956,
0.1265147477388382,
0.10168280452489853,
0.012311205267906189,
-0.026300907135009766,
-0.04780186340212822,
0.006401794496923685,
-0.011390899308025837,
0.09497781842947006,
-0.003815124509856105,
-0.11586036533117294,
-0.027454063296318054,
-0.0054977149702608585,
0.05677550658583641,
-0.161662295460701,
-0.04133867472410202,
0.00468446733430028,
-0.021599285304546356,
0.02855628728866577,
0.09876874089241028,
0.02730019949376583,
0.007248224690556526,
-0.0229452196508646,
-0.02427842654287815,
0.005362322088330984,
0.07350432127714157,
-0.11940198391675949,
-0.05905388295650482
] |
null | null | transformers | # Finetuned Verbatim model.
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
# NB-Whisper Small Verbatim
Introducing the **_Norwegian NB-Whisper Small Verbatim model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-small](https://huggingface.co/openai/whisper-small)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-medium-verbatim")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model.bin -O models/nb-small-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-small/resolve/main/ggml-model-q5_0.bin -O models/nb-small-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-small-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-small-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-medium-verbatim --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Small Verbatim model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-small", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-small-verbatim | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:08:16+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us
| Finetuned Verbatim model.
=========================
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
NB-Whisper Small Verbatim
=========================
Introducing the *Norwegian NB-Whisper Small Verbatim model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-small
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Small Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Small Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
143,
198,
107,
95,
127,
160,
149,
215,
325,
500
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-small\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05313670262694359,
0.09707695245742798,
-0.003880731062963605,
0.0012831786880269647,
0.046301472932100296,
-0.029270900413393974,
0.025701439008116722,
0.05904564633965492,
-0.00023794670414645225,
0.06951453536748886,
-0.002706193597987294,
-0.07667413353919983,
0.08631157875061035,
0.021189367398619652,
0.09632975608110428,
-0.2983481287956238,
0.05454470217227936,
-0.07895288616418839,
0.010602572001516819,
0.03997239097952843,
0.0936204195022583,
-0.05643267557024956,
0.045930664986371994,
0.014130756258964539,
-0.005923555698245764,
0.014167976565659046,
-0.05366680026054382,
-0.03547634556889534,
0.07598380744457245,
0.09494748711585999,
0.0394386388361454,
-0.0011252068215981126,
0.08186641335487366,
-0.14478033781051636,
0.009939960204064846,
0.05257062986493111,
0.036123644560575485,
0.011760481633245945,
0.035008665174245834,
0.11113276332616806,
0.20000874996185303,
-0.08396207541227341,
0.010943327099084854,
0.05032143369317055,
-0.026787815615534782,
-0.138142392039299,
-0.0423724390566349,
-0.025564784184098244,
0.05061521753668785,
0.04267450049519539,
-0.03012804128229618,
0.08522967249155045,
-0.07865230739116669,
0.03649737313389778,
0.07793093472719193,
-0.11590909957885742,
-0.014122061431407928,
0.014020264148712158,
0.04893474280834198,
0.04814661666750908,
-0.026305215433239937,
0.009253446944057941,
0.01583971455693245,
0.04073401167988777,
0.015422825701534748,
-0.009919791482388973,
0.07131191343069077,
-0.06190745159983635,
-0.10806053876876831,
-0.04816674813628197,
0.12705682218074799,
0.016789792105555534,
-0.06055287644267082,
-0.17028534412384033,
-0.041089627891778946,
0.056730836629867554,
-0.022781148552894592,
-0.026135198771953583,
0.02914804220199585,
-0.0025629810988903046,
0.0968579649925232,
-0.07163555175065994,
-0.0979076698422432,
0.015982838347554207,
-0.007248008158057928,
0.1084735170006752,
0.04391639307141304,
0.0032915985211730003,
0.017141973599791527,
0.04907306656241417,
-0.06850013881921768,
-0.05525605008006096,
-0.0659816637635231,
-0.06542304903268814,
-0.05962321162223816,
0.013917632400989532,
-0.026013746857643127,
-0.09549623727798462,
-0.00031829081126488745,
0.0769890770316124,
-0.019275715574622154,
0.02246892638504505,
0.026539525017142296,
-0.002181742340326309,
0.052935678511857986,
0.13106216490268707,
-0.0235371645539999,
-0.07798952609300613,
0.0012427183100953698,
-0.006733561400324106,
0.06595762819051743,
0.01150660589337349,
-0.04909980297088623,
-0.03309032320976257,
-0.003275076625868678,
0.024991512298583984,
0.01667100004851818,
0.012090765871107578,
0.021332645788788795,
-0.02876955084502697,
0.24200840294361115,
-0.11678973585367203,
0.00032916790223680437,
0.014547653496265411,
-0.04745154082775116,
0.0810275599360466,
0.029732907190918922,
-0.02524886094033718,
-0.12939578294754028,
0.014258217997848988,
-0.0023223310708999634,
-0.02188037894666195,
-0.06600665301084518,
-0.10683397203683853,
0.04931458830833435,
0.019808337092399597,
-0.013135972432792187,
-0.114737868309021,
-0.09354698657989502,
-0.04552862048149109,
0.01894090510904789,
-0.011279534548521042,
-0.03845116123557091,
0.004129962995648384,
-0.04065261036157608,
-0.03251716122031212,
-0.030552411451935768,
0.0021749038714915514,
-0.028590066358447075,
-0.01985478773713112,
-0.024314217269420624,
0.035928357392549515,
-0.02141048200428486,
0.011384367011487484,
-0.06342495232820511,
-0.006687468383461237,
-0.1882234662771225,
0.1151350736618042,
-0.07525362819433212,
-0.006661958992481232,
-0.02711421065032482,
-0.07187265157699585,
-0.04935802146792412,
0.05072873458266258,
0.0013195810606703162,
0.07402697205543518,
-0.17393003404140472,
-0.028222842141985893,
0.1417880803346634,
-0.14306537806987762,
0.04109632596373558,
0.14123377203941345,
0.015080549754202366,
0.004092665389180183,
0.13263969123363495,
0.11340143531560898,
0.17020194232463837,
-0.12335120886564255,
-0.06412967294454575,
0.0018372070044279099,
-0.032944682985544205,
0.05900487303733826,
0.0407794751226902,
-0.0073601058684289455,
0.09095627069473267,
0.04521762207150459,
0.009426462464034557,
0.022795652970671654,
0.0386730395257473,
-0.021873140707612038,
-0.006592024117708206,
-0.024910712614655495,
-0.004523947834968567,
0.036533284932374954,
-0.04791133478283882,
-0.03944604843854904,
-0.09118659049272537,
0.07751038670539856,
0.10942360013723373,
-0.0418989397585392,
0.03315546736121178,
-0.07041851431131363,
-0.02286585606634617,
0.009751184843480587,
0.0019668142776936293,
-0.11190291494131088,
-0.04579504206776619,
0.04627826809883118,
-0.12929825484752655,
0.07301092892885208,
0.054888591170310974,
0.03660829737782478,
0.0719652846455574,
-0.016242740675807,
0.01578412391245365,
-0.030603118240833282,
0.0015486478805541992,
-0.021183989942073822,
-0.03719048574566841,
-0.027448566630482674,
-0.037148769944906235,
0.028130074962973595,
-0.10822416096925735,
0.0015185835072770715,
0.008852694183588028,
0.08869853615760803,
0.01412180531769991,
-0.024355031549930573,
0.016014395281672478,
0.014259892515838146,
0.0100739486515522,
-0.04136030003428459,
-0.00638921745121479,
-0.001843442558310926,
-0.0006580464541912079,
0.10725913196802139,
-0.14937596023082733,
-0.10943901538848877,
0.054853785783052444,
0.10564041137695312,
-0.013185436837375164,
-0.0027868885081261396,
-0.038119301199913025,
-0.03295012190937996,
-0.05370565131306648,
-0.1174565926194191,
0.2050842046737671,
0.015831485390663147,
0.060475338250398636,
-0.0889221653342247,
-0.02486799657344818,
0.007388660684227943,
-0.0008354944293387234,
-0.0063574109226465225,
0.08234357833862305,
0.010048775933682919,
-0.06351929903030396,
-0.014198740012943745,
-0.04787725210189819,
0.02790415845811367,
0.16964773833751678,
-0.021508963778614998,
-0.10722240805625916,
0.003961676266044378,
-0.013383145444095135,
-0.012975757010281086,
0.09238606691360474,
0.008721154183149338,
-0.00505990581586957,
0.031140172854065895,
0.026048585772514343,
0.054733723402023315,
-0.054619211703538895,
0.0842786431312561,
0.02590578980743885,
-0.04710424318909645,
0.03870037570595741,
-0.02515474520623684,
-0.006976984441280365,
0.0405050627887249,
0.005498888436704874,
0.022333582863211632,
-0.044265683740377426,
-0.043198417872190475,
-0.08687541633844376,
0.10283917188644409,
-0.10409072786569595,
-0.21965236961841583,
-0.1659354865550995,
0.07346221804618835,
-0.023266231641173363,
-0.014174141921103,
0.035258736461400986,
-0.05420442298054695,
-0.10379225760698318,
-0.1337755173444748,
0.0421854667365551,
0.01956932246685028,
-0.07287488132715225,
-0.05041934922337532,
0.028165118768811226,
0.009453055448830128,
-0.12478236109018326,
0.0027940552681684494,
0.005860915873199701,
0.012749065645039082,
-0.01719336397945881,
0.015236203558743,
0.031876951456069946,
0.07406831532716751,
0.009132600389420986,
-0.0558936782181263,
0.0101427948102355,
0.16270941495895386,
-0.07086695730686188,
0.15033970773220062,
0.1562698483467102,
0.005631578620523214,
0.05730154737830162,
0.09364261478185654,
0.014711350202560425,
-0.024254417046904564,
0.024636538699269295,
0.01629345864057541,
-0.06223832443356514,
-0.14600996673107147,
-0.12706467509269714,
-0.0425279401242733,
-0.004692379385232925,
0.06527990102767944,
0.032293666154146194,
-0.049374427646398544,
0.017062177881598473,
-0.07859542220830917,
-0.010797430761158466,
0.054243315011262894,
0.051642727106809616,
0.13773994147777557,
0.0020953600760549307,
0.026424771174788475,
-0.07799598574638367,
-0.017226500436663628,
0.10078561305999756,
-0.010684768669307232,
0.15625354647636414,
-0.05970749631524086,
0.11622975021600723,
0.028910383582115173,
-0.010988420806825161,
0.058008670806884766,
0.05566098168492317,
0.0005455004866234958,
0.020927725359797478,
-0.0232278760522604,
-0.07865266501903534,
-0.04497179388999939,
0.08335456997156143,
0.051190122961997986,
-0.07843119651079178,
-0.0012857100227847695,
0.0037725695874542,
0.002151771215721965,
0.13667893409729004,
0.030793333426117897,
-0.11376843601465225,
-0.1319599747657776,
0.017322754487395287,
-0.10490995645523071,
-0.07242858409881592,
0.022758839651942253,
0.15094606578350067,
-0.08784168213605881,
0.02962145395576954,
0.0008819873328320682,
0.07446951419115067,
-0.07377202808856964,
0.009348523803055286,
-0.03418629243969917,
0.13505835831165314,
0.012755165807902813,
0.05442996695637703,
-0.03722630441188812,
0.04009241238236427,
0.009042722173035145,
0.11728579550981522,
-0.06465107202529907,
0.044438671320676804,
0.028123924508690834,
0.012225285172462463,
0.05873239040374756,
0.04132826253771782,
-0.15392108261585236,
0.02159232646226883,
-0.11197928339242935,
0.05467735230922699,
0.04135614261031151,
0.05001397803425789,
0.0809231624007225,
-0.008953464217483997,
-0.0015275055775418878,
-0.02577725611627102,
-0.10593116283416748,
-0.12030007690191269,
-0.16340433061122894,
0.02344527281820774,
0.0004366636276245117,
-0.00523392716422677,
-0.05475655198097229,
-0.015985265374183655,
-0.08414293080568314,
0.11709720641374588,
-0.08076038211584091,
-0.11992288380861282,
-0.07288709282875061,
-0.05552985891699791,
0.1508675068616867,
-0.044193852692842484,
0.009508131071925163,
0.03500058129429817,
0.15747155249118805,
-0.051235269755125046,
-0.061332523822784424,
-0.015845758840441704,
-0.08370790630578995,
-0.10546223074197769,
0.00861471425741911,
0.11654555797576904,
0.10216354578733444,
0.0555858351290226,
0.012048144824802876,
0.0016352894017472863,
-0.001895089983008802,
-0.09959890693426132,
-0.054681312292814255,
0.17476268112659454,
-0.010313902050256729,
0.03309483081102371,
-0.0507279634475708,
-0.0676894262433052,
-0.052058830857276917,
-0.010619651526212692,
0.04928664490580559,
0.14504148066043854,
-0.0515596829354763,
0.13446246087551117,
0.192693829536438,
-0.0651121512055397,
-0.21338246762752533,
-0.052382100373506546,
0.05056386813521385,
0.05381326749920845,
0.01359261479228735,
-0.17020352184772491,
0.10854663699865341,
0.03819425776600838,
0.00042788463179022074,
0.03690018132328987,
-0.21739418804645538,
-0.10163155943155289,
0.04473710060119629,
-0.0096145523712039,
-0.02072981745004654,
-0.019876651465892792,
-0.02849278412759304,
-0.03807775676250458,
-0.03133409097790718,
0.06722855567932129,
-0.04959271475672722,
0.054221466183662415,
0.03577839955687523,
0.07713714241981506,
0.0474044494330883,
-0.023890845477581024,
0.11339205503463745,
-0.032442424446344376,
0.004955462645739317,
-0.06187007948756218,
0.09500572830438614,
0.01632487215101719,
-0.06547483056783676,
0.14617858827114105,
-0.023286888375878334,
0.012256506830453873,
-0.10549134016036987,
-0.06298240274190903,
-0.07537183910608292,
0.0745459794998169,
-0.017882883548736572,
-0.040177371352910995,
-0.08476237207651138,
0.07771856337785721,
0.09474539756774902,
0.005961759015917778,
-0.07780201733112335,
-0.07052728533744812,
-0.07399367541074753,
0.12443294376134872,
0.17438654601573944,
-0.050500378012657166,
-0.05015835538506508,
0.012608158402144909,
-0.00040455162525177,
0.05465744435787201,
-0.04949146509170532,
0.016545670107007027,
0.09414318948984146,
-0.019306370988488197,
0.03281625360250473,
-0.03229619190096855,
-0.1326659470796585,
-0.01768515817821026,
0.022716455161571503,
-0.039342429488897324,
-0.16728843748569489,
-0.031687237322330475,
0.03801671043038368,
-0.05294129252433777,
-0.044117242097854614,
0.13111990690231323,
-0.08157230168581009,
-0.0012005154276266694,
0.013236693106591702,
0.05600733682513237,
0.02499464713037014,
0.09346991032361984,
0.026659101247787476,
0.023409834131598473,
-0.07067522406578064,
0.09815231710672379,
0.03496327996253967,
-0.12817589938640594,
0.047648947685956955,
0.1502082347869873,
-0.09159968048334122,
-0.05008998513221741,
-0.11932400614023209,
-0.032365161925554276,
-0.01244290079921484,
-0.10346478223800659,
0.007711347192525864,
-0.07365702092647552,
0.014432557858526707,
0.0091245761141181,
0.008175571449100971,
-0.02214003913104534,
-0.01989944465458393,
0.03948504105210304,
-0.10130547732114792,
0.08423126488924026,
0.008853754960000515,
0.025744909420609474,
-0.04087499529123306,
0.09207773208618164,
-0.002216203138232231,
0.012180165387690067,
-0.024727383628487587,
-0.01848761737346649,
-0.009370808489620686,
-0.03160339593887329,
-0.1387440711259842,
0.014791886322200298,
-0.08791568130254745,
0.007802583277225494,
-0.00317222042940557,
0.03427378088235855,
-0.01833273470401764,
0.0523056797683239,
-0.0354597233235836,
-0.020654622465372086,
-0.05594886466860771,
0.03145807236433029,
-0.05038841441273689,
0.00910224299877882,
0.046396151185035706,
-0.07415524870157242,
0.0487644225358963,
0.004426298197358847,
-0.05110938847064972,
0.048794638365507126,
-0.013280510902404785,
0.005640966352075338,
0.018198734149336815,
0.06447144597768784,
0.004770229104906321,
-0.05151667073369026,
-0.004744712263345718,
0.029938340187072754,
-0.00018579575407784432,
-0.03939804434776306,
0.03659697622060776,
-0.0551922507584095,
0.07752563059329987,
0.02746741659939289,
-0.015197635628283024,
-0.05802598595619202,
0.02963772974908352,
0.02814788557589054,
0.01871461607515812,
0.08735248446464539,
-0.061210278421640396,
0.015165169723331928,
-0.09669142961502075,
-0.004097163677215576,
0.006682078819721937,
0.0063231936655938625,
0.08226530998945236,
-0.016139281913638115,
0.02895633690059185,
0.002758507849648595,
0.1819487363100052,
-0.018903326243162155,
0.023826882243156433,
0.053360819816589355,
-0.09414822608232498,
-0.10512331128120422,
0.01768769510090351,
0.06145940348505974,
0.013478961773216724,
-0.008507783524692059,
-0.04033714160323143,
-0.02806883491575718,
-0.01580694690346718,
-0.0008854120969772339,
0.08684146404266357,
0.10731986910104752,
0.09825684875249863,
0.0889103040099144,
0.01407468318939209,
-0.04119017347693443,
-0.10962633043527603,
0.07789019495248795,
-0.031503383070230484,
0.07799782603979111,
-0.04238221049308777,
0.08439106494188309,
0.12369954586029053,
-0.06412716209888458,
0.0944921001791954,
-0.00461883470416069,
-0.04585379362106323,
-0.0711073949933052,
-0.13473056256771088,
-0.04483099654316902,
-0.038716237992048264,
-0.026582792401313782,
-0.09069667011499405,
0.0348714143037796,
0.01204677950590849,
0.019142286852002144,
-0.026215462014079094,
0.11082395911216736,
-0.06747175008058548,
-0.09304589778184891,
0.05668144300580025,
-0.02135493792593479,
0.030029000714421272,
0.11332351714372635,
-0.001903074444271624,
0.0538879819214344,
0.0626368597149849,
0.061253368854522705,
0.05991177633404732,
0.017391294240951538,
-0.006223815958946943,
0.0012516319984570146,
-0.011559360660612583,
-0.009327809326350689,
-0.002457044320181012,
-0.005853129085153341,
0.10366081446409225,
0.07637200504541397,
-0.08245780318975449,
-0.005937032867223024,
0.10046041011810303,
-0.04660309478640556,
-0.15016955137252808,
-0.13030411303043365,
0.12804454565048218,
-0.0013114610919728875,
0.02104349248111248,
-0.00456608459353447,
-0.07677078992128372,
-0.018956074491143227,
0.12963031232357025,
0.1300942301750183,
0.03230154141783714,
-0.007181910332292318,
-0.03572767227888107,
-0.011372965760529041,
-0.06597274541854858,
0.11057919263839722,
0.005265643820166588,
0.2758779525756836,
-0.005390087608247995,
0.05360968038439751,
-0.02204703725874424,
-0.0230290275067091,
-0.11652066558599472,
0.07801657915115356,
-0.05089867115020752,
-0.0024473480880260468,
-0.02735804207623005,
0.07720375806093216,
-0.04201756790280342,
-0.24628931283950806,
-0.023046985268592834,
0.0037102538626641035,
-0.0638623759150505,
0.028440719470381737,
-0.012544691562652588,
0.021744966506958008,
0.06285843998193741,
-0.005074951332062483,
0.004013536497950554,
0.14971476793289185,
-0.015112691558897495,
-0.07112941890954971,
0.03717851638793945,
0.04316838085651398,
-0.04310224950313568,
0.16356422007083893,
0.034751567989587784,
0.05759380757808685,
0.057338353246450424,
-0.01089188177138567,
-0.1154375895857811,
0.060483336448669434,
0.02838708646595478,
-0.11712319403886795,
0.026896247640252113,
0.14681857824325562,
-0.011345725506544113,
0.07475735992193222,
0.047464653849601746,
-0.038985710591077805,
0.0022469956893473864,
0.08736366778612137,
0.006847434211522341,
-0.051591675728559494,
0.05323134362697601,
-0.09291020035743713,
0.1258869767189026,
0.10196837037801743,
0.012543381191790104,
-0.02674541063606739,
-0.04791439697146416,
0.006477579474449158,
-0.01092036347836256,
0.09372072666883469,
-0.003220735350623727,
-0.11508854478597641,
-0.027409130707383156,
-0.005490185227245092,
0.0566849410533905,
-0.1619563102722168,
-0.041261639446020126,
0.004434375092387199,
-0.021752452477812767,
0.028687670826911926,
0.09841183573007584,
0.027271397411823273,
0.007381968665868044,
-0.022927766665816307,
-0.024275457486510277,
0.005169641226530075,
0.07391040027141571,
-0.1190115287899971,
-0.05995332822203636
] |
null | null | transformers | # Finetuned Verbatim model.
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
# NB-Whisper Base Verbatim
Introducing the **_Norwegian NB-Whisper Base Verbatim model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-base](https://huggingface.co/openai/whisper-base)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-base-verbatim")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-base/resolve/main/ggml-model.bin -O models/nb-base-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-base/resolve/main/ggml-model-q5_0.bin -O models/nb-base-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-base-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-base-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-base-verbatim --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Base Verbatim model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-base", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-base-verbatim | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:08:23+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us
| Finetuned Verbatim model.
=========================
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
NB-Whisper Base Verbatim
========================
Introducing the *Norwegian NB-Whisper Base Verbatim model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-base
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Base Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Base Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Base Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
142,
198,
106,
95,
127,
160,
149,
215,
325,
500
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-base #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-base\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.05257462337613106,
0.09156882017850876,
-0.003889370709657669,
-0.0015360364923253655,
0.047238755971193314,
-0.03131026774644852,
0.02606082893908024,
0.05856755003333092,
0.0024788628797978163,
0.07160583138465881,
-0.004657914396375418,
-0.07279955595731735,
0.08953166007995605,
0.02657502330839634,
0.09533765912055969,
-0.29706111550331116,
0.05294530466198921,
-0.07575330883264542,
0.011178061366081238,
0.03972775116562843,
0.09776624292135239,
-0.0573233962059021,
0.043915897607803345,
0.015983829274773598,
-0.004811856895685196,
0.01090115588158369,
-0.05485151335597038,
-0.03610143065452576,
0.07365185767412186,
0.09314020723104477,
0.038003306835889816,
-0.0030246663372963667,
0.0796879455447197,
-0.14527931809425354,
0.009586211293935776,
0.0526743046939373,
0.04137672856450081,
0.01226269081234932,
0.03552528843283653,
0.11107107996940613,
0.19839446246623993,
-0.08577730506658554,
0.010634895414113998,
0.05158892273902893,
-0.024263568222522736,
-0.13056319952011108,
-0.042981043457984924,
-0.021199366077780724,
0.054708629846572876,
0.04196814075112343,
-0.028493275865912437,
0.09011867642402649,
-0.07922334969043732,
0.03615320473909378,
0.08135702461004257,
-0.1097734346985817,
-0.014897624962031841,
0.014663194306194782,
0.05201491340994835,
0.04401925206184387,
-0.026381075382232666,
0.00925327930599451,
0.01566195860505104,
0.03858711197972298,
0.016308536753058434,
-0.008892078883945942,
0.06737422198057175,
-0.0652637854218483,
-0.10682063549757004,
-0.04764435067772865,
0.13008302450180054,
0.015052453614771366,
-0.0597391314804554,
-0.17290198802947998,
-0.038318559527397156,
0.056371141225099564,
-0.025580009445548058,
-0.02837369590997696,
0.03024839423596859,
-0.002745782257989049,
0.10448190569877625,
-0.07679938524961472,
-0.09968048334121704,
0.015082251280546188,
-0.0032910555601119995,
0.10395898669958115,
0.04376959800720215,
0.0005302935023792088,
0.01057725865393877,
0.05159304663538933,
-0.06901535391807556,
-0.05669306591153145,
-0.06647590547800064,
-0.06746726483106613,
-0.05931593477725983,
0.011265193112194538,
-0.021308952942490578,
-0.10260790586471558,
-0.001161266933195293,
0.07558553665876389,
-0.0172494538128376,
0.018252713605761528,
0.021934369578957558,
0.00037506470107473433,
0.05515357479453087,
0.1336805373430252,
-0.02279496192932129,
-0.07873788475990295,
0.003176724771037698,
-0.007530247326940298,
0.06663152575492859,
0.01272540632635355,
-0.04872044920921326,
-0.031206006184220314,
-0.0012563440250232816,
0.025600938126444817,
0.020737653598189354,
0.009196359664201736,
0.019953899085521698,
-0.028772788122296333,
0.2428404539823532,
-0.11352575570344925,
0.00018854376685339957,
0.014144708402454853,
-0.04884526506066322,
0.08463320881128311,
0.027041539549827576,
-0.023487547412514687,
-0.1307404786348343,
0.016883408650755882,
-0.0035485774278640747,
-0.01933528669178486,
-0.06576713174581528,
-0.10395687818527222,
0.048546548932790756,
0.014570239000022411,
-0.01304579433053732,
-0.11385476589202881,
-0.10204380750656128,
-0.047755781561136246,
0.015790242701768875,
-0.013504761271178722,
-0.0395045168697834,
0.002495313063263893,
-0.041211482137441635,
-0.031562503427267075,
-0.03004133515059948,
0.0015689233550801873,
-0.027348237112164497,
-0.01828496716916561,
-0.027612680569291115,
0.031708717346191406,
-0.02219320833683014,
0.013326813466846943,
-0.06904678791761398,
-0.00820846389979124,
-0.1871155947446823,
0.11733788251876831,
-0.07467526942491531,
-0.002654820680618286,
-0.027569226920604706,
-0.06801412254571915,
-0.05746210739016533,
0.05036112666130066,
0.001801577745936811,
0.07583329826593399,
-0.17955483496189117,
-0.027895590290427208,
0.1435326188802719,
-0.1413162797689438,
0.03791363909840584,
0.1414884775876999,
0.015380400232970715,
0.009737317450344563,
0.13239245116710663,
0.11542914062738419,
0.16897821426391602,
-0.12066122889518738,
-0.06412052363157272,
0.0035918389912694693,
-0.032089270651340485,
0.05469133332371712,
0.04030877724289894,
-0.010213442146778107,
0.09640880674123764,
0.044758390635252,
0.0070007964968681335,
0.02430967427790165,
0.0370696596801281,
-0.022251738235354424,
-0.007411798927932978,
-0.028484458103775978,
-0.00793131161481142,
0.038228005170822144,
-0.044577401131391525,
-0.03791611269116402,
-0.0902194008231163,
0.07575424760580063,
0.11097148805856705,
-0.03986934944987297,
0.03288238123059273,
-0.06610088795423508,
-0.0213315486907959,
0.013469169847667217,
0.0009380274568684399,
-0.11318757385015488,
-0.04361434653401375,
0.04955941066145897,
-0.12993203103542328,
0.07392912358045578,
0.05316663160920143,
0.03368912637233734,
0.06573044508695602,
-0.009932253509759903,
0.017469586804509163,
-0.026545854285359383,
-0.0010807787766680121,
-0.024003347381949425,
-0.034315578639507294,
-0.023853451013565063,
-0.037304993718862534,
0.028932660818099976,
-0.10818460583686829,
0.0007115441258065403,
0.00023688003420829773,
0.08971771597862244,
0.014919336885213852,
-0.024469345808029175,
0.019408876076340675,
0.014115971513092518,
0.007055744528770447,
-0.04119503125548363,
-0.007463077548891306,
-0.002350161550566554,
-0.00011819973587989807,
0.10613059997558594,
-0.14575015008449554,
-0.11973664909601212,
0.05151921883225441,
0.10759500414133072,
-0.01368203479796648,
-0.003729540854692459,
-0.03683733567595482,
-0.0337187685072422,
-0.05311546102166176,
-0.11978860944509506,
0.20104210078716278,
0.018200375139713287,
0.059448327869176865,
-0.0884394720196724,
-0.025831766426563263,
0.005623621866106987,
-0.0009092949330806732,
-0.007862324826419353,
0.08347650617361069,
0.015631651505827904,
-0.06891824305057526,
-0.010891006328165531,
-0.050589293241500854,
0.026871388778090477,
0.1734665185213089,
-0.021228566765785217,
-0.10743492841720581,
0.010934221558272839,
-0.011180895380675793,
-0.014131377451121807,
0.09314537048339844,
0.007327839732170105,
-0.006300993729382753,
0.03166627511382103,
0.025188079103827477,
0.05536413565278053,
-0.0524313785135746,
0.08439788222312927,
0.024913661181926727,
-0.04657699540257454,
0.041718944907188416,
-0.02345859259366989,
-0.004616702441126108,
0.04217268526554108,
0.004943785723298788,
0.022859960794448853,
-0.04445381835103035,
-0.043085385113954544,
-0.08692565560340881,
0.1030755564570427,
-0.10095026344060898,
-0.22130484879016876,
-0.16304047405719757,
0.08234840631484985,
-0.021208787336945534,
-0.01394116785377264,
0.03547279164195061,
-0.047463491559028625,
-0.10224474221467972,
-0.13318704068660736,
0.03956166282296181,
0.02483067847788334,
-0.07745752483606339,
-0.0493643581867218,
0.02596699260175228,
0.005489848554134369,
-0.1260288506746292,
0.003924097400158644,
0.0033671557903289795,
0.01677533984184265,
-0.02007139101624489,
0.014987650327384472,
0.03377743065357208,
0.07851061969995499,
0.006713266018778086,
-0.056653041392564774,
0.008832809515297413,
0.15939825773239136,
-0.07354361563920975,
0.14936716854572296,
0.15850311517715454,
0.009460188448429108,
0.05709390342235565,
0.09096863865852356,
0.01263255625963211,
-0.02113858424127102,
0.024939196184277534,
0.015375353395938873,
-0.06287761777639389,
-0.14652808010578156,
-0.1292000561952591,
-0.043429747223854065,
-0.004742756951600313,
0.06299073249101639,
0.033950742334127426,
-0.05017924681305885,
0.017199620604515076,
-0.07970335334539413,
-0.013278537429869175,
0.050925057381391525,
0.051864754408597946,
0.1301913857460022,
0.004499455448240042,
0.025248000398278236,
-0.07573241740465164,
-0.017455870285630226,
0.1000291183590889,
-0.01241537556052208,
0.15138280391693115,
-0.06394769996404648,
0.11626806110143661,
0.025530144572257996,
-0.017087966203689575,
0.053920403122901917,
0.05614541843533516,
-0.0022100897040218115,
0.021280964836478233,
-0.022973142564296722,
-0.07623719424009323,
-0.04654158279299736,
0.08252201229333878,
0.05374358966946602,
-0.07701850682497025,
0.0010007526725530624,
0.006483456585556269,
0.0038289539515972137,
0.14058005809783936,
0.031033774837851524,
-0.10908690094947815,
-0.1297164261341095,
0.018468476831912994,
-0.10277310758829117,
-0.07551591843366623,
0.022123083472251892,
0.14524811506271362,
-0.08877679705619812,
0.02560300938785076,
0.0012576443841680884,
0.07706376910209656,
-0.06902803480625153,
0.00808622408658266,
-0.03724433854222298,
0.13538937270641327,
0.013870674185454845,
0.05481257662177086,
-0.04132213443517685,
0.03703119978308678,
0.007341486867517233,
0.11949849128723145,
-0.06698263436555862,
0.04459688439965248,
0.028197431936860085,
0.00392954284325242,
0.061128437519073486,
0.038627441972494125,
-0.1476462185382843,
0.023053346201777458,
-0.11594921350479126,
0.04937589168548584,
0.03826010227203369,
0.04810386896133423,
0.07727103680372238,
-0.009832856245338917,
0.00020158539700787514,
-0.0263079646974802,
-0.10911429673433304,
-0.12161785364151001,
-0.1611223667860031,
0.025364426895976067,
0.0010908445110544562,
-0.009240173734724522,
-0.05138523504137993,
-0.01434664148837328,
-0.08600425720214844,
0.11482184380292892,
-0.0863439217209816,
-0.12069369107484818,
-0.07193117588758469,
-0.055428389459848404,
0.14997471868991852,
-0.044907379895448685,
0.010956734418869019,
0.034273259341716766,
0.156535342335701,
-0.05186764523386955,
-0.05998174473643303,
-0.01664636842906475,
-0.08650519698858261,
-0.10432609170675278,
0.008584936149418354,
0.11713094264268875,
0.10306458920240402,
0.05550260469317436,
0.01402889285236597,
-0.0037956226151436567,
-0.002469730796292424,
-0.09776115417480469,
-0.057848766446113586,
0.176075279712677,
-0.013818051666021347,
0.03683515265583992,
-0.05131937190890312,
-0.07409121841192245,
-0.05530531331896782,
-0.011606081388890743,
0.052511751651763916,
0.14612169563770294,
-0.05149024724960327,
0.13566452264785767,
0.19136059284210205,
-0.06542227417230606,
-0.21335369348526,
-0.048646796494722366,
0.051191192120313644,
0.05359865352511406,
0.016979796811938286,
-0.1665179282426834,
0.10834703594446182,
0.03878497704863548,
-0.0033060358837246895,
0.04139995947480202,
-0.2199699431657791,
-0.10115042328834534,
0.05138356611132622,
-0.014696101658046246,
-0.019931785762310028,
-0.02439277060329914,
-0.027330448850989342,
-0.03606271371245384,
-0.032326988875865936,
0.0719451978802681,
-0.04141524061560631,
0.053368743509054184,
0.035363782197237015,
0.07322632521390915,
0.048791032284498215,
-0.025341486558318138,
0.10781291872262955,
-0.03155675157904625,
0.005655005574226379,
-0.06145535036921501,
0.093990258872509,
0.014361526817083359,
-0.06690443307161331,
0.15183700621128082,
-0.02094689942896366,
0.013860423117876053,
-0.10597428679466248,
-0.060870181769132614,
-0.07602480053901672,
0.0795830711722374,
-0.016881369054317474,
-0.04051683843135834,
-0.08648011088371277,
0.079889215528965,
0.0924091711640358,
0.008822781965136528,
-0.070767343044281,
-0.07114078849554062,
-0.07292737811803818,
0.12178241461515427,
0.1760808229446411,
-0.04936368390917778,
-0.04979963228106499,
0.010590658523142338,
-0.000011843939319078345,
0.054640259593725204,
-0.047685656696558,
0.01793590374290943,
0.0930771604180336,
-0.015346909873187542,
0.03134333714842796,
-0.0324653796851635,
-0.1336463838815689,
-0.014874051325023174,
0.020214026793837547,
-0.03513690084218979,
-0.166758194565773,
-0.03249279782176018,
0.04399672523140907,
-0.05362558364868164,
-0.04186059534549713,
0.12851418554782867,
-0.08162453025579453,
-0.0026072452310472727,
0.013565752655267715,
0.05626619979739189,
0.02818627469241619,
0.09577565640211105,
0.02398509718477726,
0.022909635677933693,
-0.06808645278215408,
0.09853681921958923,
0.034314919263124466,
-0.12186542898416519,
0.04960690438747406,
0.15046323835849762,
-0.08797264099121094,
-0.04938700795173645,
-0.11917342990636826,
-0.03522126376628876,
-0.014209886081516743,
-0.10138630121946335,
0.00975082814693451,
-0.07157068699598312,
0.011992515064775944,
0.009445005096495152,
0.0075260065495967865,
-0.022323573008179665,
-0.019787786528468132,
0.040024928748607635,
-0.10326302796602249,
0.08393501490354538,
0.012209909968078136,
0.027513189241290092,
-0.040436241775751114,
0.08833393454551697,
-0.0034943195059895515,
0.014298238791525364,
-0.025021841749548912,
-0.017383025959134102,
-0.007371917366981506,
-0.028965309262275696,
-0.13482794165611267,
0.015716686844825745,
-0.0883842408657074,
0.006401623133569956,
-0.0032790761906653643,
0.033418990671634674,
-0.016347689554095268,
0.050989557057619095,
-0.03411979228258133,
-0.020543891936540604,
-0.05626319721341133,
0.03392863646149635,
-0.05129844322800636,
0.010385682806372643,
0.04580840468406677,
-0.0714820995926857,
0.049852412194013596,
0.003565218998119235,
-0.05129159986972809,
0.04671138525009155,
-0.024842804297804832,
0.004699361976236105,
0.01904085837304592,
0.06338037550449371,
0.005048207938671112,
-0.04959264397621155,
-0.005536240991204977,
0.030493125319480896,
0.0007314601098187268,
-0.0410645417869091,
0.030622011050581932,
-0.0570262186229229,
0.08029692620038986,
0.02498999424278736,
-0.0160979013890028,
-0.05481528863310814,
0.030023574829101562,
0.028390804305672646,
0.019957885146141052,
0.08840803056955338,
-0.05862550064921379,
0.015917059034109116,
-0.09230790287256241,
-0.001870525535196066,
0.006468463223427534,
0.008100527338683605,
0.08088687807321548,
-0.014560184441506863,
0.02747131697833538,
0.0035718281287699938,
0.17750181257724762,
-0.014259431511163712,
0.02441818080842495,
0.05345626175403595,
-0.09368842095136642,
-0.10172409564256668,
0.01638318970799446,
0.060973603278398514,
0.014401343651115894,
-0.007174742873758078,
-0.03897197172045708,
-0.028259972110390663,
-0.015247260220348835,
0.0014847380807623267,
0.09563418477773666,
0.11039241403341293,
0.09006167203187943,
0.09136733412742615,
0.01270326692610979,
-0.04466308280825615,
-0.11487390846014023,
0.07476502656936646,
-0.029085049405694008,
0.07602059096097946,
-0.04339108243584633,
0.08744344115257263,
0.12590646743774414,
-0.06505384296178818,
0.09222327917814255,
-0.006449844688177109,
-0.04574226960539818,
-0.07216823101043701,
-0.13679030537605286,
-0.042808935046195984,
-0.03323495760560036,
-0.026476852595806122,
-0.08898219466209412,
0.034736428409814835,
0.014900150708854198,
0.017807574942708015,
-0.02808266319334507,
0.1074584424495697,
-0.06566540151834488,
-0.09300240874290466,
0.05968322232365608,
-0.020241307094693184,
0.03029818832874298,
0.10341143608093262,
-0.004173435736447573,
0.052550625056028366,
0.06428921967744827,
0.06439057737588882,
0.05959190055727959,
0.0175357386469841,
-0.007759917993098497,
-0.0007445684750564396,
-0.01214592531323433,
-0.009498578496277332,
-0.0007696487009525299,
-0.005349504295736551,
0.10413491725921631,
0.07822798937559128,
-0.08163751661777496,
-0.006295992527157068,
0.10199352353811264,
-0.046091679483652115,
-0.14931099116802216,
-0.12814126908779144,
0.12502624094486237,
-0.0003677681088447571,
0.01822538673877716,
-0.006288283970206976,
-0.08173256367444992,
-0.018743636086583138,
0.1325640231370926,
0.13066108524799347,
0.031717684119939804,
-0.007915569469332695,
-0.040161628276109695,
-0.011440892703831196,
-0.07110974937677383,
0.10926435142755508,
0.006859429646283388,
0.2796337902545929,
-0.00578506151214242,
0.059996068477630615,
-0.01953831873834133,
-0.023556167259812355,
-0.11984100192785263,
0.07610777020454407,
-0.05006611347198486,
-0.0019897196907550097,
-0.027313843369483948,
0.07642850279808044,
-0.05070553347468376,
-0.2426699548959732,
-0.029307931661605835,
0.0024147715885192156,
-0.06487597525119781,
0.027705291286110878,
-0.016348347067832947,
0.021582549437880516,
0.061873599886894226,
-0.0065840804018080235,
0.003531457856297493,
0.14242595434188843,
-0.01677924580872059,
-0.06653181463479996,
0.03608198091387749,
0.04046594724059105,
-0.03962743654847145,
0.16938753426074982,
0.03353407233953476,
0.05317943915724754,
0.056245286017656326,
-0.010979980230331421,
-0.11587733030319214,
0.055590540170669556,
0.026536395773291588,
-0.11149876564741135,
0.029833385720849037,
0.14703212678432465,
-0.012301348149776459,
0.08035222440958023,
0.04916243627667427,
-0.041829098016023636,
0.0012287311255931854,
0.08374606817960739,
0.008869524113833904,
-0.05507571995258331,
0.05538788437843323,
-0.0921650156378746,
0.12868550419807434,
0.09751296043395996,
0.011994477361440659,
-0.02694537676870823,
-0.04719294235110283,
0.008014805614948273,
-0.013625702820718288,
0.1011563315987587,
-0.004667121917009354,
-0.11675518751144409,
-0.025047481060028076,
-0.003896063892170787,
0.05545365437865257,
-0.16231052577495575,
-0.03846776857972145,
0.0026844076346606016,
-0.022036388516426086,
0.02941945381462574,
0.09558728337287903,
0.029479295015335083,
0.006198102608323097,
-0.02368827350437641,
-0.02596864104270935,
0.0029959373641759157,
0.07417096942663193,
-0.11611198633909225,
-0.05778326094150543
] |
null | null | transformers | # Finetuned Verbatim model.
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
# NB-Whisper Tiny Verbatim
Introducing the **_Norwegian NB-Whisper Tiny Verbatim model_**, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of [OpenAI's Whisper](https://arxiv.org/abs/2212.04356). Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
| Model Size | Parameters | Model |
|------------|------------|------------|
| Tiny | 39M | [NB-Whisper Tiny](https://huggingface.co/NbAiLab/nb-whisper-tiny) |
| Base | 74M | [NB-Whisper Base](https://huggingface.co/NbAiLab/nb-whisper-base) |
| Small | 244M | [NB-Whisper Small](https://huggingface.co/NbAiLab/nb-whisper-small) |
| Medium | 769M | [NB-Whisper Medium](https://huggingface.co/NbAiLab/nb-whisper-medium) |
| Large | 1550M | [NB-Whisper Large](https://huggingface.co/NbAiLab/nb-whisper-large) |
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
- **Verbatim version**: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
| Model Size | Parameters | Semantic version |
|------------|------------|------------------|
| Tiny | 39M | [Tiny - semantic](https://huggingface.co/NbAiLab/nb-whisper-tiny-semantic) |
| Base | 74M | [Base - semantic](https://huggingface.co/NbAiLab/nb-whisper-base-semantic) |
| Small | 244M | [Small - semantic](https://huggingface.co/NbAiLab/nb-whisper-small-semantic) |
| Medium | 769M | [Medium - semantic](https://huggingface.co/NbAiLab/nb-whisper-medium-semantic) |
| Large | 1550M | [Large - semantic](https://huggingface.co/NbAiLab/nb-whisper-large-semantic) |
### Model Description
- **Developed by:** [NB AI-Lab](https://ai.nb.no/)
- **Shared by:** [NB AI-Lab](https://ai.nb.no/)
- **Model type:** `whisper`
- **Language(s) (NLP):** Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Trained from model:** [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny)
- **Code Repository:** https://github.com/NbAiLab/nb-whisper/
- **Paper:** _Coming soon_
- **Demo:** _See Spaces on this page_
## How to Use the Models
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the **Spaces** section on the [Main Page](https://huggingface.co/NbAiLab/).
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have [Python](https://www.python.org/downloads/) installed on your machine. For practical demonstrations, refer to examples using this [sample mp3 file](https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3).
```bash
# Download the sample file
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
# Install necessary libraries.
$ pip install transformers>=4.35.2
```
After this is done, you should be able to run this in Python:
```python
from transformers import pipeline
# Load the model
asr = pipeline("automatic-speech-recognition", "NbAiLabBeta/nb-whisper-tiny-verbatim")
#transcribe
asr("king.mp3", generate_kwargs={'task': 'transcribe', 'language': 'no'})
```
<details>
<summary>Expected output</summary>
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra.'}
}
```
</details>
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the ```chunk_lengt_s``` argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
```python
# Long Transcripts
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Increase accuracy by setting beam size to 5
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'num_beams': 5, 'task': 'transcribe', 'language': 'no'})
# Return Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps=True, generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Return Word Level Timestamps
asr("king.mp3", chunk_length_s=28, return_timestamps="word", generate_kwargs={'task': 'transcribe', 'language': 'no'})
# Transcribe to Nynorsk
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'nn'})
# Transcribe to English
asr("king.mp3", chunk_length_s=28, generate_kwargs={'task': 'transcribe', 'language': 'en'})
```
<details>
<summary>Expected output</summary>
Long transcripts:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}
}
```
Timestamps:
```json
{
{'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra. Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.',
'chunks': [{'timestamp': (0.0, 5.46),
'text': ' Nordmenn er nordlendinger, trøndere, sørlendinger'},
{'timestamp': (5.52, 8.68), 'text': ' og folk fra alle andre regioner.'},
{'timestamp': (8.68, 16.64),
'text': ' Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria.'},
{'timestamp': (16.64, 13.3),
'text': ' Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi er fra.'},
{'timestamp': (13.32, 30.28),
'text': ' Hvilken nasjonalitet vi er fra. hvilken nasjonalitet vi tilhører.'},
{'timestamp': (32.52, 39.16),
'text': ' Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres'},
{'timestamp': (39.16, 42.0), 'text': ' innenfor landegrenser.'},
{'timestamp': (42.0, 46.74),
'text': ' Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter,'},
{'timestamp': (46.74, 51.12),
'text': ' og jenter og gutter som er glad i hverandre.'},
{'timestamp': (51.16, 57.42),
'text': ' Nordmenn trommer på Gud, Allah, Altet og ingenting.'},
{'timestamp': (57.42, 64.3),
'text': ' Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes.'},
{'timestamp': (64.34, 71.24),
'text': ' Med andre ord, Norge er dere. Norge er oss.'},
{'timestamp': (71.24, 78.04),
'text': ' Mitt største håp for Norge er at vi skal klare å ta vare på hverandre,'},
{'timestamp': (78.12, 84.68),
'text': ' at vi skal bygge dette landet videre på tillit, fellesskap og raushet.'}]}
}
```
Word Level Timestamps:
```json
{
{"text": "Nordmenn er nordlendinger, trøndere, sørlendinger og folk fra alle andre regioner. Nordmenn er også innvandret fra Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikke alltid så lett å si hvor vi er fra, hvilken nasjonalitet vi tilhører. Det vi kaller hjem, er der hjertet vårt er, og det kan ikke alltid plasseres innenfor landegrenser. Nordmenn er jenter som er glad i jenter, gutter som er glad i gutter, og jenter og gutter som er glad i hverandre. Nordmenn trommer på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbilis og Kari Bremnes. Med andre ord, Norge er dere. Norge er oss. Mitt største håp for Norge er at vi skal klare å ta vare på hverandre, at vi skal bygge dette landet videre på tillit, fellesskap og raushet.",
"chunks": [
{"text": "Nordmenn", "timestamp": [0.72, 1.42]},
{"text": "er", "timestamp": [1.42, 1.74]},
// ... more chunks ...
{"text": "raushet.", "timestamp": [83.1, 84.88]}
]
}
}
```
Nynorsk:
```json
{
{"text": "Nordmenn er nordlendingar, trøndarar, sørlendingar og folk frå alle andre regionar. Nordmenn er også innvandra frå Afghanistan, Pakistan, Polen, Sverige, Somalia og Syria. Det er ikkje alltid så lett å seie kvar vi er frå, kva nasjonalitet vi tilhøyrer. Det vi kallar heim, er der hjartet vårt er, og det kan ikkje alltid plasserast innanfor landegrenser. Nordmenn er jenter som er glad i jenter, gutar som erade i gutar, og jenter og gutar som er glade i kvarandre. Nordmenn trommar på Gud, Allah, Altet og ingenting. Nordmenn liker Grieg, Kygo, Helbiles og Kari Bremnes. Med andre ord, Noreg er dere! Noreg er oss. Mitt største håp for Noreg er at vi skal klare å ta vare på kvarandre, at vi skal byggje dette landet vidare på tillit, fellesskap og raushet."}
}
```
English:
```json
{
{"text": "Norwegians are Norwegians, trønders, southerners and people from all other regions. Norwegians are also invaded from Afghanistan, Pakistan, Poland, Sweden, Somalia and Suria. It is not always so easy to say where we are from, what nationality we belong to. What we call home is where our heart is, and it cannot always be placed within national borders. Norwegians are girls who like girls, boys who like boys, and girls and boys who like each other. Norwegians thrump on God, Allah, Altet and nothing. Norwegians like Grieg, Kygo, Helbilis and Kari Bremnes. In other words, Norway is you. Norway is us. My biggest hope for Norway is that we should be able to take care of each other, that we should build this country on trust, community and generosity."}
}
```
</details>
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their [homepage](https://github.com/ggerganov/whisper.cpp) provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded [here](blob/main/ggml-model.bin), and a `q5_0` quantized version is also available [here](blob/main/ggml-model-q5_0.bin).
```bash
# We can download and compile whisper.cpp
$ git clone --depth 1 https://github.com/ggerganov/whisper.cpp --branch v1.5.1
$ cd whisper.cpp/
$ make
# We also need to convert the audio to WAV as that is the only format supported by whisper.cpp
$ wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/king.mp3
$ ffmpeg -i king.mp3 -ar 16000 -ac 1 -c:a pcm_s16le king.wav
# Lets download the two ggml-files from this site
wget -N https://huggingface.co/NbAiLab/nb-whisper-tiny/resolve/main/ggml-model.bin -O models/nb-tiny-ggml-model.bin
wget -N https://huggingface.co/NbAiLab/nb-whisper-tiny/resolve/main/ggml-model-q5_0.bin -O models/nb-tiny-ggml-model-q5_0.bin
# And run it with the f16 default model
$ ./main -l no -m models/nb-tiny-ggml-model.bin king.wav
# Or the quantized version
$ ./main -l no -m models/nb-tiny-ggml-model-q5_0.bin king.wav
```
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that [WhisperX](https://github.com/m-bain/whisperX) is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses [PyAnnote-audio](https://github.com/pyannote/pyannote-audio) for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
```bash
# Follow the install instructions on https://github.com/m-bain/whisperX
# Make sure you have a HuggingFace account and have agreed to the pyannote terms
# Log in (or supply HF Token in command line)
huggingface-cli login
# Download a test file
wget -N https://github.com/NbAiLab/nb-whisper/raw/main/audio/knuthamsun.mp3
# Optional. If you get complains about not support for Norwegian, do:
pip uninstall whisperx && pip install git+https://github.com/m-bain/whisperx.git@8540ff5985fceee764acbed94f656063d7f56540
# Transcribe the test file. All transcripts will end up in the directory of the mp3-file
whisperx knuthamsun.mp3 --model NbAiLabBeta/nb-whisper-tiny-verbatim --language no --diarize
```
You can also run WhisperX from Python. Please take a look at the instructions on [WhisperX homepage](https://github.com/m-bain/whisperX).
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
## Training Data
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
- NST Norwegian ASR Database (16 kHz) and its corresponding dataset
- Transcribed speeches from the Norwegian Parliament by Språkbanken
- TV broadcast (NRK) subtitles (NLN digital collection)
- Audiobooks (NLN digital collection)
## Downstream Use
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
## Bias, Risks, and Limitations
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, whisper.cpp, and ONXX formats. These are available under `Files and versions`. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository [nb-whisper](https://github.com/NbAiLab/nb-whisper/).
## Citation & Contributors
The NB-Whisper Tiny Verbatim model is a product of the NoSTram project led by Per Egil Kummervold ([@pere](https://huggingface.co/pere)) at the National Library of Norway. Key contributors include Javier de la Rosa ([@versae](https://huggingface.co/versae)), Freddy Wetjen ([@freddyw](https://huggingface.co/freddyw)), and Rolv-Arild Braaten ([@Rolv-Arild](https://huggingface.co/Rolv-Arild)). NB AI-Lab, under the direction of Svein Arne Brygfjeld ([@Brygfjeld](https://huggingface.co/Brygfjeld)), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
## Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
## Acknowledgements
Our gratitude extends to [Google TPU Research Cloud](https://sites.research.google/trc/about/) for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
## Contact
For feedback, technical concerns, or collaboration inquiries, please contact <a rel="noopener nofollow" href="mailto:[email protected]">[email protected]</a>. If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| {"language": ["no", "nb", "nn", "en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["audio", "asr", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["NbAiLab/ncc_speech", "NbAiLab/NST", "NbAiLab/NPSC"], "metrics": ["wer", "cer"], "base_model": "openai/whisper-tiny", "pipeline_tag": "automatic-speech-recognition", "widget": [{"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/1/audio/audio.mp3", "example_title": "FLEURS sample 1"}, {"src": "https://datasets-server.huggingface.co/assets/google/fleurs/--/nb_no/train/4/audio/audio.mp3", "example_title": "FLEURS sample 2"}]} | automatic-speech-recognition | NbAiLab/nb-whisper-tiny-verbatim | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"onnx",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"asr",
"hf-asr-leaderboard",
"no",
"nb",
"nn",
"en",
"dataset:NbAiLab/ncc_speech",
"dataset:NbAiLab/NST",
"dataset:NbAiLab/NPSC",
"arxiv:2212.04356",
"base_model:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:08:33+00:00 | [
"2212.04356"
] | [
"no",
"nb",
"nn",
"en"
] | TAGS
#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us
| Finetuned Verbatim model.
=========================
This model is trained 200 additional steps on top of the model below. This makes it outputting only text in lowercase and without punctation. It is also considerably more verbatim, and will not make any attempt at correcting grammatical errors in the text
NB-Whisper Tiny Verbatim
========================
Introducing the *Norwegian NB-Whisper Tiny Verbatim model*, proudly developed by the National Library of Norway. NB-Whisper is a cutting-edge series of models designed for automatic speech recognition (ASR) and speech translation. These models are based on the work of OpenAI's Whisper. Each model in the series has been trained for 250,000 steps, utilizing a diverse dataset of 8 million samples. These samples consist of aligned audio clips, each 30 seconds long, culminating in a staggering 66,000 hours of speech. For an in-depth understanding of our training methodology and dataset composition, keep an eye out for our upcoming article.
Model Size: Tiny, Parameters: 39M, Model: NB-Whisper Tiny
Model Size: Base, Parameters: 74M, Model: NB-Whisper Base
Model Size: Small, Parameters: 244M, Model: NB-Whisper Small
Model Size: Medium, Parameters: 769M, Model: NB-Whisper Medium
Model Size: Large, Parameters: 1550M, Model: NB-Whisper Large
### Verbatim Model
While the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:
* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.
Model Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic
Model Size: Base, Parameters: 74M, Semantic version: Base - semantic
Model Size: Small, Parameters: 244M, Semantic version: Small - semantic
Model Size: Medium, Parameters: 769M, Semantic version: Medium - semantic
Model Size: Large, Parameters: 1550M, Semantic version: Large - semantic
### Model Description
* Developed by: NB AI-Lab
* Shared by: NB AI-Lab
* Model type: 'whisper'
* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English
* License: Apache 2.0
* Trained from model: openai/whisper-tiny
* Code Repository: URL
* Paper: *Coming soon*
* Demo: *See Spaces on this page*
How to Use the Models
---------------------
### Online Demos
You can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.
### Local Setup with HuggingFace
Alternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.
After this is done, you should be able to run this in Python:
Expected output
#### Extended HuggingFace
Examining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.
Expected output
Long transcripts:
Timestamps:
Word Level Timestamps:
Nynorsk:
English:
### Whisper CPP
Whisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.
We have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\_0' quantized version is also available here.
### WhisperX and Speaker Diarization
Speaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.
You can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.
### API
Instructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.
Training Data
-------------
The training data originates from Språkbanken and the National Library of Norway's digital collection, including:
* NST Norwegian ASR Database (16 kHz) and its corresponding dataset
* Transcribed speeches from the Norwegian Parliament by Språkbanken
* TV broadcast (NRK) subtitles (NLN digital collection)
* Audiobooks (NLN digital collection)
Downstream Use
--------------
The models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.
Bias, Risks, and Limitations
----------------------------
Using these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.
### Software
The model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.
& Contributors
The NB-Whisper Tiny Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.
Disclaimer
----------
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.
Acknowledgements
----------------
Our gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.
Contact
-------
For feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes.
| [
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-tiny\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Tiny Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
"TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic",
"### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-tiny\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.",
"### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output",
"#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:",
"### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.",
"### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models.",
"### Software\n\n\nThe model was trained using Jax/Flax and converted to PyTorch, Tensorflow, URL, and ONXX formats. These are available under 'Files and versions'. We welcome requests for conversion to other formats. All training code and scripts are released under the Apache License 2.0 in the GitHub repository nb-whisper.\n\n\n& Contributors\nThe NB-Whisper Tiny Verbatim model is a product of the NoSTram project led by Per Egil Kummervold (@pere) at the National Library of Norway. Key contributors include Javier de la Rosa (@versae), Freddy Wetjen (@freddyw), and Rolv-Arild Braaten (@Rolv-Arild). NB AI-Lab, under the direction of Svein Arne Brygfjeld (@Brygfjeld), supported the project's successful completion. A detailed paper on our process and findings is forthcoming.\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (The National Library of Norway) be liable for any results arising from the use made by third parties of these models.\n\n\nAcknowledgements\n----------------\n\n\nOur gratitude extends to Google TPU Research Cloud for training resources, Google Cloud for translation credits, and HuggingFace's Sanchit Ghandi for technical support. A special thank you to Per Erik Solberg at Språkbanken for the collaboration on the Stortinget corpus.\n\n\nContact\n-------\n\n\nFor feedback, technical concerns, or collaboration inquiries, please contact [ailab@URL](mailto:ailab@URL). If you plan to include this model in your research, contact us for the latest information on our upcoming paper for citation purposes."
] | [
142,
198,
106,
95,
127,
160,
149,
215,
325,
501
] | [
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #onnx #safetensors #whisper #automatic-speech-recognition #audio #asr #hf-asr-leaderboard #no #nb #nn #en #dataset-NbAiLab/ncc_speech #dataset-NbAiLab/NST #dataset-NbAiLab/NPSC #arxiv-2212.04356 #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us \n### Verbatim Model\n\n\nWhile the main models are suitable for most transcription task, we demonstrate how easy it is to change the output of the main model. The following models are trained 250 additional steps from the main models above, and might be suitable for more targetted use cases:\n\n\n* Verbatim version: This lower-cased variant is more literal and suitable for tasks requiring detailed transcription, such as linguistic analysis.\n\n\nModel Size: Tiny, Parameters: 39M, Semantic version: Tiny - semantic\nModel Size: Base, Parameters: 74M, Semantic version: Base - semantic\nModel Size: Small, Parameters: 244M, Semantic version: Small - semantic\nModel Size: Medium, Parameters: 769M, Semantic version: Medium - semantic\nModel Size: Large, Parameters: 1550M, Semantic version: Large - semantic### Model Description\n\n\n* Developed by: NB AI-Lab\n* Shared by: NB AI-Lab\n* Model type: 'whisper'\n* Language(s) (NLP): Norwegian, Norwegian Bokmål, Norwegian Nynorsk, English\n* License: Apache 2.0\n* Trained from model: openai/whisper-tiny\n* Code Repository: URL\n* Paper: *Coming soon*\n* Demo: *See Spaces on this page*\n\n\nHow to Use the Models\n---------------------",
"passage: ### Online Demos\n\n\nYou can try the models directly through the HuggingFace Inference API, accessible on the right side of this page. Be aware that initially, the model needs to load and will run on limited CPU capacity, which might be slow. To enhance your experience, we are temporarily hosting some models on TPUs for a few days, significantly boosting their performance. Explore these under the Spaces section on the Main Page.### Local Setup with HuggingFace\n\n\nAlternatively, you can run the models locally. The Tiny, Base, and Small models are optimized for CPU execution. For the Medium and Large models, we recommend a system equipped with a GPU to ensure efficient processing. Setting up and using these models with HuggingFace's Transformers is straightforward, provided you have Python installed on your machine. For practical demonstrations, refer to examples using this sample mp3 file.\n\n\nAfter this is done, you should be able to run this in Python:\n\n\n\nExpected output#### Extended HuggingFace\n\n\nExamining the output above, we see that there are multiple repetitions at the end. This is because the video is longer than 30 seconds. By passing the argument, we can transcribe longer file. Our experience is that we get slightly better result by setting that to 28 seconds instead of the default 30 seconds. We also recommend setting the beam size to 5 if possible. This greatly increases the accuracy but takes a bit longer and requires slightly more memory. The examples below also illustrates how to transcribe to English or Nynorsk, and how to get timestamps for sentences and words.\n\n\n\nExpected output\nLong transcripts:\n\n\nTimestamps:\n\n\nWord Level Timestamps:\n\n\nNynorsk:\n\n\nEnglish:### Whisper CPP\n\n\nWhisper CPP is a C++ implementation of the Whisper model, offering the same functionalities with the added benefits of C++ efficiency and performance optimizations. This allows embedding any Whisper model into a binary file, facilitating the development of real applications. However, it requires some familiarity with compiling C++ programs. Their homepage provides examples of how to build applications, including real-time transcription.\n\n\nWe have converted this model to the ggml-format model used by Whisper CPP binaries. The file can be downloaded here, and a 'q5\\_0' quantized version is also available here.",
"passage: ### WhisperX and Speaker Diarization\n\n\nSpeaker diarization is a technique in natural language processing and automatic speech recognition that identifies and separates different speakers in an audio recording. It segments the audio into parts based on who is speaking, enhancing the quality of transcribing meetings or phone calls. We find that WhisperX is the easiest way to use our models for diarizing speech. In addition, WhisperX is using phoneme-based Wav2Vec-models for improving the alignment of the timestamps. As of December 2023 it also has native support for using the nb-wav2vec-models. It currently uses PyAnnote-audio for doing the actual diarization. This package has a fairly strict licence where you have to agree to user terms. Follow the instructions below.\n\n\nYou can also run WhisperX from Python. Please take a look at the instructions on WhisperX homepage.### API\n\n\nInstructions for accessing the models via a simple API are included in the demos under Spaces. Note that these demos are temporary and will only be available for a few weeks.\n\n\nTraining Data\n-------------\n\n\nThe training data originates from Språkbanken and the National Library of Norway's digital collection, including:\n\n\n* NST Norwegian ASR Database (16 kHz) and its corresponding dataset\n* Transcribed speeches from the Norwegian Parliament by Språkbanken\n* TV broadcast (NRK) subtitles (NLN digital collection)\n* Audiobooks (NLN digital collection)\n\n\nDownstream Use\n--------------\n\n\nThe models, especially the smaller ones, may exhibit occasional hallucinations and may drop parts of the transcript. They are designed to convert spoken language into grammatically correct written sentences, which might not always be word-for-word translations. We have made two extra model variant for users that want a different transcription style. We encourage users to try the models themselves to get a better understanding.\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nUsing these models without adequate risk assessment and mitigation could be considered irresponsible. They may contain biases or other undesirable distortions. Users who deploy these models or integrate them into systems or services are responsible for mitigating risks and complying with applicable AI regulations. The National Library of Norway, as the model owner, disclaims liability for any outcomes resulting from third-party use of these models."
] | [
-0.052225906401872635,
0.08852273225784302,
-0.003895467845723033,
-0.001449303119443357,
0.047147076576948166,
-0.0315697155892849,
0.025692766532301903,
0.05916014686226845,
0.001698717474937439,
0.07178816199302673,
-0.005218917969614267,
-0.07260925322771072,
0.08957157284021378,
0.02634858340024948,
0.09490147233009338,
-0.2973610460758209,
0.05276748910546303,
-0.07571085542440414,
0.011015770025551319,
0.03917618468403816,
0.09780293703079224,
-0.05803140997886658,
0.043932367116212845,
0.016411589458584785,
-0.004291608929634094,
0.0097085265442729,
-0.05423247441649437,
-0.03591864928603172,
0.07305631041526794,
0.09334317594766617,
0.03796182945370674,
-0.0028611214365810156,
0.07959828525781631,
-0.1442774087190628,
0.009281400591135025,
0.052888303995132446,
0.04221884533762932,
0.012296347878873348,
0.03537113592028618,
0.11246511340141296,
0.1984262466430664,
-0.08718260377645493,
0.010056406259536743,
0.05152798816561699,
-0.024116916581988335,
-0.1314290165901184,
-0.04314407333731651,
-0.02156982757151127,
0.05449479818344116,
0.04127844050526619,
-0.028288787230849266,
0.09217771887779236,
-0.08047741651535034,
0.03606955334544182,
0.0823182538151741,
-0.10875251144170761,
-0.015062816441059113,
0.015682803466916084,
0.05189214646816254,
0.04337715730071068,
-0.025615306571125984,
0.009404554963111877,
0.015427603386342525,
0.03894093632698059,
0.0156259685754776,
-0.009114402346313,
0.068882055580616,
-0.06560022383928299,
-0.1068887710571289,
-0.04781949520111084,
0.1303248405456543,
0.01466977596282959,
-0.06016857549548149,
-0.17470335960388184,
-0.03839777782559395,
0.05656785890460014,
-0.026666810736060143,
-0.02887471206486225,
0.030949926003813744,
-0.00289578246884048,
0.10373929888010025,
-0.07647919654846191,
-0.10015001893043518,
0.01541897002607584,
-0.0037047427613288164,
0.1038130447268486,
0.04344170168042183,
-0.00008429990703007206,
0.009824433363974094,
0.05023450776934624,
-0.06897114962339401,
-0.055488333106040955,
-0.0671657994389534,
-0.06762628257274628,
-0.05735538527369499,
0.011250006966292858,
-0.021838733926415443,
-0.10344312340021133,
-0.000581158499699086,
0.07518082857131958,
-0.01670704409480095,
0.017618374899029732,
0.02295663021504879,
0.0006919124280102551,
0.05463071167469025,
0.13421685993671417,
-0.02209932543337345,
-0.07718177884817123,
0.003401976777240634,
-0.007302285637706518,
0.06709380447864532,
0.012656754814088345,
-0.04873855784535408,
-0.03072303719818592,
-0.002092138398438692,
0.02586854249238968,
0.02184317074716091,
0.009723346680402756,
0.01972309686243534,
-0.028814909979701042,
0.24236838519573212,
-0.1133742704987526,
0.00035644634044729173,
0.01350055355578661,
-0.04798631742596626,
0.08590974658727646,
0.026040395721793175,
-0.023624412715435028,
-0.13131694495677948,
0.017776265740394592,
-0.003250340698286891,
-0.019727351143956184,
-0.06523322314023972,
-0.1043441891670227,
0.04832173511385918,
0.01556383352726698,
-0.012989559210836887,
-0.11469698697328568,
-0.1000029444694519,
-0.04756873473525047,
0.015920156612992287,
-0.01311273779720068,
-0.03999199718236923,
0.0031369831413030624,
-0.04285532236099243,
-0.03128480166196823,
-0.029861053451895714,
0.0004953791503794491,
-0.026907140389084816,
-0.018177127465605736,
-0.028251685202121735,
0.03195595741271973,
-0.021645477041602135,
0.013025384396314621,
-0.06956323981285095,
-0.008394765667617321,
-0.18725061416625977,
0.11754278093576431,
-0.07433769851922989,
-0.0025425527710467577,
-0.027438826858997345,
-0.06842014938592911,
-0.05730564519762993,
0.05102263018488884,
0.0016294916858896613,
0.07590734213590622,
-0.1796562820672989,
-0.028364911675453186,
0.1441795378923416,
-0.14114247262477875,
0.03879125043749809,
0.14181755483150482,
0.01631464995443821,
0.010170950554311275,
0.13288132846355438,
0.11724665760993958,
0.16870629787445068,
-0.12065622210502625,
-0.06449628621339798,
0.004633029457181692,
-0.03280365467071533,
0.054361049085855484,
0.0404452420771122,
-0.010512937791645527,
0.0978797897696495,
0.04435842111706734,
0.00631623575463891,
0.0240571741014719,
0.036766890436410904,
-0.021994126960635185,
-0.007460128515958786,
-0.028092987835407257,
-0.007962706498801708,
0.03885675594210625,
-0.04466962441802025,
-0.03864722698926926,
-0.09016133099794388,
0.07592028379440308,
0.11120650917291641,
-0.03947211429476738,
0.03295416757464409,
-0.0676860436797142,
-0.022284945473074913,
0.013856165111064911,
0.0011349128326401114,
-0.11247014999389648,
-0.04307776317000389,
0.05008325353264809,
-0.1313273161649704,
0.07344061881303787,
0.054308634251356125,
0.0337761715054512,
0.06499332189559937,
-0.0098849693313241,
0.017766952514648438,
-0.02715049870312214,
-0.0010002367198467255,
-0.024409061297774315,
-0.034023184329271317,
-0.023691533133387566,
-0.03686695173382759,
0.026984110474586487,
-0.10741329193115234,
0.0005126644973643124,
0.0016298157861456275,
0.08894561976194382,
0.01571890152990818,
-0.024155139923095703,
0.01836409978568554,
0.014099687337875366,
0.006955515593290329,
-0.041352901607751846,
-0.006241718772798777,
-0.002074833959341049,
0.0005560678546316922,
0.10581705719232559,
-0.14442090690135956,
-0.12108644098043442,
0.05202390253543854,
0.10709765553474426,
-0.013376806862652302,
-0.0026417833287268877,
-0.03725690022110939,
-0.03306608274579048,
-0.05283467471599579,
-0.12019258737564087,
0.19981379806995392,
0.017910579219460487,
0.05941613018512726,
-0.08878075331449509,
-0.026212312281131744,
0.006121269892901182,
-0.001349806785583496,
-0.008082512766122818,
0.08378139138221741,
0.01607358455657959,
-0.06908705085515976,
-0.011388938874006271,
-0.05114361271262169,
0.02801305055618286,
0.1727195531129837,
-0.021740218624472618,
-0.10745269060134888,
0.010682486928999424,
-0.010667898692190647,
-0.012420963495969772,
0.09152982383966446,
0.008651901967823505,
-0.006311625707894564,
0.032054636627435684,
0.025929123163223267,
0.05522071197628975,
-0.052826594561338425,
0.08396222442388535,
0.02549523115158081,
-0.04609178379178047,
0.04280078038573265,
-0.022747883573174477,
-0.004828274250030518,
0.04238114878535271,
0.005506219808012247,
0.023525051772594452,
-0.044619008898735046,
-0.043317895382642746,
-0.08644869923591614,
0.10225197672843933,
-0.09983924776315689,
-0.2230597287416458,
-0.16312386095523834,
0.08276357501745224,
-0.020482318475842476,
-0.0135834701359272,
0.03606247901916504,
-0.04808738827705383,
-0.10255175083875656,
-0.13341183960437775,
0.03934044763445854,
0.024850621819496155,
-0.07681998610496521,
-0.04863737151026726,
0.02693675458431244,
0.005223082844167948,
-0.12587480247020721,
0.0035038066562265158,
0.0034617397468537092,
0.017010381445288658,
-0.01938239112496376,
0.01475540455430746,
0.03373110666871071,
0.07818625867366791,
0.0071961707435548306,
-0.056919921189546585,
0.008759372867643833,
0.15968751907348633,
-0.07362956553697586,
0.14982958137989044,
0.1591135859489441,
0.010566595010459423,
0.05727095529437065,
0.0907081589102745,
0.012823100201785564,
-0.021463578566908836,
0.024610979482531548,
0.014908000826835632,
-0.06317827105522156,
-0.14568156003952026,
-0.1296371966600418,
-0.04354536533355713,
-0.002862870693206787,
0.06264252215623856,
0.03364190831780434,
-0.050751861184835434,
0.016862967982888222,
-0.07995019853115082,
-0.01189426239579916,
0.050388678908348083,
0.05185304209589958,
0.1305679827928543,
0.004554635379463434,
0.02522990107536316,
-0.07585408538579941,
-0.018485860899090767,
0.10077250003814697,
-0.014172014780342579,
0.15140685439109802,
-0.06358424574136734,
0.11729124933481216,
0.025626346468925476,
-0.018516208976507187,
0.05469629168510437,
0.05597400292754173,
-0.002645803615450859,
0.021008489653468132,
-0.023433880880475044,
-0.07610709220170975,
-0.04634670913219452,
0.08225713670253754,
0.05410097539424896,
-0.07721629738807678,
-0.000055336084187729284,
0.007994402199983597,
0.004226780030876398,
0.141394704580307,
0.03088860772550106,
-0.10785882920026779,
-0.13042192161083221,
0.018406806513667107,
-0.10283639281988144,
-0.07447937875986099,
0.02225286327302456,
0.14712245762348175,
-0.08862995356321335,
0.0247601717710495,
0.0015470919897779822,
0.07675943523645401,
-0.06931576877832413,
0.008923756889998913,
-0.037522729486227036,
0.13547645509243011,
0.0148098049685359,
0.05460122227668762,
-0.040691178292036057,
0.03594399243593216,
0.007524114102125168,
0.11986365914344788,
-0.06701100617647171,
0.04461449384689331,
0.02805890142917633,
0.0038270826917141676,
0.061878424137830734,
0.039295319467782974,
-0.1471647173166275,
0.02185921184718609,
-0.11554110795259476,
0.04900367185473442,
0.03978533670306206,
0.04878067597746849,
0.07762307673692703,
-0.011232279241085052,
0.00010367358481744304,
-0.02589580975472927,
-0.11007753759622574,
-0.12047339230775833,
-0.16099536418914795,
0.026253074407577515,
0.0016102790832519531,
-0.012039287947118282,
-0.05209111049771309,
-0.01393717061728239,
-0.08537588268518448,
0.11512008309364319,
-0.086333267390728,
-0.12055904418230057,
-0.07162051647901535,
-0.056014955043792725,
0.14978408813476562,
-0.045222971588373184,
0.011422659270465374,
0.03458608314394951,
0.15674103796482086,
-0.05130079388618469,
-0.05984078347682953,
-0.01646554283797741,
-0.0871242880821228,
-0.10555434226989746,
0.009179354645311832,
0.11674296855926514,
0.10215631872415543,
0.05596138536930084,
0.01426722202450037,
-0.0035110872704535723,
-0.0022220786195248365,
-0.0982024297118187,
-0.05760173499584198,
0.17588752508163452,
-0.015212134458124638,
0.03668995946645737,
-0.05086560919880867,
-0.07438778132200241,
-0.05467268452048302,
-0.012131031602621078,
0.05289961025118828,
0.147649347782135,
-0.05165044963359833,
0.1354415863752365,
0.1905476599931717,
-0.06485673785209656,
-0.21484120190143585,
-0.048808932304382324,
0.050561461597681046,
0.05266590043902397,
0.016840344294905663,
-0.1655668020248413,
0.10857931524515152,
0.038562871515750885,
-0.0030878542456775904,
0.041141606867313385,
-0.22227518260478973,
-0.10099960118532181,
0.049818526953458786,
-0.014821608550846577,
-0.020186029374599457,
-0.02436850219964981,
-0.02766892872750759,
-0.03614833578467369,
-0.03089556284248829,
0.07159119099378586,
-0.041483957320451736,
0.05427553877234459,
0.03569434955716133,
0.07217209786176682,
0.04920588806271553,
-0.025053592398762703,
0.10781430453062057,
-0.032340966165065765,
0.005375148728489876,
-0.06170859932899475,
0.09190771728754044,
0.015412471257150173,
-0.06622236967086792,
0.1507914811372757,
-0.020450951531529427,
0.013892539776861668,
-0.10539975762367249,
-0.060903001576662064,
-0.07634317129850388,
0.07967976480722427,
-0.016665490344166756,
-0.041370589286088943,
-0.08772089332342148,
0.08018451184034348,
0.09333449602127075,
0.009074760600924492,
-0.07185950130224228,
-0.07093992829322815,
-0.07474950700998306,
0.12077293545007706,
0.17676575481891632,
-0.05060909688472748,
-0.04941016808152199,
0.010487984865903854,
-0.0000072158873081207275,
0.053859684616327286,
-0.04685445502400398,
0.017943071201443672,
0.09268790483474731,
-0.01578686572611332,
0.030510475859045982,
-0.03213605657219887,
-0.13292117416858673,
-0.015581357292830944,
0.020425312221050262,
-0.034070760011672974,
-0.16803307831287384,
-0.03274836763739586,
0.044969603419303894,
-0.05326880142092705,
-0.04127122834324837,
0.1281358301639557,
-0.08154872059822083,
-0.002435939619317651,
0.013432449661195278,
0.056119125336408615,
0.028950048610568047,
0.09494984149932861,
0.023578708991408348,
0.0228892769664526,
-0.06803906708955765,
0.09775378555059433,
0.03442065417766571,
-0.12268374115228653,
0.049312248826026917,
0.1513606160879135,
-0.08833330869674683,
-0.04944658279418945,
-0.11741238832473755,
-0.03644706308841705,
-0.013946798630058765,
-0.10032463818788528,
0.009968179278075695,
-0.07222934067249298,
0.01247421745210886,
0.00764068216085434,
0.007883475162088871,
-0.022853434085845947,
-0.01947901025414467,
0.04050984978675842,
-0.10356128960847855,
0.08392780274152756,
0.011687375605106354,
0.027217308059334755,
-0.040857139974832535,
0.09045123308897018,
-0.004106637090444565,
0.013881065882742405,
-0.024869397282600403,
-0.01667608879506588,
-0.007341073360294104,
-0.02906951494514942,
-0.132608100771904,
0.016597947105765343,
-0.08786028623580933,
0.006111952010542154,
-0.0031462085898965597,
0.03427022695541382,
-0.016645194962620735,
0.050871092826128006,
-0.03443542495369911,
-0.020971929654479027,
-0.05678952857851982,
0.03435012698173523,
-0.04970226809382439,
0.01060494501143694,
0.04573814943432808,
-0.07156609743833542,
0.04943181574344635,
0.0022769116330891848,
-0.05154310539364815,
0.046509552747011185,
-0.023453624919056892,
0.004097518976777792,
0.019239922985434532,
0.06405843794345856,
0.005800385028123856,
-0.04881251975893974,
-0.005513886455446482,
0.030480094254016876,
0.001039832248352468,
-0.040952011942863464,
0.03084316849708557,
-0.05733249709010124,
0.08058638125658035,
0.025077002122998238,
-0.016648350283503532,
-0.055366143584251404,
0.029299676418304443,
0.028620583936572075,
0.01957288570702076,
0.088797926902771,
-0.05874796584248543,
0.016061926260590553,
-0.09258904308080673,
-0.0017860470106825233,
0.006758896633982658,
0.007367485668510199,
0.08113706111907959,
-0.01423239428550005,
0.027559829875826836,
0.0032798137981444597,
0.17642265558242798,
-0.013503152877092361,
0.023280302062630653,
0.05413125827908516,
-0.09503402560949326,
-0.10219889879226685,
0.016268249601125717,
0.06149940565228462,
0.01527510304003954,
-0.007534107193350792,
-0.03953751549124718,
-0.02916337549686432,
-0.014883622527122498,
0.000896188139449805,
0.09694097191095352,
0.11095067113637924,
0.09064046293497086,
0.09217807650566101,
0.011939878575503826,
-0.044474560767412186,
-0.11705377697944641,
0.07488963007926941,
-0.02921421267092228,
0.07666610926389694,
-0.043572697788476944,
0.08798312395811081,
0.1260853260755539,
-0.06540343165397644,
0.09240283817052841,
-0.007691781967878342,
-0.045044343918561935,
-0.0720357820391655,
-0.13759805262088776,
-0.04340393468737602,
-0.03260834515094757,
-0.027543092146515846,
-0.08869364857673645,
0.03453271463513374,
0.014034916646778584,
0.017245149239897728,
-0.02818775177001953,
0.1076536551117897,
-0.06853114813566208,
-0.09216915816068649,
0.0596550814807415,
-0.021187076345086098,
0.02972322888672352,
0.10345810651779175,
-0.00539429672062397,
0.05171045660972595,
0.06470414996147156,
0.0647355243563652,
0.059751007705926895,
0.01655891351401806,
-0.008036908693611622,
-0.0010981978848576546,
-0.01257641613483429,
-0.009200030006468296,
-0.0012059304863214493,
-0.005321629345417023,
0.10171066969633102,
0.07835332304239273,
-0.08204271644353867,
-0.0062174987979233265,
0.10131800174713135,
-0.046293165534734726,
-0.15047912299633026,
-0.1273408681154251,
0.12395629286766052,
-0.0007258318364620209,
0.018471378833055496,
-0.006247213575989008,
-0.0816165879368782,
-0.019903363659977913,
0.13459067046642303,
0.13071709871292114,
0.03213238716125488,
-0.007894089445471764,
-0.039852872490882874,
-0.01120071206241846,
-0.07177309691905975,
0.11053664237260818,
0.00687100924551487,
0.27977511286735535,
-0.005616945680230856,
0.060298994183540344,
-0.019655324518680573,
-0.02370639704167843,
-0.12060276418924332,
0.07631135731935501,
-0.04913611710071564,
-0.001253551454283297,
-0.027348458766937256,
0.07637572288513184,
-0.0502299964427948,
-0.24100856482982635,
-0.02803356945514679,
0.00229514017701149,
-0.06411519646644592,
0.028340429067611694,
-0.01675615832209587,
0.02103683352470398,
0.06279034167528152,
-0.0068031735718250275,
0.003647860838100314,
0.14050765335559845,
-0.016986941918730736,
-0.06681404262781143,
0.03775051236152649,
0.04071744158864021,
-0.039992693811655045,
0.16937245428562164,
0.03391566500067711,
0.05281646177172661,
0.05689360201358795,
-0.011062577366828918,
-0.11674658209085464,
0.05533183738589287,
0.026785263791680336,
-0.11179912090301514,
0.030567238107323647,
0.14684522151947021,
-0.013011924922466278,
0.08045897632837296,
0.049192070960998535,
-0.043427322059869766,
0.0014846710255369544,
0.08644678443670273,
0.010129737667739391,
-0.054996322840452194,
0.05480967462062836,
-0.09295687079429626,
0.12768959999084473,
0.09785804152488708,
0.012446794658899307,
-0.02764817886054516,
-0.04701811447739601,
0.00810723751783371,
-0.013159717433154583,
0.10067502409219742,
-0.005202915519475937,
-0.11670020967721939,
-0.025086358189582825,
-0.004475296940654516,
0.0564287006855011,
-0.1614377498626709,
-0.03812123090028763,
0.0023183308076113462,
-0.022022679448127747,
0.028381118550896645,
0.09560016542673111,
0.030049918219447136,
0.005535001400858164,
-0.02377123199403286,
-0.024639392271637917,
0.002642090665176511,
0.07445094734430313,
-0.1155429556965828,
-0.05776357278227806
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2548
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 425 | 3.6676 |
| 4.0717 | 2.0 | 850 | 3.4692 |
| 3.5577 | 3.0 | 1275 | 3.3515 |
| 3.4352 | 4.0 | 1700 | 3.2840 |
| 3.3733 | 5.0 | 2125 | 3.2548 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilroberta-base", "model-index": [{"name": "distilroberta-base-finetuned-wikitext2", "results": []}]} | text-generation | Doniaa/Trial | [
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-generation",
"generated_from_trainer",
"base_model:distilroberta-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:10:29+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilroberta-base-finetuned-wikitext2
======================================
This model is a fine-tuned version of distilroberta-base on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 3.2548
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
68,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #roberta #text-generation #generated_from_trainer #base_model-distilroberta-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.08345858752727509,
0.07611770927906036,
-0.002433516550809145,
0.10026758909225464,
0.14040349423885345,
0.016541926190257072,
0.16321203112602234,
0.11313231289386749,
-0.08579763025045395,
0.04848911985754967,
0.13434210419654846,
0.13548874855041504,
0.009403061121702194,
0.12729816138744354,
-0.061710454523563385,
-0.2258564829826355,
0.008874216116964817,
0.03387179225683212,
-0.06310802698135376,
0.11269839107990265,
0.09860564768314362,
-0.12219689786434174,
0.09286748617887497,
-0.005950632505118847,
-0.1934249848127365,
0.016429411247372627,
0.015600067563354969,
-0.04883521795272827,
0.13486477732658386,
0.036466438323259354,
0.12966951727867126,
0.02112698182463646,
0.09002850949764252,
-0.2029232531785965,
0.011774693615734577,
0.0591641366481781,
-0.0048940484412014484,
0.07989782094955444,
0.02948121540248394,
0.007287887390702963,
0.1123942956328392,
-0.07502848654985428,
0.057950835675001144,
0.019036607816815376,
-0.11966536939144135,
-0.19359232485294342,
-0.08458975702524185,
0.03567004203796387,
0.09324698150157928,
0.09039580076932907,
-0.010054652579128742,
0.12716422975063324,
-0.045349061489105225,
0.0964915007352829,
0.22229214012622833,
-0.30710142850875854,
-0.0684443935751915,
0.060754768550395966,
0.05553913488984108,
0.07887103408575058,
-0.0988885834813118,
-0.015860429033637047,
0.07146281749010086,
0.02763674408197403,
0.12884080410003662,
-0.03426671773195267,
-0.07100808620452881,
0.008088395930826664,
-0.14462308585643768,
-0.020389923825860023,
0.16591699421405792,
0.04625248908996582,
-0.04126308113336563,
-0.0476016104221344,
-0.06613238155841827,
-0.14694832265377045,
-0.03588340803980827,
-0.028870560228824615,
0.04642540216445923,
-0.020283924415707588,
-0.06684088706970215,
-0.029281338676810265,
-0.1093803122639656,
-0.08126113563776016,
-0.059704892337322235,
0.1405573934316635,
0.03571874275803566,
-0.00005906631486141123,
-0.02255399525165558,
0.09617482870817184,
-0.04486753046512604,
-0.1258218139410019,
0.011922640725970268,
0.025001654401421547,
0.015628226101398468,
-0.05320858210325241,
-0.05988609790802002,
-0.09049784392118454,
0.027131395414471626,
0.16122794151306152,
-0.04417482390999794,
0.04279394447803497,
0.021548401564359665,
0.04909322038292885,
-0.09831281751394272,
0.16187317669391632,
-0.039321526885032654,
-0.037638306617736816,
0.01773259975016117,
0.06694766879081726,
0.0448034442961216,
-0.005919225048273802,
-0.1239905133843422,
0.021992729976773262,
0.09376055002212524,
0.010180395096540451,
-0.061903733760118484,
0.07327021658420563,
-0.042738284915685654,
0.00782833993434906,
-0.0036949722561985254,
-0.08534658700227737,
0.023775476962327957,
-0.012683883309364319,
-0.04589055851101875,
-0.04524901136755943,
0.036180928349494934,
0.02366664633154869,
0.008837971836328506,
0.0996837168931961,
-0.09009366482496262,
0.004843475762754679,
-0.09589379280805588,
-0.11714127659797668,
0.02052288129925728,
-0.08901740610599518,
0.03214254975318909,
-0.10961657017469406,
-0.1864001750946045,
-0.011210326105356216,
0.06161332502961159,
-0.03408314660191536,
-0.03278704732656479,
-0.05564650148153305,
-0.07977306097745895,
0.015923459082841873,
-0.015178723260760307,
0.09072060137987137,
-0.05913102999329567,
0.09334620833396912,
0.04930799826979637,
0.07181992381811142,
-0.05283811688423157,
0.03651142120361328,
-0.09308937937021255,
0.027928370982408524,
-0.17127878963947296,
0.013703836128115654,
-0.060428451746702194,
0.06921642273664474,
-0.07973038405179977,
-0.07087995111942291,
-0.020217590034008026,
0.018148677423596382,
0.0772799551486969,
0.08552606403827667,
-0.1603953093290329,
-0.0660477876663208,
0.18033280968666077,
-0.09395282715559006,
-0.1458417922258377,
0.12619957327842712,
-0.05374721437692642,
0.07603384554386139,
0.05821295827627182,
0.17225147783756256,
0.05933314934372902,
-0.10653477907180786,
-0.0004718810960184783,
-0.0037895780988037586,
0.04829942807555199,
-0.04880119115114212,
0.058841001242399216,
0.0029627832118421793,
0.021031668409705162,
0.016635550186038017,
-0.03576546534895897,
0.051415178924798965,
-0.08177065849304199,
-0.08285244554281235,
-0.05028768256306648,
-0.10008998960256577,
0.024769598618149757,
0.05091293156147003,
0.06543039530515671,
-0.11185801774263382,
-0.09056957811117172,
0.06765124201774597,
0.07499837875366211,
-0.07672091573476791,
0.02182682603597641,
-0.06619622558355331,
0.08079993724822998,
-0.06297652423381805,
-0.01597532257437706,
-0.15200822055339813,
-0.041412629187107086,
0.006823655683547258,
-0.00045223228516988456,
0.018231147900223732,
0.01744302734732628,
0.07589468359947205,
0.07521763443946838,
-0.06249614059925079,
-0.01845681294798851,
-0.02138562500476837,
0.014783560298383236,
-0.12186668068170547,
-0.19971616566181183,
-0.009192277677357197,
-0.036514222621917725,
0.13191430270671844,
-0.22735154628753662,
0.05163495987653732,
-0.009606555104255676,
0.08836563676595688,
0.034292615950107574,
-0.007353479508310556,
-0.05008181184530258,
0.07274783402681351,
-0.05127646401524544,
-0.05854925885796547,
0.05244901403784752,
0.012623165734112263,
-0.08322697132825851,
-0.038912516087293625,
-0.14850188791751862,
0.17832273244857788,
0.13570083677768707,
-0.09283223748207092,
-0.0821261778473854,
-0.002590759191662073,
-0.04496310278773308,
-0.03319411352276802,
-0.04192904010415077,
-0.00827493891119957,
0.12077319622039795,
-0.018653610721230507,
0.14611299335956573,
-0.07950049638748169,
-0.03745543211698532,
0.028298107907176018,
-0.054719265550374985,
0.009643903002142906,
0.10000594705343246,
0.11734707653522491,
-0.09046775847673416,
0.1512134075164795,
0.168884739279747,
-0.1082402840256691,
0.14262224733829498,
-0.04128725081682205,
-0.06672731041908264,
-0.024797096848487854,
0.00539715401828289,
0.008624987676739693,
0.11511754989624023,
-0.1297835409641266,
0.0002540361601859331,
0.007341122254729271,
0.01107130665332079,
0.022202569991350174,
-0.21794891357421875,
-0.03076460026204586,
0.038898732513189316,
-0.0461810939013958,
0.01725049689412117,
-0.012684937566518784,
-0.017292186617851257,
0.0936676561832428,
-0.005544451996684074,
-0.08174587041139603,
0.038275282829999924,
-0.0006300880922935903,
-0.08117826282978058,
0.20805495977401733,
-0.07443928718566895,
-0.12387751042842865,
-0.14315088093280792,
-0.0769542008638382,
-0.03348730504512787,
0.023958804085850716,
0.07067432254552841,
-0.07037890702486038,
-0.0469658300280571,
-0.10458541661500931,
0.01748441904783249,
0.03389362618327141,
0.025871438905596733,
0.029014375060796738,
0.009377586655318737,
0.0707268938422203,
-0.10662169754505157,
-0.009930874221026897,
-0.04694436118006706,
-0.05708371475338936,
0.032104555517435074,
0.026447327807545662,
0.12296346575021744,
0.13673481345176697,
-0.018275385722517967,
-0.001434018020518124,
-0.030628319829702377,
0.22372569143772125,
-0.06419318914413452,
-0.014556650072336197,
0.13631239533424377,
-0.018037939444184303,
0.046140242367982864,
0.13673634827136993,
0.05915343016386032,
-0.09420359879732132,
0.02098148688673973,
0.038043711334466934,
-0.03323415294289589,
-0.21289733052253723,
-0.02935653366148472,
-0.04462822154164314,
0.008015529252588749,
0.08776531368494034,
0.03300902992486954,
0.04986097663640976,
0.07330197095870972,
0.03327565640211105,
0.08631352335214615,
-0.013127275742590427,
0.07818129658699036,
0.12199466675519943,
0.0378226637840271,
0.12818357348442078,
-0.054844748228788376,
-0.05975743755698204,
0.033560752868652344,
0.007951516658067703,
0.221388578414917,
0.02823413535952568,
0.13658714294433594,
0.06290949881076813,
0.14743682742118835,
-0.005909011233597994,
0.07077706605195999,
-0.012534277513623238,
-0.05111127346754074,
-0.012025067582726479,
-0.049326688051223755,
-0.0174450371414423,
0.046920519322156906,
-0.10474453121423721,
0.05350726097822189,
-0.09470934420824051,
0.030922628939151764,
0.054801490157842636,
0.23539750277996063,
0.04575067386031151,
-0.32484835386276245,
-0.0888262614607811,
0.023744706064462662,
-0.02915298379957676,
-0.02199678122997284,
0.031986501067876816,
0.12579408288002014,
-0.05267147719860077,
0.02653055638074875,
-0.0740315243601799,
0.08211415261030197,
-0.02504258044064045,
0.04319947585463524,
0.05805953964591026,
0.09735238552093506,
-0.00587481539696455,
0.06699026376008987,
-0.2859300673007965,
0.2785884439945221,
0.008028744719922543,
0.08350958675146103,
-0.04912848025560379,
0.010766604915261269,
0.02467198297381401,
0.07372972369194031,
0.08039481192827225,
-0.02848856896162033,
-0.08243478089570999,
-0.1774367392063141,
-0.046566516160964966,
0.027379285544157028,
0.09862792491912842,
-0.02271958254277706,
0.10847839713096619,
-0.04090363532304764,
0.004640320315957069,
0.08833059668540955,
-0.012978432700037956,
-0.07977700233459473,
-0.10377073287963867,
-0.012724360451102257,
0.036669597029685974,
-0.03253453969955444,
-0.08137821406126022,
-0.09428581595420837,
-0.12454306334257126,
0.15277214348316193,
-0.05551169440150261,
-0.019705116748809814,
-0.09813292324542999,
0.06580430269241333,
0.05740252882242203,
-0.07890326529741287,
0.06021590903401375,
0.010743332095444202,
0.08323987573385239,
0.01803473010659218,
-0.058475740253925323,
0.11892760545015335,
-0.08104778081178665,
-0.16276144981384277,
-0.07159648835659027,
0.09413120150566101,
0.01315623801201582,
0.04164328798651695,
-0.0009685414843261242,
0.013874531723558903,
-0.029172267764806747,
-0.07896314561367035,
0.020346881821751595,
-0.007831981405615807,
0.05828864872455597,
0.0007867304375395179,
-0.06962043792009354,
-0.0033066789619624615,
-0.04659648239612579,
-0.04857555404305458,
0.16074085235595703,
0.2773028314113617,
-0.08759690076112747,
-0.0004130755551159382,
0.05655180662870407,
-0.07216419279575348,
-0.20489150285720825,
0.032150186598300934,
0.029928503558039665,
0.006659740582108498,
0.047062937170267105,
-0.14231976866722107,
0.11299409717321396,
0.10703272372484207,
-0.021097535267472267,
0.1046915203332901,
-0.30106320977211,
-0.13543298840522766,
0.12721015512943268,
0.15318165719509125,
0.11586689949035645,
-0.15841546654701233,
-0.031811390072107315,
-0.03619222715497017,
-0.13284559547901154,
0.09589894860982895,
-0.1275532841682434,
0.11473473906517029,
-0.010892868973314762,
0.06163668632507324,
-0.00026632685330696404,
-0.062288202345371246,
0.12038755416870117,
-0.014071052893996239,
0.10927790403366089,
-0.06727563589811325,
-0.009558402933180332,
0.056581050157547,
-0.04987519606947899,
0.02648303657770157,
-0.11702403426170349,
0.03173621743917465,
-0.04290533438324928,
-0.03439858928322792,
-0.044029105454683304,
0.03826732560992241,
-0.03802983835339546,
-0.06777091324329376,
-0.0434492751955986,
0.015711847692728043,
0.03064478002488613,
-0.01693534106016159,
0.14289169013500214,
0.010199540294706821,
0.15378834307193756,
0.13533097505569458,
0.07485658675432205,
-0.06684575229883194,
-0.023365410044789314,
0.0006610240670852363,
-0.03576252982020378,
0.0565255731344223,
-0.15482543408870697,
0.027839289978146553,
0.11756378412246704,
0.009007042273879051,
0.1496729999780655,
0.0742945671081543,
-0.03328637406229973,
0.020004961639642715,
0.07650832831859589,
-0.1605442315340042,
-0.11073404550552368,
-0.003207610687240958,
-0.043953701853752136,
-0.11269360780715942,
0.06742126494646072,
0.1210051029920578,
-0.07669895887374878,
0.011069906875491142,
-0.011747968383133411,
0.011931381188333035,
-0.049140747636556625,
0.1784394383430481,
0.05925635248422623,
0.04403204470872879,
-0.0689391940832138,
0.0737728476524353,
0.03187359496951103,
-0.06955727189779282,
0.01946997456252575,
0.04853029549121857,
-0.0759899690747261,
-0.043811991810798645,
0.0533050075173378,
0.18722184002399445,
-0.054337028414011,
-0.0530276745557785,
-0.14908413589000702,
-0.11641382426023483,
0.05360036715865135,
0.19525237381458282,
0.09528402984142303,
0.012023882009088993,
-0.034293290227651596,
0.025555837899446487,
-0.12443611025810242,
0.11276324093341827,
0.03815286234021187,
0.08702172338962555,
-0.15020304918289185,
0.11765957623720169,
-0.0018453572411090136,
0.003036820562556386,
-0.027569903060793877,
0.050871655344963074,
-0.1163724958896637,
-0.006314377766102552,
-0.12708207964897156,
-0.02104642428457737,
-0.03221196308732033,
-0.003813599469140172,
0.007467073854058981,
-0.05613020807504654,
-0.07043779641389847,
0.01618894375860691,
-0.09534992277622223,
-0.018968896940350533,
0.03602933883666992,
0.05586924031376839,
-0.12138056010007858,
-0.030770383775234222,
0.027738777920603752,
-0.06539234519004822,
0.061149708926677704,
0.023769769817590714,
0.026488741859793663,
0.053908176720142365,
-0.17835412919521332,
0.04066471382975578,
0.07094567269086838,
0.010199316777288914,
0.04230727627873421,
-0.08495165407657623,
-0.016972094774246216,
-0.0007132422761060297,
0.0549282506108284,
0.015788350254297256,
0.06422004103660583,
-0.12574338912963867,
0.009651938453316689,
-0.03715367987751961,
-0.06896113604307175,
-0.05794219672679901,
0.019856754690408707,
0.09021977335214615,
-0.0009188277181237936,
0.1990557760000229,
-0.09552884101867676,
0.016657110303640366,
-0.20056360960006714,
0.014825943857431412,
0.0056166816502809525,
-0.10946466028690338,
-0.11474528163671494,
-0.06330868601799011,
0.04270013049244881,
-0.05984985828399658,
0.15087653696537018,
0.004334236029535532,
0.011263079941272736,
0.03663322329521179,
-0.04383581504225731,
0.037807680666446686,
0.025239186361432076,
0.2293316125869751,
0.032711632549762726,
-0.037308987230062485,
-0.0005727113457396626,
0.04114822298288345,
0.11300858110189438,
0.06240274757146835,
0.16761913895606995,
0.16113992035388947,
-0.04598230868577957,
0.11600474268198013,
0.05355855077505112,
-0.05437672138214111,
-0.12965047359466553,
0.06720975786447525,
-0.043771449476480484,
0.09129083901643753,
-0.02724011056125164,
0.20505854487419128,
0.11820191890001297,
-0.15364621579647064,
0.00959702767431736,
-0.05041781812906265,
-0.07837077975273132,
-0.11230289936065674,
-0.0524095818400383,
-0.09827960282564163,
-0.15334045886993408,
0.007177416235208511,
-0.11597257107496262,
0.013943923637270927,
0.0932130292057991,
0.007231622003018856,
-0.016784772276878357,
0.17887656390666962,
0.014476751908659935,
0.033473532646894455,
0.042497504502534866,
0.001117998268455267,
-0.032315466552972794,
-0.10046716779470444,
-0.08103344589471817,
-0.006744718644768,
-0.018868355080485344,
0.021298982203006744,
-0.04539373889565468,
-0.027252424508333206,
0.04216427356004715,
-0.012241915799677372,
-0.09483315050601959,
0.007699227426201105,
0.03183714672923088,
0.047982558608055115,
0.03215897083282471,
0.0020493410993367434,
0.0050349729135632515,
0.00010157067299587652,
0.21455182135105133,
-0.07870321720838547,
-0.07717686146497726,
-0.10241597145795822,
0.19785474240779877,
0.03354838117957115,
0.019672775641083717,
0.001258699456229806,
-0.0826977789402008,
0.025647524744272232,
0.23854508996009827,
0.18099446594715118,
-0.06907124072313309,
-0.0013108125422149897,
0.003350545186549425,
-0.009402837604284286,
-0.04537735506892204,
0.08754539489746094,
0.12185650318861008,
0.043123021721839905,
-0.0736887976527214,
-0.05140001326799393,
-0.03527302294969559,
-0.004794827196747065,
-0.043937068432569504,
0.048259228467941284,
0.03565199673175812,
0.008959793485701084,
-0.03726023808121681,
0.054067086428403854,
-0.0353313684463501,
-0.09458194673061371,
0.05100494623184204,
-0.19509491324424744,
-0.1417035311460495,
-0.0061378260143101215,
0.12781235575675964,
-0.02153659239411354,
0.05180525779724121,
-0.034643176943063736,
-0.007987434975802898,
0.07663948088884354,
-0.023803843185305595,
-0.07278525084257126,
-0.06497693061828613,
0.055241771042346954,
-0.09486434608697891,
0.22449913620948792,
-0.04459742084145546,
0.0460035540163517,
0.13464656472206116,
0.040778569877147675,
-0.07551173865795135,
0.08942783623933792,
0.04546962305903435,
-0.07376838475465775,
0.031001698225736618,
0.07904908806085587,
-0.041968364268541336,
0.11129932850599289,
0.05352366715669632,
-0.13573355972766876,
0.023312155157327652,
-0.051951389759778976,
-0.0772101879119873,
-0.0469391793012619,
-0.03525449335575104,
-0.06733235716819763,
0.13913500308990479,
0.1846131533384323,
-0.034203916788101196,
0.008715596981346607,
-0.048852335661649704,
0.03527196869254112,
0.07242638617753983,
0.04902678728103638,
-0.031726203858852386,
-0.23743025958538055,
0.02584819868206978,
0.07450920343399048,
-0.01641952060163021,
-0.28159913420677185,
-0.09509453177452087,
-0.0050046974793076515,
-0.050304513424634933,
-0.09798592329025269,
0.07585301250219345,
0.14445258677005768,
0.055531371384859085,
-0.06094427406787872,
-0.0994575098156929,
-0.07693617790937424,
0.15705393254756927,
-0.13242176175117493,
-0.10346879810094833
] |
null | null | transformers |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | Jimmyhd/llama213b1000RowsCsvData | [
"transformers",
"safetensors",
"llama",
"text-generation",
"autotrain",
"conversational",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T10:11:24+00:00 | [] | [] | TAGS
#transformers #safetensors #llama #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
60,
29,
3
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.019985521212220192,
0.041816771030426025,
-0.0015093530528247356,
0.04983396455645561,
0.12702052295207977,
-0.03985721990466118,
0.2503778636455536,
0.05486364662647247,
-0.07805269211530685,
-0.09162702411413193,
0.17846965789794922,
0.1966569423675537,
-0.04739704728126526,
0.1650119423866272,
-0.049305956810712814,
-0.24425874650478363,
0.0308239683508873,
-0.027378594502806664,
0.07924526929855347,
0.1134583130478859,
0.1459313929080963,
-0.069073885679245,
0.07192834466695786,
0.03350326791405678,
-0.19196271896362305,
0.03607437387108803,
0.06325650960206985,
-0.13595952093601227,
0.1781395524740219,
0.0730607882142067,
0.09795299917459488,
0.05176535248756409,
0.11600667238235474,
-0.13303197920322418,
0.02148369513452053,
0.0012429648777469993,
-0.01376256626099348,
0.07068803161382675,
0.059910088777542114,
-0.05214822664856911,
0.0754503607749939,
0.16066469252109528,
0.10474828630685806,
0.04625413194298744,
-0.10307525843381882,
0.03606079891324043,
-0.007664162199944258,
0.01607259176671505,
0.11656992882490158,
0.11167781800031662,
-0.011104828678071499,
0.1692391037940979,
-0.12352225184440613,
0.0927673876285553,
-0.06312526017427444,
-0.27212998270988464,
-0.01953640766441822,
0.1782287359237671,
0.05536120757460594,
0.003713849000632763,
-0.11036768555641174,
0.08918768912553787,
0.10817500948905945,
-0.013457630760967731,
0.07449526339769363,
-0.02930355817079544,
-0.0747155100107193,
-0.010549373924732208,
-0.07913186401128769,
0.021456321701407433,
0.19510410726070404,
-0.07658466696739197,
-0.027360891923308372,
-0.12176856398582458,
-0.028924891725182533,
0.019943704828619957,
-0.0005201206658966839,
-0.09924954175949097,
-0.020720934495329857,
0.08925458788871765,
-0.03742996230721474,
-0.03304271399974823,
-0.12941165268421173,
-0.05689829960465431,
-0.10432478040456772,
0.080483578145504,
0.004176578018814325,
-0.010131161659955978,
-0.10812576860189438,
0.10831553488969803,
0.036837704479694366,
-0.10601683706045151,
0.04897525534033775,
-0.08127395063638687,
0.033307481557130814,
-0.08884010463953018,
-0.020026516169309616,
-0.12393411993980408,
0.03442299738526344,
0.1911761611700058,
0.17161144316196442,
-0.007555702235549688,
-0.09846625477075577,
0.037637125700712204,
0.009408433921635151,
0.11369463801383972,
0.046487849205732346,
-0.045776110142469406,
0.06709876656532288,
-0.04264026880264282,
-0.007700521033257246,
-0.0333063006401062,
-0.19418306648731232,
0.0378468781709671,
0.021328622475266457,
0.07297981530427933,
-0.06390932202339172,
0.10803848505020142,
-0.02806558459997177,
0.036218613386154175,
0.013528588227927685,
-0.050648901611566544,
0.027312688529491425,
-0.06237490847706795,
-0.0030069053173065186,
-0.05522473528981209,
0.05534237623214722,
0.11048513650894165,
0.02016965113580227,
0.09672053903341293,
-0.07597221434116364,
-0.042308881878852844,
-0.10714275389909744,
-0.0614769272506237,
0.004838209133595228,
0.041153065860271454,
0.04866805672645569,
-0.20337146520614624,
-0.28446313738822937,
-0.01588602364063263,
0.0521208830177784,
-0.021559109911322594,
-0.08046071231365204,
-0.10747288912534714,
0.00881670881062746,
0.04961434006690979,
-0.041250426322221756,
0.050605256110429764,
-0.0236679557710886,
0.03517379239201546,
-0.05794162675738335,
0.028720907866954803,
-0.06626300513744354,
0.022746717557311058,
-0.13074947893619537,
-0.012705030851066113,
-0.005872172769159079,
0.0556497685611248,
-0.029873713850975037,
0.14915479719638824,
-0.01976943202316761,
0.04168194904923439,
-0.032731082290410995,
0.06638596206903458,
0.011397413909435272,
0.15701933205127716,
-0.14462588727474213,
-0.026569174602627754,
0.15484152734279633,
-0.10750064253807068,
-0.12428124994039536,
0.10975891351699829,
-0.10399746894836426,
0.2686645984649658,
0.12837029993534088,
0.11094734072685242,
0.052566420286893845,
-0.09157175570726395,
0.10807830095291138,
0.01006117183715105,
-0.07926932722330093,
-0.033996015787124634,
-0.0030480532441288233,
0.021757712587714195,
-0.2016487568616867,
0.042763303965330124,
0.1273248940706253,
0.07622935622930527,
-0.042705241590738297,
-0.08762288838624954,
-0.013048309832811356,
-0.06594497710466385,
0.05332009866833687,
-0.0165407694876194,
0.12763361632823944,
-0.06454508751630783,
-0.03124934248626232,
0.06888130307197571,
0.05346184968948364,
0.03705662116408348,
-0.04548288881778717,
-0.09575856477022171,
-0.036537256091833115,
-0.02627917379140854,
0.02103262208402157,
-0.08334317058324814,
-0.058662403374910355,
-0.029133448377251625,
0.1217503622174263,
0.06030846759676933,
0.08229882270097733,
0.02534819208085537,
0.042578503489494324,
-0.018014518544077873,
0.01894560270011425,
0.18454627692699432,
0.032461415976285934,
-0.12112367898225784,
-0.11247295141220093,
0.11623511463403702,
-0.07412275671958923,
0.1516924351453781,
-0.23376472294330597,
0.03449954092502594,
-0.10606535524129868,
0.08345958590507507,
0.0017077438533306122,
0.08513399958610535,
-0.07354991883039474,
0.028825750574469566,
-0.10780332237482071,
0.006710950750857592,
0.06633526086807251,
0.035992223769426346,
-0.053377557545900345,
0.15349788963794708,
-0.16296741366386414,
0.2517978847026825,
0.12181705981492996,
-0.13286034762859344,
-0.08410528302192688,
-0.10299656540155411,
0.008393438532948494,
-0.014958925545215607,
-0.09269360452890396,
-0.0077692181803286076,
0.11051994562149048,
-0.0386928990483284,
0.19442617893218994,
-0.022682785987854004,
-0.023846805095672607,
-0.018296057358384132,
-0.0968841016292572,
-0.0053712609224021435,
0.01993531733751297,
0.09277937561273575,
-0.19608037173748016,
0.13585902750492096,
0.13785377144813538,
-0.03008674643933773,
0.19921445846557617,
0.03519749641418457,
0.027803806588053703,
0.007066489662975073,
-0.05108575150370598,
0.004615103360265493,
-0.016708610579371452,
-0.033214226365089417,
-0.03719793260097504,
0.015435577370226383,
-0.0017268992960453033,
0.032471369951963425,
-0.1292077600955963,
-0.043498411774635315,
0.020972639322280884,
0.04506610706448555,
0.04959072545170784,
0.06424178183078766,
-0.08114660531282425,
0.08710927516222,
-0.04135018214583397,
-0.15011906623840332,
0.12193439155817032,
0.009275504387915134,
-0.11306660622358322,
0.16927401721477509,
-0.08475425839424133,
-0.23736794292926788,
-0.20089298486709595,
-0.1772872656583786,
-0.019119156524538994,
0.07307472079992294,
0.06685910373926163,
-0.06804469972848892,
-0.07033377140760422,
-0.015055990777909756,
-0.06428932398557663,
0.01824975572526455,
-0.017032640054821968,
-0.08446785062551498,
0.0451301671564579,
-0.013684476725757122,
-0.11405912786722183,
-0.04632818326354027,
0.014144818298518658,
-0.06867410987615585,
0.0687858834862709,
-0.058338385075330734,
0.06045600399374962,
0.15961861610412598,
-0.019494086503982544,
0.0206222515553236,
-0.03382864594459534,
0.14078357815742493,
-0.07376300543546677,
-0.0012525641359388828,
0.11309554427862167,
-0.062387753278017044,
0.028928736224770546,
0.20800267159938812,
0.022075070068240166,
-0.08314149081707001,
0.08971244096755981,
-0.030545884743332863,
-0.06756961345672607,
-0.20312197506427765,
-0.11030048131942749,
-0.008868525736033916,
0.018859686329960823,
0.07774554193019867,
0.054550036787986755,
0.27576303482055664,
0.1262982189655304,
0.06817155331373215,
0.06489472836256027,
0.02961319498717785,
0.0905478373169899,
0.17955242097377777,
-0.04619118198752403,
0.18362373113632202,
-0.06867803633213043,
-0.18565845489501953,
0.038355518132448196,
-0.013611708767712116,
0.05560639500617981,
0.1625892072916031,
-0.010438076220452785,
0.03667459264397621,
0.07544669508934021,
0.1377580612897873,
0.11926194280385971,
0.07736333459615707,
-0.053890686482191086,
-0.0123245595023036,
-0.01958155632019043,
-0.06044450402259827,
0.12492989748716354,
-0.05211121216416359,
-0.054029930382966995,
-0.024178169667720795,
0.05096742510795593,
0.04076123610138893,
0.0885038748383522,
0.0038742104079574347,
-0.2888331115245819,
0.029169974848628044,
0.046290311962366104,
-0.07681751251220703,
-0.0919661894440651,
0.09150734543800354,
-0.013760070316493511,
-0.16257907450199127,
0.015721965581178665,
-0.028128916397690773,
0.09203445166349411,
-0.0021014101803302765,
0.07018017023801804,
-0.09545313566923141,
-0.03008081018924713,
-0.042241763323545456,
0.14214786887168884,
-0.3904902935028076,
0.20279116928577423,
-0.013757672160863876,
0.042110588401556015,
-0.11073333024978638,
0.007113968953490257,
0.08234450221061707,
0.16876806318759918,
0.10588139295578003,
-0.06181146577000618,
-0.12684638798236847,
-0.10665581375360489,
-0.10000576823949814,
-0.002122523495927453,
0.02025875821709633,
-0.01400430966168642,
0.02576049603521824,
-0.11316744238138199,
-0.006981417536735535,
0.045942071825265884,
0.00005032867193222046,
-0.1327565312385559,
-0.1645476371049881,
0.0024751632008701563,
0.05992445722222328,
0.12014522403478622,
-0.031115030869841576,
-0.08387571573257446,
-0.09976387768983841,
0.17837554216384888,
0.05166070535778999,
-0.0018781357211992145,
-0.12678098678588867,
-0.04034697636961937,
-0.05093073844909668,
-0.028619080781936646,
0.07061373442411423,
0.0115804523229599,
0.11983727663755417,
-0.0776883065700531,
-0.08326467871665955,
0.10231561213731766,
-0.11192917823791504,
-0.05676306411623955,
-0.10391143709421158,
0.037218786776065826,
-0.03583132475614548,
-0.00300496444106102,
0.10846041887998581,
0.03224887326359749,
-0.06038367375731468,
-0.06040918827056885,
-0.030626386404037476,
0.002953569171950221,
-0.024285854771733284,
-0.10091901570558548,
-0.12347328662872314,
-0.11263954639434814,
-0.02959991805255413,
-0.12120452523231506,
0.22577744722366333,
0.14880536496639252,
-0.08410761505365372,
0.14291362464427948,
0.19940108060836792,
-0.11654946208000183,
-0.31744757294654846,
-0.06689527630805969,
-0.0593874417245388,
0.013890923000872135,
0.03878229483962059,
-0.12954181432724,
0.1002892553806305,
0.01007110346108675,
-0.07845258712768555,
-0.02825743705034256,
-0.14488260447978973,
-0.1637139469385147,
0.2514044940471649,
0.017858991399407387,
0.2469845563173294,
-0.09900328516960144,
-0.0562664233148098,
-0.15211105346679688,
0.03663777932524681,
0.07304413616657257,
-0.07589749246835709,
0.07363534718751907,
0.04355068877339363,
0.08316291868686676,
0.03649081289768219,
-0.01850447990000248,
0.052836280316114426,
-0.054001953452825546,
0.0728193074464798,
-0.16864342987537384,
-0.03414404019713402,
0.027340056374669075,
-0.022704705595970154,
0.10976030677556992,
-0.0730476900935173,
0.02634204924106598,
-0.030788259580731392,
-0.07285180687904358,
0.03122970648109913,
0.06520438939332962,
-0.0009521509637124836,
-0.1139688566327095,
0.0039772349409759045,
-0.012743611820042133,
0.010530095547437668,
-0.05437423661351204,
0.0461973138153553,
-0.04407922551035881,
0.13509979844093323,
0.1618819236755371,
0.23810654878616333,
-0.05280663073062897,
0.09310457110404968,
-0.033480700105428696,
-0.11406292766332626,
0.08189824223518372,
-0.08512503653764725,
0.032127875834703445,
0.07843150943517685,
-0.044101208448410034,
0.16950587928295135,
0.05329832062125206,
0.014960072003304958,
-0.01104823499917984,
0.15519900619983673,
-0.16232933104038239,
0.024055518209934235,
-0.08237790316343307,
0.12376146763563156,
0.05088603496551514,
0.0014241976896300912,
0.1342407464981079,
-0.09748697280883789,
-0.014323112554848194,
0.022257447242736816,
0.007675370201468468,
-0.033624421805143356,
0.10129436105489731,
0.04010987654328346,
0.014930672012269497,
-0.07691510766744614,
0.03148246556520462,
0.0725991502404213,
0.015066240914165974,
0.04953509941697121,
0.0274610985070467,
-0.0900094285607338,
-0.1050470843911171,
0.005259819328784943,
0.2579692006111145,
-0.19413785636425018,
-0.08945966511964798,
-0.025965580716729164,
-0.12841492891311646,
0.014765934087336063,
0.11006096005439758,
0.07310435175895691,
0.044437263160943985,
-0.06245476007461548,
-0.02698042243719101,
-0.11322736740112305,
0.10147830843925476,
0.009620734490454197,
0.04834596440196037,
-0.1538265496492386,
0.08177132904529572,
-0.029066724702715874,
0.00937450211495161,
-0.09466302394866943,
-0.042740534991025925,
-0.1231655478477478,
0.03438199684023857,
-0.1424497365951538,
-0.03872010484337807,
-0.03764064237475395,
-0.0057784151285886765,
0.060039203613996506,
-0.010813248343765736,
-0.026481660082936287,
-0.03197747841477394,
-0.09286012500524521,
0.03200497850775719,
-0.00152523850556463,
0.05066218599677086,
-0.047269031405448914,
-0.034692924469709396,
0.0329509861767292,
-0.007188139017671347,
0.0626831129193306,
0.005206009838730097,
-0.02624976634979248,
0.05140461027622223,
-0.1694229692220688,
0.02440795861184597,
0.082680344581604,
0.006990185473114252,
0.028766458854079247,
-0.03343239799141884,
-0.007944096811115742,
0.10507944226264954,
0.045716822147369385,
0.04421989992260933,
-0.0016878187889233232,
-0.09214121103286743,
0.04705626890063286,
0.06632386893033981,
-0.12569046020507812,
-0.02528555504977703,
-0.032815009355545044,
-0.0014488169690594077,
-0.03754571080207825,
0.21374857425689697,
-0.12118947505950928,
0.04649411514401436,
-0.040373969823122025,
0.03989865258336067,
-0.03320062533020973,
-0.133179172873497,
-0.09410917013883591,
-0.12265769392251968,
-0.029428137466311455,
-0.006662990897893906,
0.26298657059669495,
0.1437089890241623,
-0.026487676426768303,
0.03210904821753502,
0.06094885990023613,
0.06782585382461548,
0.017442740499973297,
0.1971471905708313,
0.11334112286567688,
0.022985318675637245,
-0.12867672741413116,
0.071586012840271,
0.04378187656402588,
-0.05412180349230766,
-0.0045047118328511715,
-0.009332980960607529,
-0.0896068587899208,
0.06856201589107513,
0.07274024933576584,
-0.017855705693364143,
-0.07965993136167526,
-0.14463277161121368,
-0.1213601604104042,
0.037436988204717636,
-0.09540953487157822,
0.01893600821495056,
0.1667509526014328,
-0.04064670577645302,
-0.010678790509700775,
-0.05300956964492798,
-0.04512229934334755,
-0.2152797430753708,
-0.15672186017036438,
-0.11946940422058105,
-0.0855121985077858,
0.02102506160736084,
-0.043301086872816086,
0.052764277905225754,
0.029187506064772606,
0.06313791871070862,
-0.05562494322657585,
0.09108006209135056,
-0.09731566905975342,
-0.005315596237778664,
0.015475943684577942,
-0.061675529927015305,
0.010126902721822262,
-0.21131323277950287,
-0.008043201640248299,
-0.1311621069908142,
0.03128067031502724,
-0.03407350182533264,
-0.01647294871509075,
0.011170338839292526,
0.0016343420138582587,
-0.04606010392308235,
-0.022542638704180717,
-0.018754055723547935,
0.030862459912896156,
0.018654366955161095,
0.04895023629069328,
0.011602147482335567,
-0.007601312827318907,
0.04120941832661629,
0.2065572589635849,
-0.04314391687512398,
-0.18267865478992462,
-0.13445086777210236,
0.23455701768398285,
0.008398239500820637,
0.12134525924921036,
-0.06294604390859604,
0.00014360684144776314,
0.037056922912597656,
0.3312853276729584,
0.2920037806034088,
-0.06591223180294037,
0.016222776845097542,
-0.03496059402823448,
-0.0027518460992723703,
-0.010928671807050705,
0.15421736240386963,
0.028664609417319298,
0.17177550494670868,
-0.04381604865193367,
0.043717969208955765,
-0.016995223239064217,
-0.08461270481348038,
-0.04839999973773956,
0.1234268918633461,
-0.023147767409682274,
-0.008460386656224728,
-0.025535622611641884,
0.07288745045661926,
-0.10863745957612991,
0.12828007340431213,
-0.14435164630413055,
-0.04277050495147705,
-0.06639285385608673,
0.03365843743085861,
0.09799379110336304,
-0.002150370506569743,
0.03132012113928795,
-0.028657719492912292,
-0.01699075475335121,
0.02508445642888546,
-0.035447027534246445,
-0.09260720759630203,
-0.014365178532898426,
0.0811554342508316,
-0.0027090583462268114,
0.19214791059494019,
-0.017216810956597328,
0.056720808148384094,
0.07222237437963486,
-0.00482771685346961,
-0.08674640208482742,
0.1075904443860054,
-0.0048650032840669155,
-0.05891861394047737,
0.12367065995931625,
-0.010637915693223476,
0.01167653501033783,
0.01576627418398857,
0.004334752913564444,
-0.1429060995578766,
0.12325593084096909,
-0.09570106863975525,
-0.09618059545755386,
-0.048408687114715576,
0.0977308452129364,
-0.025742121040821075,
0.14952276647090912,
0.07897619158029556,
-0.021133622154593468,
0.017186688259243965,
-0.026781605556607246,
0.05889764428138733,
-0.02527386136353016,
-0.13093510270118713,
-0.030025644227862358,
-0.19915463030338287,
-0.0342443622648716,
0.06796575337648392,
-0.01800817996263504,
-0.27645865082740784,
-0.07106360793113708,
-0.08814076334238052,
-0.034605976194143295,
-0.13013334572315216,
0.0708061084151268,
0.23948776721954346,
0.02763688750565052,
-0.012005307711660862,
-0.1307791769504547,
-0.004188409075140953,
0.03875477984547615,
-0.04872049018740654,
-0.09937731176614761
] |
null | null | transformers | # [MaziyarPanahi/sqlcoder-7b-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-GGUF)
- Model creator: [defog](https://huggingface.co/defog)
- Original model: [defog/sqlcoder-7b](https://huggingface.co/defog/sqlcoder-7b)
## Description
[MaziyarPanahi/sqlcoder-7b-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-GGUF) contains GGUF format model files for [defog/sqlcoder-7b](https://huggingface.co/defog/sqlcoder-7b).
## How to use
Thanks to [TheBloke](https://huggingface.co/TheBloke) for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: [MaziyarPanahi/sqlcoder-7b-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-GGUF) and below it, a specific filename to download, such as: sqlcoder-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download MaziyarPanahi/sqlcoder-7b-GGUF sqlcoder-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download [MaziyarPanahi/sqlcoder-7b-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-GGUF) --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MaziyarPanahi/sqlcoder-7b-GGUF sqlcoder-7b-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m sqlcoder-7b-GGUF.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./sqlcoder-7b-GGUF.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./sqlcoder-7b-GGUF.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers) | {"tags": ["quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "transformers", "pytorch", "mistral", "text-generation", "code", "en", "license:cc-by-sa-4.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us"], "model_name": "sqlcoder-7b-GGUF", "base_model": "defog/sqlcoder-7b", "inference": false, "model_creator": "defog", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/sqlcoder-7b-GGUF | [
"transformers",
"gguf",
"mistral",
"quantized",
"2-bit",
"3-bit",
"4-bit",
"5-bit",
"6-bit",
"8-bit",
"GGUF",
"pytorch",
"text-generation",
"code",
"en",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:defog/sqlcoder-7b"
] | 2024-02-13T10:13:01+00:00 | [] | [] | TAGS
#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #code #en #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b
| # MaziyarPanahi/sqlcoder-7b-GGUF
- Model creator: defog
- Original model: defog/sqlcoder-7b
## Description
MaziyarPanahi/sqlcoder-7b-GGUF contains GGUF format model files for defog/sqlcoder-7b.
## How to use
Thanks to TheBloke for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* URL. The source project for GGUF. Offers a CLI and a server option.
* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.
* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
Note for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* URL
### In 'text-generation-webui'
Under Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the 'huggingface-hub' Python library:
Then you can download any individual model file to the current directory, at high speed, with a command like this:
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
For more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.
To accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':
And set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':
Windows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.
</details>
## Example 'URL' command
Make sure you are using 'URL' from commit d0cee0d or later.
Change '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'
For other parameters and how to use them, please refer to the URL documentation
## How to run in 'text-generation-webui'
Further instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.
## How to run from Python code
You can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: llama-cpp-python docs.
#### First install the package
Run one of the following commands, according to your system:
#### Simple llama-cpp-python example code
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* LangChain + llama-cpp-python
* LangChain + ctransformers | [
"# MaziyarPanahi/sqlcoder-7b-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b",
"## Description\nMaziyarPanahi/sqlcoder-7b-GGUF contains GGUF format model files for defog/sqlcoder-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
"TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #code #en #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b \n",
"# MaziyarPanahi/sqlcoder-7b-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b",
"## Description\nMaziyarPanahi/sqlcoder-7b-GGUF contains GGUF format model files for defog/sqlcoder-7b.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
104,
34,
35,
26,
401,
323,
84,
75,
218,
182,
49,
77,
36,
19,
12,
50
] | [
"passage: TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #pytorch #text-generation #code #en #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b \n# MaziyarPanahi/sqlcoder-7b-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b## Description\nMaziyarPanahi/sqlcoder-7b-GGUF contains GGUF format model files for defog/sqlcoder-7b.## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"passage: ### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-GGUF.Q4_K_M.gguf.\n\nThen click Download."
] | [
-0.08280113339424133,
0.15520961582660675,
-0.0028725960291922092,
0.06999900937080383,
0.10086846351623535,
0.05033065378665924,
0.015932010486721992,
0.09922252595424652,
0.05204183608293533,
0.0547698475420475,
0.0845315158367157,
0.04415379837155342,
0.057042136788368225,
0.1704007387161255,
0.09437622129917145,
-0.20744894444942474,
0.033255621790885925,
-0.0071572307497262955,
-0.01573844440281391,
0.038280535489320755,
0.055041439831256866,
-0.02439494989812374,
0.08348642289638519,
-0.012457894161343575,
-0.067390076816082,
-0.05391896516084671,
-0.04298120737075806,
-0.002481263130903244,
0.051185257732868195,
0.06534560769796371,
-0.07333163917064667,
-0.03599422425031662,
-0.004599135369062424,
-0.12284886091947556,
0.02443622797727585,
0.04302065074443817,
-0.032672710716724396,
0.032930128276348114,
-0.012205366976559162,
0.04709886759519577,
0.16349858045578003,
-0.07886078208684921,
-0.031241629272699356,
0.046620797365903854,
-0.049928054213523865,
-0.14956039190292358,
-0.11453741043806076,
0.013135358691215515,
0.010595576837658882,
0.05135045200586319,
0.01756400614976883,
-0.01186177134513855,
-0.012359770946204662,
0.03361160308122635,
0.17223574221134186,
-0.2346126288175583,
-0.049708545207977295,
0.1321033090353012,
0.05650966614484787,
0.06051218509674072,
-0.08770698308944702,
0.04971037805080414,
0.008766223676502705,
0.02537713572382927,
0.03641428425908089,
-0.04115799441933632,
0.12279711663722992,
0.006731437519192696,
-0.10816926509141922,
-0.02804299257695675,
0.10023177415132523,
-0.0253013726323843,
-0.05077395588159561,
-0.0694786086678505,
-0.03462517261505127,
-0.04048553481698036,
-0.04242425784468651,
0.04137026518583298,
0.022109363228082657,
0.042454712092876434,
0.05089884623885155,
-0.11922498047351837,
-0.03970879316329956,
-0.06068328022956848,
-0.04400191456079483,
0.19974368810653687,
0.013258753344416618,
0.05858413502573967,
0.01990114524960518,
0.11058638989925385,
-0.1570630669593811,
-0.04244750738143921,
-0.10677685588598251,
-0.0010580569505691528,
-0.022630203515291214,
0.05347087234258652,
-0.011176642030477524,
0.06381531804800034,
0.07502281665802002,
0.11452366411685944,
-0.09404776990413666,
0.07875420153141022,
0.08106176555156708,
-0.0022834702394902706,
-0.034102875739336014,
0.12418697774410248,
-0.0709967240691185,
-0.09595285356044769,
0.07797896862030029,
0.009713994339108467,
0.0804469883441925,
-0.03730849176645279,
-0.08408689498901367,
-0.006455376744270325,
-0.03130004554986954,
0.03048839047551155,
0.017381738871335983,
0.04646056517958641,
-0.018580542877316475,
-0.03856221213936806,
0.21265491843223572,
-0.07477501034736633,
0.04469728469848633,
0.00791005976498127,
-0.023891249671578407,
-0.018412794917821884,
0.009667657315731049,
-0.026069318875670433,
-0.04551360756158829,
-0.023370493203401566,
-0.09703505784273148,
-0.027750428766012192,
-0.06918604671955109,
-0.029317811131477356,
0.04181854426860809,
-0.07084667682647705,
-0.018567238003015518,
-0.07199457287788391,
-0.20257985591888428,
0.02301155962049961,
0.0378943495452404,
-0.03540801256895065,
-0.016626743599772453,
-0.003514351323246956,
-0.03824241831898689,
0.03231716901063919,
0.017666222527623177,
0.08163370192050934,
-0.045701250433921814,
0.04718519747257233,
0.030603326857089996,
0.04593855142593384,
-0.16668570041656494,
0.009661389514803886,
-0.03421428054571152,
0.05915790796279907,
-0.06368168443441391,
0.11502404510974884,
-0.09926120936870575,
0.02336706593632698,
-0.04842238873243332,
-0.014498072676360607,
-0.02041342295706272,
-0.03524623066186905,
0.045697156339883804,
0.07045403867959976,
-0.1008150577545166,
-0.056650277227163315,
0.11603888869285583,
-0.12088585644960403,
-0.03693599998950958,
0.12397678941488266,
0.01434637513011694,
-0.029041040688753128,
0.07715969532728195,
0.08316320925951004,
0.18358393013477325,
-0.04239572957158089,
-0.0792384147644043,
0.04895716905593872,
0.015358490869402885,
-0.016485482454299927,
0.07923717796802521,
0.0019429493695497513,
-0.04863860458135605,
0.07385241985321045,
-0.10769668221473694,
0.06565891206264496,
0.01500073540955782,
-0.05031563714146614,
-0.032944973558187485,
-0.07534757256507874,
0.05593249201774597,
-0.00851275771856308,
-0.019894856959581375,
-0.0038203708827495575,
-0.07895101606845856,
-0.07721076905727386,
0.13533711433410645,
-0.03719719499349594,
0.015546940267086029,
-0.07630167156457901,
0.15317869186401367,
-0.07119311392307281,
0.05138334631919861,
-0.042171649634838104,
-0.07139726728200912,
0.05036613345146179,
-0.07700890302658081,
0.026509784162044525,
-0.08425962924957275,
0.050839297473430634,
0.06243910640478134,
-0.032890431582927704,
0.03869562968611717,
-0.012736189179122448,
-0.024982241913676262,
-0.06142040714621544,
-0.03413558751344681,
-0.008110443130135536,
-0.032482586801052094,
0.1379653811454773,
-0.0716007798910141,
0.01466575637459755,
0.12160694599151611,
0.0237123966217041,
-0.0005319789052009583,
-0.09510952979326248,
0.03803981468081474,
-0.009135784581303596,
0.02474219910800457,
-0.050754643976688385,
0.027813665568828583,
0.04220735281705856,
-0.07880425453186035,
0.0526173859834671,
-0.102195605635643,
0.01857220008969307,
0.09441075474023819,
0.15649938583374023,
0.03230803832411766,
-0.05575525015592575,
0.008270745165646076,
-0.017263950780034065,
0.028665464371442795,
-0.03670080378651619,
0.15643879771232605,
-0.025431960821151733,
0.06373439729213715,
-0.055049821734428406,
-0.0056017301976680756,
0.02836962789297104,
0.019975118339061737,
-0.02581942453980446,
0.05408535152673721,
0.08015993237495422,
-0.05611059442162514,
0.06000002101063728,
0.028855565935373306,
-0.026800082996487617,
0.16752156615257263,
0.013629079796373844,
-0.03981754928827286,
-0.042562320828437805,
-0.0034630869049578905,
0.010907399468123913,
0.13331720232963562,
-0.14281603693962097,
-0.010445314459502697,
0.012773021124303341,
0.012769658118486404,
0.08370514214038849,
-0.12844541668891907,
0.018161339685320854,
-0.040257617831230164,
-0.09253214299678802,
0.029252290725708008,
0.02202896773815155,
-0.09484076499938965,
0.034182097762823105,
0.06596167385578156,
0.05634211748838425,
0.035480741411447525,
0.012702731415629387,
-0.08411428332328796,
0.15043655037879944,
-0.1328330934047699,
-0.20365998148918152,
-0.1348373293876648,
-0.023550864309072495,
-0.0614897683262825,
0.002005810383707285,
0.020422033965587616,
-0.05929876118898392,
-0.03571855649352074,
-0.05625131353735924,
-0.01603136956691742,
-0.025278108194470406,
0.008161434903740883,
0.0746627002954483,
-0.07462511956691742,
-0.0140679981559515,
-0.11023324728012085,
-0.002599285915493965,
0.008985047228634357,
-0.08298714458942413,
0.03495984524488449,
-0.0063141100108623505,
0.09545718878507614,
0.08169996738433838,
0.028102509677410126,
0.01600949838757515,
-0.012125193141400814,
0.2127770483493805,
-0.07420143485069275,
0.07450887560844421,
0.13495314121246338,
0.07104288786649704,
0.0637398287653923,
-0.01337740384042263,
0.012097694911062717,
-0.06742140650749207,
-0.01928148791193962,
0.014852717518806458,
-0.11109693348407745,
-0.0888793021440506,
-0.06369292736053467,
-0.0830676332116127,
0.05193178355693817,
-0.0005633607506752014,
0.08576817810535431,
-0.025770755484700203,
0.08704708516597748,
-0.0064699891954660416,
0.04012270271778107,
0.004724334925413132,
0.04882921278476715,
0.11134069412946701,
-0.0012099826708436012,
0.03440623730421066,
-0.07342573255300522,
0.06355162709951401,
0.11862964928150177,
0.1207817941904068,
0.1217270940542221,
-0.08455201238393784,
0.15249818563461304,
0.01676950976252556,
0.061004653573036194,
0.017276430502533913,
0.006421853322535753,
-0.07554234564304352,
0.006304004229605198,
-0.03204915672540665,
-0.05531451851129532,
-0.07373106479644775,
0.04666409641504288,
0.0009422395378351212,
-0.049093738198280334,
-0.004095360636711121,
0.04832093045115471,
0.0560833178460598,
0.08704154938459396,
0.015165033750236034,
-0.1628398299217224,
-0.11593359708786011,
0.030698131769895554,
-0.0056859590113162994,
-0.050272680819034576,
0.01571386307477951,
0.0838727056980133,
-0.05814023315906525,
0.05224809795618057,
-0.03368200361728668,
0.03808395937085152,
-0.08112843334674835,
-0.01241255085915327,
0.07035227864980698,
0.1612061858177185,
0.021768881008028984,
0.07076560705900192,
-0.18859833478927612,
0.001105019822716713,
0.029567021876573563,
0.055397115647792816,
-0.06632420420646667,
0.03936052322387695,
0.07788775861263275,
0.007424760609865189,
0.048179954290390015,
0.03460579365491867,
0.0372963547706604,
-0.03140591084957123,
-0.09876731038093567,
0.07213737070560455,
0.044274479150772095,
-0.04467098414897919,
0.060853056609630585,
-0.021447962149977684,
0.014462183229625225,
-0.03397070989012718,
-0.009846221655607224,
-0.030213791877031326,
-0.18776170909404755,
0.11826521158218384,
0.04772402346134186,
-0.012459490448236465,
-0.08743101358413696,
-0.02997138909995556,
-0.05729781836271286,
0.15290123224258423,
-0.08718347549438477,
-0.09928658604621887,
-0.09107774496078491,
-0.017048310488462448,
0.10994323343038559,
-0.08965612947940826,
0.04719928279519081,
-0.041375234723091125,
0.04773066192865372,
-0.035687584429979324,
-0.1035173162817955,
0.03412396460771561,
-0.06796452403068542,
-0.11586589366197586,
-0.003771737217903137,
0.09479114413261414,
0.031175632029771805,
0.024960320442914963,
-0.009932518936693668,
0.012820949777960777,
-0.02045055851340294,
-0.1548328697681427,
0.03776930272579193,
0.14305628836154938,
-0.08604160696268082,
0.05406366288661957,
0.00041123852133750916,
0.03783619403839111,
0.00515018729493022,
-0.028984149917960167,
0.07219648361206055,
0.1353810578584671,
-0.06179403141140938,
0.12384182214736938,
0.12632903456687927,
-0.0753018856048584,
-0.2255742847919464,
-0.049074940383434296,
-0.0019664261490106583,
0.011333389207720757,
-0.11390349268913269,
-0.23344555497169495,
0.07020124793052673,
0.07736990600824356,
-0.033708229660987854,
0.25982290506362915,
-0.2527916729450226,
-0.058295413851737976,
-0.04306734353303909,
0.051313210278749466,
0.18636558949947357,
-0.16040858626365662,
-0.06434564292430878,
0.006149805150926113,
-0.1720902919769287,
0.07367746531963348,
-0.04900912195444107,
0.13412044942378998,
-0.035883814096450806,
0.0670040100812912,
-0.020400600507855415,
-0.05427132546901703,
0.1555148959159851,
-0.05000811815261841,
-0.00572447944432497,
-0.057739030569791794,
0.009112442843616009,
0.03794300928711891,
-0.04307730123400688,
0.09364429116249084,
-0.11861155182123184,
0.033486299216747284,
-0.0789235532283783,
-0.03398386761546135,
-0.07862585037946701,
0.015661824494600296,
0.0015863198786973953,
-0.04345593601465225,
-0.09893380105495453,
0.05637266859412193,
-0.0037608472630381584,
0.019179126247763634,
-0.044217146933078766,
0.0036414694041013718,
-0.006961464881896973,
0.08667206019163132,
0.04937121272087097,
-0.15905000269412994,
-0.0658300593495369,
-0.03339249640703201,
-0.02406695857644081,
0.06559376418590546,
-0.11498153209686279,
0.02084587886929512,
0.07384410500526428,
0.023788873106241226,
0.05193270370364189,
0.0283169187605381,
-0.10874149203300476,
0.07336950302124023,
0.06853324919939041,
-0.10957030951976776,
-0.17311881482601166,
-0.028954749926924706,
-0.046782441437244415,
-0.048409946262836456,
0.06859225034713745,
0.14176452159881592,
0.010870548896491528,
-0.02218061313033104,
-0.019019819796085358,
0.06824858486652374,
-0.03126773238182068,
0.11926337331533432,
0.034239478409290314,
0.0010767430067062378,
-0.11186917871236801,
0.057032786309719086,
0.01765374280512333,
-0.014473918825387955,
0.027018219232559204,
0.1503625512123108,
-0.10607096552848816,
-0.0701192319393158,
-0.17354178428649902,
-0.047734059393405914,
-0.06823211908340454,
-0.037626106292009354,
-0.04202431067824364,
-0.04105404391884804,
0.0429549366235733,
0.04594019055366516,
0.03304654359817505,
0.056210536509752274,
-0.02407868579030037,
0.0655793845653534,
-0.04340492561459541,
0.04627923667430878,
-0.058549027889966965,
0.04566812142729759,
-0.11644680798053741,
0.017236515879631042,
0.0011824313551187515,
0.07613051682710648,
-0.02520662546157837,
-0.0218246728181839,
-0.07624770700931549,
-0.036946460604667664,
-0.1238364726305008,
0.025418249890208244,
-0.1159919872879982,
0.022348366677761078,
-0.011502888053655624,
0.0025289461482316256,
-0.018567364662885666,
0.06931851804256439,
-0.03385872393846512,
-0.04409734904766083,
-0.055908530950546265,
-0.0012400280684232712,
-0.05681510642170906,
0.01449868269264698,
0.051187798380851746,
-0.023287564516067505,
0.13287390768527985,
0.0019506178796291351,
0.015383623540401459,
0.030762072652578354,
-0.12886834144592285,
0.02005138248205185,
0.014920671470463276,
-0.013055187650024891,
-0.020960746333003044,
-0.07730869948863983,
0.046123962849378586,
-0.0028282981365919113,
0.003821682184934616,
0.015837915241718292,
0.15303993225097656,
-0.07660424709320068,
-0.00494920089840889,
-0.06726954877376556,
0.004791134037077427,
-0.02249564230442047,
0.046324267983436584,
0.10467928647994995,
0.02654878981411457,
0.05027519911527634,
-0.04124858230352402,
-0.022975342348217964,
-0.09294729679822922,
-0.010105744004249573,
-0.004411583766341209,
-0.040366560220718384,
-0.03494611009955406,
-0.014063325710594654,
0.03971225023269653,
0.01654670014977455,
0.18346334993839264,
-0.020296771079301834,
-0.06730763614177704,
-0.024883801117539406,
-0.025995230302214622,
0.08885523676872253,
-0.0035264338366687298,
0.11921298503875732,
0.04887659102678299,
-0.01905018463730812,
-0.003482216037809849,
0.06305360049009323,
0.02904864028096199,
0.00755017064511776,
0.04884206876158714,
-0.025505226105451584,
0.07391580939292908,
0.09999310970306396,
-0.002950505120679736,
-0.08061940968036652,
-0.10927044600248337,
0.05307722091674805,
-0.09789924323558807,
0.05832434445619583,
-0.06913870573043823,
0.0672433078289032,
0.12922582030296326,
-0.10352972149848938,
0.05679390951991081,
0.04286551475524902,
-0.06306274980306625,
-0.05075666680932045,
-0.14221623539924622,
-0.04314758628606796,
-0.1015593409538269,
-0.0032206415198743343,
-0.07996521890163422,
0.01792176626622677,
0.058752819895744324,
0.0038587814196944237,
-0.0030458744149655104,
0.12437696009874344,
0.004046998918056488,
-0.040821902453899384,
0.04461543262004852,
0.010584634728729725,
-0.02890261635184288,
0.10870902240276337,
-0.042405836284160614,
0.02860265225172043,
-0.030699893832206726,
0.09693443775177002,
0.026361877098679543,
0.019877824932336807,
0.06228812038898468,
-0.009971734136343002,
0.0013741664588451385,
-0.025202684104442596,
0.04972043260931969,
0.02011437714099884,
0.15581448376178741,
0.015039866790175438,
-0.0702289417386055,
0.02161114104092121,
0.0878114104270935,
-0.027106083929538727,
-0.01000828854739666,
-0.0993477925658226,
0.10852982848882675,
-0.054231464862823486,
0.00881250575184822,
-0.02142918109893799,
-0.06753261387348175,
0.01858387514948845,
0.18206819891929626,
0.16294187307357788,
-0.07838563621044159,
-0.012079929932951927,
0.028844617307186127,
-0.006946387700736523,
0.012163389474153519,
0.13505928218364716,
0.07094122469425201,
0.27675825357437134,
-0.013182219117879868,
-0.019916091114282608,
-0.02687494456768036,
-0.015023976564407349,
-0.10253850370645523,
0.017889944836497307,
-0.06465381383895874,
0.04252142459154129,
-0.05874260514974594,
0.00030753016471862793,
-0.04221228510141373,
-0.10146787762641907,
0.004901244305074215,
-0.0774879902601242,
-0.06253936886787415,
0.007562918122857809,
-0.06927654147148132,
0.025520019233226776,
0.039014510810375214,
0.02522283047437668,
0.03719539940357208,
0.07104243338108063,
0.008890902623534203,
-0.1539585441350937,
-0.02419726364314556,
0.056249916553497314,
0.0007012840360403061,
0.2100003957748413,
-0.04177876561880112,
0.025234371423721313,
0.09569119662046432,
-0.0021129450760781765,
-0.1334923505783081,
0.08607164770364761,
0.012036800384521484,
-0.0908496081829071,
-0.00524336751550436,
0.05687497556209564,
-0.010159652680158615,
0.04146892949938774,
0.05608949810266495,
0.1465999335050583,
0.017542090266942978,
0.03889729827642441,
0.018006332218647003,
-0.06616052985191345,
-0.010658478364348412,
-0.14105552434921265,
0.1510114073753357,
0.11114911735057831,
-0.007464174181222916,
-0.008126433938741684,
-0.05962248146533966,
0.037328287959098816,
-0.0295550599694252,
0.04206942021846771,
-0.04170020669698715,
-0.09700866043567657,
0.007195712998509407,
-0.01616956666111946,
0.024802248924970627,
-0.18579888343811035,
-0.04415787383913994,
-0.04564391076564789,
0.01610749401152134,
0.012003748677670956,
0.0924360603094101,
0.08871884644031525,
-0.002726062200963497,
-0.0472642257809639,
-0.11930151283740997,
-0.03838428109884262,
0.04761269688606262,
-0.11188602447509766,
-0.08239954710006714
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# cards-blt-swin-tiny-patch4-window7-224-finetuned-v1
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3476
- Accuracy: 0.4217
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.7046 | 1.0 | 56 | 1.4651 | 0.3422 |
| 1.6543 | 1.99 | 112 | 1.4050 | 0.3917 |
| 1.6565 | 2.99 | 168 | 1.3476 | 0.4217 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swin-tiny-patch4-window7-224", "model-index": [{"name": "cards-blt-swin-tiny-patch4-window7-224-finetuned-v1", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.4216666666666667, "name": "Accuracy"}]}]}]} | image-classification | ansilmbabl/cards-blt-swin-tiny-patch4-window7-224-finetuned-v1 | [
"transformers",
"tensorboard",
"safetensors",
"swin",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:microsoft/swin-tiny-patch4-window7-224",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:14:28+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| cards-blt-swin-tiny-patch4-window7-224-finetuned-v1
===================================================
This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset.
It achieves the following results on the evaluation set:
* Loss: 1.3476
* Accuracy: 0.4217
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.0.1+cu117
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
88,
144,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.11765915900468826,
0.16754376888275146,
-0.002332442905753851,
0.09133251011371613,
0.10915283858776093,
0.02940904162824154,
0.10591065883636475,
0.13941383361816406,
-0.06588911265134811,
0.11517035961151123,
0.13753651082515717,
0.08762428909540176,
0.07153991609811783,
0.15178488194942474,
-0.00499581778421998,
-0.2968966066837311,
0.018001360818743706,
-0.01170832384377718,
-0.14045464992523193,
0.10913319140672684,
0.06868519634008408,
-0.125401571393013,
0.09118035435676575,
0.0037092796992510557,
-0.14063076674938202,
-0.027653243392705917,
-0.03979460895061493,
-0.046449121087789536,
0.09920447319746017,
0.03686288371682167,
0.08162684738636017,
0.031193384900689125,
0.11375588178634644,
-0.2330435961484909,
0.007130134850740433,
0.07351066172122955,
0.012036585249006748,
0.09648068994283676,
0.11815784126520157,
0.01702595315873623,
0.14384452998638153,
-0.10978855937719345,
0.06574146449565887,
0.04111422970890999,
-0.08309638500213623,
-0.23894765973091125,
-0.0635773316025734,
0.09020445495843887,
0.12273582816123962,
0.053626567125320435,
-0.00808032602071762,
0.08387497812509537,
-0.06683608144521713,
0.08410459756851196,
0.22697961330413818,
-0.24168333411216736,
-0.07380914688110352,
0.04401656240224838,
0.03167056664824486,
0.036000482738018036,
-0.1333334594964981,
-0.0063564833253622055,
0.037771668285131454,
0.0030793994665145874,
0.10977614670991898,
0.0263083353638649,
0.061307329684495926,
0.009123362600803375,
-0.14058749377727509,
-0.04456121847033501,
0.09490650147199631,
0.10788409411907196,
-0.01751788519322872,
-0.12114740908145905,
-0.05410588160157204,
-0.19297905266284943,
-0.0459405779838562,
0.013369264081120491,
0.04008757695555687,
-0.05635332688689232,
-0.07813552021980286,
0.030272776260972023,
-0.07100824266672134,
-0.07953225821256638,
0.04502252861857414,
0.12736277282238007,
0.060521818697452545,
-0.0045955306850373745,
0.022281132638454437,
0.11716210842132568,
0.09475012123584747,
-0.16307766735553741,
-0.0007599478121846914,
0.006901710759848356,
-0.07224006950855255,
-0.000039319849747698754,
-0.00695404689759016,
0.024959657341241837,
0.042200032621622086,
0.14083793759346008,
-0.029838066548109055,
0.0800810232758522,
0.08421552181243896,
0.022104909643530846,
-0.0812143087387085,
0.14753203094005585,
-0.08025862276554108,
-0.08872431516647339,
-0.029431860893964767,
0.11827979981899261,
0.03366805985569954,
-0.009361648932099342,
-0.0867839902639389,
0.02186848595738411,
0.10502532124519348,
0.02884267084300518,
0.0005844584084115922,
0.04297792166471481,
-0.05719715729355812,
-0.03245789185166359,
0.08858150243759155,
-0.08463194221258163,
0.04490572586655617,
0.03402736037969589,
-0.06643088907003403,
-0.009982298128306866,
0.02637125551700592,
-0.012438962236046791,
0.008760823868215084,
0.10687355697154999,
-0.0967978909611702,
-0.0289057157933712,
-0.08168037980794907,
-0.07624975591897964,
0.0314454585313797,
-0.08725813776254654,
0.016400212422013283,
-0.08427345007658005,
-0.11392028629779816,
-0.03780375048518181,
0.06510571390390396,
-0.06268245726823807,
-0.0711817592382431,
-0.04824215546250343,
-0.10063835978507996,
0.0611896850168705,
0.006773785687983036,
0.12806789577007294,
-0.05223703384399414,
0.09449782967567444,
0.003740663407370448,
0.07972024381160736,
0.06422502547502518,
0.03617699816823006,
-0.06354711204767227,
0.06748682260513306,
-0.16542872786521912,
0.05099838599562645,
-0.08781091123819351,
0.06942710280418396,
-0.11834156513214111,
-0.10437877476215363,
-0.008676953613758087,
-0.012299522757530212,
0.06506089866161346,
0.14185702800750732,
-0.15555356442928314,
-0.06797108799219131,
0.14526063203811646,
-0.08890704810619354,
-0.12081436067819595,
0.10566697269678116,
-0.01255430281162262,
-0.06108224764466286,
0.010860845446586609,
0.16729365289211273,
0.08459290862083435,
-0.08424384891986847,
-0.03433625027537346,
0.006005214061588049,
0.0968417227268219,
-0.0007902128854766488,
0.103339284658432,
-0.001891355263069272,
0.012257043272256851,
0.017475103959441185,
-0.07454390823841095,
0.0761919692158699,
-0.08945557475090027,
-0.07930418848991394,
-0.038342010229825974,
-0.08360843360424042,
0.02758554369211197,
0.06448548287153244,
0.02280540019273758,
-0.0794619619846344,
-0.13597674667835236,
0.016156552359461784,
0.12124969065189362,
-0.09589441120624542,
-0.005005652084946632,
-0.05528402328491211,
0.07092840224504471,
-0.05018484592437744,
-0.011231796815991402,
-0.12547600269317627,
-0.07431714981794357,
0.0345577746629715,
-0.08429037779569626,
-0.015576883219182491,
-0.009276621975004673,
0.07471670210361481,
0.08819718658924103,
-0.05812571570277214,
-0.08974308520555496,
-0.05569591000676155,
0.010978843085467815,
-0.07734661549329758,
-0.2588106691837311,
-0.08065007627010345,
-0.0267537459731102,
0.1426846981048584,
-0.2551719844341278,
0.015487599186599255,
0.013709750957787037,
0.1476009041070938,
0.04463601112365723,
-0.05752711370587349,
0.0028919437900185585,
0.013227124698460102,
-0.045008111745119095,
-0.10059117525815964,
0.03368052840232849,
0.002146498765796423,
-0.10949800163507462,
-0.02254980430006981,
-0.11953935027122498,
0.12222383171319962,
0.104343481361866,
0.011190311051905155,
-0.09359344840049744,
-0.043751973658800125,
-0.07634665071964264,
-0.05544142425060272,
-0.019447404891252518,
0.015762142837047577,
0.08351815491914749,
0.010114968754351139,
0.10583659261465073,
-0.08187925815582275,
-0.05762708559632301,
0.04081660509109497,
-0.0029322037007659674,
-0.02943330630660057,
0.13994333148002625,
0.10684020072221756,
-0.07842238247394562,
0.1334345042705536,
0.1302734613418579,
-0.05239474028348923,
0.13124732673168182,
-0.05807117000222206,
-0.09641838073730469,
-0.03255247324705124,
0.02186783216893673,
0.020187770947813988,
0.15437449514865875,
-0.09198525547981262,
0.012078605592250824,
0.026315320283174515,
0.011550411581993103,
0.01344616711139679,
-0.17262187600135803,
-0.016989683732390404,
0.04578150808811188,
-0.04995851591229439,
0.02068524807691574,
-0.0304640494287014,
-0.02509402111172676,
0.09259186685085297,
0.0044878325425088406,
-0.04971901699900627,
-0.004058452323079109,
-0.006232511252164841,
-0.08122281730175018,
0.21001681685447693,
-0.07837370038032532,
-0.14975300431251526,
-0.12711450457572937,
0.03793148323893547,
-0.0412096343934536,
-0.002861904678866267,
0.017427677288651466,
-0.10598774254322052,
-0.052681148052215576,
-0.08712753653526306,
0.001756535959430039,
-0.015517032705247402,
0.05112868547439575,
0.010858828201889992,
0.016460923478007317,
0.08136335015296936,
-0.08419495075941086,
0.021267786622047424,
-0.007105300202965736,
-0.010764733888208866,
0.030794942751526833,
0.04259965568780899,
0.12018527835607529,
0.130454882979393,
0.01678231917321682,
0.017083048820495605,
-0.008356506004929543,
0.18868491053581238,
-0.0937875434756279,
0.03220866620540619,
0.1022246703505516,
-0.002280519576743245,
0.0516522154211998,
0.13424144685268402,
0.04549407586455345,
-0.0743604227900505,
0.015226229093968868,
0.03370179608464241,
-0.01755775511264801,
-0.19187405705451965,
-0.0314071886241436,
-0.027853915467858315,
0.007445486728101969,
0.13086847960948944,
0.04723621532320976,
-0.03068353421986103,
0.0684063583612442,
-0.01992211677134037,
0.008621105924248695,
-0.01818081922829151,
0.07188230007886887,
0.023659151047468185,
0.048338647931814194,
0.10719096660614014,
-0.03794257342815399,
-0.022650068625807762,
0.0402592308819294,
-0.005390160251408815,
0.20968137681484222,
-0.030143337324261665,
0.14354616403579712,
0.024695534259080887,
0.16601219773292542,
0.004566024988889694,
0.061988238245248795,
0.014600413851439953,
-0.03425098583102226,
0.005677468609064817,
-0.05432700365781784,
-0.025352302938699722,
0.05348490923643112,
0.020321615040302277,
0.05995103716850281,
-0.1077842190861702,
0.06711214780807495,
0.04653113707900047,
0.26234328746795654,
0.07365486025810242,
-0.3369845151901245,
-0.09089618176221848,
0.016972418874502182,
-0.033847060054540634,
-0.046926550567150116,
0.023354025557637215,
0.15566037595272064,
-0.08536054193973541,
0.0764487236738205,
-0.08531530946493149,
0.0676410123705864,
-0.07122406363487244,
-0.004181514959782362,
0.08911734819412231,
0.10828197747468948,
0.004614140372723341,
0.07681705057621002,
-0.19348685443401337,
0.25423383712768555,
-0.007474001031368971,
0.04424755647778511,
-0.05861619487404823,
0.03237165883183479,
0.02785586379468441,
0.023865723982453346,
0.11019990593194962,
-0.0031531229615211487,
-0.10478446632623672,
-0.18756313621997833,
-0.12044835090637207,
0.020075786858797073,
0.11717861145734787,
-0.08104095607995987,
0.11284030228853226,
-0.03374394401907921,
-0.03869448974728584,
0.04863887280225754,
-0.061892956495285034,
-0.08061802387237549,
-0.1257026195526123,
0.0011736677261069417,
-0.03629905357956886,
0.009203562512993813,
-0.0960238128900528,
-0.10382671654224396,
-0.09686540812253952,
0.1513671725988388,
-0.1097187027335167,
-0.03905751183629036,
-0.15598542988300323,
0.10080127418041229,
0.14356352388858795,
-0.08274886012077332,
0.06050371006131172,
-0.008562751114368439,
0.1286071389913559,
0.03891170769929886,
-0.04769802838563919,
0.11410756409168243,
-0.09723494201898575,
-0.23011907935142517,
-0.056654322892427444,
0.11289055645465851,
0.03920014575123787,
0.06084984168410301,
-0.02456614561378956,
0.022357529029250145,
-0.013520187698304653,
-0.09739882498979568,
0.05857406184077263,
0.04218752682209015,
0.03884945437312126,
0.01868574507534504,
-0.03727824613451958,
0.036556825041770935,
-0.027723057195544243,
-0.03382936865091324,
0.10511323064565659,
0.27984827756881714,
-0.11793424934148788,
0.02472478523850441,
0.027829673141241074,
-0.0476553738117218,
-0.1822100728750229,
0.015143350698053837,
0.10217855870723724,
0.022075414657592773,
0.0365886352956295,
-0.17293256521224976,
0.10572438687086105,
0.08650773763656616,
-0.0243484228849411,
0.09790496528148651,
-0.2888767719268799,
-0.12184233218431473,
0.09059683233499527,
0.13360431790351868,
-0.04244033992290497,
-0.16834291815757751,
-0.05420541763305664,
-0.006668890360742807,
-0.07138010859489441,
0.08698517829179764,
0.0038003420922905207,
0.0984795019030571,
-0.03212409466505051,
-0.017954055219888687,
0.023302139714360237,
-0.07280249893665314,
0.16252560913562775,
-0.014910126104950905,
0.08688957244157791,
-0.034461673349142075,
0.017920315265655518,
-0.003072576830163598,
-0.07789447158575058,
0.03512747958302498,
-0.11582940071821213,
0.05595426633954048,
-0.10548631101846695,
-0.015252421610057354,
-0.07824725657701492,
0.029060110449790955,
-0.05192287638783455,
-0.042640846222639084,
-0.04031151905655861,
0.04891260713338852,
0.07617685943841934,
-0.00217669946141541,
0.14038215577602386,
0.013078346848487854,
0.10190054774284363,
0.11482054740190506,
0.05900100991129875,
0.002717725932598114,
-0.10083061456680298,
-0.038084741681814194,
-0.008698533289134502,
0.04824669286608696,
-0.14922083914279938,
0.013089473359286785,
0.12981580197811127,
0.03928258270025253,
0.11548526585102081,
0.05111001804471016,
-0.054315101355314255,
-0.01789676398038864,
0.0841798484325409,
-0.1091301292181015,
-0.1320478320121765,
-0.02631361410021782,
0.00773661071434617,
-0.16186073422431946,
0.017698200419545174,
0.07245944440364838,
-0.06629248708486557,
0.006713004782795906,
0.002649394329637289,
0.04993007332086563,
0.003737477818503976,
0.19004949927330017,
0.08354374021291733,
0.07991153001785278,
-0.08624987304210663,
0.10702330619096756,
0.029471762478351593,
-0.1394757628440857,
0.02424287050962448,
0.06968863308429718,
-0.07944589108228683,
-0.01132661011070013,
0.09079309552907944,
0.09419460594654083,
-0.023013953119516373,
-0.04604016989469528,
-0.12561380863189697,
-0.11931733787059784,
0.06838501989841461,
0.06386713683605194,
0.06747365742921829,
0.020375745370984077,
-0.0043162135407328606,
0.029817134141921997,
-0.1076386421918869,
0.13847704231739044,
0.0727555975317955,
0.09924577176570892,
-0.19350652396678925,
0.08719491958618164,
0.01169799268245697,
0.008220277726650238,
-0.014865806326270103,
0.05191589519381523,
-0.12394312024116516,
-0.02974599599838257,
-0.06934866309165955,
0.008496379479765892,
-0.07055816054344177,
0.008839210495352745,
0.0013880839105695486,
-0.05062389001250267,
-0.03972139209508896,
0.005841545760631561,
-0.09341643005609512,
-0.06133613735437393,
0.00033977694693021476,
0.06255333870649338,
-0.09890934824943542,
-0.016745084896683693,
0.038395266979932785,
-0.1200156956911087,
0.09057571738958359,
0.012617985717952251,
0.04598386585712433,
0.01351027749478817,
-0.08702675253152847,
0.03024478442966938,
0.047812558710575104,
-0.0016031148843467236,
0.025506258010864258,
-0.13247133791446686,
-0.005399250891059637,
-0.04852224513888359,
-0.008806341327726841,
-0.019481530413031578,
0.04546886309981346,
-0.1364341825246811,
0.003004145808517933,
-0.06009151041507721,
-0.05304520204663277,
-0.06006377562880516,
0.05000130459666252,
0.06901537626981735,
-0.017232565209269524,
0.16814684867858887,
-0.07581187039613724,
0.040154460817575455,
-0.23868009448051453,
-0.0016488373512402177,
-0.013726300559937954,
-0.06311896443367004,
-0.08477350324392319,
-0.008371884003281593,
0.0769384354352951,
-0.05026012659072876,
0.09595084935426712,
-0.03446922078728676,
0.019558290019631386,
0.02723146602511406,
-0.03540471941232681,
0.04992973804473877,
0.04972423240542412,
0.19736462831497192,
0.01716417446732521,
-0.013882202096283436,
0.06814666837453842,
0.01584491692483425,
0.0838017389178276,
0.058178942650556564,
0.15544551610946655,
0.15209923684597015,
-0.05183647945523262,
0.10504091531038284,
0.049059126526117325,
-0.12150325626134872,
-0.16009992361068726,
0.1451549232006073,
-0.06984632462263107,
0.13142333924770355,
-0.021816730499267578,
0.17254935204982758,
0.11832132190465927,
-0.20456916093826294,
0.004667798988521099,
-0.016011768952012062,
-0.08158690482378006,
-0.09400244057178497,
-0.09854137152433395,
-0.09025079756975174,
-0.1727149933576584,
0.015422775410115719,
-0.10447925329208374,
0.006767000071704388,
0.07415241003036499,
0.02412485145032406,
0.023330582305788994,
0.16197170317173004,
0.059195924550294876,
0.02629590407013893,
0.0596691370010376,
0.049987513571977615,
-0.04272458329796791,
-0.028314566239714622,
-0.08582209050655365,
0.0181849654763937,
-0.021958863362669945,
0.038143765181303024,
-0.06471582502126694,
-0.06629931181669235,
0.08654410392045975,
0.04458014294505119,
-0.10066959261894226,
0.023125886917114258,
-0.021000437438488007,
0.0416138730943203,
0.06134175509214401,
0.011143067851662636,
0.00839002151042223,
-0.04647599533200264,
0.20154057443141937,
-0.0934651792049408,
-0.009475420229136944,
-0.11276676505804062,
0.17374737560749054,
-0.015044840052723885,
-0.007557152770459652,
0.0330665148794651,
-0.08890704810619354,
-0.007291472051292658,
0.15302665531635284,
0.15686677396297455,
-0.04413728043437004,
-0.022433971986174583,
0.018802214413881302,
-0.015187270939350128,
-0.04206923022866249,
0.09034943580627441,
0.09423679858446121,
0.05252635106444359,
-0.06939922273159027,
-0.04846150428056717,
-0.0424131378531456,
-0.05792897939682007,
-0.03177092969417572,
0.05636906996369362,
0.03760228678584099,
-0.007395045831799507,
-0.043773408979177475,
0.076807402074337,
-0.04245525598526001,
-0.11749028414487839,
0.08981110900640488,
-0.1794610619544983,
-0.1732812225818634,
-0.03291282057762146,
0.08835410326719284,
0.014134406112134457,
0.04608627036213875,
-0.0024435422383248806,
-0.021372079849243164,
0.09447401016950607,
-0.00310629908926785,
-0.08481718599796295,
-0.08432804048061371,
0.04468643665313721,
-0.04805774614214897,
0.23803840577602386,
-0.029600104317069054,
0.009161766618490219,
0.1275947242975235,
0.035501979291439056,
-0.1378655880689621,
0.010674306191504002,
0.07660042494535446,
-0.09973540157079697,
0.04684274271130562,
0.15216849744319916,
-0.026179084554314613,
0.12002134323120117,
0.042633213102817535,
-0.09844363480806351,
-0.008621600456535816,
-0.07883019745349884,
-0.056011322885751724,
-0.0556047298014164,
0.004597065504640341,
-0.03835177794098854,
0.15771318972110748,
0.2004527747631073,
-0.06060699373483658,
-0.03660879284143448,
-0.045232370495796204,
0.03831230103969574,
0.04749315604567528,
0.09174647927284241,
0.005916736554354429,
-0.23259237408638,
0.028865976259112358,
-0.03149829059839249,
0.020662818104028702,
-0.19407245516777039,
-0.09256979823112488,
0.018790891394019127,
-0.05064710974693298,
-0.09936390072107315,
0.10424387454986572,
0.07491288334131241,
0.04820488765835762,
-0.06026504561305046,
-0.039347004145383835,
-0.05443344637751579,
0.15698020160198212,
-0.16645683348178864,
-0.07735693454742432
] |
null | null | null |
# gtp2-SlimOrca
gpt2 base model trained off SlimOrca dataset | {"language": ["en"], "license": "mit", "datasets": ["Open-Orca/SlimOrca"]} | null | luke-ethan-gibson/gpt2-orcaSlim | [
"en",
"dataset:Open-Orca/SlimOrca",
"license:mit",
"region:us"
] | 2024-02-13T10:15:40+00:00 | [] | [
"en"
] | TAGS
#en #dataset-Open-Orca/SlimOrca #license-mit #region-us
|
# gtp2-SlimOrca
gpt2 base model trained off SlimOrca dataset | [
"# gtp2-SlimOrca\ngpt2 base model trained off SlimOrca dataset"
] | [
"TAGS\n#en #dataset-Open-Orca/SlimOrca #license-mit #region-us \n",
"# gtp2-SlimOrca\ngpt2 base model trained off SlimOrca dataset"
] | [
26,
22
] | [
"passage: TAGS\n#en #dataset-Open-Orca/SlimOrca #license-mit #region-us \n# gtp2-SlimOrca\ngpt2 base model trained off SlimOrca dataset"
] | [
-0.13152526319026947,
0.19666698575019836,
-0.0038007667753845453,
0.0978793054819107,
0.0798925831913948,
0.03636230528354645,
0.09794684499502182,
0.13640162348747253,
0.08702106028795242,
0.013886981643736362,
0.14783300459384918,
0.1937176138162613,
0.009285109117627144,
0.19111646711826324,
-0.08296244591474533,
-0.23055951297283173,
0.015901435166597366,
-0.023211928084492683,
-0.11996821314096451,
0.06662008166313171,
0.07288137078285217,
0.009422612376511097,
0.057653483003377914,
-0.03864714875817299,
-0.11181513965129852,
-0.021077241748571396,
0.0072580804117023945,
-0.021884620189666748,
-0.001906288554891944,
0.08486931771039963,
0.05463821068406105,
0.039920344948768616,
0.05330846086144447,
-0.0450102835893631,
0.0270024873316288,
-0.05236469954252243,
-0.03329194337129593,
0.08506672084331512,
0.01664889231324196,
0.03587956726551056,
0.10330817103385925,
-0.012262261472642422,
-0.005064693745225668,
0.011447970755398273,
-0.18034489452838898,
-0.22219528257846832,
-0.038261860609054565,
-0.15757443010807037,
0.03562333062291145,
-0.03092358447611332,
0.036743566393852234,
0.0791056677699089,
-0.17254675924777985,
-0.03026488609611988,
-0.04899417608976364,
-0.20633172988891602,
-0.06355546414852142,
0.12720438838005066,
-0.008144194260239601,
0.0497807078063488,
0.004707242362201214,
0.07501213252544403,
0.07522612065076828,
-0.025818079710006714,
-0.054350800812244415,
-0.0007538940408267081,
-0.046999793499708176,
0.05082991346716881,
-0.037300679832696915,
-0.04024806246161461,
0.23444581031799316,
0.03609872981905937,
0.02459588460624218,
-0.06971650570631027,
-0.029817167669534683,
0.11389477550983429,
0.027833424508571625,
0.022262148559093475,
0.02938474342226982,
0.1046450212597847,
0.003455295693129301,
-0.05374240130186081,
-0.07861749082803726,
-0.11226867884397507,
-0.19705639779567719,
0.12842094898223877,
-0.0703134834766388,
0.09679946303367615,
-0.07213514298200607,
0.03135567158460617,
-0.05731289088726044,
-0.09656556695699692,
-0.10608799755573273,
-0.0982842743396759,
0.05365091934800148,
-0.004569333512336016,
-0.0017386761028319597,
0.1657727062702179,
0.16322913765907288,
0.16587641835212708,
0.09899824112653732,
-0.055023256689310074,
0.06387106329202652,
0.08253904432058334,
0.04930470883846283,
-0.06601828336715698,
-0.07412543892860413,
0.0761919766664505,
0.08002492785453796,
-0.14777597784996033,
0.03415927663445473,
-0.0778561383485794,
-0.10691315680742264,
-0.04388653486967087,
-0.17735855281352997,
0.134440079331398,
0.09156768769025803,
-0.09384296834468842,
-0.06992693245410919,
-0.07514113187789917,
0.1715974062681198,
-0.009267682209610939,
0.028760233893990517,
-0.025496698915958405,
-0.05663900077342987,
0.10089762508869171,
-0.06110205128788948,
0.06755967438220978,
0.07708142697811127,
-0.04794817417860031,
-0.09791899472475052,
-0.0135623998939991,
0.003484653541818261,
0.04175376519560814,
0.046130530536174774,
0.002663637511432171,
0.07860589027404785,
-0.14114078879356384,
-0.22834235429763794,
-0.0033342193346470594,
0.07667668908834457,
-0.14282460510730743,
-0.06132759898900986,
-0.020249906927347183,
-0.015619934536516666,
-0.038180530071258545,
-0.054730579257011414,
0.01085946150124073,
-0.04177401587367058,
0.040524210780858994,
-0.049965836107730865,
0.006896250881254673,
-0.25487038493156433,
0.005865118466317654,
-0.1278896927833557,
0.03551240265369415,
0.08864422887563705,
0.11997974663972855,
-0.06865102797746658,
0.06764402985572815,
-0.07809106260538101,
0.017650920897722244,
-0.13127253949642181,
0.00758452620357275,
0.02010158821940422,
0.21519091725349426,
-0.1985527127981186,
-0.006679977290332317,
0.1511877179145813,
-0.0861382856965065,
-0.11095733940601349,
-0.00016886625962797552,
0.006133421789854765,
0.14665640890598297,
0.023771634325385094,
0.18179894983768463,
0.08303730189800262,
-0.011204547248780727,
0.07337798178195953,
0.04170949012041092,
-0.012019804678857327,
-0.152724027633667,
0.19067049026489258,
-0.06703079491853714,
-0.01031424105167389,
0.06007339432835579,
-0.07769865542650223,
0.040126167237758636,
-0.09945324808359146,
-0.11347310990095139,
-0.06973389536142349,
-0.07061295956373215,
-0.05712269991636276,
-0.028019128367304802,
0.05885232239961624,
0.0585629865527153,
0.006852203048765659,
0.014026031829416752,
0.07101117074489594,
0.013248526491224766,
-0.0684574544429779,
-0.09633946418762207,
0.16152507066726685,
-0.21226866543293,
-0.057833246886730194,
-0.10521262139081955,
-0.04511231556534767,
-0.03497665748000145,
-0.0215185284614563,
-0.016639411449432373,
0.09760395437479019,
0.11338841170072556,
-0.010341907851397991,
0.04500708729028702,
0.09360290318727493,
0.052518006414175034,
0.07904717326164246,
0.03233389928936958,
-0.097264364361763,
-0.005289914086461067,
-0.07242359220981598,
0.18220791220664978,
-0.12074951082468033,
-0.02964632399380207,
0.12933996319770813,
0.10896722227334976,
0.04042595624923706,
-0.0004742765158880502,
0.026948850601911545,
0.0021740321535617113,
0.02975142002105713,
-0.09536783397197723,
0.08140551298856735,
0.02797127701342106,
-0.09802983701229095,
0.040474798530340195,
0.06401973217725754,
0.1938788890838623,
0.10745903849601746,
0.10183734446763992,
0.042678456753492355,
-0.20173397660255432,
-0.030999790877103806,
0.04561449587345123,
-0.02949005365371704,
0.04131472483277321,
0.0284942165017128,
-0.06490229815244675,
-0.001089076860807836,
-0.08003358542919159,
0.05069351941347122,
-0.004907377064228058,
-0.04903290793299675,
0.0031143350061029196,
0.055370114743709564,
0.24523067474365234,
-0.12940417230129242,
0.11507406830787659,
0.16156676411628723,
0.10750731825828552,
0.10200799256563187,
-0.04819326847791672,
-0.039528701454401016,
-0.10875888913869858,
0.028637316077947617,
0.01743955723941326,
0.0692911297082901,
0.008172374218702316,
0.12427707016468048,
0.047787465155124664,
0.029580168426036835,
0.08628152310848236,
-0.05837509408593178,
-0.10131462663412094,
0.025108013302087784,
-0.05424680933356285,
-0.06522727757692337,
0.09407419711351395,
-0.10060704499483109,
0.05062457174062729,
-0.0035874878522008657,
0.027518821880221367,
0.15810070931911469,
0.0399530865252018,
-0.017295198515057564,
0.11279885470867157,
-0.06708008795976639,
-0.11701345443725586,
-0.09216935187578201,
0.10971800237894058,
0.02277233824133873,
0.010110610164701939,
0.038217272609472275,
-0.047296084463596344,
-0.022790683433413506,
0.028527481481432915,
-0.026763586327433586,
0.06358428299427032,
0.0013826185604557395,
-0.048637330532073975,
0.10745866596698761,
-0.04107105731964111,
-0.08446574211120605,
-0.019780358299613,
-0.013843242079019547,
-0.05190890654921532,
0.04824277386069298,
-0.12542225420475006,
0.0774013102054596,
0.060986604541540146,
0.08293242752552032,
0.038485586643218994,
0.05733419209718704,
0.20668141543865204,
-0.015297824516892433,
-0.03778326138854027,
0.2243109494447708,
0.03983048349618912,
-0.02062194235622883,
0.15056580305099487,
0.11605329811573029,
-0.14586447179317474,
-0.09943446516990662,
-0.04315539821982384,
-0.11410355567932129,
-0.17958058416843414,
-0.12601761519908905,
-0.09696995466947556,
-0.04971568286418915,
0.038516029715538025,
0.048696015030145645,
0.023017792031168938,
0.13590098917484283,
0.06367570906877518,
-0.12531758844852448,
-0.07116680592298508,
-0.017280932515859604,
-0.012324835173785686,
-0.010244273580610752,
0.07381002604961395,
-0.04342832788825035,
0.0042725251987576485,
0.16549953818321228,
0.04125381261110306,
0.20284157991409302,
0.11171754449605942,
0.03656547889113426,
0.07268624752759933,
0.05987558886408806,
0.08790870010852814,
0.027740241959691048,
0.07825536280870438,
-0.03531435504555702,
-0.010333721525967121,
-0.05800356715917587,
-0.09366558492183685,
0.05588524788618088,
0.0398898683488369,
-0.09030155092477798,
0.025363899767398834,
-0.013048850931227207,
0.061648089438676834,
-0.09860750287771225,
0.07619736343622208,
-0.16025902330875397,
0.03731546178460121,
0.06649105995893478,
0.10475173592567444,
-0.023152891546487808,
0.04256311431527138,
0.10215473175048828,
-0.050283510237932205,
-0.038150012493133545,
0.03210681304335594,
0.10127715766429901,
-0.08535277098417282,
-0.08198476582765579,
-0.04379858076572418,
0.0516819953918457,
0.038845136761665344,
0.10382116585969925,
-0.14916659891605377,
0.18480156362056732,
-0.019933929666876793,
0.060942038893699646,
-0.10289560258388519,
-0.010952170006930828,
0.023222140967845917,
0.040574390441179276,
0.2341407984495163,
0.04140613600611687,
-0.24614396691322327,
0.07271630316972733,
-0.16095487773418427,
0.09685484319925308,
-0.10795075446367264,
-0.07832171022891998,
-0.0855964869260788,
-0.031546689569950104,
-0.002613726770505309,
-0.009100549854338169,
-0.030126415193080902,
-0.05370042473077774,
-0.1075957715511322,
-0.02092503197491169,
0.025728024542331696,
-0.05431165173649788,
-0.05376053228974342,
0.02421700768172741,
0.07616867125034332,
0.13487553596496582,
-0.024731438606977463,
-0.0686187595129013,
-0.11671386659145355,
0.07334886491298676,
0.13515397906303406,
-0.0391572080552578,
0.027981698513031006,
0.01069939136505127,
0.025093840435147285,
0.09322138130664825,
-0.11047995090484619,
0.08048191666603088,
-0.038741230964660645,
0.052808500826358795,
-0.04557909071445465,
-0.020050181075930595,
-0.09550182521343231,
0.0014737884048372507,
0.06526920199394226,
0.03869318589568138,
-0.0209724772721529,
-0.11404111981391907,
0.013915269635617733,
0.07505609095096588,
-0.1562425196170807,
0.031778089702129364,
-0.03170156851410866,
0.023000624030828476,
0.1360645890235901,
-0.05184955149888992,
0.09593786299228668,
0.22048385441303253,
-0.07946645468473434,
0.1179567202925682,
0.10578345507383347,
-0.07130608707666397,
-0.183466836810112,
-0.09700476378202438,
-0.08699813485145569,
0.04205358400940895,
0.031051117926836014,
-0.20855191349983215,
0.16962645947933197,
0.19507095217704773,
-0.04914388433098793,
0.09094127267599106,
-0.29492226243019104,
-0.06322215497493744,
0.04745694622397423,
0.05325927957892418,
0.3206270635128021,
-0.08148778229951859,
-0.1603010594844818,
-0.052741888910532,
-0.12610183656215668,
0.22154644131660461,
-0.1254725605249405,
0.08158561587333679,
-0.00589272053912282,
-0.07475455105304718,
0.016952717676758766,
-0.009456442669034004,
0.30922695994377136,
-0.09697262942790985,
0.04044176638126373,
-0.04790158569812775,
-0.08163189888000488,
0.020589511841535568,
0.03421588987112045,
-0.021230267360806465,
-0.006908625364303589,
0.06982402503490448,
-0.16625642776489258,
-0.039936091750860214,
0.015209686011075974,
0.07344450056552887,
0.023361284285783768,
-0.0761510506272316,
0.05135182663798332,
0.08121780306100845,
-0.03481683507561684,
0.034153345972299576,
-0.027495503425598145,
-0.10923770815134048,
-0.030914297327399254,
-0.06095827370882034,
-0.003261849284172058,
-0.11455634236335754,
0.030126769095659256,
-0.10394541919231415,
-0.04269767180085182,
0.020869947969913483,
-0.14467857778072357,
-0.022028837352991104,
0.10287677496671677,
0.058022115379571915,
-0.024983929470181465,
0.042473115026950836,
0.06341184675693512,
0.0715617686510086,
0.11483867466449738,
-0.06373316794633865,
0.030617740005254745,
-0.0048895892687141895,
0.08108920603990555,
0.04294202849268913,
-0.07947563380002975,
0.1196928545832634,
0.06482446938753128,
-0.022512884810566902,
0.0073727816343307495,
0.012098305858671665,
-0.1307089775800705,
0.06279484927654266,
0.10253427922725677,
-0.05801449716091156,
-0.1332395076751709,
0.01938176155090332,
0.048402026295661926,
0.14747747778892517,
-0.06892450153827667,
-0.056671638041734695,
-0.03780020400881767,
-0.061876699328422546,
-0.039369259029626846,
0.14823439717292786,
-0.1104137971997261,
-0.1080741360783577,
-0.027031175792217255,
-0.0653536394238472,
-0.014924393966794014,
-0.12688681483268738,
0.09617742896080017,
0.10220655053853989,
0.0026367066893726587,
-0.08016415685415268,
-0.02973073720932007,
-0.034059617668390274,
-0.006169062573462725,
-0.008559206500649452,
-0.10142327845096588,
-0.11215675622224808,
-0.09419813752174377,
0.0036382446996867657,
-0.07312965393066406,
-0.006973522249609232,
-0.13078396022319794,
-0.014870872721076012,
-0.0415206104516983,
0.03277416527271271,
-0.03777794539928436,
0.03535943478345871,
0.05958866328001022,
-0.0629095733165741,
-0.07736574858427048,
0.0032397890463471413,
-0.07262785732746124,
0.042574092745780945,
-0.026024281978607178,
0.003781678853556514,
-0.06304590404033661,
-0.08130119740962982,
0.026462972164154053,
-0.021794408559799194,
0.055160947144031525,
0.08451057225465775,
-0.008529257029294968,
-0.054289381951093674,
-0.22913116216659546,
-0.04533448442816734,
0.043899595737457275,
0.05604294687509537,
0.04627881944179535,
-0.0759648010134697,
0.03974497690796852,
0.10777555406093597,
-0.04293312504887581,
0.022918106988072395,
-0.007890746928751469,
-0.09547240287065506,
-0.10242891311645508,
-0.0379096120595932,
-0.12006959319114685,
0.045051854103803635,
0.011751563288271427,
0.19387832283973694,
0.0561758317053318,
0.042523082345724106,
0.055674076080322266,
0.06223181635141373,
-0.08397992700338364,
-0.011664696969091892,
-0.03224789351224899,
-0.0383622981607914,
0.01601572148501873,
0.007211855612695217,
0.0017544570146128535,
0.06443602591753006,
0.2776503264904022,
0.07620510458946228,
-0.08650290220975876,
-0.0034735603258013725,
0.05527767166495323,
0.21243718266487122,
0.012123245745897293,
0.2808995246887207,
0.10269589722156525,
0.03853527456521988,
-0.03320314735174179,
0.09744887053966522,
-0.05011007562279701,
-0.04812651500105858,
-0.0009740757523104548,
0.01870022900402546,
0.04744509235024452,
-0.023250211030244827,
0.07994094491004944,
-0.006473544053733349,
0.027801446616649628,
0.07779650390148163,
0.0700313001871109,
-0.03822365030646324,
0.00009332654735771939,
-0.1390479952096939,
0.01902792789041996,
-0.10261113196611404,
-0.033374108374118805,
0.044689975678920746,
-0.03110313042998314,
-0.15875457227230072,
-0.17294827103614807,
-0.10179473459720612,
-0.10792841017246246,
0.0028765276074409485,
-0.06576836854219437,
-0.06053054332733154,
0.1485436111688614,
0.019949061796069145,
0.002784856129437685,
0.045771002769470215,
-0.03510696440935135,
0.01894490048289299,
-0.019039245322346687,
-0.07361019402742386,
0.039497870951890945,
-0.11023876070976257,
-0.07704432308673859,
0.011476912535727024,
0.09245186299085617,
-0.08059854060411453,
0.01916489750146866,
0.025617938488721848,
-0.039525073021650314,
-0.04690936580300331,
-0.02933812327682972,
-0.12551262974739075,
0.002969164866954088,
-0.0695829838514328,
0.1858605593442917,
0.046845950186252594,
0.02010304480791092,
0.07120377570390701,
0.14254286885261536,
0.03592831268906593,
-0.0887761265039444,
-0.1749958097934723,
-0.04992081597447395,
-0.09749949723482132,
0.06794948130846024,
-0.013486761599779129,
0.014663760550320148,
0.06429458409547806,
0.11180627346038818,
0.09118671715259552,
-0.11644843220710754,
-0.00031395256519317627,
0.10453040152788162,
-0.013477522879838943,
-0.005984649993479252,
0.03606085106730461,
-0.05842073634266853,
0.10605364292860031,
-0.08548413217067719,
-0.08727633208036423,
-0.00997395720332861,
-0.015775542706251144,
-0.059789709746837616,
-0.05353323370218277,
-0.009852515533566475,
-0.06109524518251419,
-0.021576089784502983,
0.14902110397815704,
-0.16575632989406586,
0.14197741448879242,
0.20182622969150543,
-0.09079138934612274,
-0.13389228284358978,
-0.0645642802119255,
0.01571052148938179,
0.07266353815793991,
0.07667820155620575,
-0.15625406801700592,
-0.04491109028458595,
0.043958526104688644,
0.050751328468322754,
-0.13803890347480774,
-0.0903058871626854,
0.04976794868707657,
-0.019456565380096436,
0.18815186619758606,
-0.024368805810809135,
0.13694898784160614,
0.09624622762203217,
-0.001062276540324092,
-0.16536906361579895,
-0.0504583902657032,
0.044965386390686035,
-0.031972818076610565,
0.09609414637088776,
-0.12983368337154388,
-0.05500328913331032,
0.0009311572066508234,
0.08067761361598969,
0.017550377175211906,
-0.02027209848165512,
0.31154829263687134,
0.00025291970814578235,
-0.044969115406274796,
0.07786523550748825,
-0.10854926705360413,
0.11440631747245789,
-0.004970505367964506,
-0.06440816819667816,
-0.06262875348329544,
-0.017016438767313957,
0.18579424917697906,
0.09791180491447449,
-0.09566518664360046,
-0.030984215438365936,
-0.06930262595415115,
-0.06505559384822845,
0.15661749243736267,
0.051699958741664886,
-0.03481883183121681,
-0.023179296404123306,
-0.15673600137233734,
-0.09067966043949127,
-0.00737566314637661,
0.03437253087759018,
0.12670142948627472,
0.027102645486593246,
-0.008424287661910057,
0.05798326060175896,
-0.03303004056215286,
-0.017995845526456833,
-0.04865404590964317,
-0.04619068279862404
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | bhuvanmdev/flan-t5-google-resume-parser | [
"transformers",
"tensorboard",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:18:46+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #tensorboard #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #tensorboard #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
35,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05129265785217285,
0.22092880308628082,
-0.0035179967526346445,
0.02501223422586918,
0.11708144098520279,
-0.0005181307788006961,
0.04833335056900978,
0.126449316740036,
-0.019315702840685844,
0.11368890851736069,
0.02830384485423565,
0.087161585688591,
0.10932693630456924,
0.14976787567138672,
0.027767986059188843,
-0.22400330007076263,
0.013545092195272446,
-0.08547151833772659,
0.007360507268458605,
0.10887812823057175,
0.13407185673713684,
-0.10422329604625702,
0.0799589455127716,
-0.022218724712729454,
-0.01576193980872631,
-0.0024709447752684355,
-0.09191098809242249,
-0.07704412192106247,
0.06658186763525009,
0.07635242491960526,
0.06079627200961113,
0.015532677061855793,
0.09731730818748474,
-0.28775396943092346,
0.015315494500100613,
0.08426344394683838,
-0.003832674352452159,
0.06570416688919067,
0.06679647415876389,
-0.07225077599287033,
0.11294640600681305,
-0.08326761424541473,
0.14265748858451843,
0.07819409668445587,
-0.0751931369304657,
-0.20058366656303406,
-0.0678529441356659,
0.08777504414319992,
0.12172447144985199,
0.06566067785024643,
-0.0240190327167511,
0.15403629839420319,
-0.07573149353265762,
0.009909119457006454,
0.1344568133354187,
-0.09381069988012314,
-0.05284041166305542,
0.04851995036005974,
0.11218059062957764,
0.09264162927865982,
-0.13201914727687836,
0.007263765670359135,
0.03538952395319939,
0.01823991909623146,
0.08935598284006119,
0.021646643057465553,
0.10804788768291473,
0.0471319817006588,
-0.13886919617652893,
-0.04078420251607895,
0.10519677400588989,
0.03467073664069176,
-0.0511174350976944,
-0.21634697914123535,
-0.0042060064151883125,
-0.018993662670254707,
-0.02438262850046158,
-0.057221945375204086,
0.047075625509023666,
-0.03517723083496094,
0.05707012489438057,
-0.03823539987206459,
-0.09978390485048294,
-0.030010027810931206,
0.07668479532003403,
0.05997013673186302,
0.013257342390716076,
-0.018452148884534836,
0.029318852350115776,
0.11608750373125076,
0.04971613734960556,
-0.1120513305068016,
-0.06576516479253769,
-0.06617631763219833,
-0.1015271320939064,
-0.04605502262711525,
0.04043229669332504,
0.021388711407780647,
0.02790878526866436,
0.2005361020565033,
0.0013144337572157383,
0.04578043892979622,
0.027719371020793915,
0.006634909193962812,
0.06140429154038429,
0.09704186022281647,
-0.06184549257159233,
-0.14441433548927307,
-0.052150655537843704,
0.0838291123509407,
-0.005048004910349846,
-0.03949505463242531,
-0.05814170464873314,
0.045859213918447495,
0.050563037395477295,
0.11559558659791946,
0.0925697535276413,
-0.00952589139342308,
-0.04793645069003105,
-0.03162948787212372,
0.22493304312229156,
-0.14139817655086517,
0.04953770712018013,
-0.012962840497493744,
-0.03824609890580177,
-0.04221971333026886,
0.031711336225271225,
0.03209373727440834,
-0.016264338046312332,
0.09782443940639496,
-0.05990086868405342,
-0.0385088287293911,
-0.10335034132003784,
-0.04756024107336998,
0.03825374320149422,
-0.010136007331311703,
-0.014318512752652168,
-0.06374996900558472,
-0.0976734608411789,
-0.0440363809466362,
0.06589976698160172,
-0.06674116849899292,
-0.04861561208963394,
0.014166015200316906,
-0.05206536501646042,
-0.0034754190128296614,
-0.0007777506252750754,
0.11236779391765594,
-0.02925705723464489,
0.03520512953400612,
-0.03593742102384567,
0.06071564927697182,
0.10378561168909073,
0.04010351747274399,
-0.06594906002283096,
0.052586451172828674,
-0.21832634508609772,
0.09042318910360336,
-0.10633471608161926,
0.030845342203974724,
-0.15926258265972137,
-0.041567448526620865,
0.019635990262031555,
0.014180500991642475,
0.008806075900793076,
0.11377737671136856,
-0.19005726277828217,
-0.025171801447868347,
0.1300044059753418,
-0.0945831909775734,
-0.10084225982427597,
0.07147544622421265,
-0.04920754209160805,
0.13694678246974945,
0.04123294726014137,
-0.02705676108598709,
0.06769294291734695,
-0.15995272994041443,
-0.05838307738304138,
-0.016111139208078384,
-0.009447884745895863,
0.13041087985038757,
0.05866660177707672,
-0.059992723166942596,
0.08056262880563736,
0.02419431507587433,
-0.019690977409482002,
-0.043121110647916794,
-0.043974619358778,
-0.10198459774255753,
-0.0013167005963623524,
-0.07957979291677475,
0.04423648864030838,
-0.008130980655550957,
-0.07807544618844986,
-0.029026441276073456,
-0.1834263950586319,
0.052552398294210434,
0.08257346600294113,
0.016371481120586395,
-0.010266339406371117,
-0.07969299703836441,
0.008663319051265717,
-0.026843959465622902,
-0.016039082780480385,
-0.1699541211128235,
-0.04567434638738632,
0.04240812361240387,
-0.1652011275291443,
0.04373474419116974,
-0.03978709131479263,
0.05944959074258804,
0.0320458859205246,
-0.05190446972846985,
-0.0011512517230585217,
-0.014688361436128616,
0.01982065849006176,
-0.036412887275218964,
-0.19573865830898285,
-0.043008726090192795,
-0.02956235036253929,
0.160308837890625,
-0.24183565378189087,
0.03329772129654884,
0.057998064905405045,
0.1498682051897049,
0.0007596806390210986,
-0.04587903618812561,
0.02215476520359516,
-0.05358593538403511,
-0.05101087689399719,
-0.06615893542766571,
-0.0030418226961046457,
-0.028946032747626305,
-0.04868051037192345,
0.029185006394982338,
-0.1843569129705429,
-0.04010150954127312,
0.10059087723493576,
0.09962263703346252,
-0.1534094512462616,
-0.014867037534713745,
-0.04807868227362633,
-0.06743370741605759,
-0.08563356846570969,
-0.05508554354310036,
0.14246419072151184,
0.05223851278424263,
0.04553950950503349,
-0.07807126641273499,
-0.0671616643667221,
0.01849428191781044,
0.0007367139332927763,
-0.040424644947052,
0.07459479570388794,
0.08395206183195114,
-0.09447163343429565,
0.0763159990310669,
0.07563351839780807,
0.0745408833026886,
0.10207726806402206,
0.01656539924442768,
-0.10977036505937576,
-0.025348074734210968,
0.015244545415043831,
0.024517716839909554,
0.14840243756771088,
-0.05424882099032402,
0.03563909977674484,
0.050164610147476196,
-0.04687408357858658,
0.02342480979859829,
-0.10203663259744644,
0.031017832458019257,
0.039212487637996674,
-0.0007507543195970356,
0.04622079059481621,
-0.04102898761630058,
0.00785372406244278,
0.0762152150273323,
0.049299295991659164,
0.04492497444152832,
0.00451761344447732,
-0.01614830642938614,
-0.09929052740335464,
0.16183419525623322,
-0.09384983777999878,
-0.3009108603000641,
-0.15124821662902832,
0.016429319977760315,
0.03624126687645912,
-0.02517835609614849,
0.026567041873931885,
-0.05863599851727486,
-0.10570778697729111,
-0.10092674195766449,
-0.005781888496130705,
0.020709572359919548,
-0.07780015468597412,
-0.06983298063278198,
0.07268352061510086,
0.03807120397686958,
-0.14607851207256317,
0.03948947787284851,
0.04788151755928993,
-0.050529371947050095,
-0.02089620567858219,
0.08623623847961426,
0.11477750539779663,
0.15922853350639343,
-0.016451021656394005,
-0.02623858116567135,
0.018677379935979843,
0.2000662386417389,
-0.13754040002822876,
0.11520840972661972,
0.13333985209465027,
-0.047717101871967316,
0.09115317463874817,
0.17400066554546356,
0.026365477591753006,
-0.07634809613227844,
0.037423089146614075,
0.050920601934194565,
-0.04808442294597626,
-0.25332513451576233,
-0.05663376301527023,
0.009860052727162838,
-0.08193530142307281,
0.093865767121315,
0.09549196064472198,
0.14020170271396637,
0.03696327283978462,
-0.07213930040597916,
-0.042459435760974884,
0.0036775453481823206,
0.11384411156177521,
-0.03568359836935997,
-0.005343656521290541,
0.08481227606534958,
-0.04323641583323479,
-0.0007266122847795486,
0.10209894180297852,
0.025129051879048347,
0.18802158534526825,
0.023904064670205116,
0.1413942575454712,
0.06004384160041809,
0.06251177191734314,
0.000551602803170681,
0.014730905182659626,
0.0419386588037014,
0.014600582420825958,
-0.007120285648852587,
-0.09679079800844193,
0.014006711542606354,
0.13750481605529785,
0.05959208309650421,
0.020287446677684784,
0.011764034628868103,
-0.022532951086759567,
0.05339716002345085,
0.17398898303508759,
-0.014597101137042046,
-0.19965122640132904,
-0.06684007495641708,
0.07284105569124222,
-0.05776670575141907,
-0.12176336348056793,
-0.03483381122350693,
0.030525391921401024,
-0.17502301931381226,
0.03214195370674133,
-0.02044833078980446,
0.10361938923597336,
-0.10804998129606247,
-0.029580531641840935,
0.02396276406943798,
0.08289678394794464,
-0.021593092009425163,
0.09409880638122559,
-0.15318867564201355,
0.12421330064535141,
0.027643848210573196,
0.08287007361650467,
-0.11865443736314774,
0.0884069874882698,
-0.00914052128791809,
0.004794934764504433,
0.17335905134677887,
-0.00887978170067072,
-0.06141883134841919,
-0.06713027507066727,
-0.0922374352812767,
-0.02362675592303276,
0.11753249168395996,
-0.10957267880439758,
0.08096589893102646,
-0.00771739799529314,
-0.049180056899785995,
0.013477327302098274,
-0.11567004770040512,
-0.16323277354240417,
-0.20408131182193756,
0.07107069343328476,
-0.0975819006562233,
-0.0031133198644965887,
-0.10442767292261124,
-0.06875836104154587,
-0.031984321773052216,
0.23626349866390228,
-0.13814060389995575,
-0.079515740275383,
-0.15315008163452148,
-0.05907658115029335,
0.17421935498714447,
-0.038646843284368515,
0.07752027362585068,
-0.009368468075990677,
0.22135040163993835,
-0.0005769561394117773,
-0.006669431459158659,
0.06871405988931656,
-0.08641299605369568,
-0.16795587539672852,
-0.0766071230173111,
0.13526922464370728,
0.11764655262231827,
0.052490342408418655,
-0.005320967175066471,
0.010855535045266151,
-0.025631245225667953,
-0.1058010458946228,
0.001448161550797522,
0.1267717480659485,
0.057947464287281036,
0.022029021754860878,
-0.03157263621687889,
-0.10811471194028854,
-0.06623940169811249,
-0.052098218351602554,
0.04786430671811104,
0.181030735373497,
-0.09767430275678635,
0.17944495379924774,
0.14716151356697083,
-0.06651145964860916,
-0.20653264224529266,
0.04529459774494171,
0.048565998673439026,
-0.011420371010899544,
0.04028045013546944,
-0.18920361995697021,
0.08407842367887497,
0.012931931763887405,
-0.054178349673748016,
0.13587801158428192,
-0.1596759408712387,
-0.1581452637910843,
0.06423576176166534,
0.045448124408721924,
-0.23230639100074768,
-0.1363905370235443,
-0.08674333989620209,
-0.062469057738780975,
-0.1569693386554718,
0.08063073456287384,
-0.00359702599234879,
0.007655099034309387,
0.04580266401171684,
0.029607584699988365,
0.021776875481009483,
-0.05363304913043976,
0.18664778769016266,
-0.0077520408667624,
0.013649445958435535,
-0.07357750833034515,
-0.07677493244409561,
0.09593275189399719,
-0.05751553177833557,
0.10879078507423401,
-0.0018659495981410146,
0.008710069581866264,
-0.09187035262584686,
-0.056738369166851044,
-0.04518502950668335,
0.05294037610292435,
-0.08283393830060959,
-0.11107265949249268,
-0.04874441400170326,
0.08882616460323334,
0.08018611371517181,
-0.03557325154542923,
-0.012883543968200684,
-0.07679562270641327,
0.08622492104768753,
0.19816941022872925,
0.16589786112308502,
0.02310795709490776,
-0.08369841426610947,
0.011619039811193943,
-0.03282969444990158,
0.03419959545135498,
-0.23653645813465118,
0.037609003484249115,
0.05255814641714096,
0.035989757627248764,
0.10626954585313797,
-0.025342373177409172,
-0.1775771826505661,
-0.04355216026306152,
0.060009121894836426,
-0.043581295758485794,
-0.21388691663742065,
-0.013861402869224548,
0.0937437117099762,
-0.19061240553855896,
-0.02898477576673031,
0.026441190391778946,
-0.0349886454641819,
-0.03127404302358627,
0.005819133948534727,
0.060078177601099014,
0.027275938540697098,
0.08815191686153412,
0.06973249465227127,
0.09625289589166641,
-0.0921335518360138,
0.1012224555015564,
0.10827095806598663,
-0.09258020669221878,
0.03437352553009987,
0.06997713446617126,
-0.0464654341340065,
-0.03779982030391693,
0.04445716738700867,
0.05210983380675316,
-0.0035824330989271402,
-0.05700870603322983,
-0.0012550450628623366,
-0.05801399052143097,
0.053437739610672,
0.11578024923801422,
0.024411514401435852,
-0.031481921672821045,
0.06297732889652252,
0.031466688960790634,
-0.11282244324684143,
0.09800126403570175,
0.016209103167057037,
0.0332130491733551,
-0.057540472596883774,
-0.014383697882294655,
0.04758455976843834,
0.01908467337489128,
-0.018431948497891426,
-0.03139050304889679,
-0.03864758461713791,
-0.01871557906270027,
-0.15688249468803406,
-0.00880077388137579,
-0.06904306262731552,
0.008402650244534016,
0.00812726654112339,
-0.043084751814603806,
-0.007121704053133726,
0.027271781116724014,
-0.07522907853126526,
-0.06459254771471024,
-0.004998138640075922,
0.09239296615123749,
-0.15713782608509064,
0.0025221933610737324,
0.0771171823143959,
-0.1061064675450325,
0.06508705765008926,
-0.008114354684948921,
0.0024605898652225733,
0.013164390809834003,
-0.14945632219314575,
0.05327258259057999,
-0.013698753900825977,
0.019031856209039688,
0.04311361908912659,
-0.16456228494644165,
0.004066106863319874,
-0.04824318736791611,
-0.02670198306441307,
-0.010943911038339138,
-0.06265503913164139,
-0.11827512085437775,
0.0809461921453476,
-0.014609033241868019,
-0.06019756942987442,
-0.01396041177213192,
0.05710810795426369,
0.09468782693147659,
-0.03689182177186012,
0.09434254467487335,
-0.001471250900067389,
0.06198366731405258,
-0.17185549437999725,
-0.025822067633271217,
-0.035810794681310654,
0.014255421236157417,
0.023194478824734688,
-0.010762988589704037,
0.04005304351449013,
-0.005854793358594179,
0.23268097639083862,
-0.031446292996406555,
0.15102393925189972,
0.05785702168941498,
0.00336667918600142,
0.0021511006634682417,
0.06947512179613113,
0.05804704874753952,
0.026832543313503265,
0.007880797609686852,
0.029525361955165863,
-0.023783985525369644,
-0.011394080705940723,
-0.16318662464618683,
0.03503057733178139,
0.13904942572116852,
0.0770760327577591,
0.010082700289785862,
0.07332480698823929,
-0.12486710399389267,
-0.11119920760393143,
0.10134211927652359,
-0.03071645461022854,
0.009480697102844715,
-0.07685033231973648,
0.13950638473033905,
0.14809831976890564,
-0.15144896507263184,
0.069137804210186,
-0.05071515217423439,
-0.05081162974238396,
-0.09150747954845428,
-0.11238798499107361,
-0.06143717095255852,
-0.03477819636464119,
0.002448677783831954,
-0.04411304369568825,
0.05612285062670708,
0.050102539360523224,
-0.009517417289316654,
0.0061646318063139915,
0.11181281507015228,
-0.010644787922501564,
-0.0011640603188425303,
0.0354636125266552,
0.040780141949653625,
0.02505035698413849,
-0.0645608976483345,
0.03148636221885681,
0.017563825473189354,
0.036894720047712326,
0.059959281235933304,
0.02824280969798565,
-0.036668822169303894,
0.028542157262563705,
0.007808319292962551,
-0.10596310347318649,
0.023117657750844955,
-0.01790459267795086,
-0.06492031365633011,
0.12684021890163422,
0.03216429799795151,
0.014280314557254314,
-0.04237029701471329,
0.23208919167518616,
-0.06517360359430313,
-0.07358380407094955,
-0.13262179493904114,
0.09425038844347,
-0.020774539560079575,
0.05422547087073326,
0.03925251588225365,
-0.1250264197587967,
0.0033500357531011105,
0.13577201962471008,
0.11836834996938705,
-0.002868342911824584,
0.008071486838161945,
0.039927463978528976,
0.005613021552562714,
-0.05908596143126488,
0.036810338497161865,
0.05790701508522034,
0.13603076338768005,
-0.07816687971353531,
0.06294257938861847,
0.0068999994546175,
-0.07861924916505814,
-0.0430021770298481,
0.12258011102676392,
-0.01771291345357895,
0.033160895109176636,
-0.04474131762981415,
0.10494659841060638,
-0.06386210769414902,
-0.3023519814014435,
0.03618663549423218,
-0.10097576677799225,
-0.1573091298341751,
-0.01591421850025654,
0.05504710227251053,
-0.024499690160155296,
0.024731650948524475,
0.07404778897762299,
-0.05839020013809204,
0.18247833847999573,
0.03708400949835777,
-0.08908724039793015,
-0.05214424058794975,
0.05819600820541382,
-0.07667914777994156,
0.29527547955513,
0.0018856418319046497,
0.029907740652561188,
0.10656348615884781,
-0.02130613848567009,
-0.15849106013774872,
0.017152920365333557,
0.10987870395183563,
-0.09249646216630936,
0.08097189664840698,
0.1969153881072998,
-0.017194712534546852,
0.11304406821727753,
0.061918240040540695,
-0.060775525867938995,
0.0563589483499527,
-0.06079897657036781,
-0.04729166626930237,
-0.09214385598897934,
0.06507396697998047,
-0.05960074067115784,
0.15478351712226868,
0.0983923152089119,
-0.04626932740211487,
-0.011300264857709408,
-0.05205517262220383,
0.037209995090961456,
0.013506266288459301,
0.12517105042934418,
0.005834320094436407,
-0.16502192616462708,
0.03574982285499573,
-0.008094683289527893,
0.1093440055847168,
-0.24298563599586487,
-0.0835915356874466,
0.08754356950521469,
-0.028303591534495354,
-0.04819139838218689,
0.0978732705116272,
0.07554362714290619,
0.044536542147397995,
-0.046234432607889175,
-0.08701527863740921,
-0.015517168678343296,
0.14954860508441925,
-0.14008408784866333,
-0.008103124797344208
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | maridze/Saiga_2_7b_fine_tune_custom_data | [
"peft",
"region:us"
] | 2024-02-13T10:24:36+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
164,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.053269870579242706,
0.037288859486579895,
-0.0023498369846493006,
0.12860141694545746,
0.09711828827857971,
0.06938523054122925,
0.10579416155815125,
0.1292499452829361,
0.022261790931224823,
0.08017075061798096,
0.11625925451517105,
0.04396219924092293,
0.06695731729269028,
0.15088412165641785,
-0.028099212795495987,
-0.008259058929979801,
0.050381533801555634,
-0.0005584736936725676,
-0.011545145884156227,
0.08704539388418198,
0.05457798019051552,
-0.051225606352090836,
0.032125186175107956,
-0.09107348322868347,
-0.15622684359550476,
-0.003146089380607009,
0.004685089457780123,
0.026495451107621193,
0.041620079427957535,
0.036286361515522,
0.0672917291522026,
0.000948688539210707,
-0.02340070717036724,
-0.20800472795963287,
-0.011052807793021202,
0.10847669839859009,
-0.023697681725025177,
0.07120364159345627,
-0.10105514526367188,
0.14101602137088776,
-0.07431574165821075,
-0.044521018862724304,
0.015505745075643063,
0.016141142696142197,
-0.08908672630786896,
-0.10921372473239899,
-0.056611500680446625,
0.05434385687112808,
0.025896143168210983,
0.06898002326488495,
-0.011024434119462967,
0.18677890300750732,
-0.1572560966014862,
0.08946336060762405,
0.07651227712631226,
-0.2272987961769104,
-0.02457762509584427,
0.12256019562482834,
-0.011468162760138512,
0.16642406582832336,
-0.08616470545530319,
-0.10246296972036362,
0.06721126288175583,
0.04392894357442856,
-0.03561519458889961,
-0.008210073225200176,
-0.0949617549777031,
0.018221145495772362,
-0.13337399065494537,
-0.05047217383980751,
0.1545511782169342,
0.029353223741054535,
-0.04286954179406166,
-0.05023425444960594,
-0.09016099572181702,
-0.35934361815452576,
0.029608795419335365,
-0.009606908075511456,
-0.07419688999652863,
0.047605302184820175,
-0.009263900108635426,
-0.015186766162514687,
-0.01163696963340044,
-0.08817271143198013,
-0.0290176123380661,
0.06573924422264099,
0.04293062910437584,
0.032810889184474945,
0.012150761671364307,
0.11014214158058167,
-0.11336442083120346,
-0.02742960676550865,
-0.04243305325508118,
-0.028446335345506668,
-0.04927803948521614,
-0.015285715460777283,
-0.07611539959907532,
0.1796576827764511,
0.07765424996614456,
0.12315205484628677,
-0.14354397356510162,
0.12791253626346588,
-0.02828214503824711,
0.05214376002550125,
-0.028637759387493134,
0.03207641839981079,
-0.11796392500400543,
0.11347455531358719,
0.007657987996935844,
0.16286496818065643,
0.012001721188426018,
-0.045438047498464584,
-0.0658487007021904,
-0.0003362045099493116,
0.14264227449893951,
0.0027607788797467947,
-0.10811484605073929,
0.010071263648569584,
-0.1485380381345749,
-0.032002370804548264,
0.08129970729351044,
-0.07263465970754623,
0.016057845205068588,
0.03499146178364754,
-0.04885988309979439,
-0.01572159305214882,
0.10545022040605545,
-0.04347445070743561,
-0.02752680517733097,
-0.02662649005651474,
-0.1016698032617569,
-0.017475098371505737,
-0.09856010973453522,
-0.13301338255405426,
0.04543880745768547,
-0.17170141637325287,
0.0035178987309336662,
-0.03677147254347801,
-0.058993980288505554,
0.022542359307408333,
0.014296506531536579,
-0.08111370354890823,
0.06173989176750183,
-0.09122895449399948,
-0.15531693398952484,
-0.024016814306378365,
0.014668106101453304,
0.02006439119577408,
-0.018803609535098076,
0.10746068507432938,
0.03717300668358803,
0.10858677327632904,
-0.1785307079553604,
-0.0017424002289772034,
0.005638845264911652,
0.06792867928743362,
0.030236108228564262,
0.13407836854457855,
-0.10484272241592407,
-0.04035882279276848,
-0.06063907966017723,
-0.06223446503281593,
-0.10421726107597351,
-0.018008070066571236,
0.1334538757801056,
0.08467729389667511,
-0.15831358730793,
-0.007896519266068935,
0.07586853206157684,
-0.016155976802110672,
-0.06550277024507523,
0.15078632533550262,
-0.06013118848204613,
0.10789518058300018,
-0.03523194417357445,
0.07353883981704712,
0.22796933352947235,
-0.11163830012083054,
-0.007653160020709038,
0.11777401715517044,
0.06732518970966339,
0.009476522915065289,
0.011584899388253689,
0.07497644424438477,
-0.11210387200117111,
0.02998311258852482,
0.06886386126279831,
0.03176359832286835,
-0.06186481565237045,
-0.06746450811624527,
-0.03173987567424774,
-0.056847721338272095,
0.11095795780420303,
0.02684447541832924,
0.009942829608917236,
-0.0729876384139061,
-0.08242283761501312,
0.14047347009181976,
0.12489724904298782,
-0.02373495139181614,
-0.00369115243665874,
-0.13260477781295776,
-0.011203601025044918,
-0.03366612270474434,
0.021201135590672493,
-0.12538520991802216,
0.0323343388736248,
0.08256782591342926,
0.014403634704649448,
0.010397580452263355,
0.0396578386425972,
0.058033477514982224,
0.028194895014166832,
-0.06090497598052025,
0.011176415719091892,
-0.053236544132232666,
0.004079440608620644,
-0.09993678331375122,
-0.08476646989583969,
0.0006691172602586448,
-0.010852687060832977,
0.21550384163856506,
-0.1337486356496811,
0.03394760191440582,
0.11621101200580597,
-0.004439641255885363,
-0.007688772398978472,
-0.04028208553791046,
-0.07547637820243835,
0.10565713047981262,
-0.01266601774841547,
-0.036212850362062454,
0.033809415996074677,
0.028592392802238464,
-0.06285451352596283,
-0.16731229424476624,
-0.08161847293376923,
0.03963222727179527,
0.13266299664974213,
0.08500313758850098,
-0.07733339816331863,
-0.04591209068894386,
-0.01541791670024395,
-0.03867197409272194,
0.06297565251588821,
-0.06336305290460587,
0.031645916402339935,
0.005129612050950527,
0.05986512452363968,
-0.09973008185625076,
-0.03743978962302208,
0.06189531460404396,
-0.01586044207215309,
-0.041004765778779984,
0.11039824783802032,
0.02631252445280552,
-0.11783633381128311,
0.06864100694656372,
0.06029544770717621,
-0.1404678374528885,
0.09327255189418793,
-0.012676672078669071,
-0.019596412777900696,
-0.09960536658763885,
0.16997361183166504,
0.02375946193933487,
0.11756766587495804,
-0.1501924842596054,
0.10538405925035477,
-0.009771161712706089,
0.006204601377248764,
0.0612400583922863,
-0.1997312605381012,
-0.005127645563334227,
-0.039008110761642456,
-0.08297083526849747,
-0.07024680823087692,
-0.02411830425262451,
0.005311124958097935,
0.03330911695957184,
0.002952837385237217,
0.06394848227500916,
0.14376528561115265,
-0.0185391865670681,
-0.07937957346439362,
0.18065880239009857,
-0.22822222113609314,
-0.22458399832248688,
-0.22842389345169067,
0.006693730596452951,
-0.09362730383872986,
-0.036704301834106445,
-0.054009731858968735,
-0.08067703247070312,
0.03441975265741348,
-0.08248597383499146,
-0.055353812873363495,
-0.011982304975390434,
0.008380476385354996,
0.05353120341897011,
0.016293516382575035,
0.16450470685958862,
-0.07970929890871048,
0.0257254745811224,
0.053753700107336044,
-0.024769404903054237,
0.12344299256801605,
-0.08420433104038239,
-0.03462434560060501,
0.11387899518013,
-0.01188295055180788,
0.0145328463986516,
0.01183662936091423,
0.33029767870903015,
0.004910196643322706,
0.03649308532476425,
0.08162890374660492,
-0.003756461199373007,
0.05702434852719307,
0.08918663114309311,
0.01854359731078148,
-0.10683408379554749,
0.07154038548469543,
0.052385956048965454,
-0.0819675549864769,
-0.13089759647846222,
-0.03209419921040535,
-0.06363310664892197,
0.01682649366557598,
0.07607807219028473,
0.06257891654968262,
0.08387400209903717,
0.07003678381443024,
0.031037570908665657,
0.10862348228693008,
0.006719090044498444,
-0.009780955500900745,
0.1154506579041481,
-0.02919038012623787,
0.0664224848151207,
-0.01642061397433281,
0.026679545640945435,
0.061177391558885574,
0.1297265589237213,
0.069287970662117,
-0.0736280009150505,
0.012541249394416809,
0.05162445083260536,
0.29216650128364563,
-0.011502427980303764,
0.08846558630466461,
-0.07443831115961075,
-0.019035225734114647,
-0.013720297254621983,
-0.03154807910323143,
-0.07571276277303696,
0.04215119034051895,
0.003495335578918457,
0.06473421305418015,
-0.0022922337520867586,
-0.017213091254234314,
0.07760420441627502,
0.09821566194295883,
0.17678722739219666,
-0.2779153883457184,
-0.11255498975515366,
-0.006441204342991114,
0.10050622373819351,
-0.10428133606910706,
0.01565384306013584,
0.22264739871025085,
0.006862876936793327,
-0.1009795218706131,
-0.03333192318677902,
0.031150078400969505,
-0.015747448429465294,
0.01241636648774147,
0.12056416273117065,
0.10708002001047134,
-0.002608862705528736,
0.07637561857700348,
-0.31736811995506287,
0.02226284332573414,
0.05985979735851288,
0.03599311038851738,
-0.04527868330478668,
0.007991808466613293,
-0.061200015246868134,
-0.06832600384950638,
0.03509039431810379,
0.0019549522548913956,
0.1889735907316208,
-0.2897758185863495,
-0.07803528755903244,
-0.008666069246828556,
0.12686322629451752,
0.06344277411699295,
0.04962414875626564,
0.01623174548149109,
0.054691381752491,
0.0789775401353836,
0.06340458244085312,
-0.03233237564563751,
-0.11194033175706863,
0.0005942063289694488,
0.16109155118465424,
-0.12496649473905563,
-0.06108592450618744,
-0.05368443951010704,
-0.007636595517396927,
0.04602250084280968,
-0.16420245170593262,
-0.05097196623682976,
-0.05793680250644684,
0.03439047187566757,
0.1503523737192154,
-0.025070440024137497,
-0.002292958088219166,
-0.013629160821437836,
0.00947998184710741,
-0.03680781275033951,
-0.07645224779844284,
0.1100383996963501,
-0.04053280130028725,
-0.14371339976787567,
-0.043084755539894104,
0.13991355895996094,
0.08521818369626999,
0.00122605892829597,
-0.08492032438516617,
-0.04409416764974594,
0.026184678077697754,
-0.14106614887714386,
0.012389937415719032,
0.08117508143186569,
-0.0536799319088459,
0.08704525977373123,
-0.1025477796792984,
0.21894492208957672,
-0.05376456305384636,
0.08074451237916946,
0.07477474957704544,
0.3051668405532837,
-0.07810559123754501,
0.025269396603107452,
0.07390918582677841,
-0.017363475635647774,
-0.25344404578208923,
0.03934885561466217,
0.06799458712339401,
0.050301443785429,
-0.03162827342748642,
-0.1787947118282318,
0.029355987906455994,
0.07851903140544891,
0.01152612455189228,
0.16447895765304565,
-0.3246810734272003,
-0.06552402675151825,
0.03930334374308586,
0.05016697570681572,
0.12071209400892258,
-0.05151672661304474,
0.011802415363490582,
0.0005798686761409044,
-0.012056782841682434,
0.16171705722808838,
-0.08953577280044556,
0.11604998260736465,
-0.009891933761537075,
0.023780513554811478,
0.007661169860512018,
-0.03744346275925636,
0.1529698222875595,
0.0016671200282871723,
0.08786030858755112,
0.0227588452398777,
-0.0784774124622345,
0.06254196912050247,
-0.06566163897514343,
0.023280994966626167,
-0.058587972074747086,
0.08849100023508072,
-0.053309399634599686,
0.008895163424313068,
-0.06468260288238525,
-0.0285837110131979,
-0.06997548788785934,
-0.05294686183333397,
-0.1114194467663765,
0.09259607642889023,
-0.014859088696539402,
-0.028233857825398445,
-0.03854804486036301,
0.05306123569607735,
0.038334574550390244,
0.4490160346031189,
-0.05683157220482826,
-0.03747360035777092,
0.08445943146944046,
0.09416796267032623,
-0.022764256224036217,
0.09828681498765945,
-0.12651817500591278,
0.04326749965548515,
0.12407610565423965,
0.0013500433415174484,
0.14305323362350464,
0.08107176423072815,
-0.10908607393503189,
0.0026568786706775427,
0.03864200413227081,
-0.13661205768585205,
-0.0637107715010643,
-0.021698210388422012,
-0.008182504214346409,
-0.11601445823907852,
-0.003855476388707757,
0.10335319489240646,
-0.026012953370809555,
0.04859347641468048,
0.02959892898797989,
0.04628570005297661,
-0.14050062000751495,
0.15718260407447815,
0.03785894438624382,
0.0773269459605217,
-0.08905962854623795,
0.08538462221622467,
0.0341004803776741,
0.004354927688837051,
0.054473016411066055,
-0.02465607225894928,
-0.10066041350364685,
0.018329044803977013,
-0.03814751282334328,
-0.10103803873062134,
0.11562931537628174,
-0.031049763783812523,
-0.038677770644426346,
-0.0993829295039177,
0.014433843083679676,
0.0846017450094223,
0.052262432873249054,
0.1067095547914505,
-0.03113221377134323,
0.02033975161612034,
-0.1376478374004364,
0.07439704984426498,
-0.0334647037088871,
0.015372445806860924,
-0.13893410563468933,
0.07189736515283585,
-0.021610558032989502,
0.05621515214443207,
-0.018327541649341583,
-0.014693169854581356,
-0.22836874425411224,
0.02324662171304226,
-0.03602374345064163,
0.004337872378528118,
0.045130908489227295,
0.02817535400390625,
0.02978314459323883,
0.05020326375961304,
-0.023141341283917427,
0.02906402014195919,
-0.03197551518678665,
-0.05226500332355499,
0.0499696210026741,
-0.004835754632949829,
-0.03992021083831787,
-0.05186335742473602,
0.06049661710858345,
-0.10903627425432205,
0.03873228281736374,
0.03313494473695755,
-0.054615914821624756,
0.06981058418750763,
0.06330772489309311,
0.0268961600959301,
0.09631124138832092,
0.056871797889471054,
0.046097688376903534,
-0.06351964920759201,
0.03455657139420509,
-0.022250670939683914,
-0.00624211085960269,
0.05636441707611084,
0.14740675687789917,
-0.047108784317970276,
-0.0636642798781395,
-0.14202140271663666,
-0.019476987421512604,
-0.04566153883934021,
0.04513302817940712,
0.16159655153751373,
0.09404542297124863,
0.09236114472150803,
-0.0821361318230629,
-0.023633049800992012,
-0.14481386542320251,
-0.07746478915214539,
0.05283274129033089,
-0.052044827491045,
-0.047382134944200516,
-0.04354611411690712,
0.0739387795329094,
-0.009924774058163166,
0.1355780065059662,
-0.09322313219308853,
-0.10284098237752914,
-0.053579315543174744,
-0.19133111834526062,
-0.12044229358434677,
0.00256889290176332,
0.26070472598075867,
0.03901056945323944,
-0.04509548842906952,
-0.08076508343219757,
0.002155857626348734,
0.07334074378013611,
0.13909012079238892,
0.029680786654353142,
0.08756469190120697,
-0.12226739525794983,
0.10201890766620636,
0.04023836925625801,
-0.055521633476018906,
0.1132090762257576,
0.31879276037216187,
-0.07730785757303238,
0.013387022539973259,
-0.10073232650756836,
0.11021994799375534,
0.0007453791913576424,
-0.1430017352104187,
0.006316468119621277,
-0.037437427788972855,
-0.16519887745380402,
-0.10536279529333115,
0.02628246694803238,
-0.0706881508231163,
-0.18453171849250793,
-0.023839564993977547,
-0.11543169617652893,
-0.06360355019569397,
0.1126338317990303,
0.0399429053068161,
-0.028909916058182716,
0.201675683259964,
-0.08599916845560074,
0.041288454085588455,
-0.0003141422930639237,
-0.010095082223415375,
-0.016095805913209915,
-0.030886787921190262,
-0.09983845800161362,
0.14582909643650055,
0.019685756415128708,
0.10730867087841034,
0.0036301896907389164,
0.08048225194215775,
0.04194061830639839,
-0.026748450472950935,
-0.04846813157200813,
-0.011329391971230507,
0.012692849151790142,
-0.053367115557193756,
0.11627253144979477,
0.05312275141477585,
-0.07700059562921524,
-0.08015355467796326,
-0.010329783894121647,
-0.08480866253376007,
-0.03079259768128395,
-0.15980394184589386,
0.2568057179450989,
-0.035085529088974,
0.11443251371383667,
0.0006201770738698542,
-0.06570316851139069,
-0.0947524830698967,
0.15000610053539276,
0.11755913496017456,
-0.14171954989433289,
-0.011081993579864502,
0.09272223711013794,
-0.007588645908981562,
-0.08871670067310333,
0.15103355050086975,
0.0864647775888443,
-0.019251089543104172,
0.0231208186596632,
-0.023887814953923225,
-0.02682357467710972,
-0.011532667092978954,
0.013189748860895634,
-0.033909741789102554,
0.02560831978917122,
0.03932470083236694,
-0.13912217319011688,
-0.029104189947247505,
-0.06448303163051605,
-0.08723004907369614,
0.18666662275791168,
-0.13714911043643951,
-0.081991046667099,
-0.03343013674020767,
-0.08826499432325363,
-0.10603920370340347,
0.020193934440612793,
-0.10627500712871552,
0.07077591866254807,
0.07121050357818604,
-0.05374689772725105,
-0.0019511525752022862,
-0.04885771498084068,
0.011899150907993317,
0.02466990239918232,
0.058414071798324585,
-0.011807786300778389,
0.07866984605789185,
0.11816007643938065,
-0.02016681805253029,
-0.048029348254203796,
0.10774647444486618,
0.01690971665084362,
-0.04457734525203705,
-0.1452929526567459,
0.03396405652165413,
-0.021062646061182022,
0.13437509536743164,
0.03301657736301422,
-0.06416842341423035,
-0.014076373539865017,
-0.20952264964580536,
-0.012867928482592106,
-0.14440461993217468,
-0.07743453234434128,
-0.07547476142644882,
0.10911253839731216,
0.19156797230243683,
-0.058017831295728683,
0.022697798907756805,
-0.03117358312010765,
0.03372464329004288,
-0.04286767169833183,
0.06252838671207428,
-0.003493012860417366,
-0.15049496293067932,
0.05715633183717728,
-0.04946384206414223,
0.01261360477656126,
-0.29460665583610535,
-0.007316927425563335,
0.012749905698001385,
-0.02980758063495159,
-0.0437527671456337,
0.14593428373336792,
0.007143276743590832,
0.06895528733730316,
-0.05879765376448631,
-0.251142680644989,
-0.06769166886806488,
0.1345207542181015,
0.0035605370067059994,
-0.06681157648563385
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": ["unsloth"]} | text-generation | mahiatlinux/tinyllama_custom | [
"transformers",
"safetensors",
"llama",
"text-generation",
"unsloth",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | 2024-02-13T10:27:49+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #unsloth #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #unsloth #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
63,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #unsloth #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04292343929409981,
0.18826867640018463,
-0.005377055611461401,
0.01779043860733509,
0.10202369838953018,
0.006763845216482878,
0.05548536777496338,
0.11715483665466309,
-0.05078177899122238,
0.12557542324066162,
0.0411979965865612,
0.11146166175603867,
0.11804298311471939,
0.14329783618450165,
0.00002410791057627648,
-0.21835729479789734,
0.049665696918964386,
-0.10991441458463669,
-0.0106809725984931,
0.12213768064975739,
0.14416629076004028,
-0.10152151435613632,
0.06854484230279922,
-0.03602901101112366,
-0.017661886289715767,
-0.03734491392970085,
-0.06368496268987656,
-0.041672609746456146,
0.03853093832731247,
0.05424622446298599,
0.0640019103884697,
0.0008005748968571424,
0.0846111923456192,
-0.27982035279273987,
0.018390439450740814,
0.06937906891107559,
-0.0023142043501138687,
0.06504538655281067,
0.06703895330429077,
-0.0643777847290039,
0.10560540109872818,
-0.05572549253702164,
0.1365208476781845,
0.08376910537481308,
-0.09230275452136993,
-0.18153217434883118,
-0.0921248346567154,
0.1093878373503685,
0.17717865109443665,
0.05059289559721947,
-0.02799699455499649,
0.10320482403039932,
-0.08263788372278214,
0.016451768577098846,
0.05110875144600868,
-0.08650156855583191,
-0.05617035552859306,
0.06494492292404175,
0.09247919917106628,
0.052732132375240326,
-0.1271737962961197,
-0.03365660458803177,
0.004899790044873953,
0.016557635739445686,
0.07190205156803131,
0.021786099299788475,
0.1472795307636261,
0.034876905381679535,
-0.1331535279750824,
-0.051329996436834335,
0.1048947125673294,
0.04056032747030258,
-0.040847018361091614,
-0.2426268309354782,
-0.027707915753126144,
-0.025109421461820602,
-0.03402986750006676,
-0.0437135249376297,
0.04242107644677162,
-0.0035838026087731123,
0.08458299934864044,
-0.007490561809390783,
-0.07259652763605118,
-0.03244706988334656,
0.0688263401389122,
0.064784474670887,
0.030094482004642487,
-0.020735738798975945,
0.02483791671693325,
0.10849814862012863,
0.0949249416589737,
-0.11696284264326096,
-0.058448728173971176,
-0.06557316333055496,
-0.07224222272634506,
-0.04181387275457382,
0.03500311076641083,
0.015896132215857506,
0.071471206843853,
0.2602509558200836,
0.01899585872888565,
0.05600179731845856,
0.029530007392168045,
0.0073724533431231976,
0.05294303596019745,
0.10623796284198761,
-0.06396427750587463,
-0.11310389637947083,
-0.02034088596701622,
0.09129863977432251,
0.02021746337413788,
-0.036910805851221085,
-0.04569694399833679,
0.06657931953668594,
0.04505877196788788,
0.11183920502662659,
0.1006806343793869,
0.019416874274611473,
-0.07564955204725266,
-0.06204071268439293,
0.19905975461006165,
-0.15675219893455505,
0.039149727672338486,
0.043250180780887604,
-0.03578011691570282,
-0.02036311849951744,
0.009607705287635326,
0.026054807007312775,
-0.03432466462254524,
0.08814160525798798,
-0.05473000556230545,
-0.04614151641726494,
-0.11089799553155899,
-0.030556926503777504,
0.0442013218998909,
0.010448796674609184,
-0.03395242616534233,
-0.03711342439055443,
-0.07666761428117752,
-0.08680769801139832,
0.08644384890794754,
-0.07048791646957397,
-0.05712197721004486,
-0.027583591639995575,
-0.08218405395746231,
0.023172836750745773,
0.019756721332669258,
0.0729556605219841,
-0.025309210643172264,
0.05618005245923996,
-0.053604111075401306,
0.05488147959113121,
0.10356975346803665,
0.03610638156533241,
-0.06198744475841522,
0.05823414400219917,
-0.22975605726242065,
0.08514560014009476,
-0.06982189416885376,
0.06414492428302765,
-0.15622174739837646,
-0.02446330152451992,
0.03592666983604431,
0.004316574428230524,
-0.004745627753436565,
0.1365443617105484,
-0.20896528661251068,
-0.02342131733894348,
0.1669033318758011,
-0.09443739056587219,
-0.0715414509177208,
0.050615932792425156,
-0.04599856585264206,
0.10239199548959732,
0.03290434554219246,
0.002775689819827676,
0.0626225471496582,
-0.10854848474264145,
-0.013959519565105438,
-0.055099017918109894,
-0.025059405714273453,
0.13819867372512817,
0.07562530040740967,
-0.07902952283620834,
0.06477715075016022,
0.022813700139522552,
-0.020409759134054184,
-0.06547383219003677,
-0.01951533555984497,
-0.10126182436943054,
0.01546517014503479,
-0.06879144161939621,
0.011947265826165676,
-0.016198530793190002,
-0.09417803585529327,
-0.02866245061159134,
-0.16998343169689178,
-0.029649244621396065,
0.08252308517694473,
-0.004467182792723179,
-0.014274250715970993,
-0.10941161960363388,
0.02704133465886116,
0.03371654078364372,
0.0044359853491187096,
-0.13195890188217163,
-0.04033959284424782,
0.034737132489681244,
-0.15329891443252563,
0.03500048816204071,
-0.07348592579364777,
0.051377225667238235,
0.015687594190239906,
-0.029155118390917778,
-0.026816358789801598,
0.022950554266572,
0.00845371000468731,
-0.015416091307997704,
-0.23671619594097137,
-0.02562868967652321,
-0.02933376282453537,
0.1596956104040146,
-0.2073163241147995,
0.03471647948026657,
0.07962728291749954,
0.15802165865898132,
0.002939125057309866,
-0.05216621235013008,
0.019714290276169777,
-0.07012677937746048,
-0.024032827466726303,
-0.057665277272462845,
0.0032831889111548662,
-0.018774664029479027,
-0.04345292970538139,
0.025579115375876427,
-0.17645572125911713,
-0.046705566346645355,
0.09828836470842361,
0.04776141792535782,
-0.126988023519516,
-0.020474793389439583,
-0.03759893402457237,
-0.052560579031705856,
-0.04305299371480942,
-0.06037580594420433,
0.096319779753685,
0.06287256628274918,
0.03674183785915375,
-0.05966869741678238,
-0.07946930080652237,
-0.004839655943214893,
-0.0153745636343956,
-0.02353493496775627,
0.09512404352426529,
0.07751291990280151,
-0.13043329119682312,
0.09360626339912415,
0.08417011797428131,
0.07712284475564957,
0.08935420215129852,
-0.02144179306924343,
-0.075404092669487,
-0.0370500348508358,
0.03699750825762749,
0.019515717402100563,
0.12300322204828262,
-0.03906695544719696,
0.04399838298559189,
0.04019251465797424,
-0.027942534536123276,
0.01802278310060501,
-0.07876281440258026,
0.03273193538188934,
0.020718198269605637,
-0.014756618067622185,
0.050279535353183746,
-0.036855414509773254,
0.018754903227090836,
0.08659088611602783,
0.05881096050143242,
0.044072698801755905,
0.016800886020064354,
-0.052010320127010345,
-0.1115691065788269,
0.15924349427223206,
-0.12276260554790497,
-0.21748638153076172,
-0.1328210085630417,
0.014497416093945503,
0.026223482564091682,
-0.01506797969341278,
0.00576651468873024,
-0.05971094220876694,
-0.10903192311525345,
-0.09009502828121185,
0.005795407574623823,
0.057016320526599884,
-0.08362492173910141,
-0.06093323975801468,
0.04702426865696907,
0.043084245175123215,
-0.14079512655735016,
0.02152630127966404,
0.04325500503182411,
-0.09207198023796082,
-0.01115967147052288,
0.08168168365955353,
0.07780283689498901,
0.18435634672641754,
0.020768070593476295,
-0.021172961220145226,
0.032526228576898575,
0.22063808143138885,
-0.13656745851039886,
0.11198467016220093,
0.13181617856025696,
-0.08575399219989777,
0.08299451321363449,
0.20879633724689484,
0.04222285374999046,
-0.09664907306432724,
0.0304343793541193,
0.030948897823691368,
-0.022790847346186638,
-0.23554196953773499,
-0.06883510947227478,
0.0009722800459712744,
-0.0659257248044014,
0.07941686362028122,
0.09556294232606888,
0.07574138045310974,
0.01872626319527626,
-0.09533156454563141,
-0.09108595550060272,
0.05632733553647995,
0.10894450545310974,
0.014029460027813911,
-0.006910799536854029,
0.08843157440423965,
-0.03546027094125748,
0.01583779789507389,
0.08756837993860245,
0.0024584103375673294,
0.1598428189754486,
0.04778169468045235,
0.17317496240139008,
0.08422500640153885,
0.07279623299837112,
0.000603687425609678,
0.008512238040566444,
0.014100469648838043,
0.041183922439813614,
-0.005525778979063034,
-0.08351922780275345,
-0.02603893168270588,
0.11030695587396622,
0.06851852685213089,
0.01754682883620262,
0.01248775515705347,
-0.04759526997804642,
0.0871804729104042,
0.17926573753356934,
0.004300123080611229,
-0.18129703402519226,
-0.05778562277555466,
0.074430450797081,
-0.09818711876869202,
-0.10275593400001526,
-0.00784407276660204,
0.013536486774682999,
-0.16616149246692657,
0.03770674765110016,
-0.020995361730456352,
0.10762384533882141,
-0.13448180258274078,
-0.016813986003398895,
0.07832744717597961,
0.0712333619594574,
-0.0029534129425883293,
0.05958399921655655,
-0.18045948445796967,
0.09713280200958252,
0.011677161790430546,
0.07155559211969376,
-0.09598501026630402,
0.09018103778362274,
-0.007205406203866005,
-0.029768822714686394,
0.1431937962770462,
-0.003142349421977997,
-0.07371359318494797,
-0.06319711357355118,
-0.09517261385917664,
-0.010899929329752922,
0.1268581748008728,
-0.13050828874111176,
0.09160564839839935,
-0.03312861919403076,
-0.03596508502960205,
-0.011182941496372223,
-0.0880916640162468,
-0.1122843474149704,
-0.17711912095546722,
0.06018149107694626,
-0.12945356965065002,
0.03882477059960365,
-0.10549931228160858,
-0.026717960834503174,
-0.026683486998081207,
0.178618922829628,
-0.23788060247898102,
-0.07397715002298355,
-0.14370791614055634,
-0.09402398020029068,
0.1324852705001831,
-0.04830838367342949,
0.08988931030035019,
-0.015055065974593163,
0.15817634761333466,
0.021827908232808113,
-0.020078346133232117,
0.08633492141962051,
-0.0848253071308136,
-0.19780774414539337,
-0.06955822557210922,
0.16248609125614166,
0.12084493786096573,
0.03335541859269142,
-0.0013899571495130658,
0.037777941673994064,
-0.02108919247984886,
-0.11882339417934418,
0.022981705144047737,
0.15345308184623718,
0.06721732765436172,
0.011430226266384125,
-0.024014320224523544,
-0.10809025913476944,
-0.07674530893564224,
-0.029322028160095215,
0.02987777628004551,
0.1724686175584793,
-0.07139913737773895,
0.17051774263381958,
0.14364555478096008,
-0.0588962584733963,
-0.20953300595283508,
-0.0034934862051159143,
0.025866392999887466,
-0.00958812702447176,
0.01262340322136879,
-0.19036969542503357,
0.08587159216403961,
-0.002174045192077756,
-0.05474715679883957,
0.1073274239897728,
-0.1594780534505844,
-0.13815680146217346,
0.08379834145307541,
0.04926612973213196,
-0.18574541807174683,
-0.13679726421833038,
-0.09653197973966599,
-0.04133835434913635,
-0.15640324354171753,
0.09362965822219849,
0.022156093269586563,
0.012721545062959194,
0.02991352044045925,
0.015599608421325684,
0.024388102814555168,
-0.048591915518045425,
0.17602954804897308,
-0.016234640032052994,
0.02279195562005043,
-0.09626507014036179,
-0.07968656718730927,
0.018025947734713554,
-0.051882676780223846,
0.07170172780752182,
-0.017266500741243362,
0.012935908511281013,
-0.10264335572719574,
-0.03540561720728874,
-0.04268501326441765,
0.01661405898630619,
-0.099188432097435,
-0.085202656686306,
-0.04659072309732437,
0.09497132152318954,
0.09788262099027634,
-0.02253568172454834,
-0.026998797431588173,
-0.0772315189242363,
0.05453493818640709,
0.20757755637168884,
0.18566472828388214,
0.04546947404742241,
-0.06112571433186531,
-0.003745796624571085,
-0.016034787520766258,
0.04216223955154419,
-0.19591552019119263,
0.058634500950574875,
0.056738704442977905,
0.022449525073170662,
0.10273087024688721,
-0.019446061924099922,
-0.1581314206123352,
-0.0764765590429306,
0.06901811063289642,
-0.06423882395029068,
-0.20066112279891968,
0.008975905366241932,
0.05530434846878052,
-0.1775708645582199,
-0.03917059674859047,
0.04495568946003914,
-0.002727912273257971,
-0.03902778401970863,
0.023089852184057236,
0.09468687325716019,
0.003924074117094278,
0.0782470852136612,
0.07192396372556686,
0.0822129026055336,
-0.09933636337518692,
0.08582625538110733,
0.09758522361516953,
-0.0720822811126709,
0.028736449778079987,
0.10113639384508133,
-0.05638568103313446,
-0.03886125236749649,
0.034486688673496246,
0.08152203261852264,
0.025647565722465515,
-0.044451456516981125,
0.010355938225984573,
-0.09268626570701599,
0.06836600601673126,
0.09989523887634277,
0.029670365154743195,
0.017934802919626236,
0.04530128091573715,
0.046947747468948364,
-0.07622124254703522,
0.12428463250398636,
0.034861911088228226,
0.015623697079718113,
-0.04374397173523903,
-0.045552048832178116,
0.009736912325024605,
-0.03028755635023117,
-0.004958131350576878,
-0.02133471518754959,
-0.08800674229860306,
-0.015880294144153595,
-0.13308630883693695,
-0.009298216551542282,
-0.06151175871491432,
0.013635284267365932,
0.027435721829533577,
-0.032538071274757385,
0.005575480870902538,
0.005044261459261179,
-0.07019834965467453,
-0.0691831111907959,
-0.012711147777736187,
0.095164455473423,
-0.1670372188091278,
0.02610124461352825,
0.08352641016244888,
-0.11222398281097412,
0.09912886470556259,
0.012217588722705841,
-0.005713800899684429,
0.022394387051463127,
-0.14695370197296143,
0.03546147048473358,
-0.03938230127096176,
0.008286556228995323,
0.022746257483959198,
-0.19988538324832916,
-0.000013934964044892695,
-0.03536801412701607,
-0.06939463317394257,
-0.008364387787878513,
-0.027196144685149193,
-0.11335262656211853,
0.1067931056022644,
0.0012342236004769802,
-0.08100111037492752,
-0.03039664961397648,
0.031982820481061935,
0.07636885344982147,
-0.028471706435084343,
0.15208598971366882,
-0.013617309741675854,
0.06672896444797516,
-0.1587895303964615,
-0.011663785204291344,
-0.010776839219033718,
0.01560376025736332,
-0.03581136837601662,
-0.007234253454953432,
0.05085217207670212,
-0.013196735642850399,
0.17413955926895142,
-0.03515728563070297,
0.016792291775345802,
0.06547330319881439,
0.045002423226833344,
-0.034976135939359665,
0.0982886403799057,
0.051650237292051315,
0.017417224124073982,
0.008780776523053646,
0.011892673559486866,
-0.042148590087890625,
-0.03601905331015587,
-0.19182322919368744,
0.07043200731277466,
0.18725112080574036,
0.09618266671895981,
-0.020147977396845818,
0.07248927652835846,
-0.1016421988606453,
-0.09610612690448761,
0.15260624885559082,
-0.03709828853607178,
-0.005899024195969105,
-0.07303978502750397,
0.1281641572713852,
0.14302216470241547,
-0.18107613921165466,
0.06588808447122574,
-0.07227024435997009,
-0.04245559871196747,
-0.11048944294452667,
-0.19562652707099915,
-0.06181143969297409,
-0.04950173199176788,
-0.017301756888628006,
-0.046650297939777374,
0.06790224462747574,
0.06302686780691147,
-0.0009760049870237708,
-0.00815952941775322,
0.07094833999872208,
-0.03396129235625267,
-0.0016717935213819146,
0.029484858736395836,
0.059591714292764664,
0.0060126567259430885,
-0.036931898444890976,
0.016109824180603027,
-0.01176763791590929,
0.05502614006400108,
0.07581013441085815,
0.04883652552962303,
-0.027538318186998367,
0.020281067118048668,
-0.0400039367377758,
-0.10745270550251007,
0.04913952574133873,
-0.027562124654650688,
-0.07319166511297226,
0.15083648264408112,
0.020662441849708557,
0.005448428448289633,
-0.012079499661922455,
0.24070440232753754,
-0.064145527780056,
-0.10332037508487701,
-0.14430701732635498,
0.07491715997457504,
-0.041465189307928085,
0.049052558839321136,
0.038524940609931946,
-0.11372357606887817,
0.026006104424595833,
0.14732374250888824,
0.15270641446113586,
-0.0402783565223217,
0.022211361676454544,
0.03488355875015259,
0.00868822168558836,
-0.024850280955433846,
0.03838307037949562,
0.06393134593963623,
0.14742381870746613,
-0.0477873831987381,
0.07836286723613739,
0.0009453904931433499,
-0.08714000135660172,
-0.03805996850132942,
0.11301322281360626,
-0.010510249063372612,
0.015199603512883186,
-0.0608166866004467,
0.11857125908136368,
-0.07339458167552948,
-0.21728423237800598,
0.039847232401371,
-0.06880895793437958,
-0.1338454782962799,
-0.025233695283532143,
0.07665672153234482,
-0.010742755606770515,
0.021982967853546143,
0.07774899899959564,
-0.07041120529174805,
0.1911073625087738,
0.0385311022400856,
-0.058266088366508484,
-0.04942469671368599,
0.07217175513505936,
-0.07537322491407394,
0.29618796706199646,
0.015248388051986694,
0.039003998041152954,
0.11050893366336823,
-0.016553720459342003,
-0.14173917472362518,
0.022491389885544777,
0.0963440090417862,
-0.09687109291553497,
0.053855426609516144,
0.18176445364952087,
0.001875719171948731,
0.12723487615585327,
0.07698098570108414,
-0.08834564685821533,
0.045245129615068436,
-0.07325270026922226,
-0.07016976177692413,
-0.0993812158703804,
0.10319076478481293,
-0.08812170475721359,
0.14497129619121552,
0.12399062514305115,
-0.055769409984350204,
0.010903195478022099,
-0.034286584705114365,
0.04751160740852356,
-0.0033840627875179052,
0.12132634967565536,
0.012232412584125996,
-0.18882830440998077,
0.02651413343846798,
-0.026405373588204384,
0.10246428102254868,
-0.1690295934677124,
-0.08598951250314713,
0.04612896963953972,
0.009267891757190228,
-0.07218426465988159,
0.1265496462583542,
0.05560595914721489,
0.029229512438178062,
-0.049114905297756195,
-0.02625749073922634,
-0.01127582136541605,
0.14144857227802277,
-0.10424615442752838,
-0.004776977468281984
] |
null | null | null |
# As most of the new merges, the quantized version is not working properly.
# OGNO-7B
OGNO-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [liminerity/Omningotex-7b-slerp](https://huggingface.co/liminerity/Omningotex-7b-slerp)
* [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: liminerity/Omningotex-7b-slerp
layer_range: [0, 32]
- model: eren23/dpo-binarized-NeutrixOmnibe-7B
layer_range: [0, 32]
merge_method: slerp
base_model: liminerity/Omningotex-7b-slerp
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "paulml/OGNO-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit", "liminerity/Omningotex-7b-slerp", "eren23/dpo-binarized-NeutrixOmnibe-7B"], "base_model": ["liminerity/Omningotex-7b-slerp", "eren23/dpo-binarized-NeutrixOmnibe-7B"]} | null | paulml/OGNO-7B-GGUF | [
"gguf",
"merge",
"mergekit",
"lazymergekit",
"liminerity/Omningotex-7b-slerp",
"eren23/dpo-binarized-NeutrixOmnibe-7B",
"base_model:liminerity/Omningotex-7b-slerp",
"base_model:eren23/dpo-binarized-NeutrixOmnibe-7B",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-13T10:28:49+00:00 | [] | [] | TAGS
#gguf #merge #mergekit #lazymergekit #liminerity/Omningotex-7b-slerp #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/Omningotex-7b-slerp #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #region-us
|
# As most of the new merges, the quantized version is not working properly.
# OGNO-7B
OGNO-7B is a merge of the following models using LazyMergekit:
* liminerity/Omningotex-7b-slerp
* eren23/dpo-binarized-NeutrixOmnibe-7B
## Configuration
## Usage
| [
"# As most of the new merges, the quantized version is not working properly.",
"# OGNO-7B\n\nOGNO-7B is a merge of the following models using LazyMergekit:\n* liminerity/Omningotex-7b-slerp\n* eren23/dpo-binarized-NeutrixOmnibe-7B",
"## Configuration",
"## Usage"
] | [
"TAGS\n#gguf #merge #mergekit #lazymergekit #liminerity/Omningotex-7b-slerp #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/Omningotex-7b-slerp #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #region-us \n",
"# As most of the new merges, the quantized version is not working properly.",
"# OGNO-7B\n\nOGNO-7B is a merge of the following models using LazyMergekit:\n* liminerity/Omningotex-7b-slerp\n* eren23/dpo-binarized-NeutrixOmnibe-7B",
"## Configuration",
"## Usage"
] | [
107,
18,
57,
4,
3
] | [
"passage: TAGS\n#gguf #merge #mergekit #lazymergekit #liminerity/Omningotex-7b-slerp #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/Omningotex-7b-slerp #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #region-us \n# As most of the new merges, the quantized version is not working properly.# OGNO-7B\n\nOGNO-7B is a merge of the following models using LazyMergekit:\n* liminerity/Omningotex-7b-slerp\n* eren23/dpo-binarized-NeutrixOmnibe-7B## Configuration## Usage"
] | [
-0.023654676973819733,
0.0624711699783802,
-0.0037444126792252064,
0.025254445150494576,
0.021330909803509712,
0.1030736193060875,
0.09319525212049484,
0.10387890785932541,
0.04393823444843292,
0.04541046544909477,
0.05899583548307419,
0.1022668108344078,
0.03017556108534336,
0.13708773255348206,
-0.06636053323745728,
-0.25989264249801636,
0.052645523101091385,
0.07099343091249466,
-0.06787096709012985,
0.02298475243151188,
0.10404086112976074,
0.013337429612874985,
0.07035206258296967,
0.06313024461269379,
-0.09566985070705414,
0.022597085684537888,
-0.027275288477540016,
0.014278617687523365,
0.041031550616025925,
0.11335870623588562,
0.02359980158507824,
0.017256643623113632,
-0.026742130517959595,
-0.11116176843643188,
0.026610571891069412,
-0.021295664831995964,
-0.03372429311275482,
0.06466218084096909,
0.016338802874088287,
0.01341144647449255,
0.15423795580863953,
-0.08785047382116318,
0.028509093448519707,
0.029015565291047096,
-0.0911230593919754,
-0.064776711165905,
-0.10900778323411942,
0.09057414531707764,
-0.0158652625977993,
0.0138300945982337,
-0.019899310544133186,
0.19859063625335693,
-0.008390816859900951,
0.04970170557498932,
0.09772360324859619,
-0.2872905135154724,
-0.03759098798036575,
0.24180695414543152,
0.06068003177642822,
-0.024000488221645355,
0.031023094430565834,
0.022530196234583855,
0.021336087957024574,
0.01333801168948412,
-0.05992076173424721,
-0.07954026013612747,
0.1913861781358719,
-0.049082204699516296,
-0.15600575506687164,
0.03428157791495323,
0.10522223263978958,
0.10164298862218857,
-0.0059653036296367645,
-0.07872321456670761,
-0.10737134516239166,
-0.011223459616303444,
-0.050201840698719025,
-0.07982426881790161,
0.03153073415160179,
-0.028936615213751793,
0.06707121431827545,
-0.12600760161876678,
0.010994309559464455,
0.02058412879705429,
-0.12482008337974548,
0.23334190249443054,
0.01446016225963831,
0.028414005413651466,
0.003907219506800175,
0.0465703085064888,
-0.36878466606140137,
-0.10268973559141159,
-0.030156543478369713,
-0.0708276778459549,
-0.007332189474254847,
-0.03174298629164696,
-0.09898553788661957,
-0.06037693843245506,
0.11168911308050156,
0.31805068254470825,
-0.011327610351145267,
0.0879996195435524,
0.018722115084528923,
0.047673601657152176,
0.048247020691633224,
0.04512866958975792,
-0.10708195716142654,
-0.19314293563365936,
0.09571272134780884,
0.03136376664042473,
0.08992009609937668,
-0.012146192602813244,
-0.1108369454741478,
-0.06300195306539536,
-0.052724868059158325,
-0.0623152069747448,
0.0182021651417017,
0.08059582859277725,
-0.058942705392837524,
-0.10830073058605194,
0.2117357701063156,
-0.03148707002401352,
0.03552557900547981,
-0.088297538459301,
-0.057841118425130844,
0.01592549867928028,
0.04330100119113922,
0.07814430445432663,
0.027274200692772865,
0.0445646271109581,
-0.08243674039840698,
-0.0646662786602974,
-0.033297184854745865,
-0.03348343074321747,
-0.0007858296157792211,
-0.043123725801706314,
0.04357859864830971,
-0.0796353816986084,
-0.099017433822155,
-0.008380335755646229,
0.04211512953042984,
-0.08662430942058563,
-0.06452454626560211,
-0.041290536522865295,
0.0370110422372818,
-0.01209110114723444,
0.020584795624017715,
0.011162969283759594,
0.013961427845060825,
-0.05147344991564751,
0.07178277522325516,
0.002394637791439891,
-0.2845069468021393,
-0.016134513542056084,
-0.0744299367070198,
0.11321061849594116,
-0.20233717560768127,
0.05470709130167961,
-0.07305572926998138,
-0.05388428270816803,
-0.06008460745215416,
-0.017540719360113144,
-0.09518782794475555,
-0.028358692303299904,
0.09253926575183868,
0.07131288945674896,
-0.010365809313952923,
-0.08133316785097122,
-0.00792032852768898,
-0.07562209665775299,
-0.02355463057756424,
0.09990034997463226,
-0.0012288594152778387,
-0.0037161626387387514,
-0.006944739725440741,
0.24372227489948273,
0.10060829669237137,
0.012114203535020351,
-0.04097592830657959,
0.00828929990530014,
-0.01874423958361149,
0.044329553842544556,
0.07977581024169922,
-0.006473879795521498,
-0.08007015287876129,
0.043855197727680206,
0.015404353849589825,
0.05979916825890541,
-0.03497310355305672,
-0.04468323662877083,
-0.06051511690020561,
-0.038997992873191833,
0.05072348192334175,
-0.025583254173398018,
0.03936557099223137,
-0.07421855628490448,
-0.05081695690751076,
0.06358811259269714,
0.15208831429481506,
-0.01216384582221508,
0.006595312152057886,
-0.09192104637622833,
0.11451781541109085,
-0.10754342377185822,
0.021385155618190765,
-0.07783854007720947,
-0.11005443334579468,
0.03059050254523754,
-0.07156100869178772,
0.07126255333423615,
0.012498721480369568,
0.07644645124673843,
0.01515574473887682,
-0.11579296737909317,
0.018007244914770126,
0.029978249222040176,
0.050359196960926056,
-0.01797863468527794,
-0.1079811379313469,
-0.06607399135828018,
-0.026455629616975784,
0.21587438881397247,
-0.07727832347154617,
0.02948853187263012,
-0.046204548329114914,
0.23549284040927887,
-0.04247000440955162,
-0.026210065931081772,
-0.009938156232237816,
0.02272414229810238,
-0.005752981640398502,
-0.015488036908209324,
0.045536186546087265,
0.022858398035168648,
-0.16837923228740692,
0.0016058512264862657,
-0.049008600413799286,
0.03688693046569824,
0.10503214597702026,
0.09508731216192245,
-0.013101594522595406,
-0.04730873554944992,
0.001239595701918006,
-0.06595192849636078,
0.12279057502746582,
-0.008577635511755943,
0.0766206681728363,
-0.02876371145248413,
0.0744275227189064,
-0.041836097836494446,
-0.02235575020313263,
0.04604632407426834,
-0.041919317096471786,
-0.10179927200078964,
0.055000774562358856,
0.06758619844913483,
-0.33684226870536804,
0.08081788569688797,
0.05845828726887703,
-0.028007304295897484,
0.10282941907644272,
0.030504578724503517,
0.006555738393217325,
-0.12314809858798981,
-0.02583843097090721,
0.013317308388650417,
0.04354218766093254,
-0.14643627405166626,
0.09342192858457565,
0.07654447108507156,
0.023975003510713577,
0.08670501410961151,
-0.021297892555594444,
0.02316068485379219,
-0.006793923210352659,
0.02974053844809532,
0.09109697490930557,
0.1143132820725441,
-0.039909739047288895,
0.05157197639346123,
0.05609622970223427,
-0.0578838549554348,
0.07513346523046494,
0.02095067873597145,
-0.018745722249150276,
0.0742078572511673,
-0.1396656185388565,
-0.11711942404508591,
-0.14758199453353882,
-0.08500969409942627,
-0.18428264558315277,
-0.019601643085479736,
-0.001674534403719008,
0.06888885796070099,
-0.019933395087718964,
-0.04899706691503525,
0.019026881083846092,
-0.0666850283741951,
-0.02805336005985737,
0.025628238916397095,
-0.052955616265535355,
0.029646743088960648,
-0.11307550221681595,
-0.04512045532464981,
-0.038352444767951965,
0.009109631180763245,
0.04420185461640358,
-0.10273397713899612,
0.06657510250806808,
0.09731077402830124,
-0.009438279084861279,
0.004198694135993719,
0.00656008580699563,
0.24613933265209198,
-0.07037892192602158,
0.02303813025355339,
0.11875931918621063,
-0.018569957464933395,
0.058631617575883865,
0.11321365833282471,
0.025927245616912842,
-0.04591962322592735,
-0.023517094552516937,
0.0305612925440073,
-0.040463533252477646,
-0.11637762933969498,
-0.11035500466823578,
-0.09781409800052643,
-0.008880604058504105,
-0.0011407575802877545,
0.058093540370464325,
0.11500433832406998,
0.07118597626686096,
-0.04718388617038727,
0.03095179982483387,
-0.014457393437623978,
0.07240580767393112,
0.2851247489452362,
0.019257115200161934,
0.06690007448196411,
0.021553201600909233,
-0.06311490386724472,
0.08208725601434708,
0.06466250866651535,
0.02075892873108387,
0.08042456209659576,
0.11819657683372498,
0.005488421767950058,
0.061820246279239655,
0.06613334268331528,
-0.02530207671225071,
-0.03989985212683678,
-0.02717951498925686,
-0.04730711877346039,
-0.06242798641324043,
0.012968193739652634,
0.030064811930060387,
0.06870710104703903,
0.05134580656886101,
0.00606901990249753,
-0.04597415775060654,
0.06961283087730408,
0.035925429314374924,
0.07705017924308777,
-0.22762106359004974,
-0.03427213802933693,
0.013151257298886776,
0.017179587855935097,
-0.07893345504999161,
-0.06157634034752846,
-0.014464120380580425,
-0.06554074585437775,
0.17618578672409058,
-0.015721764415502548,
0.057181503623723984,
-0.011210604570806026,
0.0051795728504657745,
0.012889781035482883,
0.14577189087867737,
-0.011032234877347946,
0.050942275673151016,
-0.1457393914461136,
0.03907277435064316,
0.03415914997458458,
-0.0186176560819149,
0.000986642437055707,
0.024771951138973236,
0.027694176882505417,
0.09778548032045364,
0.04383474588394165,
0.02685796096920967,
0.12697172164916992,
0.02935831807553768,
-0.10225985944271088,
0.01490266527980566,
0.05500096455216408,
-0.02702436037361622,
0.04640437290072441,
-0.00956645142287016,
-0.033272262662649155,
0.046729739755392075,
0.10682208091020584,
-0.1491691768169403,
-0.13375231623649597,
0.12329280376434326,
0.06514771282672882,
-0.03959355875849724,
-0.04557149484753609,
-0.029623117297887802,
-0.12978845834732056,
0.35825133323669434,
0.09164625406265259,
-0.02154255099594593,
-0.09859207272529602,
0.0659906417131424,
0.19026236236095428,
-0.04349261894822121,
0.066294364631176,
-0.04007500782608986,
0.01992720365524292,
-0.05465678125619888,
-0.16634713113307953,
0.07680968195199966,
-0.03692544251680374,
-0.08612355589866638,
-0.020893067121505737,
0.13072387874126434,
-0.032849717885255814,
0.027808891609311104,
-0.024746326729655266,
0.048075202852487564,
-0.0006817966932430863,
-0.08380329608917236,
0.04443484544754028,
0.064933642745018,
0.04011385887861252,
0.10284915566444397,
0.012884189374744892,
0.023821398615837097,
0.0029252644162625074,
-0.0002714846341405064,
0.11278592795133591,
0.32268333435058594,
-0.0012587185483425856,
-0.014275352470576763,
0.056254468858242035,
-0.03367204964160919,
-0.06609735637903214,
-0.036087390035390854,
0.07437752187252045,
0.04933210462331772,
0.012611880898475647,
-0.07425065338611603,
0.013809301890432835,
0.21847976744174957,
0.004274011589586735,
0.12761761248111725,
-0.2660089135169983,
-0.09574180096387863,
0.058694202452898026,
-0.00419121328741312,
0.23419497907161713,
-0.06332536786794662,
-0.07076936215162277,
-0.027359219267964363,
-0.23755063116550446,
0.12404588609933853,
0.021015208214521408,
0.1228807270526886,
-0.04682648926973343,
0.003163372864946723,
0.03006277047097683,
-0.015780534595251083,
0.1916205734014511,
-0.03812141716480255,
0.00349310296587646,
-0.05795454978942871,
-0.1386955976486206,
0.06619690358638763,
-0.042924460023641586,
0.027603378519415855,
0.01657087914645672,
0.011000508442521095,
0.0025823593605309725,
-0.056641485542058945,
-0.04670979827642441,
0.09286695718765259,
-0.012845630757510662,
-0.08182765543460846,
-0.07632620632648468,
0.09081634879112244,
-0.04074199125170708,
0.024376176297664642,
0.21547415852546692,
-0.024573279544711113,
-0.019748754799365997,
0.1399528831243515,
0.012786678969860077,
-0.11458572000265121,
0.013829601928591728,
-0.0002395811752649024,
-0.06272951513528824,
0.024834616109728813,
0.07465605437755585,
-0.020898563787341118,
0.10484782606363297,
-0.006141605321317911,
0.08025945723056793,
0.04304755851626396,
-0.09103814512491226,
-0.0273574348539114,
0.08208868652582169,
-0.12586481869220734,
-0.10982222110033035,
-0.056227389723062515,
-0.13448044657707214,
0.0014820998767390847,
0.06190815567970276,
0.18977570533752441,
-0.004699079319834709,
-0.03837544098496437,
0.048741042613983154,
-0.009586605243384838,
-0.14955496788024902,
0.07327527552843094,
0.030233435332775116,
-0.018671222031116486,
-0.07835443317890167,
0.03747915104031563,
0.0700845718383789,
0.028677642345428467,
-0.025755757465958595,
0.0647079348564148,
-0.0611349456012249,
-0.04668928310275078,
-0.15497562289237976,
0.15636833012104034,
-0.07625631988048553,
-0.020535530522465706,
-0.09237685799598694,
-0.037750668823719025,
-0.004962008912116289,
0.033100675791502,
0.0894075557589531,
0.022202489897608757,
-0.0195834469050169,
-0.0072843595407903194,
-0.07710728049278259,
-0.002377081196755171,
-0.011722306720912457,
0.12553095817565918,
-0.10842110961675644,
-0.06792101263999939,
-0.044441770762205124,
0.041552137583494186,
-0.049176186323165894,
-0.013029773719608784,
-0.18039195239543915,
-0.07565757632255554,
-0.09409093856811523,
-0.08168629556894302,
-0.1126260980963707,
0.0031597476918250322,
0.016260260716080666,
-0.014504346065223217,
-0.03841797262430191,
0.00712596857920289,
-0.01637992635369301,
-0.07549916207790375,
0.007623107172548771,
0.012112261727452278,
0.020825432613492012,
-0.010288084857165813,
0.00513034500181675,
-0.031581442803144455,
0.0537417009472847,
0.039183683693408966,
0.05330118536949158,
-0.04211553558707237,
-0.05163748189806938,
-0.05534864962100983,
0.04619546979665756,
0.03190559893846512,
0.046774737536907196,
-0.10613948106765747,
-0.0011845147237181664,
0.01765076443552971,
-0.034528698772192,
-0.022585928440093994,
0.11265644431114197,
-0.11752259731292725,
-0.0720568597316742,
-0.09375157207250595,
-0.06561838090419769,
-0.055010586977005005,
-0.035673338919878006,
0.013532399199903011,
0.09990788251161575,
0.07655216753482819,
0.008806303143501282,
-0.007171328645199537,
-0.14974196255207062,
-0.020978717133402824,
-0.002653467236086726,
-0.1077791377902031,
0.11295212805271149,
-0.00582055002450943,
0.020077290013432503,
0.01983010396361351,
0.1941339671611786,
-0.05068435147404671,
-0.1289229691028595,
-0.007887794636189938,
-0.09732818603515625,
0.05626727268099785,
-0.02110539935529232,
0.18948052823543549,
0.10700737684965134,
-0.011050940491259098,
-0.02411363087594509,
0.091082863509655,
0.048400044441223145,
-0.021906744688749313,
0.03909049928188324,
-0.01488726120442152,
0.004516135901212692,
0.07825465500354767,
0.10535556823015213,
-0.05107630044221878,
-0.06381203979253769,
0.06507761031389236,
0.046300046145915985,
0.04083361104130745,
0.0017250821692869067,
0.05197330191731453,
0.08008324354887009,
-0.1354457288980484,
0.06636136770248413,
0.031544383615255356,
-0.038189847022295,
-0.05723673477768898,
-0.20152753591537476,
-0.12744911015033722,
-0.16701184213161469,
-0.009737314656376839,
-0.11539758741855621,
-0.02711603417992592,
0.037129271775484085,
0.0052866325713694096,
-0.0393037386238575,
0.040039751678705215,
-0.13634523749351501,
-0.02759377472102642,
0.0017205497715622187,
-0.010406858287751675,
-0.0370187908411026,
-0.033683277666568756,
-0.0830937996506691,
-0.01722908951342106,
0.06585504114627838,
0.030745262280106544,
-0.03199867904186249,
0.06117476895451546,
0.010557935573160648,
0.005094247404485941,
-0.07209722697734833,
-0.04150618612766266,
0.05771553888916969,
0.06213271617889404,
0.023134248331189156,
0.007875337265431881,
-0.04092098027467728,
-0.02423417754471302,
0.06436410546302795,
-0.00729147857055068,
-0.11716154962778091,
-0.06996366381645203,
0.19057226181030273,
0.016891470178961754,
0.047978274524211884,
0.0175011046230793,
-0.1244611069560051,
0.010350952856242657,
0.07428672164678574,
0.25412389636039734,
-0.05712803080677986,
-0.02614785172045231,
0.13036666810512543,
-0.005716809071600437,
0.04005762189626694,
0.0012349042808637023,
0.027591191232204437,
0.15206779539585114,
-0.03065338544547558,
0.0008266579825431108,
-0.02034955658018589,
-0.01509897131472826,
-0.045990534126758575,
0.010407562367618084,
0.0902567058801651,
-0.006550552323460579,
0.025568611919879913,
0.03170889616012573,
-0.06890971213579178,
-0.01661785878241062,
0.06021289899945259,
-0.12566906213760376,
-0.15054254233837128,
-0.04656226187944412,
-0.08392927050590515,
-0.043947748839855194,
0.09733542054891586,
-0.06021266430616379,
-0.04328557848930359,
0.002062457147985697,
-0.016420559957623482,
-0.06460856646299362,
-0.10041207820177078,
0.05271320790052414,
0.04468485340476036,
0.06674575805664062,
-0.008774137124419212,
0.1001463308930397,
0.11252930760383606,
0.036284904927015305,
-0.0468113012611866,
0.06392323225736618,
-0.0076974667608737946,
-0.046792082488536835,
0.01800895668566227,
0.025285180658102036,
-0.017423680052161217,
0.07583744078874588,
0.015101736411452293,
-0.11507260054349899,
0.022592417895793915,
0.04392814636230469,
0.0001828901149565354,
-0.05735708773136139,
0.06106724217534065,
-0.11950238794088364,
0.14901810884475708,
0.17798571288585663,
-0.022332286462187767,
-0.02955426461994648,
-0.043940749019384384,
0.07374079525470734,
0.1201850101351738,
0.12808743119239807,
-0.061592116951942444,
-0.15826305747032166,
0.00881750788539648,
0.04733901470899582,
-0.0271743331104517,
-0.12032750248908997,
-0.10155917704105377,
-0.11678636074066162,
0.004025156144052744,
-0.00540529377758503,
0.01551742572337389,
0.09522663056850433,
-0.005503157619386911,
-0.029216477647423744,
-0.09762389212846756,
-0.04527048021554947,
0.01759377121925354,
-0.11819294840097427,
-0.07082735002040863
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | Augustya07/Mistral-7B-Instruct-v0.2-function-calling-hotel-adapter | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:30:45+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Uploaded model
- **Developed by:** oliverbob
- **License:** apache-2.0
- **Finetuned from model :** tinyKJV
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "gguf"], "base_model": "tinyKJV"} | null | oliverbob/tinykjv | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:tinyKJV",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:32:41+00:00 | [] | [
"en"
] | TAGS
#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-tinyKJV #license-apache-2.0 #endpoints_compatible #region-us
|
# Uploaded model
- Developed by: oliverbob
- License: apache-2.0
- Finetuned from model : tinyKJV
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
<img src="URL width="200"/>
| [
"# Uploaded model\n\n- Developed by: oliverbob\n- License: apache-2.0\n- Finetuned from model : tinyKJV\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
"TAGS\n#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-tinyKJV #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Uploaded model\n\n- Developed by: oliverbob\n- License: apache-2.0\n- Finetuned from model : tinyKJV\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
55,
70
] | [
"passage: TAGS\n#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-tinyKJV #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: oliverbob\n- License: apache-2.0\n- Finetuned from model : tinyKJV\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
-0.02731204219162464,
0.0681709423661232,
-0.0019299463601782918,
0.08918213844299316,
0.07340764999389648,
0.026659920811653137,
0.1044430211186409,
0.15266482532024384,
-0.01405428908765316,
-0.04082530736923218,
0.1363215446472168,
0.1345222145318985,
0.019740158692002296,
-0.04054921865463257,
0.005339999217540026,
-0.20532001554965973,
0.08694403618574142,
-0.004282066132873297,
-0.14360861480236053,
0.03011403977870941,
0.07556945085525513,
-0.0012169885449111462,
0.09625071287155151,
-0.018130220472812653,
-0.07994120568037033,
0.012422827072441578,
-0.04314662516117096,
-0.03733992204070091,
0.001289226463995874,
0.08703888207674026,
-0.03786769509315491,
0.024783551692962646,
0.03830438107252121,
-0.12577201426029205,
0.039333276450634,
0.027517404407262802,
-0.006836124695837498,
0.05658897012472153,
-0.002813327591866255,
0.06674309074878693,
0.21964246034622192,
-0.06044633314013481,
-0.096295565366745,
0.029014186933636665,
-0.04419377073645592,
-0.08562130481004715,
-0.04068022966384888,
0.12862662971019745,
0.045446932315826416,
0.03766439110040665,
0.042492106556892395,
0.08937887847423553,
-0.09371152520179749,
0.05047732964158058,
0.1800791323184967,
-0.252005010843277,
-0.08231350779533386,
0.16511964797973633,
0.018334735184907913,
-0.019359955564141273,
-0.01374799944460392,
0.06484995037317276,
0.059621814638376236,
-0.020943084731698036,
0.010988469235599041,
-0.07386592775583267,
-0.18153776228427887,
0.063170425593853,
-0.07971017062664032,
-0.0010773369576781988,
0.22147202491760254,
0.07577894628047943,
-0.009031101129949093,
0.014199461787939072,
-0.09033752232789993,
0.06455402076244354,
-0.08472632616758347,
0.04730667173862457,
0.08036018908023834,
0.0864400565624237,
0.011865103617310524,
-0.10791189968585968,
-0.07994741946458817,
-0.04031163454055786,
-0.112324558198452,
0.11612255126237869,
0.050452638417482376,
0.10597961395978928,
-0.11628903448581696,
0.053395915776491165,
-0.05384831503033638,
-0.12833382189273834,
-0.05688389018177986,
-0.0601380355656147,
0.1698329746723175,
0.06647045165300369,
-0.04410570114850998,
0.014526613056659698,
0.19150610268115997,
0.11786128580570221,
0.1653289943933487,
0.01238830666989088,
0.01712452434003353,
0.0657239481806755,
-0.05234319344162941,
0.0886034294962883,
-0.12943588197231293,
-0.022728562355041504,
0.14501972496509552,
0.011359705589711666,
0.06078256294131279,
0.0067203533835709095,
-0.1094445139169693,
-0.05870560184121132,
-0.0656239315867424,
0.059411972761154175,
0.08665285259485245,
0.11288152635097504,
0.035411760210990906,
-0.0471884123980999,
-0.008930647745728493,
-0.0530957356095314,
-0.032579030841588974,
-0.04111583158373833,
-0.057423029094934464,
0.17791514098644257,
0.10101050138473511,
0.005633016582578421,
-0.05547412484884262,
-0.10343114286661148,
-0.058907996863126755,
-0.0066121360287070274,
-0.023209983482956886,
0.0044165607541799545,
0.06081679090857506,
-0.0434756837785244,
0.05322548747062683,
-0.14457589387893677,
-0.22295913100242615,
0.030554894357919693,
0.14074136316776276,
-0.014548467472195625,
-0.08552434295415878,
-0.033935628831386566,
-0.044718705117702484,
0.030367717146873474,
-0.03416222706437111,
0.0010523548116907477,
-0.08794184029102325,
0.023303939029574394,
-0.08296414464712143,
0.07866523414850235,
-0.1465112417936325,
0.041874248534440994,
-0.11723915487527847,
0.03826114907860756,
-0.037769462913274765,
0.05030257627367973,
-0.03496706113219261,
0.14273765683174133,
-0.08803719282150269,
0.02258462831377983,
-0.08528848737478256,
0.0338781364262104,
0.00998828187584877,
0.16928836703300476,
-0.1193525567650795,
0.007213106844574213,
0.14599834382534027,
-0.01536798570305109,
-0.1340561807155609,
0.09063468128442764,
0.010118551552295685,
0.10578177124261856,
0.08196927607059479,
0.172842338681221,
0.1414443850517273,
-0.07191483676433563,
0.06420955061912537,
0.1552814245223999,
-0.03787364810705185,
-0.11848647892475128,
0.06937824189662933,
0.013029894791543484,
-0.14651140570640564,
0.09073144197463989,
-0.11804187297821045,
0.13746142387390137,
0.017666712403297424,
-0.051283497363328934,
-0.130022794008255,
-0.11396859586238861,
-0.07663041353225708,
-0.018863439559936523,
0.04910945147275925,
0.011272500269114971,
-0.013867764733731747,
-0.023736469447612762,
0.1684211790561676,
-0.05389493703842163,
0.032516222447156906,
-0.03365393728017807,
0.05359325930476189,
-0.11563409864902496,
0.09004583954811096,
-0.04379928484559059,
0.006747591309249401,
-0.030868131667375565,
-0.05136380344629288,
0.08293106406927109,
0.085548996925354,
0.041281361132860184,
-0.08140204846858978,
0.00029386990354396403,
0.03556390479207039,
0.07678036391735077,
-0.0018510520458221436,
-0.046011507511138916,
-0.12633109092712402,
0.04414167255163193,
0.00909576565027237,
0.061969075351953506,
-0.040710270404815674,
0.029016952961683273,
-0.10338258743286133,
0.06389079988002777,
-0.04874595254659653,
0.09419621527194977,
0.052201464772224426,
-0.10532160103321075,
-0.03339260071516037,
-0.09060990810394287,
0.09138170629739761,
0.05139831826090813,
-0.04975319653749466,
0.11369440704584122,
0.015388710424304008,
0.09414669126272202,
0.17617449164390564,
0.0015963565092533827,
0.08507959544658661,
0.050890058279037476,
-0.035966191440820694,
0.009178239852190018,
0.04180285334587097,
0.05813347548246384,
-0.023538164794445038,
-0.009514140896499157,
0.12492678314447403,
-0.09757804125547409,
-0.010598476976156235,
0.016053611412644386,
-0.11206924170255661,
0.013843766413629055,
0.05918135493993759,
0.1929953247308731,
-0.11896663904190063,
0.07896073162555695,
0.2614811360836029,
-0.0543133020401001,
0.12258010357618332,
-0.050516653805971146,
-0.06914851069450378,
0.008946149609982967,
0.010315853171050549,
-0.01092513743788004,
0.02044656313955784,
-0.04685583710670471,
0.017027607187628746,
0.06698636710643768,
0.0006311375182121992,
0.0678902119398117,
-0.12007886171340942,
-0.04362323135137558,
0.003034918801859021,
-0.06773106753826141,
0.022162361070513725,
0.09609085321426392,
-0.08238329738378525,
0.08178546279668808,
-0.009819707833230495,
-0.03370358422398567,
0.042101580649614334,
0.03413805365562439,
-0.022643795236945152,
0.13008415699005127,
-0.06629135459661484,
-0.16544976830482483,
-0.144446462392807,
-0.044916898012161255,
-0.148983895778656,
0.0053984178230166435,
0.083112932741642,
-0.08572791516780853,
-0.06722782552242279,
-0.07240135222673416,
0.005524736829102039,
0.014148552902042866,
0.026054657995700836,
0.0480896420776844,
0.04758051782846451,
0.04091566428542137,
-0.12932907044887543,
-0.032983433455228806,
0.021518725901842117,
-0.03662583976984024,
-0.007607446983456612,
-0.09812185913324356,
0.10727524012327194,
0.10516650229692459,
0.025575637817382812,
0.0012541617034003139,
0.07107031345367432,
0.15010981261730194,
0.015095352195203304,
0.0440133698284626,
0.23628012835979462,
0.06314258277416229,
0.05647728964686394,
0.11293879896402359,
0.011708950623869896,
-0.0702982097864151,
-0.00043705443385988474,
0.010926649905741215,
-0.07996921241283417,
-0.16480368375778198,
-0.018665896728634834,
-0.11631453782320023,
0.06102304905653,
0.08873937278985977,
0.07242050766944885,
0.02055886946618557,
0.178638756275177,
-0.06001463159918785,
0.15674643218517303,
-0.016050633043050766,
0.08023907244205475,
0.15045811235904694,
0.025022009387612343,
0.04304582625627518,
-0.14382480084896088,
-0.04252858832478523,
0.1389012336730957,
0.05226315185427666,
0.10548023879528046,
0.019463133066892624,
0.008888917043805122,
0.07772888988256454,
0.16299311816692352,
0.014400182291865349,
0.10556044429540634,
-0.016837000846862793,
-0.016470544040203094,
-0.06872827559709549,
-0.06292062997817993,
-0.0554400198161602,
0.05724654719233513,
-0.11370961368083954,
-0.051540229469537735,
-0.005335876252502203,
0.06905355304479599,
0.06566033512353897,
0.23250943422317505,
0.03635324910283089,
-0.2022482454776764,
-0.027496997267007828,
0.08232814073562622,
0.026817385107278824,
-0.03725280612707138,
0.07159000635147095,
-0.04726593568921089,
-0.03979724273085594,
0.06097860634326935,
-0.0019911930430680513,
0.13907691836357117,
0.04743914678692818,
0.04219399020075798,
-0.013972347602248192,
0.08692286163568497,
0.06030794605612755,
0.13216359913349152,
-0.2155124992132187,
0.06974568963050842,
0.011113323271274567,
0.024118173867464066,
-0.07403972744941711,
-0.00947639811784029,
0.12578865885734558,
0.126658633351326,
0.09548574686050415,
0.017710896208882332,
0.007883303798735142,
0.02713896706700325,
-0.11956281960010529,
0.10312395542860031,
-0.02744666486978531,
-0.0277019701898098,
0.06441593915224075,
-0.10618900507688522,
-0.011121119372546673,
0.0116186598315835,
0.1045357957482338,
-0.07433164864778519,
-0.14036045968532562,
0.002226253505796194,
0.1271163523197174,
-0.043951332569122314,
-0.033943213522434235,
0.028834143653512,
-0.08215045928955078,
0.16535449028015137,
0.07320375740528107,
-0.07828971743583679,
-0.07920217514038086,
-0.023299837484955788,
0.13297586143016815,
-0.08329752832651138,
0.00630695978179574,
-0.1167217493057251,
0.007412098813802004,
0.026248527690768242,
-0.25275030732154846,
0.02513054572045803,
-0.11627991497516632,
-0.009859842248260975,
0.02396240085363388,
0.03507332131266594,
-0.12267056107521057,
-0.03085305728018284,
0.033393744379282,
-0.05263901501893997,
-0.1035546064376831,
-0.13170744478702545,
-0.09133031219244003,
0.16269689798355103,
-0.09101293981075287,
-0.030969135463237762,
-0.11298879981040955,
0.05964142829179764,
-0.002725844504311681,
-0.015553470700979233,
0.06968466192483902,
0.18966303765773773,
-0.03232213482260704,
0.08212529122829437,
0.16571184992790222,
-0.08631868660449982,
-0.2807105779647827,
-0.13657449185848236,
-0.1166173443198204,
-0.044888392090797424,
-0.03530352935194969,
-0.08420112729072571,
0.1864248663187027,
0.04627862200140953,
-0.0324094183743,
0.1133420541882515,
-0.2637578845024109,
-0.09547971934080124,
0.13758212327957153,
0.011608807370066643,
0.3852095901966095,
-0.16626015305519104,
-0.03957197815179825,
-0.13846756517887115,
-0.22426113486289978,
0.11411848664283752,
-0.2423543930053711,
0.1140967532992363,
-0.05665820837020874,
0.04911613464355469,
0.0155326621606946,
-0.001305568148382008,
0.1414719969034195,
-0.012310202233493328,
0.06367261707782745,
-0.1329333782196045,
0.06208883970975876,
0.06572108715772629,
-0.09382104873657227,
0.17320556938648224,
-0.18459224700927734,
0.056166741997003555,
-0.1099638044834137,
-0.019110415130853653,
-0.040918804705142975,
0.03341556340456009,
0.01769811473786831,
-0.043246570974588394,
-0.10136748850345612,
-0.02602279745042324,
0.09272490441799164,
0.031505752354860306,
0.16182470321655273,
0.05352037027478218,
-0.06649680435657501,
0.07220936566591263,
-0.010511016473174095,
-0.12077299505472183,
0.009355268441140652,
-0.07275556772947311,
-0.03330415487289429,
0.06612494587898254,
-0.2886815369129181,
0.035252973437309265,
0.05473724380135536,
-0.03913373500108719,
0.013614718802273273,
0.02171037532389164,
0.036769501864910126,
-0.024670585989952087,
0.06657419353723526,
-0.11537779122591019,
-0.015361574478447437,
-0.0380244143307209,
0.030116809532046318,
-0.023236244916915894,
0.0433175191283226,
0.15249237418174744,
-0.08178935199975967,
-0.00404066126793623,
-0.004162941128015518,
0.02938929758965969,
-0.07153653353452682,
0.024621445685625076,
0.09381719678640366,
-0.0386839359998703,
-0.1205606535077095,
0.14884839951992035,
-0.013504866510629654,
-0.005197187885642052,
-0.007685041520744562,
0.03813750296831131,
-0.11025910824537277,
-0.10088741779327393,
0.11162110418081284,
0.10080631077289581,
-0.1918773651123047,
-0.052846428006887436,
-0.070075124502182,
-0.06067702919244766,
0.032042067497968674,
-0.012986890971660614,
0.07356083393096924,
0.012936200015246868,
-0.017127452418208122,
-0.03041389212012291,
-0.05973239615559578,
0.01722346432507038,
0.05954626575112343,
0.030173007398843765,
-0.18337713181972504,
-0.060033876448869705,
-0.03163774311542511,
0.0959000438451767,
-0.03922806307673454,
0.033905114978551865,
-0.09788063168525696,
-0.01690014638006687,
-0.24946759641170502,
0.042713675647974014,
-0.08627820760011673,
0.03371429443359375,
0.020557427778840065,
-0.044366855174303055,
-0.08633460849523544,
0.04370934143662453,
-0.10540201514959335,
-0.041207727044820786,
-0.052886176854372025,
0.05432154983282089,
-0.05155292525887489,
-0.05606061965227127,
0.03338630869984627,
-0.04489986598491669,
0.006964104250073433,
0.045664090663194656,
-0.07737810164690018,
0.050433069467544556,
-0.1776731312274933,
-0.06819284707307816,
0.023161977529525757,
0.04655323550105095,
-0.0056908694095909595,
0.07260168343782425,
0.020985806360840797,
0.052489764988422394,
0.05188708007335663,
-0.037376172840595245,
-0.009300217032432556,
-0.08883940428495407,
-0.08826791495084763,
-0.09489642828702927,
-0.0023439715150743723,
-0.02819092757999897,
-0.042952317744493484,
0.10936854034662247,
0.1091170385479927,
0.1475471407175064,
-0.01601126417517662,
-0.012488547712564468,
-0.11755484342575073,
0.01261834055185318,
-0.005695186089724302,
-0.103845976293087,
-0.05580080300569534,
-0.1147274598479271,
-0.01403429452329874,
-0.018169719725847244,
0.13410921394824982,
0.05050040781497955,
-0.029125787317752838,
-0.021357189863920212,
0.0486774817109108,
0.059032488614320755,
-0.03373446315526962,
0.2944798171520233,
0.06115159019827843,
0.057804521173238754,
-0.09126466512680054,
0.02353319339454174,
0.11890296638011932,
-0.022128483280539513,
0.016182895749807358,
0.08523865044116974,
-0.018715733662247658,
0.16093742847442627,
0.021851448342204094,
0.03935640677809715,
0.04729987308382988,
0.037536654621362686,
0.002984924940392375,
0.10312757641077042,
-0.03501693531870842,
0.13855786621570587,
0.1387198120355606,
-0.06543710827827454,
-0.011476235464215279,
-0.027979431673884392,
0.0003270624147262424,
-0.1352522224187851,
-0.20956970751285553,
-0.11856232583522797,
-0.19505354762077332,
0.00876085739582777,
-0.06635048240423203,
0.05579289048910141,
0.06920060515403748,
0.01541447639465332,
-0.001996866427361965,
0.011574097909033298,
-0.071006178855896,
-0.10046039521694183,
0.06889678537845612,
-0.03579825907945633,
-0.1008966863155365,
0.04806346073746681,
-0.010459239594638348,
0.010545837692916393,
-0.0381493978202343,
-0.01649155467748642,
0.035420972853899,
0.09362765401601791,
0.052099671214818954,
-0.09079761803150177,
-0.046076130121946335,
-0.05657218024134636,
0.027669092640280724,
0.046184733510017395,
0.04758928716182709,
0.04096153751015663,
-0.059335820376873016,
0.04364310950040817,
0.1755574345588684,
-0.05151301249861717,
-0.12886182963848114,
-0.09369144588708878,
-0.013757630251348019,
-0.0594463050365448,
0.004317616578191519,
-0.05366048589348793,
-0.016792401671409607,
-0.014004170894622803,
0.3887156546115875,
0.13607683777809143,
-0.10398776829242706,
-0.022368386387825012,
-0.054774366319179535,
0.0181873831897974,
-0.016692068427801132,
0.13717974722385406,
0.12001432478427887,
0.03395318239927292,
-0.06242438778281212,
-0.052586279809474945,
-0.03745099529623985,
-0.024547578766942024,
-0.1628098040819168,
0.03187359496951103,
-0.06193769723176956,
-0.07036192715167999,
0.0020498440135270357,
0.02067972533404827,
-0.12564580142498016,
0.006826639641076326,
0.0003891655069310218,
-0.00492359884083271,
-0.024200838059186935,
-0.10041162371635437,
0.024980375543236732,
0.07088685035705566,
0.03574902564287186,
-0.11278268694877625,
0.029915345832705498,
0.10501901805400848,
-0.03813346475362778,
-0.2023334503173828,
-0.04933694377541542,
0.08102178573608398,
0.0897657722234726,
0.11569032073020935,
0.04224676266312599,
-0.014628976583480835,
0.054559480398893356,
-0.03947310522198677,
-0.15429791808128357,
0.04364027827978134,
-0.02388220652937889,
-0.03175673261284828,
0.038264937698841095,
-0.08248069137334824,
-0.0626293271780014,
-0.09100011736154556,
0.054748497903347015,
0.04829966276884079,
-0.030601972714066505,
0.11638902872800827,
-0.0000036934086438122904,
-0.08340339362621307,
-0.00169432966504246,
-0.11030074954032898,
0.1010465994477272,
0.07712224870920181,
-0.06341137737035751,
-0.06291618198156357,
-0.09976807236671448,
0.08355272561311722,
0.023979008197784424,
-0.12376244366168976,
-0.004764157813042402,
0.038602352142333984,
-0.04835794121026993,
-0.008437832817435265,
0.04270331189036369,
-0.0964532271027565,
-0.03567324951291084,
-0.08608482033014297,
-0.025722388178110123,
-0.06450986862182617,
0.1035710871219635,
0.16431021690368652,
0.020791973918676376,
-0.019272802397608757,
-0.13931281864643097,
-0.011602581478655338,
0.031952038407325745,
-0.026270780712366104,
-0.11949161440134048
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# cards-blt-swin-tiny-patch4-window7-224-finetuned-v2
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2162
- Accuracy: 0.5022
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.4297 | 1.0 | 56 | 1.1976 | 0.4933 |
| 1.4078 | 1.99 | 112 | 1.1964 | 0.5011 |
| 1.417 | 2.99 | 168 | 1.2025 | 0.4961 |
| 1.4163 | 4.0 | 225 | 1.2295 | 0.4883 |
| 1.4318 | 5.0 | 281 | 1.2330 | 0.495 |
| 1.4383 | 5.99 | 337 | 1.2162 | 0.5022 |
| 1.4212 | 6.99 | 393 | 1.2634 | 0.4717 |
| 1.4346 | 8.0 | 450 | 1.3083 | 0.4689 |
| 1.419 | 9.0 | 506 | 1.2719 | 0.4806 |
| 1.4252 | 9.99 | 562 | 1.3048 | 0.4911 |
| 1.4522 | 10.99 | 618 | 1.2708 | 0.4794 |
| 1.3748 | 12.0 | 675 | 1.3720 | 0.4383 |
| 1.3966 | 13.0 | 731 | 1.3095 | 0.4594 |
| 1.4507 | 13.99 | 787 | 1.2430 | 0.485 |
| 1.4033 | 14.99 | 843 | 1.2728 | 0.4794 |
| 1.3972 | 16.0 | 900 | 1.2611 | 0.4883 |
| 1.4136 | 17.0 | 956 | 1.3166 | 0.45 |
| 1.3992 | 17.99 | 1012 | 1.3103 | 0.4856 |
| 1.3614 | 18.99 | 1068 | 1.3302 | 0.4422 |
| 1.3747 | 20.0 | 1125 | 1.2919 | 0.4856 |
| 1.3868 | 21.0 | 1181 | 1.3166 | 0.4728 |
| 1.3399 | 21.99 | 1237 | 1.3200 | 0.4672 |
| 1.3943 | 22.99 | 1293 | 1.2920 | 0.4811 |
| 1.3635 | 24.0 | 1350 | 1.3109 | 0.4833 |
| 1.3724 | 25.0 | 1406 | 1.3100 | 0.4644 |
| 1.3141 | 25.99 | 1462 | 1.3263 | 0.4978 |
| 1.3576 | 26.99 | 1518 | 1.3307 | 0.4772 |
| 1.3022 | 28.0 | 1575 | 1.3409 | 0.4978 |
| 1.2982 | 29.0 | 1631 | 1.3962 | 0.4583 |
| 1.2657 | 29.99 | 1687 | 1.3329 | 0.4817 |
| 1.3152 | 30.99 | 1743 | 1.2973 | 0.49 |
| 1.2924 | 32.0 | 1800 | 1.3159 | 0.4833 |
| 1.214 | 33.0 | 1856 | 1.3955 | 0.4833 |
| 1.2717 | 33.99 | 1912 | 1.4583 | 0.46 |
| 1.2692 | 34.99 | 1968 | 1.3504 | 0.4939 |
| 1.2127 | 36.0 | 2025 | 1.3784 | 0.4833 |
| 1.1956 | 37.0 | 2081 | 1.4184 | 0.4817 |
| 1.2408 | 37.99 | 2137 | 1.3849 | 0.4944 |
| 1.1699 | 38.99 | 2193 | 1.4298 | 0.4844 |
| 1.1727 | 40.0 | 2250 | 1.4331 | 0.4772 |
| 1.1485 | 41.0 | 2306 | 1.4597 | 0.4672 |
| 1.1668 | 41.99 | 2362 | 1.4429 | 0.4783 |
| 1.1881 | 42.99 | 2418 | 1.4555 | 0.4839 |
| 1.1204 | 44.0 | 2475 | 1.4648 | 0.4783 |
| 1.1523 | 45.0 | 2531 | 1.4744 | 0.4733 |
| 1.1206 | 45.99 | 2587 | 1.4792 | 0.4906 |
| 1.1135 | 46.99 | 2643 | 1.5009 | 0.4678 |
| 1.1227 | 48.0 | 2700 | 1.5480 | 0.4733 |
| 1.1017 | 49.0 | 2756 | 1.5907 | 0.4644 |
| 1.1601 | 49.99 | 2812 | 1.5136 | 0.47 |
| 1.1239 | 50.99 | 2868 | 1.5384 | 0.4789 |
| 1.09 | 52.0 | 2925 | 1.5716 | 0.4711 |
| 1.1023 | 53.0 | 2981 | 1.5736 | 0.4728 |
| 1.1038 | 53.99 | 3037 | 1.5919 | 0.4556 |
| 1.058 | 54.99 | 3093 | 1.5534 | 0.4772 |
| 1.0405 | 56.0 | 3150 | 1.5788 | 0.4717 |
| 1.0172 | 57.0 | 3206 | 1.5855 | 0.4767 |
| 1.0036 | 57.99 | 3262 | 1.6425 | 0.455 |
| 1.0124 | 58.99 | 3318 | 1.6039 | 0.4678 |
| 1.0647 | 60.0 | 3375 | 1.5891 | 0.4572 |
| 1.0143 | 61.0 | 3431 | 1.6265 | 0.4483 |
| 1.0051 | 61.99 | 3487 | 1.6208 | 0.4633 |
| 0.9571 | 62.99 | 3543 | 1.6874 | 0.4483 |
| 0.9838 | 64.0 | 3600 | 1.6778 | 0.4517 |
| 0.9995 | 65.0 | 3656 | 1.6248 | 0.4722 |
| 1.0374 | 65.99 | 3712 | 1.6645 | 0.4667 |
| 0.9483 | 66.99 | 3768 | 1.6307 | 0.4611 |
| 0.9825 | 68.0 | 3825 | 1.6662 | 0.4661 |
| 1.0023 | 69.0 | 3881 | 1.6650 | 0.46 |
| 0.9642 | 69.99 | 3937 | 1.6953 | 0.4494 |
| 0.9687 | 70.99 | 3993 | 1.7076 | 0.4661 |
| 0.9542 | 72.0 | 4050 | 1.7012 | 0.4656 |
| 0.9378 | 73.0 | 4106 | 1.7056 | 0.4533 |
| 0.9542 | 73.99 | 4162 | 1.7331 | 0.4572 |
| 0.9035 | 74.99 | 4218 | 1.7459 | 0.4417 |
| 0.9631 | 76.0 | 4275 | 1.7236 | 0.465 |
| 0.8759 | 77.0 | 4331 | 1.7294 | 0.455 |
| 0.9218 | 77.99 | 4387 | 1.7654 | 0.4578 |
| 0.9077 | 78.99 | 4443 | 1.7234 | 0.4594 |
| 0.8924 | 80.0 | 4500 | 1.7256 | 0.4683 |
| 0.9156 | 81.0 | 4556 | 1.7320 | 0.4678 |
| 0.806 | 81.99 | 4612 | 1.7348 | 0.4661 |
| 0.8863 | 82.99 | 4668 | 1.7514 | 0.4606 |
| 0.8698 | 84.0 | 4725 | 1.7484 | 0.4661 |
| 0.8623 | 85.0 | 4781 | 1.7420 | 0.4778 |
| 0.8643 | 85.99 | 4837 | 1.7636 | 0.4617 |
| 0.8914 | 86.99 | 4893 | 1.7552 | 0.465 |
| 0.837 | 88.0 | 4950 | 1.7552 | 0.4644 |
| 0.8217 | 89.0 | 5006 | 1.7532 | 0.4639 |
| 0.8601 | 89.99 | 5062 | 1.7447 | 0.4683 |
| 0.8293 | 90.99 | 5118 | 1.7622 | 0.4611 |
| 0.8301 | 92.0 | 5175 | 1.7616 | 0.4633 |
| 0.7752 | 93.0 | 5231 | 1.7585 | 0.4722 |
| 0.8533 | 93.99 | 5287 | 1.7842 | 0.4617 |
| 0.8156 | 94.99 | 5343 | 1.7837 | 0.4622 |
| 0.8094 | 96.0 | 5400 | 1.7896 | 0.4583 |
| 0.839 | 97.0 | 5456 | 1.7835 | 0.465 |
| 0.839 | 97.99 | 5512 | 1.7883 | 0.46 |
| 0.7763 | 98.99 | 5568 | 1.7838 | 0.4594 |
| 0.8186 | 99.56 | 5600 | 1.7837 | 0.4606 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swin-tiny-patch4-window7-224", "model-index": [{"name": "cards-blt-swin-tiny-patch4-window7-224-finetuned-v2", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.5022222222222222, "name": "Accuracy"}]}]}]} | image-classification | ansilmbabl/cards-blt-swin-tiny-patch4-window7-224-finetuned-v2 | [
"transformers",
"tensorboard",
"safetensors",
"swin",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:microsoft/swin-tiny-patch4-window7-224",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:32:42+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| cards-blt-swin-tiny-patch4-window7-224-finetuned-v2
===================================================
This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2162
* Accuracy: 0.5022
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 100
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.0.1+cu117
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 100",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 100",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
88,
144,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 100### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.12197081744670868,
0.16778463125228882,
-0.002241230569779873,
0.09024723619222641,
0.10894396901130676,
0.027739636600017548,
0.10612418502569199,
0.13863332569599152,
-0.06307186931371689,
0.11581125110387802,
0.13803674280643463,
0.08974488824605942,
0.072993203997612,
0.15467678010463715,
-0.0044370596297085285,
-0.29600289463996887,
0.017680872231721878,
-0.008941015228629112,
-0.14032605290412903,
0.11068560183048248,
0.06715383380651474,
-0.12732191383838654,
0.09052493423223495,
0.002919551217928529,
-0.1423577070236206,
-0.028385527431964874,
-0.03812786936759949,
-0.04820679500699043,
0.10070312023162842,
0.034423843026161194,
0.08278004825115204,
0.028820611536502838,
0.11601699888706207,
-0.23169413208961487,
0.007078269496560097,
0.07431315630674362,
0.013421805575489998,
0.09829675406217575,
0.11385973542928696,
0.017809392884373665,
0.1423722505569458,
-0.11045458167791367,
0.06527907401323318,
0.040604494512081146,
-0.0836503803730011,
-0.23937545716762543,
-0.06524749845266342,
0.09032010287046432,
0.12473452091217041,
0.05376007780432701,
-0.009274756535887718,
0.08508836477994919,
-0.0644797533750534,
0.0844368040561676,
0.2287716567516327,
-0.24264070391654968,
-0.07315286248922348,
0.045961663126945496,
0.0336332805454731,
0.03454527631402016,
-0.13502615690231323,
-0.005948153790086508,
0.038717541843652725,
0.000785020471084863,
0.11111516505479813,
0.026071859523653984,
0.0589665062725544,
0.008098294027149677,
-0.13931457698345184,
-0.0442984402179718,
0.09446113556623459,
0.11179595440626144,
-0.018874479457736015,
-0.12070099264383316,
-0.055661458522081375,
-0.19925019145011902,
-0.04809974133968353,
0.012125787325203419,
0.04130754619836807,
-0.05666695907711983,
-0.08140364289283752,
0.031903136521577835,
-0.07018715888261795,
-0.0815420150756836,
0.04439853876829147,
0.12772688269615173,
0.05981215834617615,
-0.005291998386383057,
0.022647010162472725,
0.11745388060808182,
0.09194192290306091,
-0.16325566172599792,
-0.0008715098374523222,
0.008386597968637943,
-0.07422223687171936,
-0.0007253802614286542,
-0.007126060780137777,
0.024172812700271606,
0.0421549454331398,
0.1427929848432541,
-0.030113769695162773,
0.07957372069358826,
0.08745452016592026,
0.02263854816555977,
-0.08255328238010406,
0.14926688373088837,
-0.08104278892278671,
-0.09357237070798874,
-0.02895929664373398,
0.11769639700651169,
0.034360889345407486,
-0.009186741895973682,
-0.08594934642314911,
0.019303781911730766,
0.10698030143976212,
0.027295855805277824,
-0.002915973076596856,
0.04394292086362839,
-0.056374888867139816,
-0.02971779741346836,
0.08426685631275177,
-0.08451053500175476,
0.045033011585474014,
0.03186368942260742,
-0.06657171249389648,
-0.010876175947487354,
0.024802403524518013,
-0.012761885300278664,
0.007512000389397144,
0.10719368606805801,
-0.09762382507324219,
-0.02933877892792225,
-0.08452755212783813,
-0.07777481526136398,
0.030467860400676727,
-0.09048594534397125,
0.01657946966588497,
-0.08281129598617554,
-0.11364054679870605,
-0.039871424436569214,
0.06450004130601883,
-0.06062080338597298,
-0.0703015923500061,
-0.05010989308357239,
-0.10224007070064545,
0.060127921402454376,
0.006665038410574198,
0.128762349486351,
-0.05101712793111801,
0.09616884589195251,
0.00291986926458776,
0.08200828731060028,
0.0644688531756401,
0.03549591451883316,
-0.06391667574644089,
0.06703869253396988,
-0.16491763293743134,
0.05083831027150154,
-0.08631675690412521,
0.07128114253282547,
-0.11775048077106476,
-0.10321716964244843,
-0.013145104981958866,
-0.01133703626692295,
0.06698551028966904,
0.1435525268316269,
-0.15925000607967377,
-0.06838212162256241,
0.15040865540504456,
-0.08637046813964844,
-0.12266295403242111,
0.10439243167638779,
-0.01354361791163683,
-0.058964867144823074,
0.013840096071362495,
0.16568870842456818,
0.08135281503200531,
-0.0842730775475502,
-0.03580331802368164,
0.006064062938094139,
0.0957372859120369,
-0.002940719947218895,
0.1002473384141922,
-0.002241311129182577,
0.014782523736357689,
0.016384869813919067,
-0.0725528821349144,
0.0758015364408493,
-0.08964798599481583,
-0.07908418774604797,
-0.03882293775677681,
-0.08464035391807556,
0.026834135875105858,
0.06508512049913406,
0.023225782439112663,
-0.07940685749053955,
-0.13606154918670654,
0.014811579138040543,
0.1204112321138382,
-0.09607531130313873,
-0.005412750411778688,
-0.05541950464248657,
0.07098424434661865,
-0.04893878474831581,
-0.010405845940113068,
-0.12686295807361603,
-0.07086286693811417,
0.03454255685210228,
-0.08371388912200928,
-0.017243418842554092,
-0.009270663373172283,
0.07600285112857819,
0.0869927927851677,
-0.05663977935910225,
-0.08981665968894958,
-0.057419173419475555,
0.011229296214878559,
-0.08066105842590332,
-0.2561860978603363,
-0.08026544004678726,
-0.026383237913250923,
0.14275267720222473,
-0.25495266914367676,
0.01692008040845394,
0.012403174303472042,
0.1466103196144104,
0.04449433460831642,
-0.05888935551047325,
0.002529993187636137,
0.012456423602998257,
-0.04403352737426758,
-0.10192549228668213,
0.03255503997206688,
0.0022691795602440834,
-0.11009715497493744,
-0.02617514505982399,
-0.11938951164484024,
0.12109164148569107,
0.10598591715097427,
0.010043161921203136,
-0.09652858227491379,
-0.04426586255431175,
-0.07529446482658386,
-0.05371600389480591,
-0.02137628197669983,
0.01452871784567833,
0.08154476433992386,
0.013158627785742283,
0.10640297830104828,
-0.08174393326044083,
-0.05882112681865692,
0.04136933386325836,
-0.002133174566552043,
-0.024983493611216545,
0.13981613516807556,
0.10600269585847855,
-0.07691016048192978,
0.1337069869041443,
0.1291142702102661,
-0.053959328681230545,
0.12951645255088806,
-0.05911657586693764,
-0.09739364683628082,
-0.032560836523771286,
0.02097730152308941,
0.01866086758673191,
0.15525922179222107,
-0.0944737046957016,
0.00974917784333229,
0.026286108419299126,
0.009198736399412155,
0.010538004338741302,
-0.17359569668769836,
-0.017074191942811012,
0.045315105468034744,
-0.04929956793785095,
0.020218903198838234,
-0.030374420806765556,
-0.026462629437446594,
0.0921553298830986,
0.0038208116311579943,
-0.04913785308599472,
-0.005610935855656862,
-0.006156917195767164,
-0.07983293384313583,
0.21188627183437347,
-0.07587197422981262,
-0.14679156243801117,
-0.12588828802108765,
0.04107876494526863,
-0.03771701827645302,
-0.0023191131185740232,
0.015725785866379738,
-0.10662204772233963,
-0.051581356674432755,
-0.0862559899687767,
0.0012754809577018023,
-0.014478192664682865,
0.05164944753050804,
0.012768912129104137,
0.016519291326403618,
0.07897555083036423,
-0.08416544646024704,
0.020944183692336082,
-0.00895693339407444,
-0.01114041730761528,
0.030229628086090088,
0.04253549501299858,
0.12048613280057907,
0.1305713653564453,
0.014367513358592987,
0.01849762722849846,
-0.008162100799381733,
0.19023333489894867,
-0.09695786237716675,
0.03424535319209099,
0.10195370018482208,
0.0004145709681324661,
0.051231611520051956,
0.13308091461658478,
0.04676535353064537,
-0.07548903673887253,
0.01517964992672205,
0.03477976471185684,
-0.017015092074871063,
-0.1900445520877838,
-0.02796464040875435,
-0.02770318277180195,
0.0038819287437945604,
0.13251572847366333,
0.04613769054412842,
-0.0280259158462286,
0.07201312482357025,
-0.020157305523753166,
0.01021562423557043,
-0.019142314791679382,
0.07240898907184601,
0.021824926137924194,
0.04802199453115463,
0.10660848021507263,
-0.03751734644174576,
-0.025204339995980263,
0.03823018819093704,
-0.007247514091432095,
0.21259890496730804,
-0.02768302895128727,
0.14401310682296753,
0.0234108567237854,
0.16535784304141998,
0.004037198144942522,
0.06545671820640564,
0.015197698958218098,
-0.034823864698410034,
0.006073992699384689,
-0.05381692945957184,
-0.02626587450504303,
0.052809108048677444,
0.017010800540447235,
0.059489309787750244,
-0.10556373745203018,
0.06422341614961624,
0.045280300080776215,
0.263717919588089,
0.07519254833459854,
-0.3374471068382263,
-0.09046453982591629,
0.016000010073184967,
-0.035326894372701645,
-0.050114430487155914,
0.02255243808031082,
0.15496665239334106,
-0.08576994389295578,
0.07601236552000046,
-0.08620578795671463,
0.06910087913274765,
-0.0736173689365387,
-0.004588648676872253,
0.08451958000659943,
0.10798022896051407,
0.0031525460071861744,
0.0757705569267273,
-0.19421610236167908,
0.2556628882884979,
-0.007619957905262709,
0.04746091365814209,
-0.059184323996305466,
0.031625811010599136,
0.02785421535372734,
0.022234249860048294,
0.11147918552160263,
-0.0030324021354317665,
-0.10374195128679276,
-0.1891999989748001,
-0.11875684559345245,
0.019269157201051712,
0.11635461449623108,
-0.08066902309656143,
0.11388365924358368,
-0.03277508541941643,
-0.03910188004374504,
0.04871850088238716,
-0.0624738447368145,
-0.08129706978797913,
-0.1235007792711258,
-0.0012866369215771556,
-0.03369726985692978,
0.013921725563704967,
-0.09542065113782883,
-0.10308107733726501,
-0.09478253126144409,
0.14712588489055634,
-0.1134473979473114,
-0.038834501057863235,
-0.15528196096420288,
0.10360345989465714,
0.1447896659374237,
-0.08216556906700134,
0.0616331622004509,
-0.008069139905273914,
0.13000211119651794,
0.037839148193597794,
-0.04445766285061836,
0.11435864120721817,
-0.09536700695753098,
-0.2299543172121048,
-0.056969087570905685,
0.11265071481466293,
0.0405983068048954,
0.059508636593818665,
-0.023616861552000046,
0.02295227348804474,
-0.014790303073823452,
-0.09771544486284256,
0.05837317556142807,
0.041056904941797256,
0.04123670980334282,
0.017108453437685966,
-0.03845193237066269,
0.0385725274682045,
-0.028588788583874702,
-0.03275656700134277,
0.1045311838388443,
0.27668526768684387,
-0.11833974719047546,
0.020922206342220306,
0.030990730971097946,
-0.04674448072910309,
-0.18056756258010864,
0.01544331107288599,
0.10250384360551834,
0.02424062043428421,
0.03335884213447571,
-0.1732032597064972,
0.10613811016082764,
0.08786498755216599,
-0.023607028648257256,
0.10059678554534912,
-0.28801602125167847,
-0.12166237831115723,
0.09397689998149872,
0.13419689238071442,
-0.03967813402414322,
-0.1672218292951584,
-0.05459770932793617,
-0.010101892985403538,
-0.07107273489236832,
0.08692353963851929,
0.0020339293405413628,
0.09883525222539902,
-0.0313623808324337,
-0.01643054001033306,
0.024051113054156303,
-0.0731007307767868,
0.15977846086025238,
-0.01220053993165493,
0.08937365561723709,
-0.034540943801403046,
0.016648292541503906,
-0.0014660321176052094,
-0.07792048901319504,
0.03708992898464203,
-0.11667905747890472,
0.05515766143798828,
-0.10234367847442627,
-0.015207058750092983,
-0.0776943489909172,
0.029842795804142952,
-0.051580119878053665,
-0.04388092830777168,
-0.042394060641527176,
0.04822278395295143,
0.07526268810033798,
-0.0025002460461109877,
0.13907064497470856,
0.014308981597423553,
0.10316695272922516,
0.11043526232242584,
0.05953893065452576,
0.0019138554343953729,
-0.10216884315013885,
-0.03520846366882324,
-0.008334203623235226,
0.049948763102293015,
-0.14820533990859985,
0.010737042874097824,
0.12997779250144958,
0.041450947523117065,
0.11546941846609116,
0.05117648094892502,
-0.05466906726360321,
-0.019861910492181778,
0.08367875218391418,
-0.10649348050355911,
-0.13163405656814575,
-0.023796990513801575,
0.006164076738059521,
-0.1615881472826004,
0.019734418019652367,
0.07105803489685059,
-0.06770303845405579,
0.006660067476332188,
0.0014945589937269688,
0.04876585677266121,
0.0035399694461375475,
0.19171105325222015,
0.0849151611328125,
0.08047249913215637,
-0.08659207820892334,
0.10526536405086517,
0.028508765622973442,
-0.13800063729286194,
0.02372496947646141,
0.06765804439783096,
-0.08088655024766922,
-0.01005338504910469,
0.09170400351285934,
0.09658977389335632,
-0.02430126629769802,
-0.04518376663327217,
-0.1247469037771225,
-0.11843977868556976,
0.06800360232591629,
0.06671731173992157,
0.0667998194694519,
0.021197883412241936,
-0.005221130792051554,
0.029406728222966194,
-0.1078258603811264,
0.13625623285770416,
0.07416732609272003,
0.09965053200721741,
-0.19104371964931488,
0.09056923538446426,
0.011925418861210346,
0.006649661809206009,
-0.014620078727602959,
0.05230679363012314,
-0.12294607609510422,
-0.029186954721808434,
-0.06602035462856293,
0.006430784240365028,
-0.07041899114847183,
0.007327499333769083,
0.0001292850502068177,
-0.050191354006528854,
-0.0395747646689415,
0.004598851781338453,
-0.09383632987737656,
-0.06130542978644371,
0.0009742403635755181,
0.05945093184709549,
-0.09852756559848785,
-0.015430853702127934,
0.03855089470744133,
-0.11982092261314392,
0.0890805721282959,
0.011413653381168842,
0.04547789320349693,
0.013540967367589474,
-0.08969093859195709,
0.03290634974837303,
0.04763522744178772,
-0.0015138519229367375,
0.02544497512280941,
-0.1311316192150116,
-0.004199587740004063,
-0.049615528434515,
-0.005427768919616938,
-0.019878000020980835,
0.04283295199275017,
-0.13693514466285706,
-0.000481033610412851,
-0.06016315892338753,
-0.05202636867761612,
-0.06260208040475845,
0.05077974125742912,
0.07113778591156006,
-0.016693998128175735,
0.16878268122673035,
-0.07372672855854034,
0.03835239261388779,
-0.2400226891040802,
-0.0011370016727596521,
-0.012157930992543697,
-0.06427918374538422,
-0.08258844166994095,
-0.01028454303741455,
0.078823022544384,
-0.049347735941410065,
0.09705895185470581,
-0.034084614366292953,
0.018134238198399544,
0.026300357654690742,
-0.037379223853349686,
0.048201993107795715,
0.04849008098244667,
0.20245914161205292,
0.019397368654608727,
-0.014919322915375233,
0.0654066726565361,
0.01786438189446926,
0.08371171355247498,
0.057042889297008514,
0.15362733602523804,
0.151055246591568,
-0.05364006757736206,
0.10563323646783829,
0.046953991055488586,
-0.11987686157226562,
-0.15477071702480316,
0.14639663696289062,
-0.07227242738008499,
0.12954439222812653,
-0.02200929820537567,
0.17868253588676453,
0.11883239448070526,
-0.20316433906555176,
0.0068444013595581055,
-0.014320353046059608,
-0.08285009115934372,
-0.09389149397611618,
-0.09835892170667648,
-0.09040027111768723,
-0.1723771095275879,
0.016508139669895172,
-0.10418394207954407,
0.007104028481990099,
0.07421868294477463,
0.024059578776359558,
0.022241003811359406,
0.16334792971611023,
0.059300631284713745,
0.02531874179840088,
0.06235603988170624,
0.05058920383453369,
-0.04275691136717796,
-0.02928183786571026,
-0.08716171979904175,
0.020074423402547836,
-0.023553336039185524,
0.03942134231328964,
-0.06527470052242279,
-0.06745055317878723,
0.08619115501642227,
0.04711531847715378,
-0.09917964786291122,
0.02291444130241871,
-0.021910401061177254,
0.043089453130960464,
0.06111874058842659,
0.01174522377550602,
0.00837349146604538,
-0.04771869257092476,
0.20122197270393372,
-0.09451166540384293,
-0.011327581480145454,
-0.11251940578222275,
0.16732586920261383,
-0.012114852666854858,
-0.008062135428190231,
0.03149892017245293,
-0.08946976065635681,
-0.0052291275933384895,
0.15501248836517334,
0.15304133296012878,
-0.04197708144783974,
-0.023339563980698586,
0.01739748939871788,
-0.016512321308255196,
-0.04304489493370056,
0.08935808390378952,
0.09349123388528824,
0.05113459378480911,
-0.06809736043214798,
-0.04580637440085411,
-0.04180990159511566,
-0.0603671558201313,
-0.032731831073760986,
0.05920489504933357,
0.03628996014595032,
-0.008233729749917984,
-0.04537268728017807,
0.07955510914325714,
-0.03850400447845459,
-0.12242899090051651,
0.09281294792890549,
-0.1775585263967514,
-0.17219190299510956,
-0.03232305496931076,
0.08557991683483124,
0.016829941421747208,
0.045333534479141235,
-0.0011318877805024385,
-0.018777577206492424,
0.09369762241840363,
-0.0029286271892488003,
-0.08478975296020508,
-0.08688928186893463,
0.043406348675489426,
-0.05073212459683418,
0.23768571019172668,
-0.02948596701025963,
0.008706144988536835,
0.12846875190734863,
0.03390855714678764,
-0.13459262251853943,
0.014444009400904179,
0.07576601207256317,
-0.097775399684906,
0.049319762736558914,
0.15176492929458618,
-0.0253209937363863,
0.11530710756778717,
0.040977369993925095,
-0.09970629960298538,
-0.008158843964338303,
-0.0797295942902565,
-0.056384239345788956,
-0.060111697763204575,
0.006181841716170311,
-0.03717653453350067,
0.15856796503067017,
0.2005574107170105,
-0.06086646392941475,
-0.03424602001905441,
-0.04668880254030228,
0.03689827024936676,
0.0476611964404583,
0.0855749100446701,
0.005430721677839756,
-0.23243774473667145,
0.028367208316922188,
-0.02746983803808689,
0.018881861120462418,
-0.19169846177101135,
-0.0903862938284874,
0.017965326085686684,
-0.051880646497011185,
-0.09727723896503448,
0.1023438423871994,
0.07495777308940887,
0.04877745360136032,
-0.05944102630019188,
-0.03797846660017967,
-0.05261421948671341,
0.1591472029685974,
-0.1668727695941925,
-0.07708519697189331
] |
null | null | stable-baselines3 |
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mathreader -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mathreader -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga mathreader
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
| {"library_name": "stable-baselines3", "tags": ["SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "DQN", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "SpaceInvadersNoFrameskip-v4", "type": "SpaceInvadersNoFrameskip-v4"}, "metrics": [{"type": "mean_reward", "value": "637.00 +/- 120.48", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | mathreader/dqn-SpaceInvadersNoFrameskip-v4-v2 | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-13T10:33:12+00:00 | [] | [] | TAGS
#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# DQN Agent playing SpaceInvadersNoFrameskip-v4
This is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4
using the stable-baselines3 library
and the RL Zoo.
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: URL
SB3: URL
SB3 Contrib: URL
Install the RL Zoo (with SB3 and SB3-Contrib):
If you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:
## Training (with the RL Zoo)
## Hyperparameters
# Environment Arguments
| [
"# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.",
"## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:",
"## Training (with the RL Zoo)",
"## Hyperparameters",
"# Environment Arguments"
] | [
"TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.",
"## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:",
"## Training (with the RL Zoo)",
"## Hyperparameters",
"# Environment Arguments"
] | [
43,
90,
73,
9,
5,
7
] | [
"passage: TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:## Training (with the RL Zoo)## Hyperparameters# Environment Arguments"
] | [
0.043572068214416504,
0.2414778620004654,
-0.0026879787910729647,
0.012635791674256325,
0.05784223601222038,
0.0030472534708678722,
0.08585051447153091,
0.10650663822889328,
0.024212315678596497,
-0.001382096204906702,
0.003954293206334114,
0.17533031105995178,
0.03632635250687599,
0.13125447928905487,
-0.018073517829179764,
-0.2066594809293747,
-0.013479253277182579,
-0.06247470900416374,
-0.07153085619211197,
0.036099132150411606,
0.07206681370735168,
-0.030116932466626167,
0.036061208695173264,
-0.051406677812337875,
-0.057161085307598114,
0.036824777722358704,
-0.03157254680991173,
0.007067287806421518,
0.15158706903457642,
-0.1222257912158966,
0.12329676002264023,
0.020955175161361694,
0.1896144151687622,
-0.12332789599895477,
0.0339222252368927,
0.08982209116220474,
-0.036988191306591034,
0.013221588917076588,
0.00975361280143261,
-0.052562564611434937,
0.1590864509344101,
-0.09371145814657211,
0.07146181166172028,
0.010926910676062107,
-0.07592244446277618,
-0.1774153709411621,
-0.09356249868869781,
0.07947742193937302,
0.0617753230035305,
0.005319166928529739,
0.03726791962981224,
0.11306490749120712,
-0.020991774275898933,
0.06488905102014542,
0.11562903225421906,
-0.17549200356006622,
0.013578375801444054,
0.17859570682048798,
0.003242473118007183,
0.15767055749893188,
-0.05546637624502182,
0.019877681508660316,
0.02752300351858139,
0.04758313298225403,
0.06873945891857147,
-0.08186400681734085,
-0.1364826112985611,
-0.056155186146497726,
-0.15456219017505646,
-0.03352400287985802,
0.05195203423500061,
-0.011860138736665249,
-0.05783402919769287,
-0.010724928230047226,
-0.04010869935154915,
0.0008851495804265141,
-0.028637725859880447,
0.01805497519671917,
0.07031578570604324,
-0.01226285845041275,
0.02092539705336094,
-0.08391954004764557,
-0.0390290804207325,
-0.038563769310712814,
-0.018022390082478523,
0.12054917961359024,
0.08285853266716003,
0.0266572255641222,
-0.04135355353355408,
0.10274127870798111,
-0.07091585546731949,
-0.05454207584261894,
0.04555258899927139,
-0.03786851093173027,
-0.10615779459476471,
0.02120024710893631,
-0.05905991420149803,
0.026879185810685158,
0.09943640232086182,
0.18048083782196045,
-0.09862488508224487,
0.012620617635548115,
-0.03430783003568649,
0.08121664822101593,
-0.03196052461862564,
0.03197542577981949,
-0.0840383991599083,
-0.016251085326075554,
0.17835216224193573,
0.0030782297253608704,
0.022272996604442596,
0.002074616262689233,
-0.049819961190223694,
-0.02881433069705963,
-0.017756454646587372,
0.06631895154714584,
0.07032092660665512,
0.010587303899228573,
-0.0037596761249005795,
-0.027667716145515442,
-0.036921944469213486,
-0.05629328638315201,
-0.04952820762991905,
0.018803736194968224,
-0.04712437093257904,
-0.047942135483026505,
0.06027210131287575,
-0.005624116864055395,
0.11337806284427643,
-0.025607796385884285,
0.026316547766327858,
-0.019410157576203346,
-0.07494441419839859,
-0.13221681118011475,
-0.0304415225982666,
0.0691632330417633,
0.04371757060289383,
-0.22497159242630005,
-0.16994807124137878,
-0.008539012633264065,
0.017946386709809303,
-0.018741264939308167,
-0.11334165185689926,
0.02453240379691124,
-0.007166135590523481,
-0.049758363515138626,
-0.01601579785346985,
0.10474669933319092,
-0.020438622683286667,
0.018010856583714485,
-0.05593825876712799,
0.16603368520736694,
-0.14290283620357513,
0.031004127115011215,
-0.08706212788820267,
0.023509707301855087,
-0.21286657452583313,
0.041208744049072266,
-0.177636057138443,
0.04863585904240608,
-0.08500861376523972,
0.02327173389494419,
0.021320728585124016,
0.01968831568956375,
0.08580207824707031,
0.10143322497606277,
-0.23631145060062408,
0.05405791476368904,
0.07900930196046829,
-0.022739801555871964,
-0.04218491166830063,
0.06798892468214035,
-0.06558530032634735,
0.1382148116827011,
0.046505436301231384,
0.24831900000572205,
0.10361487418413162,
-0.2036508023738861,
0.061786454170942307,
0.0578593946993351,
-0.08880111575126648,
-0.004730981774628162,
-0.020022382959723473,
0.11598580330610275,
-0.01114928349852562,
0.03338807821273804,
-0.12186288088560104,
0.1456439197063446,
0.02738998830318451,
-0.0165485180914402,
-0.04454165697097778,
-0.1614885926246643,
0.10309953987598419,
-0.015504824928939342,
0.09532155096530914,
-0.042415786534547806,
0.0001161050095106475,
-0.011168917641043663,
0.18012429773807526,
-0.043841805309057236,
0.0007168867159634829,
0.07871408760547638,
0.10895700752735138,
0.028009075671434402,
-0.020230965688824654,
-0.20380273461341858,
-0.0423048660159111,
0.02367858961224556,
0.044489551335573196,
0.2190362960100174,
0.19936694204807281,
0.07770156860351562,
-0.022313760593533516,
-0.025487221777439117,
-0.003248062450438738,
-0.05106664076447487,
0.03467361256480217,
-0.027858436107635498,
-0.024532482028007507,
0.06065356358885765,
-0.09305168688297272,
0.02817818708717823,
-0.13112716376781464,
0.06307920068502426,
-0.17345242202281952,
0.06863926351070404,
0.021998396143317223,
-0.005436043255031109,
0.024577690288424492,
-0.011292695067822933,
-0.034188106656074524,
-0.06233125180006027,
0.07110602408647537,
0.06098933145403862,
0.014702376909554005,
0.0021991983521729708,
-0.0683600977063179,
-0.13828523457050323,
0.08231553435325623,
-0.04042381793260574,
-0.14305958151817322,
0.06392676383256912,
0.011172642931342125,
0.04875864461064339,
-0.05975872278213501,
0.016254881396889687,
0.22900153696537018,
0.05321883037686348,
0.09785865992307663,
-0.04092191904783249,
-0.022525805979967117,
-0.06617844104766846,
-0.06677833944559097,
0.09694591909646988,
0.10812206566333771,
0.060318704694509506,
-0.0030071530491113663,
0.07626225054264069,
0.10942911356687546,
-0.1035122498869896,
-0.0651884600520134,
0.03220061957836151,
-0.05973697826266289,
0.019652515649795532,
0.049140311777591705,
0.02971293032169342,
0.08619047701358795,
0.1833551675081253,
0.008245792239904404,
0.0386311337351799,
-0.025997694581747055,
0.026109617203474045,
-0.15547916293144226,
-0.03145433962345123,
0.04308181628584862,
0.00886955764144659,
-0.07408110797405243,
0.04994636029005051,
0.051439400762319565,
0.13607151806354523,
-0.08217083662748337,
-0.13170577585697174,
-0.059745315462350845,
-0.03804200142621994,
-0.04239124804735184,
0.14975430071353912,
-0.08507520705461502,
-0.19221234321594238,
-0.017164425924420357,
-0.15751953423023224,
-0.02518727444112301,
-0.005179801490157843,
0.002318724524229765,
-0.08325926214456558,
0.017780914902687073,
0.010001576505601406,
-0.03129372000694275,
-0.0684933215379715,
-0.06596160680055618,
-0.05786636844277382,
0.09124112874269485,
0.06932931393384933,
-0.12240120023488998,
-0.00961651187390089,
-0.03742414712905884,
-0.020465577021241188,
0.04516167193651199,
0.08452648669481277,
-0.007267598994076252,
0.07773483544588089,
-0.13209199905395508,
-0.06962883472442627,
0.02834828943014145,
0.2766247093677521,
0.02882981114089489,
0.004668009467422962,
0.17051753401756287,
-0.03629542142152786,
0.04912714660167694,
0.16181479394435883,
0.030781643465161324,
-0.14196757972240448,
0.07090470939874649,
-0.011341600678861141,
-0.09542687982320786,
-0.1706860214471817,
-0.10215658694505692,
-0.037867411971092224,
-0.05015881359577179,
0.05638284236192703,
0.004951419774442911,
-0.04476970434188843,
0.05910305306315422,
0.08782228082418442,
-0.017004497349262238,
-0.06151578947901726,
0.11129767447710037,
0.032263003289699554,
-0.030136963352560997,
0.08078382909297943,
-0.042354047298431396,
-0.04206389561295509,
0.0032403599470853806,
0.22643887996673584,
0.0937788337469101,
-0.01775507442653179,
-0.042567066848278046,
0.019317636266350746,
0.05095715448260307,
0.03613382205367088,
0.11312435567378998,
-0.06975842267274857,
-0.06826137751340866,
-0.035185977816581726,
0.027829548344016075,
-0.02945687249302864,
0.08205190300941467,
0.0630207508802414,
0.005563626065850258,
-0.04653681069612503,
-0.07972332090139389,
-0.04849022626876831,
0.08408913016319275,
-0.027642227709293365,
-0.10093270242214203,
0.09321888536214828,
0.048575710505247116,
0.0016974330646917224,
0.03055831417441368,
0.027994604781270027,
0.01462269201874733,
-0.07982148975133896,
-0.06775744259357452,
0.011468625627458096,
0.07076629996299744,
-0.06822766363620758,
-0.027886953204870224,
-0.19817815721035004,
0.14578363299369812,
0.010630400851368904,
0.04118429124355316,
-0.13048617541790009,
0.1209396943449974,
-0.023116756230592728,
-0.026430301368236542,
0.013811616227030754,
0.0014643745962530375,
0.08203291147947311,
-0.04806509613990784,
0.15762180089950562,
0.009528410620987415,
-0.28092408180236816,
-0.1418946087360382,
-0.08416824042797089,
-0.051183976233005524,
-0.022873088717460632,
0.014752174727618694,
0.0642135739326477,
0.01516205258667469,
0.003868846921250224,
-0.013076163828372955,
0.03185269236564636,
-0.09826882928609848,
-0.06493937969207764,
-0.04839126765727997,
-0.02250157669186592,
-0.06525848805904388,
-0.05647949501872063,
-0.0006809153710491955,
-0.17226077616214752,
0.12522587180137634,
0.11787347495555878,
-0.06451737880706787,
-0.041814323514699936,
-0.06554657220840454,
0.046191465109586716,
-0.07571537792682648,
0.0469326451420784,
0.003414976177737117,
0.019198855385184288,
-0.06806991249322891,
-0.17922484874725342,
0.016097763553261757,
-0.10899919271469116,
0.03772687539458275,
-0.05070559307932854,
0.020257100462913513,
0.08594245463609695,
0.17520126700401306,
0.05856714025139809,
0.01460097823292017,
-0.07239776104688644,
-0.07543374598026276,
-0.0017121878918260336,
-0.06344114243984222,
0.05762333422899246,
-0.009151889942586422,
-0.20333483815193176,
0.02763226442039013,
-0.11414948850870132,
0.06860900670289993,
0.3310066759586334,
0.3324824273586273,
-0.10698744654655457,
0.1177443116903305,
0.04819539934396744,
-0.042202454060316086,
-0.21051374077796936,
-0.002244179602712393,
0.012272895313799381,
0.024992236867547035,
0.13725964725017548,
-0.12924811244010925,
0.05453680083155632,
0.0794181227684021,
-0.024458877742290497,
0.01456840243190527,
-0.09078162908554077,
-0.10816970467567444,
0.20847418904304504,
0.14226987957954407,
0.04421741142868996,
-0.09421348571777344,
0.08391669392585754,
0.004295284394174814,
0.08375877887010574,
0.2107764035463333,
-0.052112679928541183,
0.10695768147706985,
0.005195184610784054,
0.19852910935878754,
0.0328996516764164,
-0.023768596351146698,
0.10834760218858719,
-0.009801650419831276,
0.07911337912082672,
0.03985166177153587,
-0.007676942739635706,
0.010487722232937813,
-0.04522453248500824,
0.014148596674203873,
-0.028376007452607155,
0.010284217074513435,
-0.2274095118045807,
0.0582297146320343,
-0.06368855386972427,
0.04604509472846985,
0.008256820961833,
-0.0999874547123909,
-0.03583388403058052,
0.06431841105222702,
0.08014573156833649,
0.01975327916443348,
0.0436067171394825,
-0.03867863491177559,
0.11051398515701294,
0.20660489797592163,
-0.009811338968575,
0.17751595377922058,
-0.0615963339805603,
0.01464168168604374,
-0.023011628538370132,
-0.04223164543509483,
-0.1462583988904953,
-0.035259708762168884,
0.03498423472046852,
0.057734888046979904,
0.015203364193439484,
0.049647457897663116,
-0.05656236410140991,
0.08498423546552658,
0.021687336266040802,
-0.041541360318660736,
0.033579520881175995,
0.08835696429014206,
0.12415177375078201,
0.010754258371889591,
-0.030121933668851852,
0.06147436052560806,
-0.08128108084201813,
-0.09446098655462265,
-0.004497923422604799,
-0.029991207644343376,
-0.1083834245800972,
0.11353230476379395,
0.16914646327495575,
0.039594944566488266,
-0.057076629251241684,
0.10688766092061996,
-0.02768099494278431,
0.10047874599695206,
0.009198128245770931,
0.06507332623004913,
-0.014091075398027897,
-0.03691792115569115,
0.10611724853515625,
-0.05442855879664421,
-0.01637818105518818,
0.07645545154809952,
-0.06522727757692337,
-0.023877469822764397,
-0.0801999643445015,
0.06034626066684723,
0.09222240000963211,
-0.16854619979858398,
-0.0639432892203331,
-0.032122284173965454,
-0.08628080040216446,
0.013965039514005184,
0.012447911314666271,
0.0710059329867363,
-0.08589600026607513,
0.06316167116165161,
-0.024337708950042725,
0.015639442950487137,
-0.03689891844987869,
0.019222697243094444,
-0.19525384902954102,
-0.002140450058504939,
-0.11280795186758041,
-0.00348020251840353,
-0.002931603929027915,
0.04463808611035347,
-0.04961875081062317,
-0.029358822852373123,
-0.0030675032176077366,
0.044366419315338135,
-0.16609135270118713,
0.002798673929646611,
-0.011639905162155628,
0.03210212290287018,
-0.0002893915225286037,
-0.0983390137553215,
0.014195028692483902,
-0.04294256120920181,
-0.04198618605732918,
0.04925514757633209,
0.009436776861548424,
0.06470516324043274,
-0.2795179784297943,
-0.14905457198619843,
0.030816160142421722,
0.0683867484331131,
0.05483196675777435,
-0.1830425262451172,
0.03568267077207565,
-0.08042316138744354,
-0.02253127470612526,
-0.037770628929138184,
0.018491698428988457,
-0.0539514496922493,
0.0018174031283706427,
-0.04225044324994087,
-0.023033907637000084,
-0.028055014088749886,
-0.07556360960006714,
0.0826747715473175,
0.12462522834539413,
0.07555580884218216,
-0.03807181864976883,
0.09595896303653717,
-0.10009756684303284,
-0.04657831788063049,
-0.04052736237645149,
-0.036951083689928055,
0.017965637147426605,
-0.0870552659034729,
0.048530060797929764,
0.05188591405749321,
0.18719671666622162,
-0.08520494401454926,
-0.058800119906663895,
-0.014255574904382229,
0.0746525228023529,
0.07849094271659851,
0.005095830652862787,
0.17779210209846497,
-0.045693784952163696,
0.05693846940994263,
0.021304311230778694,
0.046699028462171555,
0.10497613251209259,
-0.023569339886307716,
0.14490213990211487,
0.21171095967292786,
-0.037196725606918335,
-0.11048602312803268,
0.043668005615472794,
0.01745123788714409,
-0.002401199424639344,
0.05968761444091797,
0.11983796209096909,
-0.050589341670274734,
-0.10903856158256531,
0.23442286252975464,
0.054169271141290665,
-0.11218088120222092,
0.09546315670013428,
0.039532262831926346,
-0.015890996903181076,
-0.1301896870136261,
0.010444961488246918,
-0.0013640925753861666,
-0.11233190447092056,
0.03386834263801575,
-0.06087532266974449,
-0.025547027587890625,
0.11809267848730087,
0.008789865300059319,
0.03317064419388771,
-0.04139537364244461,
-0.03756232187151909,
-0.04352104663848877,
-0.04273213446140289,
-0.012549578212201595,
-0.02991986647248268,
-0.030186517164111137,
-0.07621737569570541,
-0.007770835887640715,
-0.012012424878776073,
0.030795488506555557,
-0.015285328030586243,
-0.02503054589033127,
-0.021192016080021858,
-0.06697061657905579,
-0.0026312144473195076,
-0.008178025484085083,
0.015549594536423683,
0.010121971368789673,
0.2358063906431198,
0.07042546570301056,
-0.10260069370269775,
-0.01036880537867546,
0.22197756171226501,
-0.03853277862071991,
-0.06528383493423462,
-0.07849395275115967,
0.25128230452537537,
-0.10482002794742584,
0.051095426082611084,
-0.005819917656481266,
-0.06550488620996475,
-0.07153836637735367,
0.2309868484735489,
0.13502730429172516,
-0.1677926480770111,
0.06329060345888138,
-0.0368385910987854,
-0.009490780532360077,
-0.14286863803863525,
0.16013580560684204,
0.1865294873714447,
0.09480160474777222,
-0.12259847670793533,
0.0023130534682422876,
-0.03518044203519821,
-0.018328361213207245,
-0.1660851687192917,
-0.004593863617628813,
-0.029364850372076035,
-0.0427238829433918,
-0.050771355628967285,
0.029773715883493423,
-0.15205919742584229,
-0.0927426889538765,
-0.1916799396276474,
-0.11482496559619904,
-0.12386849522590637,
-0.04549141973257065,
-0.11142764985561371,
-0.0019938007462769747,
0.02257080189883709,
-0.0641874223947525,
0.021061956882476807,
-0.0212461706250906,
-0.05887424945831299,
0.015386379323899746,
-0.08395619690418243,
0.0674985870718956,
0.06488548219203949,
0.15327942371368408,
-0.0790991559624672,
0.025424562394618988,
0.07090727984905243,
-0.057595450431108475,
-0.10164349526166916,
0.06067253649234772,
0.015708057209849358,
-0.1972588747739792,
0.007548294495791197,
0.17712996900081635,
-0.10420889407396317,
0.09745754301548004,
0.048501528799533844,
-0.012951982207596302,
0.0867827981710434,
-0.024721821770071983,
-0.016682926565408707,
-0.04852180927991867,
-0.011212974786758423,
-0.10143939405679703,
0.09892100840806961,
0.0876845121383667,
-0.0517118014395237,
0.07436849176883698,
-0.09508965909481049,
-0.04068392515182495,
0.13103286921977997,
-0.010057874955236912,
-0.08450483530759811,
-0.11667824536561966,
-0.04081142693758011,
0.09684515744447708,
-0.018041390925645828,
-0.20185889303684235,
-0.11639472097158432,
-0.11752668023109436,
-0.00014377340266946703,
-0.03563340753316879,
0.061800602823495865,
0.02430674433708191,
-0.02556120604276657,
-0.008150683715939522,
-0.17615078389644623,
-0.06614746153354645,
0.13479791581630707,
-0.10176112502813339,
-0.07456064969301224
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | bardsai/jaskier-7b-dpo-v3.1 | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T10:33:26+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05921921506524086,
0.15253323316574097,
-0.004925556480884552,
0.01970141939818859,
0.09812989830970764,
0.008722675032913685,
0.07155127823352814,
0.11091651022434235,
-0.02038503810763359,
0.11541511863470078,
0.03161177039146423,
0.09504877775907516,
0.11244720220565796,
0.1593349277973175,
0.0006018498679623008,
-0.22924894094467163,
0.050943523645401,
-0.12565383315086365,
-0.028005311265587807,
0.1202453151345253,
0.14323006570339203,
-0.10873830318450928,
0.07482945919036865,
-0.03924073651432991,
-0.006830108352005482,
-0.03327549248933792,
-0.06254202127456665,
-0.05196645110845566,
0.05287102237343788,
0.06693000346422195,
0.07382122427225113,
0.0121690658852458,
0.09054198116064072,
-0.27071383595466614,
0.02402324043214321,
0.07869837433099747,
-0.00047617589007131755,
0.07642106711864471,
0.049837369471788406,
-0.08698169887065887,
0.07614438980817795,
-0.060363397002220154,
0.14962489902973175,
0.07956483215093613,
-0.09049813449382782,
-0.19196605682373047,
-0.07841940224170685,
0.10002946108579636,
0.18888257443904877,
0.05783533677458763,
-0.02747977338731289,
0.11718999594449997,
-0.08618196099996567,
0.013946855440735817,
0.06651762872934341,
-0.05830651894211769,
-0.055825375020504,
0.07012750208377838,
0.08251979202032089,
0.08537944406270981,
-0.13050076365470886,
-0.011774240992963314,
0.015172234736382961,
0.00940374843776226,
0.0883294939994812,
0.017624128609895706,
0.13745273649692535,
0.04126768559217453,
-0.1351923644542694,
-0.04287068545818329,
0.09870852530002594,
0.035997726023197174,
-0.04835180938243866,
-0.24833782017230988,
-0.023138362914323807,
-0.039952121675014496,
-0.03223174810409546,
-0.0381147637963295,
0.04236193001270294,
-0.01381280180066824,
0.07635250687599182,
-0.0030598659068346024,
-0.08292017132043839,
-0.042900193482637405,
0.07140932232141495,
0.06195797771215439,
0.025352943688631058,
-0.016651969403028488,
0.0064301020465791225,
0.12258180975914001,
0.11147689074277878,
-0.12772345542907715,
-0.053019966930150986,
-0.06414514780044556,
-0.08524893969297409,
-0.04640465974807739,
0.03045455552637577,
0.03743596002459526,
0.047410931438207626,
0.2386423945426941,
0.0032438088674098253,
0.054757438600063324,
0.046099163591861725,
0.014072372578084469,
0.06632840633392334,
0.10764557868242264,
-0.05884917825460434,
-0.09735266119241714,
-0.030795203521847725,
0.10186740756034851,
0.006704956758767366,
-0.041407015174627304,
-0.05594591051340103,
0.06964502483606339,
0.020676078274846077,
0.1224241703748703,
0.07868597656488419,
0.002938423305749893,
-0.07543925195932388,
-0.06281042098999023,
0.18152743577957153,
-0.1571107804775238,
0.0444292388856411,
0.03200872242450714,
-0.03442244604229927,
-0.009351148270070553,
0.00990392453968525,
0.02681080251932144,
-0.02011663094162941,
0.09737543761730194,
-0.05644093081355095,
-0.033681318163871765,
-0.11296935379505157,
-0.0371013842523098,
0.030811145901679993,
0.01213210541754961,
-0.029025491327047348,
-0.0342867337167263,
-0.0882277637720108,
-0.0636090338230133,
0.09107700735330582,
-0.07191670686006546,
-0.04744245857000351,
-0.017612621188163757,
-0.07794062048196793,
0.022423118352890015,
0.017721612006425858,
0.09050743281841278,
-0.021899394690990448,
0.03913994878530502,
-0.056751471012830734,
0.06101011112332344,
0.11571475863456726,
0.028108863160014153,
-0.058606795966625214,
0.06155762821435928,
-0.2421950101852417,
0.10317995399236679,
-0.07758963108062744,
0.051325954496860504,
-0.1530446857213974,
-0.026070065796375275,
0.03956404700875282,
0.012061306275427341,
-0.008345595560967922,
0.1417774260044098,
-0.2185831218957901,
-0.03138069063425064,
0.1676056981086731,
-0.10102425515651703,
-0.07971794903278351,
0.06269615143537521,
-0.05407082289457321,
0.11134804040193558,
0.04596652463078499,
-0.023191405460238457,
0.05842197686433792,
-0.14511504769325256,
-0.00791724119335413,
-0.04188765957951546,
-0.017894908785820007,
0.16635635495185852,
0.07102048397064209,
-0.06073606386780739,
0.07092984020709991,
0.019934939220547676,
-0.016795052215456963,
-0.04869792237877846,
-0.028511613607406616,
-0.10498060286045074,
0.011810078285634518,
-0.059134796261787415,
0.02167343720793724,
-0.021296551451086998,
-0.09382132440805435,
-0.029188871383666992,
-0.17379464209079742,
-0.0012200147612020373,
0.08734307438135147,
-0.010546354576945305,
-0.02201107330620289,
-0.11164727807044983,
0.008580547757446766,
0.03398929536342621,
0.0007392297266051173,
-0.13708379864692688,
-0.059298936277627945,
0.02737307921051979,
-0.16233380138874054,
0.02912268228828907,
-0.05535917729139328,
0.046022266149520874,
0.040077272802591324,
-0.03548351675271988,
-0.0344831608235836,
0.01168955210596323,
0.011000183410942554,
-0.01812567003071308,
-0.25495970249176025,
-0.017501724883913994,
-0.02502158097922802,
0.17353887856006622,
-0.22721131145954132,
0.04271984100341797,
0.07614967226982117,
0.14550280570983887,
0.0073052942752838135,
-0.034482456743717194,
0.014565827324986458,
-0.07198352366685867,
-0.03167816624045372,
-0.06257235258817673,
-0.010083765722811222,
-0.03872835263609886,
-0.06014038994908333,
0.04782424867153168,
-0.16939696669578552,
-0.03236479312181473,
0.10534932464361191,
0.06398996710777283,
-0.14835967123508453,
-0.030286256223917007,
-0.0393594354391098,
-0.047035153955221176,
-0.06618485599756241,
-0.054856978356838226,
0.12015452980995178,
0.05620792135596275,
0.04745647683739662,
-0.07151947915554047,
-0.07490099221467972,
0.007241961546242237,
-0.019977761432528496,
-0.0163256898522377,
0.09354335069656372,
0.06967450678348541,
-0.12794628739356995,
0.09154868870973587,
0.0982460081577301,
0.08392132818698883,
0.10398648679256439,
-0.015390566550195217,
-0.08757331967353821,
-0.041474130004644394,
0.023933125659823418,
0.014664852991700172,
0.1483616679906845,
-0.016296299174427986,
0.054420776665210724,
0.0360836423933506,
-0.013510678894817829,
0.01076538860797882,
-0.09628108888864517,
0.02706051431596279,
0.02971329540014267,
-0.015405743382871151,
0.03466423228383064,
-0.04367179423570633,
0.019455796107649803,
0.09001301974058151,
0.041830018162727356,
0.0396038182079792,
0.010561688803136349,
-0.04398298263549805,
-0.11032342165708542,
0.17876994609832764,
-0.12373854219913483,
-0.2460412234067917,
-0.13813963532447815,
0.010937176644802094,
0.04738753288984299,
-0.011057097464799881,
0.006951550021767616,
-0.06640941649675369,
-0.1170244961977005,
-0.09733203053474426,
0.01991088129580021,
0.04529648274183273,
-0.07728998363018036,
-0.06572148203849792,
0.06318122148513794,
0.037644270807504654,
-0.13899093866348267,
0.023945696651935577,
0.0469096377491951,
-0.0813174769282341,
-0.0011905812425538898,
0.07709334045648575,
0.06798645853996277,
0.17623907327651978,
0.014159789308905602,
-0.023712651804089546,
0.025652561336755753,
0.21002908051013947,
-0.14298869669437408,
0.1094568595290184,
0.1327279806137085,
-0.08898334950208664,
0.08212688565254211,
0.20222385227680206,
0.0385010726749897,
-0.10506977140903473,
0.03657889738678932,
0.027060477063059807,
-0.02792542427778244,
-0.24959829449653625,
-0.06908850371837616,
0.001758498721756041,
-0.053698375821113586,
0.06916391849517822,
0.08716317266225815,
0.09721273928880692,
0.016790922731161118,
-0.10066783428192139,
-0.0790279284119606,
0.05001477152109146,
0.10897587984800339,
-0.001458899350836873,
-0.014394176192581654,
0.09075857698917389,
-0.02953648567199707,
0.01689162664115429,
0.09213569760322571,
0.0019032615236938,
0.1793205291032791,
0.052213337272405624,
0.17340974509716034,
0.07910763472318649,
0.06269825994968414,
0.021207094192504883,
0.006816241890192032,
0.02095629647374153,
0.01695442944765091,
-0.004212336614727974,
-0.0863528773188591,
-0.0027415938675403595,
0.1203664243221283,
0.050876569002866745,
0.03059028834104538,
0.014285655692219734,
-0.03054206818342209,
0.08466528356075287,
0.177787184715271,
0.001063879462890327,
-0.1876421719789505,
-0.07282958924770355,
0.07934894412755966,
-0.08512143790721893,
-0.10675539821386337,
-0.029639042913913727,
0.040873926132917404,
-0.17292065918445587,
0.01861744187772274,
-0.020119842141866684,
0.10806277394294739,
-0.12885749340057373,
-0.017452897503972054,
0.055447377264499664,
0.06997017562389374,
-0.009931124746799469,
0.06633757054805756,
-0.1625119000673294,
0.1177479475736618,
0.01653103344142437,
0.06594116985797882,
-0.09538834542036057,
0.095417320728302,
-0.006962447427213192,
0.007516060955822468,
0.1403670459985733,
0.010755252093076706,
-0.0641925036907196,
-0.0961010679602623,
-0.10299893468618393,
-0.010606445372104645,
0.1309773176908493,
-0.14660196006298065,
0.08697716891765594,
-0.02743646875023842,
-0.0437387153506279,
0.0037594304885715246,
-0.12246467173099518,
-0.13224415481090546,
-0.18235477805137634,
0.05769521743059158,
-0.13171130418777466,
0.040173836052417755,
-0.1089821308851242,
-0.04585907980799675,
-0.021465247496962547,
0.1977471560239792,
-0.23280778527259827,
-0.06815840303897858,
-0.15394872426986694,
-0.08265888690948486,
0.1454220414161682,
-0.04706942290067673,
0.08337214589118958,
0.000301246385788545,
0.19080647826194763,
0.020952312275767326,
-0.017133628949522972,
0.1067209243774414,
-0.09975022822618484,
-0.20161914825439453,
-0.09120959788560867,
0.15868841111660004,
0.13963958621025085,
0.038726504892110825,
-0.004869744647294283,
0.032236017286777496,
-0.021885421127080917,
-0.12115032970905304,
0.02010788396000862,
0.17255425453186035,
0.08749033510684967,
0.026468761265277863,
-0.028463367372751236,
-0.11846643686294556,
-0.07225121557712555,
-0.03745346516370773,
0.02470988966524601,
0.1813775599002838,
-0.07139390707015991,
0.18551595509052277,
0.14274363219738007,
-0.054879751056432724,
-0.19840270280838013,
0.02148755080997944,
0.04472679644823074,
0.0060237692669034,
0.03174281120300293,
-0.20237314701080322,
0.09144619107246399,
0.0006281035020947456,
-0.05034751072525978,
0.13383205235004425,
-0.18327344954013824,
-0.15106844902038574,
0.061150215566158295,
0.04303572699427605,
-0.19199669361114502,
-0.1237611323595047,
-0.08872545510530472,
-0.046805474907159805,
-0.1568751484155655,
0.1029038056731224,
0.0011325168889015913,
0.007591354660689831,
0.03782656043767929,
0.024313677102327347,
0.012553532607853413,
-0.041947584599256516,
0.19289998710155487,
-0.02507353574037552,
0.034427378326654434,
-0.0793621614575386,
-0.06381990760564804,
0.06411149352788925,
-0.057697590440511703,
0.0750909373164177,
-0.025500034913420677,
0.015388053841888905,
-0.10115842521190643,
-0.047956179827451706,
-0.029484452679753304,
0.01986371912062168,
-0.09421123564243317,
-0.09366033226251602,
-0.04838487133383751,
0.0944879949092865,
0.08926530182361603,
-0.037268105894327164,
-0.033034052699804306,
-0.07874293625354767,
0.04173892363905907,
0.17448031902313232,
0.18235735595226288,
0.045147113502025604,
-0.07717937231063843,
-0.0013610349269583821,
-0.014655699953436852,
0.04845907539129257,
-0.22060799598693848,
0.06062275543808937,
0.045259539037942886,
0.01552091259509325,
0.11744016408920288,
-0.020618194714188576,
-0.1619492471218109,
-0.0666290745139122,
0.06087447330355644,
-0.06730270385742188,
-0.1811886727809906,
0.00352504407055676,
0.0753183513879776,
-0.16591353714466095,
-0.03711319714784622,
0.04232833534479141,
-0.011535273864865303,
-0.04050648957490921,
0.013207654468715191,
0.08094717562198639,
0.0073035703971982,
0.07697968184947968,
0.05389590561389923,
0.09186159074306488,
-0.10275198519229889,
0.07336891442537308,
0.08092255145311356,
-0.08580191433429718,
0.029650582000613213,
0.0956844761967659,
-0.0660475566983223,
-0.03553546592593193,
0.039692267775535583,
0.08463539928197861,
0.025261107832193375,
-0.04666709899902344,
0.003693421371281147,
-0.09922701120376587,
0.05857077240943909,
0.11215036362409592,
0.035282451659440994,
0.011146705597639084,
0.03799959644675255,
0.04474346339702606,
-0.07786709815263748,
0.11944296956062317,
0.024733934551477432,
0.020655835047364235,
-0.04009570553898811,
-0.040743377059698105,
0.03469119220972061,
-0.027051862329244614,
-0.011984582990407944,
-0.035381630063056946,
-0.07329677045345306,
-0.014250458218157291,
-0.16089624166488647,
-0.006425157655030489,
-0.039050452411174774,
0.006492188666015863,
0.0227071400731802,
-0.03757927939295769,
0.008156952448189259,
0.012379756197333336,
-0.06891508400440216,
-0.05483170598745346,
-0.0225595161318779,
0.09499263763427734,
-0.16361327469348907,
0.02182857319712639,
0.08322018384933472,
-0.12078364938497543,
0.09284685552120209,
0.016550488770008087,
0.002410374814644456,
0.028476644307374954,
-0.15792103111743927,
0.04754367470741272,
-0.020290223881602287,
0.012727295979857445,
0.04053649678826332,
-0.2180718630552292,
-0.005482743959873915,
-0.04065772518515587,
-0.055209364742040634,
-0.008002875372767448,
-0.03194994851946831,
-0.11256447434425354,
0.09542836248874664,
0.010766619816422462,
-0.0858173593878746,
-0.029525602236390114,
0.032997291535139084,
0.07880192995071411,
-0.02688010409474373,
0.15163032710552216,
-0.004930328112095594,
0.07543973624706268,
-0.17439891397953033,
-0.02280678227543831,
-0.009784235619008541,
0.02145213820040226,
-0.02418927662074566,
-0.016610441729426384,
0.04521343484520912,
-0.027311841025948524,
0.18978725373744965,
-0.02763848751783371,
0.047156915068626404,
0.06419318169355392,
0.01327395811676979,
-0.016141459345817566,
0.11109550297260284,
0.05755641311407089,
0.024413742125034332,
0.02059282548725605,
0.0006552583072334528,
-0.04046328365802765,
-0.012729931622743607,
-0.18779614567756653,
0.06844497472047806,
0.14769941568374634,
0.09005311876535416,
-0.014767808839678764,
0.06981590390205383,
-0.09979446232318878,
-0.11724765598773956,
0.10648569464683533,
-0.06312347948551178,
-0.011802246794104576,
-0.06541955471038818,
0.14070585370063782,
0.1514706313610077,
-0.1892511397600174,
0.06684626638889313,
-0.06704412400722504,
-0.05669668689370155,
-0.11357752978801727,
-0.1923627108335495,
-0.05791294202208519,
-0.05011613294482231,
-0.018368201330304146,
-0.05373769626021385,
0.06899537891149521,
0.057158127427101135,
0.011277895420789719,
0.008883214555680752,
0.0839093029499054,
-0.009658100083470345,
0.001425864058546722,
0.031231271103024483,
0.06669623404741287,
0.016144385561347008,
-0.0304893609136343,
0.01806715875864029,
-0.003015234600752592,
0.033999331295490265,
0.059489116072654724,
0.036065202206373215,
-0.028380198404192924,
0.013694645836949348,
-0.03632815182209015,
-0.11369726806879044,
0.043240632861852646,
-0.028342511504888535,
-0.07773103564977646,
0.13286112248897552,
0.026473212987184525,
0.005609886720776558,
-0.022322779521346092,
0.2495104819536209,
-0.07400858402252197,
-0.09536818414926529,
-0.1448878049850464,
0.11703428626060486,
-0.04134928435087204,
0.06479805707931519,
0.03765689954161644,
-0.10748469084501266,
0.018750222399830818,
0.12525403499603271,
0.1550474315881729,
-0.04537956044077873,
0.019106155261397362,
0.02858782559633255,
0.004584235139191151,
-0.04013598710298538,
0.05142189934849739,
0.06933367252349854,
0.14214643836021423,
-0.05173535272479057,
0.08858583122491837,
0.0017827433766797185,
-0.10212727636098862,
-0.04129546508193016,
0.11294585466384888,
-0.012940747663378716,
0.016553698107600212,
-0.05866444855928421,
0.1253037303686142,
-0.059382375329732895,
-0.23649652302265167,
0.061238259077072144,
-0.07580125331878662,
-0.14206883311271667,
-0.02515989914536476,
0.0734870657324791,
-0.015550101175904274,
0.026368482038378716,
0.07198820263147354,
-0.07507873326539993,
0.18898127973079681,
0.03871531784534454,
-0.05198408663272858,
-0.05836968496441841,
0.07604995369911194,
-0.117560975253582,
0.2752254605293274,
0.01097069587558508,
0.05294901132583618,
0.10413134098052979,
-0.02049596607685089,
-0.13178466260433197,
0.024117950350046158,
0.09550730884075165,
-0.08813395351171494,
0.04131056368350983,
0.21484604477882385,
-0.005940921604633331,
0.1187596246600151,
0.07743308693170547,
-0.07539036870002747,
0.047102998942136765,
-0.1141449362039566,
-0.0771128386259079,
-0.08687382191419601,
0.09549140185117722,
-0.0675748735666275,
0.14216206967830658,
0.12683449685573578,
-0.054658904671669006,
0.010759806260466576,
-0.02898469939827919,
0.045599378645420074,
0.0063186027109622955,
0.10157246887683868,
0.009957551956176758,
-0.18577666580677032,
0.02454824559390545,
0.017152229323983192,
0.10993915796279907,
-0.1806284487247467,
-0.09123970568180084,
0.04470835253596306,
0.0021878182888031006,
-0.06369121372699738,
0.12484876811504364,
0.057084910571575165,
0.04630184918642044,
-0.044473882764577866,
-0.029204387217760086,
-0.0060947248712182045,
0.1420498490333557,
-0.10524781048297882,
-0.003831128589808941
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion_new
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8847
- Accuracy: 0.8
- F1: 0.7333
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.5408 | 1.0 | 4 | 0.7674 | 0.8 | 0.7333 |
| 0.4368 | 2.0 | 8 | 0.7471 | 0.8 | 0.7333 |
| 0.3222 | 3.0 | 12 | 0.7318 | 0.8 | 0.7333 |
| 0.4061 | 4.0 | 16 | 0.7289 | 0.8 | 0.7333 |
| 0.3774 | 5.0 | 20 | 0.7732 | 0.8 | 0.7333 |
| 0.3304 | 6.0 | 24 | 0.7874 | 0.8 | 0.7333 |
| 0.3042 | 7.0 | 28 | 0.8036 | 0.8 | 0.7333 |
| 0.4571 | 8.0 | 32 | 0.8038 | 0.8 | 0.7333 |
| 0.1992 | 9.0 | 36 | 0.8271 | 0.8 | 0.7333 |
| 0.2661 | 10.0 | 40 | 0.8498 | 0.8 | 0.7333 |
| 0.2361 | 11.0 | 44 | 0.8582 | 0.8 | 0.7333 |
| 0.2292 | 12.0 | 48 | 0.8620 | 0.8 | 0.7333 |
| 0.2363 | 13.0 | 52 | 0.8678 | 0.8 | 0.7333 |
| 0.2574 | 14.0 | 56 | 0.8672 | 0.8 | 0.7333 |
| 0.5177 | 15.0 | 60 | 0.8668 | 0.8 | 0.7333 |
| 0.226 | 16.0 | 64 | 0.8726 | 0.8 | 0.7333 |
| 0.1726 | 17.0 | 68 | 0.8788 | 0.8 | 0.7333 |
| 0.2439 | 18.0 | 72 | 0.8823 | 0.8 | 0.7333 |
| 0.2005 | 19.0 | 76 | 0.8842 | 0.8 | 0.7333 |
| 0.2541 | 20.0 | 80 | 0.8847 | 0.8 | 0.7333 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu118
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion_new", "results": []}]} | text-classification | gK29382231121/distilbert-base-uncased-finetuned-emotion_new | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:34:49+00:00 | [] | [] | TAGS
#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-emotion\_new
==============================================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8847
* Accuracy: 0.8
* F1: 0.7333
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 6
* eval\_batch\_size: 6
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 20
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu118
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 6\n* eval\\_batch\\_size: 6\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 6\n* eval\\_batch\\_size: 6\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
68,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 6\n* eval\\_batch\\_size: 6\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.0914938822388649,
0.09465637058019638,
-0.0019068996189162135,
0.11402009427547455,
0.15489327907562256,
0.020437492057681084,
0.1444302201271057,
0.10357940196990967,
-0.0763426274061203,
0.03999235853552818,
0.12151549011468887,
0.13862012326717377,
0.0004471030260901898,
0.141187384724617,
-0.08696981519460678,
-0.2202576994895935,
0.021312380209565163,
0.01433143112808466,
-0.032039329409599304,
0.11717943102121353,
0.10404512286186218,
-0.12246957421302795,
0.08782947063446045,
-0.018687531352043152,
-0.17134323716163635,
0.00619130814447999,
0.014718643389642239,
-0.05062808841466904,
0.1276121437549591,
0.024118050932884216,
0.13183832168579102,
0.025212259963154793,
0.0926089957356453,
-0.20689789950847626,
0.004804091993719339,
0.049338821321725845,
-0.006069837603718042,
0.06475023180246353,
0.02002439647912979,
-0.013208729214966297,
0.08738353848457336,
-0.09217702597379684,
0.06301327794790268,
0.02086714282631874,
-0.12651699781417847,
-0.1964358240365982,
-0.0892859473824501,
0.040647368878126144,
0.10159635543823242,
0.08467555046081543,
-0.012002181261777878,
0.11422029137611389,
-0.08217202872037888,
0.08850916475057602,
0.20791280269622803,
-0.3071562647819519,
-0.05795971304178238,
0.048109520226716995,
0.014627458527684212,
0.07115690410137177,
-0.10345283150672913,
-0.037418387830257416,
0.07111987471580505,
0.024389754980802536,
0.1178320050239563,
-0.022070489823818207,
-0.09716123342514038,
-0.004428342916071415,
-0.14799454808235168,
-0.022147728130221367,
0.170987069606781,
0.05360133945941925,
-0.056735407561063766,
-0.049126554280519485,
-0.06536707282066345,
-0.13627997040748596,
-0.03835159167647362,
-0.013716642744839191,
0.05199325084686279,
-0.0202340018004179,
-0.0408659502863884,
0.007985026575624943,
-0.09321396052837372,
-0.07775197178125381,
-0.05430694669485092,
0.16930463910102844,
0.04044429585337639,
-0.0027716991025954485,
0.0018307238351553679,
0.10426343232393265,
-0.04604966565966606,
-0.1300736516714096,
0.0013968611601740122,
0.012995952740311623,
0.022139593958854675,
-0.060734111815690994,
-0.061117783188819885,
-0.037479501217603683,
0.02473272942006588,
0.17409808933734894,
-0.06862525641918182,
0.04071120172739029,
0.017641378566622734,
0.03540686145424843,
-0.09681084007024765,
0.15709663927555084,
-0.020375652238726616,
-0.03365593031048775,
0.028283117339015007,
0.0762639194726944,
0.05340424180030823,
0.000881157408002764,
-0.11781080067157745,
0.019311407580971718,
0.1043330654501915,
0.02540571801364422,
-0.08360567688941956,
0.07786537706851959,
-0.061577148735523224,
0.0032596304081380367,
0.034945178776979446,
-0.09471391141414642,
0.02184944413602352,
0.0006603292422369123,
-0.048952773213386536,
-0.06208636984229088,
0.03345686197280884,
0.02594781294465065,
0.009862750768661499,
0.10706664621829987,
-0.08086197823286057,
0.004601646680384874,
-0.08521503210067749,
-0.11352050304412842,
0.011849808506667614,
-0.07844794541597366,
0.030111435800790787,
-0.11581352353096008,
-0.2026163637638092,
-0.01046968623995781,
0.04970933496952057,
-0.01571296527981758,
-0.03379412740468979,
-0.07006247341632843,
-0.0785122737288475,
0.012317116372287273,
-0.01402975432574749,
0.053487539291381836,
-0.07320050150156021,
0.09655753523111343,
0.0466897189617157,
0.0641590878367424,
-0.05857997015118599,
0.03958427160978317,
-0.11465071886777878,
0.02801615558564663,
-0.1874428540468216,
0.02689214050769806,
-0.07048971205949783,
0.06842769682407379,
-0.06647258996963501,
-0.08267904072999954,
0.008511051535606384,
0.0026769402902573347,
0.06340205669403076,
0.10663657635450363,
-0.15615546703338623,
-0.0552932471036911,
0.1706557720899582,
-0.10855189710855484,
-0.14514118432998657,
0.12222456932067871,
-0.05877300351858139,
0.057079706341028214,
0.06757538765668869,
0.17387208342552185,
0.06522037833929062,
-0.08744095265865326,
-0.00994914025068283,
-0.007792309392243624,
0.054006706923246384,
-0.022607501596212387,
0.0629366934299469,
0.004319171421229839,
-0.024592570960521698,
0.02279096283018589,
-0.056428149342536926,
0.04833953082561493,
-0.08101756125688553,
-0.0829210877418518,
-0.05049620196223259,
-0.10571318864822388,
0.06643494218587875,
0.043644774705171585,
0.05572904273867607,
-0.1199217438697815,
-0.0769181028008461,
0.07055274397134781,
0.08663072437047958,
-0.0645015761256218,
0.01572597026824951,
-0.06478554755449295,
0.0804913341999054,
-0.03349420800805092,
-0.020713606849312782,
-0.15163515508174896,
-0.05076524242758751,
0.02280418388545513,
0.009274296462535858,
0.005215817131102085,
-0.026180563494563103,
0.06047886610031128,
0.0877218097448349,
-0.07229675352573395,
-0.041252411901950836,
-0.028173139318823814,
0.02429933100938797,
-0.11334045231342316,
-0.1893395185470581,
-0.01175843644887209,
-0.030747197568416595,
0.15227441489696503,
-0.22831688821315765,
0.05461478233337402,
-0.013748401775956154,
0.08173912763595581,
0.02946324087679386,
-0.0033877675887197256,
-0.04341196268796921,
0.07625608891248703,
-0.04808081313967705,
-0.06050734594464302,
0.05310485139489174,
0.013199650682508945,
-0.0828925222158432,
-0.05575592815876007,
-0.1291126310825348,
0.18021051585674286,
0.13286957144737244,
-0.07915661484003067,
-0.08847159147262573,
-0.009395730681717396,
-0.043260782957077026,
-0.025318583473563194,
-0.056971628218889236,
0.007833804003894329,
0.12856097519397736,
-0.021633969619870186,
0.14580537378787994,
-0.08448690921068192,
-0.02720530517399311,
0.014759588986635208,
-0.05067989230155945,
0.020664237439632416,
0.10070309787988663,
0.09517449140548706,
-0.11703002452850342,
0.15270507335662842,
0.1884479820728302,
-0.09235252439975739,
0.1233825534582138,
-0.04604225605726242,
-0.0470220148563385,
-0.017811907455325127,
0.010637368075549603,
0.0019329016795381904,
0.09715099632740021,
-0.11851517111063004,
0.011294860392808914,
0.005934691987931728,
0.024173490703105927,
0.008804600685834885,
-0.2129630744457245,
-0.02991989441215992,
0.03763130307197571,
-0.048743635416030884,
-0.01138971745967865,
-0.021348007023334503,
-0.006264281924813986,
0.09469091147184372,
-0.0054648700170218945,
-0.09114275872707367,
0.05259465053677559,
-0.0004788268997799605,
-0.08468985557556152,
0.21316638588905334,
-0.1011585146188736,
-0.12023420631885529,
-0.12382642179727554,
-0.07119690626859665,
-0.04958810657262802,
0.03614182770252228,
0.07767949253320694,
-0.07065296918153763,
-0.046504028141498566,
-0.10552270710468292,
-0.0025327964685857296,
0.045268550515174866,
0.022061262279748917,
0.01390733290463686,
0.0060615078546106815,
0.07412045449018478,
-0.10239305347204208,
-0.020511029288172722,
-0.039898816496133804,
-0.06561223417520523,
0.04340992122888565,
0.020021840929985046,
0.11054451763629913,
0.1380935162305832,
-0.027549555525183678,
-0.008633655495941639,
-0.032788168638944626,
0.23150594532489777,
-0.04865274950861931,
-0.024487754330039024,
0.13300809264183044,
-0.016313135623931885,
0.047561053186655045,
0.1456366777420044,
0.053218986839056015,
-0.10853169113397598,
0.03201514109969139,
0.02772151678800583,
-0.0226117093116045,
-0.21000388264656067,
-0.05255910009145737,
-0.03722558915615082,
-0.0034009742084890604,
0.09220675379037857,
0.024754097685217857,
0.02487226575613022,
0.06662343442440033,
0.01707010716199875,
0.07191425561904907,
0.001510050380602479,
0.08114784955978394,
0.12579898536205292,
0.0409218966960907,
0.12200180441141129,
-0.04128005728125572,
-0.05112064629793167,
0.03493214398622513,
-0.018899934366345406,
0.20726798474788666,
0.02392205037176609,
0.10754711925983429,
0.05845312774181366,
0.15486586093902588,
-0.0016455636359751225,
0.07411012053489685,
0.0012985578505322337,
-0.040857944637537,
-0.01351824402809143,
-0.04965923726558685,
-0.046240612864494324,
0.04437679797410965,
-0.10980531573295593,
0.07357510924339294,
-0.1279584765434265,
0.023943645879626274,
0.06788218021392822,
0.24301910400390625,
0.05136534571647644,
-0.32176029682159424,
-0.09965436905622482,
0.027091072872281075,
-0.026065930724143982,
-0.027326881885528564,
0.039160460233688354,
0.10286083817481995,
-0.06044802814722061,
0.041508156806230545,
-0.04906272888183594,
0.07909838110208511,
-0.022515833377838135,
0.045058805495500565,
0.04689551144838333,
0.07910948246717453,
-0.006883695255964994,
0.07055478543043137,
-0.2753323018550873,
0.2642439007759094,
0.007735824678093195,
0.07853628695011139,
-0.040941908955574036,
0.0011649603256955743,
0.03773687779903412,
0.11345916241407394,
0.07955317944288254,
-0.015251606702804565,
-0.06434033811092377,
-0.19380664825439453,
-0.046474456787109375,
0.02814589999616146,
0.09284201264381409,
-0.03156566992402077,
0.10038994252681732,
-0.03863779827952385,
0.004480276256799698,
0.08447333425283432,
-0.013796723447740078,
-0.09670216590166092,
-0.08874055743217468,
-0.02742762304842472,
0.03877246752381325,
0.010106544010341167,
-0.087575264275074,
-0.09235179424285889,
-0.12017395347356796,
0.15399254858493805,
-0.05314982682466507,
-0.034396082162857056,
-0.0952993631362915,
0.045647915452718735,
0.04753968492150307,
-0.07358488440513611,
0.0635068267583847,
0.01306109968572855,
0.0859583243727684,
0.01660563237965107,
-0.051136087626218796,
0.11904558539390564,
-0.081510990858078,
-0.18620456755161285,
-0.07259827852249146,
0.10056886076927185,
0.01965062879025936,
0.03884207457304001,
0.003931769635528326,
0.012613262981176376,
-0.013625599443912506,
-0.08631673455238342,
0.005609086714684963,
0.019147580489516258,
0.07230864465236664,
0.04888589307665825,
-0.08351212739944458,
-0.01591719686985016,
-0.05166758596897125,
-0.0317566879093647,
0.1611301302909851,
0.29158836603164673,
-0.08590467274188995,
0.007154058199375868,
0.058445870876312256,
-0.060593172907829285,
-0.20394544303417206,
0.02772488258779049,
0.03236744552850723,
-0.002810197416692972,
0.035681504756212234,
-0.13709253072738647,
0.12223396450281143,
0.11309260129928589,
-0.024455735459923744,
0.0947246178984642,
-0.2739735245704651,
-0.12852896749973297,
0.13550648093223572,
0.1486785113811493,
0.13034851849079132,
-0.14298178255558014,
-0.02657361328601837,
-0.050423409789800644,
-0.12761764228343964,
0.10249520093202591,
-0.11426135152578354,
0.1087711825966835,
-0.016168929636478424,
0.051957279443740845,
0.0017395097529515624,
-0.04789571836590767,
0.12846463918685913,
0.01172699499875307,
0.1205688938498497,
-0.06386616080999374,
-0.01917806640267372,
0.03441247716546059,
-0.05988593399524689,
0.03253444656729698,
-0.10579727590084076,
0.04949009045958519,
-0.05645763501524925,
-0.028247924521565437,
-0.042480792850255966,
0.043771758675575256,
-0.038442887365818024,
-0.06821920722723007,
-0.037479519844055176,
0.02449975349009037,
0.05350128561258316,
-0.011415730230510235,
0.13416077196598053,
0.015503505244851112,
0.14988647401332855,
0.11768544465303421,
0.07280102372169495,
-0.07808231562376022,
-0.003470516763627529,
-0.005451811943203211,
-0.037291962653398514,
0.0602378211915493,
-0.14671282470226288,
0.042458560317754745,
0.11678183078765869,
0.016089387238025665,
0.1581750214099884,
0.07941216975450516,
-0.007383095566183329,
0.006909925956279039,
0.06572644412517548,
-0.16662009060382843,
-0.07476764917373657,
-0.0040940227918326855,
-0.026422828435897827,
-0.11084025353193283,
0.06530650705099106,
0.10728118568658829,
-0.07771048694849014,
0.005082488059997559,
-0.020336709916591644,
0.019173184409737587,
-0.0436350516974926,
0.1651630997657776,
0.06201813742518425,
0.04744335263967514,
-0.08459048718214035,
0.09011051058769226,
0.045626189559698105,
-0.056345123797655106,
0.007890293374657631,
0.03176763281226158,
-0.09763193130493164,
-0.047806646674871445,
0.054115891456604004,
0.1797482818365097,
-0.038381047546863556,
-0.055495914071798325,
-0.13258296251296997,
-0.12550996243953705,
0.053819913417100906,
0.18377886712551117,
0.10872691124677658,
0.020311929285526276,
-0.026425743475556374,
0.013211235404014587,
-0.11600187420845032,
0.1064344272017479,
0.031792983412742615,
0.08785396814346313,
-0.15547621250152588,
0.11061949282884598,
-0.005333236418664455,
0.0012757579097524285,
-0.021915394812822342,
0.04743589088320732,
-0.11824394017457962,
-0.006985522340983152,
-0.12725339829921722,
-0.002512704348191619,
-0.030664779245853424,
0.01830485090613365,
0.007022560108453035,
-0.04869743064045906,
-0.05197744071483612,
0.01853903941810131,
-0.0939844623208046,
-0.019419850781559944,
0.03702768310904503,
0.0713874027132988,
-0.12360458076000214,
-0.04489405080676079,
0.027700548991560936,
-0.07577110826969147,
0.06712868064641953,
0.03574527055025101,
0.024497002363204956,
0.05386712774634361,
-0.19496311247348785,
0.016533540561795235,
0.0781429186463356,
0.011042426340281963,
0.042921341955661774,
-0.10247352719306946,
-0.011847179383039474,
0.00305749149993062,
0.030697675421833992,
0.023386288434267044,
0.08440495282411575,
-0.1299544870853424,
0.008713933639228344,
-0.020959977060556412,
-0.06311309337615967,
-0.05009410157799721,
0.007057643961161375,
0.10397851467132568,
-0.011826805770397186,
0.2099234163761139,
-0.10051386058330536,
0.012289018370211124,
-0.1902405023574829,
0.0012938356958329678,
-0.007648006081581116,
-0.10992077738046646,
-0.14939941465854645,
-0.05340587720274925,
0.04054391011595726,
-0.05037014186382294,
0.14844492077827454,
0.00020350654085632414,
0.026194345206022263,
0.03358232602477074,
-0.03850259631872177,
0.0402771458029747,
0.027739573270082474,
0.23446355760097504,
0.034065958112478256,
-0.044648677110672,
0.018638674169778824,
0.028821470215916634,
0.11621227860450745,
0.04903554171323776,
0.16997937858104706,
0.17015495896339417,
-0.06075584143400192,
0.0993083193898201,
0.035267721861600876,
-0.05573210492730141,
-0.13399387896060944,
0.04773983731865883,
-0.028792332857847214,
0.08733665943145752,
-0.019104376435279846,
0.19806498289108276,
0.08021830022335052,
-0.1618838757276535,
0.014006758108735085,
-0.055513061583042145,
-0.07659298181533813,
-0.10894858837127686,
-0.025150636211037636,
-0.1006808876991272,
-0.16098158061504364,
0.003477640450000763,
-0.12030231952667236,
0.005281173158437014,
0.09288057684898376,
-0.00554812652990222,
-0.01363807637244463,
0.1688944399356842,
-0.010694865137338638,
0.03837676718831062,
0.05315862596035004,
-0.00839142594486475,
-0.044048380106687546,
-0.08533873409032822,
-0.10164593905210495,
0.00809846818447113,
-0.02848614566028118,
0.022123701870441437,
-0.047947514802217484,
-0.030946962535381317,
0.03708430379629135,
-0.008066588081419468,
-0.09523473680019379,
0.017221253365278244,
0.03133460879325867,
0.04452819004654884,
0.04845485836267471,
0.013975806534290314,
0.009467422030866146,
0.014093292877078056,
0.21841289103031158,
-0.07333530485630035,
-0.08884060382843018,
-0.09968321025371552,
0.2434176206588745,
0.054089538753032684,
0.024447767063975334,
0.017733847722411156,
-0.09231259673833847,
0.03023177571594715,
0.19409634172916412,
0.16661442816257477,
-0.0740061104297638,
0.006414048373699188,
-0.019693393260240555,
-0.014183416962623596,
-0.03318142145872116,
0.08523187041282654,
0.1255200058221817,
0.007971348240971565,
-0.06527391076087952,
-0.04141722247004509,
-0.03852161392569542,
-0.006757243070751429,
-0.04692791774868965,
0.051072947680950165,
0.020618094131350517,
0.0012710187584161758,
-0.04569990932941437,
0.054772552102804184,
-0.038056980818510056,
-0.0858035609126091,
0.05914507061243057,
-0.1892903596162796,
-0.1436149626970291,
-0.014672023244202137,
0.10114632546901703,
-0.002273870399221778,
0.04804978892207146,
-0.03384115919470787,
-0.015613718889653683,
0.08457304537296295,
-0.028399961069226265,
-0.05961977317929268,
-0.07839598506689072,
0.05183505639433861,
-0.07719939202070236,
0.23638032376766205,
-0.0299539752304554,
0.05797996371984482,
0.12430445104837418,
0.05032625421881676,
-0.0743299275636673,
0.1088433489203453,
0.043163783848285675,
-0.06707984954118729,
0.03431042656302452,
0.06812845915555954,
-0.05027173087000847,
0.1258588433265686,
0.052133653312921524,
-0.15706874430179596,
0.022034335881471634,
-0.018281273543834686,
-0.09917052835226059,
-0.05994891747832298,
-0.031286634504795074,
-0.0580732561647892,
0.13512961566448212,
0.19011709094047546,
-0.04004078358411789,
0.010298593901097775,
-0.04378649592399597,
0.03968079388141632,
0.06675505638122559,
0.031561072915792465,
-0.028796402737498283,
-0.2339414805173874,
0.035846371203660965,
0.08997131139039993,
-0.006189094390720129,
-0.2598113715648651,
-0.08931773155927658,
-0.010107096284627914,
-0.049242548644542694,
-0.09941335767507553,
0.08476175367832184,
0.11265615373849869,
0.047752123326063156,
-0.060659389942884445,
-0.10670653730630875,
-0.07941043376922607,
0.1634376496076584,
-0.11760987341403961,
-0.11024897545576096
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | uyiosa/doctor_mistral | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:36:04+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers | Based ob the paper: "UmlsBERT: Augmenting Contextual Embeddings with a Clinical Metathesaurus" (https://aclanthology.org/2021.naacl-main.139.pdf).
and the github repo: https://github.com/gmichalo/UmlsBERT
Changing base model to SpanBert instead of Bert.
Trained from scratch on MIMIC dataset, using the UMLS dataset to mask words within the text.
We achived better accuracy on MedNLI dataset.
Bert Model accuracy: 83%
SpanBert Model accuracy: 86% | {"license": "apache-2.0", "tags": ["medical"]} | text-classification | NitzanBar/umls-spanbert | [
"transformers",
"safetensors",
"bert",
"text-classification",
"medical",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:37:14+00:00 | [] | [] | TAGS
#transformers #safetensors #bert #text-classification #medical #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Based ob the paper: "UmlsBERT: Augmenting Contextual Embeddings with a Clinical Metathesaurus" (URL
and the github repo: URL
Changing base model to SpanBert instead of Bert.
Trained from scratch on MIMIC dataset, using the UMLS dataset to mask words within the text.
We achived better accuracy on MedNLI dataset.
Bert Model accuracy: 83%
SpanBert Model accuracy: 86% | [] | [
"TAGS\n#transformers #safetensors #bert #text-classification #medical #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
48
] | [
"passage: TAGS\n#transformers #safetensors #bert #text-classification #medical #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.02882269397377968,
0.06328218430280685,
-0.0066969734616577625,
-0.05879899486899376,
0.07936526089906693,
0.016644181683659554,
0.1706218421459198,
0.1301293969154358,
0.018972225487232208,
-0.011018339544534683,
0.19763168692588806,
0.16622377932071686,
-0.010815899819135666,
0.13417008519172668,
-0.07849785685539246,
-0.18570540845394135,
0.09725107252597809,
0.03719087317585945,
-0.00686426367610693,
0.08355125784873962,
0.08568337559700012,
-0.04517310485243797,
0.0353013351559639,
-0.0027561564929783344,
-0.0035776235163211823,
0.01452997699379921,
0.12950310111045837,
-0.12334469705820084,
0.09893360733985901,
0.00935264304280281,
0.10677505284547806,
0.055605895817279816,
-0.005282963160425425,
-0.23153053224086761,
0.013291751965880394,
0.022320298478007317,
-0.0756678506731987,
0.061808787286281586,
-0.001551482593640685,
-0.10183581709861755,
0.012820999138057232,
-0.06755253672599792,
0.03731973469257355,
0.060910243541002274,
-0.07546336203813553,
-0.18811020255088806,
-0.08237274736166,
0.0494224950671196,
0.11020224541425705,
0.07851947844028473,
0.024403555318713188,
0.19589009881019592,
-0.0722145363688469,
0.05845404043793678,
0.09373088926076889,
-0.33981993794441223,
0.018794838339090347,
0.07215049862861633,
0.10237101465463638,
0.0029014884494245052,
-0.016490627080202103,
0.08891928195953369,
0.048788394778966904,
-0.010986527428030968,
0.05840777978301048,
-0.05225116014480591,
-0.06417520344257355,
0.03580431640148163,
-0.04518512263894081,
-0.013531599193811417,
0.22562338411808014,
-0.04107077419757843,
0.01918128877878189,
-0.04659979045391083,
-0.05446135252714157,
0.007943566888570786,
-0.022789154201745987,
-0.013174977153539658,
0.02700418420135975,
0.08323588967323303,
0.07374349981546402,
0.025318441912531853,
-0.11646951735019684,
-0.025162478908896446,
-0.2057732194662094,
0.07066041976213455,
-0.01855143904685974,
0.0649995282292366,
-0.09876561164855957,
0.022999895736575127,
0.023371580988168716,
-0.09083861857652664,
-0.00911408755928278,
-0.051060840487480164,
0.03660587593913078,
-0.038235120475292206,
-0.07023352384567261,
0.03756333887577057,
0.20783479511737823,
0.21686887741088867,
0.05311169847846031,
0.01726081594824791,
-0.06738264858722687,
0.06398430466651917,
-0.04786225035786629,
-0.0013431471306830645,
0.014625652693212032,
-0.017649270594120026,
0.07269107550382614,
-0.05213898792862892,
0.09657834470272064,
-0.016177155077457428,
-0.1023322194814682,
0.0005789280403405428,
0.0034762690775096416,
0.12624268233776093,
-0.029482953250408173,
0.041688259690999985,
-0.06011286750435829,
0.048151616007089615,
0.14762137830257416,
-0.05093582347035408,
-0.024959631264209747,
0.008555484935641289,
0.10519115626811981,
0.007185637019574642,
0.07528050243854523,
0.05718954652547836,
-0.009995924308896065,
0.09970805048942566,
-0.024099133908748627,
-0.034014347940683365,
-0.021690450608730316,
0.01973966509103775,
0.07110828161239624,
-0.0541372075676918,
0.07091987133026123,
-0.1576969027519226,
-0.13261716067790985,
0.01797098107635975,
0.05675918236374855,
0.006295595783740282,
0.006335548125207424,
0.03497740253806114,
0.0164874829351902,
0.04974786564707756,
-0.08210505545139313,
-0.07343950122594833,
-0.09222103655338287,
0.05288375914096832,
-0.09974987059831619,
0.03842431306838989,
-0.1674443632364273,
0.009011753834784031,
-0.10690359771251678,
0.004025737755000591,
-0.04665958136320114,
-0.07144596427679062,
-0.11086519807577133,
0.16723431646823883,
-0.08236666023731232,
0.017424948513507843,
-0.03918576240539551,
0.013094197027385235,
-0.033671412616968155,
0.09341190755367279,
-0.10068310052156448,
-0.04122086241841316,
0.21154043078422546,
-0.11379729211330414,
-0.27254608273506165,
0.06879785656929016,
-0.010750997811555862,
0.032012395560741425,
0.09078764170408249,
0.1727796494960785,
0.09473010897636414,
-0.10095006972551346,
0.04977809637784958,
0.06627698987722397,
-0.08717023581266403,
-0.14353539049625397,
0.04726443812251091,
-0.0576680451631546,
-0.14093810319900513,
0.021244194358587265,
-0.011884225532412529,
0.056757230311632156,
-0.02376960776746273,
-0.06320895999670029,
-0.06856612861156464,
-0.08746950328350067,
-0.004095714073628187,
-0.024379942566156387,
0.04971237853169441,
-0.12570805847644806,
0.015740731731057167,
-0.007896722294390202,
0.03771601617336273,
0.051912564784288406,
0.006834430154412985,
-0.12293488532304764,
-0.05606178194284439,
0.036770835518836975,
0.042626917362213135,
-0.10292859375476837,
-0.10225141048431396,
-0.004647684283554554,
0.019164778292179108,
-0.03780725225806236,
0.03658287599682808,
0.04584968462586403,
-0.043280597776174545,
-0.029709752649068832,
0.0038355300202965736,
0.1957893967628479,
0.0957680493593216,
-0.03554314747452736,
-0.14825095236301422,
0.12618163228034973,
-0.0717441737651825,
0.053244177252054214,
-0.05259959399700165,
0.027744032442569733,
0.042313460260629654,
0.10784067213535309,
0.015097246505320072,
0.08222393691539764,
-0.01702805608510971,
-0.01129968836903572,
-0.03792872652411461,
0.005838984623551369,
0.08733737468719482,
0.002724163234233856,
-0.11053192615509033,
0.17796403169631958,
-0.11708057671785355,
0.33464619517326355,
0.16230177879333496,
-0.14007475972175598,
0.040013156831264496,
-0.05821817368268967,
0.009657860733568668,
0.0019212388433516026,
-0.04248211905360222,
-0.02704700082540512,
-0.0324711948633194,
-0.00860572699457407,
0.13237115740776062,
0.0006498977891169488,
-0.03467736765742302,
-0.012653950601816177,
-0.08204536139965057,
-0.017555924132466316,
-0.0014194318791851401,
0.07559594511985779,
-0.2211519032716751,
0.12911376357078552,
0.3304784893989563,
0.03784406557679176,
0.10699132084846497,
-0.057785116136074066,
0.031706634908914566,
-0.003223131410777569,
-0.04273540526628494,
0.0412190817296505,
0.00231818831525743,
-0.1337810456752777,
0.012530621141195297,
0.07154593616724014,
0.01678764633834362,
0.024248911067843437,
-0.10268746316432953,
-0.06202992424368858,
-0.00840687844902277,
-0.012920916080474854,
-0.03988944739103317,
0.05930837616324425,
-0.0409468337893486,
0.10849093645811081,
-0.03015333227813244,
-0.06533636152744293,
0.12857705354690552,
-0.009714081883430481,
-0.12529049813747406,
0.1432943493127823,
-0.14911822974681854,
-0.22270014882087708,
-0.14743663370609283,
-0.14076514542102814,
0.029536891728639603,
0.045448049902915955,
0.09049424529075623,
-0.07701006531715393,
-0.06822352856397629,
-0.01890665479004383,
-0.10508091002702713,
0.09402725100517273,
0.03866045922040939,
0.005216816905885935,
0.09513863176107407,
0.06561560928821564,
-0.07423365116119385,
-0.0634206011891365,
-0.008498860523104668,
-0.026911331340670586,
0.11188729107379913,
-0.10131729394197464,
0.07858773320913315,
0.13020506501197815,
0.03273051604628563,
-0.014252065680921078,
-0.05879557877779007,
0.08393377810716629,
-0.06769175082445145,
0.01960076577961445,
0.2075439989566803,
-0.013509387150406837,
0.033972058445215225,
0.18852867186069489,
0.060841307044029236,
-0.08412540704011917,
0.07258569449186325,
-0.026926131919026375,
-0.058287639170885086,
-0.3023282587528229,
-0.10487587749958038,
-0.07810397446155548,
0.006053750403225422,
0.015176489017903805,
0.07441794872283936,
0.13255231082439423,
0.09462598711252213,
0.024692874401807785,
0.0030524879693984985,
0.021714581176638603,
0.039133209735155106,
0.21871405839920044,
0.008096759207546711,
0.16062012314796448,
-0.10570050776004791,
-0.08012334257364273,
0.1075790598988533,
0.041271939873695374,
0.13332036137580872,
0.1808275580406189,
0.0481923446059227,
0.09382620453834534,
0.08450545370578766,
0.11066383868455887,
0.12282980978488922,
0.10359426587820053,
-0.038373980671167374,
-0.021102212369441986,
-0.02285604737699032,
-0.08110905438661575,
0.04187936335802078,
-0.12245102226734161,
-0.1290430724620819,
0.002197074005380273,
-0.07037992030382156,
0.11305976659059525,
0.14102493226528168,
0.052476391196250916,
-0.16121505200862885,
0.026214394718408585,
0.11776388436555862,
-0.005855615250766277,
-0.060904815793037415,
0.11271205544471741,
-0.041294679045677185,
-0.029335912317037582,
0.18581055104732513,
-0.032741669565439224,
0.09866754710674286,
0.018394412472844124,
0.09174232184886932,
-0.025614289566874504,
-0.19772912561893463,
0.0112525075674057,
0.1301191747188568,
-0.27380436658859253,
0.21248489618301392,
0.010583771392703056,
0.005882799159735441,
-0.06797363609075546,
-0.0032593326177448034,
0.08571088314056396,
0.30911800265312195,
0.1321706473827362,
0.016057152301073074,
-0.18317845463752747,
-0.06577374041080475,
-0.06509032100439072,
0.04400327429175377,
0.07518281042575836,
-0.004980396945029497,
-0.03259500861167908,
-0.07766319811344147,
-0.006862754467874765,
0.0070721060037612915,
-0.054215118288993835,
-0.1305270493030548,
-0.13561537861824036,
0.06372001022100449,
0.10969607532024384,
0.10034215450286865,
-0.08639386296272278,
-0.004268782213330269,
-0.11042293161153793,
0.1515161395072937,
-0.10436142981052399,
-0.021477997303009033,
-0.12035331130027771,
-0.09836047142744064,
0.01189013384282589,
-0.03638771176338196,
0.050377603620290756,
-0.03797776252031326,
0.0020433778408914804,
-0.043752651661634445,
-0.19163523614406586,
0.1183919832110405,
-0.16155637800693512,
-0.07783602178096771,
-0.1052800714969635,
0.12431409955024719,
-0.09501545131206512,
0.009598379954695702,
0.051176685839891434,
0.03711506724357605,
-0.05459039658308029,
-0.08733371645212173,
0.08022923022508621,
0.035490743815898895,
0.038934893906116486,
0.0074270013719797134,
-0.12501932680606842,
-0.1708926558494568,
0.023515556007623672,
-0.08060896396636963,
0.17566420137882233,
0.2148539274930954,
-0.06898608058691025,
0.13915829360485077,
0.1982877254486084,
-0.07000149041414261,
-0.3482537567615509,
-0.10478611290454865,
-0.125523641705513,
-0.11425847560167313,
0.017833160236477852,
-0.1223132535815239,
0.18943506479263306,
0.08209694921970367,
-0.0989905521273613,
0.026844505220651627,
-0.05092131346464157,
-0.1035657450556755,
0.19863668084144592,
-0.0014954132493585348,
0.3259270489215851,
-0.1834351122379303,
-0.10064470022916794,
-0.08672674745321274,
-0.12925519049167633,
0.15097081661224365,
-0.20350366830825806,
0.0255461223423481,
0.0016492014983668923,
-0.0786610096693039,
-0.02027071826159954,
-0.05757457762956619,
0.14901375770568848,
-0.05886392295360565,
0.050177548080682755,
-0.11251134425401688,
0.027866972610354424,
0.09528748691082001,
-0.03517479822039604,
0.051732003688812256,
-0.12214647233486176,
0.008560915477573872,
-0.04706450551748276,
-0.025135239586234093,
-0.019060278311371803,
0.05484754219651222,
0.019615329802036285,
-0.06941211968660355,
-0.0516585111618042,
-0.021593239158391953,
0.010173231363296509,
-0.030591603368520737,
0.27503374218940735,
-0.04334801807999611,
0.09395701438188553,
0.11949700862169266,
0.16647212207317352,
-0.19485095143318176,
0.09121526032686234,
-0.03188852220773697,
-0.1321258842945099,
0.05905652046203613,
-0.15798407793045044,
0.07995249330997467,
0.08452380448579788,
-0.08298631012439728,
0.09369197487831116,
0.07313651591539383,
0.022189157083630562,
-0.04112337529659271,
0.1726691871881485,
-0.16404618322849274,
-0.03734620288014412,
-0.005195382051169872,
0.16125985980033875,
0.02349029667675495,
0.11493726074695587,
0.14234307408332825,
0.024536453187465668,
-0.006779633462429047,
-0.006648622918874025,
0.040682777762413025,
-0.0848076343536377,
0.10766201466321945,
0.04446107894182205,
0.036579474806785583,
-0.08680825680494308,
0.03371918946504593,
0.05024690181016922,
-0.13307803869247437,
0.007309334818273783,
0.028025073930621147,
-0.17351283133029938,
-0.13927796483039856,
0.049561724066734314,
0.1615375131368637,
-0.08627539128065109,
-0.15312854945659637,
-0.03409670665860176,
-0.1809636503458023,
0.047177527099847794,
0.21551093459129333,
0.06876570731401443,
0.05126050487160683,
0.006864676717668772,
-0.05005291476845741,
0.03792372718453407,
0.05283728614449501,
-0.10114152729511261,
0.06359682232141495,
-0.0708330050110817,
-0.07716058939695358,
-0.014580447226762772,
0.046408019959926605,
-0.09363221377134323,
0.03375237435102463,
-0.11647037416696548,
0.009360190480947495,
-0.16336558759212494,
0.026603873819112778,
-0.042260922491550446,
-0.026127753779292107,
0.035006083548069,
-0.05835914611816406,
-0.04451943188905716,
-0.026208583265542984,
-0.07557250559329987,
0.00818571262061596,
0.00341840204782784,
0.07227285206317902,
-0.12092152237892151,
-0.056134797632694244,
0.09312362968921661,
-0.03211892396211624,
0.1203230544924736,
0.028465602546930313,
-0.044884782284498215,
0.06555619835853577,
-0.181229367852211,
-0.06262478232383728,
0.12524040043354034,
0.06773930042982101,
0.0026395705062896013,
-0.05276317894458771,
-0.001953659113496542,
0.13240960240364075,
-0.029398880898952484,
0.08400102704763412,
0.049924518913030624,
-0.08378703147172928,
0.0026075872592628,
-0.030639197677373886,
-0.10770449042320251,
-0.005685258191078901,
-0.12977148592472076,
0.10914928466081619,
-0.04648755118250847,
0.21860548853874207,
-0.0757703185081482,
-0.03921421989798546,
-0.07225234806537628,
0.024595879018306732,
-0.04563136771321297,
-0.17363688349723816,
-0.15969882905483246,
-0.021366527304053307,
-0.02778218500316143,
-0.007527793291956186,
0.29726290702819824,
-0.04253964126110077,
-0.058123864233493805,
0.09893513470888138,
0.09202398359775543,
0.060646478086709976,
0.02303197793662548,
0.2510068714618683,
0.0910157784819603,
-0.026742715388536453,
-0.1585884839296341,
-0.017119213938713074,
0.04053884372115135,
-0.1639159768819809,
0.08072113990783691,
0.08350816369056702,
-0.014233672060072422,
0.027510695159435272,
0.006648737005889416,
0.011821052059531212,
-0.08749523758888245,
-0.11573512852191925,
-0.04321509599685669,
0.01060608122497797,
0.06654021143913269,
-0.013153607957065105,
0.19407400488853455,
-0.04137048497796059,
-0.01209225133061409,
-0.09835361689329147,
-0.006990403402596712,
-0.17513450980186462,
-0.06494226306676865,
-0.08941061794757843,
-0.1170348972082138,
-0.019268877804279327,
-0.05384734645485878,
-0.01562822051346302,
0.11123226583003998,
0.04522820562124252,
-0.002485580276697874,
0.10326793789863586,
-0.1017998605966568,
0.07756774872541428,
0.009786637499928474,
-0.016613969579339027,
-0.024763571098446846,
-0.04665450006723404,
-0.04550401493906975,
-0.09820853173732758,
-0.025838034227490425,
-0.04405415430665016,
0.02693760022521019,
-0.006543354131281376,
0.030286559835076332,
-0.10278365015983582,
-0.06279169768095016,
-0.014383754692971706,
0.011076727882027626,
-0.08298566937446594,
0.12651675939559937,
0.06130107864737511,
0.03614383935928345,
0.12431498616933823,
0.16905559599399567,
-0.06318540126085281,
-0.17344006896018982,
-0.09270928800106049,
0.11096365749835968,
0.03590411692857742,
0.10358710587024689,
-0.053665220737457275,
0.015112156979739666,
-0.025679102167487144,
0.3032962679862976,
0.21215346455574036,
-0.0349913090467453,
0.059092387557029724,
-0.03686247020959854,
0.02724860981106758,
0.0869794711470604,
0.14667563140392303,
0.11220800876617432,
0.12840643525123596,
-0.03205697983503342,
-0.05255934223532677,
-0.02723260037600994,
-0.030095849186182022,
-0.16272062063217163,
0.05315004289150238,
0.01894722692668438,
-0.04779142513871193,
0.0030979341827332973,
0.11080046743154526,
-0.029671024531126022,
0.1322738081216812,
0.0038735431153327227,
-0.09811319410800934,
-0.015319996513426304,
-0.0304238423705101,
0.22279317677021027,
-0.05891277641057968,
-0.022975649684667587,
-0.008688272908329964,
-0.06943072378635406,
0.06912986189126968,
-0.016126299276947975,
-0.18908549845218658,
-0.023070156574249268,
0.03422079235315323,
-0.020735345780849457,
0.16266244649887085,
0.0237507913261652,
0.020332835614681244,
0.09010221064090729,
0.01513579674065113,
-0.06954387575387955,
0.19325003027915955,
-0.019169863313436508,
-0.028173526749014854,
0.03546449914574623,
-0.10915189236402512,
-0.005642518401145935,
-0.010675007477402687,
0.06567658483982086,
-0.12404470890760422,
0.03550006449222565,
-0.00909273698925972,
-0.11708894371986389,
-0.04540231078863144,
0.11852402240037918,
-0.055701036006212234,
0.059156738221645355,
-0.007526088040322065,
-0.018056178465485573,
0.016709627583622932,
-0.03375602141022682,
0.031181497499346733,
0.01088046096265316,
-0.10471638292074203,
-0.06174462288618088,
-0.06561077386140823,
0.030944667756557465,
0.09471340477466583,
0.05003354325890541,
-0.18795572221279144,
-0.043001946061849594,
-0.11087854206562042,
0.04277458041906357,
-0.1566266417503357,
0.04450396075844765,
0.12136760354042053,
0.01947321556508541,
-0.03705925494432449,
-0.1299867182970047,
0.049983296543359756,
0.0369957759976387,
-0.07235348224639893,
-0.0983930379152298
] |
null | null | sample-factory |
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment.
This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory.
Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/
## Downloading the model
After installing Sample-Factory, download the model with:
```
python -m sample_factory.huggingface.load_from_hub -r candyhaws/rl_course_vizdoom_health_gathering_supreme
```
## Using the model
To run the model after download, use the `enjoy` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme
```
You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag.
See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details
## Training with this model
To continue training with this model, use the `train` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000
```
Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
| {"library_name": "sample-factory", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "sample-factory"], "model-index": [{"name": "APPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "doom_health_gathering_supreme", "type": "doom_health_gathering_supreme"}, "metrics": [{"type": "mean_reward", "value": "12.53 +/- 5.59", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | candyhaws/rl_course_vizdoom_health_gathering_supreme | [
"sample-factory",
"tensorboard",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-13T10:37:41+00:00 | [] | [] | TAGS
#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
A(n) APPO model trained on the doom_health_gathering_supreme environment.
This model was trained using Sample-Factory 2.0: URL
Documentation for how to use Sample-Factory can be found at URL
## Downloading the model
After installing Sample-Factory, download the model with:
## Using the model
To run the model after download, use the 'enjoy' script corresponding to this environment:
You can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.
See URL for more details
## Training with this model
To continue training with this model, use the 'train' script corresponding to this environment:
Note, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at.
| [
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
"TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"## Downloading the model\n\nAfter installing Sample-Factory, download the model with:",
"## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details",
"## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
34,
19,
59,
67
] | [
"passage: TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n## Downloading the model\n\nAfter installing Sample-Factory, download the model with:## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at."
] | [
-0.162887305021286,
-0.07949446886777878,
0.0013769814977422357,
0.0244897473603487,
0.13643795251846313,
0.08826540410518646,
0.13243556022644043,
0.07938782125711441,
0.19449298083782196,
0.07451266050338745,
0.12160012871026993,
0.06742649525403976,
0.02505551464855671,
0.31084391474723816,
0.08655242621898651,
-0.18235880136489868,
0.031082456931471825,
-0.06436605006456375,
-0.02882574498653412,
0.05590416118502617,
0.050910040736198425,
-0.06422623991966248,
0.11641133576631546,
-0.05714287608861923,
-0.15497641265392303,
0.08288847655057907,
0.008126083761453629,
0.03596968948841095,
0.12199652194976807,
-0.007729834411293268,
0.06358569860458374,
0.02508161962032318,
0.09885215014219284,
-0.08979995548725128,
0.05817115306854248,
0.037268251180648804,
-0.005583701189607382,
0.0697544738650322,
-0.02916712686419487,
0.01197513286024332,
0.20552261173725128,
0.051445573568344116,
-0.014811687171459198,
0.0707944929599762,
-0.04854035750031471,
0.005004523321986198,
0.024828260764479637,
0.08118943125009537,
0.1108563020825386,
-0.013300174847245216,
-0.015604399144649506,
0.2098497599363327,
-0.045419543981552124,
0.030687451362609863,
0.1803472340106964,
-0.13901305198669434,
-0.00587898213416338,
0.3598267436027527,
0.13591337203979492,
0.07389762997627258,
-0.05572221428155899,
0.065569669008255,
0.12957775592803955,
-0.013377981260418892,
-0.022062024101614952,
-0.037468962371349335,
0.01014290377497673,
0.02470328100025654,
-0.08271043002605438,
-0.03898613899946213,
0.18779566884040833,
0.027798498049378395,
-0.0647122785449028,
-0.11388745903968811,
-0.08383605629205704,
-0.01143614575266838,
-0.08729266375303268,
-0.06047317758202553,
0.061255209147930145,
0.06450130045413971,
-0.05541218817234039,
-0.16354843974113464,
-0.08759765326976776,
-0.14808951318264008,
0.09711641818284988,
-0.018818290904164314,
0.020023507997393608,
0.039053402841091156,
-0.13240769505500793,
0.13932685554027557,
-0.12239529192447662,
-0.005040881223976612,
-0.00391974626109004,
-0.10012788325548172,
-0.0298643596470356,
-0.02757178619503975,
-0.06954579800367355,
-0.08072661608457565,
0.06621979922056198,
0.1397300660610199,
0.1075919046998024,
0.04457515478134155,
-0.016096504405140877,
0.0929836705327034,
0.0659836158156395,
0.015487046912312508,
-0.046446919441223145,
-0.03190334141254425,
0.06750229746103287,
0.09463070333003998,
-0.0025161339435726404,
-0.04405781999230385,
-0.12502750754356384,
0.004669501446187496,
-0.05889439582824707,
0.07438734918832779,
-0.01944235898554325,
0.09347380697727203,
0.0012449703644961119,
-0.0658751055598259,
0.09675891697406769,
-0.056166794151067734,
-0.015024078078567982,
0.05717969685792923,
-0.09829384088516235,
-0.044000294059515,
0.02636338584125042,
-0.018662840127944946,
0.02191256918013096,
-0.08697114139795303,
-0.1281215101480484,
-0.0406981036067009,
-0.15496762096881866,
-0.0733695924282074,
0.020342092961072922,
-0.10162562131881714,
0.040819648653268814,
-0.08701786398887634,
-0.27291807532310486,
-0.016108427196741104,
0.05915366858243942,
0.0003154690202791244,
0.03663148358464241,
-0.06209208071231842,
0.0267410296946764,
-0.030988745391368866,
-0.013702943921089172,
0.12538094818592072,
-0.04706621542572975,
0.005733184050768614,
0.02853262610733509,
0.09092917293310165,
0.029396481812000275,
-0.011824010871350765,
-0.09237373620271683,
0.03002769686281681,
-0.1866937130689621,
0.0038047281559556723,
-0.051012441515922546,
0.14028684794902802,
-0.07785230129957199,
-0.0034444157499819994,
-0.07691079378128052,
0.06912831217050552,
0.052552226930856705,
0.21963854134082794,
-0.22059281170368195,
-0.09743031859397888,
0.1902308464050293,
-0.09678838402032852,
-0.1949385702610016,
0.06732125580310822,
-0.03079940192401409,
0.20069970190525055,
0.02597416751086712,
0.1891578733921051,
0.00020795770979020745,
-0.25584760308265686,
0.035303130745887756,
0.07686726003885269,
-0.2078019231557846,
-0.11653494834899902,
0.00783967413008213,
0.04216665402054787,
-0.050144799053668976,
0.023388857021927834,
-0.07392873615026474,
0.1217033788561821,
-0.023950038477778435,
-0.021695949137210846,
-0.009935722686350346,
-0.06940963864326477,
-0.039610356092453,
0.012346661649644375,
0.06086154654622078,
-0.02202412113547325,
-0.025860905647277832,
-0.05173748731613159,
0.16720648109912872,
-0.0795547217130661,
0.011736705899238586,
-0.11241740733385086,
0.1497063785791397,
0.007124151568859816,
0.025635361671447754,
-0.0980280190706253,
-0.014672551304101944,
0.044151511043310165,
0.08621654659509659,
0.011970171704888344,
0.1326037049293518,
0.06774137914180756,
0.01454958226531744,
0.042493220418691635,
-0.004039871972054243,
-0.0012205307139083743,
-0.10230473428964615,
-0.05593033879995346,
-0.11311958730220795,
-0.11286478489637375,
-0.09429361671209335,
0.08868816494941711,
-0.20066434144973755,
0.05826579034328461,
-0.15120604634284973,
0.047645486891269684,
0.038803353905677795,
-0.07772190868854523,
0.05121537670493126,
-0.08661998063325882,
-0.021283775568008423,
-0.08784573525190353,
0.0805407464504242,
-0.014386715367436409,
-0.08415807038545609,
0.006313080433756113,
-0.09094364196062088,
-0.08295580744743347,
0.09175937622785568,
0.013830476440489292,
0.0026490744203329086,
-0.1170414388179779,
-0.04695970565080643,
0.001149212708696723,
0.03873389959335327,
-0.0591595321893692,
0.08649469166994095,
0.06776818633079529,
0.09646541625261307,
-0.09070473909378052,
0.03797374665737152,
-0.020416714251041412,
-0.06236580014228821,
-0.045745182782411575,
0.014070805162191391,
0.1767948418855667,
-0.022993814200162888,
-0.01734299771487713,
-0.005982444155961275,
-0.048861317336559296,
0.20095843076705933,
-0.018403954803943634,
-0.11935548484325409,
0.0030399553943425417,
-0.01395543571561575,
-0.017944620922207832,
0.11660698801279068,
-0.13726668059825897,
-0.05182260647416115,
0.030854813754558563,
-0.06529976427555084,
0.10216285288333893,
-0.08242622762918472,
-0.0392029769718647,
-0.05685178562998772,
-0.043409593403339386,
0.046979792416095734,
0.12330524623394012,
-0.07290767133235931,
-0.009151018224656582,
-0.047789376229047775,
-0.03510203957557678,
-0.025379952043294907,
-0.05724980682134628,
-0.11478709429502487,
0.1582695096731186,
0.002751561114564538,
-0.09990474581718445,
-0.17415542900562286,
-0.08029486984014511,
-0.03834356367588043,
0.05337152257561684,
-0.034037429839372635,
-0.04430336132645607,
-0.01500723510980606,
-0.07299388945102692,
0.1465158462524414,
0.063304103910923,
-0.0472191721200943,
-0.01852818764746189,
0.08560720086097717,
0.04456184431910515,
-0.15394946932792664,
0.007078593596816063,
-0.08948076516389847,
-0.08794131129980087,
0.03091353550553322,
-0.08061819523572922,
0.012820594012737274,
0.11341627687215805,
0.03525753691792488,
0.02826494723558426,
0.01035099383443594,
0.23537762463092804,
-0.0369284451007843,
-0.01093987375497818,
0.19019025564193726,
0.0682438537478447,
0.020443644374608994,
0.055847786366939545,
0.027420951053500175,
-0.15370461344718933,
0.10424364358186722,
0.012530675157904625,
-0.044538769870996475,
-0.10689681768417358,
-0.04666181653738022,
-0.03360101953148842,
0.09803235530853271,
0.12185155600309372,
0.03158954530954361,
0.025155838578939438,
0.096546471118927,
0.02187134325504303,
-0.0098390718922019,
-0.11183010786771774,
0.05996714532375336,
-0.1770814210176468,
-0.043808963149785995,
0.00898060668259859,
-0.028755301609635353,
0.00010461114288773388,
0.0659034252166748,
0.026660064235329628,
0.12833580374717712,
0.0295290257781744,
0.06181740015745163,
0.0663255974650383,
0.10200989991426468,
0.01538698747754097,
0.1999037265777588,
-0.06215142831206322,
-0.1075027585029602,
-0.03758005052804947,
-0.04118350148200989,
-0.11916319280862808,
0.12439136207103729,
0.1381523460149765,
-0.030515994876623154,
-0.06625506281852722,
0.07200724631547928,
0.014589293859899044,
0.08729344606399536,
0.08250882476568222,
-0.29115065932273865,
-0.034177567809820175,
0.031450141221284866,
0.01114452164620161,
-0.04308335855603218,
0.010566305369138718,
0.10542299598455429,
-0.07616783678531647,
-0.09982791543006897,
-0.03972722589969635,
0.1055394783616066,
0.08046542853116989,
0.03702867403626442,
-0.10841067880392075,
0.20128826797008514,
-0.01744360849261284,
0.07004447281360626,
-0.07662706822156906,
0.1728198230266571,
0.018701205030083656,
0.05943213775753975,
-0.07497778534889221,
-0.009592941962182522,
0.1228223443031311,
0.03374773636460304,
0.09092900156974792,
-0.0056656887754797935,
-0.09995020180940628,
-0.13336431980133057,
-0.1216202825307846,
0.024986369535326958,
-0.000090524394181557,
-0.08169890940189362,
0.03341596573591232,
-0.016717763617634773,
0.017487963661551476,
-0.0027857583481818438,
0.23440547287464142,
-0.18267135322093964,
0.012482558377087116,
-0.054521817713975906,
0.02707577496767044,
-0.04300008341670036,
-0.0709642544388771,
-0.027162717655301094,
0.060507629066705704,
0.09744840115308762,
0.07921962440013885,
0.030401866883039474,
-0.07419665157794952,
0.1431404948234558,
0.06514685600996017,
-0.058246973901987076,
-0.01524845976382494,
0.01951364241540432,
0.1256532073020935,
-0.07438289374113083,
-0.10393836349248886,
0.10585980117321014,
-0.11736445128917694,
0.008749126456677914,
-0.05019083246588707,
0.04299405962228775,
0.02305823378264904,
0.011290842667222023,
0.007447924464941025,
-0.04279239848256111,
0.0015383695717900991,
-0.06904047727584839,
0.0778660774230957,
0.020559091120958328,
-0.0047941361553967,
-0.0006717707728967071,
-0.16239388287067413,
0.08390985429286957,
-0.04138755425810814,
0.052877847105264664,
0.1489589661359787,
0.27864590287208557,
-0.02386910282075405,
0.030926240608096123,
0.1617380678653717,
-0.01897917501628399,
-0.2491649091243744,
0.04654841497540474,
0.014908025041222572,
0.10310175269842148,
0.04640066251158714,
-0.19236695766448975,
0.11111847311258316,
0.009474517777562141,
-0.02225719392299652,
0.009804603643715382,
-0.24880149960517883,
-0.13740544021129608,
0.17525193095207214,
0.06902051717042923,
0.15983323752880096,
-0.03665107116103172,
-0.013587141409516335,
-0.061109546571969986,
-0.03419603407382965,
-0.026354335248470306,
-0.12708203494548798,
0.12749767303466797,
-0.017607107758522034,
0.047745801508426666,
0.027817612513899803,
-0.07676684111356735,
0.12058744579553604,
-0.017944786697626114,
0.13344953954219818,
-0.017018258571624756,
-0.031023232266306877,
0.042466819286346436,
-0.09033756703138351,
0.1662607043981552,
-0.10233280807733536,
0.057950668036937714,
-0.11091876775026321,
-0.03109682910144329,
-0.015322481282055378,
0.15654151141643524,
0.005544521380215883,
-0.0855189636349678,
-0.041066281497478485,
0.04975702613592148,
-0.05784251168370247,
0.05022609233856201,
-0.0021613158751279116,
-0.03506873920559883,
0.022246064618229866,
0.08415499329566956,
0.040208954364061356,
-0.10403558611869812,
-0.011038471013307571,
0.03089289739727974,
0.01896476000547409,
0.09993185102939606,
-0.20835483074188232,
-0.020152123644948006,
0.019231827929615974,
-0.015702085569500923,
0.13085414469242096,
0.04400704801082611,
-0.08080117404460907,
0.027568496763706207,
0.13726983964443207,
-0.061186157166957855,
-0.030986590310931206,
-0.04847807064652443,
-0.016679393127560616,
-0.12794725596904755,
-0.01594163477420807,
0.057148490101099014,
-0.04251079633831978,
0.02512725070118904,
-0.03424951806664467,
0.0004248716577421874,
-0.10717252641916275,
0.07036283612251282,
0.06859682500362396,
0.0642281174659729,
-0.07167360186576843,
0.09394960850477219,
-0.07811970263719559,
0.014289900660514832,
0.03734226152300835,
0.045441556721925735,
-0.06931920349597931,
-0.06820165365934372,
-0.05322124809026718,
0.27575042843818665,
-0.024388493970036507,
-0.02025510184466839,
-0.06021025776863098,
0.11942195147275925,
-0.057836465537548065,
-0.06673881411552429,
0.08716115355491638,
-0.007450808770954609,
-0.059019722044467926,
0.022327717393636703,
-0.0734894648194313,
-0.014457973651587963,
0.04693116992712021,
0.016375891864299774,
-0.11610891669988632,
0.1136312261223793,
0.031648989766836166,
0.02891513518989086,
-0.09186926484107971,
-0.0486464723944664,
-0.12123195827007294,
0.0032020595390349627,
-0.025323880836367607,
-0.06051601842045784,
-0.07913094758987427,
-0.0425749197602272,
0.049642790108919144,
0.018434861674904823,
-0.08444267511367798,
-0.0022111251018941402,
-0.12617166340351105,
0.006370943505316973,
0.006689207162708044,
0.10316617041826248,
-0.06351965665817261,
0.04670397937297821,
0.10049878805875778,
-0.07692139595746994,
0.09893755614757538,
0.0846271738409996,
-0.00729260453954339,
0.08929292112588882,
-0.20261284708976746,
-0.02319980226457119,
0.047821637243032455,
0.055264540016651154,
0.03154374286532402,
0.06104309484362602,
0.013487739488482475,
-0.05460033565759659,
0.04538526386022568,
-0.03539090231060982,
0.0028435050044208765,
-0.09104080498218536,
0.09713591635227203,
0.009731475263834,
-0.009716489352285862,
-0.060456521809101105,
-0.01384128537029028,
0.01817488856613636,
0.10404353588819504,
0.09692291915416718,
-0.07237115502357483,
-0.0035003575030714273,
-0.11786255985498428,
0.024597108364105225,
0.02565017342567444,
0.010576808825135231,
0.03638135641813278,
-0.11692339926958084,
0.03729743883013725,
-0.05475534871220589,
0.19700418412685394,
0.019796879962086678,
-0.10531783103942871,
-0.008661900646984577,
0.07250577956438065,
0.17378750443458557,
-0.006129021290689707,
0.21011123061180115,
0.05919691175222397,
0.09556611627340317,
0.0324610099196434,
0.11373614519834518,
0.11542147397994995,
0.004254546947777271,
0.10733281821012497,
0.0500684529542923,
-0.04822303727269173,
0.14306919276714325,
0.032827045768499374,
-0.017670227214694023,
0.0304852481931448,
0.04704435542225838,
-0.03187015652656555,
0.02075354754924774,
-0.06440161913633347,
0.11196915805339813,
0.13514995574951172,
-0.08471442013978958,
-0.0081911850720644,
0.04797748476266861,
-0.0438203290104866,
-0.1532401293516159,
-0.08671712130308151,
-0.024648865684866905,
-0.2236001342535019,
0.08533021807670593,
-0.06946314871311188,
-0.13578248023986816,
0.019155733287334442,
0.013867083936929703,
-0.028145823627710342,
0.11776147037744522,
-0.07801362872123718,
-0.03346126526594162,
0.020983682945370674,
-0.039618294686079025,
-0.09754771739244461,
-0.09402462840080261,
-0.07874704152345657,
0.03500581532716751,
-0.04535633698105812,
0.025271590799093246,
-0.05421067774295807,
0.015182215720415115,
0.10334893316030502,
-0.04038224741816521,
-0.041323766112327576,
-0.0359976626932621,
-0.035855069756507874,
-0.11793428659439087,
0.025968458503484726,
0.044103916734457016,
-0.03597194701433182,
-0.05585090070962906,
0.17637495696544647,
-0.04257858544588089,
-0.01666315644979477,
-0.1211012676358223,
0.14332374930381775,
-0.04330325871706009,
0.03261799365282059,
-0.10366860777139664,
-0.08559805154800415,
-0.10071583092212677,
0.27439257502555847,
0.2784624397754669,
-0.14349330961704254,
-0.009759977459907532,
0.02939503826200962,
0.004204166121780872,
-0.14250165224075317,
0.14376720786094666,
0.01570971868932247,
-0.024460898712277412,
-0.027595078572630882,
0.026391539722681046,
-0.007621914613991976,
-0.0827714279294014,
-0.03114704228937626,
-0.05752136558294296,
-0.006779014132916927,
-0.05148708075284958,
-0.034257955849170685,
0.06298708915710449,
-0.12136059254407883,
-0.09091135859489441,
-0.05560125410556793,
-0.0083417734131217,
-0.03344108536839485,
-0.07473809272050858,
-0.019548200070858,
0.07662302255630493,
0.14781777560710907,
-0.05502733215689659,
0.06005467101931572,
-0.004367031157016754,
-0.04969286173582077,
-0.13970479369163513,
-0.13660922646522522,
0.05449144169688225,
-0.129489928483963,
0.26909253001213074,
-0.050524767488241196,
-0.05207161232829094,
0.041712693870067596,
-0.03221052139997482,
-0.05838879942893982,
0.020522039383649826,
0.009778409264981747,
-0.05078497156500816,
-0.029240628704428673,
0.09255361557006836,
-0.033305004239082336,
0.009149706922471523,
-0.022496739402413368,
-0.22135144472122192,
0.0034119023475795984,
-0.05107501149177551,
0.028507398441433907,
-0.12569822371006012,
0.06501629203557968,
-0.09348012506961823,
0.12403472512960434,
0.07595156878232956,
-0.01166640967130661,
-0.036088403314352036,
-0.04733064025640488,
0.1257045865058899,
0.08392459154129028,
-0.02910126931965351,
-0.0870935395359993,
-0.16758979856967926,
-0.004611360374838114,
-0.0011314527364447713,
-0.08687946200370789,
-0.23090760409832,
-0.008421163074672222,
-0.031696807593107224,
0.0109195401892066,
-0.00838692206889391,
0.12826944887638092,
0.14749252796173096,
0.05249129980802536,
0.016358694061636925,
-0.12719306349754333,
0.041898638010025024,
0.08496948331594467,
-0.15762199461460114,
-0.1707899123430252
] |
null | null | transformers | # [MaziyarPanahi/sqlcoder-7b-2-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-2-GGUF)
- Model creator: [defog](https://huggingface.co/defog)
- Original model: [defog/sqlcoder-7b-2](https://huggingface.co/defog/sqlcoder-7b-2)
## Description
[MaziyarPanahi/sqlcoder-7b-2-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-2-GGUF) contains GGUF format model files for [defog/sqlcoder-7b-2](https://huggingface.co/defog/sqlcoder-7b-2).
## How to use
Thanks to [TheBloke](https://huggingface.co/TheBloke) for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: [MaziyarPanahi/sqlcoder-7b-2-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-2-GGUF) and below it, a specific filename to download, such as: sqlcoder-7b-2-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download MaziyarPanahi/sqlcoder-7b-2-GGUF sqlcoder-7b-2-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download [MaziyarPanahi/sqlcoder-7b-2-GGUF](https://huggingface.co/MaziyarPanahi/sqlcoder-7b-2-GGUF) --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MaziyarPanahi/sqlcoder-7b-2-GGUF sqlcoder-7b-2-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m sqlcoder-7b-2-GGUF.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./sqlcoder-7b-2-GGUF.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./sqlcoder-7b-2-GGUF.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers) | {"tags": ["quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "transformers", "safetensors", "gguf", "llama", "text-generation", "license:cc-by-sa-4.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us"], "model_name": "sqlcoder-7b-2-GGUF", "base_model": "defog/sqlcoder-7b-2", "inference": false, "model_creator": "defog", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"} | text-generation | MaziyarPanahi/sqlcoder-7b-2-GGUF | [
"transformers",
"gguf",
"mistral",
"quantized",
"2-bit",
"3-bit",
"4-bit",
"5-bit",
"6-bit",
"8-bit",
"GGUF",
"safetensors",
"llama",
"text-generation",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"base_model:defog/sqlcoder-7b-2"
] | 2024-02-13T10:37:51+00:00 | [] | [] | TAGS
#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b-2
| # MaziyarPanahi/sqlcoder-7b-2-GGUF
- Model creator: defog
- Original model: defog/sqlcoder-7b-2
## Description
MaziyarPanahi/sqlcoder-7b-2-GGUF contains GGUF format model files for defog/sqlcoder-7b-2.
## How to use
Thanks to TheBloke for preparing an amazing README on how to use GGUF models:
### About GGUF
GGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* URL. The source project for GGUF. Offers a CLI and a server option.
* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.
* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
### Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
## How to download GGUF files
Note for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* URL
### In 'text-generation-webui'
Under Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-2-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-2-GGUF.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the 'huggingface-hub' Python library:
Then you can download any individual model file to the current directory, at high speed, with a command like this:
</details>
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
For more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.
To accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':
And set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':
Windows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.
</details>
## Example 'URL' command
Make sure you are using 'URL' from commit d0cee0d or later.
Change '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'
For other parameters and how to use them, please refer to the URL documentation
## How to run in 'text-generation-webui'
Further instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.
## How to run from Python code
You can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: llama-cpp-python docs.
#### First install the package
Run one of the following commands, according to your system:
#### Simple llama-cpp-python example code
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* LangChain + llama-cpp-python
* LangChain + ctransformers | [
"# MaziyarPanahi/sqlcoder-7b-2-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b-2",
"## Description\nMaziyarPanahi/sqlcoder-7b-2-GGUF contains GGUF format model files for defog/sqlcoder-7b-2.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-2-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-2-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
"TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b-2 \n",
"# MaziyarPanahi/sqlcoder-7b-2-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b-2",
"## Description\nMaziyarPanahi/sqlcoder-7b-2-GGUF contains GGUF format model files for defog/sqlcoder-7b-2.",
"## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.",
"### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw",
"## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL",
"### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-2-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-2-GGUF.Q4_K_M.gguf.\n\nThen click Download.",
"### On the command line, including multiple files at once\n\nI recommend using the 'huggingface-hub' Python library:\n\n\n\nThen you can download any individual model file to the current directory, at high speed, with a command like this:\n\n\n</details>\n<details>\n <summary>More advanced huggingface-cli download usage (click to read)</summary>\n\nYou can also download multiple files at once with a pattern:\n\n\n\nFor more documentation on downloading with 'huggingface-cli', please see: HF -> Hub Python Library -> Download files -> Download from the CLI.\n\nTo accelerate downloads on fast connections (1Gbit/s or higher), install 'hf_transfer':\n\n\n\nAnd set environment variable 'HF_HUB_ENABLE_HF_TRANSFER' to '1':\n\n\n\nWindows Command Line users: You can set the environment variable by running 'set HF_HUB_ENABLE_HF_TRANSFER=1' before the download command.\n</details>",
"## Example 'URL' command\n\nMake sure you are using 'URL' from commit d0cee0d or later.\n\n\n\nChange '-ngl 32' to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.\n\nChange '-c 32768' to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by URL automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.\n\nIf you want to have a chat-style conversation, replace the '-p <PROMPT>' argument with '-i -ins'\n\nFor other parameters and how to use them, please refer to the URL documentation",
"## How to run in 'text-generation-webui'\n\nFurther instructions can be found in the text-generation-webui documentation, here: text-generation-webui/docs/04 ‐ Model URL.",
"## How to run from Python code\n\nYou can use GGUF models from Python using the llama-cpp-python or ctransformers libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.",
"### How to load this model in Python code, using llama-cpp-python\n\nFor full documentation, please see: llama-cpp-python docs.",
"#### First install the package\n\nRun one of the following commands, according to your system:",
"#### Simple llama-cpp-python example code",
"## How to use with LangChain\n\nHere are guides on using llama-cpp-python and ctransformers with LangChain:\n\n* LangChain + llama-cpp-python\n* LangChain + ctransformers"
] | [
105,
36,
37,
26,
401,
323,
84,
77,
218,
182,
49,
77,
36,
19,
12,
50
] | [
"passage: TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #safetensors #llama #text-generation #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #base_model-defog/sqlcoder-7b-2 \n# MaziyarPanahi/sqlcoder-7b-2-GGUF\n- Model creator: defog\n- Original model: defog/sqlcoder-7b-2## Description\nMaziyarPanahi/sqlcoder-7b-2-GGUF contains GGUF format model files for defog/sqlcoder-7b-2.## How to use\nThanks to TheBloke for preparing an amazing README on how to use GGUF models:",
"passage: ### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.### Explanation of quantisation methods\n\n<details>\n <summary>Click to see details</summary>\n\nThe new methods available are:\n\n* GGML_TYPE_Q2_K - \"type-1\" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)\n* GGML_TYPE_Q3_K - \"type-0\" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.\n* GGML_TYPE_Q4_K - \"type-1\" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.\n* GGML_TYPE_Q5_K - \"type-1\" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw\n* GGML_TYPE_Q6_K - \"type-0\" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw## How to download GGUF files\n\nNote for manual downloaders: You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.\n\nThe following clients/libraries will automatically download models for you, providing a list of available models to choose from:\n\n* LM Studio\n* LoLLMS Web UI\n* URL### In 'text-generation-webui'\n\nUnder Download Model, you can enter the model repo: MaziyarPanahi/sqlcoder-7b-2-GGUF and below it, a specific filename to download, such as: sqlcoder-7b-2-GGUF.Q4_K_M.gguf.\n\nThen click Download."
] | [
-0.07442577183246613,
0.13983842730522156,
-0.002758803777396679,
0.07194438576698303,
0.08795566856861115,
0.03634747117757797,
0.03644540533423424,
0.10122601687908173,
0.049955353140830994,
0.06003871560096741,
0.07598184049129486,
0.043489616364240646,
0.05065353587269783,
0.1532067507505417,
0.07780881226062775,
-0.18534575402736664,
0.04291590303182602,
-0.022437788546085358,
-0.028317872434854507,
0.03114474192261696,
0.05250104144215584,
-0.02687816321849823,
0.0891273245215416,
-0.01598423160612583,
-0.04172828793525696,
-0.04613004997372627,
-0.023834262043237686,
-0.0035264985635876656,
0.057565007358789444,
0.07704836130142212,
-0.06574827432632446,
-0.02977629005908966,
0.0031681135296821594,
-0.1379701793193817,
0.024516155943274498,
0.04923729598522186,
-0.02861914411187172,
0.028975969180464745,
-0.016906052827835083,
0.035636622458696365,
0.13104549050331116,
-0.08670338988304138,
-0.02399822697043419,
0.046960826963186264,
-0.05173036456108093,
-0.14229997992515564,
-0.1200484186410904,
0.017565786838531494,
0.025077715516090393,
0.04909220337867737,
0.01957542635500431,
-0.0013985354453325272,
-0.008764004334807396,
0.04180586710572243,
0.19048848748207092,
-0.24208468198776245,
-0.045717429369688034,
0.1271880865097046,
0.06020645052194595,
0.06147029250860214,
-0.08744004368782043,
0.06340799480676651,
0.012159192003309727,
0.009350999258458614,
0.049941372126340866,
-0.04043978825211525,
0.12065541744232178,
-0.0011603422462940216,
-0.10966971516609192,
-0.017318211495876312,
0.09245506674051285,
-0.018214024603366852,
-0.06375148147344589,
-0.07637245953083038,
-0.040572673082351685,
-0.027014728635549545,
-0.042195335030555725,
0.03957303613424301,
0.022144701331853867,
0.03903811052441597,
0.055927425622940063,
-0.11252572387456894,
-0.03863919526338577,
-0.058593012392520905,
-0.03984099254012108,
0.21517637372016907,
0.020127274096012115,
0.054874617606401443,
0.03085445612668991,
0.10182298719882965,
-0.15579430758953094,
-0.04647178202867508,
-0.103884756565094,
0.003607841907069087,
-0.025382641702890396,
0.051702000200748444,
-0.0084501588717103,
0.056451261043548584,
0.07488119602203369,
0.12493810057640076,
-0.09521695971488953,
0.07653531432151794,
0.062481582164764404,
0.0011998293921351433,
-0.049142636358737946,
0.10000123828649521,
-0.06574564427137375,
-0.08993678539991379,
0.08636467158794403,
0.028627164661884308,
0.0998237133026123,
-0.035069167613983154,
-0.07896767556667328,
-0.008335031569004059,
-0.0315190888941288,
0.03714078292250633,
0.01857742667198181,
0.050839196890592575,
-0.014428246766328812,
-0.01596452109515667,
0.1913614422082901,
-0.08319781720638275,
0.03907039761543274,
0.00775621272623539,
-0.017368195578455925,
-0.028862398117780685,
0.028596438467502594,
-0.020800575613975525,
-0.029195522889494896,
-0.01095900684595108,
-0.08994652330875397,
-0.03555282577872276,
-0.0595426931977272,
-0.03677969053387642,
0.0503661185503006,
-0.06776860356330872,
-0.015685051679611206,
-0.08478650450706482,
-0.2084091305732727,
0.03489892557263374,
0.03561103343963623,
-0.03113282285630703,
-0.002686896361410618,
0.012915091589093208,
-0.04502564296126366,
0.030704129487276077,
0.01791643351316452,
0.0668780505657196,
-0.05264183133840561,
0.04550235718488693,
0.03710734844207764,
0.048721712082624435,
-0.16233882308006287,
-0.0035727492067962885,
-0.027963582426309586,
0.07349450141191483,
-0.06966239213943481,
0.10252925008535385,
-0.10483269393444061,
0.02907104417681694,
-0.05601711943745613,
-0.0035754526033997536,
-0.0275310967117548,
-0.02306629717350006,
0.046009186655282974,
0.07403247803449631,
-0.10305500030517578,
-0.045932166278362274,
0.1140439361333847,
-0.14020845293998718,
-0.055649422109127045,
0.12437643110752106,
0.01448022946715355,
-0.02534102275967598,
0.08042548596858978,
0.08356426656246185,
0.18932534754276276,
-0.06558915972709656,
-0.09122814238071442,
0.044166479259729385,
0.02201232686638832,
-0.0016754860989749432,
0.082199826836586,
0.0132015161216259,
-0.05202271789312363,
0.06801994889974594,
-0.11193814128637314,
0.058188166469335556,
0.018408117815852165,
-0.0528978630900383,
-0.04509330540895462,
-0.07828809320926666,
0.05208716541528702,
-0.02190716564655304,
-0.029667546972632408,
-0.004289411939680576,
-0.08575452119112015,
-0.06595958769321442,
0.13863308727741241,
-0.032604657113552094,
0.019740551710128784,
-0.08173854649066925,
0.16112186014652252,
-0.07871384918689728,
0.04903385788202286,
-0.031346701085567474,
-0.09593811631202698,
0.04421217367053032,
-0.09753715991973877,
0.02823874168097973,
-0.08777017146348953,
0.04978445917367935,
0.06986958533525467,
-0.04518657550215721,
0.03317784518003464,
0.0038250423967838287,
-0.018921062350273132,
-0.06386205554008484,
-0.05115813761949539,
0.00035390397533774376,
-0.03246769681572914,
0.1323634833097458,
-0.07929961383342743,
0.020997341722249985,
0.11921785771846771,
0.02453731559216976,
0.013019906356930733,
-0.08897678554058075,
0.0389973483979702,
-0.01802663505077362,
0.015912465751171112,
-0.04943780601024628,
0.019783983007073402,
0.0354190394282341,
-0.09242811053991318,
0.06388819217681885,
-0.1276819407939911,
0.04542243480682373,
0.0941224992275238,
0.18052038550376892,
0.025104206055402756,
-0.06125035136938095,
0.013331563211977482,
-0.026046480983495712,
0.007654678076505661,
-0.04310363903641701,
0.11757706850767136,
-0.022466380149126053,
0.06270407140254974,
-0.060885559767484665,
0.0007790550589561462,
0.02356180176138878,
0.01380021870136261,
-0.02363709732890129,
0.05701850354671478,
0.06699532270431519,
-0.057166438549757004,
0.059388648718595505,
0.044706132262945175,
-0.03423386812210083,
0.14495402574539185,
0.014974386431276798,
-0.04164239764213562,
-0.04748525097966194,
0.009782999753952026,
0.022665156051516533,
0.1387227326631546,
-0.09725542366504669,
0.0089552141726017,
0.006341567263007164,
0.01622992940247059,
0.06826028227806091,
-0.13160204887390137,
0.014211367815732956,
-0.03619047999382019,
-0.09798917919397354,
0.036052994430065155,
0.02200310304760933,
-0.09888758510351181,
0.040839120745658875,
0.05604670196771622,
0.056382499635219574,
0.033843994140625,
0.0052164215594530106,
-0.0801326334476471,
0.148698627948761,
-0.12233775854110718,
-0.19900420308113098,
-0.14281487464904785,
-0.027629226446151733,
-0.07257328182458878,
0.002350261900573969,
0.027749404311180115,
-0.04560773819684982,
-0.052883706986904144,
-0.0826243981719017,
-0.016738267615437508,
0.00039536505937576294,
0.015950912609696388,
0.056732285767793655,
-0.0627971664071083,
0.0023401621729135513,
-0.11099763214588165,
0.00433379877358675,
0.01768461987376213,
-0.077484130859375,
0.02322249673306942,
0.006003564223647118,
0.09271885454654694,
0.07143419235944748,
0.034223929047584534,
0.003610922023653984,
-0.008944280445575714,
0.19047385454177856,
-0.07031185925006866,
0.07497967779636383,
0.1340387761592865,
0.052613515406847,
0.059130363166332245,
0.010144785046577454,
0.014634128659963608,
-0.07504028081893921,
-0.011114493012428284,
0.028830602765083313,
-0.10973282158374786,
-0.09870913624763489,
-0.05545249953866005,
-0.06624284386634827,
0.0578736737370491,
-0.0011325739324092865,
0.09015622735023499,
-0.03574495017528534,
0.08729945123195648,
-0.014594590291380882,
0.028025787323713303,
0.029300546273589134,
0.0503869503736496,
0.12017785012722015,
-0.00733655272051692,
0.04118465632200241,
-0.08261694759130478,
0.06402900815010071,
0.11111768335103989,
0.12495497614145279,
0.11089803278446198,
-0.0873565524816513,
0.14465904235839844,
0.015909111127257347,
0.06095156818628311,
0.01412359718233347,
0.00807974860072136,
-0.06400470435619354,
-0.005206774920225143,
-0.028572363778948784,
-0.06225302070379257,
-0.06349629163742065,
0.055927176028490067,
-0.021726276725530624,
-0.030897561460733414,
-0.003387676551938057,
0.0697563961148262,
0.05840916186571121,
0.09039013087749481,
0.021126411855220795,
-0.16680514812469482,
-0.11249996721744537,
0.040747787803411484,
-0.006957462057471275,
-0.04317064955830574,
0.024276679381728172,
0.09611617773771286,
-0.03856862336397171,
0.060129035264253616,
-0.033017516136169434,
0.03295668959617615,
-0.07250973582267761,
-0.019619997590780258,
0.0472356341779232,
0.15494300425052643,
0.007647250778973103,
0.06181053817272186,
-0.1872793436050415,
0.006965314969420433,
0.03823556378483772,
0.05357242003083229,
-0.04826410859823227,
0.03620186075568199,
0.07848653942346573,
0.026342477649450302,
0.06502789258956909,
0.02561945468187332,
0.006127683445811272,
-0.032407671213150024,
-0.10739012062549591,
0.06632998585700989,
0.04287052899599075,
-0.0355839841067791,
0.0694565400481224,
-0.03498733788728714,
0.007506333291530609,
-0.019457947462797165,
0.0028266198933124542,
-0.03825773298740387,
-0.1733420044183731,
0.10487858951091766,
0.05981430411338806,
-0.016833104193210602,
-0.1000564694404602,
-0.030577756464481354,
-0.06167376786470413,
0.14943546056747437,
-0.086329884827137,
-0.08323727548122406,
-0.08683811873197556,
-0.030429577454924583,
0.10331292450428009,
-0.08220720291137695,
0.04795045033097267,
-0.03439423814415932,
0.06652408838272095,
-0.033241935074329376,
-0.09553666412830353,
0.03816315531730652,
-0.06838829815387726,
-0.12704138457775116,
-0.003996242769062519,
0.10238417983055115,
0.015635386109352112,
0.0196477510035038,
-0.004313407000154257,
0.011475290171802044,
-0.0002463487908244133,
-0.14311906695365906,
0.033201057463884354,
0.16291874647140503,
-0.08505194634199142,
0.06570614874362946,
-0.01641584187746048,
0.029220636934041977,
0.004877686034888029,
-0.03916711360216141,
0.06525178253650665,
0.16480959951877594,
-0.05794458091259003,
0.11664004623889923,
0.11980754882097244,
-0.0761401429772377,
-0.23568856716156006,
-0.039586957544088364,
-0.017678789794445038,
0.004355498589575291,
-0.0828782171010971,
-0.1917259395122528,
0.075332410633564,
0.07849760353565216,
-0.04058612510561943,
0.24250546097755432,
-0.25478804111480713,
-0.07179124653339386,
-0.03926719352602959,
0.06529064476490021,
0.18999424576759338,
-0.17565935850143433,
-0.06271432340145111,
-0.019658857956528664,
-0.1542113870382309,
0.07133679836988449,
-0.0735725536942482,
0.12592142820358276,
-0.025921061635017395,
0.045068126171827316,
-0.01710507832467556,
-0.05035312473773956,
0.15691134333610535,
-0.05784151330590248,
0.007031615357846022,
-0.06436216831207275,
0.029413266107439995,
0.03289010748267174,
-0.05440230667591095,
0.09612709283828735,
-0.13766413927078247,
0.03391941636800766,
-0.05171168968081474,
-0.03483942896127701,
-0.06126537546515465,
0.02852429449558258,
-0.005478374660015106,
-0.040482062846422195,
-0.09939303994178772,
0.04865336790680885,
0.0006927608046680689,
0.018180450424551964,
-0.026691395789384842,
-0.008264090865850449,
-0.0013466021046042442,
0.10143356770277023,
0.04981251433491707,
-0.13277919590473175,
-0.05726883187890053,
-0.017495375126600266,
-0.02909824438393116,
0.06256283074617386,
-0.132034569978714,
0.030245574191212654,
0.06540650129318237,
0.02043556049466133,
0.05629643425345421,
0.02722647413611412,
-0.10005836188793182,
0.06969861686229706,
0.07250663638114929,
-0.11197935044765472,
-0.182869091629982,
-0.02120286412537098,
-0.03129425644874573,
-0.052361954003572464,
0.06875121593475342,
0.1580239236354828,
-0.0014147087931632996,
-0.006735166534781456,
-0.010618510656058788,
0.07261422276496887,
-0.03130804002285004,
0.11599534749984741,
0.027777250856161118,
0.0035019833594560623,
-0.09753142297267914,
0.0699225664138794,
0.006416040472686291,
-0.025306735187768936,
0.023830793797969818,
0.13749276101589203,
-0.09848742187023163,
-0.07382676750421524,
-0.16843311488628387,
-0.047606516629457474,
-0.057500794529914856,
-0.0420842170715332,
-0.04368456080555916,
-0.05643010511994362,
0.02846367284655571,
0.07635116577148438,
0.025597859174013138,
0.050367530435323715,
-0.005996817722916603,
0.06539275497198105,
-0.04648309573531151,
0.06294810771942139,
-0.058726705610752106,
0.05166193097829819,
-0.13661952316761017,
0.00849718227982521,
0.003358980640769005,
0.05487829074263573,
-0.02540436014533043,
-0.0033950256183743477,
-0.08101379871368408,
-0.033428169786930084,
-0.14740827679634094,
0.042822495102882385,
-0.10900910943746567,
0.025835908949375153,
-0.011430017650127411,
-0.004250079393386841,
-0.01834002509713173,
0.05938713252544403,
-0.027296163141727448,
-0.037932924926280975,
-0.051641110330820084,
0.00012005586177110672,
-0.07205317914485931,
0.001857321709394455,
0.049385834485292435,
-0.03095889464020729,
0.13158389925956726,
-0.0038654953241348267,
0.01211458258330822,
0.038709282875061035,
-0.13138353824615479,
0.03690577298402786,
0.024923603981733322,
-0.022521773353219032,
-0.03709615394473076,
-0.09203153848648071,
0.0359005443751812,
-0.005208475980907679,
0.0029275845736265182,
0.016460038721561432,
0.14649145305156708,
-0.0757271945476532,
-0.011018874123692513,
-0.06388553231954575,
0.009037948213517666,
-0.016230128705501556,
0.033586591482162476,
0.1078183501958847,
0.028416339308023453,
0.058412931859493256,
-0.04958118498325348,
-0.030014803633093834,
-0.09625867754220963,
-0.0033176911529153585,
-0.003211997915059328,
-0.050888001918792725,
-0.039506763219833374,
-0.01014801301062107,
0.033644311130046844,
0.009581007063388824,
0.18021684885025024,
-0.033020585775375366,
-0.07398096472024918,
-0.01370153296738863,
-0.04106881469488144,
0.08807448297739029,
0.004929389338940382,
0.13390982151031494,
0.04537995159626007,
-0.01471918635070324,
-0.0169310811907053,
0.04512111470103264,
0.041671693325042725,
-0.01414667908102274,
0.04718669503927231,
-0.0035513918846845627,
0.05027031898498535,
0.10106471180915833,
0.010134544223546982,
-0.08274905383586884,
-0.09193602204322815,
0.06445984542369843,
-0.11047713458538055,
0.05742660537362099,
-0.06329077482223511,
0.06774471700191498,
0.14810600876808167,
-0.09755690395832062,
0.0454099178314209,
0.026741504669189453,
-0.058277204632759094,
-0.06301417946815491,
-0.13088932633399963,
-0.053785551339387894,
-0.10108891129493713,
-0.0026586062740534544,
-0.07762113213539124,
0.01523472461849451,
0.05877882242202759,
0.004970441572368145,
0.009359623305499554,
0.14258268475532532,
-0.005996625870466232,
-0.03119102492928505,
0.026946168392896652,
0.01210262905806303,
-0.044138118624687195,
0.11894196271896362,
-0.055718641728162766,
0.03377186134457588,
-0.035462237894535065,
0.08090931922197342,
0.03304102271795273,
0.02628260850906372,
0.07497379928827286,
-0.013313502073287964,
-0.0013007372617721558,
-0.0295293889939785,
0.04664692282676697,
0.026156967505812645,
0.14466705918312073,
0.018310051411390305,
-0.07315719127655029,
0.02263898029923439,
0.11253505945205688,
-0.03048733063042164,
-0.011058559641242027,
-0.0930054560303688,
0.10235340893268585,
-0.06060037016868591,
0.010097136721014977,
-0.029428789392113686,
-0.06489624828100204,
0.03295763581991196,
0.17090848088264465,
0.1531715840101242,
-0.07893696427345276,
-0.0005383891984820366,
0.006639156490564346,
-0.007126397918909788,
-0.0021555759012699127,
0.1208220049738884,
0.06264599412679672,
0.2607854902744293,
-0.008599583059549332,
-0.017274493351578712,
-0.012886830605566502,
-0.007193908095359802,
-0.0986374169588089,
0.027992617338895798,
-0.07928909361362457,
0.04749182611703873,
-0.0638706162571907,
-0.0006358213722705841,
-0.03684600442647934,
-0.11586807668209076,
0.0020771785639226437,
-0.07672183960676193,
-0.056333914399147034,
-0.0028393007814884186,
-0.06087789312005043,
0.0188132394105196,
0.027273423969745636,
0.013109310530126095,
0.03249124437570572,
0.055485453456640244,
0.004782829433679581,
-0.141932874917984,
-0.029323741793632507,
0.046639058738946915,
0.036323048174381256,
0.2088669389486313,
-0.027868445962667465,
0.02040313184261322,
0.09264355152845383,
-0.012049399316310883,
-0.13857313990592957,
0.09339982271194458,
0.01291356049478054,
-0.10746775567531586,
-0.008747091516852379,
0.05414766073226929,
-0.022192537784576416,
0.07145420461893082,
0.060366809368133545,
0.1399393230676651,
0.017600737512111664,
0.042950961738824844,
0.0036453865468502045,
-0.06707973778247833,
-0.013621287420392036,
-0.14142310619354248,
0.1541232019662857,
0.10589725524187088,
-0.01149412989616394,
-0.0008098476100713015,
-0.05267456918954849,
0.04312486946582794,
-0.030867310240864754,
0.033400386571884155,
-0.01889011263847351,
-0.10949353128671646,
0.023158498108386993,
-0.009158017113804817,
0.02456504851579666,
-0.22194421291351318,
-0.05682002380490303,
-0.03920813277363777,
0.007638225331902504,
0.003743826411664486,
0.08974262326955795,
0.12010152637958527,
-0.004628287628293037,
-0.049954261630773544,
-0.16208389401435852,
-0.02841051109135151,
0.06058516353368759,
-0.10403192788362503,
-0.08782495558261871
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2239
- Accuracy: 0.9332
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2255 | 1.0 | 1563 | 0.2254 | 0.9148 |
| 0.1533 | 2.0 | 3126 | 0.2239 | 0.9332 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "my_awesome_model", "results": []}]} | text-classification | Yzagnoev/my_awesome_model | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:40:33+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| my\_awesome\_model
==================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2239
* Accuracy: 0.9332
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
72,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10019759833812714,
0.11067764461040497,
-0.0027117575518786907,
0.11309350281953812,
0.1407114863395691,
0.015566974878311157,
0.16083598136901855,
0.11523853987455368,
-0.06586716324090958,
0.047139592468738556,
0.12724870443344116,
0.1267809122800827,
0.015589109621942043,
0.11694324761629105,
-0.0821223109960556,
-0.21297866106033325,
0.009262321516871452,
0.023279264569282532,
-0.06282595545053482,
0.11443697661161423,
0.09317267686128616,
-0.12140881270170212,
0.08889606595039368,
-0.01630976051092148,
-0.16684411466121674,
0.004483003169298172,
0.0185470599681139,
-0.04859450086951256,
0.12496213614940643,
0.0340886227786541,
0.1332121193408966,
0.037589170038700104,
0.08573948591947556,
-0.1914825439453125,
0.01013100054115057,
0.06056787073612213,
-0.005684851668775082,
0.08205332607030869,
0.036063775420188904,
-0.00818174984306097,
0.07036479562520981,
-0.09291410446166992,
0.06155337020754814,
0.01713711954653263,
-0.12854750454425812,
-0.20634932816028595,
-0.0883561298251152,
0.034781042486429214,
0.09293109178543091,
0.0757843628525734,
-0.010216014459729195,
0.11729177087545395,
-0.052653003484010696,
0.09577237069606781,
0.20081780850887299,
-0.30478137731552124,
-0.06204235926270485,
0.04693254828453064,
0.026437832042574883,
0.08869023621082306,
-0.09820863604545593,
-0.019617367535829544,
0.059073686599731445,
0.02347065880894661,
0.12702205777168274,
-0.02458905428647995,
-0.05828976631164551,
0.00004997341602575034,
-0.1429450958967209,
-0.017484772950410843,
0.15101875364780426,
0.04993494972586632,
-0.04607582464814186,
-0.04622197151184082,
-0.07422526925802231,
-0.1301300823688507,
-0.040548793971538544,
-0.008262471295893192,
0.04927852749824524,
-0.020487889647483826,
-0.060093268752098083,
-0.02404731698334217,
-0.09585840255022049,
-0.06599218398332596,
-0.055670365691185,
0.14398139715194702,
0.03339707851409912,
0.0050427853129804134,
-0.009880613535642624,
0.09970428794622421,
-0.02551034465432167,
-0.14622047543525696,
0.02152339741587639,
0.021895259618759155,
0.0033334260806441307,
-0.050177957862615585,
-0.05004870519042015,
-0.08081728965044022,
0.02100646123290062,
0.15884080529212952,
-0.05102210491895676,
0.05134177207946777,
0.0014934897189959884,
0.04775834456086159,
-0.10367672145366669,
0.16968229413032532,
-0.04029857739806175,
-0.032456204295158386,
0.024304935708642006,
0.09165651351213455,
0.0570831373333931,
-0.015806002542376518,
-0.13065816462039948,
0.032570730894804,
0.10311777144670486,
0.020490145310759544,
-0.050244126468896866,
0.06643285602331161,
-0.05939091369509697,
-0.013222224079072475,
0.044000059366226196,
-0.09393413364887238,
0.026997370645403862,
0.004754959139972925,
-0.05623389407992363,
-0.0456400066614151,
0.032220155000686646,
0.022079957649111748,
0.0068405186757445335,
0.10715161263942719,
-0.08120670169591904,
0.010925263166427612,
-0.0824371948838234,
-0.12997685372829437,
0.01841440238058567,
-0.09362049400806427,
0.0196848027408123,
-0.10866927355527878,
-0.18304722011089325,
-0.013394098728895187,
0.06342688202857971,
-0.029001379385590553,
-0.031549472361803055,
-0.06290201097726822,
-0.07806146144866943,
0.019294410943984985,
-0.011423278599977493,
0.06484776735305786,
-0.06414144486188889,
0.09802352637052536,
0.03467443957924843,
0.06727663427591324,
-0.06341986358165741,
0.04101603105664253,
-0.10303992033004761,
0.041689373552799225,
-0.1797795593738556,
0.03683985024690628,
-0.07091311365365982,
0.07125449180603027,
-0.07992434501647949,
-0.07109715789556503,
0.0028496324084699154,
-0.0028827181085944176,
0.07481669634580612,
0.09808890521526337,
-0.17802444100379944,
-0.06266740709543228,
0.15227645635604858,
-0.08770041912794113,
-0.14090389013290405,
0.1384061723947525,
-0.05809393897652626,
0.04345238208770752,
0.06315018236637115,
0.1906898021697998,
0.07687285542488098,
-0.08462781459093094,
0.004933589603751898,
0.004034224431961775,
0.06784228980541229,
-0.03145440295338631,
0.07046119123697281,
-0.0010255075758323073,
0.002769702346995473,
0.0120105454698205,
-0.05313775688409805,
0.050824183970689774,
-0.07786566764116287,
-0.09101258963346481,
-0.04106970131397247,
-0.10436925292015076,
0.06719474494457245,
0.04988638311624527,
0.061878304928541183,
-0.10929753631353378,
-0.08742420375347137,
0.06440360099077225,
0.07483801245689392,
-0.07390862703323364,
0.024244826287031174,
-0.06871534883975983,
0.09111680090427399,
-0.05879254266619682,
-0.0154491625726223,
-0.1570628434419632,
-0.045141421258449554,
0.019390396773815155,
0.0026568672619760036,
0.018089890480041504,
-0.006221451796591282,
0.07115030288696289,
0.08330822736024857,
-0.06736969202756882,
-0.03163342550396919,
-0.013815926387906075,
0.015050242654979229,
-0.12376483529806137,
-0.19990336894989014,
-0.01450938917696476,
-0.034796856343746185,
0.1501496285200119,
-0.2342424839735031,
0.05279238149523735,
-0.0016695931553840637,
0.08959422260522842,
0.04089779034256935,
-0.010956166312098503,
-0.03873062878847122,
0.06937503069639206,
-0.04952619969844818,
-0.07016181945800781,
0.05995992198586464,
0.009694836102426052,
-0.10586922615766525,
-0.04246232658624649,
-0.14911918342113495,
0.18212151527404785,
0.1327030509710312,
-0.08019472658634186,
-0.07329218089580536,
0.006981467362493277,
-0.034438036382198334,
-0.027134548872709274,
-0.036830391734838486,
0.0035942550748586655,
0.12769998610019684,
-0.005554917734116316,
0.1542792022228241,
-0.08694887906312943,
-0.032167550176382065,
0.020718999207019806,
-0.04828891158103943,
0.009300608187913895,
0.11511284857988358,
0.08885256946086884,
-0.1079678162932396,
0.14892984926700592,
0.1963522583246231,
-0.09542527049779892,
0.13114608824253082,
-0.0446484200656414,
-0.051352787762880325,
-0.02496159076690674,
0.009727482683956623,
0.01579960808157921,
0.10891955345869064,
-0.11396274715662003,
0.0025126792024821043,
0.009516540914773941,
0.013118362985551357,
0.010651723481714725,
-0.21625153720378876,
-0.02574796974658966,
0.04144356772303581,
-0.05188395455479622,
-0.0007115827756933868,
-0.024552641436457634,
-0.007640476804226637,
0.09821933507919312,
-0.004585641901940107,
-0.08889534324407578,
0.04677675664424896,
-0.0041070966981351376,
-0.07766163349151611,
0.20324033498764038,
-0.09426552057266235,
-0.1433550864458084,
-0.13776732981204987,
-0.06749053299427032,
-0.05450194701552391,
0.032780684530735016,
0.061373621225357056,
-0.06756307929754257,
-0.040991418063640594,
-0.11005542427301407,
-0.003991791047155857,
0.029829727485775948,
0.0193012822419405,
0.021481677889823914,
-0.004403815604746342,
0.08481433987617493,
-0.10051977634429932,
-0.0072693973779678345,
-0.03502807021141052,
-0.05129162594676018,
0.0372275747358799,
0.024707762524485588,
0.1115504652261734,
0.14919300377368927,
-0.026061538606882095,
-0.00765664828941226,
-0.02716790698468685,
0.227743998169899,
-0.05850009247660637,
-0.005881329998373985,
0.12651550769805908,
-0.03202733024954796,
0.057515040040016174,
0.13913887739181519,
0.06372343003749847,
-0.09777047485113144,
0.018723880872130394,
0.03333195298910141,
-0.035029519349336624,
-0.21672765910625458,
-0.03578333929181099,
-0.037666428834199905,
0.005857541225850582,
0.09461888670921326,
0.02930840291082859,
0.022898633033037186,
0.06634201109409332,
0.018884195014834404,
0.08255600929260254,
-0.008953948505222797,
0.07029007375240326,
0.11361095309257507,
0.0406784787774086,
0.13066865503787994,
-0.0468982458114624,
-0.05129459500312805,
0.04132942482829094,
-0.004677255637943745,
0.19999510049819946,
0.022098811343312263,
0.14669445157051086,
0.05076270550489426,
0.1585625261068344,
-0.0019041625782847404,
0.06015600636601448,
-0.010061566717922688,
-0.035915303975343704,
-0.015867874026298523,
-0.05103377252817154,
-0.031000230461359024,
0.03435772284865379,
-0.08321376889944077,
0.05710018053650856,
-0.10353868454694748,
0.01719418726861477,
0.061015937477350235,
0.2319483757019043,
0.057797372341156006,
-0.3217560350894928,
-0.09087039530277252,
0.03186166658997536,
-0.019303597509860992,
-0.020915400236845016,
0.02733795717358589,
0.12670859694480896,
-0.046651989221572876,
0.037378761917352676,
-0.07003038376569748,
0.08635158836841583,
-0.040990035980939865,
0.04473980888724327,
0.05064508318901062,
0.08444598317146301,
-0.010925895534455776,
0.06804386526346207,
-0.2791103422641754,
0.2637650966644287,
0.019467249512672424,
0.06688182055950165,
-0.0450504831969738,
-0.0006802300922572613,
0.038520198315382004,
0.09533538669347763,
0.06951190531253815,
-0.013791583478450775,
-0.05101897194981575,
-0.19256925582885742,
-0.0658053532242775,
0.02071443200111389,
0.09778265655040741,
-0.039918556809425354,
0.10060980170965195,
-0.029306231066584587,
0.0011924310820177197,
0.08037646859884262,
-0.01503133587539196,
-0.08087804913520813,
-0.09869466722011566,
-0.009656496345996857,
0.03687136620283127,
-0.03722069412469864,
-0.07974199950695038,
-0.09686758369207382,
-0.13529346883296967,
0.15331348776817322,
-0.0681944414973259,
-0.03559467941522598,
-0.10310577601194382,
0.05335264280438423,
0.058402176946401596,
-0.08098772168159485,
0.041510336101055145,
0.005158436484634876,
0.08426184207201004,
0.015012107789516449,
-0.06509631872177124,
0.12154761701822281,
-0.0728263407945633,
-0.17972655594348907,
-0.07092706114053726,
0.10838717222213745,
0.020179880782961845,
0.04547523334622383,
-0.0073124440386891365,
0.012152981013059616,
-0.015560775063931942,
-0.07710160315036774,
0.02442130632698536,
0.00561564601957798,
0.051546599715948105,
0.030712056905031204,
-0.05720522254705429,
-0.0025613741017878056,
-0.05933130532503128,
-0.02389913611114025,
0.15079012513160706,
0.2925492525100708,
-0.08473680168390274,
0.011967608705163002,
0.060570020228624344,
-0.06813773512840271,
-0.21000544726848602,
0.036273762583732605,
0.027064798399806023,
0.0033304807730019093,
0.04675736650824547,
-0.14911329746246338,
0.09992343187332153,
0.10247820615768433,
-0.02836848422884941,
0.115265391767025,
-0.29112720489501953,
-0.13693858683109283,
0.12583336234092712,
0.1450922042131424,
0.1180499717593193,
-0.15903757512569427,
-0.04340143874287605,
-0.04072235897183418,
-0.1073203757405281,
0.108147993683815,
-0.12961752712726593,
0.109608955681324,
-0.006875555031001568,
0.05172456055879593,
0.006334154400974512,
-0.051840074360370636,
0.13978011906147003,
0.0001336793357040733,
0.11625853180885315,
-0.061984241008758545,
-0.0164385586977005,
0.05768922343850136,
-0.061778731644153595,
0.01936776377260685,
-0.11603022366762161,
0.044873155653476715,
-0.06113677844405174,
-0.022650549188256264,
-0.04348858445882797,
0.034409698098897934,
-0.03925873339176178,
-0.05803742632269859,
-0.043402716517448425,
0.02587192878127098,
0.04447324573993683,
-0.0070566339418292046,
0.16503213346004486,
0.013102633878588676,
0.14423902332782745,
0.14692838490009308,
0.07686861604452133,
-0.06919044256210327,
-0.008757158182561398,
-0.008240152150392532,
-0.03608669340610504,
0.06344332545995712,
-0.16010722517967224,
0.04212872311472893,
0.12535053491592407,
0.013035088777542114,
0.14979830384254456,
0.06952080875635147,
-0.02988481894135475,
0.014233401976525784,
0.0608752965927124,
-0.16146081686019897,
-0.10508797317743301,
-0.007595548406243324,
-0.030045509338378906,
-0.11969593167304993,
0.05791445076465607,
0.12785571813583374,
-0.0663730651140213,
0.0074557093903422356,
-0.00541059672832489,
0.015494279563426971,
-0.03424257040023804,
0.17972372472286224,
0.07021588832139969,
0.04623090475797653,
-0.08495769649744034,
0.09199389815330505,
0.0578949935734272,
-0.07443088293075562,
0.009068417362868786,
0.04409179836511612,
-0.08453646302223206,
-0.04721128195524216,
0.043176159262657166,
0.1913745403289795,
-0.03265155851840973,
-0.04788843169808388,
-0.14684879779815674,
-0.11358907073736191,
0.05276036262512207,
0.17998814582824707,
0.09868751466274261,
0.015087423846125603,
-0.03525293618440628,
0.010183805599808693,
-0.10805798321962357,
0.11733382195234299,
0.04528610408306122,
0.08929302543401718,
-0.1537856012582779,
0.11731936037540436,
-0.00551610067486763,
0.010966666042804718,
-0.024875931441783905,
0.04657091572880745,
-0.11679106205701828,
-0.008620575070381165,
-0.145248144865036,
-0.0010622103000059724,
-0.02213941514492035,
0.010251045227050781,
0.0009414556552655995,
-0.05515131726861,
-0.05565661936998367,
0.00948457419872284,
-0.0989142432808876,
-0.024284575134515762,
0.0339125394821167,
0.04996289312839508,
-0.12362589687108994,
-0.04990575462579727,
0.022518280893564224,
-0.07338392734527588,
0.06937216967344284,
0.01902598701417446,
0.020056430250406265,
0.04806474223732948,
-0.1855190247297287,
0.021500837057828903,
0.057499729096889496,
0.019488660618662834,
0.04703657329082489,
-0.08265970647335052,
-0.02227623015642166,
-0.005146906711161137,
0.04245232045650482,
0.019658146426081657,
0.08979744464159012,
-0.12280616909265518,
0.014433708041906357,
-0.029076315462589264,
-0.06332500278949738,
-0.050965722650289536,
0.034330397844314575,
0.08721168339252472,
0.01196200866252184,
0.2074645757675171,
-0.09670136868953705,
0.01932312548160553,
-0.2025272250175476,
0.004567801486700773,
0.0019685737788677216,
-0.11671383678913116,
-0.11677179485559464,
-0.053585562855005264,
0.04996539652347565,
-0.061284471303224564,
0.13357214629650116,
0.00926235318183899,
0.02690991945564747,
0.035531118512153625,
-0.028389789164066315,
0.03652818128466606,
0.02669581212103367,
0.21418966352939606,
0.03301848843693733,
-0.04044715687632561,
0.01322428323328495,
0.024759382009506226,
0.11412248015403748,
0.07727012783288956,
0.16735994815826416,
0.1653759926557541,
-0.047584570944309235,
0.09863964468240738,
0.04401977360248566,
-0.04900294914841652,
-0.13626426458358765,
0.06626073271036148,
-0.039585620164871216,
0.1027722954750061,
-0.01574127934873104,
0.19873759150505066,
0.08993887156248093,
-0.15701933205127716,
0.016794875264167786,
-0.04860910028219223,
-0.0863184705376625,
-0.1088586077094078,
-0.06208421289920807,
-0.09828166663646698,
-0.14247027039527893,
-0.006211050786077976,
-0.11155468970537186,
0.01245781872421503,
0.1009899228811264,
0.0017959214746952057,
-0.016242017969489098,
0.163749560713768,
0.0008611345547251403,
0.035836413502693176,
0.062018267810344696,
0.0008616612758487463,
-0.04257979989051819,
-0.07355688512325287,
-0.09994727373123169,
0.012008104473352432,
-0.008131805807352066,
0.024468420073390007,
-0.04587997868657112,
-0.020245717838406563,
0.041861191391944885,
-0.009551173076033592,
-0.11158358305692673,
0.011882970109581947,
0.028891779482364655,
0.04873195290565491,
0.050666287541389465,
0.012873739935457706,
0.01123177632689476,
0.0010381643660366535,
0.2187424749135971,
-0.07579776644706726,
-0.06621373444795609,
-0.0977235734462738,
0.21504957973957062,
0.022861450910568237,
0.012090405449271202,
0.012209150940179825,
-0.09420657902956009,
0.02838418073952198,
0.2093445211648941,
0.18792764842510223,
-0.0987219363451004,
-0.000033142037864308804,
-0.01774763874709606,
-0.009071915410459042,
-0.03500254079699516,
0.09150321781635284,
0.11025530844926834,
0.006845984607934952,
-0.07249758392572403,
-0.05600903183221817,
-0.03894432634115219,
-0.008682544343173504,
-0.051531195640563965,
0.05627221614122391,
0.026947224512696266,
0.009873033501207829,
-0.050235308706760406,
0.059373434633016586,
-0.037524301558732986,
-0.1094055026769638,
0.05062684044241905,
-0.19721020758152008,
-0.15222543478012085,
-0.01903560571372509,
0.11201108247041702,
-0.009866760112345219,
0.044107090681791306,
-0.03392788767814636,
-0.0008012969046831131,
0.0683063194155693,
-0.031113773584365845,
-0.061754949390888214,
-0.059365835040807724,
0.05803288146853447,
-0.10416639596223831,
0.2237098515033722,
-0.03225187957286835,
0.04764861986041069,
0.1266249716281891,
0.04963808134198189,
-0.07040529698133469,
0.0836520865559578,
0.04726316034793854,
-0.06027708575129509,
0.03377300873398781,
0.08545643091201782,
-0.04247882962226868,
0.12078022211790085,
0.0640997439622879,
-0.1357356160879135,
0.01109742745757103,
-0.03347717970609665,
-0.09614472091197968,
-0.05301711708307266,
-0.04238516837358475,
-0.05982768163084984,
0.1325821429491043,
0.18687400221824646,
-0.03550780564546585,
0.01052683126181364,
-0.04102814942598343,
0.015339416451752186,
0.06744774430990219,
0.03645361214876175,
-0.03585364669561386,
-0.2278490513563156,
0.026259895414114,
0.06145141273736954,
-0.0016318528214469552,
-0.2820723056793213,
-0.08301890641450882,
-0.012913252227008343,
-0.04356272891163826,
-0.09421859681606293,
0.08908774703741074,
0.11322517693042755,
0.04987645894289017,
-0.0622892864048481,
-0.08771912008523941,
-0.07512526214122772,
0.15521787106990814,
-0.1255498081445694,
-0.09571827203035355
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-small-samsum
This model is a fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6084
- Rouge1: 44.1542
- Rouge2: 19.8954
- Rougel: 36.576
- Rougelsum: 40.1511
- Gen Len: 16.7875
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 1.7821 | 1.0 | 7366 | 1.6311 | 43.0516 | 18.9698 | 35.8 | 39.4369 | 16.9560 |
| 1.6646 | 2.0 | 14732 | 1.6093 | 43.3108 | 18.9854 | 35.7539 | 39.4436 | 16.8913 |
| 1.6213 | 3.0 | 22098 | 1.6084 | 44.1542 | 19.8954 | 36.576 | 40.1511 | 16.7875 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "google/flan-t5-small", "model-index": [{"name": "flan-t5-small-samsum", "results": []}]} | text2text-generation | hupenc/flan-t5-small-samsum | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T10:41:42+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| flan-t5-small-samsum
====================
This model is a fine-tuned version of google/flan-t5-small on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.6084
* Rouge1: 44.1542
* Rouge2: 19.8954
* Rougel: 36.576
* Rougelsum: 40.1511
* Gen Len: 16.7875
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 2
* eval\_batch\_size: 2
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
81,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google/flan-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.09432537108659744,
0.08814048022031784,
-0.0023169966880232096,
0.11520210653543472,
0.11289773881435394,
-0.00383351044729352,
0.18614067137241364,
0.12187326699495316,
-0.060657672584056854,
0.045753851532936096,
0.14093901216983795,
0.10045479983091354,
0.021066777408123016,
0.1534188687801361,
-0.06874904781579971,
-0.20802618563175201,
0.012451480142772198,
0.02562115527689457,
-0.04785153269767761,
0.13484905660152435,
0.10035915672779083,
-0.11413305252790451,
0.11066337674856186,
0.002507454017177224,
-0.1406642347574234,
0.004429369233548641,
0.02944410778582096,
-0.06186017766594887,
0.13419906795024872,
0.042450129985809326,
0.09137722104787827,
0.05108644813299179,
0.049337342381477356,
-0.17367412149906158,
0.014056380838155746,
0.05277710407972336,
-0.01031826063990593,
0.08705389499664307,
0.053455401211977005,
-0.006081112660467625,
0.06755897402763367,
-0.08701786398887634,
0.03799298033118248,
0.024413274601101875,
-0.12172845005989075,
-0.19571252167224884,
-0.07765579968690872,
0.04309545457363129,
0.067558154463768,
0.08969290554523468,
-0.014185478910803795,
0.15342548489570618,
-0.02171039767563343,
0.09903933852910995,
0.22548390924930573,
-0.3294733762741089,
-0.06339090317487717,
0.029299186542630196,
0.050028592348098755,
0.10008194297552109,
-0.0834093689918518,
0.002951051341369748,
0.05043499171733856,
0.016991538926959038,
0.1476406306028366,
-0.026044905185699463,
-0.015747668221592903,
-0.002178068505600095,
-0.12526419758796692,
-0.038740091025829315,
0.19349023699760437,
0.06596086174249649,
-0.057087335735559464,
-0.08221083879470825,
-0.07823559641838074,
-0.11890681833028793,
-0.022825734689831734,
-0.022679289802908897,
0.05021153762936592,
-0.010271656326949596,
-0.08175717294216156,
-0.07346935570240021,
-0.11711468547582626,
-0.06358268857002258,
-0.04129808768630028,
0.12766601145267487,
0.02546423301100731,
-0.0024201073683798313,
-0.031249381601810455,
0.09621965885162354,
-0.0022857943549752235,
-0.1478375643491745,
0.014550200663506985,
0.019526762887835503,
0.03616306930780411,
-0.038323625922203064,
-0.05039102956652641,
-0.12424419075250626,
0.031077975407242775,
0.1304289996623993,
-0.05339181795716286,
0.034724682569503784,
-0.010786660946905613,
0.04421193152666092,
-0.11352507770061493,
0.16565750539302826,
-0.039907049387693405,
-0.04305605590343475,
0.05501750111579895,
0.11212771385908127,
0.0861736536026001,
-0.012439475394785404,
-0.13304029405117035,
0.012951360084116459,
0.11881916970014572,
0.023852434009313583,
-0.03003067709505558,
0.07172049582004547,
-0.05769961327314377,
-0.01561958622187376,
0.020812533795833588,
-0.08970274031162262,
0.010782054625451565,
-0.002073977142572403,
-0.04881451651453972,
-0.0715567097067833,
0.032180771231651306,
0.0282379612326622,
-0.00765348831191659,
0.06255508214235306,
-0.08362805843353271,
-0.00544138764962554,
-0.057212118059396744,
-0.1078038364648819,
0.016537366434931755,
-0.06253794580698013,
0.01870773173868656,
-0.12268900871276855,
-0.20322158932685852,
0.0008578935521654785,
0.049793221056461334,
-0.03333421051502228,
-0.06371547281742096,
-0.05644303187727928,
-0.07332561910152435,
0.011521277949213982,
-0.01417504996061325,
0.07740318775177002,
-0.06490633636713028,
0.10391193628311157,
0.05615328252315521,
0.05681263282895088,
-0.06386625021696091,
0.031133947893977165,
-0.11384715139865875,
0.041808005422353745,
-0.1494217962026596,
0.04335169866681099,
-0.014240304008126259,
0.07026059180498123,
-0.09540408104658127,
-0.07048740237951279,
-0.03470679745078087,
-0.007368128281086683,
0.07021472603082657,
0.11278378218412399,
-0.14694133400917053,
-0.06417281925678253,
0.18215034902095795,
-0.08944043517112732,
-0.19043301045894623,
0.1473000943660736,
-0.0419895276427269,
0.08174353092908859,
0.07444024085998535,
0.1880979835987091,
0.06669390201568604,
-0.07942383736371994,
0.011734307743608952,
-0.01423362735658884,
0.05975478142499924,
-0.041681837290525436,
0.09588909149169922,
-0.012494005262851715,
-0.01078617013990879,
0.00712957326322794,
-0.06134241819381714,
0.06458620727062225,
-0.061678607016801834,
-0.08311963081359863,
-0.04443224146962166,
-0.10641087591648102,
0.05373121798038483,
0.03419944643974304,
0.0761396586894989,
-0.11615987122058868,
-0.097510926425457,
0.02828180603682995,
0.054720643907785416,
-0.0833144336938858,
0.019520975649356842,
-0.0765969529747963,
0.10414819419384003,
-0.08724966645240784,
-0.009048295207321644,
-0.13644224405288696,
-0.0467451773583889,
0.029544787481427193,
-0.006374059244990349,
0.011748977936804295,
-0.023977546021342278,
0.08693286776542664,
0.06634547561407089,
-0.07584120333194733,
-0.045004721730947495,
-0.01588849350810051,
0.0008129972848109901,
-0.11099149286746979,
-0.17016136646270752,
-0.005097780842334032,
-0.02603677473962307,
0.15888971090316772,
-0.21850718557834625,
0.0496990904211998,
0.016898170113563538,
0.08449354767799377,
0.047646116465330124,
-0.012375742197036743,
-0.012695759534835815,
0.034296974539756775,
-0.049386367201805115,
-0.07307009398937225,
0.06821642816066742,
0.03417159244418144,
-0.12759186327457428,
0.013939646072685719,
-0.15967236459255219,
0.1996607631444931,
0.13214442133903503,
-0.06071522831916809,
-0.05217822641134262,
-0.0020729275420308113,
-0.04118011146783829,
-0.032376810908317566,
-0.03820556402206421,
-0.02852456271648407,
0.12146046757698059,
0.0018753728363662958,
0.16635264456272125,
-0.11423929035663605,
-0.05004753917455673,
0.02176012471318245,
-0.03761908784508705,
-0.0003040714655071497,
0.10309857130050659,
0.03172644227743149,
-0.14122876524925232,
0.1418677419424057,
0.18989825248718262,
-0.0638265535235405,
0.1426674723625183,
-0.04740847274661064,
-0.05962132662534714,
-0.026690643280744553,
0.04494909942150116,
0.024530809372663498,
0.0862988829612732,
-0.0934072881937027,
0.010528966784477234,
0.01558765396475792,
0.005161259789019823,
0.013504277914762497,
-0.20459707081317902,
-0.02435995824635029,
0.05260859429836273,
-0.06700784713029861,
0.004598703235387802,
-0.0021812182385474443,
-0.0206604041159153,
0.09303253889083862,
0.007313945330679417,
-0.0637466162443161,
0.06288167834281921,
0.002372461138293147,
-0.08912383019924164,
0.1913282424211502,
-0.050511445850133896,
-0.18150995671749115,
-0.15230511128902435,
-0.03592214733362198,
-0.06878049671649933,
0.034371379762887955,
0.07083394378423691,
-0.0422806441783905,
-0.037851482629776,
-0.13688404858112335,
-0.001957148080691695,
0.021037042140960693,
0.024309825152158737,
0.010518829338252544,
-0.006980055011808872,
0.09602504968643188,
-0.09824768453836441,
-0.009897240437567234,
-0.006868953350931406,
-0.03254186362028122,
0.03735562413930893,
0.00008907353912945837,
0.10818979144096375,
0.1050146296620369,
-0.019151462242007256,
0.006948812864720821,
-0.03692476078867912,
0.22404460608959198,
-0.05846668779850006,
-0.001549068372696638,
0.15761655569076538,
-0.010981784202158451,
0.06025691330432892,
0.1278783082962036,
0.03695811331272125,
-0.0972040593624115,
0.033089302480220795,
0.019122779369354248,
-0.029496517032384872,
-0.2096726894378662,
-0.00886340532451868,
-0.04513213783502579,
0.012416604906320572,
0.1015353873372078,
0.03869443014264107,
0.05156654119491577,
0.07381859421730042,
0.005122784990817308,
0.09526613354682922,
0.014743622392416,
0.07966295629739761,
0.13305702805519104,
0.05628305673599243,
0.12437291443347931,
-0.049325332045555115,
-0.05940115079283714,
0.04343072324991226,
-0.003939126618206501,
0.17202799022197723,
0.002978616626933217,
0.21121834218502045,
0.03379945456981659,
0.13704869151115417,
0.00035819923505187035,
0.07372485101222992,
-0.006013807374984026,
-0.024186458438634872,
-0.0167918149381876,
-0.06313233077526093,
-0.030032534152269363,
0.03593050315976143,
-0.06989137828350067,
0.06799639761447906,
-0.09271489083766937,
0.03463374450802803,
0.05618195980787277,
0.2657819390296936,
0.04343516752123833,
-0.3477080464363098,
-0.09509502351284027,
0.022088300436735153,
-0.017978794872760773,
-0.031308434903621674,
0.022295432165265083,
0.1527692973613739,
-0.05514739826321602,
0.047003135085105896,
-0.0902746394276619,
0.07915391772985458,
-0.036745768040418625,
0.042401012033224106,
0.04388168826699257,
0.06855697184801102,
-0.007848866283893585,
0.06377021223306656,
-0.27892833948135376,
0.2527472674846649,
0.016935816034674644,
0.0689530223608017,
-0.049801331013441086,
0.0029189104679971933,
0.03165502846240997,
0.05828842148184776,
0.09060008078813553,
-0.018475258722901344,
-0.017353661358356476,
-0.15056854486465454,
-0.09766240417957306,
0.02826165221631527,
0.08670424669981003,
-0.06593118607997894,
0.10749444365501404,
-0.05453167483210564,
-0.00797667820006609,
0.06949411332607269,
0.021401679143309593,
-0.07683674991130829,
-0.09729954600334167,
-0.0011377324117347598,
0.056399378925561905,
-0.0024256021715700626,
-0.09295007586479187,
-0.09823669493198395,
-0.1091952919960022,
0.15168368816375732,
-0.019355928525328636,
-0.0516827255487442,
-0.09898234158754349,
0.05855344980955124,
0.05135412514209747,
-0.08103109896183014,
0.042054735124111176,
0.0018552429974079132,
0.09688375890254974,
0.020910512655973434,
-0.057506151497364044,
0.12042683362960815,
-0.05689553543925285,
-0.18407398462295532,
-0.04908813536167145,
0.14067882299423218,
-0.023241523653268814,
0.0383722148835659,
0.0023810076527297497,
0.01473819836974144,
-0.03578720986843109,
-0.06498275697231293,
0.02876102738082409,
-0.029938584193587303,
0.051543042063713074,
-0.0069373175501823425,
-0.026684360578656197,
0.02020563744008541,
-0.05374521389603615,
-0.04468933120369911,
0.17565743625164032,
0.293794184923172,
-0.06643181294202805,
0.0066049909219145775,
0.04306944087147713,
-0.04343300312757492,
-0.1775505244731903,
0.013246681541204453,
0.01935512386262417,
0.009931154549121857,
0.06561385095119476,
-0.1289515346288681,
0.06051782891154289,
0.07681857794523239,
-0.02836279757320881,
0.09277172386646271,
-0.29716283082962036,
-0.14435115456581116,
0.09016989171504974,
0.1607722043991089,
0.11828020960092545,
-0.16637590527534485,
-0.06244116276502609,
-0.04010097682476044,
-0.11714307963848114,
0.11272311955690384,
-0.1408383846282959,
0.10660521686077118,
-0.006472150329500437,
0.0368291437625885,
0.011800040490925312,
-0.05090826004743576,
0.12278728187084198,
-0.049384407699108124,
0.08662167191505432,
-0.06418172270059586,
-0.005359533242881298,
0.08751361817121506,
-0.06408242136240005,
0.03930274769663811,
-0.1527344435453415,
0.04619443789124489,
-0.047436363995075226,
-0.039657123386859894,
-0.04947792738676071,
0.03503676876425743,
-0.03296936675906181,
-0.05231567844748497,
-0.032658901065588,
0.010082975029945374,
0.05506887286901474,
-0.0012742046965286136,
0.160639688372612,
0.007279049139469862,
0.12145648151636124,
0.15011703968048096,
0.1100495457649231,
-0.055221766233444214,
-0.017058096826076508,
-0.020285986363887787,
-0.04264184087514877,
0.047236137092113495,
-0.1512564867734909,
0.04643508791923523,
0.09978815168142319,
0.003185200970619917,
0.14573685824871063,
0.06195082888007164,
-0.03575826808810234,
0.019788198173046112,
0.06877442449331284,
-0.18185853958129883,
-0.16194406151771545,
-0.049371346831321716,
-0.04169178009033203,
-0.13595600426197052,
0.041790202260017395,
0.13666222989559174,
-0.06652063876390457,
0.002282612258568406,
-0.006952248513698578,
0.008185180835425854,
-0.019069306552410126,
0.1498146802186966,
0.047385673969984055,
0.04212620109319687,
-0.07194661349058151,
0.08557423204183578,
0.05366215854883194,
-0.06698707491159439,
0.016662459820508957,
0.03987680375576019,
-0.09439587593078613,
-0.042481545358896255,
0.03669915348291397,
0.1617652326822281,
-0.021710194647312164,
-0.052165113389492035,
-0.16342340409755707,
-0.11156424880027771,
0.03761664777994156,
0.14216439425945282,
0.07856124639511108,
0.02662467025220394,
-0.017708148807287216,
-0.001608520746231079,
-0.08921448886394501,
0.13348615169525146,
0.04675046354532242,
0.08149034529924393,
-0.16697533428668976,
0.1013750210404396,
-0.00976825226098299,
0.009094873443245888,
-0.02159453183412552,
0.04575660824775696,
-0.0937810093164444,
-0.011401577852666378,
-0.1344892382621765,
0.01641729474067688,
-0.015869837254285812,
-0.003948807716369629,
-0.008419185876846313,
-0.05846218392252922,
-0.06990984082221985,
0.019951360300183296,
-0.0935041531920433,
-0.040311943739652634,
0.035353854298591614,
0.0570157915353775,
-0.11523864418268204,
-0.03566651791334152,
0.032860834151506424,
-0.08133957535028458,
0.09245038032531738,
0.017422059550881386,
0.0005610236548818648,
0.03160813823342323,
-0.1535664200782776,
0.036370377987623215,
0.05385810136795044,
0.009205443784594536,
0.020857736468315125,
-0.10042215138673782,
-0.01262616366147995,
0.014799488708376884,
0.026126576587557793,
0.015979856252670288,
0.10060581564903259,
-0.12210071086883545,
-0.00009802722343010828,
-0.015770386904478073,
-0.037185076624155045,
-0.05568239837884903,
0.023256896063685417,
0.05811728909611702,
0.00994768738746643,
0.21913380920886993,
-0.097664974629879,
0.0028523665387183428,
-0.21141497790813446,
0.0176639873534441,
0.0007810533279553056,
-0.12337649613618851,
-0.13409334421157837,
-0.05361213907599449,
0.0449591688811779,
-0.06010846421122551,
0.11760981380939484,
-0.008881824091076851,
0.0360216461122036,
0.0338725782930851,
-0.008702998049557209,
0.06901426613330841,
0.019220678135752678,
0.24172905087471008,
0.006425208877772093,
-0.04202411696314812,
0.033092062920331955,
0.014688082039356232,
0.11473368853330612,
0.08115489035844803,
0.16912950575351715,
0.1577177345752716,
-0.05358307063579559,
0.11051241308450699,
0.041217099875211716,
-0.014393788762390614,
-0.1434302031993866,
0.03969772905111313,
-0.028752263635396957,
0.10588830709457397,
-0.01534363441169262,
0.21416474878787994,
0.12486592680215836,
-0.14903537929058075,
0.005624936427921057,
-0.049998629838228226,
-0.062221549451351166,
-0.09715531766414642,
-0.10447311401367188,
-0.107839435338974,
-0.14084257185459137,
-0.008640117011964321,
-0.09968771040439606,
0.003641804214566946,
0.0704628974199295,
0.0014332198770716786,
-0.03543303534388542,
0.18045610189437866,
0.010008204728364944,
-0.00035062083043158054,
0.05779534950852394,
-0.0066108847968280315,
-0.03599819168448448,
-0.061042968183755875,
-0.10694198310375214,
0.007210060488432646,
-0.0037479286547750235,
0.02507293038070202,
-0.032445937395095825,
-0.012675389647483826,
0.03130525350570679,
-0.02402145229279995,
-0.11152885854244232,
0.0049141775816679,
0.029229294508695602,
0.04789820685982704,
0.022357843816280365,
0.0183409433811903,
-0.007085159420967102,
0.0069655184634029865,
0.23150725662708282,
-0.06955290585756302,
-0.06649051606655121,
-0.09210243821144104,
0.1515483260154724,
0.008076125755906105,
-0.004019885323941708,
0.016873685643076897,
-0.10012172907590866,
0.03957004100084305,
0.2237454205751419,
0.17956610023975372,
-0.08302394300699234,
0.004502059426158667,
-0.014472231268882751,
-0.009055675938725471,
-0.020488860085606575,
0.07603344321250916,
0.0952640250325203,
0.0020983435679227114,
-0.06364130228757858,
-0.016774604097008705,
-0.036038655787706375,
-0.002185751451179385,
-0.042458679527044296,
0.09067313373088837,
0.01798054203391075,
0.013262704946100712,
-0.041686657816171646,
0.05917557701468468,
-0.025006094947457314,
-0.08167798072099686,
0.011482834815979004,
-0.19901376962661743,
-0.1332925409078598,
-0.03097497671842575,
0.0866120383143425,
-0.013259107246994972,
0.04660402610898018,
-0.0265594944357872,
0.011389429681003094,
0.029587984085083008,
-0.02269625850021839,
-0.08439406007528305,
-0.03395799919962883,
0.05973925441503525,
-0.125967338681221,
0.23256352543830872,
-0.03163767233490944,
0.03685236722230911,
0.1292109489440918,
0.026817049831151962,
-0.09386912733316422,
0.09507124125957489,
0.045535579323768616,
-0.034474071115255356,
0.057718705385923386,
0.07771362364292145,
-0.027460476383566856,
0.10329899191856384,
0.05604051798582077,
-0.0809304490685463,
0.009916216135025024,
-0.02441335842013359,
-0.05892753601074219,
-0.061281152069568634,
-0.05275372043251991,
-0.06391709297895432,
0.13163277506828308,
0.1605108380317688,
-0.05778026208281517,
0.002062993124127388,
-0.044735610485076904,
0.027307812124490738,
0.08835188299417496,
0.03921535611152649,
-0.017637820914387703,
-0.22947071492671967,
0.004247667267918587,
0.07924506813287735,
0.003556932555511594,
-0.31749746203422546,
-0.085939921438694,
-0.02757384069263935,
-0.03703475371003151,
-0.10162842273712158,
0.08380453288555145,
0.14250001311302185,
0.04501932114362717,
-0.05422680452466011,
-0.0499359592795372,
-0.08480043709278107,
0.15979593992233276,
-0.13092486560344696,
-0.09257251769304276
] |
null | null | stable-baselines3 |
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
| {"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "246.44 +/- 20.91", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | imadjinn/ppo-LunarLander-v2 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-13T10:43:15+00:00 | [] | [] | TAGS
#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# PPO Agent playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2
using the stable-baselines3 library.
## Usage (with Stable-baselines3)
TODO: Add your code
| [
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
"TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
39,
41,
17
] | [
"passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
0.03942384943366051,
0.04900386184453964,
-0.005304091144353151,
0.026427261531352997,
0.107408307492733,
-0.026511888951063156,
0.11188238859176636,
0.0814051404595375,
0.10722193866968155,
0.04762078449130058,
0.08338645845651627,
0.06030960753560066,
0.05080918222665787,
0.2571701407432556,
0.04754156619310379,
-0.22987541556358337,
0.036159250885248184,
-0.04869936779141426,
0.12395193427801132,
0.07178173214197159,
-0.0038484656251966953,
-0.06485428661108017,
0.020415637642145157,
-0.013290755450725555,
0.05367108806967735,
0.04282612353563309,
-0.01716216839849949,
-0.08207534998655319,
0.07169748842716217,
-0.06345846503973007,
0.06986866891384125,
0.07677983492612839,
0.13218913972377777,
-0.17832116782665253,
0.029566360637545586,
0.02571309357881546,
-0.07189024239778519,
0.01342033501714468,
0.008019951172173023,
0.05120139941573143,
0.17303818464279175,
0.019879888743162155,
0.07844575494527817,
-0.0025605305563658476,
-0.15412317216396332,
-0.018950799480080605,
0.0436202734708786,
0.12546207010746002,
0.08808347582817078,
0.04605821147561073,
0.01970590092241764,
0.17503218352794647,
-0.054352790117263794,
-0.028833400458097458,
0.21759237349033356,
-0.2881564497947693,
-0.031460098922252655,
0.321048766374588,
0.06997483223676682,
0.09725230932235718,
-0.07540661096572876,
-0.03619609400629997,
0.007783263456076384,
-0.013137873262166977,
-0.028666524216532707,
-0.07447073608636856,
0.17313385009765625,
0.05152064561843872,
-0.05057951435446739,
-0.09541505575180054,
0.16948209702968597,
0.006921638268977404,
0.0018855923553928733,
-0.019282981753349304,
0.009060598909854889,
0.07402525842189789,
-0.016097044572234154,
-0.07255112379789352,
0.057438433170318604,
0.05330665782094002,
0.019649166613817215,
-0.1435653269290924,
-0.10762494057416916,
-0.022740179672837257,
-0.008012006990611553,
0.17786912620067596,
-0.009255532175302505,
0.042902372777462006,
0.003065188182517886,
0.10384012013673782,
-0.12480384111404419,
-0.03354184702038765,
-0.0454259067773819,
-0.07565800100564957,
-0.0223417766392231,
-0.02058211714029312,
-0.03580251708626747,
0.07184842973947525,
0.11971849203109741,
0.027368178591132164,
0.09350208193063736,
0.047715865075588226,
-0.03206788748502731,
0.06343851238489151,
0.05555703118443489,
0.14222665131092072,
0.05807621404528618,
0.012854371219873428,
0.13179877400398254,
0.055213116109371185,
0.033023182302713394,
-0.0613492950797081,
-0.18252409994602203,
0.07489913702011108,
-0.07031869143247604,
0.007941240444779396,
0.12051256000995636,
-0.04480670019984245,
-0.1183447614312172,
-0.037500523030757904,
-0.017392054200172424,
-0.06224250793457031,
-0.025395862758159637,
0.0547584593296051,
-0.02883218228816986,
-0.03973718360066414,
0.0011496668448671699,
0.09384800493717194,
0.00953749567270279,
-0.1752052903175354,
0.03303423151373863,
-0.025042934343218803,
-0.10782608389854431,
0.009975161403417587,
0.0022444494534283876,
0.03394931182265282,
0.04408763721585274,
-0.11822668462991714,
-0.30899152159690857,
-0.07652641832828522,
0.05490870401263237,
-0.06516939401626587,
-0.18425025045871735,
-0.13193942606449127,
0.02454492449760437,
-0.09037084132432938,
-0.044885024428367615,
-0.12759265303611755,
-0.028549788519740105,
0.01743689924478531,
0.011519349180161953,
0.10758619755506516,
-0.0106219332665205,
-0.012188062071800232,
-0.1571401208639145,
0.008273907005786896,
-0.20951123535633087,
0.0890483483672142,
-0.019150104373693466,
0.037884220480918884,
-0.032381169497966766,
-0.07404014468193054,
0.030707746744155884,
0.052499737590551376,
-0.01474119070917368,
0.13510210812091827,
-0.15592676401138306,
-0.03691192343831062,
-0.007996266707777977,
-0.13611900806427002,
-0.04786273464560509,
-0.10358831286430359,
-0.04357128217816353,
0.13354332745075226,
0.018664736300706863,
0.15356586873531342,
-0.08709818124771118,
-0.0722038671374321,
0.20489206910133362,
-0.010411538183689117,
-0.12820468842983246,
-0.076752208173275,
0.10165707021951675,
0.021510310471057892,
-0.056606587022542953,
-0.02523270808160305,
-0.1839766949415207,
-0.0152357779443264,
-0.04550420492887497,
-0.047039128839969635,
0.01796751655638218,
-0.010888241231441498,
0.13837894797325134,
0.08494598418474197,
0.05018039792776108,
-0.06086122244596481,
-0.006730288732796907,
0.10779471695423126,
0.08823856711387634,
0.008680110797286034,
0.023406028747558594,
-0.05774238705635071,
0.09552932530641556,
-0.04003755748271942,
-0.0142367510125041,
-0.08283266425132751,
-0.036246106028556824,
-0.026256313547492027,
0.17507147789001465,
0.09440762549638748,
0.2257927656173706,
0.09567736834287643,
0.039160262793302536,
0.031270865350961685,
-0.13181598484516144,
-0.1425403207540512,
-0.0017254541162401438,
0.09020978957414627,
-0.14270411431789398,
-0.04119925573468208,
-0.08974775671958923,
-0.17768175899982452,
-0.12202505767345428,
0.0006432619411498308,
-0.17960017919540405,
0.06390921026468277,
0.05408334732055664,
-0.035177867859601974,
0.03272094577550888,
0.13032332062721252,
-0.011533179320394993,
-0.03967514634132385,
0.0831870287656784,
0.0379033200442791,
-0.041234664618968964,
-0.021742934361100197,
0.11885567009449005,
0.15673065185546875,
0.13124459981918335,
-0.03511447086930275,
0.004914294462651014,
0.07076404243707657,
-0.02309088408946991,
0.06539414077997208,
0.0558244064450264,
0.20973342657089233,
0.188301220536232,
0.038996949791908264,
0.008822928182780743,
-0.07048165798187256,
0.0855446457862854,
-0.0742373839020729,
-0.14302679896354675,
-0.05579735338687897,
0.08729292452335358,
0.016605578362941742,
0.023469142615795135,
0.08711627870798111,
0.024545932188630104,
0.09132762253284454,
0.15968108177185059,
0.01990218088030815,
-0.09659269452095032,
-0.050218869000673294,
0.01175848301500082,
0.027713103219866753,
0.04794301092624664,
-0.04514073207974434,
-0.00937939714640379,
0.017020760104060173,
-0.10303554683923721,
0.031789086759090424,
-0.1413339376449585,
-0.1358717679977417,
0.044326696544885635,
0.003906996920704842,
0.010907664895057678,
0.02786896750330925,
-0.0038291432429105043,
0.019039705395698547,
0.04351753741502762,
-0.06975466758012772,
0.047416772693395615,
-0.024745507165789604,
-0.020031947642564774,
0.03340689837932587,
-0.057257164269685745,
-0.205775648355484,
-0.17696654796600342,
0.00013708483311347663,
-0.09910997003316879,
0.10194740444421768,
0.018308809027075768,
-0.12373185902833939,
0.047737859189510345,
-0.05822649225592613,
0.027574289590120316,
-0.01875593699514866,
-0.049130141735076904,
0.10507171601057053,
0.1525275856256485,
-0.016146350651979446,
0.018018173053860664,
-0.04865182936191559,
-0.10157987475395203,
-0.19632206857204437,
0.0691583976149559,
0.04680244252085686,
0.014610917307436466,
0.10669491440057755,
0.018072687089443207,
0.02367905154824257,
-0.007674071006476879,
-0.016521066427230835,
-0.011659215204417706,
-0.08781040459871292,
0.31909599900245667,
0.04510033503174782,
-0.025173069909214973,
0.02041010931134224,
-0.0043001663871109486,
-0.028083480894565582,
0.03263787180185318,
-0.0985708013176918,
-0.07548979669809341,
-0.08774089068174362,
-0.04367410019040108,
-0.09784720093011856,
0.053299110382795334,
0.05916472524404526,
0.003188040340319276,
-0.07727594673633575,
0.04221395403146744,
0.11369874328374863,
-0.0923808291554451,
-0.07137343287467957,
0.07477962225675583,
0.0972946360707283,
-0.07331304252147675,
0.00012658814375754446,
0.00874367356300354,
0.023951783776283264,
0.037102166563272476,
0.06778035312891006,
-0.03966575115919113,
0.08589404821395874,
-0.19917890429496765,
0.0372927263379097,
0.106058269739151,
0.023754918947815895,
0.0638108178973198,
0.07643651217222214,
-0.1058402881026268,
-0.008500572293996811,
-0.032518330961465836,
-0.21341575682163239,
0.1668180525302887,
0.1355515867471695,
0.06788124144077301,
-0.025637222453951836,
-0.00461410591378808,
-0.0649740919470787,
0.05773647129535675,
0.02723747305572033,
-0.14758841693401337,
0.004883295856416225,
0.06064270809292793,
0.026899009943008423,
0.01614922471344471,
0.07971042394638062,
0.014697225764393806,
-0.1801026314496994,
-0.014406266622245312,
0.10730406641960144,
0.002390873385593295,
0.0053148469887673855,
-0.03175045922398567,
-0.1755964607000351,
0.0751047357916832,
0.004285442177206278,
0.07233936339616776,
-0.1676585078239441,
0.14297930896282196,
-0.10089799761772156,
0.07726949453353882,
-0.004285062663257122,
-0.021311495453119278,
0.02507244050502777,
-0.0541163794696331,
0.15163759887218475,
0.01058570109307766,
-0.021810131147503853,
-0.1200498715043068,
-0.1717042326927185,
-0.019227758049964905,
-0.11788936704397202,
-0.11679866164922714,
0.050424277782440186,
0.062185097485780716,
0.04923136904835701,
-0.061147067695856094,
0.1518532931804657,
-0.047422297298908234,
0.060713399201631546,
-0.06893875449895859,
-0.06755045056343079,
0.03764858841896057,
-0.12588608264923096,
-0.08176055550575256,
0.05573027580976486,
0.19166934490203857,
0.15833087265491486,
-0.02816431224346161,
-0.03472423925995827,
-0.047419581562280655,
-0.006212298292666674,
-0.007802055217325687,
0.0275666993111372,
0.023223137483000755,
0.07315318286418915,
-0.07681374251842499,
-0.11649256944656372,
0.033787861466407776,
-0.06713802367448807,
-0.055589709430933,
-0.015439179725944996,
0.1513158082962036,
0.04671623185276985,
0.07720734924077988,
-0.018946662545204163,
0.03887668624520302,
-0.001724981120787561,
-0.056474871933460236,
0.16197094321250916,
0.03885216265916824,
-0.05193585529923439,
0.06837689876556396,
0.053174007683992386,
0.043745119124650955,
0.03011113777756691,
-0.026783017441630363,
0.206032395362854,
0.1980147808790207,
0.014206883497536182,
0.2175983190536499,
0.03177616000175476,
-0.03772832080721855,
-0.1300560086965561,
-0.065880686044693,
-0.006372632458806038,
0.03559038043022156,
0.08070417493581772,
-0.18207235634326935,
-0.015011128038167953,
-0.05689644813537598,
-0.034518610686063766,
-0.15059494972229004,
-0.28553900122642517,
-0.05957856774330139,
0.20075850188732147,
0.14706264436244965,
0.27519428730010986,
-0.10432573407888412,
0.035197313874959946,
0.02663275972008705,
-0.04912831634283066,
-0.006501141935586929,
0.00018665487004909664,
0.10268618166446686,
-0.15421873331069946,
0.1176437959074974,
0.08486983180046082,
-0.019002694636583328,
0.01058861706405878,
-0.1619086116552353,
0.00936629343777895,
-0.12191236019134521,
0.05354422330856323,
0.1400289237499237,
-0.048128653317689896,
-0.054873593151569366,
0.14033560454845428,
-0.024562934413552284,
-0.22685599327087402,
-0.04648222774267197,
-0.043600670993328094,
-0.010640020482242107,
0.026607351377606392,
-0.1013401448726654,
0.04101909324526787,
0.1330099105834961,
0.009380043484270573,
0.1147187277674675,
0.11749245226383209,
-0.052566803991794586,
0.10792597383260727,
0.2257719188928604,
-0.018785694614052773,
0.04689010605216026,
-0.12743118405342102,
-0.0012336712097749114,
-0.028270328417420387,
0.013657891191542149,
-0.09504974633455276,
-0.09938385337591171,
0.02366873063147068,
0.02872389927506447,
0.009118586778640747,
0.0921793207526207,
-0.029922157526016235,
0.0759170651435852,
0.06817561388015747,
-0.13014446198940277,
-0.16288450360298157,
0.015828335657715797,
-0.007344507612287998,
0.08354310691356659,
0.00027861111448146403,
0.08878035843372345,
-0.11932205408811569,
-0.018093237653374672,
-0.03153328225016594,
-0.03319635987281799,
-0.130486860871315,
-0.07138993591070175,
0.06156524643301964,
0.028095467016100883,
-0.06602972000837326,
0.1398407518863678,
0.026440169662237167,
0.15942534804344177,
0.049197953194379807,
0.012499804608523846,
0.07227300107479095,
-0.05345509201288223,
0.1283530443906784,
0.13818155229091644,
-0.00868943240493536,
-0.05460423603653908,
-0.1013643890619278,
-0.10236792266368866,
0.08925779908895493,
-0.05773641914129257,
0.07476430386304855,
-0.14885357022285461,
-0.06675903499126434,
0.015772046521306038,
0.016141414642333984,
-0.09562095999717712,
0.02571965754032135,
-0.01625603251159191,
-0.18119946122169495,
0.056570518761873245,
-0.048285093158483505,
0.0440407395362854,
-0.06347788125276566,
-0.1110161691904068,
-0.17226378619670868,
0.06091433763504028,
0.08593481779098511,
-0.053876690566539764,
-0.12229149043560028,
0.011023230850696564,
-0.00012518465518951416,
-0.06341652572154999,
-0.05023367330431938,
0.09722746908664703,
-0.11020902544260025,
0.031452205032110214,
-0.012567701749503613,
0.08853451162576675,
-0.03510405123233795,
-0.011538895778357983,
0.044220831245183945,
-0.08039166033267975,
-0.009481523185968399,
0.03534642979502678,
-0.026372017338871956,
-0.04127239063382149,
-0.2689029574394226,
0.0036654395516961813,
0.0341104120016098,
0.02497158572077751,
0.07856601476669312,
0.011906822212040424,
0.021174922585487366,
0.03993808850646019,
-0.15396519005298615,
-0.013395369984209538,
0.14574195444583893,
-0.07689505815505981,
-0.022186370566487312,
0.05703273415565491,
-0.09054436534643173,
0.013882770203053951,
-0.030287226662039757,
0.1345842480659485,
0.023923413828015327,
0.06404478847980499,
-0.0851147472858429,
0.10106813907623291,
-0.1451139897108078,
-0.04998219385743141,
-0.01244612317532301,
0.09761348366737366,
0.07019034773111343,
-0.10272270441055298,
0.014697125181555748,
0.04210108891129494,
0.19416837394237518,
0.016384804621338844,
-0.0356343574821949,
-0.03396720811724663,
0.004015897400677204,
0.22076453268527985,
0.03044266067445278,
0.10457023978233337,
0.07281364500522614,
-0.026583973318338394,
0.12624378502368927,
0.09929762035608292,
0.11280370503664017,
-0.055645186454057693,
0.13904185593128204,
0.04667386785149574,
0.038641396909952164,
0.0614289753139019,
0.06836545467376709,
0.09098632633686066,
-0.0008288522367365658,
0.1138714924454689,
0.013811973854899406,
-0.02422109805047512,
-0.021335409954190254,
0.17759373784065247,
0.10501719266176224,
-0.14769648015499115,
0.029047364369034767,
-0.01258957851678133,
0.039933037012815475,
-0.014194529503583908,
-0.15634691715240479,
-0.07240267097949982,
-0.3315149247646332,
0.1226184144616127,
-0.07119352370500565,
0.019930170848965645,
0.007913772016763687,
-0.037425633519887924,
-0.03296699747443199,
-0.04477746784687042,
0.13151589035987854,
-0.013641550205647945,
-0.006079165264964104,
-0.04815853759646416,
-0.015360191464424133,
-0.11607866734266281,
-0.11200575530529022,
-0.013207737356424332,
-0.13671602308750153,
-0.010119039565324783,
0.05595948174595833,
0.003977729007601738,
0.01821410097181797,
-0.03142618387937546,
0.0024383175186812878,
0.06541839241981506,
-0.05751744285225868,
0.056182678788900375,
0.12097269296646118,
0.08766137808561325,
-0.1058853268623352,
0.031048951670527458,
0.2011747509241104,
0.04359564557671547,
-0.12483977526426315,
0.01449228823184967,
0.1819491684436798,
0.004885740112513304,
0.017068125307559967,
-0.006097703706473112,
-0.0540788508951664,
-0.07554277032613754,
0.1251034289598465,
0.08296554535627365,
-0.09985227137804031,
0.015833314508199692,
-0.0726347416639328,
-0.01594804972410202,
-0.06374675035476685,
0.10130585730075836,
0.09538925439119339,
0.04440245032310486,
-0.10621760785579681,
-0.08487539738416672,
-0.10891728103160858,
0.040588874369859695,
-0.08629853278398514,
-0.07311757653951645,
0.09629398584365845,
-0.07057105004787445,
-0.07029950618743896,
0.025521177798509598,
-0.17978744208812714,
-0.009467960335314274,
0.1711762249469757,
-0.24654000997543335,
-0.0916430801153183,
-0.10857923328876495,
0.14477859437465668,
0.016497576609253883,
0.1013975441455841,
-0.006207061931490898,
-0.007889035157859325,
-0.20577777922153473,
0.024890204891562462,
-0.05293011665344238,
-0.02073732763528824,
0.07814782857894897,
-0.09476397186517715,
0.22629831731319427,
-0.08276885002851486,
0.020940175279974937,
0.012659613974392414,
0.0870661810040474,
-0.030675338581204414,
0.09283176809549332,
-0.03660329803824425,
-0.12576518952846527,
-0.03620953485369682,
0.03001813031733036,
0.013904244638979435,
0.10071761906147003,
0.09772487729787827,
-0.03414725139737129,
0.03389119729399681,
0.09747414290904999,
0.04172342270612717,
-0.023843804374337196,
0.0360250361263752,
-0.17077107727527618,
0.02182629331946373,
-0.018498148769140244,
-0.06935930997133255,
0.03687669709324837,
-0.06603235751390457,
0.1639697551727295,
0.04022442549467087,
0.0670473501086235,
-0.036152735352516174,
0.0073931049555540085,
-0.014454689808189869,
-0.013775371946394444,
-0.026180334389209747,
-0.17259705066680908,
-0.10422050207853317,
-0.1347656100988388,
-0.012701659463346004,
-0.034971047192811966,
0.04591470584273338,
0.023234914988279343,
-0.0003200018545612693,
-0.014577031135559082,
-0.12090865522623062,
0.04360328987240791,
0.11146783083677292,
-0.04631396010518074,
-0.026193076744675636
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text2text-generation | OmarHaroon01/t5_pretrain_final_final_final_kaggle | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T10:44:25+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
58,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.053328532725572586,
0.16120538115501404,
-0.005120371468365192,
0.022602224722504616,
0.09686747193336487,
0.013199392706155777,
0.07261143624782562,
0.11177206039428711,
-0.020693831145763397,
0.1128523200750351,
0.0323781855404377,
0.09778297692537308,
0.11381756514310837,
0.15530984103679657,
-0.0018252237932756543,
-0.23414164781570435,
0.051169246435165405,
-0.12603329122066498,
-0.039110470563173294,
0.11734651774168015,
0.14655858278274536,
-0.10434788465499878,
0.07780920714139938,
-0.029932111501693726,
-0.010786613449454308,
-0.030950399115681648,
-0.06109464541077614,
-0.04963193088769913,
0.05158040300011635,
0.07096312940120697,
0.06875279545783997,
0.009741154499351978,
0.09293358027935028,
-0.2676756680011749,
0.021060682833194733,
0.07436702400445938,
-0.0019205488497391343,
0.07644513249397278,
0.05394738167524338,
-0.07786445319652557,
0.08801496773958206,
-0.053122974932193756,
0.14802159368991852,
0.08166222274303436,
-0.09144649654626846,
-0.19256246089935303,
-0.08630277216434479,
0.10201671719551086,
0.17971307039260864,
0.050409309566020966,
-0.02338344417512417,
0.10295069962739944,
-0.08843041211366653,
0.012706292793154716,
0.059160783886909485,
-0.06515879184007645,
-0.05482804775238037,
0.0630323737859726,
0.08173035830259323,
0.0787791833281517,
-0.12468571215867996,
-0.018215585500001907,
0.011311499401926994,
0.00691694812849164,
0.08102929592132568,
0.022060219198465347,
0.14176861941814423,
0.03922285884618759,
-0.1292058527469635,
-0.047744158655405045,
0.10315844416618347,
0.04381343349814415,
-0.04969092458486557,
-0.24839195609092712,
-0.028692634776234627,
-0.03409173712134361,
-0.029329892247915268,
-0.041139665991067886,
0.04428756237030029,
-0.010770969092845917,
0.08322557806968689,
-0.008045176975429058,
-0.07979845255613327,
-0.03690612316131592,
0.06324487924575806,
0.05645342543721199,
0.024454401805996895,
-0.008984005078673363,
0.006743076257407665,
0.1175178587436676,
0.10636600106954575,
-0.12631633877754211,
-0.05289403349161148,
-0.06528059393167496,
-0.0853288322687149,
-0.04429693520069122,
0.03338160738348961,
0.04351643845438957,
0.04334709793329239,
0.24920088052749634,
0.011966975405812263,
0.05556565150618553,
0.03878911957144737,
0.011687099933624268,
0.06360286474227905,
0.11270952969789505,
-0.05845928564667702,
-0.09383665025234222,
-0.033332064747810364,
0.09301437437534332,
0.008503437042236328,
-0.0402098223567009,
-0.06047673895955086,
0.06078295037150383,
0.015703821554780006,
0.12211526930332184,
0.087046779692173,
0.002870776690542698,
-0.07195370644330978,
-0.06478150933980942,
0.19285908341407776,
-0.15949691832065582,
0.047871991991996765,
0.03357849270105362,
-0.040312062948942184,
-0.0005020854296162724,
0.01165273692458868,
0.023987481370568275,
-0.021567439660429955,
0.0924374982714653,
-0.05500924214720726,
-0.03761355206370354,
-0.10879732668399811,
-0.03591866046190262,
0.03197222575545311,
0.0022585385013371706,
-0.02967100404202938,
-0.033424828201532364,
-0.08920473605394363,
-0.0635172426700592,
0.09580977261066437,
-0.07413128018379211,
-0.05156254023313522,
-0.016345804557204247,
-0.0761859342455864,
0.026101797819137573,
0.01702207140624523,
0.08535456657409668,
-0.0213642455637455,
0.037230201065540314,
-0.05421315133571625,
0.06241346150636673,
0.10910454392433167,
0.0320611298084259,
-0.053984515368938446,
0.06094928830862045,
-0.2412392497062683,
0.10316064208745956,
-0.07156267017126083,
0.05108866095542908,
-0.15137021243572235,
-0.025331947952508926,
0.04665522649884224,
0.009590202011168003,
-0.011478574015200138,
0.14007656276226044,
-0.2198302298784256,
-0.029333066195249557,
0.1640782356262207,
-0.09730498492717743,
-0.08055570721626282,
0.059064920991659164,
-0.054139286279678345,
0.10999192297458649,
0.04003598168492317,
-0.023768696933984756,
0.06297750771045685,
-0.14250542223453522,
-0.0039275879971683025,
-0.041889119893312454,
-0.01720282807946205,
0.16010744869709015,
0.07506491243839264,
-0.06698185205459595,
0.077672079205513,
0.022212913259863853,
-0.023321649059653282,
-0.04393244534730911,
-0.022494852542877197,
-0.10826845467090607,
0.009565223939716816,
-0.06269361078739166,
0.02424052357673645,
-0.023944495245814323,
-0.0903024971485138,
-0.029575346037745476,
-0.1770460456609726,
-0.013402442447841167,
0.08679109811782837,
-0.010982494801282883,
-0.019886262714862823,
-0.11693590134382248,
0.012033592909574509,
0.032231178134679794,
0.0004325093177612871,
-0.13445010781288147,
-0.05658498778939247,
0.0273329745978117,
-0.16240260004997253,
0.031236927956342697,
-0.05114622414112091,
0.04928715154528618,
0.03406677767634392,
-0.03175085783004761,
-0.031348153948783875,
0.01572313904762268,
0.006510823033750057,
-0.013680041767656803,
-0.24737438559532166,
-0.02852414920926094,
-0.022412575781345367,
0.16979394853115082,
-0.2190135270357132,
0.04012007266283035,
0.07135825604200363,
0.15074580907821655,
0.006911954842507839,
-0.03669405356049538,
0.005606858059763908,
-0.0768459290266037,
-0.03284264728426933,
-0.0623927041888237,
-0.008401541970670223,
-0.03721899166703224,
-0.054593876004219055,
0.051287684589624405,
-0.16718235611915588,
-0.031153932213783264,
0.1028679683804512,
0.06780845671892166,
-0.13963541388511658,
-0.01705223321914673,
-0.04106766730546951,
-0.043112557381391525,
-0.05709490180015564,
-0.05539087578654289,
0.11148729920387268,
0.05757083371281624,
0.04828811436891556,
-0.06848311424255371,
-0.0756818875670433,
0.006132613401859999,
-0.0179264098405838,
-0.021222935989499092,
0.0928845927119255,
0.07583390921354294,
-0.12310270220041275,
0.09178637713193893,
0.10549022257328033,
0.0892157256603241,
0.10119049996137619,
-0.02137933485209942,
-0.08691582083702087,
-0.04892461374402046,
0.0229446180164814,
0.016364475712180138,
0.13983985781669617,
-0.016759416088461876,
0.05310053750872612,
0.04020100086927414,
-0.012910815887153149,
0.011883769184350967,
-0.09328193217515945,
0.02934250421822071,
0.03636814281344414,
-0.019501443952322006,
0.040251899510622025,
-0.03908125311136246,
0.020790016278624535,
0.08787564933300018,
0.04434992000460625,
0.03818633407354355,
0.013980780728161335,
-0.04370194673538208,
-0.11091572046279907,
0.17051653563976288,
-0.12536633014678955,
-0.239797443151474,
-0.14147889614105225,
0.001731917611323297,
0.041165996342897415,
-0.01159723661839962,
0.0031763319857418537,
-0.06770002096891403,
-0.11874829977750778,
-0.09346967190504074,
0.015001182444393635,
0.04228860139846802,
-0.080612413585186,
-0.05524664744734764,
0.05777253210544586,
0.040611669421195984,
-0.143319234251976,
0.020423002541065216,
0.04869217798113823,
-0.08989228308200836,
-0.00900039542466402,
0.08071441948413849,
0.06998268514871597,
0.17929090559482574,
0.009512054733932018,
-0.020932139828801155,
0.03292093798518181,
0.2157505750656128,
-0.13771237432956696,
0.11451084166765213,
0.14277678728103638,
-0.0911637470126152,
0.08293474465608597,
0.1991184800863266,
0.03884927183389664,
-0.10264625400304794,
0.03326369449496269,
0.022328944876790047,
-0.028676386922597885,
-0.2503291964530945,
-0.06918580830097198,
0.0007976540364325047,
-0.05238448083400726,
0.07527847588062286,
0.08888168632984161,
0.09494108706712723,
0.01729334332048893,
-0.09416709095239639,
-0.08025584369897842,
0.04901478812098503,
0.10409125685691833,
0.010409193113446236,
-0.01156378723680973,
0.09060908854007721,
-0.03323452174663544,
0.01843860000371933,
0.09313460439443588,
0.004041523206979036,
0.17060963809490204,
0.05550962686538696,
0.18336638808250427,
0.07643263041973114,
0.0721396952867508,
0.015671607106924057,
0.013079277239739895,
0.02304760180413723,
0.021578695625066757,
-0.0033059304114431143,
-0.0851421132683754,
-0.009511260315775871,
0.11862117052078247,
0.06801546365022659,
0.020754681900143623,
0.009507957845926285,
-0.033934496343135834,
0.08064714074134827,
0.17465052008628845,
-0.0009437129483558238,
-0.1870066076517105,
-0.06896740943193436,
0.08026526123285294,
-0.08972865343093872,
-0.10345284640789032,
-0.02900044620037079,
0.0354950949549675,
-0.17372116446495056,
0.02448408491909504,
-0.018045885488390923,
0.11108683049678802,
-0.1356782615184784,
-0.01890929788351059,
0.06319493800401688,
0.07008420675992966,
-0.0016097982879728079,
0.06208989396691322,
-0.16155508160591125,
0.10791012644767761,
0.01390943955630064,
0.06503470987081528,
-0.09786296635866165,
0.10111832618713379,
-0.006267238408327103,
-0.007413685787469149,
0.14043578505516052,
0.009255880489945412,
-0.07051325589418411,
-0.08343593031167984,
-0.0979004055261612,
-0.010649190284311771,
0.12877127528190613,
-0.14879846572875977,
0.08456916362047195,
-0.0322830006480217,
-0.04405250772833824,
0.005208021495491266,
-0.10768675804138184,
-0.12857580184936523,
-0.18887875974178314,
0.05537694692611694,
-0.13356289267539978,
0.033175256103277206,
-0.1055491715669632,
-0.0408647358417511,
-0.02885887771844864,
0.19630752503871918,
-0.22321896255016327,
-0.0670507624745369,
-0.15318840742111206,
-0.09096445143222809,
0.14798617362976074,
-0.049908362329006195,
0.08374498039484024,
-0.005065108183771372,
0.18742504715919495,
0.01894373446702957,
-0.024415504187345505,
0.1011786088347435,
-0.09638315439224243,
-0.19627197086811066,
-0.08534666895866394,
0.15457913279533386,
0.13537167012691498,
0.0351712740957737,
-0.004617651924490929,
0.03167666867375374,
-0.0189940445125103,
-0.12101218104362488,
0.022920187562704086,
0.17696480453014374,
0.07036592066287994,
0.024736741557717323,
-0.02639835514128208,
-0.11453131586313248,
-0.06600044667720795,
-0.032452553510665894,
0.02982977218925953,
0.18294402956962585,
-0.07586611062288284,
0.18679921329021454,
0.13732017576694489,
-0.05770440772175789,
-0.1956426501274109,
0.01923983357846737,
0.04058924317359924,
0.00837375782430172,
0.032165057957172394,
-0.20239581167697906,
0.08806682378053665,
0.0007347199134528637,
-0.05074144899845123,
0.13624143600463867,
-0.17552010715007782,
-0.15046143531799316,
0.06929060816764832,
0.03642011433839798,
-0.19279520213603973,
-0.12030941992998123,
-0.08865538984537125,
-0.05107492581009865,
-0.17776648700237274,
0.10758756101131439,
0.02193085290491581,
0.00676411809399724,
0.033654287457466125,
0.026140762493014336,
0.014790141955018044,
-0.0396585576236248,
0.19431912899017334,
-0.02348872646689415,
0.030807901173830032,
-0.08293910324573517,
-0.07001609355211258,
0.05941145867109299,
-0.05705835670232773,
0.0775861069560051,
-0.022215960547327995,
0.013414059765636921,
-0.10643109679222107,
-0.04425564035773277,
-0.03175993636250496,
0.015691282227635384,
-0.09722420573234558,
-0.08909335732460022,
-0.050057362765073776,
0.09262266010046005,
0.0974174216389656,
-0.035089656710624695,
-0.03564268350601196,
-0.07118509709835052,
0.039714183658361435,
0.18831974267959595,
0.17605267465114594,
0.046182651072740555,
-0.08030564337968826,
-0.004098092205822468,
-0.011694483458995819,
0.042484745383262634,
-0.21906526386737823,
0.062426332384347916,
0.05058585852384567,
0.014059843495488167,
0.1173645630478859,
-0.01779606007039547,
-0.15810294449329376,
-0.06761486083269119,
0.05993710458278656,
-0.06326820701360703,
-0.19225671887397766,
0.0032602818682789803,
0.055388111621141434,
-0.16711848974227905,
-0.04538320377469063,
0.0430813767015934,
-0.005750913172960281,
-0.039257556200027466,
0.01613711006939411,
0.08359149098396301,
0.0031580389477312565,
0.07040093839168549,
0.05520293489098549,
0.086640864610672,
-0.10250966250896454,
0.07937785238027573,
0.08386688679456711,
-0.08347215503454208,
0.028158824890851974,
0.09330378472805023,
-0.06144890934228897,
-0.029910072684288025,
0.032212331891059875,
0.08255140483379364,
0.012964491732418537,
-0.04401125758886337,
0.008184057660400867,
-0.10146338492631912,
0.0627170279622078,
0.09755739569664001,
0.03206513822078705,
0.011901181191205978,
0.03383762761950493,
0.04645882546901703,
-0.07481352984905243,
0.11842621862888336,
0.025973208248615265,
0.01822328381240368,
-0.04273592680692673,
-0.04516541585326195,
0.027133917436003685,
-0.02340707741677761,
-0.007566304877400398,
-0.03583317995071411,
-0.06988023966550827,
-0.01722576655447483,
-0.16493180394172668,
-0.01076561864465475,
-0.044063083827495575,
0.008020744659006596,
0.026847293600440025,
-0.0369400717318058,
0.008594665676355362,
0.009077225811779499,
-0.07577309012413025,
-0.06240518018603325,
-0.02245018258690834,
0.0914878100156784,
-0.16343435645103455,
0.023352261632680893,
0.08310231566429138,
-0.12098916620016098,
0.09322582185268402,
0.018653366714715958,
-0.0019369579385966063,
0.02680385299026966,
-0.15561461448669434,
0.0368269607424736,
-0.027320701628923416,
0.014671673998236656,
0.045705173164606094,
-0.21818207204341888,
-0.0014451020397245884,
-0.03558654710650444,
-0.059982262551784515,
-0.010693925432860851,
-0.037350837141275406,
-0.11245633661746979,
0.10088492184877396,
0.012412267737090588,
-0.08672942966222763,
-0.03157110512256622,
0.03652326017618179,
0.08053763210773468,
-0.02631879225373268,
0.15205731987953186,
-0.0010786735219880939,
0.07447176426649094,
-0.1738860309123993,
-0.0210786834359169,
-0.0090115275233984,
0.02177848480641842,
-0.016872623935341835,
-0.01564885675907135,
0.042430613189935684,
-0.026671668514609337,
0.18584245443344116,
-0.027355844154953957,
0.03733034059405327,
0.06316441297531128,
0.01770097203552723,
-0.021354418247938156,
0.10755398869514465,
0.06012963131070137,
0.02173144742846489,
0.019801700487732887,
0.0075409491546452045,
-0.041807159781455994,
-0.018543899059295654,
-0.19347810745239258,
0.07164526730775833,
0.14044208824634552,
0.08769161999225616,
-0.012164209969341755,
0.08067302405834198,
-0.10084949433803558,
-0.11743459850549698,
0.11121641099452972,
-0.059808436781167984,
-0.0022669173777103424,
-0.06652101874351501,
0.13155525922775269,
0.14582572877407074,
-0.19254228472709656,
0.07050827890634537,
-0.06511960923671722,
-0.05269601568579674,
-0.11906112730503082,
-0.1953776627779007,
-0.05703132599592209,
-0.054343048483133316,
-0.015079263597726822,
-0.05059242993593216,
0.07498416304588318,
0.05622640252113342,
0.010858895257115364,
0.0015552249969914556,
0.06971994787454605,
-0.019759170711040497,
0.001521410304121673,
0.032095473259687424,
0.06417544931173325,
0.014362066984176636,
-0.03133942559361458,
0.018592869862914085,
-0.008470231667160988,
0.03991629183292389,
0.0633486732840538,
0.04155107960104942,
-0.028110865503549576,
0.01659207232296467,
-0.0337030366063118,
-0.10854189842939377,
0.04278707876801491,
-0.028698457404971123,
-0.08063279837369919,
0.13984808325767517,
0.025403661653399467,
0.009562181308865547,
-0.022226108238101006,
0.241981640458107,
-0.07480388879776001,
-0.09265431761741638,
-0.14692139625549316,
0.1055137887597084,
-0.04348868504166603,
0.06415078788995743,
0.045384783297777176,
-0.10421041399240494,
0.012057800777256489,
0.12658540904521942,
0.1625804305076599,
-0.0438871793448925,
0.019560009241104126,
0.03037482313811779,
0.00398933095857501,
-0.03853052854537964,
0.05252939090132713,
0.06827457249164581,
0.14848913252353668,
-0.050116557627916336,
0.09223522990942001,
0.0050886585377156734,
-0.09908851981163025,
-0.034064266830682755,
0.11810369789600372,
-0.019035303965210915,
0.019260596483945847,
-0.05601469427347183,
0.11788773536682129,
-0.06368034332990646,
-0.233087420463562,
0.06406685709953308,
-0.07426205277442932,
-0.14131881296634674,
-0.024826664477586746,
0.07676053047180176,
-0.014309047721326351,
0.027850469574332237,
0.0722186341881752,
-0.07654546946287155,
0.19937579333782196,
0.03671684116125107,
-0.058611851185560226,
-0.05623113736510277,
0.07896319031715393,
-0.11419995129108429,
0.27488458156585693,
0.015893742442131042,
0.045155949890613556,
0.1038452610373497,
-0.013412448577582836,
-0.13435201346874237,
0.01833420805633068,
0.09638454020023346,
-0.08846497535705566,
0.04018587991595268,
0.20595665276050568,
-0.0028567397966980934,
0.11962885409593582,
0.07707620412111282,
-0.08087631314992905,
0.049051105976104736,
-0.09828304499387741,
-0.07230360060930252,
-0.08931835740804672,
0.09120666980743408,
-0.07232820242643356,
0.14308606088161469,
0.1311190128326416,
-0.05265164002776146,
0.00968363881111145,
-0.029376711696386337,
0.045510269701480865,
0.004632700700312853,
0.10403459519147873,
0.008749093860387802,
-0.1797543615102768,
0.02403045818209648,
0.01841445453464985,
0.10992073267698288,
-0.1701374351978302,
-0.09734909981489182,
0.043629229068756104,
-0.0012522460892796516,
-0.06121290475130081,
0.1290796846151352,
0.05957380682229996,
0.05011506378650665,
-0.043520737439394,
-0.0211784765124321,
-0.008504665456712246,
0.14072857797145844,
-0.10404830425977707,
-0.00016830587992444634
] |
null | null | transformers |
# Uploaded model
- **Developed by:** mahiatlinux
- **License:** apache-2.0
- **Finetuned from model :** unsloth/tinyllama-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "gguf"], "base_model": "unsloth/tinyllama-bnb-4bit"} | null | mahiatlinux/model5 | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/tinyllama-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:45:10+00:00 | [] | [
"en"
] | TAGS
#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
|
# Uploaded model
- Developed by: mahiatlinux
- License: apache-2.0
- Finetuned from model : unsloth/tinyllama-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
<img src="URL width="200"/>
| [
"# Uploaded model\n\n- Developed by: mahiatlinux\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
"TAGS\n#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Uploaded model\n\n- Developed by: mahiatlinux\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
63,
78
] | [
"passage: TAGS\n#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: mahiatlinux\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
-0.06339544802904129,
0.06385766714811325,
-0.003077042754739523,
0.09262873232364655,
0.05100737512111664,
0.031024787575006485,
0.08123161643743515,
0.14437168836593628,
-0.05459794029593468,
-0.02220834791660309,
0.11186816543340683,
0.10500596463680267,
0.040766533464193344,
-0.007556891534477472,
0.030850403010845184,
-0.16410492360591888,
0.0854489877820015,
-0.025436241179704666,
-0.13823498785495758,
0.03367576748132706,
0.06636486947536469,
-0.015619843266904354,
0.0843258649110794,
-0.030770249664783478,
-0.06551038473844528,
0.02010071836411953,
-0.047521334141492844,
-0.025266071781516075,
-0.006287051364779472,
0.07431270182132721,
-0.03541243448853493,
0.019772540777921677,
0.026076119393110275,
-0.12582197785377502,
0.03140607848763466,
0.03772259131073952,
0.013077774085104465,
0.05065004900097847,
-0.016478143632411957,
0.09057852625846863,
0.16429419815540314,
-0.012101562693715096,
-0.10453570634126663,
0.05312464386224747,
-0.002321898005902767,
-0.12225113064050674,
-0.0358586460351944,
0.12250462174415588,
0.00648823007941246,
0.04859117418527603,
0.027972396463155746,
0.05608145520091057,
-0.07048185914754868,
0.020205724984407425,
0.15082226693630219,
-0.2659325897693634,
-0.0828971266746521,
0.1274225115776062,
0.0302912387996912,
0.03137395903468132,
-0.03707735985517502,
0.05837301164865494,
0.046250056475400925,
-0.00317451567389071,
0.021852701902389526,
-0.06321361660957336,
-0.08493033051490784,
0.060857679694890976,
-0.09323953092098236,
0.020388426259160042,
0.1627909243106842,
0.06529819965362549,
-0.04081720486283302,
0.004795224871486425,
-0.09987626224756241,
0.04414059966802597,
-0.0826532170176506,
0.06582914292812347,
0.08361051976680756,
0.09288866072893143,
-0.009413482621312141,
-0.10902798175811768,
-0.05542832240462303,
-0.03401229530572891,
-0.10588151216506958,
0.08785504847764969,
0.06458087265491486,
0.11167117208242416,
-0.043628089129924774,
0.05240331590175629,
-0.010693751275539398,
-0.12898440659046173,
-0.06420803815126419,
-0.04655035212635994,
0.11613868176937103,
0.10200187563896179,
-0.05712404102087021,
0.09289619326591492,
0.17620979249477386,
0.1551458090543747,
0.14332237839698792,
0.054423872381448746,
0.0346839502453804,
0.03600362688302994,
-0.07596763223409653,
0.0377955324947834,
-0.1614702343940735,
-0.04612356424331665,
0.1240999847650528,
0.07704918086528778,
0.07245756685733795,
0.0027268817648291588,
-0.08247864246368408,
-0.04385772719979286,
-0.05429850518703461,
0.044845208525657654,
0.0644712746143341,
0.07657679915428162,
0.009307486936450005,
-0.06200199946761131,
-0.02403208240866661,
-0.10114898532629013,
-0.03974781185388565,
-0.029968682676553726,
-0.06992991268634796,
0.18087752163410187,
0.07071412354707718,
-0.0033534334506839514,
-0.0508224181830883,
-0.11790921539068222,
-0.06421162188053131,
-0.034573353826999664,
-0.028310276567935944,
0.023323440924286842,
0.06360360234975815,
-0.07621122896671295,
0.020254528149962425,
-0.14156560599803925,
-0.24762257933616638,
0.05192144587635994,
0.15700073540210724,
-0.04461594298481941,
-0.05454177409410477,
-0.01985033042728901,
-0.0490109920501709,
0.038595810532569885,
-0.04586120694875717,
0.06178727746009827,
-0.08125151693820953,
0.038535140454769135,
-0.01204301044344902,
0.09644926339387894,
-0.12692226469516754,
0.030592836439609528,
-0.07612119615077972,
0.04523112624883652,
-0.013381599448621273,
0.08555766195058823,
-0.06797464936971664,
0.1272498518228531,
-0.10358694940805435,
0.03149735555052757,
-0.08650782704353333,
0.0382230281829834,
0.027969492599368095,
0.13512621819972992,
-0.13190807402133942,
-0.0008346213144250214,
0.14650262892246246,
-0.022312022745609283,
-0.12723267078399658,
0.10731825977563858,
0.01210667472332716,
0.09233709424734116,
0.08553098142147064,
0.11762897670269012,
0.12666098773479462,
-0.08786924183368683,
0.018527578562498093,
0.14338964223861694,
0.027202917262911797,
-0.13659007847309113,
0.09697174280881882,
0.028799405321478844,
-0.12174449861049652,
0.09089435636997223,
-0.06968127191066742,
0.13392768800258636,
0.023112913593649864,
-0.08233696222305298,
-0.12214265763759613,
-0.14276431500911713,
-0.054417822510004044,
-0.010871278122067451,
0.02037777006626129,
0.007083165924996138,
-0.05937210097908974,
-0.01846223697066307,
0.18397153913974762,
-0.06036672368645668,
0.013721849769353867,
-0.06379634886980057,
0.06493761390447617,
-0.11406656354665756,
0.09561660140752792,
-0.05064556375145912,
0.03181275352835655,
-0.03471748158335686,
-0.07377520948648453,
0.09406620264053345,
0.0340166874229908,
0.05456942692399025,
-0.05709104239940643,
-0.023493820801377296,
0.04945254698395729,
0.057131391018629074,
-0.023948917165398598,
-0.042346518486738205,
-0.0855972021818161,
0.03921051695942879,
0.008185424841940403,
0.14349707961082458,
-0.03142232075333595,
0.04390692710876465,
-0.059159040451049805,
0.0596792995929718,
-0.046280790120363235,
0.06398042291402817,
0.04161367565393448,
-0.10595200955867767,
-0.008976473473012447,
-0.08510207384824753,
0.08964025974273682,
0.0665932297706604,
-0.05686551332473755,
0.07459830492734909,
0.013123361393809319,
0.13033278286457062,
0.17943529784679413,
0.036285169422626495,
0.09076111018657684,
0.04874839633703232,
-0.02870420552790165,
-0.004717214498668909,
0.0445471853017807,
0.013161718845367432,
-0.027137093245983124,
-0.005743224639445543,
0.12158358097076416,
-0.119321808218956,
-0.010014506056904793,
0.014944669790565968,
-0.08034398406744003,
0.024269651621580124,
0.034406520426273346,
0.1417669951915741,
-0.06821548193693161,
0.05894356593489647,
0.268087774515152,
-0.09098346531391144,
0.14522889256477356,
-0.07926308363676071,
-0.0749722346663475,
0.00047642705612815917,
0.0008331508724950254,
-0.005333714187145233,
0.026085101068019867,
-0.030870752409100533,
0.039257846772670746,
0.04444614797830582,
-0.0126882903277874,
0.07647285610437393,
-0.13062867522239685,
-0.0149542773142457,
-0.007063054945319891,
-0.08396850526332855,
0.035218898206949234,
0.0519520565867424,
-0.09873949736356735,
0.06609414517879486,
-0.006190081592649221,
-0.03978842869400978,
0.04048341140151024,
0.04304640367627144,
-0.018812870606780052,
0.12233831733465195,
-0.07203501462936401,
-0.17001180350780487,
-0.15161986649036407,
-0.05407172068953514,
-0.13767069578170776,
-0.0015579811297357082,
0.0555097833275795,
-0.07956001162528992,
-0.055538419634103775,
-0.07183632254600525,
-0.01741642877459526,
0.022796358913183212,
0.046472445130348206,
0.09929979592561722,
0.042400576174259186,
0.08538202196359634,
-0.10657398402690887,
-0.009890113957226276,
0.021519135683774948,
-0.05524535849690437,
-0.03470233455300331,
-0.08784568309783936,
0.08409816026687622,
0.1141052097082138,
0.03562119975686073,
-0.012479799799621105,
0.08079785853624344,
0.13662143051624298,
0.025408722460269928,
0.06319673359394073,
0.2551833689212799,
0.08956851810216904,
0.07317598909139633,
0.09346766024827957,
0.010411259718239307,
-0.07281990349292755,
-0.01951935514807701,
0.027732549235224724,
-0.06771281361579895,
-0.16111798584461212,
0.00904193427413702,
-0.09579529613256454,
0.03895577788352966,
0.0762164443731308,
0.06952226161956787,
-0.01693948358297348,
0.17705677449703217,
-0.038480158895254135,
0.1318323016166687,
-0.018343577161431313,
0.04006578400731087,
0.15488289296627045,
0.018979400396347046,
0.08343395590782166,
-0.13689680397510529,
-0.022237492725253105,
0.1507604420185089,
0.10067892074584961,
0.10246524959802628,
0.000795466301497072,
0.02758157253265381,
0.051331911236047745,
0.13290750980377197,
0.007319195661693811,
0.08658777177333832,
-0.03954952955245972,
-0.01912027597427368,
-0.06665466725826263,
-0.05977539345622063,
-0.0722925215959549,
0.051198989152908325,
-0.09057371318340302,
-0.052382633090019226,
0.018252119421958923,
0.09968709945678711,
0.07251934707164764,
0.2007167488336563,
0.041090186685323715,
-0.20909756422042847,
-0.04510590434074402,
0.07441745698451996,
0.0019669760949909687,
-0.02704189158976078,
0.07304631918668747,
-0.012243900448083878,
0.0006601019413210452,
0.061341721564531326,
-0.026524782180786133,
0.12255393713712692,
0.04553455486893654,
0.03722293674945831,
0.015327423810958862,
0.12882333993911743,
0.07400252670049667,
0.11256883293390274,
-0.19533118605613708,
0.010399789549410343,
0.02520376443862915,
0.03926907852292061,
-0.0603972002863884,
0.010153282433748245,
0.1459237039089203,
0.06328174471855164,
0.06874284148216248,
0.0015832656063139439,
0.009094174019992352,
0.048460453748703,
-0.16054394841194153,
0.09879849851131439,
-0.010102638974785805,
-0.009323018603026867,
0.0751541405916214,
-0.09908337146043777,
-0.004952870309352875,
0.01186513714492321,
0.07937553524971008,
-0.07325121015310287,
-0.1414850354194641,
0.0032563060522079468,
0.15257884562015533,
-0.08766854554414749,
-0.05912703275680542,
0.015589362010359764,
-0.06693876534700394,
0.15388567745685577,
0.029941923916339874,
-0.09534032642841339,
-0.08114556223154068,
-0.03348280489444733,
0.13995979726314545,
-0.056365400552749634,
0.029531827196478844,
-0.10575801879167557,
0.016125483438372612,
0.033263858407735825,
-0.22538922727108002,
0.023348810151219368,
-0.10136787593364716,
-0.009333325549960136,
0.02007557637989521,
0.03753584250807762,
-0.12570825219154358,
-0.01065998338162899,
0.0031641116365790367,
-0.051055025309324265,
-0.10801461338996887,
-0.12752386927604675,
-0.07914000004529953,
0.16065901517868042,
-0.08435343950986862,
-0.0038603635039180517,
-0.10673610121011734,
0.036319997161626816,
0.007224203087389469,
-0.026052085682749748,
0.04417450726032257,
0.1789805144071579,
-0.021533261984586716,
0.061016011983156204,
0.1976182907819748,
-0.049689095467329025,
-0.31487563252449036,
-0.15553143620491028,
-0.06241097301244736,
-0.031226010993123055,
-0.08557043969631195,
-0.15159130096435547,
0.17774754762649536,
0.0792228952050209,
-0.044149648398160934,
0.12092886120080948,
-0.28837645053863525,
-0.09392353892326355,
0.10186731070280075,
0.004815184976905584,
0.32859036326408386,
-0.17612439393997192,
-0.047625306993722916,
-0.16293643414974213,
-0.1858121156692505,
0.08228781819343567,
-0.26060205698013306,
0.139048770070076,
-0.05335276573896408,
0.03462276607751846,
-0.014425450004637241,
-0.01644602045416832,
0.15644681453704834,
-0.005532183684408665,
0.04332004487514496,
-0.10040203481912613,
0.12381893396377563,
0.09690074622631073,
-0.08303137868642807,
0.16967757046222687,
-0.23060347139835358,
0.06697313487529755,
-0.10642609745264053,
-0.028055226430296898,
-0.010440313257277012,
-0.014627466909587383,
0.021345563232898712,
-0.02604052424430847,
-0.10527428239583969,
-0.0018903325544670224,
0.07441895455121994,
0.020739920437335968,
0.060726098716259,
0.03673117235302925,
-0.10608664900064468,
0.18198460340499878,
-0.010603967122733593,
-0.14311851561069489,
-0.007510053925216198,
-0.09652954339981079,
-0.037894632667303085,
0.06907264143228531,
-0.2937307059764862,
0.037004731595516205,
0.0657389834523201,
-0.053731709718704224,
0.007877950556576252,
0.02102147601544857,
0.011001739650964737,
-0.017169395461678505,
0.08003605157136917,
-0.09060655534267426,
-0.059742771089076996,
-0.03777078539133072,
0.03852325305342674,
-0.087079256772995,
0.03392545506358147,
0.13463464379310608,
-0.05231134966015816,
0.010960549116134644,
0.006722358986735344,
0.024813201278448105,
-0.07696129381656647,
0.10341905802488327,
0.09348743408918381,
-0.023960737511515617,
-0.10966725647449493,
0.16855040192604065,
-0.025485891848802567,
0.034579548984766006,
0.004123454447835684,
0.052343208342790604,
-0.11313469707965851,
-0.08399937301874161,
0.03804416209459305,
0.030672602355480194,
-0.18314142525196075,
-0.06113402917981148,
-0.08207555115222931,
-0.06498647481203079,
0.05777783319354057,
-0.020750554278492928,
0.06963779777288437,
0.0016248661559075117,
-0.028733963146805763,
-0.024246782064437866,
-0.02864612080156803,
0.036282047629356384,
0.06550420075654984,
0.04327072203159332,
-0.19065316021442413,
-0.056657999753952026,
-0.0029269780497998,
0.06097063049674034,
-0.042191363871097565,
0.04051122069358826,
-0.07665438205003738,
-0.0008479008101858199,
-0.3231341540813446,
0.0464438870549202,
-0.04796146973967552,
0.027328528463840485,
0.019161175936460495,
-0.02105754427611828,
-0.08559324592351913,
0.04404577985405922,
-0.08069920539855957,
-0.04190855100750923,
-0.0450284369289875,
0.0018547290237620473,
-0.07130727171897888,
-0.04113642871379852,
0.0340411476790905,
-0.059688765555620193,
0.010694405063986778,
0.03978952392935753,
-0.0526750385761261,
0.043912746012210846,
-0.08966182172298431,
-0.08975236862897873,
0.016563186421990395,
0.06438489258289337,
-0.03476820886135101,
0.05284014344215393,
0.036138392984867096,
0.05710342153906822,
0.054008834064006805,
-0.039036527276039124,
0.044383592903614044,
-0.07327914983034134,
-0.06901062279939651,
-0.09869320690631866,
0.03614817559719086,
-0.03898946940898895,
-0.046439047902822495,
0.141930490732193,
0.10438518971204758,
0.1575525999069214,
-0.01905851438641548,
-0.056578878313302994,
-0.14384236931800842,
0.018700601533055305,
-0.016367359086871147,
-0.1091863363981247,
-0.015987956896424294,
-0.1054849699139595,
-0.007712510414421558,
-0.035578444600105286,
0.14734071493148804,
0.006420309189707041,
-0.06403931975364685,
-0.026053955778479576,
0.0017946558073163033,
0.0826827883720398,
-0.029220186173915863,
0.3281739056110382,
0.10416611284017563,
0.0621848925948143,
-0.07805075496435165,
-0.009776366874575615,
0.12074686586856842,
0.03878898173570633,
-0.0037567808758467436,
0.11258713155984879,
-0.026879316195845604,
0.1926935613155365,
0.03895435854792595,
0.04808717593550682,
0.026537155732512474,
0.11824846267700195,
-0.060516633093357086,
0.07017520070075989,
-0.04394964873790741,
0.14729352295398712,
0.15759095549583435,
-0.06953871250152588,
-0.01330507267266512,
-0.035782065242528915,
-0.013800183311104774,
-0.1344090700149536,
-0.14256300032138824,
-0.10189935564994812,
-0.17680396139621735,
0.0021604166831821203,
-0.04908357188105583,
0.02496907487511635,
0.12392619252204895,
0.012191254645586014,
0.034053489565849304,
0.07112740725278854,
-0.09429342299699783,
-0.08708025515079498,
0.07796216756105423,
-0.034016359597444534,
-0.1191316694021225,
0.1035296618938446,
-0.055791281163692474,
0.041909631341695786,
-0.015961961820721626,
0.013215788640081882,
0.027171866968274117,
0.06486444175243378,
0.06244969367980957,
-0.07571601122617722,
-0.02987995557487011,
-0.06072581559419632,
0.025414979085326195,
0.04576055333018303,
0.045678310096263885,
0.031392984092235565,
-0.059113357216119766,
0.035818230360746384,
0.13383673131465912,
-0.08080361038446426,
-0.15614338219165802,
-0.11289725452661514,
0.016985859721899033,
-0.08296547830104828,
0.014183171093463898,
-0.043325990438461304,
-0.021501578390598297,
-0.027632003650069237,
0.3743724822998047,
0.1216624528169632,
-0.16806380450725555,
-0.03790953755378723,
-0.00988360308110714,
0.008543808944523335,
-0.03881007432937622,
0.16445250809192657,
0.13111761212348938,
0.054924242198467255,
-0.04899970814585686,
-0.06526365131139755,
-0.022298336029052734,
-0.04231555014848709,
-0.16930890083312988,
0.06032777950167656,
-0.08382598310709,
-0.02644583210349083,
-0.02071652188897133,
-0.008979234844446182,
-0.06683535128831863,
-0.019919173792004585,
0.012720765545964241,
0.031715210527181625,
-0.041400108486413956,
-0.09956756234169006,
-0.000951430934946984,
0.061839453876018524,
-0.0009190546115860343,
-0.10452400147914886,
0.06423261016607285,
0.09963454306125641,
-0.0337580069899559,
-0.1773941069841385,
-0.040155455470085144,
0.08130805939435959,
0.08014440536499023,
0.10163398087024689,
0.0399949848651886,
-0.005868354346603155,
0.081943579018116,
-0.05487925931811333,
-0.153142049908638,
0.0815272331237793,
-0.02016904018819332,
-0.04238777607679367,
0.02218264900147915,
-0.05134609341621399,
-0.06749691069126129,
-0.0457116961479187,
0.043650124222040176,
0.14203357696533203,
-0.0619145967066288,
0.08868588507175446,
-0.0010508334962651134,
-0.07477114349603653,
-0.016454625874757767,
-0.1124788224697113,
0.10202623903751373,
0.07815283536911011,
-0.061750929802656174,
-0.04848892614245415,
-0.09687869995832443,
0.10528398305177689,
0.021606529131531715,
-0.12005073577165604,
0.014386570081114769,
-0.002051852410659194,
-0.06875774264335632,
0.009250439703464508,
0.04736728593707085,
-0.15410366654396057,
-0.030815085396170616,
-0.058930832892656326,
-0.011198592372238636,
-0.059732239693403244,
0.1156710684299469,
0.1706739217042923,
0.041659798473119736,
-0.03175545111298561,
-0.15591831505298615,
-0.041059788316488266,
0.016017407178878784,
-0.034965455532073975,
-0.10154350847005844
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "OpenAI/whisper-large-v3"} | null | Zangs3011/test_hindi_WHISPER | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:OpenAI/whisper-large-v3",
"region:us"
] | 2024-02-13T10:46:11+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-OpenAI/whisper-large-v3 #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-OpenAI/whisper-large-v3 #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
40,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-OpenAI/whisper-large-v3 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.12340369075536728,
0.20964087545871735,
-0.0026578435208648443,
0.03044895827770233,
0.08163684606552124,
0.01907583698630333,
0.04888064041733742,
0.12979428470134735,
0.010612472891807556,
0.10601453483104706,
0.06981992721557617,
0.11075510829687119,
0.11342204362154007,
0.21643367409706116,
0.0024683501105755568,
-0.17324063181877136,
0.030185295268893242,
-0.08837804198265076,
0.005328307393938303,
0.12581805884838104,
0.142048642039299,
-0.10108163207769394,
0.08259914070367813,
-0.01254014577716589,
0.002698361175134778,
-0.0402686782181263,
-0.07093684375286102,
-0.019336525350809097,
0.04342759773135185,
0.03378082811832428,
0.05550599470734596,
-0.009715796448290348,
0.09081826359033585,
-0.2598173916339874,
0.018749605864286423,
0.042297497391700745,
0.005100619979202747,
0.08716151863336563,
0.09818874299526215,
-0.03535139560699463,
0.12088041007518768,
-0.02692229300737381,
0.14206331968307495,
0.09203855693340302,
-0.08486077934503555,
-0.22848229110240936,
-0.06369096040725708,
0.07781405001878738,
0.18861441314220428,
0.07991877943277359,
-0.041599173098802567,
0.13531897962093353,
-0.07079365104436874,
0.025429582223296165,
0.03626258671283722,
-0.08759641647338867,
-0.06764029711484909,
0.060845717787742615,
0.12627418339252472,
0.05897857993841171,
-0.12499645352363586,
-0.037143170833587646,
0.029377656057476997,
0.03831232339143753,
0.06469468027353287,
0.009803327731788158,
0.1621878296136856,
0.025757724419236183,
-0.14315831661224365,
-0.04962573200464249,
0.15380430221557617,
0.01880733110010624,
-0.04604065790772438,
-0.23212698101997375,
-0.006269009783864021,
-0.08894474059343338,
-0.024280628189444542,
-0.0511368066072464,
0.03402375429868698,
0.010862383991479874,
0.1228427141904831,
-0.04055675119161606,
-0.09451177716255188,
-0.021922077983617783,
0.09378229081630707,
0.053649891167879105,
0.024153877049684525,
-0.01831495203077793,
0.012376267462968826,
0.12455810606479645,
0.08614643663167953,
-0.13257035613059998,
-0.06720525026321411,
-0.07982297241687775,
-0.04671717435121536,
-0.03795701265335083,
0.042284030467271805,
0.03368604928255081,
0.06673722714185715,
0.26350516080856323,
-0.02117912285029888,
0.061064787209033966,
0.06637135148048401,
0.01627514138817787,
0.051714204251766205,
0.1049022525548935,
-0.04041266441345215,
-0.15910930931568146,
-0.011238064616918564,
0.09697338193655014,
-0.0036076135002076626,
-0.03122262842953205,
-0.04591301083564758,
0.03514810651540756,
0.03637223690748215,
0.11276920884847641,
0.11469072103500366,
-0.014249779284000397,
-0.07705514878034592,
-0.06470406800508499,
0.21165713667869568,
-0.15891747176647186,
0.043524302542209625,
0.024469448253512383,
-0.007081115152686834,
-0.046795252710580826,
0.006809866521507502,
0.01865917071700096,
-0.028482668101787567,
0.06792803853750229,
-0.06496340036392212,
-0.04464559257030487,
-0.1259719282388687,
-0.02408834733068943,
0.030273474752902985,
0.008322354406118393,
-0.038063764572143555,
-0.04051791876554489,
-0.08171916753053665,
-0.10462643951177597,
0.10970591008663177,
-0.05588056892156601,
-0.060106728225946426,
-0.028774823993444443,
-0.0912916511297226,
0.02339482493698597,
0.027628915384411812,
0.07492497563362122,
-0.027341941371560097,
0.04593275114893913,
-0.012628565542399883,
0.06189972162246704,
0.07852211594581604,
0.027782730758190155,
-0.07917096465826035,
0.06277500092983246,
-0.19120949506759644,
0.08374214172363281,
-0.08263211697340012,
0.03638710454106331,
-0.16125202178955078,
-0.008930309675633907,
0.017782840877771378,
0.024895479902625084,
0.03391677513718605,
0.1671580672264099,
-0.220138281583786,
-0.022471725940704346,
0.15741965174674988,
-0.10796599835157394,
-0.1293429583311081,
0.04211048409342766,
-0.039975300431251526,
0.1727498471736908,
0.025898970663547516,
0.0019739652052521706,
0.09815234690904617,
-0.16362367570400238,
-0.028782222419977188,
-0.017621850594878197,
-0.0027132676914334297,
0.08475355803966522,
0.09050797671079636,
-0.08843125402927399,
0.012817040085792542,
0.014273159205913544,
-0.058927930891513824,
-0.01774461753666401,
-0.040333349257707596,
-0.10742637515068054,
0.0066123888827860355,
-0.08774691820144653,
0.021084312349557877,
-0.0020215981639921665,
-0.09489985555410385,
-0.0066161020658910275,
-0.15586090087890625,
-0.059811923652887344,
0.09076111018657684,
0.002079015364870429,
-0.0253857783973217,
-0.11046980321407318,
0.053198136389255524,
-0.03598884493112564,
-0.02260250598192215,
-0.13851308822631836,
-0.022035274654626846,
0.021333320066332817,
-0.1429627388715744,
-0.012099583633244038,
-0.11739864945411682,
0.0671306625008583,
0.005751980468630791,
-0.048236116766929626,
-0.046174246817827225,
-0.0006844713934697211,
0.002246416872367263,
-0.0535334013402462,
-0.23827359080314636,
-0.030626095831394196,
-0.05225076898932457,
0.15242548286914825,
-0.2226295918226242,
0.04289212077856064,
0.031110990792512894,
0.12129644304513931,
0.0037710382603108883,
-0.06713026016950607,
0.024128712713718414,
-0.07430514693260193,
-0.025680311024188995,
-0.07523169368505478,
-0.004475540481507778,
0.0027227161917835474,
-0.034328605979681015,
0.017627699300646782,
-0.11272820085287094,
-0.044313862919807434,
0.09914200007915497,
0.061981093138456345,
-0.15063415467739105,
0.0033914584200829268,
-0.04532816633582115,
-0.060163937509059906,
-0.07715499401092529,
-0.06851803511381149,
0.090985506772995,
0.055745553225278854,
0.03713199496269226,
-0.07579799741506577,
-0.07397736608982086,
0.007373083848506212,
-0.026012564077973366,
-0.011032276786863804,
0.1144980862736702,
0.07231363654136658,
-0.10205700248479843,
0.08975068479776382,
0.06989108771085739,
0.028087036684155464,
0.0811895951628685,
-0.029800988733768463,
-0.10426413267850876,
-0.030009763315320015,
0.05446743965148926,
0.009936902672052383,
0.1712377965450287,
-0.07328074425458908,
0.05643147975206375,
0.04873957112431526,
-0.042888667434453964,
0.04834524542093277,
-0.0857018530368805,
0.009453876875340939,
0.0063486360013484955,
-0.012710127979516983,
0.03191345930099487,
-0.01962081529200077,
0.006640988402068615,
0.07827771455049515,
0.052305225282907486,
0.02979818917810917,
0.023699773475527763,
-0.03694971650838852,
-0.14136254787445068,
0.18074744939804077,
-0.09308471530675888,
-0.23616740107536316,
-0.15770475566387177,
0.05952177196741104,
0.05017866566777229,
-0.013935580849647522,
0.021616585552692413,
-0.05329173430800438,
-0.10632162541151047,
-0.08654996752738953,
-0.001833495101891458,
0.034904845058918,
-0.055254336446523666,
-0.06641294062137604,
0.04610835388302803,
0.044369783252477646,
-0.11980858445167542,
0.029215479269623756,
0.06733658164739609,
-0.020085258409380913,
-0.0019305445021018386,
0.05806471034884453,
0.09222719818353653,
0.1829715371131897,
-0.0031983337830752134,
0.0014337238389998674,
0.06407558172941208,
0.27955642342567444,
-0.1588885337114334,
0.11794600635766983,
0.14119388163089752,
-0.07272349298000336,
0.07291366159915924,
0.18568629026412964,
0.03047158382833004,
-0.09960591048002243,
0.024044642224907875,
0.023013245314359665,
-0.020331790670752525,
-0.2690548300743103,
-0.05665416643023491,
-0.01695535145699978,
-0.07930387556552887,
0.0754946917295456,
0.08900842070579529,
0.08235830813646317,
0.039939459413290024,
-0.0649050921201706,
-0.09896484017372131,
0.027867017313838005,
0.10555469244718552,
-0.019908159971237183,
0.0038992143236100674,
0.08174064755439758,
-0.04292904958128929,
0.010355629958212376,
0.09265141934156418,
-0.017448123544454575,
0.14591778814792633,
0.05296066030859947,
0.10163239389657974,
0.08191157132387161,
0.09210129827260971,
-0.007862620986998081,
0.03698503226041794,
0.012605482712388039,
0.025873003527522087,
0.021178914234042168,
-0.08644749224185944,
0.013447163626551628,
0.11301591992378235,
0.03236837685108185,
0.02540435828268528,
0.01950814574956894,
-0.04215846210718155,
0.043833620846271515,
0.19096525013446808,
0.021270236000418663,
-0.20794640481472015,
-0.08499789983034134,
0.055579449981451035,
-0.08244962245225906,
-0.15416845679283142,
-0.009730982594192028,
0.033080581575632095,
-0.16772052645683289,
0.01807761937379837,
-0.03751436248421669,
0.10221445560455322,
-0.09416019916534424,
-0.042161013931035995,
0.11163312941789627,
0.05347413942217827,
-0.018094949424266815,
0.04876859113574028,
-0.18273460865020752,
0.11063472181558609,
0.02883661910891533,
0.07801008969545364,
-0.09032741189002991,
0.10213429480791092,
0.001036488451063633,
-0.01750524714589119,
0.1702975034713745,
0.0031702513806521893,
-0.04621752351522446,
-0.07991531491279602,
-0.10109105706214905,
-0.003381002228707075,
0.0811089351773262,
-0.13451072573661804,
0.07527469843626022,
-0.03093983232975006,
-0.026342228055000305,
-0.009458125568926334,
-0.0833074226975441,
-0.12952300906181335,
-0.16255566477775574,
0.05509736388921738,
-0.09603489935398102,
0.02514966018497944,
-0.08708824217319489,
-0.05237684026360512,
0.006943344138562679,
0.18227766454219818,
-0.22381970286369324,
-0.11102590709924698,
-0.14642344415187836,
-0.11356878280639648,
0.1620609313249588,
-0.039878129959106445,
0.08392214775085449,
0.00023800409690011293,
0.16139452159404755,
0.00940009020268917,
-0.012321837246418,
0.09214653074741364,
-0.09595609456300735,
-0.19262920320034027,
-0.052757155150175095,
0.16119500994682312,
0.14368441700935364,
0.03246816620230675,
-0.009841397404670715,
0.028064319863915443,
-0.06191481649875641,
-0.12016825377941132,
0.0267228651791811,
0.1690332144498825,
0.06358195096254349,
-0.01848018169403076,
-0.02048684097826481,
-0.10633832961320877,
-0.05620071291923523,
-0.041764285415410995,
-0.008829173631966114,
0.19599057734012604,
-0.06942033022642136,
0.15474875271320343,
0.11034954339265823,
-0.056149017065763474,
-0.2096295803785324,
0.03413212671875954,
0.046707626432180405,
0.021881205961108208,
0.037523236125707626,
-0.18897855281829834,
0.09700210392475128,
-0.01322464644908905,
-0.08250337839126587,
0.1695414036512375,
-0.1665688455104828,
-0.13525813817977905,
0.10710223019123077,
0.023802824318408966,
-0.2174971103668213,
-0.1348956823348999,
-0.1002177968621254,
-0.01801360957324505,
-0.13058027625083923,
0.044464580714702606,
0.007439748849719763,
0.004973176401108503,
0.020936954766511917,
0.008866003714501858,
0.034978363662958145,
-0.05308733507990837,
0.21094049513339996,
-0.03452594578266144,
0.0011726415250450373,
-0.05076753720641136,
-0.07901736348867416,
0.020641634240746498,
-0.050552260130643845,
0.11591015011072159,
-0.00583112146705389,
0.032958485186100006,
-0.1654132753610611,
-0.04233045130968094,
-0.050835881382226944,
0.034495942294597626,
-0.08993896842002869,
-0.08591505140066147,
-0.04130110517144203,
0.09280412644147873,
0.09798816591501236,
-0.02016557939350605,
-0.00453953817486763,
-0.09094485640525818,
0.06225668266415596,
0.2034822702407837,
0.2037808895111084,
0.06754173338413239,
-0.05154796317219734,
0.02137952484190464,
-0.034221358597278595,
0.04629328101873398,
-0.21338677406311035,
0.041778940707445145,
0.05970757082104683,
0.020107584074139595,
0.06708712130784988,
-0.011052841320633888,
-0.1543273627758026,
-0.07565934211015701,
0.0821060910820961,
-0.05891808494925499,
-0.1684045046567917,
-0.030024414882063866,
0.027366751804947853,
-0.2109183371067047,
-0.040299706161022186,
0.02870342880487442,
-0.01541885919868946,
-0.03692619875073433,
0.02013399824500084,
0.0817236676812172,
-0.021489929407835007,
0.09820283204317093,
0.08419951051473618,
0.09347046911716461,
-0.10305967926979065,
0.059481628239154816,
0.07163893431425095,
-0.039197374135255814,
0.03268672153353691,
0.11545604467391968,
-0.0456005297601223,
-0.03836939483880997,
0.07415249198675156,
0.10533022880554199,
0.016244016587734222,
-0.058830756694078445,
0.01166368369013071,
-0.04652655869722366,
0.05792858824133873,
0.09649691730737686,
0.032341089099645615,
0.006534072570502758,
0.06808165460824966,
0.03245725482702255,
-0.08801683783531189,
0.11744289100170135,
0.059884656220674515,
0.017807461321353912,
-0.0567597895860672,
-0.041458725929260254,
-0.01602754555642605,
-0.01405380666255951,
-0.019213512539863586,
-0.004647648893296719,
-0.08068239688873291,
-0.003947157878428698,
-0.10798341780900955,
0.021521763876080513,
-0.07835865765810013,
0.007724241353571415,
0.03317829221487045,
-0.048350170254707336,
0.0009064867626875639,
0.0006548394449055195,
-0.0729377493262291,
-0.05461333692073822,
-0.013054056093096733,
0.07969266176223755,
-0.1320202350616455,
0.04047635942697525,
0.07666869461536407,
-0.10980678349733353,
0.07134688645601273,
-0.008441990241408348,
0.006202814169228077,
-0.004293695092201233,
-0.147475928068161,
0.05525415018200874,
-0.027065670117735863,
-0.005864033475518227,
0.012034066021442413,
-0.19565971195697784,
-0.00857511255890131,
-0.03384929150342941,
-0.06740362197160721,
0.012412081472575665,
-0.008618423715233803,
-0.11950330436229706,
0.11007244139909744,
0.007207435090094805,
-0.05928776413202286,
-0.02486886829137802,
0.03681553527712822,
0.09573263674974442,
-0.011237755417823792,
0.12993858754634857,
-0.023663898929953575,
0.07804971188306808,
-0.17476284503936768,
-0.006192387081682682,
-0.012027435936033726,
0.054788436740636826,
-0.021364476531744003,
-0.030566714704036713,
0.06117399036884308,
-0.023771697655320168,
0.17961733043193817,
-0.010010925121605396,
0.06644268333911896,
0.05206846073269844,
0.012224987149238586,
0.030398083850741386,
0.07786355912685394,
0.06601586192846298,
-0.00868293922394514,
-0.0003457192797213793,
0.03926176205277443,
-0.006206597667187452,
-0.05116709694266319,
-0.1610683649778366,
0.05675124749541283,
0.16119857132434845,
0.056857071816921234,
0.027272630482912064,
0.015266243368387222,
-0.11608510464429855,
-0.0809825137257576,
0.1111050695180893,
-0.021610122174024582,
-0.031650807708501816,
-0.06717116385698318,
0.19355686008930206,
0.1364298015832901,
-0.19685111939907074,
0.06960256397724152,
-0.05161454528570175,
-0.043792497366666794,
-0.14364124834537506,
-0.1792483925819397,
-0.05846137925982475,
-0.04762584716081619,
-0.026486804708838463,
-0.05774926394224167,
0.04696919396519661,
0.033822376281023026,
0.00001445735961169703,
-0.022183384746313095,
0.10496337711811066,
0.016071205958724022,
-0.024771302938461304,
0.04845986142754555,
0.05849583074450493,
0.03573279455304146,
-0.09052862972021103,
0.006818205118179321,
-0.0008573383674956858,
0.021284930408000946,
0.06992872804403305,
0.020423833280801773,
-0.06802229583263397,
0.025635521858930588,
-0.01974688097834587,
-0.12234944105148315,
0.039704106748104095,
-0.014198784716427326,
-0.037286657840013504,
0.14742411673069,
0.039839353412389755,
0.010245408862829208,
-0.018048757687211037,
0.22621934115886688,
-0.08091013133525848,
-0.07661815732717514,
-0.14901289343833923,
0.05523066967725754,
-0.07483982294797897,
0.025769196450710297,
0.03128216043114662,
-0.12109179049730301,
0.011713380925357342,
0.17028267681598663,
0.12062518298625946,
-0.012868905439972878,
0.009202858433127403,
0.048415280878543854,
0.0043489825911819935,
-0.04372500255703926,
0.018367929384112358,
0.046152498573064804,
0.18851816654205322,
-0.07369488477706909,
0.05925231799483299,
-0.014324579387903214,
-0.08345195651054382,
-0.0159384086728096,
0.09323784708976746,
-0.0076723950915038586,
-0.000711631088051945,
-0.06510841846466064,
0.14650718867778778,
-0.08500031381845474,
-0.2091773897409439,
0.05968644097447395,
-0.056927572935819626,
-0.13799096643924713,
-0.0406353659927845,
0.033206623047590256,
-0.019880110397934914,
0.0038524398114532232,
0.06900432705879211,
-0.0455811470746994,
0.18641138076782227,
0.03163398802280426,
-0.048094987869262695,
-0.07966161519289017,
0.05750548467040062,
-0.15205155313014984,
0.28368082642555237,
0.02316247671842575,
0.05691858008503914,
0.10900773108005524,
-0.01977483555674553,
-0.15179020166397095,
0.007057364098727703,
0.10935498774051666,
-0.07067730277776718,
0.06441797316074371,
0.16795063018798828,
0.007539650425314903,
0.12626633048057556,
0.06266562640666962,
-0.04947442188858986,
0.03356697037816048,
-0.08441046625375748,
-0.044508013874292374,
-0.1212749034166336,
0.08166209608316422,
-0.08948409557342529,
0.15890848636627197,
0.11500747501850128,
-0.07195353507995605,
0.0011174705578014255,
-0.018616685643792152,
0.08630277216434479,
0.011609945446252823,
0.1061117872595787,
0.011260389350354671,
-0.1913442760705948,
0.03699443116784096,
0.006071514915674925,
0.1023879125714302,
-0.18385235965251923,
-0.0523555763065815,
0.04078040271997452,
-0.020401155576109886,
-0.07646653056144714,
0.1147783026099205,
0.034708280116319656,
0.028476988896727562,
-0.037368666380643845,
-0.02794753387570381,
0.010647199116647243,
0.1449728161096573,
-0.112214095890522,
-0.015044525265693665
] |
null | null | diffusers |
# Textual inversion text2image fine-tuning - seb1234/textual_inversion_doll
These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
| {"license": "creativeml-openrail-m", "tags": ["stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "diffusers", "textual_inversion"], "base_model": "runwayml/stable-diffusion-v1-5", "inference": true} | text-to-image | seb1234/textual_inversion_doll | [
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"textual_inversion",
"base_model:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-13T10:46:58+00:00 | [] | [] | TAGS
#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #textual_inversion #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
|
# Textual inversion text2image fine-tuning - seb1234/textual_inversion_doll
These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
| [
"# Textual inversion text2image fine-tuning - seb1234/textual_inversion_doll\nThese are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following."
] | [
"TAGS\n#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #textual_inversion #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"# Textual inversion text2image fine-tuning - seb1234/textual_inversion_doll\nThese are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following."
] | [
101,
61
] | [
"passage: TAGS\n#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #textual_inversion #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n# Textual inversion text2image fine-tuning - seb1234/textual_inversion_doll\nThese are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following."
] | [
-0.1105625182390213,
-0.08139190077781677,
-0.003832699963822961,
0.018095942214131355,
0.07705997675657272,
-0.007484322413802147,
0.16517752408981323,
0.065194271504879,
-0.07725117355585098,
0.05844689533114433,
0.06593766063451767,
0.05064374953508377,
-0.010134793817996979,
0.11762778460979462,
-0.03096167929470539,
-0.18592698872089386,
-0.006523416843265295,
-0.0033865850418806076,
-0.13972525298595428,
0.06445658951997757,
0.08494772762060165,
-0.042532242834568024,
0.07584961503744125,
0.02589576505124569,
-0.08026403933763504,
0.07107286900281906,
0.045551855117082596,
-0.05103436857461929,
0.09634984284639359,
0.056663066148757935,
0.06542772054672241,
0.11787673085927963,
0.07352378219366074,
-0.13324815034866333,
0.035872336477041245,
0.050058599561452866,
-0.03708973526954651,
0.055462032556533813,
-0.018183957785367966,
-0.05865299329161644,
0.13728387653827667,
-0.046791259199380875,
0.035299643874168396,
0.04762135073542595,
-0.03449219837784767,
-0.05948338285088539,
-0.027003949508070946,
0.0371207557618618,
0.10743775218725204,
0.0027832789346575737,
0.034201327711343765,
0.025361115112900734,
-0.04068269953131676,
0.06725965440273285,
0.26345449686050415,
-0.23999306559562683,
-0.009321068413555622,
0.08953487128019333,
0.050218284130096436,
0.05602579936385155,
-0.05576081573963165,
0.09167949855327606,
0.03390895202755928,
-0.0321432389318943,
0.11535771191120148,
-0.04770154133439064,
0.10019984096288681,
-0.01920117624104023,
-0.111188143491745,
0.05089537054300308,
0.1347023844718933,
-0.010568288154900074,
-0.03811126574873924,
-0.21800807118415833,
-0.1032307893037796,
0.07435514032840729,
-0.04381277412176132,
-0.029664574190974236,
-0.014198279939591885,
0.007621433585882187,
0.016846653074026108,
-0.07822442054748535,
-0.1181945651769638,
-0.07856473326683044,
-0.04290328919887543,
0.10855003446340561,
-0.01711101457476616,
0.0095799770206213,
-0.030548296868801117,
0.1188095360994339,
-0.08753743022680283,
-0.131512850522995,
0.07484372705221176,
-0.07058946043252945,
0.005835663061589003,
0.06788493692874908,
-0.03125747665762901,
-0.287682443857193,
0.03623579815030098,
0.10378559678792953,
0.10410599410533905,
-0.02222018875181675,
-0.015346108004450798,
0.09118803590536118,
-0.07006029039621353,
-0.037901852279901505,
-0.06717608124017715,
-0.04249264672398567,
0.030176077038049698,
0.057466525584459305,
0.10383281856775284,
-0.02462419494986534,
-0.08616163581609726,
-0.06611976027488708,
-0.05227898806333542,
0.040412239730358124,
-0.0396462120115757,
0.05553745850920677,
-0.06888603419065475,
-0.008869820274412632,
0.1918538510799408,
-0.0731099545955658,
0.01276648323982954,
-0.004680714104324579,
0.031096775084733963,
0.07656017690896988,
0.1863894909620285,
0.02207869291305542,
0.02256491780281067,
0.09918766468763351,
-0.06103713437914848,
0.014252369292080402,
-0.0013788024662062526,
-0.08695954829454422,
-0.030343642458319664,
-0.1738835871219635,
-0.02590572088956833,
-0.1323966234922409,
-0.15409894287586212,
0.046768203377723694,
0.02573743276298046,
0.0044554839842021465,
0.07113028317689896,
0.014030292630195618,
-0.021270746365189552,
0.03997382894158363,
0.006922969128936529,
-0.03659975528717041,
-0.021610429510474205,
0.029813729226589203,
0.01090176310390234,
0.12048858404159546,
-0.005179940722882748,
-0.018929604440927505,
-0.06063108146190643,
0.009384693577885628,
-0.2382832020521164,
0.021327629685401917,
-0.10213635861873627,
0.03191370889544487,
-0.06677009165287018,
-0.01841687597334385,
-0.07384298741817474,
0.05379621684551239,
0.040862295776605606,
0.18826091289520264,
-0.2381296008825302,
-0.064338818192482,
0.1431168168783188,
-0.18630702793598175,
-0.03551783785223961,
0.05968201160430908,
0.012395430356264114,
0.043724969029426575,
0.023847080767154694,
0.09020231664180756,
0.08391552418470383,
-0.24474918842315674,
0.10998660326004028,
0.012062367051839828,
-0.07337573170661926,
-0.031132927164435387,
0.017139242962002754,
-0.00977419689297676,
0.03883000835776329,
0.01881464198231697,
-0.0753239169716835,
0.10935249924659729,
-0.0009404963930137455,
-0.006867815274745226,
-0.06629490107297897,
-0.03766186162829399,
0.14299538731575012,
0.041498612612485886,
0.04401611164212227,
-0.031883690506219864,
-0.06592817604541779,
0.057755544781684875,
0.03012399561703205,
-0.07395994663238525,
0.049660976976156235,
-0.019151411950588226,
0.08196558058261871,
-0.1044655442237854,
-0.03721078485250473,
-0.12889912724494934,
-0.01554530393332243,
-0.04007405787706375,
0.1650703102350235,
-0.02338714338839054,
0.0993456095457077,
0.12420830130577087,
0.04392142966389656,
-0.015935556963086128,
-0.01753288507461548,
0.08515065163373947,
0.04505178704857826,
-0.04366860166192055,
-0.1641266644001007,
0.032681904733181,
-0.09874265640974045,
-0.03418855369091034,
-0.19689907133579254,
0.0935809537768364,
0.04136628285050392,
0.21418173611164093,
0.13551442325115204,
-0.025311877951025963,
0.06118614599108696,
-0.033890776336193085,
-0.05081974342465401,
-0.03574851155281067,
0.028010765090584755,
-0.005633214488625526,
-0.037608541548252106,
0.17838484048843384,
-0.12775686383247375,
0.2644275426864624,
0.10150425881147385,
0.002017974853515625,
-0.07899916917085648,
-0.10715141892433167,
-0.008812851272523403,
0.006523120682686567,
-0.05768132954835892,
-0.036799535155296326,
-0.0008677270961925387,
0.016728535294532776,
0.15793973207473755,
-0.03708657994866371,
-0.0031384918838739395,
0.04750031605362892,
-0.04301254823803902,
-0.028854109346866608,
0.06415215879678726,
0.06747382879257202,
-0.017339633777737617,
0.03133659064769745,
0.14851024746894836,
-0.041057128459215164,
0.09514593333005905,
-0.03462621569633484,
-0.08823419362306595,
0.005901474040001631,
0.03468007594347,
0.04344804584980011,
0.13026262819766998,
-0.03910607472062111,
-0.044605378061532974,
0.0241912379860878,
-0.0387314036488533,
0.03455847129225731,
-0.1155817061662674,
-0.016740502789616585,
0.09432786703109741,
-0.016878308728337288,
0.1426745504140854,
0.08057164400815964,
-0.061868030577898026,
0.08982236683368683,
-0.14995531737804413,
-0.05110165476799011,
-0.02557252161204815,
-0.012231937609612942,
-0.11605468392372131,
0.12349404394626617,
-0.08217763155698776,
-0.1225447952747345,
-0.16086947917938232,
0.006612681318074465,
-0.03123900666832924,
0.0159732885658741,
0.04991215839982033,
-0.027115074917674065,
-0.08801555633544922,
-0.12147578597068787,
0.08080915361642838,
0.09401525557041168,
0.04615122824907303,
-0.008411531336605549,
-0.03105243481695652,
0.036031730473041534,
-0.12839189171791077,
0.020676057785749435,
-0.02928425930440426,
0.06383328139781952,
0.030377034097909927,
0.02429363876581192,
0.12502126395702362,
0.08068103343248367,
-0.016805490478873253,
-0.014351283200085163,
0.009444464929401875,
0.10491138696670532,
0.003027253085747361,
0.1122211217880249,
0.19655418395996094,
-0.025920625776052475,
0.05179453641176224,
0.08948787301778793,
0.0664164125919342,
-0.012900495901703835,
0.022048432379961014,
-0.017458969727158546,
-0.08244021236896515,
-0.06421131640672684,
-0.08515875786542892,
-0.06030260771512985,
-0.03071058727800846,
0.04670029506087303,
0.028476454317569733,
0.07556372135877609,
0.09542592614889145,
0.02847684919834137,
0.013371380046010017,
0.08152501285076141,
0.070225290954113,
0.1276533454656601,
-0.022408025339245796,
0.07837895303964615,
-0.06986147165298462,
-0.06853065639734268,
0.08135932683944702,
-0.032377708703279495,
0.14271830022335052,
-0.07318499684333801,
0.05635034665465355,
0.06703130900859833,
0.010783021338284016,
0.10133683681488037,
0.15068025887012482,
-0.07821988314390182,
-0.06386000663042068,
-0.004134092479944229,
-0.091330386698246,
0.032022979110479355,
0.05889463424682617,
-0.0554102398455143,
0.00022774342505726963,
-0.049753446131944656,
0.04786461964249611,
0.013215719722211361,
-0.025143226608633995,
0.06300865858793259,
-0.20824113488197327,
0.022570554167032242,
0.015967905521392822,
0.010351891629397869,
-0.005480518564581871,
0.04128777235746384,
0.2021525502204895,
-0.00915085431188345,
0.06343974173069,
-0.0835670456290245,
0.05260590463876724,
0.024473242461681366,
-0.008193637244403362,
-0.07271844148635864,
0.06412786990404129,
-0.0302518829703331,
-0.007453261874616146,
-0.15961575508117676,
0.09299234300851822,
-0.0039393543265759945,
0.004955723881721497,
-0.01359673123806715,
0.03920866921544075,
0.04842975363135338,
0.18910521268844604,
0.15083642303943634,
-0.03849540278315544,
0.006000860128551722,
0.009916136972606182,
-0.08962981402873993,
-0.03130057454109192,
0.04037290811538696,
-0.03287709876894951,
0.023899605497717857,
0.021469559520483017,
-0.055105049163103104,
0.05510300397872925,
0.038911234587430954,
-0.2417677640914917,
-0.187719464302063,
0.006601255852729082,
0.009426102042198181,
-0.1512906551361084,
-0.10714253038167953,
-0.08269979059696198,
-0.06104743853211403,
0.2075617015361786,
-0.10196348279714584,
-0.06181705370545387,
-0.13743634521961212,
-0.0719861164689064,
0.05390705540776253,
-0.027413446456193924,
0.07391813397407532,
0.013171138241887093,
0.07767603546380997,
-0.11182785034179688,
-0.15038837492465973,
0.12730395793914795,
-0.0873267650604248,
-0.10751927644014359,
-0.11803410202264786,
0.10091440379619598,
-0.039085205644369125,
-0.03448081761598587,
-0.00367204868234694,
0.006297852843999863,
0.0400281585752964,
-0.07349453121423721,
0.054536134004592896,
0.13642160594463348,
0.0088783735409379,
-0.027789641171693802,
-0.06637784838676453,
-0.17711225152015686,
0.03089800477027893,
0.03162635117769241,
0.15516780316829681,
0.13373728096485138,
-0.0981927439570427,
0.1386379450559616,
0.0994887501001358,
-0.04058488458395004,
-0.1874745786190033,
-0.03381049633026123,
-0.05794481560587883,
0.029622463509440422,
0.06806723028421402,
-0.06863463670015335,
0.15242598950862885,
0.031359050422906876,
-0.014674012549221516,
0.13773629069328308,
-0.3741791546344757,
-0.147428497672081,
0.10334527492523193,
0.15442267060279846,
0.1437632143497467,
-0.13486191630363464,
-0.07952934503555298,
-0.015121608972549438,
-0.20124542713165283,
0.016888469457626343,
-0.0782930850982666,
0.00868917815387249,
-0.04802168533205986,
-0.07350035756826401,
-0.003256043652072549,
-0.060357484966516495,
0.08164874464273453,
-0.04448635131120682,
0.06438664346933365,
-0.09927329421043396,
0.025744058191776276,
0.09397176653146744,
-0.04440347105264664,
0.04598338529467583,
-0.2043447345495224,
0.05682515352964401,
-0.1946788877248764,
-0.030477747321128845,
0.03503090515732765,
0.06558240950107574,
-0.02383393608033657,
-0.05311521142721176,
-0.028256570920348167,
-0.0413929708302021,
0.015783879905939102,
-0.028937727212905884,
0.0513036847114563,
-0.02987396903336048,
0.07975804805755615,
0.16607816517353058,
0.07027619332075119,
-0.01867329701781273,
-0.13493943214416504,
-0.05066691339015961,
-0.009244158864021301,
0.08856338262557983,
-0.17030242085456848,
0.01699911803007126,
0.07303570955991745,
0.08104711771011353,
0.05881471931934357,
0.05000252276659012,
-0.021797433495521545,
0.060848820954561234,
0.13888829946517944,
-0.11280988156795502,
0.03652022033929825,
-0.039462435990571976,
0.0092458575963974,
0.07372947782278061,
0.13223406672477722,
0.13547056913375854,
-0.05396393686532974,
0.031677838414907455,
0.001638340181671083,
0.03613882511854172,
-0.03418676555156708,
0.11643272638320923,
0.08597534149885178,
0.03690408915281296,
-0.03988000378012657,
0.058561209589242935,
-0.05266757309436798,
-0.1083095371723175,
-0.04323168098926544,
0.09023913741111755,
-0.1131238117814064,
-0.048770416527986526,
0.0569363571703434,
0.19297172129154205,
-0.059543706476688385,
0.008560107089579105,
-0.07810328900814056,
-0.10740404576063156,
-0.010952131822705269,
0.1887502819299698,
0.05390940606594086,
-0.0510479174554348,
-0.07099026441574097,
-0.036067668348550797,
-0.018336405977606773,
0.09037385135889053,
0.10667698830366135,
0.10633142292499542,
-0.16502097249031067,
-0.046389222145080566,
-0.03457063063979149,
-0.051647406071424484,
-0.08186405897140503,
0.031975600868463516,
-0.04211295396089554,
-0.047904592007398605,
-0.07594811916351318,
0.05444355309009552,
-0.12005340307950974,
-0.032858848571777344,
-0.018606850877404213,
-0.04126633703708649,
0.00305905076675117,
0.02908976376056671,
-0.031037451699376106,
0.010248182341456413,
-0.025987740606069565,
-0.0019586412236094475,
-0.10587259382009506,
-0.016185656189918518,
-0.016057012602686882,
-0.12051794677972794,
0.06559555977582932,
-0.02851896919310093,
-0.0615464523434639,
-0.002896049292758107,
-0.20469118654727936,
-0.038009390234947205,
0.09905723482370377,
-0.027728034183382988,
-0.004103658255189657,
0.0371173731982708,
0.03702126443386078,
0.0027738595381379128,
0.011555055156350136,
-0.05062481760978699,
0.0393931120634079,
-0.08465737104415894,
0.06870707124471664,
-0.06238951161503792,
0.002901890315115452,
-0.07657264918088913,
0.015344934538006783,
0.15844987332820892,
0.044890034943819046,
0.15190823376178741,
-0.11483313143253326,
0.07266522943973541,
-0.11860126256942749,
0.014248109422624111,
0.05102503299713135,
-0.06532762199640274,
0.07524022459983826,
0.015436418354511261,
-0.025377685204148293,
-0.06303989887237549,
0.18580736219882965,
-0.027193857356905937,
-0.13465532660484314,
0.023244421929121017,
-0.015967434272170067,
0.077956922352314,
0.06970153003931046,
0.23134130239486694,
0.01955326274037361,
0.03585021197795868,
-0.1364835649728775,
0.08022238314151764,
0.09189990162849426,
-0.039067160338163376,
0.10661371052265167,
0.1148996353149414,
-0.14319147169589996,
0.13815994560718536,
0.016277305781841278,
-0.022532878443598747,
-0.06871303170919418,
0.06853591650724411,
-0.0665639191865921,
0.12379133701324463,
-0.031432151794433594,
0.007864655926823616,
0.2449411153793335,
-0.03322869539260864,
-0.024575715884566307,
0.10159090161323547,
-0.029227834194898605,
-0.0763666182756424,
-0.18692661821842194,
-0.041362352669239044,
-0.17427265644073486,
0.005611731205135584,
-0.04146823659539223,
-0.016141196712851524,
0.016048476099967957,
0.04927763715386391,
0.08772194385528564,
0.12384582310914993,
0.033518169075250626,
-0.047408394515514374,
0.07322674989700317,
0.0016254500951617956,
-0.06407259404659271,
0.04568810388445854,
0.048706699162721634,
0.038684677332639694,
-0.014279219321906567,
-0.03872578963637352,
0.03108816221356392,
0.03309463709592819,
0.019813308492302895,
-0.00974966213107109,
-0.02699587121605873,
-0.01606248877942562,
-0.006067587528377771,
-0.031125523149967194,
0.16741718351840973,
0.06925933063030243,
-0.008964953944087029,
-0.02907165326178074,
0.13447622954845428,
-0.05266872048377991,
-0.12353280931711197,
-0.11483003944158554,
-0.014447611756622791,
-0.013315049931406975,
0.08155201375484467,
-0.07493145763874054,
-0.1294896900653839,
0.036990053951740265,
0.2096807211637497,
0.22337369620800018,
-0.022630581632256508,
0.0631798580288887,
0.000541043234989047,
-0.010557647794485092,
0.014101425185799599,
0.026189982891082764,
0.014942222274839878,
0.26919233798980713,
-0.01884661614894867,
-0.08495141565799713,
-0.05975818634033203,
-0.08600912988185883,
-0.09196726232767105,
0.02908789925277233,
0.026359548792243004,
-0.04552663490176201,
-0.04590848460793495,
0.07695500552654266,
0.0187996719032526,
-0.10010558366775513,
0.03411343693733215,
-0.0884055495262146,
-0.00354204373434186,
-0.0499151311814785,
0.1365703046321869,
-0.018376560881733894,
-0.02109702117741108,
-0.019011814147233963,
-0.040815822780132294,
0.07345739752054214,
0.02180936560034752,
-0.13616527616977692,
0.06810738891363144,
-0.06800281256437302,
-0.1957753598690033,
0.07837899029254913,
-0.021284859627485275,
0.022025713697075844,
0.02415534295141697,
0.021917466074228287,
-0.0651090145111084,
0.047121018171310425,
-0.009301071055233479,
-0.05625709891319275,
-0.04421313852071762,
0.11079731583595276,
0.016986848786473274,
0.029653828591108322,
0.009067602455615997,
-0.16170276701450348,
0.06978563964366913,
0.027445316314697266,
-0.11394165456295013,
-0.07595369219779968,
0.007715072948485613,
-0.053657807409763336,
0.08819692581892014,
0.0010777199640870094,
0.010309773497283459,
0.025854354724287987,
-0.008572849445044994,
0.03102092631161213,
0.02305179089307785,
0.004621089436113834,
0.05050915479660034,
-0.09488459676504135,
0.008585365489125252,
0.03938855603337288,
0.04583987221121788,
-0.20392566919326782,
-0.0729689821600914,
-0.10808923840522766,
-0.012998366728425026,
-0.0397789403796196,
0.063255675137043,
0.26449188590049744,
0.04873135685920715,
-0.005394093692302704,
-0.2300586998462677,
0.07023264467716217,
0.07256171107292175,
-0.09832238405942917,
-0.03821288421750069
] |
null | null | stable-baselines3 |
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga hugo-massonnat -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga hugo-massonnat -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga hugo-massonnat
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
| {"library_name": "stable-baselines3", "tags": ["SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "DQN", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "SpaceInvadersNoFrameskip-v4", "type": "SpaceInvadersNoFrameskip-v4"}, "metrics": [{"type": "mean_reward", "value": "575.00 +/- 216.81", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | hugo-massonnat/dqn-SpaceInvadersNoFrameskip-v4 | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-13T10:50:59+00:00 | [] | [] | TAGS
#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# DQN Agent playing SpaceInvadersNoFrameskip-v4
This is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4
using the stable-baselines3 library
and the RL Zoo.
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: URL
SB3: URL
SB3 Contrib: URL
Install the RL Zoo (with SB3 and SB3-Contrib):
If you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:
## Training (with the RL Zoo)
## Hyperparameters
# Environment Arguments
| [
"# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.",
"## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:",
"## Training (with the RL Zoo)",
"## Hyperparameters",
"# Environment Arguments"
] | [
"TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.",
"## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:",
"## Training (with the RL Zoo)",
"## Hyperparameters",
"# Environment Arguments"
] | [
43,
90,
73,
9,
5,
7
] | [
"passage: TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:## Training (with the RL Zoo)## Hyperparameters# Environment Arguments"
] | [
0.043572068214416504,
0.2414778620004654,
-0.0026879787910729647,
0.012635791674256325,
0.05784223601222038,
0.0030472534708678722,
0.08585051447153091,
0.10650663822889328,
0.024212315678596497,
-0.001382096204906702,
0.003954293206334114,
0.17533031105995178,
0.03632635250687599,
0.13125447928905487,
-0.018073517829179764,
-0.2066594809293747,
-0.013479253277182579,
-0.06247470900416374,
-0.07153085619211197,
0.036099132150411606,
0.07206681370735168,
-0.030116932466626167,
0.036061208695173264,
-0.051406677812337875,
-0.057161085307598114,
0.036824777722358704,
-0.03157254680991173,
0.007067287806421518,
0.15158706903457642,
-0.1222257912158966,
0.12329676002264023,
0.020955175161361694,
0.1896144151687622,
-0.12332789599895477,
0.0339222252368927,
0.08982209116220474,
-0.036988191306591034,
0.013221588917076588,
0.00975361280143261,
-0.052562564611434937,
0.1590864509344101,
-0.09371145814657211,
0.07146181166172028,
0.010926910676062107,
-0.07592244446277618,
-0.1774153709411621,
-0.09356249868869781,
0.07947742193937302,
0.0617753230035305,
0.005319166928529739,
0.03726791962981224,
0.11306490749120712,
-0.020991774275898933,
0.06488905102014542,
0.11562903225421906,
-0.17549200356006622,
0.013578375801444054,
0.17859570682048798,
0.003242473118007183,
0.15767055749893188,
-0.05546637624502182,
0.019877681508660316,
0.02752300351858139,
0.04758313298225403,
0.06873945891857147,
-0.08186400681734085,
-0.1364826112985611,
-0.056155186146497726,
-0.15456219017505646,
-0.03352400287985802,
0.05195203423500061,
-0.011860138736665249,
-0.05783402919769287,
-0.010724928230047226,
-0.04010869935154915,
0.0008851495804265141,
-0.028637725859880447,
0.01805497519671917,
0.07031578570604324,
-0.01226285845041275,
0.02092539705336094,
-0.08391954004764557,
-0.0390290804207325,
-0.038563769310712814,
-0.018022390082478523,
0.12054917961359024,
0.08285853266716003,
0.0266572255641222,
-0.04135355353355408,
0.10274127870798111,
-0.07091585546731949,
-0.05454207584261894,
0.04555258899927139,
-0.03786851093173027,
-0.10615779459476471,
0.02120024710893631,
-0.05905991420149803,
0.026879185810685158,
0.09943640232086182,
0.18048083782196045,
-0.09862488508224487,
0.012620617635548115,
-0.03430783003568649,
0.08121664822101593,
-0.03196052461862564,
0.03197542577981949,
-0.0840383991599083,
-0.016251085326075554,
0.17835216224193573,
0.0030782297253608704,
0.022272996604442596,
0.002074616262689233,
-0.049819961190223694,
-0.02881433069705963,
-0.017756454646587372,
0.06631895154714584,
0.07032092660665512,
0.010587303899228573,
-0.0037596761249005795,
-0.027667716145515442,
-0.036921944469213486,
-0.05629328638315201,
-0.04952820762991905,
0.018803736194968224,
-0.04712437093257904,
-0.047942135483026505,
0.06027210131287575,
-0.005624116864055395,
0.11337806284427643,
-0.025607796385884285,
0.026316547766327858,
-0.019410157576203346,
-0.07494441419839859,
-0.13221681118011475,
-0.0304415225982666,
0.0691632330417633,
0.04371757060289383,
-0.22497159242630005,
-0.16994807124137878,
-0.008539012633264065,
0.017946386709809303,
-0.018741264939308167,
-0.11334165185689926,
0.02453240379691124,
-0.007166135590523481,
-0.049758363515138626,
-0.01601579785346985,
0.10474669933319092,
-0.020438622683286667,
0.018010856583714485,
-0.05593825876712799,
0.16603368520736694,
-0.14290283620357513,
0.031004127115011215,
-0.08706212788820267,
0.023509707301855087,
-0.21286657452583313,
0.041208744049072266,
-0.177636057138443,
0.04863585904240608,
-0.08500861376523972,
0.02327173389494419,
0.021320728585124016,
0.01968831568956375,
0.08580207824707031,
0.10143322497606277,
-0.23631145060062408,
0.05405791476368904,
0.07900930196046829,
-0.022739801555871964,
-0.04218491166830063,
0.06798892468214035,
-0.06558530032634735,
0.1382148116827011,
0.046505436301231384,
0.24831900000572205,
0.10361487418413162,
-0.2036508023738861,
0.061786454170942307,
0.0578593946993351,
-0.08880111575126648,
-0.004730981774628162,
-0.020022382959723473,
0.11598580330610275,
-0.01114928349852562,
0.03338807821273804,
-0.12186288088560104,
0.1456439197063446,
0.02738998830318451,
-0.0165485180914402,
-0.04454165697097778,
-0.1614885926246643,
0.10309953987598419,
-0.015504824928939342,
0.09532155096530914,
-0.042415786534547806,
0.0001161050095106475,
-0.011168917641043663,
0.18012429773807526,
-0.043841805309057236,
0.0007168867159634829,
0.07871408760547638,
0.10895700752735138,
0.028009075671434402,
-0.020230965688824654,
-0.20380273461341858,
-0.0423048660159111,
0.02367858961224556,
0.044489551335573196,
0.2190362960100174,
0.19936694204807281,
0.07770156860351562,
-0.022313760593533516,
-0.025487221777439117,
-0.003248062450438738,
-0.05106664076447487,
0.03467361256480217,
-0.027858436107635498,
-0.024532482028007507,
0.06065356358885765,
-0.09305168688297272,
0.02817818708717823,
-0.13112716376781464,
0.06307920068502426,
-0.17345242202281952,
0.06863926351070404,
0.021998396143317223,
-0.005436043255031109,
0.024577690288424492,
-0.011292695067822933,
-0.034188106656074524,
-0.06233125180006027,
0.07110602408647537,
0.06098933145403862,
0.014702376909554005,
0.0021991983521729708,
-0.0683600977063179,
-0.13828523457050323,
0.08231553435325623,
-0.04042381793260574,
-0.14305958151817322,
0.06392676383256912,
0.011172642931342125,
0.04875864461064339,
-0.05975872278213501,
0.016254881396889687,
0.22900153696537018,
0.05321883037686348,
0.09785865992307663,
-0.04092191904783249,
-0.022525805979967117,
-0.06617844104766846,
-0.06677833944559097,
0.09694591909646988,
0.10812206566333771,
0.060318704694509506,
-0.0030071530491113663,
0.07626225054264069,
0.10942911356687546,
-0.1035122498869896,
-0.0651884600520134,
0.03220061957836151,
-0.05973697826266289,
0.019652515649795532,
0.049140311777591705,
0.02971293032169342,
0.08619047701358795,
0.1833551675081253,
0.008245792239904404,
0.0386311337351799,
-0.025997694581747055,
0.026109617203474045,
-0.15547916293144226,
-0.03145433962345123,
0.04308181628584862,
0.00886955764144659,
-0.07408110797405243,
0.04994636029005051,
0.051439400762319565,
0.13607151806354523,
-0.08217083662748337,
-0.13170577585697174,
-0.059745315462350845,
-0.03804200142621994,
-0.04239124804735184,
0.14975430071353912,
-0.08507520705461502,
-0.19221234321594238,
-0.017164425924420357,
-0.15751953423023224,
-0.02518727444112301,
-0.005179801490157843,
0.002318724524229765,
-0.08325926214456558,
0.017780914902687073,
0.010001576505601406,
-0.03129372000694275,
-0.0684933215379715,
-0.06596160680055618,
-0.05786636844277382,
0.09124112874269485,
0.06932931393384933,
-0.12240120023488998,
-0.00961651187390089,
-0.03742414712905884,
-0.020465577021241188,
0.04516167193651199,
0.08452648669481277,
-0.007267598994076252,
0.07773483544588089,
-0.13209199905395508,
-0.06962883472442627,
0.02834828943014145,
0.2766247093677521,
0.02882981114089489,
0.004668009467422962,
0.17051753401756287,
-0.03629542142152786,
0.04912714660167694,
0.16181479394435883,
0.030781643465161324,
-0.14196757972240448,
0.07090470939874649,
-0.011341600678861141,
-0.09542687982320786,
-0.1706860214471817,
-0.10215658694505692,
-0.037867411971092224,
-0.05015881359577179,
0.05638284236192703,
0.004951419774442911,
-0.04476970434188843,
0.05910305306315422,
0.08782228082418442,
-0.017004497349262238,
-0.06151578947901726,
0.11129767447710037,
0.032263003289699554,
-0.030136963352560997,
0.08078382909297943,
-0.042354047298431396,
-0.04206389561295509,
0.0032403599470853806,
0.22643887996673584,
0.0937788337469101,
-0.01775507442653179,
-0.042567066848278046,
0.019317636266350746,
0.05095715448260307,
0.03613382205367088,
0.11312435567378998,
-0.06975842267274857,
-0.06826137751340866,
-0.035185977816581726,
0.027829548344016075,
-0.02945687249302864,
0.08205190300941467,
0.0630207508802414,
0.005563626065850258,
-0.04653681069612503,
-0.07972332090139389,
-0.04849022626876831,
0.08408913016319275,
-0.027642227709293365,
-0.10093270242214203,
0.09321888536214828,
0.048575710505247116,
0.0016974330646917224,
0.03055831417441368,
0.027994604781270027,
0.01462269201874733,
-0.07982148975133896,
-0.06775744259357452,
0.011468625627458096,
0.07076629996299744,
-0.06822766363620758,
-0.027886953204870224,
-0.19817815721035004,
0.14578363299369812,
0.010630400851368904,
0.04118429124355316,
-0.13048617541790009,
0.1209396943449974,
-0.023116756230592728,
-0.026430301368236542,
0.013811616227030754,
0.0014643745962530375,
0.08203291147947311,
-0.04806509613990784,
0.15762180089950562,
0.009528410620987415,
-0.28092408180236816,
-0.1418946087360382,
-0.08416824042797089,
-0.051183976233005524,
-0.022873088717460632,
0.014752174727618694,
0.0642135739326477,
0.01516205258667469,
0.003868846921250224,
-0.013076163828372955,
0.03185269236564636,
-0.09826882928609848,
-0.06493937969207764,
-0.04839126765727997,
-0.02250157669186592,
-0.06525848805904388,
-0.05647949501872063,
-0.0006809153710491955,
-0.17226077616214752,
0.12522587180137634,
0.11787347495555878,
-0.06451737880706787,
-0.041814323514699936,
-0.06554657220840454,
0.046191465109586716,
-0.07571537792682648,
0.0469326451420784,
0.003414976177737117,
0.019198855385184288,
-0.06806991249322891,
-0.17922484874725342,
0.016097763553261757,
-0.10899919271469116,
0.03772687539458275,
-0.05070559307932854,
0.020257100462913513,
0.08594245463609695,
0.17520126700401306,
0.05856714025139809,
0.01460097823292017,
-0.07239776104688644,
-0.07543374598026276,
-0.0017121878918260336,
-0.06344114243984222,
0.05762333422899246,
-0.009151889942586422,
-0.20333483815193176,
0.02763226442039013,
-0.11414948850870132,
0.06860900670289993,
0.3310066759586334,
0.3324824273586273,
-0.10698744654655457,
0.1177443116903305,
0.04819539934396744,
-0.042202454060316086,
-0.21051374077796936,
-0.002244179602712393,
0.012272895313799381,
0.024992236867547035,
0.13725964725017548,
-0.12924811244010925,
0.05453680083155632,
0.0794181227684021,
-0.024458877742290497,
0.01456840243190527,
-0.09078162908554077,
-0.10816970467567444,
0.20847418904304504,
0.14226987957954407,
0.04421741142868996,
-0.09421348571777344,
0.08391669392585754,
0.004295284394174814,
0.08375877887010574,
0.2107764035463333,
-0.052112679928541183,
0.10695768147706985,
0.005195184610784054,
0.19852910935878754,
0.0328996516764164,
-0.023768596351146698,
0.10834760218858719,
-0.009801650419831276,
0.07911337912082672,
0.03985166177153587,
-0.007676942739635706,
0.010487722232937813,
-0.04522453248500824,
0.014148596674203873,
-0.028376007452607155,
0.010284217074513435,
-0.2274095118045807,
0.0582297146320343,
-0.06368855386972427,
0.04604509472846985,
0.008256820961833,
-0.0999874547123909,
-0.03583388403058052,
0.06431841105222702,
0.08014573156833649,
0.01975327916443348,
0.0436067171394825,
-0.03867863491177559,
0.11051398515701294,
0.20660489797592163,
-0.009811338968575,
0.17751595377922058,
-0.0615963339805603,
0.01464168168604374,
-0.023011628538370132,
-0.04223164543509483,
-0.1462583988904953,
-0.035259708762168884,
0.03498423472046852,
0.057734888046979904,
0.015203364193439484,
0.049647457897663116,
-0.05656236410140991,
0.08498423546552658,
0.021687336266040802,
-0.041541360318660736,
0.033579520881175995,
0.08835696429014206,
0.12415177375078201,
0.010754258371889591,
-0.030121933668851852,
0.06147436052560806,
-0.08128108084201813,
-0.09446098655462265,
-0.004497923422604799,
-0.029991207644343376,
-0.1083834245800972,
0.11353230476379395,
0.16914646327495575,
0.039594944566488266,
-0.057076629251241684,
0.10688766092061996,
-0.02768099494278431,
0.10047874599695206,
0.009198128245770931,
0.06507332623004913,
-0.014091075398027897,
-0.03691792115569115,
0.10611724853515625,
-0.05442855879664421,
-0.01637818105518818,
0.07645545154809952,
-0.06522727757692337,
-0.023877469822764397,
-0.0801999643445015,
0.06034626066684723,
0.09222240000963211,
-0.16854619979858398,
-0.0639432892203331,
-0.032122284173965454,
-0.08628080040216446,
0.013965039514005184,
0.012447911314666271,
0.0710059329867363,
-0.08589600026607513,
0.06316167116165161,
-0.024337708950042725,
0.015639442950487137,
-0.03689891844987869,
0.019222697243094444,
-0.19525384902954102,
-0.002140450058504939,
-0.11280795186758041,
-0.00348020251840353,
-0.002931603929027915,
0.04463808611035347,
-0.04961875081062317,
-0.029358822852373123,
-0.0030675032176077366,
0.044366419315338135,
-0.16609135270118713,
0.002798673929646611,
-0.011639905162155628,
0.03210212290287018,
-0.0002893915225286037,
-0.0983390137553215,
0.014195028692483902,
-0.04294256120920181,
-0.04198618605732918,
0.04925514757633209,
0.009436776861548424,
0.06470516324043274,
-0.2795179784297943,
-0.14905457198619843,
0.030816160142421722,
0.0683867484331131,
0.05483196675777435,
-0.1830425262451172,
0.03568267077207565,
-0.08042316138744354,
-0.02253127470612526,
-0.037770628929138184,
0.018491698428988457,
-0.0539514496922493,
0.0018174031283706427,
-0.04225044324994087,
-0.023033907637000084,
-0.028055014088749886,
-0.07556360960006714,
0.0826747715473175,
0.12462522834539413,
0.07555580884218216,
-0.03807181864976883,
0.09595896303653717,
-0.10009756684303284,
-0.04657831788063049,
-0.04052736237645149,
-0.036951083689928055,
0.017965637147426605,
-0.0870552659034729,
0.048530060797929764,
0.05188591405749321,
0.18719671666622162,
-0.08520494401454926,
-0.058800119906663895,
-0.014255574904382229,
0.0746525228023529,
0.07849094271659851,
0.005095830652862787,
0.17779210209846497,
-0.045693784952163696,
0.05693846940994263,
0.021304311230778694,
0.046699028462171555,
0.10497613251209259,
-0.023569339886307716,
0.14490213990211487,
0.21171095967292786,
-0.037196725606918335,
-0.11048602312803268,
0.043668005615472794,
0.01745123788714409,
-0.002401199424639344,
0.05968761444091797,
0.11983796209096909,
-0.050589341670274734,
-0.10903856158256531,
0.23442286252975464,
0.054169271141290665,
-0.11218088120222092,
0.09546315670013428,
0.039532262831926346,
-0.015890996903181076,
-0.1301896870136261,
0.010444961488246918,
-0.0013640925753861666,
-0.11233190447092056,
0.03386834263801575,
-0.06087532266974449,
-0.025547027587890625,
0.11809267848730087,
0.008789865300059319,
0.03317064419388771,
-0.04139537364244461,
-0.03756232187151909,
-0.04352104663848877,
-0.04273213446140289,
-0.012549578212201595,
-0.02991986647248268,
-0.030186517164111137,
-0.07621737569570541,
-0.007770835887640715,
-0.012012424878776073,
0.030795488506555557,
-0.015285328030586243,
-0.02503054589033127,
-0.021192016080021858,
-0.06697061657905579,
-0.0026312144473195076,
-0.008178025484085083,
0.015549594536423683,
0.010121971368789673,
0.2358063906431198,
0.07042546570301056,
-0.10260069370269775,
-0.01036880537867546,
0.22197756171226501,
-0.03853277862071991,
-0.06528383493423462,
-0.07849395275115967,
0.25128230452537537,
-0.10482002794742584,
0.051095426082611084,
-0.005819917656481266,
-0.06550488620996475,
-0.07153836637735367,
0.2309868484735489,
0.13502730429172516,
-0.1677926480770111,
0.06329060345888138,
-0.0368385910987854,
-0.009490780532360077,
-0.14286863803863525,
0.16013580560684204,
0.1865294873714447,
0.09480160474777222,
-0.12259847670793533,
0.0023130534682422876,
-0.03518044203519821,
-0.018328361213207245,
-0.1660851687192917,
-0.004593863617628813,
-0.029364850372076035,
-0.0427238829433918,
-0.050771355628967285,
0.029773715883493423,
-0.15205919742584229,
-0.0927426889538765,
-0.1916799396276474,
-0.11482496559619904,
-0.12386849522590637,
-0.04549141973257065,
-0.11142764985561371,
-0.0019938007462769747,
0.02257080189883709,
-0.0641874223947525,
0.021061956882476807,
-0.0212461706250906,
-0.05887424945831299,
0.015386379323899746,
-0.08395619690418243,
0.0674985870718956,
0.06488548219203949,
0.15327942371368408,
-0.0790991559624672,
0.025424562394618988,
0.07090727984905243,
-0.057595450431108475,
-0.10164349526166916,
0.06067253649234772,
0.015708057209849358,
-0.1972588747739792,
0.007548294495791197,
0.17712996900081635,
-0.10420889407396317,
0.09745754301548004,
0.048501528799533844,
-0.012951982207596302,
0.0867827981710434,
-0.024721821770071983,
-0.016682926565408707,
-0.04852180927991867,
-0.011212974786758423,
-0.10143939405679703,
0.09892100840806961,
0.0876845121383667,
-0.0517118014395237,
0.07436849176883698,
-0.09508965909481049,
-0.04068392515182495,
0.13103286921977997,
-0.010057874955236912,
-0.08450483530759811,
-0.11667824536561966,
-0.04081142693758011,
0.09684515744447708,
-0.018041390925645828,
-0.20185889303684235,
-0.11639472097158432,
-0.11752668023109436,
-0.00014377340266946703,
-0.03563340753316879,
0.061800602823495865,
0.02430674433708191,
-0.02556120604276657,
-0.008150683715939522,
-0.17615078389644623,
-0.06614746153354645,
0.13479791581630707,
-0.10176112502813339,
-0.07456064969301224
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | token-classification | kabir5297/Deberta_Huge_data | [
"transformers",
"safetensors",
"deberta-v2",
"token-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:53:42+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
52,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06466816365718842,
0.12197275459766388,
-0.004202458541840315,
0.029149644076824188,
0.12036502361297607,
0.006935111712664366,
0.06423568725585938,
0.1065516322851181,
-0.02678372710943222,
0.11724407225847244,
0.019794756546616554,
0.1101258173584938,
0.10480418801307678,
0.1741989552974701,
-0.005228097550570965,
-0.21802054345607758,
0.044925298541784286,
-0.1332864612340927,
-0.024949483573436737,
0.1159522756934166,
0.13522779941558838,
-0.12204837054014206,
0.06930451840162277,
-0.04209818318486214,
-0.007912974804639816,
-0.0322565995156765,
-0.05880237743258476,
-0.04924030974507332,
0.06812424957752228,
0.06659761071205139,
0.0681338757276535,
0.012976576574146748,
0.10400646179914474,
-0.2772457003593445,
0.020696397870779037,
0.08373123407363892,
0.005089117679744959,
0.06848078966140747,
0.05912717059254646,
-0.0766395553946495,
0.07375232130289078,
-0.07348829507827759,
0.15079154074192047,
0.07845799624919891,
-0.09164275974035263,
-0.20210139453411102,
-0.08833131939172745,
0.09044230729341507,
0.19723249971866608,
0.056620024144649506,
-0.028050832450389862,
0.12256447225809097,
-0.07247937470674515,
0.015507095493376255,
0.06527657806873322,
-0.07301480323076248,
-0.05207262933254242,
0.06885527074337006,
0.06803692132234573,
0.10003920644521713,
-0.13086174428462982,
-0.007739242631942034,
0.030164694413542747,
0.013562135398387909,
0.10788191854953766,
0.018514400348067284,
0.12156243622303009,
0.04128511995077133,
-0.14408166706562042,
-0.03994642570614815,
0.09361754357814789,
0.0397270992398262,
-0.05522630363702774,
-0.2456887662410736,
-0.01887490786612034,
-0.03032955899834633,
-0.033128105103969574,
-0.054186753928661346,
0.05287313833832741,
-0.021889202296733856,
0.08110702037811279,
-0.008305862545967102,
-0.07696685940027237,
-0.04674658551812172,
0.08184117078781128,
0.06834331154823303,
0.026918675750494003,
-0.02635417878627777,
0.008553886786103249,
0.11673150211572647,
0.10659375786781311,
-0.11640386283397675,
-0.04757297784090042,
-0.059655386954545975,
-0.08255314826965332,
-0.051134269684553146,
0.02667141705751419,
0.02667512185871601,
0.04563198238611221,
0.21625038981437683,
0.0014771533897146583,
0.04591803625226021,
0.02884340099990368,
0.01423268485814333,
0.06688534468412399,
0.09831439703702927,
-0.05904420465230942,
-0.11931392550468445,
-0.02217327058315277,
0.10633781552314758,
0.0066702705807983875,
-0.03250111639499664,
-0.05059574916958809,
0.06834481656551361,
0.024872468784451485,
0.12410221993923187,
0.06858504563570023,
0.012844047509133816,
-0.07673057168722153,
-0.06398788094520569,
0.17097899317741394,
-0.16335350275039673,
0.03551461920142174,
0.02393929846584797,
-0.053198374807834625,
-0.014516751281917095,
0.020755909383296967,
0.025373658165335655,
-0.011161229573190212,
0.0990719124674797,
-0.05368759483098984,
-0.03255419060587883,
-0.11457451432943344,
-0.053361907601356506,
0.023612400516867638,
0.02046036161482334,
-0.02989322878420353,
-0.04349198937416077,
-0.10500442981719971,
-0.07200396806001663,
0.08398986607789993,
-0.06797605007886887,
-0.04464494064450264,
-0.03520826995372772,
-0.07841871678829193,
0.014052528887987137,
0.00574632128700614,
0.11038210988044739,
-0.023612231016159058,
0.04836830496788025,
-0.05386055260896683,
0.07064847648143768,
0.13456082344055176,
0.02942483313381672,
-0.05367830768227577,
0.05159740895032883,
-0.2461555302143097,
0.10376992076635361,
-0.07264269888401031,
0.04777098074555397,
-0.16307511925697327,
-0.0198860764503479,
0.043219488114118576,
0.023323146626353264,
-0.005688810721039772,
0.1300518661737442,
-0.2077709287405014,
-0.035600125789642334,
0.17244622111320496,
-0.10709037631750107,
-0.08368578553199768,
0.05450167506933212,
-0.05821350961923599,
0.12177194654941559,
0.05322835594415665,
-0.01946827955543995,
0.025913206860423088,
-0.14479942619800568,
-0.010151791386306286,
-0.05895674601197243,
-0.02808297798037529,
0.15971976518630981,
0.05844848230481148,
-0.05183469131588936,
0.06317830085754395,
0.016611700877547264,
-0.01677791401743889,
-0.04923691228032112,
-0.032810237258672714,
-0.09724095463752747,
0.013582640327513218,
-0.07040367275476456,
0.02299460582435131,
-0.03090890683233738,
-0.08987399935722351,
-0.0339132584631443,
-0.15792976319789886,
0.02252473123371601,
0.09328845143318176,
-0.002892867662012577,
-0.021098729223012924,
-0.10148956626653671,
-0.015995139256119728,
0.02491818368434906,
0.0005505141452886164,
-0.14534035325050354,
-0.054077308624982834,
0.019842583686113358,
-0.1618579775094986,
0.03220152109861374,
-0.027691856026649475,
0.04522697255015373,
0.04093098267912865,
-0.0447295606136322,
-0.03263319656252861,
0.012876871041953564,
0.019164353609085083,
-0.0156744085252285,
-0.2759932279586792,
-0.016470206901431084,
-0.03563441336154938,
0.16242848336696625,
-0.2516782283782959,
0.04283948242664337,
0.05301886796951294,
0.13206687569618225,
0.01450103335082531,
-0.027684900909662247,
0.01769263669848442,
-0.06721992790699005,
-0.03368232399225235,
-0.06463972479104996,
-0.013003064319491386,
-0.038854893296957016,
-0.046517930924892426,
0.033844344317913055,
-0.16243807971477509,
-0.04162684455513954,
0.11188272386789322,
0.03858201205730438,
-0.15623866021633148,
-0.03822393715381622,
-0.04364866018295288,
-0.054222021251916885,
-0.07046913355588913,
-0.052408888936042786,
0.1032525822520256,
0.052487947046756744,
0.058352477848529816,
-0.06068374589085579,
-0.0675206333398819,
0.007080634590238333,
-0.023098701611161232,
-0.019511302933096886,
0.0780739039182663,
0.06551919132471085,
-0.12371252477169037,
0.09357328712940216,
0.09488903731107712,
0.08224551379680634,
0.09777180850505829,
0.004860724322497845,
-0.08961226046085358,
-0.03159978613257408,
0.026874035596847534,
0.015613501891493797,
0.15007883310317993,
-0.023757725954055786,
0.04783238098025322,
0.03627374395728111,
-0.0069470712915062904,
0.005104671698063612,
-0.09547439962625504,
0.03146127611398697,
0.027004197239875793,
-0.010457582771778107,
0.045607659965753555,
-0.05758532136678696,
0.016228504478931427,
0.10561399161815643,
0.042750515043735504,
0.04958893731236458,
0.008061937056481838,
-0.04876195266842842,
-0.11458532512187958,
0.17577087879180908,
-0.12034392356872559,
-0.23503929376602173,
-0.11996304988861084,
-0.010993306525051594,
0.034179072827100754,
-0.007477967068552971,
0.022685132920742035,
-0.07265906035900116,
-0.11820217967033386,
-0.09142062067985535,
0.04840506240725517,
0.05493762716650963,
-0.08353719115257263,
-0.06416553258895874,
0.0699615478515625,
0.04743001237511635,
-0.1389177143573761,
0.02380296215415001,
0.034620750695466995,
-0.0872102677822113,
0.0029819237533956766,
0.08519035577774048,
0.05888642370700836,
0.18008075654506683,
0.010788527317345142,
-0.02595219388604164,
0.01976708509027958,
0.2022116482257843,
-0.13552935421466827,
0.10456039011478424,
0.13558508455753326,
-0.06794831901788712,
0.08161590993404388,
0.20791222155094147,
0.041511692106723785,
-0.10927121341228485,
0.045256976038217545,
0.03615958243608475,
-0.022633234038949013,
-0.24851085245609283,
-0.07810220122337341,
0.009074265137314796,
-0.06641479581594467,
0.07369695603847504,
0.08135901391506195,
0.09815715998411179,
0.016307005658745766,
-0.10439245402812958,
-0.05691203474998474,
0.05359639227390289,
0.11178097128868103,
0.0005312497960403562,
-0.015559414401650429,
0.09605181962251663,
-0.019349027425050735,
0.022786341607570648,
0.09166137129068375,
0.003685855306684971,
0.18121956288814545,
0.05096123367547989,
0.14115169644355774,
0.08846595138311386,
0.05260030925273895,
0.010571611113846302,
0.00519154267385602,
0.01690458506345749,
0.022460386157035828,
-0.016591908410191536,
-0.08765360713005066,
-0.0025916658341884613,
0.1303529143333435,
0.023996995761990547,
0.04771316051483154,
0.005256648175418377,
-0.041017983108758926,
0.08706887066364288,
0.1720123142004013,
0.015851903706789017,
-0.19922499358654022,
-0.06961675733327866,
0.07131917029619217,
-0.08101397007703781,
-0.10567333549261093,
-0.03051498904824257,
0.03491778299212456,
-0.17742227017879486,
0.018296167254447937,
-0.022482281550765038,
0.10083698481321335,
-0.12198518961668015,
-0.014369131997227669,
0.050927821546792984,
0.07674054801464081,
-0.016325542703270912,
0.06606277078390121,
-0.18025541305541992,
0.13103583455085754,
0.016894888132810593,
0.07214778661727905,
-0.0892268717288971,
0.08784275501966476,
0.0024716476909816265,
0.0012896760599687696,
0.14359614253044128,
0.002044856082648039,
-0.05432117357850075,
-0.10964708775281906,
-0.08345004916191101,
-0.013369085267186165,
0.12828469276428223,
-0.1321774125099182,
0.0964808538556099,
-0.018887361511588097,
-0.046898361295461655,
0.005365802440792322,
-0.12184431403875351,
-0.1419176608324051,
-0.1701805591583252,
0.04523906111717224,
-0.1329699605703354,
0.04354427382349968,
-0.10502105951309204,
-0.04792432487010956,
-0.04497060924768448,
0.19979332387447357,
-0.2204522341489792,
-0.07042528688907623,
-0.15197555720806122,
-0.05513730272650719,
0.11936050653457642,
-0.045758746564388275,
0.08609742671251297,
0.014394130557775497,
0.18831388652324677,
0.01490827463567257,
-0.012921185232698917,
0.11221019923686981,
-0.10288592427968979,
-0.20936010777950287,
-0.10631230473518372,
0.13907743990421295,
0.14045150578022003,
0.03718477860093117,
0.00017439395014662296,
0.030241772532463074,
-0.008310341276228428,
-0.11130935698747635,
0.02226727083325386,
0.1818859577178955,
0.11698096990585327,
0.034827422350645065,
-0.04558476433157921,
-0.1282479166984558,
-0.08240164071321487,
-0.04210341349244118,
0.013022612780332565,
0.19483055174350739,
-0.07285630702972412,
0.1734362244606018,
0.15553328394889832,
-0.058381255716085434,
-0.19954784214496613,
0.03224672004580498,
0.04052870720624924,
0.0018155946163460612,
0.05836709216237068,
-0.20120330154895782,
0.09613421559333801,
0.003965679090470076,
-0.05700134113430977,
0.12291120737791061,
-0.1794978678226471,
-0.1494377702474594,
0.05977441370487213,
0.06504373997449875,
-0.18471670150756836,
-0.12030193209648132,
-0.0906098261475563,
-0.04907683655619621,
-0.11491015553474426,
0.08198710530996323,
-0.004478295799344778,
0.010844743810594082,
0.03223739564418793,
0.016181785613298416,
0.011378921568393707,
-0.04127343371510506,
0.18362799286842346,
-0.008517292328178883,
0.048652928322553635,
-0.08058580756187439,
-0.06042208895087242,
0.04949076101183891,
-0.06943603605031967,
0.07448729127645493,
-0.010821190662682056,
0.013506816700100899,
-0.10542158782482147,
-0.05650888383388519,
-0.02736050635576248,
0.022536376491189003,
-0.08460354059934616,
-0.10347003489732742,
-0.03973411023616791,
0.10050813108682632,
0.09168742597103119,
-0.04071033000946045,
-0.05761529505252838,
-0.08062192797660828,
0.0355151891708374,
0.19572728872299194,
0.17283788323402405,
0.05753122270107269,
-0.0621308870613575,
-0.0033672878053039312,
-0.012863550335168839,
0.0513107031583786,
-0.22106507420539856,
0.05985407903790474,
0.037558674812316895,
0.030016964301466942,
0.11966370791196823,
-0.024740351364016533,
-0.16162925958633423,
-0.055835455656051636,
0.055034637451171875,
-0.07074437290430069,
-0.16294218599796295,
0.012024333700537682,
0.0750519260764122,
-0.1531451791524887,
-0.023071492090821266,
0.047226645052433014,
-0.022888783365488052,
-0.03552026301622391,
0.008166993036866188,
0.07943081855773926,
0.011047901585698128,
0.08480099588632584,
0.05661538988351822,
0.09406866878271103,
-0.09947272390127182,
0.06839016079902649,
0.08378785848617554,
-0.09536738693714142,
0.036548908799886703,
0.06979306042194366,
-0.06812794506549835,
-0.036443546414375305,
0.04483252763748169,
0.09306086599826813,
0.03461608663201332,
-0.0496140718460083,
0.009975533001124859,
-0.0991714745759964,
0.05783834680914879,
0.11240030825138092,
0.04345378279685974,
0.005570659413933754,
0.03424717113375664,
0.04788278043270111,
-0.09421747177839279,
0.12488389015197754,
0.03327573090791702,
0.028144897893071175,
-0.04884033277630806,
-0.030886849388480186,
0.03401049226522446,
-0.025599602609872818,
-0.013930507935583591,
-0.03926628828048706,
-0.06888161599636078,
-0.013594170100986958,
-0.17240136861801147,
-0.0029535088688135147,
-0.0383978933095932,
0.006674154195934534,
0.018208200111985207,
-0.03393350914120674,
0.010948311537504196,
0.016729488968849182,
-0.0701742097735405,
-0.056464649736881256,
-0.010333591140806675,
0.10393849015235901,
-0.1704758256673813,
0.009352225810289383,
0.07282818108797073,
-0.12330053001642227,
0.0882670134305954,
0.01709691248834133,
0.009928959421813488,
0.03636078163981438,
-0.12878885865211487,
0.04661022499203682,
-0.00919650960713625,
0.014418895356357098,
0.0528080016374588,
-0.21462877094745636,
-0.0046529583632946014,
-0.05383024364709854,
-0.05298306420445442,
-0.009345833212137222,
-0.03066449612379074,
-0.11840657889842987,
0.10337110608816147,
0.00554006127640605,
-0.07858934998512268,
-0.027129173278808594,
0.03534814342856407,
0.08033894002437592,
-0.02888466790318489,
0.15499885380268097,
-0.014964357949793339,
0.06925736367702484,
-0.18353688716888428,
-0.023686321452260017,
-0.017806880176067352,
0.02434595860540867,
-0.03405971825122833,
-0.01627812534570694,
0.04919828101992607,
-0.026282284408807755,
0.19468159973621368,
-0.01404520682990551,
0.053027138113975525,
0.06716758012771606,
-0.01731599122285843,
-0.025200918316841125,
0.10833217203617096,
0.05057381093502045,
0.012596861459314823,
0.03097045235335827,
0.002724191639572382,
-0.032638128846883774,
-0.0015376374358311296,
-0.16326214373111725,
0.07423728704452515,
0.1683049499988556,
0.0821618065237999,
-0.012878294102847576,
0.059728000313043594,
-0.11478280276060104,
-0.12184109538793564,
0.10090997815132141,
-0.05217669531702995,
-0.014969355426728725,
-0.058527734130620956,
0.1378937065601349,
0.14838765561580658,
-0.1930767446756363,
0.0610617995262146,
-0.06931009143590927,
-0.04964228719472885,
-0.10728566348552704,
-0.1681494265794754,
-0.057678062468767166,
-0.05907471105456352,
-0.0213418398052454,
-0.053565166890621185,
0.0680057555437088,
0.07332909107208252,
0.014450018294155598,
0.013244451954960823,
0.07624361664056778,
-0.018915481865406036,
0.008525490760803223,
0.028719795867800713,
0.06702575832605362,
0.009052124805748463,
-0.043495066463947296,
0.012300699949264526,
-0.005083650816231966,
0.03453131020069122,
0.04542510211467743,
0.034886524081230164,
-0.02928536757826805,
0.007193939294666052,
-0.027187315747141838,
-0.11160685122013092,
0.042607881128787994,
-0.024967635050415993,
-0.06782487779855728,
0.13797178864479065,
0.028327153995633125,
-0.009714162908494473,
-0.02516608126461506,
0.2606247663497925,
-0.07522029429674149,
-0.09206689894199371,
-0.1338571459054947,
0.14171090722084045,
-0.024645673111081123,
0.06760124862194061,
0.033446263521909714,
-0.11445143073797226,
0.026036731898784637,
0.1324138194322586,
0.14793062210083008,
-0.05245824158191681,
0.017998777329921722,
0.021558376029133797,
0.0024578277952969074,
-0.040562503039836884,
0.050375621765851974,
0.07319269329309464,
0.12713441252708435,
-0.052478544414043427,
0.08453772962093353,
-0.005550839006900787,
-0.09648649394512177,
-0.02982695773243904,
0.11937443912029266,
-0.006796215195208788,
0.0190102681517601,
-0.06431698054075241,
0.12723416090011597,
-0.0369122140109539,
-0.2703137695789337,
0.06783135235309601,
-0.06748911738395691,
-0.14737756550312042,
-0.02443784847855568,
0.05718240514397621,
-0.01098309550434351,
0.0282635148614645,
0.06607423722743988,
-0.07097610086202621,
0.19599688053131104,
0.035155706107616425,
-0.04580609127879143,
-0.06696486473083496,
0.07266953587532043,
-0.10395272821187973,
0.29104846715927124,
0.008048768155276775,
0.058571070432662964,
0.09949593245983124,
-0.0245056115090847,
-0.13227766752243042,
0.028358159586787224,
0.0850125253200531,
-0.07332444190979004,
0.05352538824081421,
0.21757395565509796,
-0.013260896317660809,
0.11338580399751663,
0.07414449006319046,
-0.10197935998439789,
0.050293050706386566,
-0.10382803529500961,
-0.09831703454256058,
-0.08538446575403214,
0.09536542743444443,
-0.0577712245285511,
0.14520715177059174,
0.1225363090634346,
-0.0463496632874012,
0.021422745659947395,
-0.023490076884627342,
0.046754222363233566,
0.010136643424630165,
0.12518633902072906,
0.014053969644010067,
-0.19339033961296082,
0.027584588155150414,
0.00011195550177944824,
0.10043127089738846,
-0.20675040781497955,
-0.10115297138690948,
0.053753241896629333,
0.0020328264217823744,
-0.06096027418971062,
0.12357921153306961,
0.05461353063583374,
0.04136013612151146,
-0.04750092700123787,
-0.030941203236579895,
-0.010107891634106636,
0.1610364019870758,
-0.10999637842178345,
-0.003804087173193693
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ViT-emotion-classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2807
- Accuracy: 0.525
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.8038 | 0.3875 |
| No log | 2.0 | 80 | 1.5444 | 0.4125 |
| No log | 3.0 | 120 | 1.4651 | 0.4188 |
| No log | 4.0 | 160 | 1.3985 | 0.4562 |
| No log | 5.0 | 200 | 1.2891 | 0.525 |
| No log | 6.0 | 240 | 1.2928 | 0.5 |
| No log | 7.0 | 280 | 1.3412 | 0.5 |
| No log | 8.0 | 320 | 1.3548 | 0.475 |
| No log | 9.0 | 360 | 1.2867 | 0.5312 |
| No log | 10.0 | 400 | 1.3636 | 0.475 |
| No log | 11.0 | 440 | 1.3431 | 0.5188 |
| No log | 12.0 | 480 | 1.2872 | 0.5312 |
| 1.0092 | 13.0 | 520 | 1.3491 | 0.525 |
| 1.0092 | 14.0 | 560 | 1.2864 | 0.5437 |
| 1.0092 | 15.0 | 600 | 1.3278 | 0.5312 |
| 1.0092 | 16.0 | 640 | 1.3772 | 0.5062 |
| 1.0092 | 17.0 | 680 | 1.4458 | 0.5 |
| 1.0092 | 18.0 | 720 | 1.3208 | 0.525 |
| 1.0092 | 19.0 | 760 | 1.4037 | 0.5 |
| 1.0092 | 20.0 | 800 | 1.2810 | 0.5375 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "google/vit-base-patch16-224-in21k", "model-index": [{"name": "ViT-emotion-classification", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "train", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.525, "name": "Accuracy"}]}]}]} | image-classification | felitrisnanto/ViT-emotion-classification | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:54:06+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #dataset-imagefolder #base_model-google/vit-base-patch16-224-in21k #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| ViT-emotion-classification
==========================
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2807
* Accuracy: 0.525
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 20
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #dataset-imagefolder #base_model-google/vit-base-patch16-224-in21k #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
86,
98,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #dataset-imagefolder #base_model-google/vit-base-patch16-224-in21k #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.11862684786319733,
0.13607098162174225,
-0.0025763940066099167,
0.11910273134708405,
0.1404537558555603,
0.004346479661762714,
0.14077715575695038,
0.1390070766210556,
-0.07172218710184097,
0.08159811049699783,
0.1491062492132187,
0.1303545981645584,
0.030898649245500565,
0.18905162811279297,
-0.049535807222127914,
-0.22352102398872375,
0.02580658718943596,
0.0477079339325428,
-0.04870045557618141,
0.12126699090003967,
0.08815955370664597,
-0.1382121741771698,
0.11751259118318558,
0.025848114863038063,
-0.2047291398048401,
-0.007606162689626217,
0.029538599774241447,
-0.05819394066929817,
0.1168045625090599,
0.0373651385307312,
0.09160982817411423,
0.028289178386330605,
0.05270405858755112,
-0.1481442153453827,
0.010206367820501328,
0.07640063017606735,
-0.009214755147695541,
0.09723441302776337,
0.05622915178537369,
0.011492506600916386,
0.016050094738602638,
-0.09463026374578476,
0.04004139453172684,
0.025265377014875412,
-0.11202917993068695,
-0.23511426150798798,
-0.08598961681127548,
0.059083208441734314,
0.0774804875254631,
0.07061222195625305,
-0.002693967893719673,
0.1436288058757782,
-0.005592017434537411,
0.09797509014606476,
0.22986344993114471,
-0.27418267726898193,
-0.07741672545671463,
0.03757063299417496,
0.017696822062134743,
0.07755748927593231,
-0.1007639467716217,
0.012463548220694065,
0.059283606708049774,
0.013740353286266327,
0.15326596796512604,
-0.00546793220564723,
-0.013231445103883743,
-0.026213020086288452,
-0.12508971989154816,
-0.06421228498220444,
0.19508886337280273,
0.0898618996143341,
-0.0470237098634243,
-0.08200053870677948,
-0.08072874695062637,
-0.13877563178539276,
-0.046653974801301956,
-0.011338525451719761,
0.057744529098272324,
-0.03410860896110535,
-0.06401600688695908,
-0.03429725766181946,
-0.09819400310516357,
-0.06963552534580231,
-0.015037794597446918,
0.09685873240232468,
0.053985342383384705,
0.01292209792882204,
-0.021191341802477837,
0.0808539018034935,
-0.04242787882685661,
-0.14507076144218445,
-0.006704701576381922,
0.01800202764570713,
0.025464244186878204,
-0.03126758337020874,
-0.025375982746481895,
-0.11615971475839615,
0.020918212831020355,
0.10874377191066742,
-0.06972605735063553,
0.05691898241639137,
-0.02163911797106266,
0.05179855599999428,
-0.11123023927211761,
0.19115161895751953,
-0.04929012805223465,
0.0159612987190485,
0.04046659544110298,
0.1017889603972435,
0.05127325654029846,
-0.002425429644063115,
-0.1056673526763916,
0.017791470512747765,
0.12042375653982162,
0.0049453722313046455,
-0.03495030477643013,
0.08120119571685791,
-0.0614188052713871,
-0.029111526906490326,
0.07602670043706894,
-0.08666333556175232,
0.026919234544038773,
-0.006897526793181896,
-0.05354881286621094,
-0.0565594881772995,
0.0469067320227623,
-0.011913116089999676,
-0.013371805660426617,
0.04143775254487991,
-0.10228675603866577,
0.012035829946398735,
-0.06763049215078354,
-0.1080244854092598,
0.01311678346246481,
-0.11561828851699829,
0.015319262631237507,
-0.12210355699062347,
-0.1382761150598526,
-0.013023408129811287,
0.05989771708846092,
-0.029016714543104172,
-0.05163951590657234,
-0.04167177900671959,
-0.08057110011577606,
0.026785289868712425,
0.005231655668467283,
0.04709792509675026,
-0.05685161426663399,
0.0881868526339531,
0.044517483562231064,
0.07619098573923111,
-0.019790910184383392,
0.04638770967721939,
-0.0868847668170929,
0.058167941868305206,
-0.2046217918395996,
0.03671747073531151,
-0.05808434262871742,
0.08664451539516449,
-0.1191207617521286,
-0.08627723902463913,
0.0009071230888366699,
-0.020160868763923645,
0.06494573503732681,
0.1077149286866188,
-0.14049723744392395,
-0.05874790623784065,
0.17436181008815765,
-0.10219626873731613,
-0.15532821416854858,
0.11266045272350311,
-0.03060438670217991,
0.029268037527799606,
0.05617973208427429,
0.19773077964782715,
0.07850029319524765,
-0.10863172262907028,
-0.006727805361151695,
-0.031635742634534836,
0.0340501144528389,
-0.053646519780159,
0.07568536698818207,
-0.0005131050711497664,
-0.01108004990965128,
0.022279467433691025,
-0.09523792564868927,
0.06257777661085129,
-0.07312074303627014,
-0.08432256430387497,
-0.0668586865067482,
-0.08747415244579315,
0.041866235435009,
0.05933330953121185,
0.06495782732963562,
-0.1016748771071434,
-0.09083876758813858,
0.028826074674725533,
0.08167547732591629,
-0.09448417276144028,
0.016852281987667084,
-0.08169770985841751,
0.11140599846839905,
-0.1074780747294426,
0.0014938017120584846,
-0.13464124500751495,
-0.02949785627424717,
0.049237966537475586,
-0.0625208243727684,
-0.008478378877043724,
-0.03656590357422829,
0.07331328839063644,
0.06236502155661583,
-0.06268363445997238,
-0.07329478859901428,
-0.03986898064613342,
-0.0025519083719700575,
-0.09935712814331055,
-0.19413237273693085,
-0.023727186024188995,
-0.026893194764852524,
0.10660623013973236,
-0.2199222296476364,
0.04163808375597,
0.05292150005698204,
0.10208848863840103,
0.05913138389587402,
-0.031235113739967346,
0.0039437394589185715,
0.017344877123832703,
-0.03981004282832146,
-0.09011241048574448,
0.061082541942596436,
0.014722965657711029,
-0.06827579438686371,
0.0051209270022809505,
-0.10053195804357529,
0.17472368478775024,
0.13116171956062317,
-0.033463917672634125,
-0.06332863867282867,
-0.0043567162938416,
-0.04356212913990021,
-0.03498488664627075,
-0.03569559007883072,
0.008986911736428738,
0.08051126450300217,
-0.008868166245520115,
0.16240540146827698,
-0.10505981743335724,
-0.026039814576506615,
0.058241575956344604,
-0.02974027581512928,
-0.03810049220919609,
0.08953040093183517,
0.06744997948408127,
-0.13986307382583618,
0.14738419651985168,
0.1662409007549286,
-0.06822194904088974,
0.12521155178546906,
-0.04773518443107605,
-0.06236952543258667,
-0.024040281772613525,
0.04132981598377228,
0.03265652805566788,
0.1292601376771927,
-0.1198551282286644,
-0.01395948976278305,
0.024260707199573517,
0.0016625267453491688,
-0.008375790901482105,
-0.20100361108779907,
-0.008371683768928051,
0.038532670587301254,
-0.05972742661833763,
0.02609979175031185,
-0.005517373792827129,
-0.021567916497588158,
0.08489884436130524,
0.008348884992301464,
-0.042639680206775665,
0.047265421599149704,
0.011100099422037601,
-0.06991036236286163,
0.19499625265598297,
-0.08492761105298996,
-0.21661029756069183,
-0.13107283413410187,
-0.020954333245754242,
-0.08055935055017471,
0.021638698875904083,
0.05819288641214371,
-0.09382759034633636,
-0.057645127177238464,
-0.10397281497716904,
-0.015600989572703838,
0.028978293761610985,
0.03961782902479172,
0.041902970522642136,
-0.0019354389514774084,
0.1307423710823059,
-0.09929072856903076,
-0.007536566816270351,
-0.010077708400785923,
-0.023466931656003,
0.048148881644010544,
0.01983301155269146,
0.12009347230195999,
0.08664698898792267,
-0.027577845379710197,
0.03478600084781647,
-0.020831966772675514,
0.2411278486251831,
-0.07336857914924622,
-0.0025067345704883337,
0.15122878551483154,
0.019550248980522156,
0.06790269911289215,
0.13012559711933136,
0.03827866166830063,
-0.10251547396183014,
0.007758599240332842,
0.021663567051291466,
-0.02579859271645546,
-0.1871739774942398,
-0.0171548780053854,
-0.03924945369362831,
-0.00444507272914052,
0.152572363615036,
0.05523106828331947,
0.061344489455223083,
0.09337121248245239,
0.00033120723674073815,
0.09185243397951126,
-0.004542561247944832,
0.08808216452598572,
0.11049571633338928,
0.045895807445049286,
0.10933946818113327,
-0.0441078245639801,
-0.028400860726833344,
0.03270525485277176,
0.014624936506152153,
0.2257072478532791,
0.0014615242835134268,
0.1745476871728897,
0.04722040519118309,
0.1896134614944458,
0.017876673489809036,
0.05607917532324791,
-0.020971206948161125,
-0.02658970281481743,
-0.009364157915115356,
-0.05445526912808418,
-0.021531760692596436,
0.03677821531891823,
-0.050751328468322754,
0.0655994564294815,
-0.0925455167889595,
0.04166332259774208,
0.06377926468849182,
0.2635495066642761,
0.0365661159157753,
-0.3789684474468231,
-0.09498053789138794,
-0.0053971679881215096,
-0.014488224871456623,
-0.06512360274791718,
0.0023779887706041336,
0.14586958289146423,
-0.0612533763051033,
0.060258202254772186,
-0.10404929518699646,
0.08211296051740646,
-0.051383692771196365,
0.021347982808947563,
0.07492318749427795,
0.09016896039247513,
0.008379991166293621,
0.056661225855350494,
-0.2485174536705017,
0.2579330503940582,
0.01591283641755581,
0.06340377777814865,
-0.04800954461097717,
0.01297358050942421,
0.03521382436156273,
0.10639113932847977,
0.1099725291132927,
-0.005465359892696142,
-0.01662856712937355,
-0.17460858821868896,
-0.08823971450328827,
0.006351474206894636,
0.07203894853591919,
-0.045675501227378845,
0.08031121641397476,
-0.030152365565299988,
-0.023747151717543602,
0.05103715881705284,
-0.0034925774671137333,
-0.08945973217487335,
-0.09370698034763336,
-0.007539461366832256,
0.04352934658527374,
0.015408708713948727,
-0.09620650857686996,
-0.09712299704551697,
-0.10361526906490326,
0.13423673808574677,
-0.01791711524128914,
-0.0405256412923336,
-0.12018390744924545,
0.0892866775393486,
0.05816558748483658,
-0.09259811788797379,
0.08043588697910309,
-0.028549257665872574,
0.13836847245693207,
0.029281003400683403,
-0.062355563044548035,
0.10987292975187302,
-0.059276655316352844,
-0.17301121354103088,
-0.046683523803949356,
0.10513435304164886,
-0.01869458332657814,
0.024247588589787483,
0.0005111363134346902,
0.028401998803019524,
-0.011718365363776684,
-0.058764077723026276,
0.05756045877933502,
0.014515225775539875,
0.05889863893389702,
-0.014903100207448006,
-0.019676700234413147,
0.007737073581665754,
-0.06332728266716003,
-0.029787488281726837,
0.13461267948150635,
0.2437446564435959,
-0.09745744615793228,
0.004943002015352249,
0.018968096002936363,
-0.051279183477163315,
-0.19513830542564392,
0.048698753118515015,
0.06568527966737747,
0.0020221148151904345,
0.0303266029804945,
-0.15475165843963623,
0.07322143018245697,
0.08173877745866776,
-0.030024560168385506,
0.09476637840270996,
-0.265503466129303,
-0.13263669610023499,
0.08120943605899811,
0.1838640421628952,
0.06726694107055664,
-0.14426320791244507,
-0.05513838306069374,
-0.012923520989716053,
-0.09017108380794525,
0.09474195539951324,
-0.06171131134033203,
0.10445062071084976,
-0.02821234054863453,
0.003340694820508361,
0.006336088292300701,
-0.0574214793741703,
0.1290079802274704,
-0.033671990036964417,
0.10798531025648117,
-0.05707699805498123,
-0.009296839125454426,
0.07792694121599197,
-0.07718640565872192,
0.06401468813419342,
-0.09070459008216858,
0.06322236359119415,
-0.061469778418540955,
-0.01584113948047161,
-0.0700555294752121,
0.03382529690861702,
-0.018966028466820717,
-0.025605415925383568,
-0.05134887993335724,
0.02348378114402294,
0.05129716917872429,
-0.0009205437381751835,
0.20003315806388855,
0.05019812658429146,
0.08837173134088516,
0.13760314881801605,
0.04343097656965256,
-0.07744259387254715,
-0.09973973035812378,
-0.027537761256098747,
-0.026997331529855728,
0.0875159427523613,
-0.18562975525856018,
0.04973244294524193,
0.09580063819885254,
0.011054624803364277,
0.1439087837934494,
0.045920830219984055,
-0.03386397287249565,
0.017781836912035942,
0.070749931037426,
-0.15560059249401093,
-0.1623297780752182,
-0.031811706721782684,
-0.016942474991083145,
-0.11406322568655014,
0.06307488679885864,
0.11163705587387085,
-0.08484364300966263,
0.0035264091566205025,
-0.008415672928094864,
0.014919687993824482,
-0.002409477951005101,
0.16151992976665497,
0.08037012070417404,
0.044436246156692505,
-0.09206259250640869,
0.09767723828554153,
0.054202839732170105,
-0.1045539379119873,
0.023341206833720207,
0.025801705196499825,
-0.10466807335615158,
-0.03715038672089577,
0.06702449917793274,
0.14704498648643494,
0.0009470285149291158,
-0.05198730155825615,
-0.14375616610050201,
-0.09377666562795639,
0.057217106223106384,
0.12533320486545563,
0.09250355511903763,
0.01556249987334013,
-0.011633376590907574,
-0.0002801600785460323,
-0.10414283722639084,
0.11900807917118073,
0.03160611540079117,
0.09795945137739182,
-0.2185405194759369,
0.06070994958281517,
0.01890115812420845,
0.031268004328012466,
-0.019250933080911636,
0.029586996883153915,
-0.09804419428110123,
-0.01634339988231659,
-0.05997609347105026,
0.04208562150597572,
-0.036590032279491425,
0.005531745962798595,
-0.0064903125166893005,
-0.06879284232854843,
-0.06098901107907295,
0.04104362055659294,
-0.10035879909992218,
-0.046047866344451904,
0.03615729510784149,
0.07032033801078796,
-0.10205142945051193,
-0.029235344380140305,
0.025446485728025436,
-0.08039696514606476,
0.0790892168879509,
0.014095950871706009,
0.0005695505533367395,
0.023463815450668335,
-0.10042203217744827,
0.010635514743626118,
0.08415917307138443,
0.0029707292560487986,
0.030986838042736053,
-0.10393767058849335,
0.007057030685245991,
-0.0011036454234272242,
0.0009315675706602633,
-0.008648671209812164,
0.10397418588399887,
-0.13311968743801117,
-0.025813480839133263,
-0.037991661578416824,
-0.03324704244732857,
-0.05948195978999138,
0.06284357607364655,
0.08606432378292084,
-0.0030004456639289856,
0.1998845636844635,
-0.08585313707590103,
0.00010894424485741183,
-0.2236366719007492,
0.004054774064570665,
-0.004355728160589933,
-0.1370699256658554,
-0.12498512864112854,
-0.028259091079235077,
0.05320440232753754,
-0.07354100048542023,
0.09616898745298386,
0.016123060137033463,
0.005351166240870953,
0.03564183786511421,
0.0026460830122232437,
-0.0028193816542625427,
0.02624340169131756,
0.18615828454494476,
-0.007491840049624443,
-0.02064543217420578,
0.07138451188802719,
0.018578987568616867,
0.11744797974824905,
0.08343684673309326,
0.0985216423869133,
0.1616794615983963,
-0.04428549483418465,
0.10382276773452759,
0.049792271107435226,
-0.0221316646784544,
-0.1735774725675583,
0.10315366834402084,
-0.07480514794588089,
0.1442563533782959,
-0.013153218664228916,
0.16521982848644257,
0.12247242778539658,
-0.1585603505373001,
0.028127895668148994,
-0.028060633689165115,
-0.07330657541751862,
-0.0704883560538292,
-0.14566665887832642,
-0.11752930283546448,
-0.18573613464832306,
0.014560677111148834,
-0.09806596487760544,
0.007403009105473757,
0.07112178951501846,
-0.009295541793107986,
-0.02352297119796276,
0.20558007061481476,
0.05049857869744301,
-0.002510844962671399,
0.07032909989356995,
0.0015775273786857724,
-0.06872397661209106,
-0.061035364866256714,
-0.08496210724115372,
0.03848041966557503,
-0.00851019099354744,
0.03335035592317581,
-0.02992033213376999,
-0.005304316058754921,
0.0488789863884449,
-0.00015379609249066561,
-0.1098908856511116,
0.018044419586658478,
0.014951071701943874,
0.010534374043345451,
0.0008333100704476237,
0.004809625446796417,
0.006202142685651779,
-0.009944885037839413,
0.1820433884859085,
-0.05484237149357796,
-0.006836243439465761,
-0.11771633476018906,
0.12500759959220886,
0.027567386627197266,
-0.01616005413234234,
0.029098497703671455,
-0.07949063926935196,
0.026830408722162247,
0.21858848631381989,
0.14383293688297272,
-0.0167448278516531,
-0.0014522324781864882,
-0.00857983622699976,
-0.020643005147576332,
-0.03108217939734459,
0.09174125641584396,
0.09339619427919388,
-0.0447763167321682,
-0.05456433817744255,
-0.022962506860494614,
-0.04835779592394829,
-0.016245421022176743,
-0.03796777129173279,
0.039009418338537216,
0.01815023086965084,
0.01443538349121809,
-0.06322929263114929,
0.042721226811409,
0.02114632911980152,
-0.07337304949760437,
0.0894613191485405,
-0.19521470367908478,
-0.13894040882587433,
-0.03288603574037552,
0.10051017254590988,
-0.006018717773258686,
0.02896568924188614,
-0.02127797342836857,
0.015905743464827538,
0.06856148689985275,
-0.02353663370013237,
-0.08276108652353287,
-0.09917619824409485,
0.05052541568875313,
-0.1333807408809662,
0.24502243101596832,
-0.032482389360666275,
0.00661885691806674,
0.11178344488143921,
0.018475130200386047,
-0.11790914833545685,
0.05208421126008034,
0.025536421686410904,
-0.03645871579647064,
0.025177190080285072,
0.11177454143762589,
-0.020675083622336388,
0.10754010826349258,
0.034746766090393066,
-0.08897331357002258,
-0.01814214512705803,
-0.054646085947752,
-0.04100141301751137,
-0.05697661265730858,
-0.02364540658891201,
-0.06854301691055298,
0.12410563230514526,
0.173186793923378,
-0.04242303967475891,
-0.02637280523777008,
-0.06457078456878662,
0.037550926208496094,
0.09137038141489029,
0.016641056165099144,
-0.012749847024679184,
-0.2280997335910797,
0.017656221985816956,
0.024374619126319885,
-0.003917396999895573,
-0.2118307650089264,
-0.11007525026798248,
-0.017804253846406937,
-0.053288642317056656,
-0.08763647079467773,
0.08584819734096527,
0.11768825352191925,
0.04970311373472214,
-0.060780398547649384,
-0.04441050440073013,
-0.06915197521448135,
0.1602003425359726,
-0.12230600416660309,
-0.08478182554244995
] |
null | null | transformers |
# Monarch-7B-dare
Monarch-7B-dare is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [mlabonne/NeuBeagle-7B](https://huggingface.co/mlabonne/NeuBeagle-7B)
## 🧩 Configuration
```yaml
models:
- model: mlabonne/OmniTruthyBeagle-7B-v0
# No parameters necessary for base model
- model: mlabonne/NeuBeagle-7B
parameters:
density: 0.53
weight: 0.45
merge_method: dare_ties
base_model: mlabonne/OmniTruthyBeagle-7B-v0
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "mlabonne/Monarch-7B-dare"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit"], "base_model": ["mlabonne/NeuBeagle-7B"]} | text-generation | mlabonne/Monarch-7B-dare | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"base_model:mlabonne/NeuBeagle-7B",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-13T10:54:09+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-mlabonne/NeuBeagle-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Monarch-7B-dare
Monarch-7B-dare is a merge of the following models using LazyMergekit:
* mlabonne/NeuBeagle-7B
## Configuration
## Usage
| [
"# Monarch-7B-dare\n\nMonarch-7B-dare is a merge of the following models using LazyMergekit:\n* mlabonne/NeuBeagle-7B",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-mlabonne/NeuBeagle-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Monarch-7B-dare\n\nMonarch-7B-dare is a merge of the following models using LazyMergekit:\n* mlabonne/NeuBeagle-7B",
"## Configuration",
"## Usage"
] | [
86,
39,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-mlabonne/NeuBeagle-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Monarch-7B-dare\n\nMonarch-7B-dare is a merge of the following models using LazyMergekit:\n* mlabonne/NeuBeagle-7B## Configuration## Usage"
] | [
-0.0284448079764843,
0.028060948476195335,
-0.0038970173336565495,
0.021807124838232994,
0.047985274344682693,
0.03521258756518364,
0.15587396919727325,
0.07479581236839294,
-0.024823222309350967,
0.0033024027943611145,
0.11070521920919418,
0.15871171653270721,
-0.005225749686360359,
0.039782680571079254,
-0.0618726909160614,
-0.18333493173122406,
0.10411687195301056,
0.01783667877316475,
-0.04687954857945442,
0.08467476814985275,
0.11099067330360413,
-0.02285853587090969,
0.11638002842664719,
-0.009758180007338524,
-0.10771863907575607,
0.03974653780460358,
-0.005802366882562637,
-0.048182036727666855,
0.09176996350288391,
0.09059058129787445,
0.07988684624433517,
0.10095126926898956,
-0.016918230801820755,
-0.14027869701385498,
0.031214185059070587,
-0.030379178002476692,
-0.0532657690346241,
0.05487216264009476,
0.046648234128952026,
-0.0282446201890707,
0.16030794382095337,
-0.014325558207929134,
0.02728106826543808,
0.06072284281253815,
-0.12409740686416626,
-0.03483833000063896,
-0.06959205120801926,
0.046432215720415115,
0.06105559691786766,
0.03401362523436546,
-0.00626734783872962,
0.13015049695968628,
-0.07416564226150513,
0.0677758976817131,
0.10968784242868423,
-0.3628918528556824,
-0.010132666677236557,
0.1809643805027008,
0.10249120742082596,
0.009418862871825695,
-0.007561436388641596,
0.08039714395999908,
0.021161438897252083,
0.002965138992294669,
0.09426822513341904,
-0.07380183041095734,
0.10719464719295502,
-0.0395156629383564,
-0.11903376877307892,
-0.021836889907717705,
0.1963447630405426,
0.018799804151058197,
0.022486452013254166,
-0.09413517266511917,
-0.07866101711988449,
0.1200326681137085,
-0.06656184792518616,
-0.03835960105061531,
0.02799227088689804,
0.04381044954061508,
0.016480281949043274,
-0.0729670450091362,
-0.04140925407409668,
-0.023656589910387993,
-0.14037255942821503,
0.03875887393951416,
-0.0015406955499202013,
0.003329917788505554,
-0.01722952537238598,
0.05576908588409424,
-0.054570578038692474,
-0.10582875460386276,
-0.020948873832821846,
-0.07053712755441666,
0.04160425066947937,
-0.015225479379296303,
-0.03953855112195015,
-0.016435468569397926,
0.11933793872594833,
0.21431297063827515,
-0.011695986613631248,
0.04801911115646362,
-0.051424603909254074,
0.0838245078921318,
-0.005087177734822035,
-0.00820055790245533,
-0.07853846251964569,
-0.18891392648220062,
0.09082374721765518,
0.03555251657962799,
0.08297956734895706,
-0.009450524114072323,
-0.12161386013031006,
-0.02818628028035164,
-0.011517430655658245,
0.039622046053409576,
0.06241586059331894,
0.0976819172501564,
-0.07602515071630478,
-0.011702075600624084,
0.2625615894794464,
-0.0354459322988987,
0.01846601814031601,
0.005402753129601479,
-0.018857162445783615,
-0.007131497375667095,
0.07470294833183289,
0.0659894123673439,
-0.010138291865587234,
0.036416079849004745,
-0.059217583388090134,
-0.07341603934764862,
-0.011303999461233616,
-0.10322604328393936,
0.02300199680030346,
-0.030072415247559547,
-0.02314285933971405,
-0.0981554388999939,
-0.24118755757808685,
0.02023535966873169,
0.05397965759038925,
-0.05419157072901726,
-0.07295405119657516,
-0.10474951565265656,
-0.02995784394443035,
0.006325979717075825,
-0.010700860060751438,
-0.06042065843939781,
-0.032271645963191986,
0.006889064330607653,
-0.02056485041975975,
0.032854869961738586,
-0.22931966185569763,
0.019727781414985657,
-0.1177680492401123,
0.06270015984773636,
-0.14661356806755066,
0.09323333203792572,
-0.05831260234117508,
0.05169462040066719,
-0.07964543253183365,
-0.0019136775517836213,
-0.0564420185983181,
0.030294151976704597,
0.01813148707151413,
0.13893994688987732,
-0.05015943571925163,
-0.08491319417953491,
0.06967814266681671,
-0.1118755117058754,
-0.1720375269651413,
0.11108309030532837,
0.008505976758897305,
0.10898200422525406,
0.08114071935415268,
0.14878205955028534,
0.05767450109124184,
-0.020324306562542915,
-0.0054306043311953545,
0.04378895461559296,
-0.05785389617085457,
-0.07725381851196289,
0.056723982095718384,
-0.022373812273144722,
-0.09592019021511078,
0.04459754377603531,
0.0172403734177351,
0.05639495700597763,
-0.0035229609347879887,
-0.060470014810562134,
-0.0409075953066349,
-0.0701046735048294,
0.06872133910655975,
-0.024659954011440277,
0.05022289231419563,
-0.07899979501962662,
-0.046848222613334656,
0.11787626147270203,
0.02805100753903389,
-0.03489452973008156,
0.016617007553577423,
-0.0533919595181942,
0.09264581650495529,
-0.04784584790468216,
0.0483282208442688,
-0.07023495435714722,
-0.10555566847324371,
0.0005585001781582832,
0.006008056458085775,
0.04237522557377815,
-0.002260811161249876,
0.05884406343102455,
0.03751802816987038,
-0.09156745672225952,
-0.0002888147428166121,
0.12937946617603302,
0.03805817291140556,
-0.025734513998031616,
-0.19417737424373627,
-0.0015828877221792936,
-0.04129624739289284,
0.15917587280273438,
-0.07239823043346405,
0.07382258027791977,
0.03631861135363579,
0.16145136952400208,
-0.018969379365444183,
0.042848192155361176,
0.02814429998397827,
0.037356775254011154,
-0.0517897866666317,
0.022150246426463127,
0.12402025610208511,
0.008560878224670887,
-0.1501196026802063,
0.15160469710826874,
-0.14636453986167908,
0.1606067419052124,
0.16194066405296326,
-0.06189248338341713,
0.023883718997240067,
-0.006441345438361168,
0.01944556273519993,
-0.067825086414814,
0.07913517206907272,
-0.0866568312048912,
0.053210824728012085,
-0.014998518861830235,
0.13269160687923431,
-0.09488826990127563,
-0.05353406444191933,
0.00836361013352871,
-0.012054456397891045,
-0.06217693164944649,
0.06528164446353912,
0.04482436925172806,
-0.184895858168602,
0.12400483340024948,
0.25183650851249695,
0.0001157306251116097,
0.11348490417003632,
0.005952267441898584,
0.02962317317724228,
-0.018787940964102745,
0.003099357709288597,
0.025125138461589813,
-0.003421242581680417,
-0.09949669986963272,
0.03789942339062691,
0.04483956843614578,
0.023509323596954346,
0.07164844870567322,
-0.06831991672515869,
0.017640847712755203,
-0.008008023723959923,
-0.024588055908679962,
0.02874598279595375,
0.07243847846984863,
-0.011966963298618793,
0.06510668992996216,
0.009584523737430573,
-0.09606173634529114,
0.09195955097675323,
0.010774391703307629,
-0.07332644611597061,
0.13233326375484467,
-0.14519746601581573,
-0.2162758708000183,
-0.18253815174102783,
-0.08641933649778366,
-0.1077168881893158,
-0.006124473176896572,
0.07219899445772171,
0.039351534098386765,
-0.01778203807771206,
-0.1297944039106369,
0.06107057258486748,
0.0005144246970303357,
-0.0522850900888443,
-0.02062460035085678,
0.02618705853819847,
-0.006232411600649357,
-0.13680069148540497,
-0.019424811005592346,
0.03530573844909668,
-0.062028512358665466,
0.051713086664676666,
-0.11800356954336166,
0.09356324374675751,
0.08764668554067612,
0.01133826095610857,
-0.024121642112731934,
-0.05262037739157677,
0.20457138121128082,
-0.014600078575313091,
0.05698864907026291,
0.09822917729616165,
-0.037937816232442856,
0.06559957563877106,
0.19573354721069336,
0.05053526908159256,
-0.067136250436306,
-0.008422392420470715,
-0.03193708509206772,
-0.05540347844362259,
-0.12852129340171814,
-0.1268506795167923,
-0.08653122931718826,
0.081587053835392,
0.016003917902708054,
0.04543724283576012,
0.06659930944442749,
0.05054621770977974,
-0.0608159638941288,
0.005914956331253052,
0.07472006231546402,
0.09310204535722733,
0.26910051703453064,
0.00023100146790966392,
0.12785452604293823,
-0.06811702251434326,
-0.04928051680326462,
0.061819303780794144,
0.02598036825656891,
-0.01157610584050417,
0.07443470507860184,
0.16331830620765686,
0.04711417853832245,
0.08965209126472473,
0.10188719630241394,
0.06811944395303726,
-0.04317031428217888,
-0.003105649957433343,
-0.041396744549274445,
-0.08435860276222229,
0.0073579759337008,
0.01895522139966488,
-0.021616170182824135,
-0.0016083598602563143,
0.01433473639190197,
-0.09528832137584686,
0.07800769805908203,
0.10511285811662674,
0.07192352414131165,
-0.2357202023267746,
-0.022678563371300697,
0.07851662486791611,
0.02199682407081127,
-0.045940760523080826,
0.014450108632445335,
-0.024118604138493538,
-0.07682619243860245,
0.14012695848941803,
-0.00007497514161514118,
0.1166374534368515,
-0.001591079286299646,
0.0416545644402504,
-0.05670011788606644,
0.029667269438505173,
-0.0066817342303693295,
0.06102466583251953,
-0.22566914558410645,
0.10345494747161865,
0.03256988152861595,
0.022366933524608612,
0.004263353534042835,
0.0036995518021285534,
0.04583286866545677,
0.19509267807006836,
0.04610571265220642,
-0.014864553697407246,
-0.03374920040369034,
0.0044724405743181705,
-0.08105506747961044,
0.027216939255595207,
0.017617080360651016,
-0.045030076056718826,
0.04573700577020645,
-0.021807339042425156,
-0.04859304800629616,
0.03663647547364235,
0.008133968338370323,
-0.14975328743457794,
-0.15741486847400665,
0.05326829478144646,
0.07975549250841141,
0.10100628435611725,
-0.05643235519528389,
-0.00970815122127533,
-0.10601882636547089,
0.2386598140001297,
-0.15657304227352142,
-0.05598914250731468,
-0.08201192319393158,
-0.10102444142103195,
0.09927277266979218,
-0.03338233381509781,
0.08198335021734238,
-0.050976142287254333,
0.024880416691303253,
-0.0840374231338501,
-0.151069775223732,
0.09104523062705994,
-0.08328257501125336,
-0.0898757204413414,
-0.028137944638729095,
0.10757248103618622,
-0.06223801523447037,
0.03946830332279205,
0.028304923325777054,
0.07863621413707733,
-0.04357736185193062,
-0.0680406466126442,
-0.023600183427333832,
0.025743786245584488,
-0.03744194284081459,
0.06960520148277283,
-0.08788193762302399,
-0.14038151502609253,
0.024469323456287384,
-0.012381616048514843,
0.1837681531906128,
0.3202136754989624,
-0.017384778708219528,
0.10455004125833511,
0.23521649837493896,
-0.06811829656362534,
-0.2775688171386719,
-0.10005080699920654,
-0.07953572273254395,
-0.007917564362287521,
-0.0100286565721035,
-0.124086394906044,
0.06589867919683456,
0.14045454561710358,
-0.03287005424499512,
0.016083965077996254,
-0.30267077684402466,
-0.11694546043872833,
0.09124623984098434,
0.05033456161618233,
0.25254905223846436,
-0.12329209595918655,
-0.08082205802202225,
-0.0969478115439415,
-0.16002169251441956,
0.13450394570827484,
-0.11073105037212372,
0.09489557892084122,
-0.03160950168967247,
-0.08278658241033554,
-0.014995284378528595,
-0.02544884756207466,
0.10414934903383255,
-0.055155083537101746,
0.022278469055891037,
-0.06675668060779572,
-0.0034110811538994312,
0.22455759346485138,
-0.028031766414642334,
0.06024196743965149,
-0.2100374698638916,
0.006454844493418932,
-0.009123099036514759,
-0.01562483236193657,
-0.04976196959614754,
0.0990351140499115,
-0.05529224872589111,
-0.04203568026423454,
-0.03279075771570206,
-0.0024592867121100426,
0.004007181618362665,
0.02123618684709072,
0.170895054936409,
-0.04024013504385948,
0.054845597594976425,
0.211096853017807,
0.12324070930480957,
-0.10669758170843124,
0.0808081179857254,
-0.0026772725395858288,
-0.0921529233455658,
0.04219277948141098,
-0.050832003355026245,
0.005044028162956238,
0.08836771547794342,
-0.02995964325964451,
0.07252903282642365,
0.05517279729247093,
0.001872393535450101,
-0.007552428636699915,
0.13550010323524475,
-0.1566954255104065,
-0.11136934161186218,
-0.007532700430601835,
-0.04205680638551712,
-0.017923196777701378,
0.027012063190340996,
0.17504869401454926,
-0.03484077379107475,
-0.0025223877746611834,
0.046031467616558075,
-0.0062634022906422615,
-0.10232973098754883,
0.08021897077560425,
-0.011804178357124329,
0.01614302769303322,
-0.10094530135393143,
0.06706281006336212,
0.0406717024743557,
-0.08796481788158417,
-0.018053293228149414,
0.12430150806903839,
-0.1281130611896515,
-0.10661908239126205,
-0.05252969264984131,
0.21854326128959656,
-0.09511111676692963,
-0.037649236619472504,
-0.11046852916479111,
-0.11890605092048645,
0.02901332639157772,
0.18253786861896515,
0.07811906933784485,
0.06161082908511162,
0.017858732491731644,
-0.05286272242665291,
-0.054443635046482086,
0.08574935048818588,
-0.02563226781785488,
0.09190645068883896,
-0.06992042064666748,
-0.06461063027381897,
-0.01646202988922596,
-0.015631774440407753,
-0.06728726625442505,
0.001199051970615983,
-0.16438716650009155,
-0.040484555065631866,
-0.1626957654953003,
-0.04024999588727951,
-0.13720674812793732,
-0.029822219163179398,
-0.01771995984017849,
0.0037969674449414015,
-0.035801418125629425,
-0.018901420757174492,
-0.03126189857721329,
-0.039104022085666656,
0.006262597627937794,
0.09304308891296387,
-0.0994439572095871,
-0.0037834958638995886,
0.06328830122947693,
-0.04791659116744995,
0.07642599940299988,
-0.0002436815557302907,
-0.031098555773496628,
0.020622730255126953,
-0.11990977078676224,
-0.07177316397428513,
0.05215354636311531,
0.00434061698615551,
0.0335024818778038,
-0.01807243376970291,
-0.033298689872026443,
0.05857056379318237,
-0.00331235327757895,
0.029260145500302315,
0.13116270303726196,
-0.08323945850133896,
0.03438537195324898,
-0.04529416561126709,
-0.0914466381072998,
-0.013388978317379951,
-0.01104709878563881,
0.08451197296380997,
0.017470410093665123,
0.19357933104038239,
-0.06854091584682465,
-0.02061818726360798,
-0.08897518366575241,
-0.0024344935081899166,
0.005571628920733929,
-0.2051256150007248,
-0.10293105989694595,
-0.06943536549806595,
-0.013848090544342995,
-0.010139258578419685,
0.2128276824951172,
-0.05399077758193016,
-0.18112541735172272,
0.058240856975317,
0.01738819107413292,
0.01631753146648407,
0.032940320670604706,
0.2594328224658966,
0.09950824081897736,
-0.016647404059767723,
-0.08945110440254211,
0.07779564708471298,
-0.000606869871262461,
-0.021394336596131325,
0.03296174481511116,
0.09284871816635132,
0.0005255246651358902,
0.09005661308765411,
0.11008962988853455,
0.03743547573685646,
-0.03938602656126022,
0.031188463792204857,
0.037320494651794434,
0.07789257168769836,
-0.020974580198526382,
0.10023725032806396,
0.153661847114563,
-0.09759797900915146,
0.030929137021303177,
0.028317417949438095,
-0.016695773229002953,
-0.09395001828670502,
-0.0698077455163002,
-0.10751019418239594,
-0.11970416456460953,
-0.022272078320384026,
-0.09825805574655533,
-0.04190926253795624,
0.04371877759695053,
0.014943902380764484,
0.0012083138572052121,
0.09648308902978897,
-0.04970825836062431,
-0.026590239256620407,
0.020655492320656776,
-0.06101471558213234,
-0.03290058672428131,
-0.016504663974046707,
-0.03863496333360672,
-0.01602265052497387,
0.014669784344732761,
-0.018489796668291092,
0.04590792581439018,
-0.018358899280428886,
0.034886494278907776,
-0.1013660877943039,
-0.11688597500324249,
-0.01581457629799843,
0.07466419041156769,
-0.0030408476013690233,
0.01751587726175785,
0.031969714909791946,
-0.08112025260925293,
0.03995731100440025,
0.09721670299768448,
-0.021903321146965027,
-0.16774512827396393,
-0.04783489555120468,
0.2393418252468109,
0.006792447995394468,
0.059468723833560944,
0.0069180820137262344,
-0.03282679244875908,
0.00243049836717546,
0.20014512538909912,
0.3296750485897064,
-0.06815113872289658,
0.011853081174194813,
0.009983456693589687,
0.016337333247065544,
0.07021860033273697,
0.05797865614295006,
0.008382589556276798,
0.16469696164131165,
-0.07916349917650223,
-0.007547300774604082,
-0.06046447157859802,
-0.06338038295507431,
-0.11245767772197723,
0.009307230822741985,
0.04217386990785599,
-0.07609562575817108,
0.0018294372130185366,
0.07959537953138351,
-0.10794036835432053,
0.06582485884428024,
-0.03369404003024101,
-0.09310566633939743,
-0.05915312096476555,
-0.06259994953870773,
0.034618351608514786,
-0.0019978436175733805,
0.03160511329770088,
-0.030625076964497566,
-0.024534426629543304,
0.024481385946273804,
-0.04206206649541855,
-0.1208145022392273,
-0.03235718980431557,
-0.001487608882598579,
-0.053055815398693085,
0.06309954077005386,
-0.009433597326278687,
0.027889013290405273,
0.12752772867679596,
0.006683403626084328,
-0.08703096956014633,
0.07629527151584625,
0.012195688672363758,
0.015440474264323711,
0.04523618519306183,
0.0017568264156579971,
-0.0041289012879133224,
0.14809678494930267,
0.036167796701192856,
-0.10925257951021194,
0.07783478498458862,
0.02280096337199211,
-0.10536064952611923,
-0.020996272563934326,
0.040298279374837875,
-0.013227434828877449,
0.09845978021621704,
0.13130132853984833,
-0.007191658020019531,
0.000898149621207267,
-0.019399993121623993,
0.012487507425248623,
0.06904423236846924,
0.015122609212994576,
-0.060117680579423904,
-0.17850995063781738,
-0.035679370164871216,
0.10724040120840073,
-0.0005584085010923445,
-0.24745051562786102,
-0.08621545135974884,
-0.16171392798423767,
0.03035990707576275,
-0.11140485852956772,
0.0676032155752182,
0.19164133071899414,
-0.0062905545346438885,
-0.009596578776836395,
-0.1359805464744568,
-0.017770063132047653,
0.06954647600650787,
-0.08382859826087952,
-0.09823592752218246
] |
null | null | transformers |
# Model Trained Using AutoTrain
- Problem type: Tabular regression
## Validation Metrics
- r2: 0.5287307064016351
- mse: 3.103168000915719e+19
- mae: 2243863540.8
- rmse: 5570608585.168877
- rmsle: 8.027979609819264
- loss: 5570608585.168877
## Best Params
- learning_rate: 0.11299209471906922
- reg_lambda: 1.95078305416454e-06
- reg_alpha: 0.03568550183373181
- subsample: 0.6486218191662874
- colsample_bytree: 0.22654368454464396
- max_depth: 1
- early_stopping_rounds: 481
- n_estimators: 20000
- eval_metric: rmse
## Usage
```python
import json
import joblib
import pandas as pd
model = joblib.load('model.joblib')
config = json.load(open('config.json'))
features = config['features']
# data = pd.read_csv("data.csv")
data = data[features]
predictions = model.predict(data) # or model.predict_proba(data)
# predictions can be converted to original labels using label_encoders.pkl
``` | {"library_name": "transformers", "tags": ["autotrain", "tabular", "regression", "tabular-regression"], "datasets": ["autotrain-uzdtm-nwkp2/autotrain-data"], "pipeline_tag": "tabular-regression"} | tabular-regression | diffuser34/autotrain-uzdtm-nwkp2 | [
"transformers",
"joblib",
"autotrain",
"tabular",
"regression",
"tabular-regression",
"dataset:autotrain-uzdtm-nwkp2/autotrain-data",
"endpoints_compatible",
"has_space",
"region:us"
] | 2024-02-13T10:54:14+00:00 | [] | [] | TAGS
#transformers #joblib #autotrain #tabular #regression #tabular-regression #dataset-autotrain-uzdtm-nwkp2/autotrain-data #endpoints_compatible #has_space #region-us
|
# Model Trained Using AutoTrain
- Problem type: Tabular regression
## Validation Metrics
- r2: 0.5287307064016351
- mse: 3.103168000915719e+19
- mae: 2243863540.8
- rmse: 5570608585.168877
- rmsle: 8.027979609819264
- loss: 5570608585.168877
## Best Params
- learning_rate: 0.11299209471906922
- reg_lambda: 1.95078305416454e-06
- reg_alpha: 0.03568550183373181
- subsample: 0.6486218191662874
- colsample_bytree: 0.22654368454464396
- max_depth: 1
- early_stopping_rounds: 481
- n_estimators: 20000
- eval_metric: rmse
## Usage
| [
"# Model Trained Using AutoTrain\n\n- Problem type: Tabular regression",
"## Validation Metrics\n\n- r2: 0.5287307064016351\n- mse: 3.103168000915719e+19\n- mae: 2243863540.8\n- rmse: 5570608585.168877\n- rmsle: 8.027979609819264\n- loss: 5570608585.168877",
"## Best Params\n\n- learning_rate: 0.11299209471906922\n- reg_lambda: 1.95078305416454e-06\n- reg_alpha: 0.03568550183373181\n- subsample: 0.6486218191662874\n- colsample_bytree: 0.22654368454464396\n- max_depth: 1\n- early_stopping_rounds: 481\n- n_estimators: 20000\n- eval_metric: rmse",
"## Usage"
] | [
"TAGS\n#transformers #joblib #autotrain #tabular #regression #tabular-regression #dataset-autotrain-uzdtm-nwkp2/autotrain-data #endpoints_compatible #has_space #region-us \n",
"# Model Trained Using AutoTrain\n\n- Problem type: Tabular regression",
"## Validation Metrics\n\n- r2: 0.5287307064016351\n- mse: 3.103168000915719e+19\n- mae: 2243863540.8\n- rmse: 5570608585.168877\n- rmsle: 8.027979609819264\n- loss: 5570608585.168877",
"## Best Params\n\n- learning_rate: 0.11299209471906922\n- reg_lambda: 1.95078305416454e-06\n- reg_alpha: 0.03568550183373181\n- subsample: 0.6486218191662874\n- colsample_bytree: 0.22654368454464396\n- max_depth: 1\n- early_stopping_rounds: 481\n- n_estimators: 20000\n- eval_metric: rmse",
"## Usage"
] | [
62,
17,
75,
110,
3
] | [
"passage: TAGS\n#transformers #joblib #autotrain #tabular #regression #tabular-regression #dataset-autotrain-uzdtm-nwkp2/autotrain-data #endpoints_compatible #has_space #region-us \n# Model Trained Using AutoTrain\n\n- Problem type: Tabular regression## Validation Metrics\n\n- r2: 0.5287307064016351\n- mse: 3.103168000915719e+19\n- mae: 2243863540.8\n- rmse: 5570608585.168877\n- rmsle: 8.027979609819264\n- loss: 5570608585.168877## Best Params\n\n- learning_rate: 0.11299209471906922\n- reg_lambda: 1.95078305416454e-06\n- reg_alpha: 0.03568550183373181\n- subsample: 0.6486218191662874\n- colsample_bytree: 0.22654368454464396\n- max_depth: 1\n- early_stopping_rounds: 481\n- n_estimators: 20000\n- eval_metric: rmse## Usage"
] | [
-0.15059709548950195,
0.15110670030117035,
-0.0040767197497189045,
0.07734968513250351,
0.0654262974858284,
0.041280779987573624,
0.06593304127454758,
0.14125822484493256,
-0.07333216816186905,
0.12870870530605316,
0.1622946560382843,
0.13514825701713562,
0.050419650971889496,
0.12940829992294312,
-0.13317570090293884,
-0.14401564002037048,
0.008287088945508003,
0.019902687519788742,
0.06317541748285294,
0.10933254659175873,
0.09313651919364929,
-0.06852103024721146,
0.085331991314888,
0.032935503870248795,
-0.1865900307893753,
-0.031083595007658005,
0.04455079138278961,
-0.04649534821510315,
0.07072796672582626,
0.07242320477962494,
0.09086719900369644,
-0.020284222438931465,
0.07558738440275192,
-0.09576103091239929,
-0.01333493273705244,
0.0503353551030159,
-0.0030389209277927876,
0.11745700985193253,
0.1443624645471573,
-0.09081029146909714,
0.18857112526893616,
-0.047148577868938446,
0.035207703709602356,
0.07200298458337784,
-0.17839215695858002,
-0.08487369865179062,
-0.15264341235160828,
0.025314373895525932,
0.12880875170230865,
0.0933828204870224,
-0.029836490750312805,
0.1757069081068039,
-0.10663094371557236,
0.09921964257955551,
0.13096627593040466,
-0.28977662324905396,
-0.07448163628578186,
0.060887303203344345,
0.005314679350703955,
-0.10783341526985168,
-0.10378744453191757,
-0.01180913858115673,
0.07470957189798355,
0.026445914059877396,
0.045380569994449615,
-0.021333307027816772,
0.011138000525534153,
0.00833876058459282,
-0.13076432049274445,
-0.04398832470178604,
0.14063015580177307,
0.06409739702939987,
-0.04289711266756058,
-0.07146450877189636,
-0.031760767102241516,
-0.09702202677726746,
-0.021222148090600967,
0.013826906681060791,
-0.015968216583132744,
-0.03842705115675926,
-0.06251680105924606,
0.06939037144184113,
-0.051184024661779404,
-0.05707202106714249,
-0.06630277633666992,
-0.03315642103552818,
-0.009487547911703587,
0.03377070277929306,
0.04469579830765724,
0.04655604809522629,
-0.03351147845387459,
-0.11544869840145111,
-0.015457452274858952,
-0.04056006669998169,
-0.10475359112024307,
-0.03482034057378769,
0.03456351161003113,
0.021273208782076836,
-0.012989593669772148,
0.13214978575706482,
-0.05154919624328613,
0.06668286770582199,
0.013242512941360474,
0.049058035016059875,
-0.02248135395348072,
0.13759136199951172,
-0.041924137622117996,
-0.1753484606742859,
0.05803299695253372,
0.045671410858631134,
-0.017355600371956825,
-0.014708287082612514,
-0.02986248768866062,
-0.06765767186880112,
0.07146883010864258,
0.055752918124198914,
0.053710486739873886,
-0.0021067331545054913,
-0.11756831407546997,
-0.010778669267892838,
0.1077273041009903,
-0.0637483149766922,
0.05403521656990051,
0.01453512441366911,
-0.10587441921234131,
-0.03916170448064804,
0.06211172044277191,
-0.006955168675631285,
-0.05144825950264931,
0.026551978662610054,
-0.09112327545881271,
0.03403324633836746,
-0.07664277404546738,
-0.11815816164016724,
0.032215673476457596,
-0.008736829273402691,
-0.0285846795886755,
-0.11150115728378296,
-0.19013938307762146,
-0.1256258338689804,
-0.024112356826663017,
-0.10153570026159286,
-0.012871569953858852,
-0.06209386885166168,
-0.03860238566994667,
0.06990205496549606,
-0.00319556868635118,
0.04151282459497452,
-0.04839964583516121,
0.026569439098238945,
0.03280884772539139,
0.016292810440063477,
-0.004146811086684465,
0.004133946727961302,
-0.03598735108971596,
0.04778212308883667,
-0.12813793122768402,
0.07796970009803772,
-0.07146799564361572,
0.1023218110203743,
-0.16535958647727966,
-0.05344972386956215,
0.00044665837776847184,
-0.019134048372507095,
0.0994800478219986,
0.1437319815158844,
-0.21969157457351685,
0.04766948148608208,
0.07686810195446014,
-0.04990368336439133,
-0.11758642643690109,
0.0989987850189209,
-0.07426229119300842,
0.005520942620933056,
0.03201648220419884,
0.1040237694978714,
0.07000743597745895,
0.017458844929933548,
-0.07679609954357147,
-0.03751888871192932,
-0.0022533810697495937,
-0.07264664024114609,
0.05845729634165764,
-0.03177984058856964,
-0.043032944202423096,
-0.012462041340768337,
0.0399920679628849,
0.03207234665751457,
-0.09880507737398148,
-0.05547622963786125,
-0.019621536135673523,
-0.09683632850646973,
-0.003265637904405594,
0.041737742722034454,
0.06209324672818184,
-0.06076938658952713,
-0.11019131541252136,
-0.08905331045389175,
0.13791556656360626,
-0.00014093279605731368,
-0.03290063142776489,
-0.1218719333410263,
0.054538652300834656,
-0.09280459582805634,
-0.031581781804561615,
-0.19493480026721954,
-0.04021773859858513,
-0.0280170738697052,
0.09701813012361526,
-0.006979252677410841,
-0.01779632642865181,
0.06583969295024872,
0.02558942139148712,
-0.0022955359891057014,
-0.036106858402490616,
0.087375707924366,
-0.009783618152141571,
-0.104032501578331,
-0.15072311460971832,
-0.016235824674367905,
-0.02106979861855507,
0.07668878138065338,
-0.2532556653022766,
-0.025920193642377853,
-0.0009000241407193244,
0.15158812701702118,
0.03465729206800461,
-0.0688118189573288,
0.02177634835243225,
0.04455740377306938,
-0.030360445380210876,
-0.07877063006162643,
0.03472764045000076,
-0.005614289548248053,
-0.06672630459070206,
0.10567662864923477,
-0.1864110678434372,
0.06301871687173843,
0.05226508155465126,
0.04146493971347809,
-0.1553918570280075,
0.0574503056704998,
-0.025400342419743538,
-0.033401623368263245,
-0.023485515266656876,
-0.023265987634658813,
0.11931828409433365,
0.008771836757659912,
0.12032558768987656,
-0.04242105036973953,
-0.05249011144042015,
-0.0013749153586104512,
-0.02591782994568348,
0.02008126489818096,
0.218113973736763,
-0.017589146271348,
-0.2370711863040924,
0.08065593987703323,
0.03778981789946556,
-0.06222299486398697,
0.01957540772855282,
-0.030343296006321907,
-0.07207555323839188,
-0.09843434393405914,
0.05751339718699455,
0.09564295411109924,
0.12143683433532715,
-0.013661443255841732,
-0.01881394162774086,
0.06038408726453781,
0.003420506604015827,
0.002584045985713601,
-0.10330189764499664,
0.005242384038865566,
-0.013924972154200077,
0.011908004991710186,
-0.026601845398545265,
0.03190338984131813,
0.04300673305988312,
0.09828414022922516,
-0.009631689637899399,
-0.11228739470243454,
-0.004859978333115578,
-0.026519987732172012,
-0.11895235627889633,
0.2196468561887741,
-0.0809720903635025,
-0.12465084344148636,
-0.10615849494934082,
-0.09670867770910263,
-0.00960078090429306,
-0.025338701903820038,
0.01725531928241253,
-0.022405819967389107,
-0.0936892107129097,
-0.08883964270353317,
0.012035381980240345,
-0.016233403235673904,
-0.05215940624475479,
0.021586913615465164,
0.022130075842142105,
0.10580034554004669,
-0.101003497838974,
0.009740653447806835,
-0.03358736261725426,
0.027979107573628426,
0.08286265283823013,
-0.0018657653126865625,
0.13677944242954254,
0.11336321383714676,
0.008238794282078743,
0.042284220457077026,
-0.02650986611843109,
0.25358980894088745,
-0.037918806076049805,
-0.049089379608631134,
0.11436901986598969,
0.013870622031390667,
0.054721713066101074,
0.069373220205307,
0.04296566918492317,
-0.09247435629367828,
-0.012330819852650166,
0.01764976792037487,
-0.015873100608587265,
-0.2529442310333252,
-0.10256291180849075,
-0.05309515818953514,
0.05003109201788902,
0.11768055707216263,
0.025900384411215782,
-0.09815153479576111,
0.04846205934882164,
0.03859001025557518,
0.07527666538953781,
-0.06041949987411499,
0.058267317712306976,
0.014353673905134201,
0.04004666954278946,
0.1428012251853943,
-0.058009181171655655,
0.01166782807558775,
0.06161390244960785,
-0.11134167015552521,
0.2024162858724594,
-0.014887617900967598,
0.0716714859008789,
0.017362285405397415,
0.17704765498638153,
0.05471896380186081,
0.021529648452997208,
0.05666358023881912,
-0.03367362916469574,
0.04234213009476662,
-0.055550467222929,
-0.1003427803516388,
0.04400521516799927,
0.0014313356950879097,
0.024895252659916878,
-0.08455892652273178,
0.08674965053796768,
-0.004719095304608345,
0.16274268925189972,
0.1293327510356903,
-0.38551509380340576,
-0.02721323072910309,
-0.005404455587267876,
-0.020530670881271362,
-0.12301839143037796,
-0.02037159539759159,
-0.059482116252183914,
-0.1371038556098938,
-0.010233988985419273,
-0.0469108410179615,
0.1420433074235916,
-0.1218554899096489,
-0.018565809354186058,
-0.04881201684474945,
0.05746613070368767,
-0.01195549312978983,
0.03319701552391052,
-0.10257705301046371,
0.20524431765079498,
0.04370102658867836,
0.09128046035766602,
-0.022315766662359238,
0.022646404802799225,
0.04401665925979614,
-0.020830221474170685,
0.14388234913349152,
0.007450887933373451,
-0.15992581844329834,
-0.2444254457950592,
-0.11536188423633575,
-0.05863219499588013,
0.054052263498306274,
-0.023259807378053665,
0.0743863582611084,
0.012535639107227325,
-0.02211233228445053,
0.033978842198848724,
-0.08482751995325089,
-0.11968056112527847,
-0.1038803830742836,
0.0360943041741848,
0.03348681703209877,
-0.015339210629463196,
-0.056296203285455704,
-0.028389450162649155,
-0.0435086190700531,
0.19835920631885529,
0.08328049629926682,
0.02129931002855301,
-0.1281314641237259,
0.09561796486377716,
0.14706547558307648,
-0.08492498099803925,
0.04731233790516853,
-0.015318287536501884,
0.01760866306722164,
0.04791485145688057,
-0.03950071334838867,
0.0787283331155777,
-0.08256828784942627,
-0.07783142477273941,
-0.03777923434972763,
0.13328787684440613,
0.007005567662417889,
0.05597737804055214,
0.031722571700811386,
0.06708761304616928,
-0.055977366864681244,
-0.0962521955370903,
0.041764285415410995,
0.06848341226577759,
0.13751542568206787,
0.08654090017080307,
0.010595551691949368,
-0.014248019084334373,
-0.019314024597406387,
-0.03296725079417229,
0.15922951698303223,
0.3264216184616089,
-0.07079540938138962,
0.02171732671558857,
-0.03152830898761749,
-0.020799797028303146,
-0.18937397003173828,
0.018996980041265488,
0.08029834926128387,
0.07753930240869522,
0.07338551431894302,
-0.020530397072434425,
0.0580446757376194,
0.11838779598474503,
-0.031138204038143158,
0.029482431709766388,
-0.27039143443107605,
-0.0989546924829483,
0.17543399333953857,
0.020710289478302002,
0.09121200442314148,
-0.13862541317939758,
-0.06344746798276901,
-0.05048571527004242,
-0.11884458363056183,
0.02452661283314228,
-0.012441361322999,
0.09562143683433533,
-0.038243845105171204,
0.006276299711316824,
0.06650018692016602,
-0.062442597001791,
0.14885810017585754,
-0.06814467906951904,
0.04474505037069321,
-0.058265943080186844,
0.023996934294700623,
0.06239324435591698,
-0.11151256412267685,
0.06525532156229019,
0.0014324842486530542,
0.12853553891181946,
-0.2750738561153412,
-0.010881230235099792,
-0.013081271201372147,
0.03390028700232506,
-0.022076338529586792,
0.007167642470449209,
-0.034200720489025116,
0.043821532279253006,
0.016285177320241928,
-0.0016997671918943524,
0.08089523762464523,
-0.007395909633487463,
0.08607710152864456,
0.12126778066158295,
0.1028144583106041,
-0.04262524098157883,
-0.12267743051052094,
-0.0015436753164976835,
-0.04017355665564537,
0.005247252061963081,
-0.1975729763507843,
0.04600667208433151,
0.11516157537698746,
0.04084775224328041,
0.12538151443004608,
0.033845916390419006,
-0.06556906551122665,
0.014797168783843517,
0.09740548580884933,
-0.11621292680501938,
-0.05041429400444031,
-0.013553132303059101,
0.06825727969408035,
-0.11653322726488113,
-0.04123979061841965,
0.13479338586330414,
-0.030577590689063072,
-0.004964426159858704,
0.031060947105288506,
0.004605463240295649,
0.013784280978143215,
0.23554594814777374,
0.03916124254465103,
0.07180757075548172,
-0.1101226955652237,
0.0752488225698471,
0.07581175118684769,
0.04622027277946472,
0.03710118308663368,
0.09517576545476913,
-0.0793219655752182,
-0.07834840565919876,
-0.04996705800294876,
0.2284160852432251,
-0.06536407768726349,
-0.031025642529129982,
-0.09001477807760239,
-0.07396180927753448,
0.056280046701431274,
0.0775553360581398,
0.07686350494623184,
0.02580084651708603,
0.008302832953631878,
-0.0890735536813736,
-0.06853564083576202,
0.0763317197561264,
0.18031464517116547,
0.036090027540922165,
-0.08384600281715393,
0.08032216876745224,
-0.062063705176115036,
-0.01327129639685154,
-0.0016337280394509435,
0.06328915059566498,
-0.14400874078273773,
-0.00656816316768527,
-0.09491072595119476,
0.04937543720006943,
-0.07500483840703964,
0.01395394653081894,
-0.009182636626064777,
0.018887795507907867,
-0.0497465617954731,
0.033472154289484024,
-0.0646163746714592,
-0.07448194921016693,
-0.0020051742903888226,
0.04448917508125305,
-0.04647238925099373,
-0.015946300700306892,
0.027148481458425522,
-0.05629941076040268,
0.0638899877667427,
0.01207752339541912,
0.07460185140371323,
-0.0002550318604335189,
-0.010998308658599854,
-0.013815360143780708,
0.06678363680839539,
-0.028106609359383583,
0.08015453815460205,
-0.131561741232872,
0.02464478649199009,
-0.003950686194002628,
0.0038600449915975332,
0.035431526601314545,
-0.004788728430867195,
-0.10666751116514206,
-0.018400784581899643,
-0.06913116574287415,
-0.011577161028981209,
-0.08511781692504883,
0.07292736321687698,
0.06590893119573593,
0.1114707738161087,
0.09936268627643585,
-0.044245023280382156,
0.06944862753152847,
-0.16505347192287445,
0.003407062729820609,
-0.017172768712043762,
-0.06274516135454178,
-0.024675123393535614,
0.021587014198303223,
0.12327001988887787,
-0.05087370052933693,
0.02980656921863556,
-0.02299400232732296,
0.029356034472584724,
0.02430710569024086,
0.04143242537975311,
0.024095889180898666,
-0.02014949731528759,
0.11037417501211166,
0.029004190117120743,
-0.010918033309280872,
0.053541384637355804,
0.07607697695493698,
0.051876623183488846,
0.04631561040878296,
0.052678026258945465,
0.12373363226652145,
-0.00804993137717247,
0.08316551893949509,
-0.00909638311713934,
-0.1400066465139389,
-0.06142812594771385,
0.14860031008720398,
-0.0834369957447052,
0.06213034316897392,
-0.05960366874933243,
0.10108629614114761,
0.13201332092285156,
-0.07181726396083832,
-0.0009137390879914165,
-0.05035875737667084,
-0.08320800960063934,
-0.1562693864107132,
-0.0653810054063797,
-0.10877919942140579,
-0.07854064553976059,
0.03986837714910507,
-0.09172454476356506,
0.04224924370646477,
0.10112985223531723,
0.017549140378832817,
0.011450679041445255,
0.05142584443092346,
-0.043666355311870575,
-0.020591212436556816,
0.08152978122234344,
-0.010130571201443672,
-0.0023531049955636263,
-0.04757515713572502,
-0.007847403176128864,
0.02980215474963188,
-0.03948291391134262,
0.07557191699743271,
-0.012272775173187256,
0.00362484366632998,
0.06328523904085159,
-0.006399884354323149,
-0.1255381852388382,
-0.017776288092136383,
0.06566362828016281,
0.04401896148920059,
0.14949461817741394,
0.09281057864427567,
0.05875056982040405,
-0.04385289177298546,
0.23446989059448242,
-0.058886848390102386,
0.005620239302515984,
-0.1567719429731369,
0.16751275956630707,
0.0364023819565773,
-0.02190573140978813,
-0.04727605730295181,
-0.11508449167013168,
0.1055678054690361,
0.1368979811668396,
0.12496818602085114,
-0.04296526685357094,
-0.01888819970190525,
-0.027930481359362602,
-0.017692340537905693,
-0.05980571731925011,
0.01875970885157585,
0.030389845371246338,
0.11430975794792175,
-0.0733257532119751,
-0.053544409573078156,
-0.0365251861512661,
-0.03023962676525116,
-0.011017506942152977,
0.08053377270698547,
0.03871268406510353,
0.0326494425535202,
-0.01816297322511673,
0.0989755243062973,
-0.06816110014915466,
-0.014646710827946663,
0.009820806793868542,
-0.1670253574848175,
-0.18778036534786224,
0.004230715334415436,
0.026659633964300156,
-0.018211431801319122,
0.13078701496124268,
-0.06221553310751915,
-0.0380263589322567,
0.03231222927570343,
-0.020065749064087868,
-0.12032731622457504,
-0.10876226425170898,
-0.027750417590141296,
0.0654875710606575,
0.18806259334087372,
-0.0063377320766448975,
0.06430352479219437,
0.15925142168998718,
-0.0007706093019805849,
-0.21743272244930267,
0.06857840716838837,
0.00525934062898159,
-0.12234508246183395,
0.1357642561197281,
0.15350013971328735,
-0.008124708198010921,
0.14558985829353333,
0.057726725935935974,
-0.11955603957176208,
-0.01675591431558132,
-0.030020250007510185,
0.06325218081474304,
-0.05010680854320526,
-0.028235364705324173,
-0.030530164018273354,
0.13124458491802216,
0.14139629900455475,
-0.03978539630770683,
-0.00826452486217022,
-0.023512210696935654,
0.05134709179401398,
0.003108080942183733,
0.0017426401609554887,
-0.0168294794857502,
-0.19998255372047424,
0.06907626986503601,
0.08852248638868332,
0.001529391622170806,
-0.1516861468553543,
-0.10774815082550049,
0.001345538767054677,
-0.07218154519796371,
-0.023696662858128548,
0.16740384697914124,
0.087065190076828,
0.02538631483912468,
-0.023915285244584084,
-0.1307443529367447,
0.008830435574054718,
0.11651736497879028,
-0.038657620549201965,
-0.11414308100938797
] |
null | null | stable-baselines3 |
# **A2C** Agent playing **PandaReachDense-v3**
This is a trained model of a **A2C** agent playing **PandaReachDense-v3**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
| {"library_name": "stable-baselines3", "tags": ["PandaReachDense-v3", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "A2C", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "PandaReachDense-v3", "type": "PandaReachDense-v3"}, "metrics": [{"type": "mean_reward", "value": "-0.20 +/- 0.09", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | slc48/a2c-PandaReachDense-v3 | [
"stable-baselines3",
"PandaReachDense-v3",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-13T10:55:45+00:00 | [] | [] | TAGS
#stable-baselines3 #PandaReachDense-v3 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# A2C Agent playing PandaReachDense-v3
This is a trained model of a A2C agent playing PandaReachDense-v3
using the stable-baselines3 library.
## Usage (with Stable-baselines3)
TODO: Add your code
| [
"# A2C Agent playing PandaReachDense-v3\nThis is a trained model of a A2C agent playing PandaReachDense-v3\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
"TAGS\n#stable-baselines3 #PandaReachDense-v3 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# A2C Agent playing PandaReachDense-v3\nThis is a trained model of a A2C agent playing PandaReachDense-v3\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
41,
45,
17
] | [
"passage: TAGS\n#stable-baselines3 #PandaReachDense-v3 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# A2C Agent playing PandaReachDense-v3\nThis is a trained model of a A2C agent playing PandaReachDense-v3\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
0.028780510649085045,
0.06549051403999329,
-0.004174588713794947,
0.028733979910612106,
0.12748076021671295,
-0.010029550641775131,
0.16130082309246063,
0.07903143763542175,
0.052706290036439896,
-0.055043965578079224,
0.09157051891088486,
-0.079488605260849,
0.04699381813406944,
0.3393711447715759,
0.029525093734264374,
-0.186785027384758,
0.08573613315820694,
0.015584449283778667,
0.018966808915138245,
0.09867662936449051,
0.03466832637786865,
-0.08736564218997955,
0.04568251967430115,
0.03800429776310921,
-0.07686931639909744,
-0.04319252818822861,
-0.03975098207592964,
-0.06744661927223206,
0.10361767560243607,
-0.044310007244348526,
0.1670169234275818,
-0.03489987552165985,
0.10219604521989822,
-0.12577489018440247,
0.031373992562294006,
-0.04813149571418762,
-0.05141052231192589,
0.002818689215928316,
-0.011371237225830555,
0.05937984213232994,
0.04167760908603668,
0.05197896435856819,
0.07366002351045609,
0.04871916025876999,
-0.08704962581396103,
-0.11396265029907227,
-0.006845315918326378,
0.07931416481733322,
0.17974808812141418,
0.04054044932126999,
-0.02474738284945488,
0.09696658700704575,
-0.11350683122873306,
0.01657135598361492,
-0.019304286688566208,
-0.4018571078777313,
0.006876560393720865,
0.15550047159194946,
0.04677277058362961,
0.010903568007051945,
-0.0061170910485088825,
-0.004642391111701727,
0.02805398777127266,
-0.037410516291856766,
0.08670840412378311,
-0.09000635892152786,
0.06153826415538788,
-0.019131680950522423,
-0.04113767296075821,
-0.01751464419066906,
0.2419518232345581,
0.01633240468800068,
-0.08024721592664719,
-0.07922019064426422,
0.009968155063688755,
-0.028026137501001358,
-0.0877801775932312,
-0.06134319305419922,
0.07644549012184143,
0.057131536304950714,
0.10696670413017273,
-0.030399860814213753,
-0.058683689683675766,
-0.04541248828172684,
0.08352918922901154,
-0.03953780233860016,
-0.017566127702593803,
-0.01754307933151722,
-0.06739802658557892,
-0.003707833355292678,
0.015629740431904793,
-0.06615205854177475,
-0.015486059710383415,
-0.044966671615839005,
-0.1556774228811264,
-0.009128551930189133,
-0.0599384643137455,
0.03310214728116989,
0.10073909163475037,
0.13065455853939056,
0.06838785856962204,
0.09685135632753372,
-0.08001106232404709,
0.0389438234269619,
0.06625691801309586,
0.09461154788732529,
-0.044509198516607285,
-0.011874453164637089,
0.14630302786827087,
0.10327376425266266,
0.09657767415046692,
-0.09182082861661911,
-0.12403369694948196,
0.04173071309924126,
0.10965418070554733,
0.03382069617509842,
0.0046537998132407665,
0.04452834278345108,
-0.14144757390022278,
0.023916395381093025,
0.0006972529226914048,
-0.045244041830301285,
-0.03088594414293766,
0.06111180782318115,
-0.04433412477374077,
0.02348744124174118,
-0.012718633748590946,
0.10830001533031464,
0.10152670741081238,
-0.023899899795651436,
-0.052799396216869354,
-0.04201658070087433,
-0.0440504252910614,
-0.05507666990160942,
0.04012975096702576,
0.01289378758519888,
0.04624854028224945,
-0.1184653639793396,
-0.13997629284858704,
0.051258668303489685,
0.019622454419732094,
-0.026321161538362503,
-0.13472233712673187,
-0.09338399767875671,
-0.03747362270951271,
-0.011210841126739979,
0.0030350966844707727,
-0.19588395953178406,
-0.02434816211462021,
-0.03428230062127113,
0.13725687563419342,
0.10810749977827072,
-0.06433141976594925,
-0.06369391083717346,
-0.12834231555461884,
0.06795675307512283,
-0.23485252261161804,
0.038750845938920975,
-0.09932064265012741,
0.12411006540060043,
0.007471752353012562,
0.023616313934326172,
0.1410844624042511,
0.02330038882791996,
0.004575210623443127,
0.1702503114938736,
-0.18833371996879578,
-0.046672217547893524,
0.17527204751968384,
-0.0857074186205864,
-0.17703735828399658,
0.05021136254072189,
-0.02124672941863537,
-0.013779462315142155,
0.06350992619991302,
0.09937554597854614,
-0.01727774553000927,
-0.17061583697795868,
0.02558896690607071,
-0.0014508399181067944,
-0.05959303304553032,
0.021542999893426895,
0.12072649598121643,
0.08040176331996918,
-0.027203790843486786,
-0.0016989230643957853,
-0.15452547371387482,
0.09701786935329437,
-0.023543400689959526,
-0.08447092026472092,
0.022736359387636185,
-0.10411997884511948,
0.10016260296106339,
-0.015677137300372124,
0.10591494292020798,
-0.02265925332903862,
-0.018805475905537605,
-0.032891299575567245,
0.10408006608486176,
-0.0068649593740701675,
0.039593957364559174,
-0.17728297412395477,
0.1326225996017456,
0.02176543138921261,
0.046730607748031616,
-0.10109715908765793,
-0.10202061384916306,
0.06674831360578537,
0.15375585854053497,
0.05606463924050331,
0.03833417221903801,
0.07328703999519348,
0.03443831577897072,
-0.0030986627098172903,
-0.1205538883805275,
-0.12789975106716156,
0.019881807267665863,
0.06068658083677292,
-0.08039596676826477,
-0.05172275751829147,
-0.10460081696510315,
0.21138279139995575,
-0.10705634206533432,
0.012047823518514633,
-0.09333895146846771,
0.010153836570680141,
0.08388294279575348,
0.01348812971264124,
0.08132237941026688,
0.02585482969880104,
-0.04426883906126022,
0.009419471956789494,
0.0882885605096817,
0.044275086373090744,
-0.1379590630531311,
0.03784618154168129,
0.024114131927490234,
0.23272188007831573,
0.15174852311611176,
-0.016499420627951622,
-0.055556558072566986,
0.006534850224852562,
0.03740030899643898,
0.03533044084906578,
0.034956689924001694,
0.06951800733804703,
0.1090264692902565,
0.07713755965232849,
0.1276414394378662,
-0.05066131055355072,
0.17763042449951172,
-0.006530070677399635,
-0.14888496696949005,
0.02993084490299225,
-0.07033783197402954,
0.0941668227314949,
-0.06030277907848358,
0.048379335552453995,
0.05410725995898247,
0.0304675605148077,
0.08504439890384674,
-0.00693494314327836,
0.022639812901616096,
-0.04341154545545578,
0.04943868890404701,
0.06790532171726227,
0.06545940041542053,
0.06452376395463943,
-0.007423467002809048,
0.015456308610737324,
-0.05288444459438324,
-0.0518295019865036,
-0.10519610345363617,
-0.12370408326387405,
0.037892695516347885,
-0.015912096947431564,
-0.04463989660143852,
-0.01629551686346531,
-0.07266248762607574,
0.050321705639362335,
0.05250744894146919,
-0.07199236750602722,
0.028561361134052277,
-0.007090074475854635,
-0.09633425623178482,
0.1130511462688446,
-0.14269201457500458,
-0.31355980038642883,
-0.02000165916979313,
-0.13154496252536774,
-0.02077566273510456,
0.15819574892520905,
-0.057956792414188385,
-0.1681092083454132,
0.03305667266249657,
-0.02401961199939251,
-0.09238096326589584,
0.04225420579314232,
-0.018061356619000435,
0.10221174359321594,
0.0857708528637886,
0.043082691729068756,
0.00862243864685297,
-0.01184127852320671,
-0.03903079405426979,
-0.08788500726222992,
0.07608162611722946,
-0.06721128523349762,
0.1173204705119133,
0.13519366085529327,
0.04123268276453018,
-0.015909500420093536,
-0.02043113484978676,
0.06215733662247658,
0.012027861550450325,
-0.036599598824977875,
0.13453175127506256,
-0.03608042374253273,
-0.00864011887460947,
0.04470202699303627,
0.008029532618820667,
-0.10533943772315979,
0.09432658553123474,
-0.05022074654698372,
-0.06974482536315918,
-0.017500806599855423,
-0.08790571242570877,
-0.09950723499059677,
0.18995612859725952,
0.0490412712097168,
0.007856572046875954,
-0.05151839926838875,
0.036120012402534485,
0.07772433012723923,
0.044773608446121216,
0.007161281071603298,
0.03985898196697235,
-0.005716364365071058,
-0.013170693069696426,
0.05278664082288742,
-0.023887991905212402,
0.009960537776350975,
-0.007844919338822365,
0.13077811896800995,
-0.015673788264393806,
0.10317149013280869,
0.0030158995650708675,
0.008619097992777824,
0.08018261194229126,
0.12394148856401443,
0.08064290136098862,
0.019240466877818108,
-0.11554506421089172,
-0.04732639715075493,
-0.030522609129548073,
-0.18181301653385162,
0.11669926345348358,
0.10738886147737503,
0.05268440023064613,
-0.05564067140221596,
0.22832486033439636,
0.0012100599706172943,
0.10802210867404938,
0.03496129810810089,
-0.17664514482021332,
0.024751557037234306,
0.03574612736701965,
0.050895314663648605,
0.007034227252006531,
0.062039270997047424,
-0.09453237801790237,
-0.1839483082294464,
0.03968557342886925,
0.018860090523958206,
0.05523261800408363,
-0.018427258357405663,
0.018512532114982605,
-0.12044285237789154,
-0.05746040865778923,
0.02161633037030697,
0.02076297253370285,
-0.3029120862483978,
0.06816349923610687,
-0.04133946821093559,
0.07392577081918716,
0.009542034938931465,
0.01343793235719204,
0.06604447960853577,
0.01652485318481922,
0.1375029981136322,
-0.017935138195753098,
0.1707022786140442,
-0.1572514772415161,
-0.16084668040275574,
0.025680551305413246,
-0.059293005615472794,
0.07245437800884247,
0.082563117146492,
0.017692390829324722,
0.0069250138476490974,
-0.00047057756455615163,
0.20794180035591125,
-0.13032017648220062,
-0.0346711240708828,
-0.035274047404527664,
0.019543148577213287,
0.022580156102776527,
-0.03844551369547844,
-0.021310672163963318,
0.06112392246723175,
0.1489492505788803,
0.07546767592430115,
-0.02780069410800934,
-0.04611911624670029,
-0.03938353434205055,
-0.09507237374782562,
-0.044778671115636826,
0.10472412407398224,
-0.07841785997152328,
0.10144548118114471,
-0.07513871043920517,
-0.04432075098156929,
0.11707907915115356,
-0.09250949323177338,
-0.053160861134529114,
-0.07627046853303909,
0.05462219938635826,
0.008296831510961056,
0.13374868035316467,
0.03642493113875389,
0.02114485390484333,
0.10089845955371857,
-0.05001259222626686,
0.08662480860948563,
0.03777577355504036,
-0.03541218861937523,
0.03517242521047592,
-0.05375073477625847,
-0.04829130321741104,
-0.010828596539795399,
0.03814345970749855,
0.24244728684425354,
0.302570104598999,
-0.012830551713705063,
0.1897524893283844,
0.09193363785743713,
0.029696941375732422,
-0.16292639076709747,
-0.1200476586818695,
0.05548451840877533,
0.059938978403806686,
0.06154406815767288,
-0.2788083851337433,
0.057189684361219406,
-0.053967077285051346,
-0.08999616652727127,
-0.06829255819320679,
-0.08560561388731003,
-0.07613074034452438,
0.088682159781456,
0.08794322609901428,
0.09100460261106491,
-0.12551987171173096,
0.015924450010061264,
-0.012671655975282192,
-0.1664767563343048,
0.12128932029008865,
-0.039350032806396484,
0.07007917016744614,
-0.025050386786460876,
-0.06438229978084564,
0.025165842846035957,
-0.02775278501212597,
0.04424511641263962,
-0.1206880658864975,
0.0005293674184940755,
-0.04527926817536354,
-0.03749620169401169,
0.1088484600186348,
0.020565982908010483,
-0.0028168195858597755,
-0.09558401256799698,
-0.011945599690079689,
-0.3103867173194885,
0.01988539844751358,
0.02114551141858101,
-0.039148375391960144,
-0.0012507046340033412,
-0.08678091317415237,
-0.042053963989019394,
0.10508828610181808,
0.03930897265672684,
0.08641290664672852,
0.15335260331630707,
-0.005581455305218697,
-0.021082017570734024,
0.17506572604179382,
0.05701295658946037,
-0.014002309180796146,
0.10069113969802856,
-0.06732672452926636,
-0.06576105207204819,
0.04418903961777687,
-0.1016126498579979,
-0.005435575265437365,
0.005642053205519915,
-0.007821558974683285,
0.07107745110988617,
0.09962856024503708,
-0.03340476378798485,
0.18194207549095154,
0.09798844903707504,
-0.15048468112945557,
0.0030947427731007338,
0.052597809582948685,
-0.032650984823703766,
0.04424609988927841,
-0.04443032294511795,
0.05541829764842987,
-0.07521786540746689,
-0.03790169581770897,
0.02031708136200905,
-0.01010141521692276,
-0.07618512213230133,
0.00011962707503698766,
0.03176301345229149,
0.029956085607409477,
-0.08340912312269211,
0.14036758244037628,
0.016359949484467506,
0.0652431845664978,
0.11902019381523132,
0.019259776920080185,
-0.10460162162780762,
-0.014167122542858124,
-0.02339506521821022,
0.2028627097606659,
-0.007937151938676834,
-0.018536100164055824,
-0.11391238868236542,
-0.12847240269184113,
0.018047582358121872,
-0.10348039865493774,
0.10282431542873383,
-0.052032727748155594,
-0.06570395082235336,
-0.03704213351011276,
-0.05561172217130661,
0.031932998448610306,
0.017090078443288803,
-0.015642894431948662,
-0.16111870110034943,
-0.04170334339141846,
0.06846143305301666,
0.039452772587537766,
-0.06145704537630081,
-0.06289087235927582,
-0.16302458941936493,
0.03506235405802727,
-0.1278870701789856,
0.0010145133128389716,
-0.047339316457509995,
-0.05002537742257118,
-0.05195476487278938,
0.01521157007664442,
-0.0177876316010952,
0.008817745372653008,
-0.05148332938551903,
0.03292781487107277,
0.011250603944063187,
0.0014076961670070887,
-0.06952075660228729,
-0.04419080913066864,
0.032172493636608124,
-0.04430563375353813,
0.0661356970667839,
0.04131564497947693,
-0.005653871223330498,
0.021474739536643028,
-0.07005896419286728,
-0.10248169302940369,
0.10313672572374344,
-0.014939527027308941,
0.050572704523801804,
-0.0603681318461895,
-0.012018447741866112,
0.007195405196398497,
-0.07569561898708344,
-0.007751014549285173,
0.24328774213790894,
-0.010914106853306293,
-0.05394120141863823,
-0.07426224648952484,
-0.036970075219869614,
-0.09100507944822311,
-0.0004900419735349715,
0.1948854625225067,
0.05477539822459221,
0.14600017666816711,
-0.0532439760863781,
0.08785777539014816,
-0.06481330841779709,
-0.01534446980804205,
-0.08259234577417374,
0.030320849269628525,
-0.157977893948555,
-0.08130980283021927,
-0.028043894097208977,
-0.03728124126791954,
0.13441862165927887,
-0.19242097437381744,
0.0032852457370609045,
-0.010904400609433651,
-0.04910553991794586,
0.11381126195192337,
0.0557032972574234,
0.24474471807479858,
0.1050342544913292,
-0.035265225917100906,
0.10503548383712769,
0.12215624749660492,
0.0929517149925232,
-0.03347417712211609,
0.058777112513780594,
-0.05078745633363724,
-0.0868106484413147,
0.09736774861812592,
0.012061800807714462,
0.036776214838027954,
-0.08157306164503098,
0.022900743409991264,
-0.10047483444213867,
0.002025678288191557,
0.02005080319941044,
0.2473200410604477,
0.1967000812292099,
-0.09632564336061478,
-0.012216159142553806,
-0.05708231031894684,
-0.032561756670475006,
-0.04091155156493187,
-0.002459051087498665,
-0.07821618020534515,
-0.21873407065868378,
0.051539067178964615,
-0.0930585265159607,
-0.07632365822792053,
-0.06189138814806938,
-0.04064059257507324,
-0.02870149537920952,
0.046939339488744736,
0.03212931379675865,
0.04136762022972107,
0.05070297420024872,
-0.0371626541018486,
-0.09345480799674988,
0.06879863888025284,
-0.11172787100076675,
-0.042014576494693756,
-0.03408866748213768,
0.014045859687030315,
0.032319605350494385,
-0.07429610192775726,
0.07487598061561584,
-0.012149554677307606,
-0.07710553705692291,
0.036456044763326645,
-0.03482281416654587,
0.02153356932103634,
0.07482071220874786,
0.04184282198548317,
-0.09644174575805664,
0.015602846629917622,
0.18867559731006622,
0.020273970440030098,
0.008802177384495735,
-0.14742465317249298,
0.2000039666891098,
-0.02619965374469757,
0.07266447693109512,
-0.03337041288614273,
-0.015141828916966915,
-0.10115411877632141,
0.19129611551761627,
0.11998134851455688,
-0.24376079440116882,
0.024953339248895645,
-0.12912821769714355,
0.022151969373226166,
-0.13376696407794952,
0.20840151607990265,
0.05465596541762352,
0.10847201198339462,
-0.06020665541291237,
-0.02479162998497486,
-0.1493310034275055,
-0.09408020973205566,
-0.08478302508592606,
-0.0414455346763134,
0.10249399393796921,
0.0031611735466867685,
-0.05072701349854469,
-0.00887248944491148,
-0.1566619724035263,
0.10201162099838257,
-0.048264030367136,
-0.11855816096067429,
-0.0679796114563942,
-0.059141192585229874,
-0.06102965027093887,
0.11088541150093079,
0.11637356877326965,
-0.01684124954044819,
0.024554423987865448,
-0.07280154526233673,
-0.012559473514556885,
0.011003518477082253,
0.005383014678955078,
0.0626269057393074,
-0.04783647879958153,
0.1594477891921997,
-0.021524829789996147,
0.0008918871753849089,
0.04285505786538124,
0.05263057351112366,
-0.07584847509860992,
0.06380704790353775,
0.02512199431657791,
0.028178859502077103,
-0.006920731160789728,
0.059795111417770386,
-0.0196672473102808,
0.08964395523071289,
0.08038042485713959,
-0.007235884666442871,
0.09868589043617249,
-0.03191833570599556,
0.006547331809997559,
-0.057698819786310196,
0.06932510435581207,
-0.12982366979122162,
0.05436630919575691,
0.043436627835035324,
-0.10945180803537369,
0.03841061517596245,
0.02560393325984478,
0.11603125184774399,
0.058632634580135345,
-0.040632184594869614,
-0.10494323819875717,
-0.13799439370632172,
0.023235952481627464,
0.058803655207157135,
-0.06312531977891922,
-0.13800419867038727,
-0.052970461547374725,
-0.2062724232673645,
0.04198472201824188,
-0.07393307238817215,
0.06842854619026184,
0.045238204300403595,
0.01849091611802578,
-0.05578908324241638,
-0.06200101599097252,
0.01771395653486252,
0.13669656217098236,
-0.06059794872999191,
-0.13932769000530243
] |
null | null | transformers | Model description:
Model: pgajo/mbert-xlwa-en-it
Dataset: TASTEset
Unshuffled ratio: ['0']
Shuffled ratio: ['1']
Best exact match epoch: 10
Best exact match: 86.54
Best epoch: 10
Drop duplicates: ['1']
Max epochs = 10
Optimizer lr = 3e-05
Optimizer eps = 1e-08
Batch size = 32
Dataset path = pgajo/EW-TT-PE_U0_S1_Tingredient_P0.25_DROP1_mbert
Results
| epoch | train_loss | train_f1 | train_exact | dev_loss | dev_f1 | dev_exact | test_loss | test_f1 | test_exact |
|--------:|-------------:|-----------:|--------------:|-----------:|---------:|------------:|------------:|----------:|-------------:|
| 1 | 1.18 | 68.16 | 50.69 | 0.7 | 81.28 | 69.51 | 0 | 0 | 0 |
| 2 | 0.39 | 88.83 | 80.23 | 0.62 | 85.69 | 78.57 | 0 | 0 | 0 |
| 3 | 0.16 | 95.33 | 91.53 | 0.7 | 86.71 | 81.04 | 0 | 0 | 0 |
| 4 | 0.09 | 97.02 | 94.56 | 0.79 | 87.62 | 82.42 | 0 | 0 | 0 |
| 5 | 0.07 | 97.82 | 96.07 | 0.71 | 86.34 | 81.32 | 0 | 0 | 0 |
| 6 | 0.06 | 97.58 | 96.07 | 0.63 | 88.88 | 83.79 | 0 | 0 | 0 |
| 7 | 0.04 | 98.77 | 98 | 0.59 | 89.36 | 84.34 | 0 | 0 | 0 |
| 8 | 0.04 | 98.89 | 98.14 | 0.7 | 88.27 | 83.24 | 0 | 0 | 0 |
| 9 | 0.02 | 99.53 | 98.9 | 0.72 | 89.48 | 85.44 | 0 | 0 | 0 |
| 10 | 0.02 | 99.31 | 98.55 | 0.73 | 90.3 | 86.54 | 0 | 0 | 0 | | {} | question-answering | pgajo/mbert-xlwa-en-it_EW-TT-PE_U0_S1_Tingredient_P0.25_DROP1_mbert_E10_DEV87.0 | [
"transformers",
"safetensors",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] | 2024-02-13T10:55:55+00:00 | [] | [] | TAGS
#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us
| Model description:
```
Model: pgajo/mbert-xlwa-en-it
Dataset: TASTEset
Unshuffled ratio: ['0']
Shuffled ratio: ['1']
Best exact match epoch: 10
Best exact match: 86.54
Best epoch: 10
Drop duplicates: ['1']
Max epochs = 10
Optimizer lr = 3e-05
Optimizer eps = 1e-08
Batch size = 32
Dataset path = pgajo/EW-TT-PE_U0_S1_Tingredient_P0.25_DROP1_mbert
```
Results
| [] | [
"TAGS\n#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us \n"
] | [
30
] | [
"passage: TAGS\n#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us \n"
] | [
-0.03100396879017353,
0.011429967358708382,
-0.009655450470745564,
-0.0477571114897728,
0.071015864610672,
0.001686002011410892,
0.08008057624101639,
0.05985769256949425,
0.11401950567960739,
0.02590048313140869,
0.1903941035270691,
0.16566626727581024,
-0.07932274788618088,
0.015106523409485817,
-0.13172350823879242,
-0.13182127475738525,
0.11529869586229324,
0.03778080269694328,
-0.03543904423713684,
0.10329030454158783,
0.05029234290122986,
-0.12624382972717285,
0.04368755966424942,
-0.06763096153736115,
-0.062081653624773026,
0.06668882071971893,
0.04820772260427475,
-0.08198674768209457,
0.13085918128490448,
0.03362511843442917,
0.2047542929649353,
0.04677434265613556,
-0.1182841956615448,
-0.21163156628608704,
0.03874710574746132,
-0.011287915520370007,
-0.05873045325279236,
0.019588099792599678,
0.032477255910634995,
-0.07909006625413895,
-0.11140874028205872,
0.027899496257305145,
0.014707351103425026,
0.08549544960260391,
-0.18314984440803528,
-0.16563549637794495,
-0.06621148437261581,
-0.053103990852832794,
0.12317322194576263,
0.08563494682312012,
-0.020668305456638336,
0.1935536116361618,
-0.15425218641757965,
0.0928223505616188,
0.1380285918712616,
-0.32555314898490906,
-0.0027393975760787725,
0.093502476811409,
0.11618221551179886,
0.05096927657723427,
-0.02073126845061779,
0.09022705256938934,
0.07546665519475937,
-0.00581451877951622,
-0.06733445823192596,
-0.0957256555557251,
-0.012503020465373993,
0.09702391922473907,
-0.07598375529050827,
-0.052956461906433105,
0.2470276802778244,
0.031026924028992653,
0.013565225526690483,
-0.008941343985497952,
-0.10310965776443481,
0.030862320214509964,
0.02648748643696308,
-0.06024225428700447,
-0.02690120041370392,
0.06734149158000946,
-0.0001909599086502567,
0.005896252579987049,
-0.1221570298075676,
-0.006722765974700451,
-0.22672583162784576,
0.2768072187900543,
-0.0018987046787515283,
0.08534801006317139,
-0.2428436279296875,
0.015660421922802925,
-0.06141046807169914,
-0.0824490636587143,
-0.013059272430837154,
-0.09494815766811371,
-0.009192516095936298,
-0.02866560034453869,
-0.04682322219014168,
0.015530125238001347,
0.12870869040489197,
0.20563961565494537,
-0.017999636009335518,
0.04083723947405815,
-0.061628565192222595,
0.0725679025053978,
0.03914913535118103,
0.09992070496082306,
0.010195896960794926,
-0.020322704687714577,
-0.016003627330064774,
-0.13105420768260956,
-0.008767413906753063,
-0.03738516569137573,
-0.05202561616897583,
-0.022937579080462456,
0.01343182846903801,
0.16656653583049774,
0.057803552597761154,
0.021070659160614014,
-0.08621648699045181,
0.05785249546170235,
0.022443469613790512,
-0.04320667311549187,
-0.017870478332042694,
0.00882878340780735,
0.06155950948596001,
0.0885266587138176,
-0.07562171667814255,
0.04524178430438042,
0.016779053956270218,
0.06491811573505402,
-0.07376032322645187,
-0.06024041771888733,
-0.019815200939774513,
-0.022853199392557144,
0.06425601989030838,
-0.06728833168745041,
0.08267539739608765,
-0.1562412828207016,
-0.08226612955331802,
0.011612122878432274,
0.02970954217016697,
0.007305266335606575,
0.06759197264909744,
-0.014567295089364052,
-0.039057523012161255,
-0.03480268642306328,
-0.07194317877292633,
-0.10265897214412689,
-0.07100482285022736,
0.06559862941503525,
0.037085019052028656,
0.029506711289286613,
-0.08701489865779877,
0.0126223498955369,
-0.10313430428504944,
0.0696413442492485,
-0.07926147431135178,
-0.03626604750752449,
-0.030684340745210648,
0.19216585159301758,
-0.03995077684521675,
-0.013410759158432484,
-0.11826255917549133,
0.05234655737876892,
-0.05254388228058815,
0.21867278218269348,
-0.03809955716133118,
-0.03585023805499077,
0.23391962051391602,
-0.09690817445516586,
-0.2571674883365631,
0.07713238894939423,
0.006013390142470598,
0.017324132844805717,
0.10797587037086487,
0.19150643050670624,
-0.016850516200065613,
-0.11185130476951599,
0.0474415123462677,
0.11249569058418274,
-0.15280477702617645,
-0.0624573640525341,
0.025971313938498497,
-0.0582793690264225,
-0.1464228332042694,
0.016458844766020775,
0.051048628985881805,
0.04815160855650902,
-0.08806464076042175,
-0.03191754221916199,
-0.02947526052594185,
-0.018536636605858803,
0.061611421406269073,
0.04005695879459381,
0.026151038706302643,
-0.12002047151327133,
0.017315825447440147,
-0.051940858364105225,
-0.04731830582022667,
0.03846436366438866,
0.007411974482238293,
-0.12714537978172302,
0.07094167917966843,
-0.131436288356781,
0.020615974441170692,
-0.16280385851860046,
-0.19247999787330627,
-0.013410934247076511,
0.10532321780920029,
-0.05276893824338913,
0.20171119272708893,
0.11623696237802505,
-0.10492526739835739,
-0.01685560680925846,
-0.07052898406982422,
0.1616603285074234,
0.05628864839673042,
-0.02636071853339672,
-0.04867614805698395,
0.07146526873111725,
-0.10356242209672928,
-0.10846276581287384,
-0.05549529939889908,
-0.01631050743162632,
0.13880129158496857,
0.10532583296298981,
0.04163223132491112,
0.06328489631414413,
-0.012810224667191505,
0.017701199278235435,
-0.008262974210083485,
0.018305214121937752,
0.07581605017185211,
-0.03447617590427399,
-0.11924053728580475,
0.11601310968399048,
-0.1444002240896225,
0.3729725480079651,
0.16846853494644165,
-0.23041868209838867,
0.01894976757466793,
-0.026126159355044365,
-0.030978791415691376,
0.034767232835292816,
0.05344981700181961,
-0.017914773896336555,
0.01958848536014557,
0.031971078366041183,
0.07821214944124222,
-0.03785416856408119,
-0.05193689465522766,
-0.015433255583047867,
-0.07395049929618835,
-0.06607450544834137,
0.07275120168924332,
-0.03483232855796814,
-0.21013760566711426,
0.1599646657705307,
0.31365448236465454,
0.09703507274389267,
0.08886944502592087,
-0.0816551148891449,
-0.028012678027153015,
-0.0039048483595252037,
0.07745775580406189,
-0.022175131365656853,
0.0646965503692627,
-0.19559495151042938,
0.002697455231100321,
0.0718853622674942,
0.040101438760757446,
0.051995899528265,
-0.1255539059638977,
-0.08741874992847443,
0.02883525937795639,
0.010361172258853912,
-0.0510454997420311,
0.08942679315805435,
0.01958455704152584,
0.10355164110660553,
0.03094480000436306,
-0.025720693171024323,
0.12157201766967773,
-0.0424032099545002,
-0.08322477340698242,
0.16933336853981018,
-0.11445565521717072,
-0.22569596767425537,
-0.07213949412107468,
-0.10141351073980331,
0.023521440103650093,
0.043139949440956116,
0.07353874295949936,
-0.13277705013751984,
-0.06267919391393661,
0.050284892320632935,
0.04398718848824501,
-0.11532527953386307,
0.034965697675943375,
0.011176006868481636,
0.0742565244436264,
-0.047823816537857056,
-0.06598490476608276,
-0.06332776695489883,
-0.03295988216996193,
-0.06356722116470337,
0.1191829964518547,
-0.10939455777406693,
0.1207437515258789,
0.09475167840719223,
0.04165811091661453,
0.036363665014505386,
-0.027820978313684464,
0.21290433406829834,
-0.11579988896846771,
-0.03179406374692917,
0.15926754474639893,
-0.07346773147583008,
0.07930222153663635,
0.20331227779388428,
0.017215436324477196,
-0.1255631297826767,
0.04482865333557129,
-0.03777764365077019,
-0.08158078044652939,
-0.24055063724517822,
-0.04635780677199364,
-0.08391188085079193,
0.07882910221815109,
-0.018682004883885384,
0.04367469623684883,
0.10718972235918045,
0.09847458451986313,
0.02698599174618721,
-0.15794047713279724,
0.009259669110178947,
0.060280539095401764,
0.19491833448410034,
-0.0554194450378418,
0.09747976064682007,
-0.07872258871793747,
-0.14044831693172455,
0.058162905275821686,
0.07227057963609695,
0.11210840195417404,
0.18135450780391693,
0.0031284119468182325,
0.07501647621393204,
0.11561381816864014,
0.14170172810554504,
0.14721226692199707,
0.028168288990855217,
-0.09393750876188278,
-0.012610750272870064,
0.000841298489831388,
-0.071214459836483,
0.04935174807906151,
0.06255429983139038,
-0.09986883401870728,
-0.016300853341817856,
-0.16199824213981628,
0.11020834743976593,
0.05675990507006645,
0.08375607430934906,
-0.13229906558990479,
0.008182737976312637,
0.12653344869613647,
-0.016539672389626503,
-0.04231732711195946,
0.12035517394542694,
0.07884106040000916,
-0.08249315619468689,
0.04244247451424599,
-0.04095182567834854,
0.11129532009363174,
0.07417996227741241,
0.09555985778570175,
-0.096460722386837,
-0.16630028188228607,
0.02183578908443451,
0.07979494333267212,
-0.27919045090675354,
0.28428587317466736,
0.032050203531980515,
-0.04338350147008896,
-0.06692010164260864,
-0.039031147956848145,
-0.04415836185216904,
0.1649855673313141,
0.21534205973148346,
-0.006029482930898666,
-0.12515726685523987,
-0.10306360572576523,
0.060360122472047806,
0.07373268157243729,
0.15369689464569092,
-0.022843722254037857,
0.01709183119237423,
-0.02581469528377056,
0.01907532475888729,
0.0005263579660095274,
0.027384355664253235,
-0.00807490199804306,
-0.10579172521829605,
-0.003417222760617733,
0.027430731803178787,
0.11391840875148773,
-0.05235821753740311,
0.053690437227487564,
-0.07520826160907745,
0.11101158708333969,
-0.08321993052959442,
-0.024513524025678635,
-0.10570400953292847,
-0.159481018781662,
0.09931088238954544,
-0.0652543157339096,
0.02730567753314972,
-0.06895346194505692,
-0.034800801426172256,
-0.06456287950277328,
-0.1387634426355362,
0.15311841666698456,
-0.12774962186813354,
-0.014343206770718098,
-0.05910857394337654,
0.1744864135980606,
-0.057705219835042953,
-0.014981103129684925,
0.022769484668970108,
0.058170903474092484,
-0.08365354686975479,
-0.09320548176765442,
0.012634269893169403,
-0.08999879658222198,
0.07918208837509155,
0.07504331320524216,
-0.010605372488498688,
0.011236832477152348,
0.017805295065045357,
0.011543014086782932,
0.1833728551864624,
0.2684391736984253,
-0.03611943498253822,
0.05449281632900238,
0.21387790143489838,
0.009187204763293266,
-0.3001823127269745,
-0.03780132532119751,
-0.20396788418293,
-0.06599479168653488,
0.0035966881550848484,
-0.01841581240296364,
0.15771964192390442,
0.038633719086647034,
-0.05389995872974396,
0.06213739886879921,
-0.16254091262817383,
-0.0409867987036705,
0.17554175853729248,
0.02816466987133026,
0.5083365440368652,
-0.16917727887630463,
-0.09572464227676392,
-0.01933435909450054,
-0.21105335652828217,
0.09465035051107407,
-0.0792510136961937,
0.00545540964230895,
0.027481064200401306,
0.0250190868973732,
0.03670221567153931,
-0.09177862852811813,
0.1804729551076889,
-0.0251461174339056,
0.07020123302936554,
-0.08957348763942719,
-0.09517528116703033,
0.0571230947971344,
-0.00989442877471447,
-0.004209878388792276,
0.0377814881503582,
0.043195612728595734,
-0.09419526904821396,
-0.02725309133529663,
-0.07557959109544754,
0.05808710306882858,
0.029764346778392792,
-0.06465182453393936,
-0.024149267002940178,
-0.034049443900585175,
0.0040148478001356125,
-0.006224581506103277,
0.3219931423664093,
-0.07817333191633224,
0.1998085230588913,
0.0308726467192173,
0.17342960834503174,
-0.20313303172588348,
0.014420399442315102,
0.002336042234674096,
-0.07989436388015747,
0.09632785618305206,
-0.054569393396377563,
0.0957014411687851,
0.14680208265781403,
-0.03774647042155266,
0.04170471802353859,
0.09971088171005249,
0.044757623225450516,
-0.023297281935811043,
0.12041250616312027,
-0.2069728821516037,
-0.19302959740161896,
0.006711400113999844,
0.002523706993088126,
0.0443287193775177,
0.1371040642261505,
0.08772092312574387,
0.10595496743917465,
0.007110828999429941,
-0.019849922508001328,
-0.013635226525366306,
-0.07197124511003494,
0.015518625266849995,
0.07721489667892456,
0.05103190615773201,
-0.0915357917547226,
0.07368962466716766,
-0.044682856649160385,
-0.2505898177623749,
-0.011277278885245323,
0.010972370393574238,
-0.1136656329035759,
-0.09253716468811035,
-0.0640796348452568,
0.11949943006038666,
-0.0853467583656311,
-0.07717446982860565,
-0.033551741391420364,
-0.13546887040138245,
0.036930788308382034,
0.2936263084411621,
0.08502552658319473,
0.10473651438951492,
0.05559305474162102,
-0.024962520226836205,
0.02628864347934723,
-0.022201525047421455,
-0.0632605329155922,
0.0033800466917455196,
-0.10716227442026138,
-0.10930395126342773,
-0.0539650060236454,
0.1258552223443985,
-0.10030562430620193,
-0.0463426411151886,
-0.20223698019981384,
0.07721703499555588,
-0.17302681505680084,
-0.07449597120285034,
-0.1311258226633072,
-0.05869106575846672,
0.011798324063420296,
-0.1269368678331375,
-0.043847475200891495,
-0.0405474416911602,
-0.11593431234359741,
0.0941464975476265,
0.06928019225597382,
0.006738580297678709,
-0.09351341426372528,
-0.052371736615896225,
0.14618384838104248,
-0.039895832538604736,
0.07875484228134155,
0.12324118614196777,
-0.11218003928661346,
0.09794780611991882,
-0.19827678799629211,
-0.10873684287071228,
0.09223955124616623,
-0.020392343401908875,
0.07176221162080765,
0.06298419088125229,
-0.0209525004029274,
0.09442277252674103,
0.03166748583316803,
0.07961104065179825,
-0.041231222450733185,
-0.09570163488388062,
0.02909303456544876,
0.012143692001700401,
-0.16935859620571136,
-0.031028112396597862,
-0.1383150815963745,
0.138075590133667,
-0.03250321373343468,
0.13132928311824799,
-0.0014017382636666298,
0.0942121222615242,
-0.0393197238445282,
0.0214883740991354,
0.022810328751802444,
-0.15824435651302338,
0.014284737408161163,
-0.04512546584010124,
0.00530107831582427,
-0.042201071977615356,
0.2832597494125366,
-0.13215987384319305,
0.07444287836551666,
0.07330053299665451,
-0.007652656175196171,
0.048707786947488785,
0.035340797156095505,
0.2554089426994324,
0.08575175702571869,
-0.05636623501777649,
-0.11349837481975555,
0.047768156975507736,
-0.03974492475390434,
-0.16682684421539307,
0.08966261893510818,
0.16476166248321533,
-0.021509341895580292,
0.09579425305128098,
-0.015587063506245613,
0.04206113517284393,
0.003570155706256628,
-0.20271413028240204,
-0.03418423607945442,
-0.028696484863758087,
0.0342242605984211,
0.06175161153078079,
0.19321276247501373,
-0.02510346844792366,
0.027360908687114716,
-0.06739696860313416,
-0.006428796332329512,
-0.16893014311790466,
-0.05832986161112785,
-0.09619798511266708,
-0.10513351857662201,
0.056126669049263,
-0.10675669461488724,
-0.02991390973329544,
0.11837480962276459,
0.07225114107131958,
-0.014147752895951271,
0.20032523572444916,
-0.0034852379467338324,
-0.01854041963815689,
0.010509109124541283,
0.005002413876354694,
0.06455502659082413,
0.07439646869897842,
-0.007380056194961071,
-0.10331036895513535,
-0.07467203587293625,
-0.07210230082273483,
0.04836762696504593,
-0.09930044412612915,
-0.01744663715362549,
-0.142163947224617,
-0.09089858829975128,
-0.06536278873682022,
0.1318330466747284,
-0.08915292471647263,
0.10780727118253708,
-0.019095079973340034,
0.01910819485783577,
0.05497001111507416,
0.22086337208747864,
-0.07868800312280655,
-0.07071682065725327,
-0.060905519872903824,
0.16298183798789978,
0.004298616200685501,
0.15630026161670685,
-0.03950318321585655,
-0.0016224056016653776,
-0.0332493931055069,
0.2914927303791046,
0.16758738458156586,
-0.04768482968211174,
0.05667643994092941,
0.013426431454718113,
0.043882496654987335,
0.059551939368247986,
0.034976501017808914,
0.07581301033496857,
0.25021910667419434,
-0.07689207047224045,
-0.01975826919078827,
0.022277116775512695,
-0.00035899964859709144,
-0.055962271988391876,
0.045156292617321014,
0.029317067936062813,
-0.019586384296417236,
-0.08728770166635513,
0.12731784582138062,
-0.10686571151018143,
0.08306804299354553,
0.05728748440742493,
-0.15720857679843903,
-0.014027200639247894,
-0.022743018344044685,
0.1905868649482727,
-0.06110110133886337,
0.11211711168289185,
-0.030706269666552544,
-0.13290581107139587,
-0.02404458075761795,
0.04101835936307907,
-0.1852385401725769,
-0.056675106287002563,
0.08444182574748993,
0.05783277377486229,
0.06356650590896606,
0.01799783855676651,
0.008918672800064087,
0.09269910305738449,
-0.0174893569201231,
-0.06227288395166397,
0.09672212600708008,
0.09302622079849243,
-0.11702378839254379,
-0.10226112604141235,
-0.03835497796535492,
0.03587648272514343,
-0.007181957364082336,
0.07796690613031387,
-0.23804201185703278,
0.04944111034274101,
0.012472385540604591,
-0.06038458272814751,
-0.06527353823184967,
0.0485636405646801,
-0.06548506766557693,
0.04292919486761093,
0.025255493819713593,
-0.00807290431112051,
0.015648027881979942,
-0.0017639343859627843,
0.056236833333969116,
0.04547872394323349,
-0.07353842258453369,
-0.10449795424938202,
-0.04468516260385513,
-0.040538545697927475,
0.15919344127178192,
-0.0320364348590374,
-0.12340949475765228,
-0.02860189974308014,
-0.014523285441100597,
0.07767149806022644,
-0.07934793829917908,
0.009319511242210865,
0.09768388420343399,
0.05723276734352112,
0.0005386354750953615,
-0.18609586358070374,
0.047480739653110504,
0.08650989830493927,
-0.0709119662642479,
-0.08683779090642929
] |
null | null | null | # GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer

## Overview
Named Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.
- Arxiv link: https://arxiv.org/abs/2311.08526
- Google colab demo: https://colab.research.google.com/drive/1mhalKWzmfSTqMnR0wQBZvt9-ktTsATHB?usp=sharing
## Usage
```python
import torch, re
from save_load import load_model
model = load_model("gliner_base.pt", model_name="microsoft/deberta-v3-base")
model = model.eval()
def extract_entities(model, text, labels, threshold=0.5):
def tokenize_text(text):
return re.findall(r'\w+(?:[-_]\w+)*|\S', text.replace("\n", ""))
tokens = tokenize_text(text)
input_x = {"tokenized_text": tokens, "ner": None}
x = model.collate_fn([input_x], labels)
output = model.predict(x, flat_ner=True, threshold=threshold)
result_dict = {}
for start, end, ent_type in output[0]:
span_text = " ".join(tokens[start:end+1]).replace(" ' ", "'")
result_dict[span_text] = ent_type
return result_dict
text = """
Cristiano Ronaldo dos Santos Aveiro (Portuguese pronunciation: [kɾiʃˈtjɐnu ʁɔˈnaldu]; born 5 February 1985) is a Portuguese professional footballer who plays as a forward for and captains both Saudi Pro League club Al Nassr and the Portugal national team. Widely regarded as one of the greatest players of all time, Ronaldo has won five Ballon d'Or awards,[note 3] a record three UEFA Men's Player of the Year Awards, and four European Golden Shoes, the most by a European player. He has won 33 trophies in his career, including seven league titles, five UEFA Champions Leagues, the UEFA European Championship and the UEFA Nations League. Ronaldo holds the records for most appearances (183), goals (140) and assists (42) in the Champions League, goals in the European Championship (14), international goals (128) and international appearances (205). He is one of the few players to have made over 1,200 professional career appearances, the most by an outfield player, and has scored over 850 official senior career goals for club and country, making him the top goalscorer of all time.
"""
labels = ["person", "award", "date", "competitions", "teams"]
result = extract_entities(model, text, labels, threshold=0.5)
for k, v in result.items():
print(f"{k}, {v}")
```
```
- Cristiano Ronaldo dos Santos Aveiro, person
- 5 February 1985, date
- Al Nassr, teams
- Portugal national team, teams
- Ballon d'Or, award
- UEFA Men's Player of the Year Awards, award
- European Golden Shoes, award
- UEFA Champions Leagues, competitions
- UEFA European Championship, competitions
- UEFA Nations League, competitions
- Champions League, competitions
- European Championship, competitions
```
## Ressources
- [Pretrained Weight](https://drive.google.com/file/d/100aMdyzk5EC6Rl2kzLmLvMKbHz3Btt34/view?usp=sharing)
- [Training Data](https://drive.google.com/file/d/1MKDx73hzm9sFByJMBJhHqEuBeJzW5TsL/view?usp=sharing)
- [Evaluation Data](https://drive.google.com/file/d/1T-5IbocGka35I7X3CE6yKe5N_Xg2lVKT/view)
## Contact
If you have any questions or need further assistance please raise an issue.
```bibtex
@misc{zaratiana2023gliner,
title={GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer},
author={Urchade Zaratiana and Nadi Tomeh and Pierre Holat and Thierry Charnois},
year={2023},
eprint={2311.08526},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| {} | null | urchade/GLiNER | [
"arxiv:2311.08526",
"region:us"
] | 2024-02-13T11:00:25+00:00 | [
"2311.08526"
] | [] | TAGS
#arxiv-2311.08526 #region-us
| # GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer
!GLiNER Logo
## Overview
Named Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.
- Arxiv link: URL
- Google colab demo: URL
## Usage
## Ressources
- Pretrained Weight
- Training Data
- Evaluation Data
## Contact
If you have any questions or need further assistance please raise an issue.
| [
"# GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer\n\n!GLiNER Logo",
"## Overview\nNamed Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.\n- Arxiv link: URL\n- Google colab demo: URL",
"## Usage",
"## Ressources\n- Pretrained Weight\n- Training Data\n- Evaluation Data",
"## Contact\nIf you have any questions or need further assistance please raise an issue."
] | [
"TAGS\n#arxiv-2311.08526 #region-us \n",
"# GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer\n\n!GLiNER Logo",
"## Overview\nNamed Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.\n- Arxiv link: URL\n- Google colab demo: URL",
"## Usage",
"## Ressources\n- Pretrained Weight\n- Training Data\n- Evaluation Data",
"## Contact\nIf you have any questions or need further assistance please raise an issue."
] | [
15,
27,
227,
3,
16,
16
] | [
"passage: TAGS\n#arxiv-2311.08526 #region-us \n# GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer\n\n!GLiNER Logo## Overview\nNamed Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.\n- Arxiv link: URL\n- Google colab demo: URL## Usage## Ressources\n- Pretrained Weight\n- Training Data\n- Evaluation Data## Contact\nIf you have any questions or need further assistance please raise an issue."
] | [
-0.06043565645813942,
0.14140118658542633,
-0.0025761143770068884,
0.08883726596832275,
0.04839339479804039,
0.003327659796923399,
0.027526911348104477,
0.11221037060022354,
-0.029467979446053505,
0.0393766425549984,
0.04570282995700836,
-0.02832833305001259,
-0.010313493199646473,
-0.023279309272766113,
0.060607459396123886,
-0.13283628225326538,
-0.022175226360559464,
-0.028236068785190582,
0.050341565161943436,
0.07011370360851288,
0.07793623208999634,
-0.031121226027607918,
0.07688139379024506,
0.004080770071595907,
-0.029953140765428543,
0.008251921273767948,
-0.014429275877773762,
0.03308454155921936,
0.06761699169874191,
0.015918372198939323,
0.013190016150474548,
-0.02554514817893505,
0.022323112934827805,
-0.16503959894180298,
0.006501541472971439,
0.017441119998693466,
0.026095794513821602,
0.02406560257077217,
-0.030982080847024918,
0.016080841422080994,
0.12841705977916718,
-0.12011150270700455,
-0.0011960541596636176,
0.04293069615960121,
-0.04835783317685127,
-0.08496636897325516,
-0.0799340009689331,
0.01656784489750862,
0.01279962994158268,
0.10810369998216629,
0.0001281170843867585,
0.1808759719133377,
-0.09876111149787903,
0.033629316836595535,
0.019471006467938423,
-0.17547942698001862,
-0.04959804564714432,
0.14834867417812347,
0.04120713472366333,
-0.021421566605567932,
-0.03322819992899895,
0.05002175644040108,
0.0315389521420002,
-0.00793077889829874,
-0.04713483527302742,
-0.017838159576058388,
-0.08377169072628021,
0.06588109582662582,
-0.12956449389457703,
0.02245260588824749,
0.09100252389907837,
0.01467069610953331,
-0.055203523486852646,
-0.11414199322462082,
-0.03660976141691208,
0.0208054780960083,
0.03655489906668663,
-0.07058455795049667,
0.05605790391564369,
0.01080497819930315,
0.001244339277036488,
-0.13430969417095184,
-0.054919347167015076,
-0.09125829488039017,
-0.06466533988714218,
0.09339027851819992,
0.08593356609344482,
0.035258617252111435,
0.02381780743598938,
0.1026669293642044,
-0.14032934606075287,
-0.051249053329229355,
-0.10291381925344467,
-0.055662062019109726,
-0.11050525307655334,
-0.013209983706474304,
-0.024315373972058296,
-0.05414307489991188,
0.09619087725877762,
0.07190638780593872,
-0.030548056587576866,
0.046141114085912704,
0.010511396452784538,
0.01068548858165741,
0.017227698117494583,
0.17763379216194153,
0.01713552512228489,
0.011393928900361061,
0.07703626900911331,
-0.010174267925322056,
0.06374843418598175,
0.010730511508882046,
-0.06089967489242554,
-0.06387049704790115,
0.017817944288253784,
0.04252708703279495,
-0.01874702423810959,
0.06034453213214874,
-0.03562529385089874,
-0.008263680152595043,
0.028409579768776894,
-0.09498798847198486,
0.025813432410359383,
-0.04431644827127457,
-0.09596114605665207,
-0.002460755640640855,
0.06857909262180328,
-0.029833799228072166,
-0.04757563769817352,
-0.1253286600112915,
-0.06273152679204941,
0.01820594258606434,
-0.09716439992189407,
-0.02810579165816307,
0.042894553393125534,
-0.03708454221487045,
-0.039502501487731934,
-0.10413496196269989,
-0.2215261459350586,
-0.024155477061867714,
0.07294660806655884,
-0.05846968665719032,
-0.05162416398525238,
-0.07237615436315536,
0.006059006787836552,
-0.018339451402425766,
0.03433947265148163,
-0.05486878380179405,
-0.020578138530254364,
-0.0016524425009265542,
0.09120555967092514,
0.025225481018424034,
-0.025166936218738556,
0.010852108709514141,
-0.0888633206486702,
0.03194202482700348,
-0.015059935860335827,
0.1249488964676857,
-0.11625996977090836,
-0.028105219826102257,
-0.08002717047929764,
-0.0222377497702837,
0.0016956659965217113,
-0.0005717065650969744,
0.010560495778918266,
0.11159229278564453,
-0.1717943549156189,
0.01817290671169758,
0.09480682015419006,
-0.12252799421548843,
-0.025803767144680023,
0.09696012735366821,
-0.050013795495033264,
0.00969507172703743,
0.1219438910484314,
0.1242968961596489,
0.21007293462753296,
-0.18223105370998383,
-0.06841076165437698,
0.06132406368851662,
-0.1161104068160057,
-0.012703480198979378,
0.041682805866003036,
0.0162680484354496,
-0.04015685245394707,
0.04077285900712013,
-0.08091748505830765,
0.05473742261528969,
-0.024911392480134964,
-0.05510902404785156,
-0.09055917710065842,
-0.06487878412008286,
0.048048339784145355,
0.015284948982298374,
0.04824934899806976,
0.012499872595071793,
-0.10562282800674438,
0.0296795517206192,
0.11175167560577393,
-0.009156262502074242,
0.002054088283330202,
-0.09313423186540604,
0.034988757222890854,
-0.07723291963338852,
0.003452249802649021,
-0.06023325026035309,
-0.1394450068473816,
0.0790749341249466,
-0.07521245628595352,
0.15574200451374054,
0.03792219236493111,
0.045133285224437714,
0.02333618514239788,
-0.03187968581914902,
0.051028765738010406,
-0.014215226285159588,
-0.04102355241775513,
0.0009793119970709085,
-0.09015725553035736,
0.05984680354595184,
-0.023564200848340988,
0.03560647368431091,
-0.1358458697795868,
-0.0192861370742321,
0.05393514782190323,
0.016567068174481392,
0.026959529146552086,
-0.0674721896648407,
0.0619538389146328,
-0.04092823714017868,
0.010882801376283169,
-0.05245343968272209,
0.01847440004348755,
0.0026419293135404587,
-0.11295589059591293,
0.06295711547136307,
0.022817598655819893,
-0.08802701532840729,
0.02134884148836136,
0.06496144831180573,
-0.061786286532878876,
-0.07201731204986572,
-0.043579570949077606,
-0.06114709749817848,
-0.0614844486117363,
-0.004958561155945063,
0.2792421877384186,
-0.0040279231034219265,
0.05181460455060005,
-0.05545918643474579,
-0.07186843454837799,
-0.06535802036523819,
-0.02559702657163143,
0.02291659452021122,
0.08710829168558121,
0.00803059060126543,
-0.06900584697723389,
0.014648682437837124,
-0.1267748922109604,
-0.12284202873706818,
0.14525778591632843,
-0.03448646888136864,
-0.060083597898483276,
-0.04319841042160988,
0.08185791224241257,
0.003256065770983696,
0.11629536002874374,
-0.05947622284293175,
0.010126401670277119,
0.015345200896263123,
0.05185757577419281,
0.08724767714738846,
-0.11110799759626389,
0.02940073423087597,
9.238719940185547e-7,
-0.09552674740552902,
-0.0062339529395103455,
0.04816989228129387,
-0.041257742792367935,
0.03196254000067711,
0.02468581311404705,
0.043648604303598404,
0.04014074057340622,
-0.021425923332571983,
-0.06932663172483444,
0.12218504399061203,
-0.06484994292259216,
-0.12414001673460007,
-0.12369577586650848,
0.05361685901880264,
-0.09585865586996078,
-0.021858567371964455,
-0.012073241174221039,
-0.10156036913394928,
-0.027305135503411293,
-0.06500165164470673,
0.026807932183146477,
-0.05922859162092209,
-0.055616624653339386,
-0.09523134678602219,
0.05166863277554512,
0.011622355319559574,
-0.16530142724514008,
0.005436960142105818,
-0.042738307267427444,
-0.11780215799808502,
0.025957418605685234,
-0.0005217248108237982,
0.07252994924783707,
0.08921246975660324,
-0.021699177101254463,
0.020121922716498375,
0.006404662970453501,
0.1654348224401474,
-0.07422912120819092,
0.002270722994580865,
0.13736988604068756,
0.045262932777404785,
0.05404343456029892,
0.028791451826691628,
0.018527492880821228,
-0.05556841939687729,
0.03670796379446983,
0.07381223887205124,
-0.06763609498739243,
-0.2005021721124649,
-0.09568017721176147,
-0.03718850389122963,
-0.10903453826904297,
-0.04979805275797844,
0.0682508647441864,
-0.05519253760576248,
0.03261025249958038,
0.04435735195875168,
0.027783943340182304,
0.017698129639029503,
0.036647289991378784,
0.17564186453819275,
-0.00831656251102686,
0.03030865266919136,
-0.03263033181428909,
-0.061086032539606094,
0.07832205295562744,
0.05348912999033928,
0.3688972294330597,
-0.06739460676908493,
0.05748064070940018,
0.08674847334623337,
0.039726220071315765,
0.05075528845191002,
-0.03958829864859581,
-0.03598570078611374,
0.017342571169137955,
-0.016875814646482468,
-0.054050520062446594,
-0.0035926501732319593,
0.05932318791747093,
0.03213074058294296,
-0.003331298939883709,
0.036255113780498505,
0.030953068286180496,
0.045554209500551224,
0.09815650433301926,
-0.0014704378554597497,
-0.14982996881008148,
-0.04282868653535843,
-0.0031342122238129377,
-0.06239664927124977,
0.004857957363128662,
0.039815302938222885,
0.06086977198719978,
-0.06509049981832504,
0.025433607399463654,
-0.00935900304466486,
0.055788554251194,
-0.15706369280815125,
-0.013050608336925507,
-0.02571989595890045,
0.06742404401302338,
0.04677392542362213,
0.13258464634418488,
-0.13243675231933594,
0.125451922416687,
0.004788008984178305,
0.09004532545804977,
-0.017336372286081314,
0.0324578583240509,
0.03252111375331879,
-0.03012373484671116,
0.12103107571601868,
0.03095381148159504,
-0.007652537897229195,
0.012398905120790005,
-0.11823764443397522,
0.07991806417703629,
0.039618704468011856,
0.050806526094675064,
0.0686914473772049,
-0.021885555237531662,
0.013925405219197273,
-0.013825833797454834,
-0.03732270374894142,
-0.20607760548591614,
-0.12196895480155945,
0.033952753990888596,
-0.051794566214084625,
-0.05508120730519295,
-0.04534795507788658,
-0.03717407211661339,
0.009866888634860516,
0.25810274481773376,
-0.06860282272100449,
-0.10021115839481354,
-0.0635003075003624,
-0.018495937809348106,
0.15201911330223083,
-0.06298404186964035,
-0.034072283655405045,
0.01978774555027485,
0.030738478526473045,
0.017719674855470657,
-0.0697636604309082,
-0.01547921635210514,
-0.012805613689124584,
-0.05161655321717262,
0.00121504336129874,
0.06832180172204971,
0.11540927737951279,
0.0561886765062809,
0.016829749569296837,
0.009235270321369171,
-0.06297606974840164,
-0.15062908828258514,
0.02756969816982746,
0.21302784979343414,
-0.05182144045829773,
0.08364932984113693,
-0.05680008605122566,
0.025232557207345963,
-0.003619053401052952,
0.00919695571064949,
0.12278775870800018,
0.1380605250597,
-0.03598131239414215,
0.13296754658222198,
0.0449489988386631,
-0.11526017636060715,
-0.18947124481201172,
0.038085486739873886,
0.03104679472744465,
0.008134109899401665,
-0.07067824900150299,
-0.154489204287529,
0.0943031832575798,
0.06775791198015213,
-0.04513942077755928,
0.10578884184360504,
-0.1952364295721054,
-0.08386752009391785,
0.015077710151672363,
0.024706030264496803,
0.23071278631687164,
-0.08672035485506058,
-0.07389724999666214,
0.01275033038109541,
-0.01799055002629757,
0.10718433558940887,
-0.09378744661808014,
0.1224944144487381,
-0.059702225029468536,
0.051465339958667755,
0.03757479786872864,
-0.051820650696754456,
0.08796492964029312,
0.005530030000954866,
0.06150384992361069,
-0.021079961210489273,
0.004665327724069357,
0.004346493165940046,
-0.0798998698592186,
0.193202942609787,
0.01869836449623108,
0.06933622807264328,
-0.11898276954889297,
-0.058833807706832886,
-0.028094954788684845,
0.07315678894519806,
0.03760373219847679,
-0.04650026187300682,
-0.09646262973546982,
0.0935940146446228,
0.04521580785512924,
0.008903305046260357,
0.014179200865328312,
-0.052388373762369156,
-0.045438092201948166,
0.01881447620689869,
0.12399592995643616,
-0.03261195868253708,
-0.10096196830272675,
0.03843468427658081,
0.025034399703145027,
0.06427804380655289,
-0.13738709688186646,
0.030269140377640724,
0.126743346452713,
0.043945446610450745,
0.04195907339453697,
0.03072180412709713,
-0.07501362264156342,
0.03220154717564583,
0.031453363597393036,
-0.04402502253651619,
-0.11278784275054932,
-0.07073778659105301,
-0.15061290562152863,
-0.026477331295609474,
0.04636990278959274,
0.14793652296066284,
-0.07277759164571762,
0.009021427482366562,
0.030512556433677673,
0.03639631345868111,
-0.02825753763318062,
0.10643401741981506,
0.04790019243955612,
0.016884485259652138,
-0.05095289275050163,
0.082644023001194,
0.04300540313124657,
-0.022395309060811996,
0.03968520835042,
0.03206506744027138,
-0.065252386033535,
-0.0626915916800499,
-0.05876933038234711,
0.14846859872341156,
-0.024256166070699692,
-0.040090009570121765,
-0.04501904919743538,
-0.01287437416613102,
0.013385236263275146,
-0.007599842734634876,
0.06803272664546967,
0.08287451416254044,
-0.036650631576776505,
0.053614214062690735,
-0.07250317186117172,
0.07872293889522552,
0.05253114551305771,
0.050329286605119705,
-0.1383255273103714,
-0.030951427295804024,
-0.0010612645419314504,
0.08053796738386154,
-0.06151757016777992,
-0.02033885382115841,
-0.13621342182159424,
0.007251404225826263,
-0.1458500176668167,
0.06379241496324539,
-0.024082927033305168,
0.03955412283539772,
0.016652852296829224,
-0.00046659240615554154,
-0.007014863658696413,
0.09102962166070938,
-0.025436338037252426,
0.0022847007494419813,
-0.022010060027241707,
0.056662291288375854,
-0.0802953839302063,
-0.022823214530944824,
0.06212678179144859,
-0.03288330137729645,
0.11717134714126587,
-0.02077120542526245,
-0.04661666601896286,
0.019617639482021332,
-0.1725478619337082,
0.012503834441304207,
0.002311449032276869,
0.025818048045039177,
-0.01530552189797163,
-0.15541306138038635,
0.045844972133636475,
0.01593116670846939,
0.043006379157304764,
0.019127653911709785,
0.06765250861644745,
-0.08758668601512909,
-0.016837261617183685,
-0.061670247465372086,
-0.03503097593784332,
-0.06170148029923439,
0.013293729163706303,
0.07281753420829773,
0.1412307173013687,
0.08401872217655182,
-0.08285458385944366,
0.015591581352055073,
-0.10159111022949219,
-0.007787411566823721,
0.05541478842496872,
0.04146334528923035,
-0.06917579472064972,
-0.024283146485686302,
0.05381639674305916,
-0.03273369371891022,
0.2648993134498596,
-0.00402516033500433,
-0.014169966802001,
-0.025951294228434563,
0.02639654278755188,
0.04068939760327339,
-0.051292210817337036,
0.026711570098996162,
-0.011247262358665466,
0.03574229031801224,
0.07073603570461273,
-0.013735231943428516,
-0.025538744404911995,
-0.1957487016916275,
0.07983126491308212,
0.07267914712429047,
-0.04090167582035065,
0.06845589727163315,
0.057958487421274185,
-0.03212844580411911,
-0.07308762520551682,
0.14097973704338074,
0.01925155520439148,
0.024322301149368286,
-0.05855511501431465,
0.13169105350971222,
0.06497646123170853,
-0.06176173314452171,
0.11840800940990448,
0.011123131960630417,
-0.0903225690126419,
-0.09115258604288101,
-0.17867840826511383,
-0.07790779322385788,
-0.1557445079088211,
0.06744943559169769,
-0.08551299571990967,
0.013374991714954376,
0.036722857505083084,
0.02146941050887108,
-0.017430711537599564,
0.07475651055574417,
-0.05154130235314369,
-0.023789972066879272,
0.01244695670902729,
-0.05101283639669418,
-0.03400806710124016,
0.0962856113910675,
0.03838663920760155,
0.035391997545957565,
0.020052336156368256,
0.04730277135968208,
0.043580614030361176,
0.06773515790700912,
-0.01873386651277542,
-0.061157453805208206,
0.025182435289025307,
-0.041430260986089706,
0.04985331371426582,
0.037000469863414764,
0.055289316922426224,
0.047583989799022675,
-0.09753922373056412,
-0.014764489606022835,
0.22290973365306854,
-0.05504667013883591,
-0.12446902692317963,
-0.10776815563440323,
0.21016839146614075,
-0.025962093845009804,
-0.007628693711012602,
-0.01388635579496622,
-0.07354852557182312,
0.05051277205348015,
0.2049795538187027,
0.1778668761253357,
-0.016556741669774055,
-0.03130599856376648,
-0.011831891722977161,
-0.015273850411176682,
0.004823414143174887,
0.05207588151097298,
0.021730419248342514,
0.27816054224967957,
-0.03841644525527954,
-0.07116521894931793,
-0.018372125923633575,
0.019493194296956062,
-0.08181951195001602,
0.07333480566740036,
0.008274984546005726,
0.029760323464870453,
-0.07396835833787918,
0.07147884368896484,
-0.12626467645168304,
-0.05893515422940254,
-0.003621278563514352,
-0.06420964002609253,
-0.09051736444234848,
-0.005489880684763193,
-0.0033233112189918756,
-0.06782153248786926,
0.04379190504550934,
-0.02995465137064457,
0.014638029970228672,
-0.000845744798425585,
-0.016211669892072678,
-0.18593791127204895,
-0.009762126952409744,
0.06922155618667603,
0.11755654960870743,
0.17678062617778778,
0.01747598685324192,
0.13900603353977203,
0.07303524017333984,
-0.028737515211105347,
-0.10249980539083481,
0.10344255715608597,
-0.03930876776576042,
-0.1713721752166748,
0.0080803157761693,
0.11198797076940536,
0.00401733722537756,
0.09940469264984131,
0.06755407154560089,
0.05179625749588013,
-0.008121953345835209,
0.04845530912280083,
0.0156854297965765,
-0.1883581429719925,
0.03770826756954193,
-0.12453512102365494,
0.15143200755119324,
0.16172146797180176,
0.000508202298078686,
0.03264068812131882,
-0.09895303845405579,
0.07404354214668274,
-0.015743806958198547,
0.11528236418962479,
-0.010129673406481743,
-0.10317926853895187,
-0.0029183009173721075,
0.0850176215171814,
0.014303979463875294,
-0.1780989170074463,
-0.09873075038194656,
-0.014207696542143822,
0.028473636135458946,
0.05564231052994728,
0.09805276989936829,
0.13645318150520325,
0.06674167513847351,
-0.015484740026295185,
-0.21172882616519928,
0.03839464858174324,
0.014080006629228592,
-0.137115478515625,
-0.03770023584365845
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-anat
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the audiofolder dataset.
It achieves the following results on the evaluation set:
- Loss: 5.6234
- Wer: 145.2381
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 40
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 6.67 | 10 | 11.3731 | 142.8571 |
| No log | 13.33 | 20 | 10.3496 | 142.8571 |
| 11.0713 | 20.0 | 30 | 7.2831 | 142.8571 |
| 11.0713 | 26.67 | 40 | 5.6234 | 145.2381 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["audiofolder"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "whisper-small-anat", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "audiofolder", "type": "audiofolder", "config": "default", "split": "None", "args": "default"}, "metrics": [{"type": "wer", "value": 145.23809523809524, "name": "Wer"}]}]}]} | automatic-speech-recognition | alexbrand09/whisper-small-anat | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:audiofolder",
"base_model:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-13T11:04:22+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-audiofolder #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us
| whisper-small-anat
==================
This model is a fine-tuned version of openai/whisper-small on the audiofolder dataset.
It achieves the following results on the evaluation set:
* Loss: 5.6234
* Wer: 145.2381
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 40
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 40\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-audiofolder #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 40\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
81,
158,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-audiofolder #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 40\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.13276973366737366,
0.13391482830047607,
-0.002615911653265357,
0.04996860772371292,
0.07905803620815277,
0.0018983462359756231,
0.12997032701969147,
0.16204625368118286,
-0.03358893841505051,
0.1128576397895813,
0.08984329551458359,
0.10046014934778214,
0.06932027637958527,
0.19742697477340698,
-0.027472693473100662,
-0.3130033612251282,
0.01918124593794346,
-0.009315293282270432,
-0.08031216263771057,
0.11159060895442963,
0.08686340600252151,
-0.11013098806142807,
0.013964656740427017,
0.005744047928601503,
-0.08853791654109955,
-0.03313656151294708,
-0.02056771144270897,
-0.07358546555042267,
0.10119177401065826,
0.016613977029919624,
0.05017292872071266,
0.049400392919778824,
0.08270751684904099,
-0.2630172073841095,
0.010709503665566444,
0.04464779049158096,
0.04758860170841217,
0.07218559831380844,
0.0831189975142479,
-0.026583673432469368,
0.04575091227889061,
-0.07757659256458282,
0.085926353931427,
0.04307227209210396,
-0.08896411955356598,
-0.3363853394985199,
-0.08205319941043854,
0.048224981874227524,
0.15568551421165466,
0.06071455776691437,
-0.03676009923219681,
0.07646384090185165,
-0.04994003102183342,
0.08297200500965118,
0.22379513084888458,
-0.27831223607063293,
-0.055722467601299286,
-0.022371806204319,
0.06469345092773438,
0.059048376977443695,
-0.10768900066614151,
-0.00951023306697607,
0.017688605934381485,
0.02087261527776718,
0.12749220430850983,
0.004734990186989307,
0.07834678888320923,
-0.019037459045648575,
-0.136898010969162,
-0.04496762529015541,
0.11373034119606018,
0.0704324021935463,
-0.0363655686378479,
-0.17725993692874908,
-0.01829485222697258,
-0.1864565908908844,
-0.07021332532167435,
0.0015232219593599439,
0.023736344650387764,
-0.03571179509162903,
-0.11074045300483704,
0.005415631923824549,
-0.04826480895280838,
-0.08869340270757675,
0.04723009094595909,
0.14677003026008606,
0.03895688056945801,
-0.048321399837732315,
0.028963178396224976,
0.08892129361629486,
0.029321879148483276,
-0.15742851793766022,
-0.0032053994946181774,
0.037547312676906586,
-0.1073329821228981,
-0.024876020848751068,
-0.007306522689759731,
-0.013315542601048946,
0.02493423782289028,
0.17680078744888306,
-0.017064137384295464,
0.09572068601846695,
0.02330927364528179,
0.03805767372250557,
-0.08062923699617386,
0.15994004905223846,
-0.030168520286679268,
-0.0991809070110321,
-0.044774845242500305,
0.13840867578983307,
0.030445188283920288,
-0.027574913576245308,
-0.08387080579996109,
0.04130706191062927,
0.08777309954166412,
0.05317813903093338,
0.000615023891441524,
0.022554492577910423,
-0.08978065848350525,
-0.023338571190834045,
0.010993057861924171,
-0.1093328595161438,
0.05549584701657295,
0.023043038323521614,
-0.03363115340471268,
-0.06326116621494293,
-0.02335803583264351,
0.03603310510516167,
-0.013096986338496208,
0.10194648057222366,
-0.04038954898715019,
-0.02315737120807171,
-0.06478411704301834,
-0.08384992927312851,
0.02120812237262726,
-0.06372039765119553,
0.00042908225441351533,
-0.050067152827978134,
-0.09311508387327194,
-0.0517427995800972,
0.07129509747028351,
-0.05281056463718414,
-0.08086445182561874,
-0.0859561413526535,
-0.06281746923923492,
0.05334907025098801,
-0.02272632159292698,
0.11665605008602142,
-0.06075325235724449,
0.09933219850063324,
0.00110232166480273,
0.07873877882957458,
0.08317041397094727,
0.0649058148264885,
-0.037568435072898865,
0.05955507606267929,
-0.21780550479888916,
0.1114671602845192,
-0.10453176498413086,
0.055205460637807846,
-0.1537383496761322,
-0.09093073755502701,
0.0023707039654254913,
0.015236468985676765,
0.08606388419866562,
0.11563925445079803,
-0.20055589079856873,
-0.10518300533294678,
0.18747231364250183,
-0.08002697676420212,
-0.08537121117115021,
0.1486111581325531,
-0.019755374640226364,
0.0031058688182383776,
0.030946388840675354,
0.2366645634174347,
0.11626799404621124,
-0.09462643414735794,
0.021398242563009262,
-0.0418790727853775,
0.10026782006025314,
0.031828317791223526,
0.09062592685222626,
-0.06699670106172562,
0.04094113036990166,
-0.007864055223762989,
-0.014992784708738327,
0.055569905787706375,
-0.06026624143123627,
-0.07874961942434311,
-0.004753346089273691,
-0.07646705955266953,
0.03521568700671196,
0.03754162788391113,
0.018720092251896858,
-0.1026148796081543,
-0.13722512125968933,
-0.00076920201536268,
0.10385025292634964,
-0.10155072063207626,
0.02138039655983448,
-0.09805149585008621,
0.07086698710918427,
-0.014610196463763714,
-0.0057748546823859215,
-0.14655347168445587,
0.05262058228254318,
0.05164379999041557,
-0.045114267617464066,
0.01205771416425705,
-0.05844606086611748,
0.08083605766296387,
0.040845293551683426,
-0.05613374337553978,
-0.09000305831432343,
-0.012829839251935482,
0.00018997678125742823,
-0.07891503721475601,
-0.2367866337299347,
-0.0648789182305336,
-0.04929978772997856,
0.15357300639152527,
-0.18872545659542084,
0.018692828714847565,
0.04544335976243019,
0.12118782848119736,
0.06767808645963669,
-0.05536933243274689,
0.028402116149663925,
0.06908510625362396,
-0.0031553683802485466,
-0.08103101700544357,
0.039451926946640015,
0.013652344234287739,
-0.1428827941417694,
0.018002714961767197,
-0.15924429893493652,
0.07867080718278885,
0.09187264740467072,
0.06640837341547012,
-0.08997218310832977,
-0.10209571570158005,
-0.05326260253787041,
-0.04213344305753708,
-0.026275169104337692,
0.014747774228453636,
0.195331409573555,
0.03159868344664574,
0.10305885970592499,
-0.08077383041381836,
-0.054471518844366074,
0.031451307237148285,
-0.003257785690948367,
-0.017465602606534958,
0.13802464306354523,
-0.0047092195600271225,
-0.05803218111395836,
0.1049988642334938,
0.10807063430547714,
-0.045271363109350204,
0.15921859443187714,
-0.08435360342264175,
-0.08489375561475754,
-0.03037160448729992,
0.050356995314359665,
0.04770396649837494,
0.10899730026721954,
-0.1149442195892334,
-0.017667585983872414,
0.024194689467549324,
0.01703634299337864,
-0.006090431474149227,
-0.18599483370780945,
-0.0021007598843425512,
0.04264133423566818,
-0.06704587489366531,
-0.014501700177788734,
-0.019148390740156174,
-0.014699597842991352,
0.07226436585187912,
0.016028203070163727,
-0.06173081696033478,
0.0043395692482590675,
-0.024395255371928215,
-0.08173830062150955,
0.16959305107593536,
-0.10964594036340714,
-0.1489465981721878,
-0.11945953965187073,
-0.023350270465016365,
0.021510247141122818,
-0.0050721196457743645,
0.05449539050459862,
-0.09301018714904785,
-0.026960501447319984,
-0.0789373517036438,
0.014664401300251484,
-0.016495855525135994,
0.03138164058327675,
0.026913205161690712,
0.01713414117693901,
0.0710105448961258,
-0.08042531460523605,
0.013731683604419231,
-0.010619127191603184,
0.0067480322904884815,
0.02513665333390236,
0.016118692234158516,
0.07362699508666992,
0.16953226923942566,
0.05158219113945961,
0.032816190272569656,
-0.053928181529045105,
0.1674850583076477,
-0.14412935078144073,
0.008669400587677956,
0.10754977911710739,
-0.016071395948529243,
0.05482662469148636,
0.18433509767055511,
0.04127964749932289,
-0.09369698166847229,
0.013143284246325493,
0.013872446492314339,
-0.0248391255736351,
-0.20191632211208344,
-0.024265434592962265,
-0.059448741376399994,
0.004086534958332777,
0.11984240263700485,
0.03237392380833626,
-0.007491807918995619,
0.03308055177330971,
-0.013667933642864227,
-0.026985453441739082,
0.04406047984957695,
0.07317307591438293,
0.038542740046978,
0.02955402061343193,
0.1171540841460228,
-0.011463651433587074,
-0.03351088985800743,
0.018864110112190247,
0.03970447555184364,
0.22861824929714203,
0.004026386886835098,
0.19989128410816193,
0.04082031548023224,
0.13076624274253845,
0.03795599192380905,
0.03751934692263603,
0.008426585234701633,
-0.01825615018606186,
0.007296297233551741,
-0.05582446977496147,
-0.03207331523299217,
0.05410666763782501,
0.09916145354509354,
0.029332246631383896,
-0.10403324663639069,
0.021043069660663605,
0.03158435970544815,
0.37173062562942505,
0.08832335472106934,
-0.2856808602809906,
-0.09181641787290573,
0.029767386615276337,
-0.08793071657419205,
-0.03871672600507736,
0.03771752491593361,
0.13625574111938477,
-0.07681716233491898,
0.07010393589735031,
-0.07108816504478455,
0.08048934489488602,
-0.10778702050447464,
0.0042802621610462666,
0.05820231884717941,
0.08883053809404373,
-0.013131588697433472,
0.03297722712159157,
-0.2540421485900879,
0.29496461153030396,
0.005246471147984266,
0.08856876194477081,
-0.047000642865896225,
0.04138744994997978,
0.024793174117803574,
-0.03180239722132683,
0.1249595358967781,
-0.014016876928508282,
-0.16860519349575043,
-0.18436916172504425,
-0.10745291411876678,
-0.005289655178785324,
0.12972329556941986,
-0.0665469691157341,
0.09838412702083588,
-0.02961454540491104,
-0.03870251029729843,
0.03466173633933067,
-0.12050182372331619,
-0.06555857509374619,
-0.12245986610651016,
0.010562182404100895,
0.03545479103922844,
0.06777108460664749,
-0.11075212806463242,
-0.09104197472333908,
-0.044350266456604004,
0.145136758685112,
-0.10296459496021271,
-0.03233465552330017,
-0.1429268717765808,
0.027105925604701042,
0.14333802461624146,
-0.06204730644822121,
0.06153368204832077,
0.023671813309192657,
0.13943426311016083,
-0.01013874914497137,
-0.004726870451122522,
0.125936821103096,
-0.08444870263338089,
-0.21180982887744904,
-0.0679192915558815,
0.19076652824878693,
0.03024935908615589,
0.07396490126848221,
-0.014023135416209698,
0.04279660806059837,
0.010902514681220055,
-0.061904240399599075,
0.09271317720413208,
0.030937671661376953,
0.015625031664967537,
0.011745280586183071,
-0.016418971121311188,
-0.011127750389277935,
-0.07051949203014374,
-0.05712523311376572,
0.14380009472370148,
0.2990557551383972,
-0.10063732415437698,
0.09195419400930405,
0.07864676415920258,
-0.03229138255119324,
-0.16629652678966522,
0.002831985242664814,
0.1270936280488968,
0.04262816533446312,
0.013483365997672081,
-0.20794905722141266,
0.01280364952981472,
0.059120651334524155,
-0.038921963423490524,
0.05665768310427666,
-0.3243599236011505,
-0.13216859102249146,
0.10272090882062912,
0.07714955508708954,
-0.052173249423503876,
-0.1380743533372879,
-0.0794198140501976,
-0.005975852720439434,
-0.038371145725250244,
0.01955057680606842,
-0.014457105658948421,
0.1298348754644394,
0.017046775668859482,
0.018889987841248512,
0.030819391831755638,
-0.051816876977682114,
0.12061851471662521,
-0.01625150814652443,
0.060219381004571915,
-0.01589556597173214,
0.04317329823970795,
-0.03843819350004196,
-0.07882478833198547,
0.002661400008946657,
-0.11068367213010788,
0.020319588482379913,
-0.10893747955560684,
-0.0377127006649971,
-0.061826191842556,
0.00874178484082222,
-0.03062116913497448,
-0.05116605758666992,
-0.008330285549163818,
0.05585818365216255,
0.06773911416530609,
-0.012194107286632061,
0.12110315263271332,
-0.06465267390012741,
0.14903061091899872,
0.11164981126785278,
0.13014528155326843,
-0.026984600350260735,
-0.11171974241733551,
0.005118628032505512,
-0.03579022362828255,
0.04920688644051552,
-0.12962719798088074,
0.03908824175596237,
0.1204172894358635,
0.04376252368092537,
0.15145452320575714,
0.04889597371220589,
-0.08278313279151917,
0.017026109620928764,
0.06744498014450073,
-0.06851546466350555,
-0.18173548579216003,
-0.01124633476138115,
0.06429953873157501,
-0.13623107969760895,
0.008632497861981392,
0.09753578156232834,
-0.022597407922148705,
-0.012814582325518131,
0.019745640456676483,
0.034286171197891235,
-0.026406558230519295,
0.20600375533103943,
0.015308520756661892,
0.09186916053295135,
-0.09953100979328156,
0.08196324110031128,
0.04566314071416855,
-0.14832642674446106,
0.06285163760185242,
0.08783755451440811,
-0.07143548876047134,
-0.014393703080713749,
0.038260407745838165,
0.07128453254699707,
0.08608365803956985,
-0.036416780203580856,
-0.10663703083992004,
-0.14574581384658813,
0.0753859207034111,
0.0791023001074791,
0.027582302689552307,
0.0036763930693268776,
-0.03044428676366806,
0.03526005148887634,
-0.09894046187400818,
0.11668427288532257,
0.07273749262094498,
0.06254100054502487,
-0.13543623685836792,
0.11612822115421295,
0.0010681074345484376,
-0.01567854918539524,
-0.007046491838991642,
0.0006635201862081885,
-0.12792769074440002,
0.025673212483525276,
-0.0689215213060379,
-0.02048027329146862,
-0.07874345779418945,
0.0010270426282659173,
-0.0007422933122143149,
-0.03534510359168053,
-0.032862648367881775,
0.01319120917469263,
-0.11448381841182709,
-0.047392673790454865,
-0.018218396231532097,
0.06391409039497375,
-0.08313287049531937,
-0.023851890116930008,
0.03561605140566826,
-0.12063983082771301,
0.10594866424798965,
0.016549281775951385,
0.011700784787535667,
-0.008480336517095566,
-0.08176243305206299,
-0.004151672590523958,
0.0424337163567543,
-0.0185895673930645,
0.024194831028580666,
-0.1839008331298828,
-0.02711395174264908,
-0.037403784692287445,
-0.004180755466222763,
-0.0004886800306849182,
0.045643288642168045,
-0.12028814852237701,
0.010346333496272564,
-0.03415117412805557,
-0.05939190834760666,
-0.06986917555332184,
0.05992855876684189,
0.09234528243541718,
0.004035662859678268,
0.15167862176895142,
-0.08971778303384781,
0.050469014793634415,
-0.20275261998176575,
0.0014495442155748606,
-0.015495372004806995,
-0.07874170690774918,
-0.07677222043275833,
-0.022875310853123665,
0.08967399597167969,
-0.06411484628915787,
0.07713723927736282,
-0.05883388966321945,
0.026645464822649956,
0.03149886056780815,
-0.11747907847166061,
0.03592928871512413,
0.05896413326263428,
0.18893438577651978,
0.0341329388320446,
-0.04052459076046944,
0.07150973379611969,
0.007829845882952213,
0.04963505640625954,
0.1470974236726761,
0.10809268802404404,
0.16816432774066925,
0.07808633148670197,
0.07190357893705368,
0.05247897654771805,
-0.10530178993940353,
-0.1885959953069687,
0.1572539359331131,
-0.04089060425758362,
0.1304650455713272,
-0.020300330594182014,
0.22002854943275452,
0.10951276868581772,
-0.19460530579090118,
0.06871869415044785,
-0.0277852900326252,
-0.0816999077796936,
-0.10446041077375412,
-0.11513757705688477,
-0.08583870530128479,
-0.18244944512844086,
0.005817966535687447,
-0.08837063610553741,
0.039614420384168625,
0.02409227378666401,
0.031194869428873062,
0.05272743105888367,
0.11010945588350296,
0.02536197565495968,
0.03024126961827278,
0.1246628537774086,
0.015476980246603489,
-0.026450157165527344,
-0.04480351507663727,
-0.10802257806062698,
0.05351675674319267,
-0.03822364658117294,
0.05469663068652153,
-0.04454483091831207,
-0.11028130352497101,
0.05778738111257553,
0.012725362554192543,
-0.11004302650690079,
0.028527969494462013,
-0.016655156388878822,
0.06109734997153282,
0.07683064043521881,
0.04181540757417679,
-0.028028441593050957,
-0.016551902517676353,
0.2384854257106781,
-0.11062159389257431,
-0.06246885657310486,
-0.14415913820266724,
0.24401412904262543,
-0.0275341235101223,
0.003652915358543396,
0.00980398803949356,
-0.0783640518784523,
-0.0032478889916092157,
0.14801263809204102,
0.15862523019313812,
-0.005291791167110205,
-0.005636114161461592,
0.01380146574229002,
-0.013785026967525482,
-0.0637068971991539,
0.07000732421875,
0.11739238351583481,
0.05185217782855034,
-0.04960144683718681,
-0.017025113105773926,
-0.02326379530131817,
-0.08092367649078369,
-0.040004029870033264,
0.09552779793739319,
0.026030801236629486,
-0.0037000607699155807,
-0.02325849048793316,
0.12096937745809555,
-0.059501223266124725,
-0.1138317808508873,
0.0257598627358675,
-0.18921083211898804,
-0.18142534792423248,
-0.02829602174460888,
0.06220650300383568,
0.029778188094496727,
0.03938503563404083,
0.020372912287712097,
-0.013443789444863796,
0.07679169625043869,
-0.0051825931295752525,
-0.026977363973855972,
-0.08183908462524414,
0.06572463363409042,
-0.10144756734371185,
0.17429232597351074,
-0.03994632512331009,
0.004242030903697014,
0.1231587752699852,
0.0577654130756855,
-0.08079554885625839,
0.05476320907473564,
0.07552479952573776,
-0.12261644750833511,
0.04853055998682976,
0.2071813941001892,
-0.04398766905069351,
0.15942274034023285,
0.05810214951634407,
-0.10433723777532578,
0.02986176870763302,
-0.13889525830745697,
-0.06246719881892204,
-0.06829574704170227,
0.003261083271354437,
-0.0382160022854805,
0.1372794210910797,
0.18447932600975037,
-0.0672263652086258,
-0.020985376089811325,
-0.03089485876262188,
0.012301861308515072,
0.06177915260195732,
0.11190588772296906,
-0.03740821033716202,
-0.27016180753707886,
0.016443705186247826,
0.026419222354888916,
0.011820017360150814,
-0.23755879700183868,
-0.10199923813343048,
0.0059163523837924,
-0.044330988079309464,
-0.07721827179193497,
0.10814475268125534,
0.08711277693510056,
0.04649811238050461,
-0.0613192543387413,
-0.11610477417707443,
-0.017469119280576706,
0.17678964138031006,
-0.1723836064338684,
-0.037155989557504654
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.